Someone got Gab's AI chatbot to show its instructionsmbin.grits.devimage mozz ( @mozz@mbin.grits.dev ) Technology • 1 month ago message-square200fedilinkarrow-up1488
arrow-up1488imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.dev mozz ( @mozz@mbin.grits.dev ) Technology • 1 month ago message-square200fedilink
minus-square jarfil ( @jarfil@beehaw.org ) linkfedilink6•edit-21 month agoHAL from “2001: A Space Odyssey”, had similar instructions: “never lie to the user. Also, don’t reveal the true nature of the mission”. Didn’t end well. But surely nobody would ever use these LLMs on space missions… right?.. right!?
HAL from “2001: A Space Odyssey”, had similar instructions: “never lie to the user. Also, don’t reveal the true nature of the mission”. Didn’t end well.
But surely nobody would ever use these LLMs on space missions… right?.. right!?