pavnilschanda@lemmy.worldM to AI Companions@lemmy.world · 5 months ago[News] Somebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square2fedilinkarrow-up12arrow-down11cross-posted to: [email protected]
arrow-up11arrow-down1external-link[News] Somebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangepavnilschanda@lemmy.worldM to AI Companions@lemmy.world · 5 months agomessage-square2fedilinkcross-posted to: [email protected]
minus-squareholycrap@lemm.eelinkfedilinkarrow-up0·5 months agoHas anyone been able to reproduce this? The guy could have typed all that in a previous prompt.
Has anyone been able to reproduce this? The guy could have typed all that in a previous prompt.
Yes, word for word