Jailbreak tricks Discord’s new chatbot into sharing napalm and meth instructions
In March, Discord announced that it had integrated OpenAI’s technology into its bot named Clyde, turning it into an AI-powered chatbot. Just like with any other chatbot launched in the last few months, users have been trying to trick Clyde into saying things it’s not supposed to say, a process colloquially known as a “jailbreaking.” … Continue reading Jailbreak tricks Discord’s new chatbot into sharing napalm and meth instructions
Copy and paste this URL into your WordPress site to embed
Copy and paste this code into your site to embed