ChatGPT's 'jailbreak' tries to make the AI break its own rules, or die
February 06, 2023 at 11:09 AM EST
Reddit users have tried to force OpenAI's ChatGPT to violate its own rules on violent content and political commentary, with an alter ego named DAN.