A jailbreak of OpenAI’s GPT-4o used leetspeak to get ChatGPT to bypass its usual safety measures, allowing users to receive knowledge on how to hotwire cars, synthesize LSD, and other illicit activities.
Go to Source
Author:
Go to Source
Author: