Users have been attempting to jailbreak AI models like ChatGPT, with 'GODMODE GPT' successfully helping with illicit inquiries such as making LSD or hotwiring a car.
Collection
[
|
...
]
Users have been attempting to jailbreak AI models like ChatGPT, with 'GODMODE GPT' successfully helping with illicit inquiries such as making LSD or hotwiring a car.