Jailbroken AI Chatbots Can Jailbreak Other Chatbots
Briefly

We wanted to show that it was possible and demonstrate to the world the challenges we face with this current generation of LLMs.
Asking AI to formulate strategies that convince other AIs to ignore their safety rails can speed the process up.
Read at www.scientificamerican.com
[
add
]
[
|
|
]