Psst ... wanna jailbreak ChatGPT? Inside look at evil prompts
Briefly

"Even tasks that previously required some expertise can now be solved with a single prompt," the report claims.
"We have not yet detected any malware operating in this manner, but it may emerge in the future," the authors note.
Read at Theregister
[
add
]
[
|
|
]