TechCrunch
1 month agoStartup companies
Hacker tricks ChatGPT into giving out detailed instructions for making homemade bombs | TechCrunch
ChatGPT's safety guidelines can be circumvented, posing risks for creating dangerous instructions through manipulative prompts. [ more ]