Hacker tricks ChatGPT into giving out detailed instructions for making homemade bombs | TechCrunchChatGPT's safety guidelines can be circumvented, posing risks for creating dangerous instructions through manipulative prompts.