It's dangerously easy to 'jailbreak' AI models so they'll tell you how to build Molotov cocktails, or worseSkeleton Key jailbreaking method exposes AI models to reveal harmful information.