In Leaked Audio, Microsoft Cherry-Picked Examples to Make Its AI Seem Functional
Microsoft's generative AI tool, Security Copilot, frequently produced incorrect responses and had to cherry-pick examples to showcase good results.
The AI tool, built on OpenAI's GPT-4 language model, suffered from hallucinations and gave different answers to the same questions.