The 'Make America Healthy Again' report, initially praised as a crucial health policy guide, revealed critical errors linked to generative AI, including fictitious citations and misrepresented findings. The White House claimed these issues were mere formatting problems. Despite these statements, experts argue that the false citations reflect a common issue with AI-generated content, known as 'hallucinations,' where plausible-sounding yet inaccurate information is presented. The report has since been updated, but concerns over AI's role in generating credible academic content persist, underscoring the challenges of reliance on AI in scholarly work.
The faux citations were formatted correctly, listed reputable journals and included realistic-sounding digital object identifiers or DOIs. However, multiple articles cited did not exist.
Researchers noted that the presence of fabricated articles is likely the result of AI "hallucinations,” which generate plausible-sounding content without factual accuracy.
Collection
[
|
...
]