Judges have noted a significant increase in AI-generated legal citation errors since May 2023, with lawyers increasingly held responsible for these mistakes. Legal researcher Damien Charlotin has documented 120 instances where AI hallucinations misled legal professionals and judges, revealing a troubling trend. While earlier cases were primarily linked to self-represented litigants, recent data shows a shift, with lawyers being implicated in over half of the errors identified in recent months. The global response has included sanctions and fines imposed on offending individuals, showcasing the seriousness of AI misuse in legal contexts.
Cases of lawyers or litigants that have mistakenly cited hallucinated cases has now become a rather common trope, highlighting the ongoing reliance on AI tools in legal contexts.
In 2023, studies show that 70% of hallucination cases involved pro se litigants, but by May 2025, legal professionals were to blame for more than half of these errors.
Collection
[
|
...
]