Sullivan & Cromwell Files Emergency 'Please Don't Sanction Us For All These AI Hallucinations' Letter - Above the Law
Briefly

Sullivan & Cromwell Files Emergency 'Please Don't Sanction Us For All These AI Hallucinations' Letter - Above the Law
"The inaccuracies and errors in the Motion include artificial intelligence ('AI') 'hallucinations.' 'Hallucinations' are instances in which artificial intelligence tools fabricate case citations, misquote authorities, or generate non-existent legal sources."
"The Firm maintains comprehensive policies and training requirements governing the use of AI tools in legal work. These safeguards are designed to prevent exactly this situation."
"Regrettably, this review process did not identify the inaccurate citations generated by AI, nor did it identify other errors that appear to have resulted in whole or in part from manual error."
A law firm advising OpenAI faced embarrassment after submitting a brief filled with inaccuracies due to AI hallucinations. The firm's partner informed the court about the errors, which included fabricated case citations and misquoted authorities. Despite having policies and training for AI use, these safeguards were not followed, leading to the inaccuracies. The incident underscores the potential pitfalls of relying on AI in legal work and raises questions about the effectiveness of existing safeguards.
Read at Above the Law
Unable to calculate read time
[
|
]