Chatbot dreams generate AI nightmares for Bay Area lawyers
Briefly

Chatbot dreams generate AI nightmares for Bay Area lawyers
"A Palo Alto lawyer with nearly a half-century of experience admitted to an Oakland federal judge this summer that legal cases he referenced in an important court filing didn't actually exist and appeared to be products of artificial intelligence "hallucinations." Jack Russo, in a court filing, described the apparent AI fabrications as a "first-time situation" for him and added, "I am quite embarrassed about it.""
"Chatbots respond to users' prompts by drawing on vast troves of data and use pattern analysis and sophisticated guesswork to produce results. Errors can occur for many reasons, including insufficient or flawed AI-training data or incorrect assumptions by the AI. It affects not just lawyers, but ordinary people seeking information, as when Google's AI overviews last year told users to eat rocks, and add glue to pizza sauce to keep the cheese from sliding off."
An experienced Palo Alto lawyer admitted to an Oakland federal judge that legal cases he cited in a court filing did not exist and appeared to be artificial intelligence "hallucinations." The lawyer, Jack Russo, described the situation as a "first-time situation" and said he was "quite embarrassed about it." Russo is a specialist in computer law with nearly fifty years of experience. He said a long recovery from COVID and age-related delegation without adequate supervision contributed to the error. Generative AI hallucinations have produced inaccurate information, prompting judges to refer matters to disciplinary authorities and impose fines up to $31,000, including a California-record $10,000 fine.
Read at The Mercury News
Unable to calculate read time
[
|
]