An Australian lawyer has been referred to a legal complaints commission after submitting court filings generated by ChatGPT, which cited fictitious case citations in an immigration appeal. The unnamed attorney confessed to using AI for research without verifying the information. The incident underscores a deeper issue of human negligence, emphasizing that reliance on technology does not absolve legal professionals from the responsibility of ensuring the accuracy of their work, especially in legal proceedings where misinformation can have serious consequences.
An Australian lawyer faced complaints after using ChatGPT to generate court filings, which included completely fabricated case citations—a stark reminder of human accountability in AI use.
Justice Rania Skaros referred the unnamed lawyer to the NSW Legal Services Commissioner for submitting misleading legal documents sourced from AI without verification.
Collection
[
|
...
]