The lawyer said that he “greatly regrets” using ChatGPT and added that he had never used it for legal research before. He opined that he was “unaware that its content could be false”.
Artificial Intelligence tool ChatGPT has been making headlines since its debut. It has been used to complete assignments, such as writing work emails in specific tones, styles and instructions. In a bizarre incident, a lawyer from New York is facing a court hearing after his company Levidow, Levidow & Oberman used the AI tool for legal research, as per a report in BBC. This came to light after a filling used hypothetical legal cases as examples. Noticing the same, the judge remarked that the situation left the court with an “unprecedented circumstance”. However, the attorney said in court that he was “unaware that its content could be false”.
Initially, the case was about a man who had sued an airline for what he claimed to be personal injury. His legal team filed a brief that cited a number of earlier court cases in an effort to establish, through precedent, why the case should proceed. However, the airline’s lawyers then informed the judge in a letter that they were unable to locate some of the examples cited in the brief.
Judge Castel then wrote to the man’s legal team demanding an explanation. He said, “Six of the submitted cases appear to be bogus judicial decisions with bogus quotes and bogus internal citations.” Later it emerged that the research was not done by the man’s lawyer Peter LoDuca but by one of his colleagues at the law firm. Steven A Schwartz, a lawyer with more than 30 years of experience, used the AI tool to find cases that were comparable to the one at hand.
The judge said that the situation left the court with an “unprecedented circumstance”.
Artificial Intelligence tool ChatGPT has been making headlines since its debut. It has been used to complete assignments, such as writing work emails in specific tones, styles and instructions. In a bizarre incident, a lawyer from New York is facing a court hearing after his company Levidow, Levidow & Oberman used the AI tool for legal research, as per a report in BBC. This came to light after a filling used hypothetical legal cases as examples. Noticing the same, the judge remarked that the situation left the court with an “unprecedented circumstance”. However, the attorney said in court that he was “unaware that its content could be false”.
Initially, the case was about a man who had sued an airline for what he claimed to be personal injury. His legal team filed a brief that cited a number of earlier court cases in an effort to establish, through precedent, why the case should proceed. However, the airline’s lawyers then informed the judge in a letter that they were unable to locate some of the examples cited in the brief.
Judge Castel then wrote to the man’s legal team demanding an explanation. He said, “Six of the submitted cases appear to be bogus judicial decisions with bogus quotes and bogus internal citations.” Later it emerged that the research was not done by the man’s lawyer Peter LoDuca but by one of his colleagues at the law firm. Steven A Schwartz, a lawyer with more than 30 years of experience, used the AI tool to find cases that were comparable to the one at hand.
Further, Mr Schwartz in a statement said that Mr LoDuca was involved in the research and was unaware of how it was conducted. He said that he “greatly regrets” using ChatGPT and added that he had never used it for legal research before. He opined that he was “unaware that its content could be false”. He pledged never again to “supplement” his legal research using AI “without absolute verification of its authenticity”.
A Twitter thread going viral on the internet shows the conversation between the chatbot and the lawyer. “Is varghese a real case,” asks Mr Schwartz. ChatGPT responded and said, “Yes, Varghese v. China Southern Airlines Co Ltd, 925 F.3d 1339 (11th Cir. 2019) is a real case.”
He then asks the bot to reveal its source. After “double checking”, ChatGPT added that the case is genuine and can be discovered on legal research resources like LexisNexis and Westlaw.