Judge calls out 'expert witness' for using AI chatbot

This isn't the first time using an AI chatbot to do legal work got someone in trouble.
Microsoft Copilot logo
An expert witness in a court case used Microsoft's AI chatbot Copilot to assess damages and was reprimanded by the judge. Credit: CFOTO/Future Publishing via Getty Images

If you find yourself needing an expert witness in a courtroom case, make sure they're not using an AI chatbot for their supposed expertise.

Last week, a New York judge reprimanded an expert witness in a real estate dispute case for using Microsoft's AI chatbot Copilot. 

The expert witness, Charles Ranson, used Copilot in order to generate an assessment for damages that should be awarded to the plaintiff in the case. The case was first reported on by Ars Technica.


You May Also Like

Copilot in court – a bad idea

The case at the center of this story involved a dispute over a $485,000 rental property in the Bahamas. The man who owned the real estate had passed away, and the property was included in a trust for the deceased man's son. The deceased man's sister was responsible for executing the trust. However, the sister was being accused of breaching her fiduciary duties by delaying the sale of the property while utilizing the property for her own personal use.

A major part in winning the case for the son was proving that he suffered damages due to his aunt's actions.

Ranson was brought on as an expert witness and tasked with assessing those damages.

While Ranson has a background in trust and estate litigation, according to judge Jonathan Schopf, he had "no relevant real estate expertise." So, Ranson turned to Microsoft's AI chatbot, Copilot.

Ranson apparently revealed his Copilot use in his testimony. When questioned about it, Ranson was unable to recall what prompts he used to assess the damages or what sources Copilot cited to arrive at its estimate. Ranson was also unable to explain how Copilot works.

The court then decided to use Copilot to see if it could arrive at the same estimate that Ranson provided. The court asked Copilot "Can you calculate the value of $250,000 invested in the Vanguard Balanced Index Fund from December 31, 2004 through January 31, 2021?"

Copilot provided a different answer in three different attempts, and each answer was different from Ranson's own Copilot-generated amount.

The court then asked Copilot if it was a reliable source of information, which Copilot replied by saying that its outputs should always be verified by experts.

According to the judge, Ranson was adamant that AI tools like Copilot were standard use in his industry, however he was unable to cite a single source showing this to be true.

Ranson's AI chatbot use wasn't his only mistake. However, the Copilot situation certainly hit the expert witness' credibility. The judge found that the evidence showed that the delay in the sale of the property not only didn't result in a loss, but additional profit for the son, and ruled there was no breach of fiduciary duty from the aunt.

Not the first time, and probably not the last time

Ranson's use of Copilot as some expert source of information is certainly not the first time AI chatbots have been used in the courtroom.

Readers may recall lawyer Steven Schwartz who last year relied on ChatGPT in legal filings for a case involving an airline customer being injured during a flight. Schwartz was reprimanded after submitting filings which cited completely nonexistent cases. Schwartz had used ChatGPT for his research, and the AI chatbot just made up previous cases, which Schwartz then included in his filings.

As a result, Schwartz and another lawyer at the firm he worked for were fined $5,000 by the court for "acting in bad faith."

The same scenario happened again with another lawyer, Jae Lee, who used ChatGPT in her filings earlier this year. Once again ChatGPT hallucinated cases that did not exist.

In the Bahamas real estate case, Judge Schopf made a point not to blame the AI chatbot but the user for citing it. However, AI chatbots continue to proliferate online and major tech companies like Google and Microsoft are ramping up promotion of this technology to users.

Mashable Potato

Recommended For You
Is Adult Friend Finder safe to use? What a cybersecurity expert says.
By Jack Dawes
Man in hood looking at screen

OpenAI must stop using ‘Cameo’ term in Sora app, judge rules
Sora and OpenAI logo

AI-generated docs aren't covered by attorney-client privilege, judge says
The external stone facade of a building. On the stone is carved "United States court house".

Siri might become an AI chatbot in iOS 27
iPhone 17 lying in grass

Sears AI chatbot chats and audio files found exposed online
A general view of newly reopened Sears department store in Downtown Burbank

Trending on Mashable
NYT Connections hints today: Clues, answers for April 4, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 4, 2026
Wordle game on a smartphone

NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

NYT Strands hints, answers for April 4, 2026
A game being played on a smartphone.
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!