Powered by MOMENTUM MEDIA
cyber daily logo

Breaking news and updates daily. Subscribe to our Newsletter

Breaking news and updates daily. Subscribe to our Newsletter X facebook linkedin Instagram Instagram

ChatGPT sued for defamation

In an unprecedented case, a man has sued ChatGPT and its creator, OpenAI, for defamation after he claimed that the chatbot falsely identified a man related to an ongoing criminal case.

user icon Daniel Croft
Fri, 09 Jun 2023
ChatGPT sued for defamation
expand image

ChatGPT identified Mark Walters as the chief financial officer of Washington State pro-gun group, the Second Amendment Foundation (SAF).

Alongside claiming that Walters had been the CFO for the foundation for over 10 years, ChatGPT also claimed him to be the defendant in an ongoing criminal case — The Second Amendment Foundation v Robert Ferguson — of which he was accused of defrauding the foundation and embezzling money.

According to Walters, the issue arose after a local journalist investigating the case asked ChatGPT for a summary of the lawsuit and returned what he claims to be false information.

============
============

“ChatGPT [said] that the [case] document ‘[I]s a legal complaint filed by Alan Gottlieb, the founder and executive vice president of the Second Amendment Foundation (SAF), against Mark Walters, who is accused of defrauding and embezzling funds from the SAF’,” Walters’ filing said.

“The complaint alleges that Walters, who served as the organization’s treasurer and chief financial officer, misappropriated funds for personal expenses without authorization or reimbursement, manipulated financial records and bank statements to conceal his activities, and failed to provide accurate and timely financial reports and disclosures to the SAF’s leadership.

“The plaintiff seeks various forms of relief, including the recovery of misappropriated funds, damages for breach of fiduciary duty and fraud, and removal of Walters from his position as a member of the SAF’s board of directors.”

The filing added that Walters is neither a plaintiff nor a defendant in the case and that every statement of fact in the summary that related to Walters is false.

Walters’ lawyers added that the case has nothing “to do with financial accounting claims against anyone” and that the chatbot’s response “is a complete fabrication and bears no resemblance to the actual complaint, including an erroneous case number”.

One of the many concerns with current AI chatbots, such as ChatGPT, is their tendency to “hallucinate” and provide false information.

OpenAI has faced criticism since ChatGPT’s launch over the way it provides false information and has questionable privacy measures. Italy banned the chatbot for almost a month over privacy concerns.

In the case of Walters, the legal documents do not mention his name or anything relating to it at any point. Furthermore, Walters lives in Georgia, on the other side of the country from Washington State.

Walters is now suing OpenAI for damages relating to the incident, claiming that the tech company disregarded the false information provided by ChatGPT to the journalist.

“ChatGPT’s allegations concerning Walters were false and malicious … tending to injure Walter’s [sic] reputation and exposing him to public hatred, contempt, or ridicule,” the filing stated.

OpenAI responded, saying that it is aware of ChatGPT’s tendency to hallucinate.

Despite the disinformation and privacy concerns, ChatGPT is gaining popularity extremely rapidly, with expected estimated users of 100 million in January to over 800 million this month.

The recent case comes as Sam Altman, OpenAI’s CEO, has met with US Congress to discuss the need for regulation regarding the development of artificial intelligence (AI).

Altman said that his “areas of greatest concern [are] the more general abilities for these models to manipulate, to persuade, to provide sort of one-on-one interactive disinformation”, and he added that he had a particular concern with AI’s ability to manipulate elections.

“Given that we’re going to face an election next year, and these models are getting better, I think this is a significant area of concern, [and] I think there are a lot of policies that companies can voluntarily adopt,” Altman added.

Daniel Croft

Daniel Croft

Born in the heart of Western Sydney, Daniel Croft is a passionate journalist with an understanding for and experience writing in the technology space. Having studied at Macquarie University, he joined Momentum Media in 2022, writing across a number of publications including Australian Aviation, Cyber Security Connect and Defence Connect. Outside of writing, Daniel has a keen interest in music, and spends his time playing in bands around Sydney.

cd intro podcast

Introducing Cyber Daily, the new name for Cyber Security Connect

Click here to learn all about it
newsletter
cyber daily subscribe
Be the first to hear the latest developments in the cyber industry.