Radio presenter sues ChatGPT after false accusation

Mark Walters, a well-known American radio host, has decided to use OpenAI to sue, the company behind the popular ChatGPT program. The artificial intelligence (AI), or artificial intelligence, tool wrongly accused the presenter of embezzlement.

ChatGPT makes false accusation

The accusation by ChatGPT came to light when journalist Fred Riehl wanted a summary of the AI ​​on the Second Amendment Foundation v. Robert Ferguson lawsuit. ChatGPT then came forward with information that Walters, who was neither in the lawsuit nor linked to any of the parties, was suspected of embezzlement. Walter’s lawyer John Monroe is shocked by the accusation:

“OpenAI polluted my client’s name and fabricated outrageous lies about him. ChatGPT identified Walters as the suspect when he had nothing to do with it.”

According to the lawyer, there was no other option than to sue the company. Released documents tell of the lawsuit that ChatGPT replied to journalist Fred Riehl and provided him with a link to the lawsuit. When the journalist then asked for a summary, the chatbot unquestionably identified Walters as the suspect. According to the documents, ChatGPT’s response was as follows:

“The lawsuit revolves around a complaint filed by Vice President Alan Gottlieb of the Second Amendment Foundation (SAF), v Mark Walters, accused of defrauding and misappropriating the funds of the SAF.”

ChatGPT ‘hallucinates’

When journalist Riehl contacted Vice President Gottlieb, he replied that the allegations were false. According to the documents, Walters is seeking damages from the company. His lawyer is confident that they will win the lawsuit:

“We wouldn’t have started the lawsuit if we thought we wouldn’t win”

However, not everyone agrees with the lawyer. On the website of OpenAI it is clearly stated that the chatbot sometimes publishes incorrect information. OpenAI refers to these situations as chatbot ‘hallucinations’. The information, which can be seen as the chatbot’s ‘memory’, does not go beyond 2021. If you request information about things that happened after 2021, the chatbot tends to fill in the blanks itself to meet the wishes of the to satisfy the user.

Recent Articles

Related News

Leave A Reply

Please enter your comment!
Please enter your name here