cyber daily logo

Breaking news and updates daily. Subscribe to our Newsletter

Breaking news and updates daily. Subscribe to our Newsletter X facebook linkedin Instagram Instagram

OpenAI called out for potential GDPR violations

ChatGPT creator OpenAI is currently being accused of violating the GDPR by an Austrian cyber security watchdog for its artificial intelligence (AI) tool’s tendency to hallucinate and spread misinformation.

user icon Daniel Croft
Wed, 01 May 2024
OpenAI called out for potential GDPR violations
expand image

The European Centre for Digital Rights (NOYB) filed a complaint with the Austrian Data Protection Authority, saying that while it was well known that AI chatbots and tools tend to “hallucinate” and create false or misleading information, these tools are violating the GDPR (General Data Protection Regulation) by sharing misinformation when it comes to individuals.

NOYB said that it had previously reached out to OpenAI about ChatGPT’s tendency to spread misinformation about individuals and requested that this be fixed, to which OpenAI said it was unable to fix these issues.

“It’s clear that companies are currently unable to make chatbots like ChatGPT comply with EU law when processing data about individuals,” said NOYB data protection lawyer Maartje de Graaf on 29 April 2024.


“If a system cannot produce accurate and transparent results, it cannot be used to generate data about individuals. The technology has to follow the legal requirements, not the other way around.”

According to the GDPR, all personal information must be accurate, and people should be able to access any stored data in its entirety, including its source. Furthermore, individuals should also be able to request their data to be deleted, and the violation of the GDPR by these organisations can result in fines of up to 4 per cent of the company’s annual revenue.

OpenAI, however, said it is unable to reveal the origin of its data or provide individuals with the details of their stored information. It is also currently working on fixing factual accuracy, calling it an “area of active research”.

“The obligation to comply with access requests applies to all companies,” added de Graaf.

“It is clearly possible to keep records of training data that was used and at least have an idea about the sources of information. It seems that with each ‘innovation’, another group of companies thinks that its products don’t have to comply with the law.”

Daniel Croft

Daniel Croft

Born in the heart of Western Sydney, Daniel Croft is a passionate journalist with an understanding for and experience writing in the technology space. Having studied at Macquarie University, he joined Momentum Media in 2022, writing across a number of publications including Australian Aviation, Cyber Security Connect and Defence Connect. Outside of writing, Daniel has a keen interest in music, and spends his time playing in bands around Sydney.

cyber daily subscribe
Be the first to hear the latest developments in the cyber industry.