DeepSeek and hallucinations: AGCM opens investigation

The Italian Competition and Market Authority (AGCM) has launched an investigation into DeepSeek for possible violation of the Consumer Code. The Chinese company does not clearly inform users that the chatbot's responses may be wrong (hallucinations). At the end of January , the Privacy Authority had blocked the processing of Italian users' data.
Unfair commercial practiceThe start of the proceedings was communicated via the weekly bulletin of June 16, as AGCM has not received any confirmation from DeepSeek of the notification sent on April 2. The unfair commercial practice, according to the Consumer Code, concerns the so-called hallucinations , or inaccurate, misleading or invented answers.
The chatbot screen does not display any clear warning about possible hallucinations, but only a generic warning in English ( AI-generated, for reference only ). The information about the risk of hallucinations is also not present on the home page, the registration page and the login page. The possibility of generating incorrect answers is confirmed in the terms of use of the service (point 4.4), reachable via a link present only at the bottom of the home page.
According to the authority, DeepSeek would have violated Articles 20, 21 and 22 of the Consumer Code, as the lack of information on hallucinations does not allow consumers to make an informed choice of the chatbot, instead of those offered by competitors. This choice would therefore be made in the mistaken belief of being able to fully rely on the reliability and correctness of the results.
The process will last 270 days. Within 30 days, DeepSeek must provide several pieces of information, including the list of services offered in Italy, the date of first availability of the apps on the Apple and Google stores, any other access methods and the number of Italian users who have used the service since March 1, 2025.
Punto Informatico