Siri recording conversations without permission: a probe in France
There is a brewing scandal in France around Apple: the Paris prosecutor's office opened an investigation into Apple's Siri voice assistant based on a complaint filed by Ligue des droits de l'Homme, a French human rights NGO. The organization accuses Apple of having Siri record conversations without users’ informed consent, and sharing these recordings with third-party contractors for analysis. This, according to France's Office for Combating Cybercrime (OFAC), may be a violation of privacy laws, including the EU General Data Protection Regulation (GDPR).
Why does Apple record conversations with Siri and analyze them?
It’s simple, really: to make the AI assistant better. A benign purpose. To be clear, this program is a thing of the past now, scrapped after an earlier situation of the same type. But when it was active, Apple hired contractors to review a fraction of Siri voice recordings and point out inaccuracies in the AI’s processing efforts, and suggest quality improvements.
In the process, though, those contractors — third parties from all over the globe — got to hear rather confidential things, those that people aren’t ready to share with the world. Thus, the whole process was condemned for privacy risks and lack of transparency.
Not the first time
This predicament is not the first of its kind in which Apple has found itself. In 2019, Siri was found to frequently record private conversations, even when users did not explicitly activate the assistant. Consequently, in the context of the aforementioned program, the recordings were sent to contractors for analysis, also without any permission from the persons heard talking in them.
There were many lawsuits; in the USA, a large class-action suit accused Apple of unlawful and intentional recording as well as unauthorized disclosure of confidential communications. Continuing to maintain that it never sold nor used the recordings for marketing purposes, Apple agreed to a $95 million settlement in December 2024.
The currently developing story is more or less similar. There is a whistleblower involved: Thomas Le Bonniec, one of Apple’s contractors analyzing Siri recordings. According to him, many recorded conversations were accidental and contained sensitive personal information such as medical discussions and private moments. Moreover, a lot of pieces were obtained when Siri was triggered unintentionally.
Apple stands its ground, claiming that it collects voice data only from users who opt in to improve Siri and never uses the recordings for marketing or sells them to ad networks. But some users involved in the matter reported repeated exposure to targeted ads that were clearly linked to private conversations. Thus, it is likely Apple is about to have another lawsuit on its hands.
How to not let Siri record you?
- Disable “Improve Siri & Dictation”: The switch is in Settings → Privacy & Security → Analytics & Improvements. Thrown to the off position, it tells Apple to stop storing and reviewing your audio recordings for quality improvement purposes.
- Opt out of human review: Starting from iOS 13.2, Apple requires explicit opt-in to the program that involves human beings reviewing the recordings. The permission can be denied in the same section of Settings.
- Disable Siri: In Settings → Siri & Search, switch off everything and confirm your desire to turn Siri off. You can also take one more step in this direction and disable Dictation in Settings → General → Keyboard.
- Revoke microphone access for apps: In Settings → Privacy & Security → Microphone, you will find the list of apps with access to the mic; disable permissions of those that don’t logically need access to the microphone.
- Combat background listening: When having a private conversation, disable Siri or, better yet, switch your phone into airplane mode and disconnect it from Wi-Fi. Physically covering the mic can help as well.