I still don’t think companies serve you ads based on spying through your microphone
2nd January 2025
One of my weirder hobbies is trying to convince people that the idea that companies are listening to you through your phone’s microphone and serving you targeted ads is a conspiracy theory that isn’t true. I wrote about this previously: Facebook don’t spy on you through your microphone.
(Convincing people of this is basically impossible. It doesn’t matter how good your argument is, if someone has ever seen an ad that relates to their previous voice conversation they are likely convinced and there’s nothing you can do to talk them out of it. Gimlet media did a great podcast episode about how impossible this is back in 2017.)
This is about to get even harder thanks to this proposed settlement: Siri “unintentionally” recorded private convos; Apple agrees to pay $95M (Ars Technica).
Apple are spending $95m (nine hours of profit), agreeing to settle while “denying wrongdoing”.
What actually happened is it turns out Apple were capturing snippets of audio surrounding the “Hey Siri” wake word, sending those back to their servers and occasionally using them for QA, without informing users that they were doing this. This is bad.
The Reuters 2021 story Apple must face Siri voice assistant privacy lawsuit -U.S. judge reported that:
One Siri user said his private discussions with his doctor about a “brand name surgical treatment” caused him to receive targeted ads for that treatment, while two others said their discussions about Air Jordan sneakers, Pit Viper sunglasses and “Olive Garden” caused them to receive ads for those products.
The claim from that story was then repeated in the 2025 Reuters story about the settlement.
The Ars Technica story reframes that like this:
The only clue that users seemingly had of Siri’s alleged spying was eerily accurate targeted ads that appeared after they had just been talking about specific items like Air Jordans or brands like Olive Garden, Reuters noted.
Crucially, this was never proven in court. And if Apple settle the case it never will be.
Let’s think this through. For the accusation to be true, Apple would need to be recording those wake word audio snippets and transmitting them back to their servers for additional processing (likely true), but then they would need to be feeding those snippets in almost real time into a system which forwards them onto advertising partners who then feed that information into targeting networks such that next time you view an ad on your phone the information is available to help select the relevant ad.
That is so far fetched. Why would Apple do that? Especially given both their brand and reputation as a privacy-first company combined with the large amounts of product design and engineering work they’ve put into preventing apps from doing exactly this kind of thing by enforcing permission-based capabilities and ensuring a “microphone active” icon is available at all times when an app is listening in.
I really don’t think this is happening—in particular for Siri wake words!
I’ve argued these points before, but I’ll do it again here for good measure.
- You don’t notice the hundreds of times a day you say something and don’t see a relevant advert a short time later. You see thousands of ads a day, can you remember what any of them are?
- The tiny fraction of times where you see an ad that’s relevant to something you’ve just said (hence breaking through your filter that prevents you from seeing most ads at all) stick in your head.
- Human beings are pattern matching machines with a huge bias towards personal anecdotes. If we’ve seen direct evidence of something ourselves, good luck talking us out of it!
I think the truth of the matter here is much more pedestrian: the quality of ad targeting that’s possible just through apps sharing data on your regular actions within those apps is shockingly high... combined with the fact that it turns out just knowing “male, 40s, NYC” is often more than enough—we’re all pretty basic!
I fully expect that this Apple story will be used as “proof” by conspiracy theorists effectively forever.
More recent articles
- Weeknotes: Starting 2025 a little slow - 4th January 2025
- Ending a year long posting streak - 2nd January 2025