Summary
- Apple agreed to pay Siri users from 2014-2024 a total of $95 million in settlement cash for unintentionally accepting user recordings for Siri training without consent.
- The widely reported incident was preceded by Amazon and Google leaking similar audio to unintended users and unscrupulous third parties, which everyone quickly forgot.
- Smartphones and speakers won’t conspiratorially spy on individual users’ conversations, and the most effective tracking means are more worrisome and harder to mitigate.
Over five years ago, The Guardian broke whistleblower allegations that Apple trained Siri, in part, using sensitive “private communications” recorded accidentally, and therefore without users’ explicit consent. Apple has now agreed to a $95 million settlement in exchange for being cleared of damages and claims related to the purported “unintended Siri activation” (Source: Ars Technica).
The alleged offense came to light in June 2019, a year after Amazon acknowledged receiving similarly unintentional recordings, and weeks after Google admitted over 1,000 Google Assistant recordings had been leaked to a Belgian news outlet.
If the US District Court approves the agreement, part of $95 million will head to anyone who claims ownership of a Siri-enabled device between the “Hey, Siri” wake word adoption on September 17, 2014, and the settlement date of December 31, 2024. Under the settlement’s terms, Apple must also confirm deletion of Siri recordings obtained before October 2019, while issuing new guidance on collected voice data use, and how consumers can opt into submitting anonymized recordings to “Improve Siri”.
Related
Ghostery CEO JP Schmetz talks online privacy in the Manifest V3 era with AP
The state of tracking on today’s internet
A smoking gun that proves voice assistant espionage?
Or something less nefarious that we’ve known for years?
According to the agreement reached between plaintiff and defendant, the payment would clear Apple and related entities of potential responsibility for the alleged breach. In other words, the company and anyone involved from 2014 to 2024 would be off the hook — and admit no wrongdoing — for what could, theoretically, have cost Apple over $1.5 billion under the Wiretap Act.
Apple’s potential release from responsibility falls at odds with the public outcry of 2019. Users, industry experts, and journalists cried foul at the supposed misuse of recordings that, captured unintentionally, users could not possibly have consented to. As far back as 2013, however, Apple openly shared with Wired its two-year Siri data retention policy, with anonymization occurring after six months. But that policy was never an admission that Apple received unintentional recordings.
Related
Gemini AI in Nest cameras could be a recipe for a privacy disaster
Does Google want us to see the other side of the story?
Raise your hand if you thought voice recognition was magic
In 2018, Amazon came under fire for recording a private conversation and sending it to a user’s contact unintentionally. Amazon explained that “a word in the background conversation sounding like ‘Alexa'” had triggered the interaction, and Alexa proceeded to repeatedly misinterpret the background conversation as, ultimately, a confirmed “send message” request directed at a contact. An unlikely turn of events, indeed — and still more likely than Amazon intentionally eavesdropping on a couple’s evening conversation about hardwood flooring.
When the Apple whistleblower emerged in 2019, public unrest also conveniently forgot the scandal of two weeks prior that saw over 1,000 Google Assistant clips leak their way to Belgian news outlet VRT NWS. Google paused the use of voice recordings (followed by an immediate three-month ban courtesy of German regulators) and launched an investigation that ultimately found third-party contractors responsible for leaking the audio.
Related
7 privacy-focused Android apps I use to keep my data secure
Essential Android apps to lock down your data
In the wake, Apple and Google removed third-party contractors from the process, pledged to shore up security, and vowed stop retaining recorded data by default. But whistleblowers had already outed Amazon for giving third-party workers access to personal recordings in April, prior to the Apple accusations. The public’s memory didn’t even rival that of a goldfish.
Is big tech recording conversations to serve you ads?
No, Alexa and Siri still aren’t spying on you
The initial worries of clandestine recording fall apart quickly when considering exactly how a company would train voice recognition in the first place. In 2019, Forbes contributor Kevin Murnane pointed out how it’s “business as usual” to train voice recognition software on user recordings. Murnane went on to outline how Google needed to fix internal problems, but the repeated “evil tech” rhetoric surrounding the varied recording revelations was “ridiculous”.
But now, consumers have already latched onto the idea that Apple is squashing a massive conspiracy with an inconsequential $95 million payout. As Reuters reports, “Two plaintiffs said their mentions of Air Jordan sneakers and Olive Garden restaurants triggered ads for those products,” an entirely unprovable claim illustrating how paranoid thinking pushes us to defy reality, and distracts us from real issues.
What are the real issues, if spying’s not one?
Source: Justin Ward / Android Police
Irresponsible headlines, like the Washington Post’s “Alexa has been eavesdropping on you this whole time,” belie the actually concerning ways organizations and advertisers track users. Cookies, browser fingerprinting, and ghost profiles — the subversive methods companies actually use to track us — serve infinitely more purpose, and pose considerably more difficulty to mitigate, than surreptitiously activating a microphone, transmitting audio to a server, parsing and categorizing information, and turning it on users via heavily personalized ads.
Related
How to use TikTok and other data-sucking apps without giving up your privacy
Protect your privacy on TikTok and other apps with our guide
Apple offered $95 million because, as the settlement documents make clear, the company saw negative value in trudging forward with a lawsuit in the murky, still-developing digital privacy frontier. Every voice recognition developer trains with user inputs, and no voice assistant picks up wake words perfectly every time. Accidental recordings will obviously happen, and Big Tech needs to continue locking down its security measures. But we don’t need to invent any more conspiracies.