Apple must have caught a lot of flack when news came out that their phones were recording and transmitting sexual liaisons and other intimate details of some users’ lives to Apple employees. From Tyler Durden at zerohedge.com:
Last week, we reported a shocking revelation about Apple’s friendly
personal assistant tool for mass surveillance Siri: The app “regularly” records people having sex, discussing private medical information, participating in drug deals and other “countless” invasive moments which it promptly sends to Apple contractors for their listening pleasure – all for the sake of “quality control”.
Now, in the face of the latest privacy scandal to rock one of the major American tech giants, Apple has reportedly decided to suspend its global internal program for “grading” users’ Siri commands following the public backlash over the Guardian’s revelations, which came courtesy of a whistleblower.
Previously, Apple contractors listened to less than 1% of Siri commands as part of the program, which was intended to improve the quality and accuracy of the voice-based digital assistant. Yet, with hundreds of millions of users worldwide, the Cupertino-based consumer tech giant was effectively monitoring a huge chunk of the American population without their explicit knowledge.
Siri is neither as nice as she sounds nor as discreet as advertised. From Tyler Durden at zerohedge.com:
Should it come as any surprise? And yet the details are shocking and outrageous. A whistleblower working for Apple has revealed to The Guardian that its popular voice activated
spying device helpful virtual assistant Siri, now in millions of households, “regularly” records people having sex, and captures other “countless” invasive moments which it promptly sends to Apple contractors for their listening pleasure “quality control”:
Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.
However, what’s not disclosed or at least not well known up to this point is that a “small proportion” of all Siri recordings of what consumers thought were private settings are actually forwarded to Apple contractors around the world, according to the new report. Supposedly this is to ensure Siri is responding properly and can continue to distinguish dictation. Apple says, according to The Guardian, the data “is used to help Siri and dictation… understand you better and recognise what you say”.