Siri is no longer safe: Apple contractors ‘regularly hear confidential details’

0
952

Voice assistants are growing in popularity, but the technology has been experiencing a parallel rise in concerns about privacy and accuracy.

First Amazon, then Google, and now Apple have all confirmed that their devices are not only listening to you, but complete strangers may be reviewing the recordings. Thanks to Siri, Apple contractors routinely catch intimate snippets of users’ private lives like drug deals, doctor’s visits, and sexual escapades as part of their quality control duties, the Guardian reported Friday.

In an effort to improve the voice assistant, “[a] small portion of Siri requests are analysed to improve Siri and dictation,” Apple told the Guardian. That involves sending these recordings sans Apple IDs to its international team of contractors to rate these interactions based on Siri’s response, amid other factors. The company further explained that these graded recordings make up less than 1 percent of daily Siri activations and that most only last a few seconds.

Apple is paying contractors to listen to recorded Siri conversations, according to a new report from The Guardian, with a former contractor revealing that workers have heard accidental recordings of users’ personal lives, including doctor’s appointments, addresses, and even possible drug deals.

According to that contractor, Siri interactions are sent to workers, who listen to the recording and are asked to grade it for a variety of factors, like whether the request was intentional or a false positive that accidentally triggered Siri, or if the response was helpful.

But Apple doesn’t really explicitly say that it has other humans listening to the recordings, and whatever admissions it does make to that end are likely buried deep in a privacy policy that few (if any) Siri users have ever read. Apple  “To help them recognize your pronunciation and provide better responses, certain information such as your name, contacts, music you listen to, and searches is sent to Apple servers using encrypted protocols,” but nowhere does it mention that human workers will be listening to and analyzing that data.

The news comes as smart home devices by Google and Amazon have both been shown to be privacy nightmares, with their microphones recording and sharing audio from their users without the customer’s knowledge.

Apple, along with Google and Amazon, all have similar policies for the contract workers it hires to review those audio snippets. But all three voice AI makers have also been the subject of similar privacy breaches, either by whistleblowers going to the press or through errors that give users access to incorrect audio files.

hese cases bring up a series of questions. What can Apple and its colleagues do to better protect user privacy as they develop their voice systems? Should users be notified when their recordings are reviewed? What can be done to reduce or eliminate the accidental activations? How should the companies handle the accidental information that its contractors overhear? Who is responsible when dangerous or illegal activity is recorded and discovered, all by accident?

Voice assistants appear to be yet another instance where a technology has been developed and adopted faster than its consequences have been fully thought-out.