How Apple's privacy stance could give Google an AI edge

23.06.2015
"We at Apple reject the idea that our customers should have to make tradeoffs between privacy and security," said Apple CEO Tim Cook earlier this month during an Electronic Privacy Information Center (EPIC) event where he was honored for corporate leadership. "We can and we must provide both in equal measure."

Via a remote video feed, Cook chastised some of Silicon Valley's most notable companies for "lulling their customers into complacency about their personal information" and added that Apple thinks it's wrong to sell customer information for profit. "You might like these so-called free services, but we don't think they're worth having your email, your search history and now even your family photos data mined and sold off for God knows what advertising purpose," Cook said during the speech, according to TechCrunch.

These statements have won Cook accolades among consumer privacy advocates, but the CEO glossed over the various data Apple collects on its users for advertising and other purposes. Apple's iAd advertising platform pales in comparison to those of Google and Facebook, but to suggest that Apple has no interest at all in its users' data is simply inaccurate. However, advertising isn't Apple's main motivation for collecting information on its users. Some of that data is used to feed services that are core to its mobile experience.

[Related News Analysis: What Apple News means to publishers, advertisers and you]

The troves of personal information that Apple, Google and Microsoft collect from their users power the Siri, Google Now and Cortana artificial intelligence (AI) apps. The privacy trade-offs should be weighed on an individual basis, but for all intents and purposes these virtual assistants are only as good as the data they use to ascertain relevant recommendations.

Data is the fuel that powers artificial intelligence

Alexander Gray, CTO and cofounder of machine-learning company Skytree, says AI platforms deliver better results as they collect and use more data. "Predictive accuracy keeps improving as the amount of data keeps increasing, but only up to the limit defined by the fundamental amount of predictive information in the data," he says. AI models will reach a threshold in the value they provide unless the models become more complex and crunch different types of new data, says Gray. 

"The more data AI can use, the better its models will be," Gray says. "Hence, there is a fundamental and unavoidable trade off between privacy and AI effectiveness." 

However, it's not all about the volume of data points, according to Jonathan Crane, chief commercial officer at IPsoft, an IT infrastructure services firm. "It's about adding and joining relevant data together to form a body of knowledge that is relevant to a particular process," Crane says. "Truly intelligent systems need to be able to link the data points together in order to make a helpful and sensible recommendation." 

Crane, who consults for companies looking to use AI, says it's important for users to understand the specific services they receive when they share potentially sensitive information. "Whatever the volume of data collected it has to be manipulated and used in order to give the customer more of the service information they value," says Crane. "If at any time it is used to promote a service or information that the customer didn't want, then it will quickly turn into a negative experience." 

[Related How-To: How to use Google's new privacy and security tools]

More data is not necessarily better, according to Ben Cheung, cofounder of the AI scheduling app, Genee. Virtual assistant apps ask for a lot of private information but give users only slight convenience or usefulness in return, Cheung says. "There is good reason for them to cumulate the large set of data first and then figure out what features they can provide to the user, but from the user's point of view, that tradeoff doesn't make sense." 

Apple's privacy stance a shot at Google

Cheung says native virtual assistant apps such as Siri and Google's "Now on Tap" represent narrow interpretations of AI. Apple, Google and others collect data in bulk because they want to serve a broad set of basic functions. "It's more important to capture the right data and to know exactly what to do with [it] than just having more data," Cheung says.

Users need to feel like they are in control of the information they release, as well as when it is shared and for what purpose, according to Crone. "Making it difficult or opaque for an individual to change their existing choices will prove unfruitful in the long term."

Heath Ahrens, CEO and founder of the text-to-speech platform iSpeech, says Apple's position on privacy is a competitive tactic against Google. "Apple reasons that since they can't directly benefit from the data, at least they can poison Google's reservoir. Google's entire business is built on selling advertisements targeted using personal data, [so] they will suffer as a result."

[Related Opinion: Lessons learned from the Apple-Google privacy fight]

Apple and Cook get kudos for taking such a strong stance on user privacy, but as a result Siri and other Apple services could fall behind Google over time. Google also has competitive advantages over Apple's Siri because of its strength and legacy in search, as well as its capability to correlate query results with user data, according to Ahrens.

He who has the most user data wins at AI

Skytree's Gray doesn't see much of a disparity between Siri and Google Now today, but he expects that to change based on how much data the companies collect and utilize over time. Apple and Google are presumably applying similar AI models and computational resources to their efforts in AI, which makes the amount and quality of data the remaining factors of differentiation, he says. "The system which can use more and better data will win."

Apple's stance on user privacy will likely place limitations on its AI systems, according to Gray. "The counterbalance is user loyalty, which may be better won and kept with its user privacy policy."

Most people are more than willing to hand over private information for access to the best technology, Gray says. However, if Apple suddenly decided to send "eerily relevant offers from third parties" via Siri, users might start to react differently and choose other technologies.

Today, consumers typically use services such as Siri or Google Now with reckless abandon or not at all. Ultimately, they must decide if it's worth giving up personal information for access to specific services. "The unfortunate fact, though, is that users are not even aware of all the data points about them that are being used in various AI systems," Gray says, and as a result, it's difficult, even impossible, for users to effectively manage all of the data they share with Apple and others.

(www.cio.com)

Matt Kapko

Zur Startseite