The MIT Technology Review has an article about Apple's use of differential privacy, that caught my eye for several reasons: Apple’s New Privacy Technology May Pressure Competitors to Better Protect Our Data: The technology is almost a decade-old idea that’s finally coming to fruition.
"On a quarterly investor call last week, Apple CEO Tim Cook boasted that the technology would let his company “deliver the kinds of services we dream of without compromising on individual privacy.” Apple will initially use the technique to track trends in what people type and tap on their phones to improve its predictive keyboard and Spotlight search tool, without learning what exactly any individual typed or clicked.
...
“It’s exciting that things we knew how to do in principle are being embraced and widely deployed,” says Aaron Roth, an associate professor at University of Pennsylvania who has written a textbook on differential privacy. “Apple seems to be betting that by including privacy protections, and advertising that fact, they will make their product more attractive.”
"On a quarterly investor call last week, Apple CEO Tim Cook boasted that the technology would let his company “deliver the kinds of services we dream of without compromising on individual privacy.” Apple will initially use the technique to track trends in what people type and tap on their phones to improve its predictive keyboard and Spotlight search tool, without learning what exactly any individual typed or clicked.
...
“It’s exciting that things we knew how to do in principle are being embraced and widely deployed,” says Aaron Roth, an associate professor at University of Pennsylvania who has written a textbook on differential privacy. “Apple seems to be betting that by including privacy protections, and advertising that fact, they will make their product more attractive.”
In the version of differential privacy Apple is using, known as the local model, software on a person’s device adds noise to data before it is transmitted to Apple. The company never gets hold of the raw data. Its data scientists can still examine trends in how people use their phones by accounting for the noise, but are unable to tell anything about the specific activity of any one individual.
Apple is not the first technology giant to implement differential privacy. In 2014 Google released code for a system called RAPPOR that it uses to collect data from the Chrome Web browser using the local model of differential privacy. But Google has not promoted its use of the technology as aggressively as Apple, which has this year made a new effort to highlight its attention to privacy (see “Apple Rolls Out Privacy-Sensitive Artificial Intelligence”)."
No comments:
Post a Comment