While Apple is able to guarantee that it won’t do anything with users’ data, third parties, such as app developers, may have a field day at the cost of users’ privacy.
Reuters reports that the American Civil Liberties Union and the Center for Democracy and Technology are concerned for users of the iPhone X interacting with the TrueDepth camera — the engine powering Face ID authentication and mapped masking features like Animoji.
As other applications from third-party developers are set to be made, Apple’s contract for access to its consumers’ facial data requires that it not be sold to outside buyers and that the customer must agree to the collection. Adding to the risk is the fact that the data must be stored on a server in order to be used — that leaves the protection of that data on a per-developer basis.
Apple insists that the dot patterns that developers will get can’t be used to hash the mathematical key determined by the Face ID algorithm and that repercussions for contract breaches like a ban from publication on the App Store will keep app-makers in line.
But the fact that raw facial data can be manifest, stored, shared and taken is what’s worrying rights advocates. While government agencies already have amassed biometric data of many forms on their citizens, it’s the commercial interests, they say, that will want to track reactions to advertisements or editorial content — against contract stipulations for the facial data with Apple.
“The bottom line is, Apple is trying to make this a user experience addition to the iPhone X, and not an advertising addition,” said Clare Garvie, an associate with the Center on Privacy & Technology at Georgetown University Law Center in Washington.
Apple’s enforcement of these and other rules it sets out is in question as many developers have not read the reams of text that make them up. One of the company’s “privacy czars” admitted to the US congress in 2011 that the pre-publication reviews process is just a random spot check of source code and relies on complaints from testers as a safeguard.
“Apple does have a pretty good historical track record of holding developers accountable who violate their agreements,” said Jay Stanley, a senior policy analyst with the ACLU. “But they have to catch them first — and sometimes that’s the hard part.”