EletiofeApple’s Privacy Mythology Doesn’t Match Reality

Apple’s Privacy Mythology Doesn’t Match Reality

-

- Advertisment -

In 2021, Apple has cast itself as the world’s superhero of privacy. Its leadership insists, “Privacy has been central to our work … from the very beginning” and that it’s a “fundamental human right.” Its new advertising even boasts that privacy and the iPhone are the same things

This past spring, rolling out a software update (iOS 14.5) that empowers users to say no to apps surveilling their activity across the internet did demonstrate something important: People choose privacy when they don’t have to struggle for control over their information. Now, only 25 percent of users consent, but before, nearly 75 percent consented by omission to have their information fuel targeted advertising. As Apple plans to add more privacy protections into iOS 15, which will be released next month, it continues to brand itself as a force potentially capable of slowing down growth at Facebook, a paragon of surveillance capitalism. Unfortunately, Apple’s privacy promises don’t show the full picture.

The company’s most alarming privacy failing may also be one of its most profitable: iCloud. For years, the cloud-based storage service has further entrenched hundreds of millions of Apple customers in its ecosystem, an internet-enabled extension of your hard drive designed for effortlessly offloading photos, movies, and other files to your unseen backup drive. Unfortunately, iCloud makes it nearly as easy for the police to access all of those files.

In the past, Apple has been adamant that it won’t weaken the security of its own devices to build in a back door. But with older devices, the door is already built. According to Apple’s law enforcement manual, anyone running iOS 7 or earlier is out of luck if they fall into the police or ICE’s crosshairs. With a simple warrant, Apple will unlock a phone. This may seem par for the course in Silicon Valley, but most tech giants’ CEOs haven’t previously proclaimed that warrants for their devices endanger “the data security of hundreds of millions of law-abiding people … setting a dangerous precedent that threatens everyone’s civil liberties.” This service is available due to security vulnerabilities eventually addressed in later operating systems.

Since 2015, Apple has drawn the ire of the FBI and the Justice Department with each new round of security enhancements, ultimately building a device that’s too safe for even Apple to crack. But the dirty little secret with nearly all of Apple’s privacy promises is that there’s been a backdoor all along. Whether it’s iPhone data from Apple’s latest devices or the iMessage data that the company constantly championed as being “end-to-end encrypted,” all of this data is vulnerable when using iCloud.

Apple’s simple design choice to hold onto iCloud encryption keys created complex consequences. They don’t do this with your iPhone (despite government pleas). They don’t do this with iMessage. Some benefits of making an exception for iCloud are clear. If Apple didn’t hold the keys, account users who forgot their password would be out of luck. With truly secure cloud storage, the company would be no more able to reset your password than would a random attacker. But retaining that power allows it to hand over your entire iCloud backup when ordered.

iCloud data goes beyond photos and files; it also includes location data, such as from “Find my phone” or AirTags, Apple’s controversial new tracking devices. With a single court order, all of your Apple devices could be turned against you and made into a weaponized surveillance system. Apple could fix it, of course. Plenty of companies have secure file-sharing platforms. The Swiss firm Tresorit offers true “end-to-end encryption” for its cloud service. Tresorit users also see their files uploaded in real time to the cloud, synced across multiple devices. The difference is that users, not Tresorit, hold the encryption keys. This does mean that if users forget their password, they also lose their files. But as long as providers have the power to recover or change passwords, they have the power to hand that information to the police.

The threat is only growing. Under a new suite of content moderation tools, Apple will scan iCloud uploads and iMessage communications for suspected child sexual abuse materials (CSAM). While the company once exclusively searched photos uploaded to iCloud for suspected CSAM, the new tools can now turn any photo and text you’ve sent or received against you. Thwarting CSAM is a noble goal, but the consequences could be disastrous for those wrongly accused when the AI fails. But even when the software works as intended, it could be deadly. As Harvard Law School instructor Kendra Albert noted on Twitter, these “features are going to get queer kids kicked out of their homes, beaten, or worse.” Software launched in the name of “child safety” could be a deadly threat to LGBTQ+ children outed to homophobic and transphobic parents. Just as chilling, the tools used to track CSAM today easily can be trained to flag political and religious content tomorrow.

Apple’s privacy threats aren’t confined to the cloud or iMessage. NBC recently reported allegations that the company and other tech giants coerced call center workers to accept company cameras in their homes, even their bedrooms, to track remote work productivity. (An Apple spokesperson told NBC the company “prohibits the use of video or photographic monitoring by our suppliers.”) These allegations follow complaints about Apple’s earlier use of facial recognition within stores, a claim that the tech giant also denied.

Apple also appears on the cusp of integrating facial verification into a new digital identification card, essentially a digital version of a government-issued ID, like a driver’s license. Facial verification and recognition are, of course, different technologies, but recent scholarship suggests that normalizing the former might psychologically predispose people to embrace the latter. Face-powered digital IDs also blur the line, presenting some of the same risks as police facial recognition. That’s because the easier Apple makes it for governments to integrate facial verification into ID checks, the more police and other agencies will turn to biometric identification.

With more than 1 billion iPhone users, this would accelerate the normalization of both automated ID checks and automated facial scanning. Even if, hypothetically, Apple’s software is flawless, the fact remains that many companies offer facial verification services. Some are biased and error-prone, especially for women and dark-skinned people. Facial verification errors already block access to resources like unemployment benefits. As people grow accustomed to using faces as ID, they will lose sight of the threat this tech poses. Once face scans become mundane, vulnerable communities will pay a steep price so others can gain minor conveniences.

Apple’s vast penetration of the mobile phone market, the very thing it emphasizes when touting privacy protections, gives it vast power over people’s habits. By changing its software, Apple not only changes our behavior, it also subconsciously shifts our beliefs. The adjustment reengineers fundamental aspects of our humanity, like what we expect, desire, and deem socially reasonable. We can equate the facial recognition on our phones with the systems deployed by police, even though they share little beyond a name. When our phone’s facial recognition fails, we can be locked out for a few seconds. When police facial recognition fails, our neighbors can be locked behind bars for days, weeks, or even longer.

If Apple wants to sell the world privacy, it shouldn’t hide pathways for authoritarianism in the fine print. True privacy means selling services that protect our data from being harvested—not just by ad tech vendors but also by governments, both foreign and domestic.


WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here, and see our submission guidelines here. Submit an op-ed at [email protected].


More Great WIRED Stories

Latest news

Rabbit R1 Review: Skip This AI-Powered Hardware Assistant

At the R1's launch event in New York City, Lyu demoed an example of having the R1 look at...

Automakers Want AM Radios Out of Cars. Congress Is About to Require Them

A controversial bill that would require all new cars to be fitted with AM radios looks set to become...

27 Viral TikTok Gifts That Are Actually Worth a Look (2024)

We've tested a lot of TikTok products, but they're not all worthy of the top spot. That doesn't mean...

No One Knows How Far Bird Flu Has Spread

In late March, the US Department of Agriculture (USDA) announced it had detected cases of bird flu in dairy...
- Advertisement -

Brane X Speaker: Compact Size, Home Theater Sound

Bass is foundational. A dedicated speaker capable of reproducing convincing bass and sub-bass not only makes music and films...

Court Halts Arrest Of APC Ward Officers Behind Ganduje’s Suspension

A Kano High Court has restrained the Inspector General of Police (IGP), Assistant Inspector General (AIG) Zone 1 Kano;...

Must read

Rabbit R1 Review: Skip This AI-Powered Hardware Assistant

At the R1's launch event in New York City,...

Automakers Want AM Radios Out of Cars. Congress Is About to Require Them

A controversial bill that would require all new cars...
- Advertisement -

You might also likeRELATED
Recommended to you