For well over a decade, Apple has been praised by privacy advocates for its decision in 2011 to end-to-end encrypt iMessage, securing users’ communications on the default texting app for all its devices so thoroughly that even Apple itself can’t read their messages. This was years before WhatsApp switched on end-to-end encryption in 2016, and before Signal—now widely considered the most private end-to-end encrypted messaging platform—even existed, Apple quietly led the way with that security feature, baking it into a core piece of the Apple ecosystem.
So it’s ironic that the US Department of Justice has now hit Apple with a landmark antitrust lawsuit, alleging that it has sought for years to monopolize the smartphone market and gravely harmed consumers in the process, iMessage’s end-to-end encryption has become Exhibit A for an argument about Apple’s privacy hypocrisy—that Apple’s allegedly anticompetitive practices have denied users not only better prices, features, and innovation, but also better digital security.
In its sweeping antitrust lawsuit, the DOJ on Thursday laid out a broad set of allegations against Apple, accusing it of monopolistic practices in how it uses its walled-garden operating systems and app stores to deprive consumers of apps and services that might make it easier for them to wean themselves from their Apple addictions—keeping out of the App Store so-called super apps with cross-platform, broad functionality; limiting streaming and cloud-based applications; and handicapping the functionality of competitors’ devices like smartwatches.
The DOJ’s complaint also homes in on Apple’s approach to security and privacy, arguing that it uses those principles as an excuse for its anticompetitive practices, yet jettisons them whenever they might hurt the bottom line. “In the end, Apple deploys privacy and security justifications as an elastic shield that can stretch or contract to serve Apple’s financial and business interests,” the complaint reads.
“I definitely think that Apple has strategically used privacy and security in ways that benefit its business,” says Caitlin Chin-Rothmann, a research fellow at the Center for Strategic & International Studies (CSIS) who focuses on technology policy. “Apple has taken some steps to improve end-to-end encryption in iMessage, for example, but it hasn’t extended that to iPhone users that text Android users or iPhone users that don’t use iMessage.”
In its privacy and security arguments, the DOJ faults Apple for decisions like its deal with Google to make Google’s search engine the default on Apple products, rather than a more privacy-preserving alternative, or allowing data-harvesting apps into its App Store. But it repeatedly returns to iMessage as perhaps the clearest example of how Apple’s anticompetitive practices directly harm users security. The DOJ argues that by refusing to allow users of other smartphone platforms like Android to use its end-to-end encryption iMessage protocol, it has significantly reduced the overall security of messaging worldwide, both for those Android users and for the Apple users who communicate with them.
“Text messages sent from iPhones to Android phones are unencrypted as a result of Apple’s conduct,” the complaint reads. “If Apple wanted to, Apple could allow iPhone users to send encrypted messages to Android users while still using iMessage on their iPhone, which would instantly improve the privacy and security of iPhone and other smartphone users.”
The argument is one that some Apple critics have made for years, as spelled out in an essay in January by Cory Doctorow, the science fiction writer, tech critic, and coauthor of Chokepoint Capitalism. “The instant an Android user is added to a chat or group chat, the entire conversation flips to SMS, an insecure, trivially hacked privacy nightmare that debuted 38 years ago—the year Wayne’s World had its first cinematic run,” Doctorow writes. “Apple’s answer to this is grimly hilarious. The company’s position is that if you want to have real security in your communications, you should buy your friends iPhones.”
In a statement to WIRED, Apple says it designs its products to “work seamlessly together, protect people’s privacy and security, and create a magical experience for our users,” and it adds that the DOJ lawsuit “threatens who we are and the principles that set Apple products apart” in the marketplace. The company also says it hasn’t released an Android version of iMessage because it couldn’t ensure that third parties would implement it in ways that met the company’s standards.
“If successful, [the lawsuit] would hinder our ability to create the kind of technology people expect from Apple—where hardware, software, and services intersect,” the statement continues. “It would also set a dangerous precedent, empowering government to take a heavy hand in designing people’s technology. We believe this lawsuit is wrong on the facts and the law, and we will vigorously defend against it.”
Apple has, in fact, not only declined to build iMessage clients for Android or other non-Apple devices, but actively fought against those who have. Last year, a service called Beeper launched with the promise of bringing iMessage to Android users. Apple responded by tweaking its iMessage service to break Beeper’s functionality, and the startup called it quits in December.
Apple argued in that case that Beeper had harmed users’ security—in fact, it did compromise iMessage’s end-to-end encryption by decrypting and then re-encrypting messages on a Beeper server, though Beeper had vowed to change that in future updates. Beeper cofounder Eric Migicovsky argued that Apple’s heavyhanded move to reduce Apple-to-Android texts to traditional text messaging was hardly a more secure alternative.
“It’s kind of crazy that we’re now in 2024 and there still isn’t an easy, encrypted, high-quality way for something as simple as a text between an iPhone and an Android,” Migicovsky told WIRED in January. “I think Apple reacted in a really awkward, weird way—arguing that Beeper Mini threatened the security and privacy of iMessage users, when in reality, the truth is the exact opposite.”
Even as Apple has faced accusations of hoarding iMessage’s security properties to the detriment of smartphone owners worldwide, it’s only continued to improve those features: In February it upgraded iMessage to use new cryptographic algorithms designed to be immune to quantum codebreaking, and last October it added Contact Key Verification, a feature designed to prevent man-in-the-middle attacks that spoof intended contacts to intercept messages. Perhaps more importantly, it’s said it will adopt the RCS standard to allow for improvements in messaging with Android users—although the company did not say whether those improvements would include end-to-end encryption.
Even as it makes those advances in iMessage’s security and privacy, Apple has rarely touted them in its public-facing marketing, points out Nadim Kobeissi, a cryptographer focused on secure messaging and the director of the cryptography consultancy Symbolic Software, and has actively contributed to public, nonproprietary security products. He argues that this deflates any argument the DOJ might make that Apple has intentionally hoarded security features as a competitive advantage.
Instead, Kobeissi says that the security gap is a byproduct of Apple’s attempt to preserve the exclusivity of more visibly integrated and fun social features—reactions, FaceTime, coveted blue bubbles—while also conscientiously maintaining the product’s security. “It’s not a security question, it’s a societal question about the openness of communication platforms,” Kobeissi says. He points out that people who are aware of iMessage’s security advantages are also aware of other end-to-end encrypted messaging options like WhatsApp and Signal.
Apple critics like Doctorow and Migicovsky both point out, however, that iMessage is deeply integrated in Apple devices as their default messaging app, and thus will always be used on those devices far more often than Signal or WhatsApp. “Defaults matter,” Doctorow says, pointing out that Google pays Apple a fortune for the right to make Google the default search engine on its devices. “Apple makes [nearly] $20 billion a year off the proposition that a click away is a click too far.”
Even if Apple’s security features are good for its customers, “Apple’s size does matter” because it gives the company power over the broader market in ways that smaller technology companies cannot, regardless of whether they market their security and privacy features to beat out the competition, says CSIS’s Chin-Rothmann. The question, ultimately, is whether people should be relying on technology giants like Apple to set the standards for privacy and security at all. She points out that comprehensive data privacy legislation that ensures minimum security requirements for software might well be a better approach than depending entirely on the private sector to decide who gets privacy—and who is denied it.
“If Congress or the US government really wants to increase privacy and security,” Chin-Rothmann says, “we really should take these decisions out of the hands of massive technology companies like Apple.”