Apple’s AirTags are meant to help you effortlessly find your keys or track your luggage. But the same features that make them easy to deploy and inconspicuous in your daily life have also allowed them to be abused as a sinister tracking tool that domestic abusers and criminals can use to stalk their targets.
Over the past year, Apple has taken protective steps to notify iPhone and Android users if an AirTag is in their vicinity for a significant amount of time without the presence of its owner’s iPhone, which could indicate that an AirTag has been planted to secretly track their location. Apple hasn’t said exactly how long this time interval is, but to create the much-needed alert system, Apple made some crucial changes to the location privacy design the company originally developed a few years ago for its “Find My” device tracking feature. Researchers from Johns Hopkins University and the University of California, San Diego, say, though, that they’ve developed a cryptographic scheme to bridge the gap—prioritizing detection of potentially malicious AirTags while also preserving maximum privacy for AirTag users.
The Find My system uses both public and private cryptographic keys to identify individual AirTags and manage their location tracking. But Apple developed a particularly thoughtful mechanism to regularly rotate the public device identifier—every 15 minutes, according to the researchers. This way, it would be much more difficult for someone to track your location over time using a Bluetooth scanner to follow the identifier around. This worked well for privately tracking the location of, say, your MacBook if it was lost or stolen, but the downside of constantly changing this identifier for AirTags was that it provided cover for the tiny devices to be deployed abusively.
In reaction to this conundrum, Apple revised the system so an AirTag’s public identifier now only rotates once every 24 hours if the AirTag is away from an iPhone or other Apple device that “owns” it. The idea is that this way other devices can detect potential stalking, but won’t be throwing up alerts all the time if you spend a weekend with a friend who has their iPhone and the AirTag on their keys in their pockets.
In practice, though, the researchers say that these changes have created a situation where AirTags are broadcasting their location to anyone who’s checking within a 30- to 50-foot radius over the course of an entire day—enough time to track a person as they go about their life and get a sense of their movements.
“We had students walk through cities, walk through Times Square and Washington, DC, and lots and lots of people are broadcasting their locations,” says Johns Hopkins cryptographer Matt Green, who worked on the research with a group of colleagues, including Nadia Heninger and Abhishek Jain. “Hundreds of AirTags were not near the device they were registered to, and we’re assuming that most of those were not stalker AirTags.”
Apple has been working with companies like Google, Samsung, and Tile on a cross-industry effort to address the threat of tracking from products similar to AirTags. And for now, at least, the researchers say that the consortium seems to have adopted Apple’s approach of rotating the device public identifiers once every 24 hours. But the privacy trade-off inherent in this solution made the researchers curious about whether it would be possible to design a system that better balanced both privacy and safety.
“There’s this whole standards effort going on around how to do stalking resistance, which is really good. It means Apple and Google and the other companies are taking it seriously,” Green says. “The sad part is that Apple did the thing that everyone does when they’re painted into a corner. They have a big knob—one direction is privacy, one direction is the other thing (in this case, anti-stalking)—and they turned that knob away from privacy.”
The solution Green and his fellow researchers came up with leans on two established areas of cryptography that the group worked to implement in a streamlined and efficient way so the system could reasonably run in the background on mobile devices without being disruptive. The first element is “secret sharing,” which allows the creation of systems that can’t reveal anything about a “secret” unless enough separate puzzle pieces present themselves and come together. Then, if the conditions are right, the system can reconstruct the secret. In the case of AirTags, the “secret” is the true, static identity of the device underlying the public identifier that is frequently changing for privacy purposes.
Secret sharing was conceptually useful for the researchers to employ because they could develop a mechanism where a device like a smartphone would only be able to determine that it was being followed around by an AirTag with a constantly rotating public identifier if the system received enough of a certain type of ping over time. Then, suddenly, the suspicious AirTag’s anonymity would fall away and the system would be able to determine that it had been in close proximity for a concerning amount of time.
Green notes, though, that a limitation of secret sharing algorithms is that they aren’t very good at sorting and parsing inputs if they’re being deluged by a lot of different puzzle pieces from all different puzzles—the exact scenario that would occur in the real world where AirTags and Find My devices are constantly encountering each other. With this in mind, the researchers employed a second concept known as “error correction coding,” which is specifically designed to sort signal from noise and preserve the durability of signals even if they acquire some errors or corruptions.
“Secret sharing and error correction coding have a lot of overlap,” Green says. “The trick was to find a way to implement it all that would be fast, and where a phone would be able to reassemble all the puzzle pieces when needed while all of this is running quietly in the background.”
The researchers first published a paper about their findings in September and submitted it to Apple. More recently, they notified the industry consortium about the proposal. Apple did not return WIRED’s request for comment about the research and whether it is considering implementing the scheme.
Green says he hopes the company will eventually do something with the work. And he adds that the project is an important reminder of the real-world impacts theoretical cryptography can have.
“What I love about this problem is it seems like there are two competing requirements that can’t be reconciled,” he says. “But in cryptography, we can get full privacy and then, magically, the puzzle pieces click into place, or a ‘chemical reaction’ happens, and we phase-transition to a point where suddenly it’s obvious that this is a stalker, not just a benign AirTag. It’s very powerful to be able to go between those two moments.”