Tracked by hidden tags? Apple and Google unite to propose safety and security standards…

Security

Apple’s AirTag system has famously been subjected to firmware hacking, used as a free low-bandwidth community radio network, and involved in a stalking incident that tragically ended in a murder charge.

To be fair to Apple, the company has introduced various tricks and techniques to make AirTags harder for stalkers and criminals to exploit, given how given how easily the devices can be hidden in luggage, stuffed into the upholstery of a car, or squeezed into the gap under a bicycle saddle.

But with lots of similar devices already on the market, and Google said to be working on a product of its own to take advantage of the zillions of Bluetooth-enabled phones that are out and about running Google Android…

…surely there should be safety and security standards that are encouraged, or perhaps even demanded and expected, throughout the “smart tag” market?

Apple and Google seem to think so, because experts from both companies have been working together to propose an internet standard they’re calling Detecting Unwanted Location Trackers:

Internet standards, to this day, retain their original, conciliatory designation Request For Comments, almost universally written simply as RFC. But when you want to ask for comments on a proposed new standard, it would be unwiedly to call it an RFCRFC, so they’re just known as Internet Drafts, or I-Ds, and have document names and URL slugs starting draft-. Each draft is typically published with a six-month commentary period, after which it may be abandoned, modified and re-proposed, or accepted into the fold and given a new, unique number in the RFC sequence, which is currently up to RFC 9411 [2023-05-03T19:47:00Z].

How big is too big to conceal?

The document introduces the term UT, short for Unwanted Tracking, and the authors hope that well-designed and correctly implemented tracking devices will take steps to make UT hard (though we suspect this risk can never be eliminated entirely).

Apple and Google’s proposal starts by splitting trackers into exactly two classes: small ones, and large ones.

Large devices are considered “easily discoverable”, which means that they’re hard to hide, and although they are urged to implement UT protection, they’re not obliged to do so.

Small devices, on the other hand, are considered easily concealed, and the proposal demands that they provide at least a basic level of UT protection.

In case you’re wondering, the authors tried to nail down the difference between small and large, and their attempt to do so reveals just how hard it can be to create unarguable, universal definitions of this sort:

 Accessories are considered easily discoverable 
 if they meet one of the following criteria:
    - The item is larger than 30 cm in at least one dimension.
    - The item is larger than 18 cm x 13 cm in two of its dimensions.
    - The item is larger than 250 cm^3 in three-dimensional space.

While we all probably agree than an AirTag is small and easily concealed, this definition also, probably very reasonably, considers our iPhone “small”, along with the Garmin we use on our bicycle, and our GoPro camera.

Our MacBook Pro, however, comes in as “large” on all three counts: it’s more then 30cm wide; it’s more than 13cm deep; and it’s well over 250cc in volume (or three-dimensional space, as the document puts it, which presumably includes the extra overall “straight line” volume added by bits that stick out).

You can try measuring some of your own portable electronic devices; you might be pleasantly surprised how chunky and apparently obvious a product can be, and yet still be considered small and “easily concealed” by the specifications.

To bleat, or not to bleat?

Loosely speaking, the proposed standards expect that all concealable devices:

  • MUST NOT BROADCAST their identity and trackability when they know they’re are near their registered owner. This helps ensure that a device that’s officially with you can’t easily be used by someone else to keep track of your every twist and turn as they follow you around in person.
  • MUST BROADCAST a “Hey, I’m a trackable Bluetooth thingy” notification every 0.5 to 2 seconds when they know they’re away from their owner. This helps to ensure that you have a way of spotting that someone else has slipped a tag ito your bag to exploit the tag to follow you around.

As you can see, these devices present two very different security risks: one where the tag shouldn’t bleat about itself when it’s with you and is supposed to be there; and the other where the tag needs to bleat about itself because it’s sticking with you suspiciously even though it’s not yours.

Tags must switch from “I am keeping quiet because I am with my real owner” mode into “Here I am, in case anyone is suspicious of me” mode after no more than 30 minutes of not synching with their owner.

Likewise they must switch back into “I’m holding my peace” after no more than 30 minutes of realising they’re back in safe hands.

When with you, they need to change their machine identifier (known in the jargon as their MAC address, short for media access code) every 15 minutes at most, so they don’t give you away for too long.

But they must hang onto their MAC address for 24 hours at a time when they’re parted from you, so they give everyone else plenty of chance to notice that the same unaccompanied tag keeps showing up nearby.

And if you do spot any unwanted tags in your vicinity, they must respond to any “reveal yourself” probes you send them by bleeping 10 times, and vibrating or flashing if they can, at a sound level laid down very specifically:

The [bleeper] MUST emit a sound with minimum 60 Phon peak loudness as defined by ISO 532-1:2017. The loudness MUST be measured in free acoustic space substantially free of obstacles that would affect the pressure measurement. The loudness MUST be measured by a calibrated (to the Pascal) free field microphone 25 cm from the accessory suspended in free space.

To track, or not to track?

Very importantly, any tag you find must not only provide a way for you to stop it calling home with its location to its owner, but also provide clear instructions on how to do this:

The accessory SHALL have a way to [be] disabled such that its future locations cannot be seen by its owner. Disablement SHALL be done via some physical action (e.g., button press, gesture, removal of battery, etc.).

The accessory manufacturer SHALL provide both a text description of how to disable the accessory as well as a visual depiction (e.g. image, diagram, animation, etc.) that MUST be available when the platform is online and OPTIONALLY when offline.

In other words, when you think you’ve busted someone who’s trying to track you, you need a way to throw your stalker off the scent, while also being able to retain the suspicious device safely as evidence, instead of resorting to smashing it or flinging it in a lake to keep it quiet.

If you wanted to, assuming that the device wasn’t jury rigged to turn tracking on just when you thought you’d turned it of, we guess you could even go off-track somewhere before turning it off, then backtrack to your original location and carry on from there, thus setting a false trail.

What to do?

If you’re interested in mobile device security; if you’re into privacy; if you’re worried about how tracking devices could be abused…

…we recommend reading through these proposed standards.

Although some of the specifications dig into technical details such as how to encrypt serial number data, others are as much social and cultural as they are technical, such as when, how and for whom such encrypted data should be unscrambled.

There are also aspects of the proposal you might not agree with, such as the specification than “obfuscated owner information” must be emitted by the device on demand.

For example, the proposal insists that this “obfuscated” data needs to include at least a partial phone number (the last four digits), or a hollowed-out email address (where tips@sophos.com would become t***@s*****.com, which obfuscates older, shorter email addresses much less usefully than newer, longer ones).

The current draft only came out yesterday [2023-05-02], so there are still six months open for comment and feedback…


Products You May Like

Articles You May Like

EU Ramps Up Cyber Resilience with Major Crisis Simulation Exercise
New Flaws in Citrix Virtual Apps Enable RCE Attacks via MSMQ Misconfiguration
Researchers Warn of Privilege Escalation Risks in Google’s Vertex AI ML Platform
Free Decryptor Released for BitLocker-Based ShrinkLocker Ransomware Victims
CISOs Turn to Indemnity Insurance as Breach Pressure Mounts

Leave a Reply

Your email address will not be published. Required fields are marked *