We've just seen some amazing reports from the Washington Post about just a few ways NSA is tracking people around the Internet and the physical world. These newly-revealed techniques hijacked personal information that was being transmitted for some commercial purpose, converting it into a tool for surveillance. One technique involved web cookies, while another involved mobile apps disclosing their location to location-based services.
One interesting thing about this is that the tracking that results is totally independent of the privacy practices that apply to the intended use of this data. Google has its own privacy policies about how it uses PREF cookies, and app developers have privacy policies for their location-based services, but this spying bypassed both of them and simply used this information as grist for the surveillance mill. So the level of intrusion resulting from this spying didn't depend on what the information was meant for, but on how it was repurposed.
Together, these programs show us that transmitting any unique information unencryptedforms the technological basis for a location-tracking technology, existing or potential. Every kind of identifiable information that gets transmitted in the clear over a radio or a public network is either an already-deployed NSA location-tracking program, or an exciting opportunity for some NSA agent to propose a new program to monitor devices’ whereabouts. And it’s not just NSA: we’ve already seen commercial use of phones’ wifi and Bluetooth addresses (which are visible to anybody nearby and which normally don’t change over the lifetime of the device) to monitor shoppers.
There’s a long-term privacy solution for location tracking that can address all of these tracking methods and others. Every transmission of personally-identifiable information, over the air or over a public network, must be encrypted. If addresses can’t be encrypted, they should be random and change frequently. And personally-identifiable information must be understood to include persistent identifiers for people or devices, even if those identifiers don’t directly include a person’s name(1) For example, a unique cookie can be used to recognize a device, so it should only ever be transmitted encrypted. Even if cookies were out of the picture, there's a lot else that's unique in browser behavior that can potentially be used to track an Internet user.
The Post’s earlier report on tracking shows that fixing technology to prevent the tracking of device locations will be a challenging task. The paper's recent story on a program called FASCIA referred to literally dozens of potentially unique and distinctive things about a cell phone that the NSA might be able to observe directly on the air or by tapping into cell phone carriers’ infrastructure. That list shows many different ways in which the existing cell phone infrastructure inherently exposes unique attributes of devices, and hence inherently permits location tracking (either by carriers or by spies monitoring radio signals). We've already noticed that location tracking is an inherent part of the way today's cell phone infrastructure is put together—we can't make some simple technical change to stop mobile carriers (or governments) from being able to know where particular mobile devices. Rather, fixing this at a technical level will require re-engineering our cell phone networks, so it might take a little longer than turning on HTTPS for tracking cookies and mobile location check-ins.
On whatever timescale we can make these necessary changes, technology developers should commit to the principle that no unique user data is ever exposed unencrypted.
(1) Some privacy engineers use the term uniqueness to refer to unique properties of a device or system that aren’t already associated with a particular person, or that might not ever become associated with a person in practice. For example, if a cell phone hasn’t been purchased yet, or if a transit fare collection system had a unique ID but nobody recorded or noticed who ended up using it for travel, some people would say the systems exhibit uniqueness but not personal identifiability. But it’s so easy for a device to make the one-way trip from uniqueness into an association with a person at any moment that careful privacy engineering would treat any long-term persistent uniqueness as presumptively identifying.