Controlling, lying and blocking – could these activities enable an individual to win the privacy arms race against the data collection, surveillance, behavioural tracking and profiling abilities of search engines, marketers, social networking sites and others?
When we think about an arms race, we might imagine two sides evenly matched, both equally able to equip themselves with weapons and defences. But, when it comes to individuals versus data collectors, the position is considerably unbalanced – the equivalent of a cavalry charge against a tank division.
It is not however as if the individual is without protections. Let’s take consent, a key principle of European data protection law. Consent based on privacy policies is rather discredited as an effective means of enforcing privacy rights over data held by commercial third parties. If I might quote Lillian Edwards, ‘consent is no guarantee of protection on Facebook and its like, because the consent that is given by users is non-negotiable, non-informed, pressurised and illusory.’[i] So what about regulatory enforcement? In the UK, it could be described as mostly polite, in the rest of Europe, sometimes a little more robust. The FTC in the US has had some notable successes with its enforcement action based on unfair practices, with Jessica Rich, Director of the FTC’s Bureau of Consumer Protection, advocating privacy as being part of the ‘bottom line.’[ii] It remains to be seen whether market pressures will drive good faith changes in privacy practices – alternative subscription, advertising-free business models have failed to make much headway in terms of market share. The so-called ‘right-to-be-forgotten’ has been much debated and I would question how much the Google Spain decision[iii] adds to the individual’s armoury, the original publication remaining unaffected. And as for personal data anonymisation, this could be subject of a long debate in itself!
What can individuals do if they want to take matters into their own hands, and become a ‘privacy vigilante’?[iv] Here are three possibilities: first, ‘personal data stores’ (PDS) or ‘personal information management services’ (PIMS) are said by their promoters to enable individuals to take back control over their personal data and manage their relationship with suppliers. Pentland from MIT describes a PDS as ‘a combination of a computer network that keeps track of user permissions for each piece of personal data, and a legal contract that specifies both what can and can’t be done with the data, and what happens if there is a violation of the permissions.’[v]
Secondly, blocking. Systems could prevent tagging of individuals by third parties and set privacy defaults at the most protective. Lifelogging technologies could prevent the display of any recognisable image unless that individual has given permission.[vi] Individuals could deploy a recently invented Google Glass detector, which impersonates the Wi-fi network, sends a ‘deauthorisation’ command and cuts the headset’s internet connection.[vii]
Finally, obfuscation, by which technology is used to produce false or misleading data in an attempt, as Murray-Rust et al. put it, to ‘cloud’ the lens of the observer.[viii] It’s the technological equivalent of what most of us will have already done online: missing off the first line of our address when we enter our details into an online form; subtly changing our birthday; accidentally/on-purpose giving an incorrect email address in exchange for a money-off voucher. A personal data store could, for instance, be used to add ‘chaff’ (adding multiple data points amongst the real ones), or simulating real behaviour such as going on holiday. Brunton & Nissenbaum describe obfuscation as a ‘viable and reasonable method of last-ditch privacy protection.’[ix] On the face of it, obfuscation may seem to be an attractive alternative approach, providing individuals with a degree of control over how much ‘real’ information is released and some confidence that profiling activities will be hampered.
Are these methods ways for the individual to win the privacy arms race? As things stand, I have my doubts, although that is not to say that a legal and regulatory regime could not be created to support these methods. PDSs raise numerous questions about contract formation, incorporation, offers and counter-offers. Service providers would need to be prepared to change their business models fundamentally if PIMS are to fulfil their potential. In the short term, there appears to be little commercial incentive for them to do so.
In terms of blocking, systems could adopt protective measures but they don’t, because they don’t have to. Google Glass blockers may well fall foul of computer misuse legislation if used by members of the public rather than the network owner. In the UK, there would be a risk of a s 3 offence under the Computer Misuse Act 1990 – an unauthorised act with intent to impair the operation of any computer. Haddadi et al. suggest the ‘continuous broadcast of a Do-Not-Track beacon from smart devices carried by individuals who prefer not to be subjected to image recognition by wearable cameras’ although the success of this would depend on regulatory enforcement and whether device providers received and conformed to such requests.[x] It would be rather ironic, however, if one had to positively broadcast one’s presence to avoid image recognition.
As for obfuscation or lying on the internet, Murray-Rust et al. distinguish between official data, where obfuscation may be a criminal offence, and other data that can be obfuscated ‘without legal consequence.’[xi] The distinction is unlikely to be so clear-cut,: both on the civil side, and on the criminal side (fraud and computer misuse spring to mind), and this is something that I will be writing about in the future.
I would like to finish with this question about privacy vigilantism: by continuing to shift responsibility onto the individual, is this letting society off-the-hook for finding better solutions to privacy concerns?[xii] I think it probably is. Finding better solutions will require even closer interaction between computer scientists, lawyers and policy-makers.
Marion Oswald is Senior Fellow and Head of the Centre for Information Rights at the University of Winchester: marion.oswald@winchester.ac.uk
The 2nd Winchester Conference on Trust, Risk, Information & the Law on 21 April 2015 will be exploring the theme of the privacy arms race: click here.
[i] Lillian Edwards, Privacy, law, code and social networking sites, in Research Handbook on Governance of the Internet, (2013) Edward Elgar (Cheltenham) Ian Brown (Ed), 309-352, 324-328
[ii] Jessica Rich, Director, Bureau of Consumer Protection, Federal Trade Commission Beyond Cookies: Privacy Lessons for Online Advertising, AdExchanger Industry Preview 2015, January 21, 2015, 4 http://www.ftc.gov/system/files/documents/public_statements/620061/150121beyondcookies.pdf
[iii] Google Spain v AEPD and Mario Costeja Gonzalez (C-131/12), 13 May 2014
[iv] Marion Oswald, Seek, and Ye Shall Not Necessarily Find: The Google Spain Decision, the Surveillant on the Street and Privacy Vigilantism, 99-115, Digital Enlightenment Yearbook 2014 (K. O’Hara et al. (Eds)
[v] A. Pentland, Social Physics: How Good Ideas Spread – The Lessons from a New Science, The Penguin Press, New York, 2014
[vi] C. Gurrin, R. Albatal, H. Joho, K. Ishii, ‘A Privacy by Design Approach to Lifelogging’, Digital Enlightenment Yearbook 2014 (K. O’Hara et al. (Eds), 49-73, 68
[vii] A. Greenberg, Cut Off Glassholes’ Wi-Fi With This Google Glass Detector, Wired, June 3, 2014, http://www.wired.com/2014/06/find-and-ban-glassholes-with-this-artists-google-glass-detector/
[viii] D. Murray-Rust, M. Van Kleek, L. Dragan, N. Shadbolt, Social Palimpsests – Clouding the Lens of the Personal Panopticon, 75-96, 76, Digital Enlightenment Yearbook 2014 (K. O’Hara et al. (Eds)
[ix] Finn Brunton, Helen Nissenbaum, ‘Vernacular resistance to data collection and analysis: A political theory of obfuscation’ First Monday, Volume 16, Number 5, 2 May 2011 http://firstmonday.org/article/view/3493/2955
[x] H. Haddadi, A. Alomainy, I. Brown, Quantified Self and the Privacy Challenge in Wearables, Society for Computers & Law, 5 August 2014 http://www.scl.org/site.aspx?i=ed38111
[xi] nviii,90
[xii] nix