Steve Jobs did not become rich and famous by doing things by half.
Nearly thirteen years ago, Jobs introduced the iPhone® with the now famous line “Every once in a while, a revolutionary product comes along that changes everything”.
The smartphone did change most everything. However, it did not change our thinking about how to control the data ecosystems that this device created and enabled. That job is only half done.
Over the last decade we allowed the smartphone to quietly embed itself in each of our bodies. This humble internet access device evolved to be an Apple® with real bite: the clearest window into our souls and innermost demons. As data collection and retention became cheaper and we moved to cloud services, we fed this new bodily appendage more data and loaded and linked more applications. We took this appendage to bed with us, to monitor the quality of our sleep and, inadvertently or not, our sexual activity. We now expect our appendage to converse with us, in the reassuring tone of Siri, Alexa and other nice people. We rely upon this appendage for our memory, our entertainment, to tell us where we are, where everyone else is, and what everyone is doing. Public modes of transport have fallen eerily silent as we don earphones and retreat into our separate electronic lives. Those separate electronic lives are not little or private: these connected lives are usually more shared and more public than our non-electronic lives in public – just not public to those other members of the public physically nearby.
Year on year over the last decade we further clicked open this digital window to our innermost selves. Data captured through our smartphones is now the richest, most enduring record of how, why, when and where we act, go, think, see and feel. Analysis of that data reveals our likes and dislikes, our wellness and our illnesses, what we think about buying and then don’t, what we buy and why we then bought it, and how we interact and with whom.
Each smart phone is now the authoritative record of its host human’s life: of interests, likes and preferences, of geo-location, activity state (including physical activity – such as committing a crime while purportedly asleep), health status and medical condition, all conveniently captured 24x7x365 (subject only to temporary enforced separation for recharging).
Over that same decade, mobile phone data ceased to be a secret shared only between each of us and our mobile service provider closely regulated by communications privacy law. The smartphone evolved to become an applications delivery platform that captures diverse and valuable data and distributes that data across multiple players operating in ever more complex, multi-party supply side application data ecosystems.
And we each love our new appendage. All simply enabled through countless, mindless click-throughs: I agree.
With linking of diverse applications and cloud services, it is now often unclear who was collecting what data relating to us and for what purpose. Sometimes it is not even clear what is the primary purpose for collection and use of data, let alone any secondary uses and disclosures of data that may be occurring. Many mobile applications collect geo-location data by default without any disclosed reason to do so. Today our mobile service provider usually knows less about what we are doing, and why we are doing it, than a myriad of mobile app providers, the mobile app platform provider and their contractors, many of who we cannot even name.
The digital exhaust of our everyday lives has become a valuable commodity. If our physical trash was this valuable, we might not have a global problem disposing of trash.
What could possibly go wrong? Should we worry? After all, this is ‘our data’, and there are data privacy laws that address just this. Isn’t this what GDPR has fixed? Can’t similar national extensions to data privacy laws elsewhere bring ‘our data’ back under our control?
Unfortunately, it isn’t that easy.
Some of the global focus on data privacy laws starts from an incorrect premise: that notching up national privacy laws towards the misdescribed GDPR ‘gold standard’ will address what are, in reality, a much broader set of concerns.
GDPR-like regulation does many laudable things, most notably by focussing upon the quality of consent of affected individuals, and taking further tentative steps towards regulation of automated decision making significantly affecting humans. ‘Legitimate interests’ eliminates some of the clutter and noise around privacy disclosures, enabling data controllers to avoid stating the obvious. Requiring unambiguous, express, fully informed consent of affected individuals as to collection and uses and sharing of data relating to them increases transparency and, through the antiseptic of sunlight, creates accountability of data controllers.
However, in our smartphone world, notice and consent-based data privacy models are under severe strain, and increasingly recognised as such by users. Almost all users don’t read terms and take the Faustian bargain of giving access to data about us in exchange for free or subsidised services. Almost all users click straight through while hoping and praying for the best.
Notice and consent are form over substance, in a world awash with disclosures clamouring for our attention. This world encourages each of us to absent minded click-through I agree, creating an illusion of consent. Users become complicit through consent-washing to whatever it is that the service provider proposes to do with data about the user. Insights from Amos Tversky, Daniel Kahneman, Richard Thaler, Cass Sunstein and other giants of behavioural analysis have changed many disciplines. However, insights of behavioural analysis don’t appear to have changed how privacy professionals think other humans think, or don’t think, or don’t think rationally. Perhaps fortunately for the rest of humanity, most of those other humans don’t think the same way as many privacy professionals. Behavioural insights tell us that notice and the illusion of consent allows service providers to game obtaining user consent. In many cases, notching up consent requirements causes service providers to require users to make choices that users do not understand, or that users will not make carefully or well.
Regulatory requirements for consent also assist some application platforms to justify denying to competitors an opportunity to share the benefits of their platform and even privacy safeguarded sharing of platform data. Intriguingly, my smartphone does not remind me of the primary service providers collecting geo-location data about me, and the many secondary uses that these service providers make of this data (yes, I had clicked I agree). However, my smartphone gleefully reminds me, frequently, of third-party applications getting and using geolocation data about me.
Notice and consent enables many service providers to assert that because the user exercising control over personal data relating to the user, it is the responsibility of the user, not the service provider, to decide what uses and sharing of data are fair and reasonable, and what uses and sharing are not.
Leaving aside the understandable desire for transparency as to data uses and sharing, why should a consumer need to make a judgment as to whether particular uses and collection of data are fair, proportionate and reasonable? Regulators don’t require consumers to take responsibility for determining whether a consumer product is fit for purpose and safe when used for the product’s stated purpose, and unsuitable or unsafe when used for other purposes. Why should data driven services be any different?
A smartphone user may not want transparency and responsibility forced upon them, so that that the user must make a sensible decision (or just click-through). Instead, a smartphone user may reasonably expect accountability of the data controller, to ensure that the data controller responsibly and reliably does what is fair, proportionate, and not adverse to the user’s legitimate interests.
Concern as to what is fair, proportionate, and not adverse to the user’s legitimate interests is about to become much greater. Many businesses over the last few years have been investing in data cleansing, normalisation and transformation, to better enable linkage of multiple related data sets and to facilitate data sharing. Smartphones, wellness devices, fitness trackers and numerous other IoT devices expand the range and depth of data that is collected about our interests, preferences and activities. Standardisation of data formats, and improvements in multi-device linking of a unique user (whether or not that user is identifiable), is enabling integration of data across multiple devices, intermediated by the smartphone. Data is more readily available on and across data clouds, discoverable and linkable at low cost. The capabilities of data applications using data collected through smartphone and other IoT devices are rapidly expanding now that better quality data is more readily available to fuel for those applications.
We are now living through an explosion in capabilities of applications to make valuable and sensitive inferences about individuals, and that enable those inferences to be used in circumstances where relevant individuals may not be identifiable. Where the inferences are drawn from non-identifying data, and used to create outcomes in how an individual is treated without knowing who that individual is, the use of that data will sit outside the operation of data privacy laws in many countries. Yet inferences from this data can be used to differentiate between affected individuals, whether or not identified or identifiable, in whether a product or service is offered to them and as to the price and non-price terms at which a product or service is offered.
As algorithms are refined through experience, with more data and through machine learning, these inferences become both more likely to be correct and less explicable (or the new term, explainable). This explosion in devices, data points and cloud is occurring while at the same time there is a continuing decline in trust of affected individuals and other citizens as to how many organisations collect, curate and share digital data relating to individuals.
Should we trust our newest bodily appendage? Maybe, but we need to become much smarter in thinking about smartphones and not leave the thinking to our smartphones and platform and app providers. Our DNA ensures that without our conscious effort all our other bodily appendages act to the benefit of the corpus, our bodies. We cannot expect our new bodily appendage to apply similar principles of social beneficence. The job of controlling the smartphone will only be fully done when we can be confident that the controllers of the myriad data ecosystems that are fuelled by our smartphones responsibly and reliably collect, use and share ‘our data’ only as is demonstrably fair, proportionate, and not adverse to the user’s legitimate interests. Notice and click-through consent won’t get the job done properly.
Peter Leonard is a data, content and technology business consultant and lawyer advising data-driven business and government agencies. Peter is principal of Data Synergies and a Professor of Practice at UNSW Business School (IT Systems and Management, and Business and Taxation Law). Peter chairs the IoTAA’s Data Access, Use and Privacy work stream, the Law Society of New South Wales’ Privacy and Data Committee and the Australian Computer Society’s AI Ethics Technical Committee. He serves on a number of corporate and advisory boards, including of the NSW Data Analytics Centre.