As consumers continue to move more boldly into the digital realm, it has become increasingly clear that their personal data is of considerable value to different stakeholders. Whether it’s through their heart rates monitored by their watches, their geolocation data provided when they check into particular restaurants, the types of films they like to watch on streaming services or perhaps even which products they browse through when shopping online, companies potentially have access to a wealth of data about who they are and what their preferences are.
Big data has come to the fore over the last decade as a hugely valuable resource that can offer significantly more granular insights to the party utilising the data than was previously possible—in many cases, companies that are seeking to procure large datasets to gain as deep an understanding as possible of consumer behaviours and preferences. “Data is the new oil,” the British mathematician and data scientist Clive Humby famously declared back in 2006. “It’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals…to create a valuable entity that drives profitable activity; so must data be broken down, analysed for it to have value.”
While this refining process can often reveal fascinating insights and ultimately provide a competitive edge to those parties that use the end-product effectively, much of the data being procured is often deeply personal. This means that if it is not used as intended, or when it falls into the wrong hands, it can cause significant damage. “Today, big data is just as disruptive as the technology that enables its collection and availability, except it’s all but traded as a commodity. That has both positive and negative consequences,” explained Scott Schlesinger, senior vice president and head of business information management for Capgemini. “It’s great to get an offer for something you really want that saves you money or gives you options. But, it’s not so great when you start to feel like your every movement is being tracked, shared and sold at will.”
Enabling companies to tailor which products and services they sell to different customers is a good case in point. By agreeing to disclose his or her favourite film genres to an online media company, for example, a consumer can expect to receive future movie recommendations that are more tailored to those preferences, which, of course, will prove useful to the consumer. Or by providing more data pertaining to his or her interests, the consumer may begin to see ads that are aligned with those interests.But given the evidence of recent years showing exactly how that data is being used, there are genuine concerns about whether companies are adequately protecting the privacy of consumers when accessing and acting upon consumer data.
Such wariness seems entirely justified, moreover, especially given the sheer magnitude of many of the biggest data breaches. In 2017, for instance, credit-reporting giant Equifax revealed that hackers had stolen the personal data of some 143 million American customers—later revised upward to 147.9 million—including such sensitive information as Social Security numbers, birth dates and domicile addresses. This eventually led to the company doling out $700 million in fines and compensation. “The stakes are high for companies handling consumer data: even consumers who were not directly affected by these breaches paid attention to the way companies responded to them.” And, of course, one can’t forget that infamously egregious case involving Facebook and Cambridge Analytica, in which the Trump 2016 election campaign hired the data firm to access the data of more than 50 million Facebook users. This enabled the campaign to influence the users’ voting behaviours.
McKinsey recently surveyed 1,000 North American consumers regarding their views on privacy, data collection, hacks and breaches, regulations, communications, and particular industries. What it revealed was that consumers were becoming increasingly selective over the types of data they shared and were more willing to share their data with providers in healthcare and financial services, although no single industry managed to achieve a trust rating of even 50 percent for data protection. “That lack of trust is understandable given the recent history of high-profile consumer-data breaches. Respondents were aware of such breaches, which informed their survey answers about trust,” McKinsey found.
It is worth stressing that privacy is not exactly an easy concept to define, let alone evaluate. It means different things to different people, and, as such, the amount each person is willing to disclose will vary greatly. It would imply, therefore, that we are far from truly understanding the exact significance of what we are relinquishing by agreeing to make our data accessible to outside interests. It is also worth asking just how much consumer data is actually worth. There is little benefit to the consumer of allowing companies to access his or her personal information. The company, meanwhile, reaps all of the economic benefits of gaining such deep consumer insights, which thus leads to a situation of asymmetry, whereby the consumer remains unaware of how exactly a company will end up using this personal information and how much value it generates. By consumers giving up their personal data, moreover, companies may begin to engage in more dubious practices, such as providing offers that are more expensive than would have otherwise been the case, selling the consumer data on to third parties or using the data in a way that the consumers did not intend.
Indeed, an investigation last year by the Washington Post found that popular browser extensions designed to improve the online browsing experience and used by a combined four million people were collecting and even selling user data. “Some extensions have a side hustle in spying,” reported the Post’s Geoffrey A. Fowler. “From a privileged perch in your browser, they pass information about where you surf and what you view into a murky data economy. Think about everything you do in your browser at work and home—it’s a digital proxy for your brain. Now imagine those clicks beaming out of your computer to be harvested for marketers, data brokers or hackers.” And while Google and Mozilla have since taken action to prevent such extensions from being available on their web browsers—Chrome and Firefox respectively—it is clear that the mining of personal data is now thriving, what the New York Times called a “trillion-dollar business” in 2018.
“Recent research shows that when there is no privacy protection, consumers may be worse off, especially if they make decisions based on short-term horizons (which is the reality of much consumer behaviour),” Alessandro Acquisti, professor of information technology and public policy at Carnegie Mellon University, wrote in 2016. “In the absence of external protection, over the long term, the consumer surplus may be appropriated by sellers through price discrimination. The point here is that there is a very clear and obvious economic rationale for privacy.”
That’s not to say that external protection does not exist, however, especially in recent years, during which time measures guarding against the nefarious use of consumer data have been implemented and continue to be strengthened. The General Data Protection Regulation (GDPR) is perhaps the most obvious example, providing consumers with greater levels of protection over how their data can be used. The directive requires businesses within the European Union (EU) to protect this consumer data for transactions being conducted within the region, thus ensuring that personal data can be gathered only under strict rules and for legitimate purposes. Organisations that collect and manage personal information must also protect it from misuse and respect certain rights.
And the ePrivacy Directive (Privacy and Electronic Communications Directive 2002/58/EC) builds on EU telecoms and data-protection frameworks to ensure that all communications over public networks maintain respect for fundamental rights, in particular, high levels of data protection and privacy, regardless of the technology used. As such, EU member states must ensure that users give their consent prior to cookies being stored on devices such as computers and smartphones.
Ultimately, it would seem that companies have a long way still to go before they can sufficiently earn the trust of their customers when it comes to their personal data. Indeed, McKinsey’s research concludes that its respondents simply do not trust companies to handle their data and protect their privacy. “Companies can therefore differentiate themselves by taking deliberate, positive measures in this domain. In our experience, consumers respond to companies that treat their personal data as carefully as they do themselves.” But while it would seem that companies on the whole are starting to learn this important lesson, the ultimate goal of regaining consumer trust is likely to remain an elusive one for some time yet.