There are shivers in Silicon Valley as the data protection winter approaches. Hard on the heels of the implementation of Europe’s General Data Protection Regulations (GDPR) in May 2018, the California Consumer Privacy Act 2018 (CCPA) passed through the state legislature in August. The long and difficult labor which preceded the birth of the CCPA is well documented by Nicholas Confessore in his New York Times article[i], and raises a serious ethical question: Who should take responsibility for data privacy in this connected age?
Is it a matter for the individual to decide what data they share and how it is used, or is it up to government to legislate? There are valid arguments on both sides, each passionately defended by different interest groups. The campaign to bring consumer privacy protection to California pitted Facebook, Google and the might of Silicon Valley against determined individuals who believed people should have a say in the commoditization of their data. Other giants of the tech industry are also having their say. In October 2018, Apple CEO Tim Cook[ii] launched a scathing attack on other tech giants for their creation of a ‘data industrial complex’ which, he stated, not only compromises individual choice but also leads to the creation of polarized echo chambers and a division of society. What are the issues around the management of data privacy, how have regulators in different jurisdictions reacted, and why should the long journey to the enactment of the CCPA give rise for concern?
Is it practical to place trust in ourselves?
The idea of a continuum of personal privacy with data use controlled by the individual is an attractive one. At one end of the scale we have the burgeoning cryptocurrency market and emerging blockchain ecosystem where a basic premise of decentralization is that the buck stops with you. You have absolute control of your data, who can access it, and how it is used. One of the downsides of this approach is that if you lose the private key to an account, whether monetary or functional, then there is no recourse to a provider, no ‘forgotten password?’ link to follow. At the opposite pole we might place health data, kept safe by providers and rarely disclosed in full even to the individual. Even in this siloed sector there are moves afoot to put this data in the hands of patients to allow them more choice and greater transparency: the Great North Care Record initiative in the United Kingdom is a good example of the careful progress towards data sharing. The parallel with the challenges faced by the tech giants is not lost on its CEO, Joe McDonald, who recently penned an open letter to Mark Zuckerberg[iii] suggesting that Facebook might learn valuable lessons from the regulatory minefield surrounding health data.
Social and search companies are commercial organizations with a responsibility to shareholders. They know that offering granular choice on sharing and use of personal information maximizes the valuable data that the business can collect and monetize and minimizes their responsibility to the user. In turn, actively making those choices feeds our growing awareness that the digital services we eagerly consume at no monetary cost are paid for with our data. The longstanding advice is: “If you don’t want strangers to read it, don’t put it online”. This is a warning we give to our children, and one that we heed according to our personal sense of privacy.
Users with the maturity and digital dexterity to manage complex privacy options and occasional fallout from data breaches are adapting their behavior, but we cannot expect everyone to exercise the same caution. It is a relief to those of us north of forty that our student days were not recorded in lurid Snapchat detail. At eighteen, few of us were sensible and farsighted enough to understand the implications of sharing our lives publicly, and it is unlikely that today’s teenagers are any different. Should those who over-share be protected? In the wider community where so many are vulnerable, should people who cannot manage their lives be expected to manage their data?
Unfortunately, the level of intrusion into our personal data is more insidious than your new boss seeing your drunken college pictures. We pay for our social networking by handing over details of our location, our mood, our eating habits, vacation plans, social demographic, political views, professions, and even family members. We pay for easy access to third party websites, apps and games by logging on with our social credentials, sharing our data ever wider. The volume of information on corporate servers is staggering. Austrian privacy lawyer Max Schrems[iv] once requested his Facebook history from the company for a project and was horrified to discover it ran to 1200 pages. The complaints he filed under the old European data protection laws were given even greater relevance following Edward Snowden’s revelations about the NSA’s PRISM surveillance program. When the actions succeeded in the European Court of Justice in 2015, this triggered the abandonment of the Safe Harbor agreement[v] upon which tech giants had previously relied to satisfy the already stringent European data privacy laws.
What are society’s moral obligations?
The principle that the leaders of a community have a moral responsibility to protect the weaker members is a fundamental part of the fabric of society. Over three hundred years ago the philosopher John Locke wrote about the protection of life, liberty and property. That phrase resonates in constitutional and common law throughout the world, although modern life gives these simple definitions a good deal of practical complexity. We appoint the leaders of our community on the understanding that they will afford us protection, and the democratic process makes government answerable to the people. But what happens when the legislative machine is too fragmented to be effective? This is what is happening under our noses around the subject of data privacy.
Not only are there glaring differences between sovereign states, as demonstrated by the fallout from failure of the Safe Harbor agreement, but within the U.S. there is a veritable patchwork of legislation at both state and federal levels. This is a difficult reality in our connected world as data flows do not stop at the border. Across the fifty states privacy laws have evolved piecemeal to address specific concerns, beginning with concerns over the intrusion of photographic journalism in 1890. Currently there are just two particularly interesting pieces of legislation that affect the tech giants. In California the CCPA focuses on the disclosure of data collection and the ability for a user to opt out of the sale of that data to third parties. In Illinois, the Biometric Information Privacy Act 2008 is the basis of a lawsuit brought by Carlo Licata against Facebook in 2015, addressing facial recognition for photo tagging. The progress of this case has been delayed by stiff lobbying but it is now proceeding as a class action.
Global firms, however, are subject to global rules. Facebook has just been fined by the U.K.’s Information Commissioner[vi] over the Cambridge Analytica data scandal. The fine itself is a drop in the ocean, only £500,000, but as observers point out this is the maximum fine that was possible under older legislation. The new General Data Protection Regulations were enforced across Europe in May 2018. On the day the legislation came into force, European visitors to many U.S. websites found their access blocked, such was the trepidation about adhering to its requirements. Some are still inaccessible; many have revised their terms and privacy notices to trigger agreement to terms or access to restricted areas of a site as soon as a European IP address is detected. The provisions of GDPR go far beyond the California model, restricting the actual collection of data unless this is done for a specific reason or, failing that, is collected with “informed consent” (yet another complex minefield). The regulations include a requirement to notify breaches within 72 hours and stiff penalties for miscreants. This means that Facebook’s recent disclosure of a breach of security and possible data theft from around 87 million user accounts will have much greater consequences for the firm. They could be facing a maximum fine of 4% of global turnover for this mishap.
Mark Zuckerberg, in his testimony to the Senate, said of Facebook that “We did not take a broad enough view of our responsibility, and that was a mistake[vii].” Does this signal that the tech giants are ready to clean up their act?
Where do commercial responsibilities lie?
The real question that Zuckerberg’s statement raises is: to whom does Facebook owe responsibility? On the face of it, he seems to refer to a responsibility to the users, to the people whose data has been collected, collated and sold on an industrial scale. However, as a publicly quoted business the company’s real responsibility is to its shareholders. This applies to all the major players in Silicon Valley. Where laws such as GDPR force responsible behavior by imposing heavy fines, commercial imperatives are bound up with a need to avoid litigation. The challenge in the U.S. is a lack of consistent messages and enforcement.
Tim Cook, in his address to the European Parliament, called for a ‘comprehensive federal privacy law’. This is interesting coming from Apple, recently locked in a battle with the FBI in 2015 to access the private records of the San Bernardino shooter. Apple’s strong argument for the protection of privacy and end to end encryption was criticized at the time at the highest levels of government. The tables may be turning. There has been a previous attempt to standardize the U.S. approach to data privacy, but in a cruel irony the Snowden revelations undermined the government’s position. It was thought hypocritical for them to legislate against the activities of commercial organizations when they were as guilty of breaching personal privacy through NSA surveillance. It may be time to revisit the topic but, as evidenced by the struggle to pass the CCPA, the political influence of tech firms has grown substantially since the initial work on federal privacy laws.
By 2015 Facebook had 67 lobbyists in Washington, according to Confessore[viii], and was committing a $4 billion budget. The fact that data could help win elections was not lost on politicians, and the representatives of the new, clean tech industry were seen as the good guys. Around this time the phrase “Data is the new Oil” was coined. There are more similarities to the oilmen and their influence than you might imagine. A 2015 paper published in the Oxford Academic Review[ix] studied the returns from lobbying expenditure by the energy sector. The author’s research showed that while the effectiveness of lobbying was very low, a small change in enactment profitability covered the whole cost of the activity with substantial returns. It is well worth the investment to ensure that the final legislation takes a form which keeps the business model running.
The CCPA in its final form has supporters and detractors in equal measure. Some see it as ‘unworkable, burdensome and possibly unconstitutional’ (Kosseff)[x] while others agree that Silicon Valley needs regulation[xi]. Whatever the compromise, it must be clear where the responsibilities and interests of all parties really lie, and we must hope that legislators act on their moral obligation to protect the whole community effectively.
[i] The Unlikely Activists Who Took On Silicon Valley — and Won, Nicholas Confessore, Aug 2018 https://www.nytimes.com/2018/08/14/magazine/facebook-google-privacy-data.html
[ii] Keynote address from Tim Cook, CEO, Apple Inc to the European Parliament Oct 2018 https://www.youtube.com/watch?v=kVhOLkIs20A
[iii] Dear Mr Zuckerberg, Professor Joe McDonald, Great North Care Record, Apr 2018 https://www.digitalhealth.net/2018/04/joes-view-dear-mr-zuckerberg/
[iv] Max Schrems, the man who took on Facebook – and won, Financial Times, Apr 2018 https://www.ft.com/content/86d1ce50-3799-11e8-8eee-e06bde01c544
[v] Is my data more private than yours? Kate Baucherel, CDEP Feb 2016 https://digitalethics.org/essays/my-data-more-private-yours
[vi] Facebook fined £500,000 for Cambridge Analytica scandal, BBC News, Oct 2018 https://www.bbc.co.uk/news/technology-45976300
[vii] Transcript of Mark Zuckerberg’s Senate hearing, Apr 2018 https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing
[viii] The Unlikely Activists Who Took On Silicon Valley — and Won, Nicholas Confessore, Aug 2018 https://www.nytimes.com/2018/08/14/magazine/facebook-google-privacy-data.html
[ix] Policy Influence and Private Returns from Lobbying in the Energy Sector, Karam Kang, July 2015 https://academic.oup.com/restud/article/83/1/269/2461194
[x] Ten Reasons Why California’s New Data Protection Law is Unworkable, Burdensome, and Possibly Unconstitutional. Jeff Kosseff, July 2018 https://blog.ericgoldman.org/archives/2018/07/ten-reasons-why-californias-new-data-protection-law-is-unworkable-burdensome-and-possibly-unconstitutional-guest-blog-post.htm
[xi] Silicon Valley Needs Regulation, Marcus Ryu, Sep 2018 https://www.nytimes.com/2018/09/11/opinion/silicon-valley-regulation.html
Add new comment