Rebecca MacKinnon of Ranking Digital Rights argues that transparency in the wake of the Facebook revelations is not enough. Silicon Valley firms must bake privacy into the design of their information platforms
Both sides of the Atlantic are in an uproar over revelations that the data of millions of Facebook users was acquired by the political consulting firm Cambridge Analytica for the purposes of influencing elections. Investors have sent a strong warning signal to the social networking company: Nordea has put a hold on its investments in Facebook, whose stock has shed $70bn in the past 10 days. Analysts are predicting more to come.
But let’s be honest with ourselves. Before suddenly waking up to a scandal so big that it threatens the material value of a highly profitable holding, financial markets have given Facebook no incentive to care about users’ privacy. In fact, investors have reliably rewarded Facebook, and other social media companies whose business models hinge heavily on targeted advertising technology, for being a leading innovator in what privacy scholars have aptly begun to call surveillance capitalism.
Even after Facebook tightened up its policies and practices, it made no serious effort to audit what happened to the data
As privacy experts and technologists have been pointing out for years, Facebook has a long history of playing fast and loose with user data, only tightening things up here and there when its hand got slapped by regulators or in response to bad media publicity. An examination of Facebook’s own published policies reveals less commitment to privacy protection than many of its industry peers.
In fact the Ranking Digital Rights 2017 Corporate Accountability Index, released in March 2017 by a research team that I manage, found that Facebook provided users with less information about how they could control the collection and use of their personal information than any other company evaluated, including two Chinese and two Russian companies. The 2018 Index will be released next month: does anybody want to take bets on whether Facebook improved on that particular question?
Facebook has had no incentive to care about users’ privacy. (Credit: DK Grove/Shutterstock)
It’s important to remember that Cambridge Analytica did not acquire the Facebook user data through theft: Facebook was not hacked. Cambridge Analytica received the data from a researcher who the company had permitted to gather it through a personality quiz app, for ostensible research purposes.
While Facebook says the Cambridge researcher, Aleskandr Kogan, violated the terms of the company’s agreement with him about how the data could be used, Facebook took no meaningful measures at the time to prevent him or other potentially dishonest people from violating such agreements.
Facebook’s systems were designed to be promiscuous with users’ data in order to drive advertising revenue
Even after Facebook tightened up its policies and practices, making Kogan-style collection of users’ data, along with that of all their contacts, impossible by app developers from 2015 onward, it made no serious effort to audit what happened to the data or to take legal action against infringements.
Mark Zuckerberg has apologized, but as the Internet Society said in a recent statement: “This incident is simply the natural outcome of today’s data-driven economy that puts businesses and others first, not users.”
As many industry commentators have pointed out, the Cambridge Analytica incident was thanks to a “feature, not a bug” of Facebook’s systems.
Facebook’s systems were designed to be promiscuous with users’ data in order to drive advertising revenue, assisted by advertisers’ ability to access rich data about the preferences and habits of individuals. This has been great for profits, and investors have consistently rewarded the company for many years, making many people very rich.
This business model and incentive system are what technologist and MIT professor Ethan Zuckerman has called the “original sin of the internet”: a bargain “in which people get content and services for free in exchange for having persuasive messages psycho-graphically targeted to them”. Google also develops profiles of its users to help advertisers target them. Most news websites embed trackers containing “cookies” and other advertising technology that follows us all over the web.
Everybody in our news and information ecosystem must take immediate steps to draw bright red lines beyond which they will not venture
It is high time for the industry not merely to repent but to change. In a keynote speech at Ethical Corporation’s Responsible Business Summit West last November, where major tech companies were promoting what they were doing to reduce their carbon footprints, I issued a challenge to Silicon Valley: We all know that our economy’s reliance on fossil fuels poses an existential threat to life on this planet. We now need to wake up to an existential crisis facing democracy.
Just as government, scientists, engineers, industry, investors, and activists have worked hard for the past generation to develop and incentivize the use of viable alternatives to fossil fuels, it is time to start investing in innovation and incentives for our information economy.
The perils of surveillance capitalism are as great as that of fossil fuels. (Credit: VLADJ55/Shutterstock)
Everybody in our news and information ecosystem – from social networking platforms to news organisations – who depend on advertising for revenues must take immediate steps to draw bright red lines beyond which they will not venture, not only in their data collection and sharing practices, but also in the types of content and behaviours that they will permit by advertisers and online marketers.
Transparency with the public is vital. The project I run, Ranking Digital Rights, has tracked persistent lack of progress by the industry when it comes to being clear with users about what is collected about them, with whom it is shared, and under what circumstances.
But transparency is not enough. Privacy, and individual control over one’s own data and content, must be baked into the design of the information platforms and services people increasingly depend upon. We need strong data protection laws not just in Europe, but in the United States and globally. We also need innovation: new technologies and business models centred around individual ownership and control over personal data as well as the content we create and share online.
These things are possible. The markets have finally awoken to the material risks behind fossil fuels and climate change. Now they have got to take the perils of surveillance capitalism just as seriously. If for no other reason, consider this: surveillance capitalism could well be one of the reasons that we now have a climate denier in the White House.
Rebecca MacKinnon is director of the Ranking Digital Rights project, which works to set global standards for how companies in the information and communications technology sector and beyond should respect freedom of expression and privacy. The 2018 Corporate Accountability Index will be released on 25 April.