Back to top

Ever started typing a Facebook status update, only to think better of it and hit backspace? I know I have. Facebook knows, too, because the social media monolith recently analyzed these “aborted” status updates from nearly 4 million of its users, then published a research studyabout what we’re doing — or not doing, to be more precise. That’s right: Facebook may be using you for research without your explicit consent.

“Why u mad?” Facebook might respond in Internet parlance. After all, its researchers used the data anonymously, and you opted in when you signed up … kind of. Facebook’s Data Use Policy— a 9,000-word behemoth you probably never read when you joined — says, "We receive data about you whenever you use or are running Facebook," and that information could be used for "internal operations, including troubleshooting, data analysis, testing, research and service improvement."

But anybody who’s taken Psych 101 knows the first step of a research study is obtaining informed consent from participants. Burying this wording deep in a document most users don’t even skim is ethically questionable, especially at a time like this, when Americans are particularly protective of their privacy. According to a new poll by the Associated Press and GfK, Germany's largest market research institute, “61 percent [of respondents] said they prioritize protecting Americans’ rights and freedoms over making sure Americans are safe from terrorists,” the Boston Globe wrote recently. (This is up 2 percent from a similar poll five months ago.) And a research study on Internet user behavior is a far cry from spying on Americans to keep the nation safe.

So in the midst of a seemingly unending assault against personal privacy, how can companies like Facebook respect privacy concerns and proceed with research in a more ethical way?

Was it unethical?

I asked Dr. Annette N. Markham, a communications professor at Aarhus University and Loyola University who has written codes of ethics for Internet research, whether she thought Facebook’s behavior was ethical.

“This is precisely the challenging question, since it points to the difference between a social network site's legal rights and ethical practices. These are not always the same,” Markham replied via email. “In this case, Facebook has the right to use whatever data they collect since the terms of service allow them access. Was it the right thing to do? The answer will vary depending on who's being asked.”

If you ask writer Sean Rintel, he’ll say no. As Rintel recently wrote in Business Spectator, “This is at best a form of passive informed consent. The MIT Technology Review proposes that consent should be both active and real time in the age of big data.” As Facebook was founded in February 2004, that gives users a decade to forget what they agreed to — hardly ethical.

Why informed consent matters

The point of informed consent, writes Markham in an email, is that study participants know what they’re getting into before the research starts. She cites the Tuskegee syphilis experiment of the 1930s-’70s as evidence that research participants can suffer if they aren’t told what’s going on. In that experiment, 128 test subjects — poor African-American men from rural Alabama — died because the U.S. Public Health Service researchers never told them they had syphilis, nor were they treated for it, even though penicillin was available. In this case, lack of informed consent literally killed people.

Although no one is going to die because they didn’t realize Facebook is using their data, the Tuskegee syphilis experiment is a good reminder of why informed consent is so important. Its goal as a first step in research “is to preserve the autonomy of the participant and give them respect,” Markham explains. “Giving them adequate information to make an informed decision was the key. Here, we can and should take Facebook to task for not being clear about the intended use of data. Their defense that the data was anonymous is not adequate since this is not the ethical obligation under dispute.”

The irony of the situation

Facebook’s research paper, “Self-Censorship on Facebook,” discusses who is most likely to erase something before posting it, and under what circumstances. For example, the researchers found that men on Facebook censor themselves more than women do, especially if the guy has mostly male friends. Part of the researchers’ motivation, it seems, was to find out how to prevent self-censorship, because that means “the SNS [social networking site] loses value from the lack of content generation.” After all, Facebook can’t profit off you if you aren’t using the site.

But the very essence of an aborted Facebook status means users didn’t want anyone to see it: not their friends and not the Facebook overlords. People have heard “The internet is forever” and know that cached versions of web pages linger on after the original content is deleted. But it seems almost cruel that even our half-formed thoughts, updates that never saw the light of day, cannot stay entirely personal. “Is passive informed consent to collect data on technical interactions sufficient when users have actively chosen not to make content socially available?” asks Business Spectator. I say no. Arguably, most people don’t realize that things they type and then erase are still recorded; I believe the blame lies with Facebook for not clarifying this. Markham seems to agree. “The question is not whether the user consented to give Facebook the right to use all data, it's whether the use of such data as keyboard actions and backspacing was clearly understood by the user to be a form of data that would be collected,” she writes via email.

“In this case, it may not be a legal as much as [an] ethical question of responsibility and fairness,” she continues. “Frankly, we should continue to question the questionable ethics of creating lengthy and complicated blanket TOS [terms of service] that cover all manner of things, when everyone knows it's common practice to simply accept these without close scrutiny.” Indeed, Facebook's latest research seems symptomatic of a culture of websites intentionally employing a long Terms of Use page that they know very few users will read. So what’s a more ethical alternative?

A better way forward

Edward Snowden opened Americans’ eyes to the fact that our data isn’t private. To be sure, some respond with blasé resignation and references to George Orwell. But others are outraged and feel even more fiercely possessive of their personal information — particularly keystrokes they thought had been erased forever. Companies like Facebook need to understand that transparency around their privacy policies will not only boost user trust and prevent scandals and backlash down the line; it’s also simply more ethical.

Moving forward, Facebook and other sites should start employing a much more explicit terms of use, as well as real-time informed consent before launching a research study. One good example is what happens when you authorize an app through Twitter. The Twitter user sees a screen very clearly spelling out what the app will and will not be able to do. For example, Twitpic will be able to post tweets for you, but it will not have access to your Twitter password. The list is short and easy to understand, and users know exactly what they’re getting into.

Rather than the arguably underhanded (ethically, if not legally) data collection process for its recent research study, Facebook should employ a similarly straightforward notification process for future studies. Select users could see a pop-up explaining that they’ve been randomly selected to take part in a study over the following two weeks, in which their data will be used completely anonymously. This at least would give users the choice of opting out. Isn’t that something everyone deserves?

As Jeffrey Rayport wrote in the MIT Technology Review, Big Data giants like Facebook should adhere to “something akin to the Golden Rule: ‘Do unto the data of others as you would have them do unto yours.’ That kind of thinking might go a long way toward creating the kind of digital world we want — and deserve.”

Holly Richmond

Holly Richmond is a Portland writer. Learn more at hollyrichmond.com.

Add new comment

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.