Back to top

Marc Rotenberg, Executive Director of the Electronic Privacy Information Center, told Congress last year that “There’s concern that Facebook and other social networks manipulate privacy policies and settings to confuse users, extract more personal information from them, and transfer the information to application developers and websites.”

Since then, the privacy issues of social networks have continued to cause controversy, surfacing time and again as the subject of international media coverage, as well as of numerous surveys, such as Consumer Reports’ recent study on “Facebook’s dark side” from June 2011.  That very report stated that “using Facebook presents children and their friends and families with safety, security and privacy risks.” Just this past week, a further privacy concern arose regarding “alleged cookie snooping:” that is, Facebook can track users - even after they have logged out.

While some users condemn Facebook for what they deem an unethical breach of privacy, the more conscientious users are trying to figure out how they can eliminate such cookies (and as a consequence, being tracked by the social network). In fact, entire websites, such as ReclaimPrivacy.org have emerged to “provide an independent and open tool for scanning your Facebook privacy settings.” After running the scanner, Facebook users will see a series of privacy scans that inspect their privacy settings and warn them about settings that might be unexpectedly public.

Yet Facebook, and specifically its Help Center, claim that “this is industry standard data” which the company collects for benign reasons: it “helps us optimize your experience depending on which browser you are using or whether or not you are logged into Facebook.” To a recent article in the Huffington Post on the Facebook’s alleged “cookie snooping,” the social network’s official reply was:

“Facebook does not track users across the Web. Instead, we use cookies on social plugins to personalize content (e.g. show you what your friends liked), to help maintain and improve what we do (e.g. measure click-through rate), or for safety and security (e.g. keeping underage kids from trying to sign up with a different age). No information we receive when you see a social plugin is used to target ads, we delete or anonymize this information within 90 days, and we never sell your information.

Specific to logged-out cookies, they are used for safety and protection, including identifying spammers and phishers, detecting when somebody unauthorized is trying to access your account, helping you get back into your account if you get hacked, disabling registration for underage people who try to re-register with a different birth date, powering account security features such as second factor login approvals and notification, and identifying shared computers to discourage the use of ‘Keep me logged in.’”

Especially after having to fight “epidemic” levels of bullying, Facebook, Twitter and other social networking sites are becoming increasingly obliged to take action and assume an ethical responsibility. However, this is usually second in line, after they have served their primary purpose: economic profit. “It's important to understand, first, that every major social network today (i.e. Facebook, Twitter, LinkedIn, Google+/Google) is a for-profit company. So their primary purpose is to provide a return on their investors' cash,” explains Scott Rosenberg, co-founder of Salon and now Executive Editor of Grist.

“In an ideal world, sure, social networking companies would understand that they have ethical responsibilities. But they are for-profit companies and it's naive to assume that they will behave in any way other than to maximize profits within the legal and regulatory frameworks that govern their actions. Today, in some cases, these companies and their leaders perceive value in taking some actions that we might think are ethically driven (concern over users’ privacy, for instance), but this generally only happens when those actions align with their quest for profits,” Rosenberg adds.

“It’s true that Facebook has taken action against the graver dangers of sharing,” affirms Emily Bazelon in a recent New York Times article titled “Why Facebook is After Your Kids.” Since January 2011, the Facebook is a partner in the police’s Amber Alert system for missing children. Fox News explains that “Facebook users who ‘fan’ their state’s newly-created Amber Alert page will receive alerts in their news feed regarding missing children.” Chris Sonderby, Lead Security and Investigations Counsel at Facebook, was quoted in the same article saying that “everyone at Facebook feels a responsibility to help protect children and, as a former federal prosecutor and a father of two, I am particularly proud that we are now part of the Amber Alert program. We are hopeful that today's announcement offers these dedicated officials another useful tool to find and safely recover abducted children.”

In fact, the idea to combine social networking and the search for missing children isn’t entirely novel; Jennifer Valentino-DeVries from the Wall Street Journal reminds us that “MySpace had an Amber Alert app years ago.” Facebook itself has a Safety Center, whose slogan reads: “we believe safety is a conversation and a shared responsibility among all of us. That's why we provide the information, tools and resources you'll find here.” The Official Facebook Safety Page provides up-to-date information on safety issues, and a sub-page titled “Experts,” which links to external privacy organizations and agencies, from Sweden to Canada. In addition, “[Facebook] is using a new technology to find and remove child pornography [and] in September [2011], Facebook started testing a special e-mail address with a small group of principals and guidance counselors that gives schools an inside track for urgent reports on bullying and fighting,” explains Bazelon.

In creating its A Thin Line campaign, MTV has similarly recognized a need for assuming ethical responsibility on the Web. The campaign “was developed to empower you to identify, respond to, and stop the spread of digital abuse in your life and amongst your peers.” As the name states, it is “built on the understanding that there's a "thin line" between what may begin as a harmless joke and something that could end up having a serious impact on you or someone else.” The music company is aware that no generation has ever had to deal with this, and thus wants to partner with young people in this effort, so that they can ultimately “draw [their] own digital line.” The goal is to educate oneself, inspire others to make a difference and be a part of the solution. Katie Davis, Project Manager at Harvard Project Zero and a member of A Thin Line’s Advisory Board, explains: “I think by their very existence campaigns like MTV’s A Thin Line are assuming an ethical responsibility to promote youth's positive online engagement. It's hard to know yet whether they are effective, since they are still so new.”

Anne Collier, Editor of NetFamilyNews.org and Co-Director of ConnectSafely.org, is convinced that every company has ethical obligations to its customers or users, and that media companies are no exception. “But as the nature of media is changing, as production is increasingly shared and distributed, so is responsibility,” she says. According to Collier, all social media companies have the following ethical responsibilities in common:

*        “To protect users' data and devices from fraud, theft, and other abuse

*        Based on how their service is used, to provide the necessary range of tools for the protection of user privacy and safety through both peer and customer-service notification (and a reasonable response time)

*        To educate users on how to protect themselves and each other from harm to their physical and psychological wellbeing and that of their intellectual and physical property.”

“We should crowd-source this list, because I'm sure others will have some valuable points to add,” she notes.

According to Collier, the responsibility doesn’t change that much from one social media service to the next. “Perhaps a little, based on how and for what purpose each is designed – though how a social site is used depends a lot on the users (again, by definition, we're talking about user-driven media),” she explains. “Twitter allows you to tweet privately to a closed circle of ‘followers,’ but that's not typically how it's used; it's for the most part very public. But it has many different uses, from marketing one's business to creating a personal learning network to following the news to celebrity watch. Because it's so utilitarian and used mostly publicly, I think, it has received less scrutiny from consumer privacy groups than Facebook.”

All in all, it’s reassuring to see that companies are attempting to assume ethical responsibility and take action for the greater good. However, the social network can only go so far. Ultimately, the user, too, must assume responsibility for his or her actions. Using common sense and maintaining oneself informed is key. If Facebook already puts so many safety resources at your fingertips, use them before you start another privacy debate, or worse, become a victim of “cybercrime.” The United States Department of Justice is here to help you out with an entire page on Cyberethics. No matter whether you’re merely a user or indeed the owner of a social network, it’s up to you to make good use of modern society’s new set of digital guidelines. As Collier says, “with – or better in – media where users are no longer passive, they must be empowered as part of the solution.” Or as MTV’s colloquial language puts it, you have to defend your very own digital domain.

Learn more about Isabel Eva Bohrer at www.isabelevabohrer.com.

Add new comment

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.