Back to top

The results of the 2016 presidential election have proven to be something of a Rorschach test for the politically conscious. President-elect Donald Trump pulled off an Electoral College victory and a stunning upset against Democratic candidate Hillary Clinton through narrow wins in Michigan, Wisconsin and Pennsylvania. Notably, Trump lost the popular vote to Clinton by more than 2.8 million votes.

Reading the tea leaves after this major event is obviously a partisan exercise for some, but one often cited and notably pesky culprit for Clinton’s loss presents a unique challenge for seekers of truth in the digital age, and it’s one that strikes at the heart of journalism itself.

On Nov. 5, mere days before the general election, the Denver Guardian declared – in all caps – “FBI AGENT SUSPECTED IN HILLARY EMAIL LEAKS FOUND DEAD IN APPARENT MURDER-SUICIDE.” There are a few problems here, not the least of which would be the fact that the Denver Guardian doesn’t exist. As The Denver Post dutifully noted shortly after the “story” broke, the Denver Guardian is not a legitimate news source. The aforementioned murder-suicide story is, quite simply, a piece of fake news. The story didn’t happen. It’s patently false. Nonetheless, stories such as this one spread like wildfire on social media sites, namely Facebook. Fake news is exactly what it sounds like: It is misinformation styled as news. Today, it is manufactured and optimally designed to get clicks. As such, false news stories tend to be both hyperpartisan and highly sensational. One such story claimed that journalist and conservative political commentator Megyn Kelly had been fired by Fox News after endorsing Clinton during the general election. In reality, Kelly never endorsed Clinton, and the commentator was never fired by Fox News (though she has since accepted an offer to move to NBC).

This disturbing trend of fake news recently made real headlines due to the troubling prospect that it might have swayed the election in favor of Trump. Did fake news really have that influence? At this point, it’s unclear. Not surprisingly, Facebook founder and CEO Mark Zuckerberg downplayed the notion of fake news having such an effect. However, an eye-opening analysis by Craig Silverman at BuzzFeed found that Facebook users were more engaged with fake news than with real stories from 19 major news outlets in the last three months of the presidential campaign. So, it wouldn’t be a huge leap in logic to assume that fake news had some kind of impact on the election results. Plus, with respect to the general election, between the three most contested states, the difference was an extremely close 80,000 votes. Even if the impact of fake news was minuscule, it could have made all the difference in those extenuating circumstances. The general significance of false news stories, however, extends beyond the scope of elections.

While fake news has been around at least since the dawn of the printing press, it has only recently become a steady source of income for unscrupulous entrepreneurs. The formula for its production is rather simple, involving only three steps.

Step one: Create a sensational story with no regard for the truth. Step two: Publish said story online, and sell ad space on the page. Step three: Collect ad revenue generated from the story.

As long as there is economic incentive to fabricate sensational stories, the plague of fake news will continue. So, how do we combat such hastily crafted misinformation? Considering potential conflicts with the First Amendment, government censorship is a path we don’t want to take. But, perhaps there are ways to disincentivize the creation and spread of fake news. Google is reportedly taking steps to ensure that fake news culprits are not able to use its ad-selling software. This is an admirable first step, but it is imperative for people to continue applying pressure on Google to ensure that the problem doesn’t fall by the wayside.

Some journalists are calling for readers to practice caution and more thoroughly scrutinize news stories. Brian Stelter of CNN coined the phrase “refuse to be confused,” a desperate plea for journalists and consumers alike to be more vigilant about the spread of misinformation. It’s an admirable sentiment. Edward Snowden recently echoed the plea, saying, “The answer to bad speech is not censorship. The answer to bad speech is more speech. We have to exercise and spread the idea that critical thinking matters now more than ever, given the fact that lies seem to be getting very popular.” Again, Snowden’s rhetoric is admirable. In an ideal world, intelligent readers armed with critical thinking skills would be plentiful, and they would be quick to combat misinformation. But the real world is fraught with complications, partisan sources, confirmation bias and prejudices that work in a myriad of ways to shut down critical thinking and productive discussion.

It’s difficult to conceive a complete, accurate profile of the average American, but researchers have discovered telling details about U.S. citizens in general. A study by the Organization for Economic Cooperation and Development found that the reading skills of American adults are significantly lower than those of adults in most other developed countries. Here's another detail: Americans tend to work longer hours than people in other large countries. American adults in full-time positions reported working 47 hours a week on average – that's nearly six days a week. Despite this schedule, the United States ranks close to the 30th percentile in the category of income inequality, meaning 70 percent of other countries have more equal income distribution. So, Americans have relatively poor reading skills and work longer hours than their counterparts in other developed countries. To top it off, the average American’s income is increasingly disproportionate relative to the country’s richest 1 percent. What can we discern from these details? Well, one thing is clear: Americans do not have the time, inclination or resources necessary to vet every single piece of news that appears on their Facebook feeds, and it is unrealistic to expect them to do so. A discerning readership is a great ideal to strive for, but not in place of pursuing pragmatic technological solutions to the problem of fake news.

Sites such as Facebook are largely responsible for creating the partisan environment that allows false information to spread online like a contagious virus. British filmmaker Adam Curtis aptly describes the process in his 2016 documentary, “Hypernormalisation,” telling how the algorithms and filters on social media have gravely limited the content people see.

“In the process, individuals began to move, without noticing, into bubbles that isolated them from enormous amounts of other information,” Curtis says. “They only heard and saw what they liked. And their news feeds increasingly excluded anything that might challenge peoples’ preexisting beliefs.”

Jon Keegan of the Wall Street Journal goes even further and creatively demonstrates the profound effect of partisan filtering on Facebook. His interactive graphic allows readers to pick certain hot-button issues, such as “guns” and “abortion,” and view side-by-side versions of liberal and conservative news feeds on Facebook to see how those topics are represented. The comparisons are striking. For instance, a cursory search of the word “guns” reveals a certain kind of result in the liberal Facebook feed: a video from Upworthy in which celebrities make the case for gun control. Conversely, the conservative feed yields a Breitbart article called “Debbie Wasserman Schultz: Federal Government May Ban Passengers from Checking Guns in Baggage.” This disparity demonstrates how social media can work to further divide Americans.

For a time after the presidential election, Zuckerberg went on the defensive against the idea that Facebook influenced the results. He refused to call Facebook a media company and seemed perplexed at the notion that anyone would even consider it that. Despite Zuckerberg’s reluctance to acknowledge the influence of the social networking platform, it is where an astounding number of people get their news. Indeed, 44 percent of the general population of the United States claimed to get news from the site. Zuckerberg recently walked back his defensive statements, saying that Facebook is, in fact, a media company – just not a “traditional” one. Whatever label you want to assign this behemoth corporate entity, the goal of a company such as Facebook is abundantly clear: to create a totally immersive online environment. Understandably, Facebook doesn’t want users leaving, and it is therefore designed to keep users engaged through an endless stream of photos, videos, news articles, and, yes — likely some fake news. The ideal Facebook user would never leave the site. And, naturally, the company wants everyone using Facebook as a basic amenity. Everything the company does is in pursuit of this ubiquitous ideal, and its efforts are working. CNBC reports that Facebook, with 1.35 billion users worldwide, has more monthly active users than WhatsApp (500 million users), Twitter (284 million) and Instagram (200 million) combined. It has about 1 billion more users than Twitter and the same amount of monthly users as there are people in China.

Facebook dominates our culture in ways that are impossible to fully articulate. To claim with certainty that it didn’t influence the 2016 presidential election, or many other major events, is specious. The platform undoubtedly influences the world by virtue of its market and cultural dominance. If such domination is indeed Facebook’s goal, the company has an ethical obligation to ensure that its users are not totally misinformed. When Facebook’s product is utilized to such a great extent, and when the company operates as the de facto media aggregate for its consumers, it puts itself in a position to be responsible for the stories shared by its users. Unlike the average American, Zuckerberg is uniquely poised to face this challenge head-on. If Facebook wishes to continue using the term “news feed” to describe its platform, it had better take all of the possible steps to ensure that what appears on said feed is not grossly inaccurate. But ethical appeals are rarely convincing to faceless corporations, whose financial obligations to shareholders and the bottom line have historically taken precedent over common decency.

Perhaps it would be better to frame the issue in pragmatic terms. If Facebook doesn’t want the public’s perception of its company to turn sour with the idea that Facebook is a fringe website fraught with dubious information, perhaps the company will take significant action to help stop the spread of fake news. Despite Zuckerberg’s initial downplaying of the potential impact of fake news on the election, Facebook is taking steps to address the problem. It is implementing a new system that allows users to flag stories they suspect to be false, and those stories are then referred to third-party fact checkers. This, too, is an admirable step in combating the spread of fake news. But is it just window dressing? As long as our social networks serve to reinforce partisan divides through algorithms, fake news will find a way to linger in the American consciousness. Now, more than ever, it is imperative that we as a society use technological means to combat the problem of misinformation. Moreover, it is imperative that those in positions to effect real change consider the consequences of allowing hyperpartisanship and, in turn, misinformation to thrive. It is for the benefit of humanity as a whole that innovative thinkers find new ways to connect individuals who are not ideologically similar. After all, isn't that the supposed purpose of social networking — to better connect people?

David Stockdale

David Stockdale is a freelance writer from the Chicagoland area. His political columns and book reviews have been featured in AND Magazine. His fictional work has appeared in Electric Rather, The Commonline Journal, Midwest Literary Magazine and Go Read Your Lunch.  Two of his essays are featured in A Practical Guide to Digital Journalism Ethics. David can be reached at , and his URL is http://davidstockdale.tumblr.com/.

Add new comment

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.