Seventh Annual International Symposium on Digital Ethics →

Bot Journalism: Who’s Writing That Piece You’re Reading?

  • AuthorNikki Williams
  • Published Wednesday, March 5th, 2014
  • Comments3

Back in the heyday of the journalistic newsroom, Walter Cronkite reigned supreme as “the most trusted man in America.” Millions flocked to their television screens daily to hear him report on current events and whatever they heard, they took to heart. His compatriots in print journalism include names such as David Warsh, economic and political commentator for Forbes magazine and The Boston Globe; Anna Quindlen, the social and political commentator for The New York Times who won a Pulitzer in 1992; and Alix Freedman, who won the 1996 Pulitzer Prize for National Reporting and recently left The Wall Street Journal to become the ethics editor at Reuters. Even if these names are not familiar to you, you probably have a byline that you search for in print or online when you want news you can trust to be accurate. Over time, we form relationships with the individuals that bring us the news, relying on some more than others for their timely reporting, strict accuracy, insightful analysis or even, perhaps, their sense of humor.

Over the years, print and television journalists have enjoyed a friendly contest, each aiming to be at the forefront of news reporting by garnering more readers or viewers. But now print journalists have new competition: the writer-bots. Like the insidious threat in a sci-fi flick, these algorithm-based bot reporters are infiltrating the ranks of paper and online journalists with alarming speed. The really frightening part is that your favorite reporter could be a bot and you’ll never even know it.

Take journalist Ken Schwenke’s byline. It appears throughout the Los Angeles Times hovering over stories he didn’t write. Well, at least not technically although perhaps “technical” is precisely the word to describe a bot-written article. Mr. Schwenke has composed a special algorithm, called a “bot,” that takes available data, arranges it, then configures it for publication instantly, accurately and without his further intervention. His bot is specific to earthquake data, so when a quake or tremor occurs, the program gathers the available information, compiles it in a readable format and publishes it in the Times under Mr. Schwenke’s name, sometimes before he’s even had his morning coffee.

This kind of computerized reporting was first revealed a few years ago when a company called Narrative Science made headlines – literally. Its groundbreaking technology allowed customers to set up algorithmic newswires that would create automatic news articles from available data without the need for a flesh-and-blood writer. Initially, Narrative Science focused on sports statistics and stories, but they’ve since branched out into financial reporting, real estate writing and other industries.

Currently, Narrative Science’s technology produces a bot-driven story every 30 seconds or so. These articles are published everywhere, from highly regarded publications such as Forbes to the myriad widely known and virtually unknown outlets, some of whom are keeping their bot-story consumption on the down-low. While the company’s CTO and co-founder, Kristian Hammond, claims that robonews will not soon replace flesh-and-blood reporters, he does predict with dire certainty that a computer will win the Pulitzer Prize for writing in the next five years.

For a news agency or publisher, part of the draw of bot-based journalism is the lure of cheap writing labor. Narrative Science’s bot journalists can undercut even the most underpaid human writer. Here’s an example of one of their pieces for Builder magazine:

“New home sales dipped year-over-year in May in the New York, NY     market, but the percentage decline, which was less severe than in April 2011, seemed to be signaling market improvement. There was a 7.7% decline in new home sales from a year earlier. This came after a 21.6% drop year-over-year last month.

In the 12 months ending May 2011, there were 10,711 new home sales, down from an annualized 10,789 in April.

As a percentage of overall housing sales, new home sales accounted for 11.4%. This is an increase on a percentage basis, as new home sales were 9.5% of total sales a year ago. Following a year-over-year decline last month, sales of new and existing homes also sank year-over-year in May.”

While this isn’t exactly a stimulating read or even that well-written, it isn’t terrible. Include the fact that a piece like this will cost around $10 for a 500 word article, while hiring a writer from one of the biggest online content mills, Demand Studios, will set you back $7.50 to $20, with an average article costing $15, and you have a formula, or perhaps an algorithm, for success.

Mr. Hammond says Narrative Science is simply filling the need for these figure-laden accounts of news that no journalist is currently covering while freeing up the reporting staff of their clients to do more in-depth research or analyze more complex data. While this may be true, I know at least a hundred would-be journalists that would jump at the chance to score a gig writing a recap of a Big Ten basketball game, a summary of trending investment strategies or a review of a local theater performance.

I would also bet that a human journalist would be able to inject some excitement into that real estate article, above, although Hammond points out that Narrative Science’s technology has now advanced to let clients choose a “voice” for their stories, giving them a tone from anything from sardonic humor to pedantic narration. He believes so ardently in his technology’s burgeoning capabilities that he estimated that computers would write more than 90 percent of the news within the next 15 years.

And they are well on their way to that goal. The New York Times is one of the 30-some large publishing clients – including trade publisher Hanley Wood and sports journalism site “The Big Ten Network” – that subscribe to Narrative Science’s technology for stories. Concurrently, some media outlets like the Washington Post are using robot fact-checkers to double-check their data before publication. The Post’s program, Truth Teller, uses voice-to-text technology to transcribe speeches and cross-check claims against a database of information. The Post’s executive producer for digital news, Cory Haik, claims the goal is to “get closer to … real time than what we have now. It’s about robots helping us to do better journalism – but still with journalists.”

While it’s true that robot fact-checkers can work much more quickly than their human counterparts, their mathematically driven methods only allow them to read data as white or black. The shades of gray – nuances of speech and infinitesimal manipulations of data by clever statisticians that are easily discerned by a human journalist are lost to them. For example, in a speech given by Bill Clinton at the Democratic National Convention in 2012, he wasn’t fibbing when he said, “In the past 29 months, our economy has produced about 4 ½ million private sector jobs.”  What he did do was obscure the truth by carefully setting his threshold of data at the 29-month mark. If he’d added just a few more months, the economic growth under Obama’s management would not have had that rosy look he was going for. A robot might not be able to see through that cunning rhetoric, but a person, like’s Robert Farley, did.

The notion of bot-produced journalism is a polarizing concept for writers, editors and consumers alike. Craig Silverman, a writer for Poynter, lauds journalism bots, claiming that they are doing the grunt work, leaving the context and narrative to “real” journalists. He writes with starry-eyed passion, gushing about the potential for robots to help writers through superior semantic awareness and the ability to flag inconsistencies in previous reporting.

Emily Bell, professor of professional practice at Columbia Journalism School and director of the Tow Center for Digital Journalism, echoes his thoughts, but adds:

“Journalism by numbers does not mean ceding human process to the bots. Every algorithm, however it is written, contains human, and therefore editorial, judgments. The decisions made about what data to include and exclude adds a layer of perspective to the information provided. There must be transparency and a set of editorial standards underpinning the data collection.”

Ah, that’s the real issue – transparency. Is it ethical to put a human byline on a bot-generated story when the byline represents someone that readers have come to know and trust, à la Walter Cronkite? To me, it is unequivocally dishonest to publish a story by a bot under a human byline. In my estimation this amounts to nothing more than plagiarism of the worst kind, in which not only is the original author (the bot or the bot’s creator) not credited, but the consumer of the article is duped into believing that a human being has carefully researched, compiled and checked the facts in the article and will stand behind them. Who is liable for an error produced by a machine-generated story? The writer whose byline appears? His editor? The bot?

The Society of Professional Journalists publishes a code of ethics that for years has been used by thousands of journalists, and in classrooms and newsrooms as guidelines for ethical decision-making. Among its criteria is the enjoinder to clarify and explain news coverage and invite dialogue; to show compassion to those who might be adversely affected by news coverage; and to recognize that private individuals have a greater right to control information about themselves than do public figures. I am not sure if an algorithm has the capacity for compassion, the ability to invite dialogue or the cognizance of the difference between public and private figures that a human writer does, and this lack definitely puts the writer-bots outside the strictures of modern journalistic code. Add to that the fact that you can’t be certain who (or what) is standing behind that byline and you have the potential for an anarchic, and untrustworthy, approach to news-gathering and dissemination.

Perhaps the problem brought to light by journalism bots goes beyond transparency issues. The trend toward public acceptance of fill-in-the-blank, impersonal reporting is like a subtle mind-numbing disease brought on by continual exposure to the cult of instant gratification perpetuated by the digital landscape. Could the fact that we’ve become so inured to snippets of brief, emotionless data make it easy for these bots to be successful in reproducing (and stealthily replacing) the stories of their journalistic human counterparts? Are our own standards of compelling, telling journalism being compromised to get more hits and claim a higher position in the search engine hierarchy? Are we losing appreciation for long-form content that requires immersion, thoughtful consideration and analysis?

Ken Schwenke was on to something when he blithely admitted that many people would never even pick up on the fact that they are reading robot-driven content, but inadvertently he has touched upon the real problem behind robonews. We are entering a new era of reporting where you can no longer rely on a flesh-and-blood journalist’s ethics, honesty and integrity. In fact, you can’t rely on the authenticity of the byline at all since at any given time you could be reading the musings of an algorithm-based writer-bot rather than a journalist you know and trust. Rest in peace, Walter, rest in peace.

Nikki B. Williams is a freelance writer based in Houston, Texas.  She has written for a variety of clients from the Huffington Post and D.C.-based political action committees to Celtic jewelry designers in Ireland. If you’ve ever seen her try to help her kids with their geometry homework you would be positive she is not a robot.

3 Responses to “Bot Journalism: Who’s Writing That Piece You’re Reading?”

  1. Regina says:

    Recently I did a project about Robot Journalism and
    researched many aspects of it. I disagree with several points made in this
    article. All forms of communication, from television to print news, have had to
    make changes with the development of technology. I think robot journalism is
    just another adjustment professionals have to deal with, and it does not mean
    the end of a professional writer’s careers. Most bot journalists are used to
    create stories laden with statistical data, such as business and sports
    reports. Theses are stories that are more factual, not creative or
    investigative reporting. Bot journalists allow real writers to focus their time
    on more important stories. They also can provide more sophisticated, timely
    fact checking and editing services. I do not think they should be used to write
    more personal stories, and also do not agree with the prediction that a bot
    will win a Pulitzer Prize in the future. I do agree with the notion that it is
    unethical to attribute a story written by a bot to a human in the by line of an
    article. Some articles I have seen are attributed to Narrative Science, a
    leading robot journalism company employed by Forbes magazine. I was unaware
    that news outlets would lie and say something was written by a human rather
    than a bot. People should be aware of who or what actually produced what they
    are reading. Overall, I think robot journalism should be used as a tool to
    assist writers, rather than something that takes over the majority of news

  2. Grace Runkel says:

    I cannot imagine a bot-generated piece ever winning the Pulitzer Prize. I can accept that these bots are good at using algorithms to pick and choose relevant data, but you cannot do in-depth reporting if you are lacking the human element. Without an actual person going to out and speaking with people about the subject, all you are left with is an article that reads like a textbook. Bots might take over the number crunching pieces, but in-depth reporting will always require the human touch. The most appalling part of this story is that people’s names are attached to the bot-produced articles. If all you did was create the program that wrote the piece then you do not deserve to have the byline. If Ken Schwenke wants credit for the story the byline should say, “Story composed with algorithms created Ken Schwenke,” or something to that effect. The best use for these bots would probably be assisting in the fact-checking process for breaking news stories. We have all seen major errors, whether they are factual or grammatical, in these stories and using some bot-like program to check them could only help the situation. However, I firmly believe that people will always have a place in writing. Computer programs may be able to write faster and cheaper, but the quality and connection of a human-written piece will always be in demand.

  3. lcola says:

    In my view nothing is scarier than
    robots taking over the world and enslaving their human masters. Although this
    will probably never occur, it has occurred in the journalism industry. This
    frightens me for a number of reasons. One, the poor reporter who loses his job
    to a robot will no be on painful a job search. Two, in my view knowing that a
    robot wrote the article I am reading takes the magic out of reading the morning
    paper. To have the realization that my worldly knowledge is being relayed to
    myself from a non-feeling entity is disturbing to me. I also wonder where this
    will all stop? If the robots are allowed a foothold in journalism, what is to
    stop them from taking other peoples jobs in other industries? Jobs being taken
    from anyone by a nonhuman source, is to me is unethical. This is a huge ethical
    issue for me and I think that more attention needs to be raised about this, before
    this essay I had no idea I could have been reading an article written by a
    robot. I applaud this essay and will do further research into this fascinating
    if not life changing topic.

Leave a Reply