Robots take over Arts and Culture… #NOT

Technology with all of its positive contributions, has been feared just as long as it has been celebrated. Software, mechanics, and machinery have innovated many industries, but have cost many individuals their jobs as well. People began to fear that they could be replaced with robots who could work more accurately, consistently, and faster without rest or pay; saving companies a fortune. Employees in the arts and culture industries seemed to be out of the woods, their work requires a high level of cognitive function which has not crossed over to the technical realm. Robots were great for repeating one task over and over again, but at the time it seemed impossible for them to actively “create”.

Recently, technology has begun to march towards creative industries and it has found itself a valuable home in journalism. Working for an increasing number of news publications, computers have become journalists and are writing the news, accurately and quickly. Though the quality of writing produced by the computer algorithms is argued amongst news publications and journalists, the public does not seem to mind or even notice that they are being given information created by a robot.  News publications can learn to use the robots in conjunction with their journalists without hurting the quality of their publications, but only if the technology is embraced and understood, otherwise the risk to jobs will rise and news will lack its human touches. The robots have landed and are here to stay.

An article by the Los Angeles Times made waves in the technology and publishing world on March 17th 2014, when by using a writing algorithm they were able to be the first media outlet to publish a story about an earthquake that happened minutes before (Oremus, Slate). The journalist and programmer Ken Schwencke was awoken by the earthquake at 6:25 am, stating he “rolled out of bed and went straight to his computer, where he found a brief story about the quake already written and waiting in the system. He glanced over the text and hit ‘publish.’” (Oremus, Slate). An algorithm called “Quakebot” was the real author of the story, pulling live data from the U.S Geological Survey, and plugging in the data to a template of pre-written text (Oremus, Slate). The article itself read;

“A shallow magnitude 4.7 earthquake was reported Monday morning five miles from Westwood, California, according to the U.S. Geological Survey. The temblor occurred at 6:25 a.m. Pacific time at a depth of 5.0 miles.

According to the USGS, the epicenter was six miles from Beverly Hills, California, seven miles from Universal City, California, seven miles from Santa Monica, California and 348 miles from Sacramento, California. In the past ten days, there have been no earthquakes magnitude 3.0 and greater centered nearby.

This information comes from the USGS Earthquake Notification Service and this post was created by an algorithm written by the author.” (as qtd. in Slate)

This is an example of “automated journalism”, and it isn’t the first. Steven Levy reports in Wired that in addition to Quakebot, programs are being used to write much more including “a pennant-waving second-half update of a Big Ten basketball contest, a sober preview of a corporate earnings statement, or a blithe summary of the presidential horse race drawn from Twitter posts” (Levy, Wired). All automated journalism shares a similar feature, they pull data and input that data in an appropriate way.

Articles written by these algorithms are best used to share or summarize facts and data, and aren’t programmed to contain any bias or opinion. Kristian Hammond is a leading scientist in the field of automated journalism and in 2010 co-founded Narrative Science which sells these writing-systems to news agencies (Levy, Wired). The systems use template text written by writers, then engineers program the computer telling it what information to pull and how to present it (Levy, Wired). In the case of a sports team the algorithm must consider things like:

“Who won the game? Was it a come-from-behind victory or a blowout? Did one player have a fantastic day at the plate? The algorithm considers context and information from other databases as well: Did a losing streak end?” (Levy, Wired).

Finally, using vocabulary provided by the writers, the system drafts a narrative. Errors are rare, but the system has built in measures to protect against this, “If the algorithm realises some data is missing it will stop and ask for it. Once it has what it needs, it goes back to work” (Eudes, the Guardian).

Computer generated articles are not able to replicate a journalist who is able to interview, make connections, and reflect, suggesting that articles are of a lower quality. Christer Clerwall comments that a major factor in the assessment of quality is in fact credibility (Clerwall 521). Clerwall also cites a 2000 study in which they found that readers are aware that content passes through a filter, but expect that articles “provide for a degree of information reliability, i.e. the information should be factually correct and (at least somewhat) objective” (Clerwall 522). Programs like Quakebot do exactly this, they provide objective factual information, and thus are arguable writers of quality work. However, they lack on many other descriptors of quality including being comprehensive, considerate of reader interests, lively, and creative (Clerwall 523).

However, readers of news publications are relatively unable to differentiate between an article written by a person, and one written by a computer. In a study conducted by Clerwall, readers were split into two groups; one group was given an article written by a journalist, and the other written by a computer. The study found there were almost no significant difference in how the texts were perceived by the readers (Clerwall 526). StatSheet founder Robbie Allen found similar results, telling the New York Times that “he believes fewer than 20 percent of his readers ever suspect they’re reading something produced by a computer program” (van Dalen 651).

The competition for space in print news publications is a valid concern, but in the  online space journalists do not have to compete with a computer (Carlson 6). Schwencke believes programs like Quakebot get information out quickly, and simply while working as a great starting point for journalists to later expand as information becomes available, noting that Quakebot’s article on the earthquake was updated by humans 71 times by noon that same day (Oremus, Slate). Kristian Hammond also remarks on what the writing robots are being used to write.

“Robonews tsunami… will not wash away the remaining human reporters who still collect paychecks. Instead the universe of newswriting will expand dramatically, as computers mine vast troves of data to produce ultracheap, totally readable accounts of events, trends, and developments that no journalist is currently covering.” (Levy, Wired).

Hammood suggests that by 2025, 90% of news would be written by computers, and technology rapidly advances with time, so regardless of how journalists feel, they will soon find themselves face to face with the robo-writer (Levy, Wired). Hammood reassures that “that doesn’t mean that robots will be replacing 90% of all journalists, simply that the volume of published material will massively increase” (Eudes, The Guardian). Matt Carlson also concludes that amidst the fear of the unknown and potential harms to the journalist, automated news can also positively impact journalists by freeing up time searching for the facts and mechanics, and can also help identify patterns, and timelines (Carlson 14). Journalists have a valuable tool in their midst, but need to redefine themselves and focus on what they contribute to news publishing, otherwise the robots will be happy
to take their place.

 

Sources

Carlson, Matt. “The Robotic Reporter.” Digital Journalism. Routledge, 11 Nov 2014.

Clerwall, Christer. “Enter the Robot Journalist.” Journalism Practice. Routledge, 25 Feb 2014.

Eudes, Yves. “The journalists who never sleep” The Guardian: Tech. The Guardian, 12 Sept 2014.

Levy, Steven. “Can an Algorithm Write a Better News Story Than a Human Reporter?” Wired: Science. Wired Magazine, 24 April 2012.

Oremus, Will. “The First News Report on the L.A. Earthquake Was Written by A Robot” Future Tense. Slate, 17 March 2014.

van Dalen, Arjen. “The Algorithms Behind the Headlines.” Journalism Practice. Routledge, 30 March 2012.

2 Replies to “Robots take over Arts and Culture… #NOT”

  1. This is a very informative essay on how “automated journalism” works. The earthquake example clearly explained what these robots or algorithms currently do. While it does appear at first that a human wrote the article on the earthquake, once you know what you are looking at you can see it is strictly data.

    It also makes sense that journalists will still get paid if that is all the robots do, but it sounds like that is not all that they will do, and that leaves me with some questions. If true that “the volume of published material will massively increase”, is it not possible that people will be going online for a quick news fix, reading only what these ‘bots have written and not read any human generated content at all?

    The world is already chock-a-block full of content, as Hugh McGuire points out in his article “Sifting Through all these Books” on the book publishing industry, and if people cannot differentiate between robot and human generated content as Clerwall concluded in his study, then why would human written articles get read as much as ‘bot written articles? It sounds like there will be so much more robot generated content that some journalists and writers will lose their jobs. Sure, robots cannot be creative, but until recently it was only humans who reported weather and sports statistics; what will those journalists be doing now? Aren’t they out of a job?

    Your title is “Robots take over Arts and Culture… #NOT”, but I feel that by the end you were saying that journalists were going to have to fight hard to retain their jobs. Your closing sentence (“Journalists have a valuable tool in their midst, but need to redefine themselves and focus on what they contribute to news publishing, otherwise the robots will be happy to take their place.”) seems to contradict your title and initial argument that robots will not be taking jobs. It feels like perhaps you had your argument, but after doing research came to a different conclusion.

    Lastly, you did not mention Elon Musk, Bill Gates, and Stephen Hawking who have all expressed their fear that artificial intelligence will be the end of the human race. In a January 15, 2015 article on wired.com, it was reported that Elon Musk, a tech mogul, has donated 10 million dollars to the Future of Life Institute to “keep AI from running loose and growing into something that’s a real danger to humans.”

    In a December 2014 article on bbcnews.com, Stephen Hawking said that “the development of full artificial intelligence could spell the end of the human race”, which is a pretty bold statement from one of Britain’s pre-eminent scientists. And last week in an Ask Me Anything on Reddit, Bill Gates revealed that he agrees with Elon Musk and Stephen Hawking and was “onboard” about their position on artificial intelligence.

    This amount of detail on AI may have been too much, but I think it would have been important to note how three of the world’s leading minds feel about the rapidly increasing rate at which machines are thinking for us.

    Overall the essay was well written, but the argument seemed to shift near the end.

  2. This was an interesting article with some good examples and quotes from those producing the algorithms or otherwise involved. I found it presented the points of view expressed by those individuals well, along with a clear explanation of the current state of the art in automated journalism.

    While you presented the points of view of others clearly, your own voice did not come through. I would have wanted to know your take on the subject, where you see journalism going, and whether you think this kind of automated reporting is something to be feared.

    A couple more paragraphs or a little more opinion interspersed throughout would have helped to bring this from a factual account to an opinion piece. Otherwise, we might mistake it for an essay written by a robot.

Comments are closed.