Technology with all of its positive contributions, has been feared just as long as it has been celebrated. Software, mechanics, and machinery have innovated many industries, but have cost many individuals their jobs as well. People began to fear that they could be replaced with robots who could work more accurately, consistently, and faster without rest or pay; saving companies a fortune. Employees in the arts and culture industries seemed to be out of the woods, their work requires a high level of cognitive function which has not crossed over to the technical realm. Robots were great for repeating one task over and over again, but at the time it seemed impossible for them to actively “create”.
Recently, technology has begun to march towards creative industries and it has found itself a valuable home in journalism. Working for an increasing number of news publications, computers have become journalists and are writing the news, accurately and quickly. Though the quality of writing produced by the computer algorithms is argued amongst news publications and journalists, the public does not seem to mind or even notice that they are being given information created by a robot. News publications can learn to use the robots in conjunction with their journalists without hurting the quality of their publications, but only if the technology is embraced and understood, otherwise the risk to jobs will rise and news will lack its human touches. The robots have landed and are here to stay.
An article by the Los Angeles Times made waves in the technology and publishing world on March 17th 2014, when by using a writing algorithm they were able to be the first media outlet to publish a story about an earthquake that happened minutes before (Oremus, Slate). The journalist and programmer Ken Schwencke was awoken by the earthquake at 6:25 am, stating he “rolled out of bed and went straight to his computer, where he found a brief story about the quake already written and waiting in the system. He glanced over the text and hit ‘publish.’” (Oremus, Slate). An algorithm called “Quakebot” was the real author of the story, pulling live data from the U.S Geological Survey, and plugging in the data to a template of pre-written text (Oremus, Slate). The article itself read;
“A shallow magnitude 4.7 earthquake was reported Monday morning five miles from Westwood, California, according to the U.S. Geological Survey. The temblor occurred at 6:25 a.m. Pacific time at a depth of 5.0 miles.
According to the USGS, the epicenter was six miles from Beverly Hills, California, seven miles from Universal City, California, seven miles from Santa Monica, California and 348 miles from Sacramento, California. In the past ten days, there have been no earthquakes magnitude 3.0 and greater centered nearby.
This information comes from the USGS Earthquake Notification Service and this post was created by an algorithm written by the author.” (as qtd. in Slate)
This is an example of “automated journalism”, and it isn’t the first. Steven Levy reports in Wired that in addition to Quakebot, programs are being used to write much more including “a pennant-waving second-half update of a Big Ten basketball contest, a sober preview of a corporate earnings statement, or a blithe summary of the presidential horse race drawn from Twitter posts” (Levy, Wired). All automated journalism shares a similar feature, they pull data and input that data in an appropriate way.
Articles written by these algorithms are best used to share or summarize facts and data, and aren’t programmed to contain any bias or opinion. Kristian Hammond is a leading scientist in the field of automated journalism and in 2010 co-founded Narrative Science which sells these writing-systems to news agencies (Levy, Wired). The systems use template text written by writers, then engineers program the computer telling it what information to pull and how to present it (Levy, Wired). In the case of a sports team the algorithm must consider things like:
“Who won the game? Was it a come-from-behind victory or a blowout? Did one player have a fantastic day at the plate? The algorithm considers context and information from other databases as well: Did a losing streak end?” (Levy, Wired).
Finally, using vocabulary provided by the writers, the system drafts a narrative. Errors are rare, but the system has built in measures to protect against this, “If the algorithm realises some data is missing it will stop and ask for it. Once it has what it needs, it goes back to work” (Eudes, the Guardian).
Computer generated articles are not able to replicate a journalist who is able to interview, make connections, and reflect, suggesting that articles are of a lower quality. Christer Clerwall comments that a major factor in the assessment of quality is in fact credibility (Clerwall 521). Clerwall also cites a 2000 study in which they found that readers are aware that content passes through a filter, but expect that articles “provide for a degree of information reliability, i.e. the information should be factually correct and (at least somewhat) objective” (Clerwall 522). Programs like Quakebot do exactly this, they provide objective factual information, and thus are arguable writers of quality work. However, they lack on many other descriptors of quality including being comprehensive, considerate of reader interests, lively, and creative (Clerwall 523).
However, readers of news publications are relatively unable to differentiate between an article written by a person, and one written by a computer. In a study conducted by Clerwall, readers were split into two groups; one group was given an article written by a journalist, and the other written by a computer. The study found there were almost no significant difference in how the texts were perceived by the readers (Clerwall 526). StatSheet founder Robbie Allen found similar results, telling the New York Times that “he believes fewer than 20 percent of his readers ever suspect they’re reading something produced by a computer program” (van Dalen 651).
The competition for space in print news publications is a valid concern, but in the online space journalists do not have to compete with a computer (Carlson 6). Schwencke believes programs like Quakebot get information out quickly, and simply while working as a great starting point for journalists to later expand as information becomes available, noting that Quakebot’s article on the earthquake was updated by humans 71 times by noon that same day (Oremus, Slate). Kristian Hammond also remarks on what the writing robots are being used to write.
“Robonews tsunami… will not wash away the remaining human reporters who still collect paychecks. Instead the universe of newswriting will expand dramatically, as computers mine vast troves of data to produce ultracheap, totally readable accounts of events, trends, and developments that no journalist is currently covering.” (Levy, Wired).
Hammood suggests that by 2025, 90% of news would be written by computers, and technology rapidly advances with time, so regardless of how journalists feel, they will soon find themselves face to face with the robo-writer (Levy, Wired). Hammood reassures that “that doesn’t mean that robots will be replacing 90% of all journalists, simply that the volume of published material will massively increase” (Eudes, The Guardian). Matt Carlson also concludes that amidst the fear of the unknown and potential harms to the journalist, automated news can also positively impact journalists by freeing up time searching for the facts and mechanics, and can also help identify patterns, and timelines (Carlson 14). Journalists have a valuable tool in their midst, but need to redefine themselves and focus on what they contribute to news publishing, otherwise the robots will be happy
to take their place.