Authorkrwilson

Finding the Words: Digital Publishing and the Rise of Translated Texts

Translation plays an important role in the spread of important texts and documents on a global scale. The past century has seen a rapid increase in globalization, and through various digital technologies, particularly the Internet, the ability for people to communicate instantly from anywhere in the world has become readily available, increasing the need for translation. This has led to massive development in on-demand translation software, such as that used by Google Translate, and an increase in translated texts being made available to a global public. This paper aims to analyze the influx of translation due to the rise of digital publishing and globalization and the largely Anglo-Western bias prominent within the industry.

The Long Tail and Translation:
Amazon has been one of the founding fathers of taking advantage of the Long Tail, offering a much larger inventory on demand and using the digital storefront of the Internet rather than a brick and mortar store to allow for a seemingly endless stock. In doing so, Amazon is able to “treat consumers as individuals, offering mass customization as an alternative to mass-market fare” (Anderson 2004). This has also been their approach with respect to translated books. Through their translation branch, AmazonCrossing, Amazon seeks to address the massive gap between literary texts in translation largely produced through small presses, and books produced by large corporate presses, which are typically international big name authors such as Haruki Murakami and Stieg Larsson. AmazonCrossing is discussed in an article interviewing Chad Post, head of non-profit translation press Open Letter Books.

Post states in his interview with Len Edgerly that he personally experienced a “lack of what was not available for international books and books in translation.” This emphasizes that there are unique and important ideas in some languages that are not being addressed in others, indicating a gap that could easily be filled by translated texts. The article begins with the citing of some statistics from last year: that 151 presses in the US published some form of translation in 2015, and that AmazonCrossing led with 75 titles (nearly 14% of all translated texts in the States), three times more than the next publisher (Edgerly 2016). While the number of presses invested in some form of translated literature is inspiring, the remaining numbers suggest just over 1000 titles being translated in the US. This is barely even a drop in the bucket when compared to how many texts are produced by the country in a year. Therefore, while translation may arguably be on the rise, it is still a largely insignificant area of publishing in the US. In addition to the lack of international representation in literature, the texts that are translated are not as diverse as we might hope for in today’s global culture. Of the three translated books that the author mentioned having read in the past year from AmazonCrossing, two were from Europe (written in Danish and Polish respectively) and the other being translated from Turkish (Edgerly 2016).

However, while a large player in translated literature in the US, AmazonCrossing is not the only publisher of translations. Edgerly’s article mentions one other translation press, Open Letter Books, based in the University of Rochester. Post works “as publisher of Open Letter Books, which has a staff of three people” and claims that “it’s pleasing to see that he is now helping to increase the supply of translated books” (Edgerly 2016). Of the two translated texts mentioned in the article by Open Letter Books, one was A Greater Music by Korean author Bea Suah (the other being Argentinian novel Gesell Dome). While this exhibits an expansion into a more global realm of translation, at the end of the day this translation publisher is tiny and unable to compete with the numbers of AmazonCrossing. Moreover, as a largely consumer-driven company (rather than the academically-driven Open Letter Books), AmazonCrossing can be seen as a representation of what the public wants, which is a small number of mainly European-to-English translations. Ultimately, while the overall interest in translation does appear to be growing, it is still a largely Euro-centric entity in a Western world reluctant to branch out.

However small the number of translated texts may be, there has been enough of an increase in readership that people are taking notice. In a Publishers Weekly article from last month, it was announced that the National Book Foundation will be conducting a study of translation in the United States. In addition to the expected analysis of how many translated texts are published and where they are purchased, how availability of translated texts affects these numbers, and also how this availability affects the ways that people read (Maher 2016). Of note, the study will also analyze the diversity of translated works, which I hope will expose the Euro-centrism I mentioned earlier. Ideally, this exposure will result in a truly diverse range of texts, and support what National Book Foundation executive director Lisa Lucas says: that “all readers and all literature” must be valued and represented (Maher 2016).

Algorithmic Translation:
In a world where people increasingly engage with material in languages different than their own, the need for instant translation services has dramatically risen. One of the most widely used examples of this is Google Translate. However, how does this process of computerized translation actually occur? Importantly, Google Translate is not a translator, it is a search engine, creating matches by scanning the internet for translated material that matches the phrase it is currently trying to translate (Allen 2016). It is a process that relies on stock phrases, attempting to make language finite and rely on a set amount of sentences. However, what people fail to realize is that their assumption that Google Translate results are “unvarying, literal, mathematical, algorithmically precise translation” is largely untrue, unable to account for literary voice and spirit or, as we may expect of an automatic translator, grammatical sentences more original than the stock set that it relies on (Allen 2016).

Adam Geitgey also addresses the functionality of Google Translate, dubbing it “Machine Translation,” and introducing it as a new means of computerized translation called “sequence-to-sequence learning” (Geitgey 2016). Geitgey discusses the evolution of this method of translation, from original (and widely inaccurate) word-for-word translations, to translations with language-specific rules and algorithms, to a new reliance on probability and statistics rather than the grammar rules of particular languages. In doing so, this translation process generates thousands of potential translations based on previous translations already deemed accurate and then ranks them based on the likelihood of correctness. This method of Machine Translation has been used by Google Translate since the early 2000s, however it still has its flaws. Likely the most upsetting example of a current flaw in this system is that when translating between two less common languages, English is introduced as a middle man to account for the lack of direct translation between the two languages, resulting in a less accurate translation (and a further example of the linguistic dominance of English on a global scale). A more recent attempt to overcome the issue of a lack of word-for-word translation comes from a 2014 development of a “recurrent neural network” that utilizes the results of previous translations to alter the results of future calculations through accuracy reports. (Geitgey 2016). While still early in development, in the two years that the technology has been developing it has already met and begun to exceed translation technologies that have been in place for twenty years. Hopefully this technology will be able to account for issues between common languages, but how recurrent neural networks will be able to address less common languages remains to be seen.

It is largely thought that despite attempts to write algorithms for producing good literature, no computer will be able to write a book the same way that a human does. However, I believe that this same truth can also be applied to translation. The process of rewriting a book into a different language is not a word-for-word endeavor, and requires an understanding of metaphorical and literary language as well as the ability to produce a book that an audience can connect with. Is computerized translation of literature going to carry the same emotional weight as a human translator? I have my doubts.

An Unsatisfactory Conclusion:
Despite the developments being made in the field of translation, I still hold several concerns. To begin, the ratio of translated texts is widely skewed, and the vast majority of translated texts are those from English into other languages. Conversely, the comparatively few number of texts being translated into English are typically texts that have achieved significant popularity in their languages of origin and that have a demand for translation. This demonstrates just how strong the dominance of the English language is becoming, even in an increasingly global sphere; the demand for the availability of originally English texts remains high, whereas the reverse holds true for only a miniscule number of other texts. Post is quoted as saying that “what drew him to world literature was the greater experimentation and innovation he found there, compared with American authors,” and I argue that some of this innovation is lost when authors feel obligated to write in another language (Edgerly 2016). Therefore, rather than globalization and translation resulting in a balanced flow of ideas from many points of origin, there is an imbalance in which languages are being valued over others, even exhibiting instances of linguistic imperialism. In a world where communication is constantly referred to as a network, the relationship between “The West and The Rest” is still largely one-sided.

References:
Allen, Esther. (2016, August 26). Can Google Help Translate a Classic Novel? Retrieved from: Publishers Weekly. http://www.publishersweekly.com/pw/by-topic/industry-news/tip-sheet/article/71273-google-translating-a-classic-novel.html

Anderson, Chris. (2004, October 1). The Long Tail. Retrieved from: Wired. https://www.wired.com/2004/10/tail/

Edgerly, Len. (2016, August 27). Found in Translation: How Amazon is filling a gap in world literature. Retrieved from: Teleread.org. https://teleread.org/2016/08/27/found-in-translation-how-amazon-is-filling-a-gap-in-world-literature/

Geitgey, Adam. (2016, August 21). Machine Learning is Fun Part 5: Language Translation with Deep Learning and the Magic of Sequences. Retrieved from: Medium.com. https://medium.com/@ageitgey/machine-learning-is-fun-part-5-language-translation-with-deep-learning-and-the-magic-of-sequences-2ace0acca0aa#.gpelbe6qb

Maher, John. (2016, October 4). NBF to Conduct Translation Study. Retrieved from: Publishers Weekly. http://www.publishersweekly.com/pw/by-topic/industry-news/bookselling/article/71659-nbf-to-conduct-translation-study.html

Interview with Matthew Kirschenbaum – Kelsey Wilson

Kelsey Wilson
PUB 401
September 20, 2016

Development of the Digital Humanities in Manuel Portela’s
An Interview with Matthew Kirschenbaum

This interview provides a discussion with Matthew Kirschenbaum, author of Track Changes: A Literary History of Word Processing released earlier this year. Following the themes of his novel, Kirschenbaum answers several questions that compare and contrast the development of writing through word processing and more traditional mediums including the typewriter and long form writing. His book centers on two decades spanning from 1964-1984, the era in which word processing went from new innovation towards becoming a modern convenience. Importantly, neither Portela nor Kirschenbaum offer an overall theory towards the impact of word processing technology onto the publishing industry, but merely offer a fascinating exploration of the many developments and changes that have led to such rapid change in literature production. Portela divides his interview into a series of nine questions and responses, and I will focus on three interesting points that I found throughout these questions.

Kirschenbaum’s closing comment to Portela’s question about significant moments in the adoption of word processing for literary writing purposes emphasizes the stunning diagram drawn by our professor in last week’s class: “the history [of word processing in literary writing] itself is rarely one of simply linear progress” (Portela 2016). Technological advancements that impacted publishing seemed to be coming out every year from the late 70s throughout the 80s. This can be exemplified through the release of the Apple II in 1977 and the Macintosh in 1984 from that company alone. However, despite many successful developments, many more were flops, resulting in circular patterns and dead ends throughout the history of word processing.

A very minor comment made by Kirschenbaum when answering a question about how discourse around word processing was developed as it became more advanced and prevalent throughout society really made me take pause. Departing from authors mentioning word processing in their works, Kirschenbaum cites 1984 as “the year the illustrator David Levine began sometimes drawing authors with computers instead of typewriters or fountain pens in his caricatures for the New York Review of Books” (Portela 2016). This moment seems incredibly significant to me as it implies that the majority of viewers of this comic would have been familiar with computers and word processors and their usage in literary writing in order for the illustration to make sense. Additionally, it is interesting that this quotation mentions fountain pens alongside typewriters as earlier tools of authors found in comics, as this made me question the fall of the typewriter due to the development of word processing. For me personally at least, when writing comes to mind I think of both the computer and a pen and paper while the typewriter falls by the wayside, and this made me wonder what the next tool of the trade to disappear will be.

As a World Literature major, I am very interested in how texts are shaped, both physically and figuratively, and this is something that is addressed in the seventh question of the interview. Kirschenbaum emphasizes both the texture of the prose and cites composition theorist Christina Haas’s notion of the “‘sense of the text,’” which I believe answer for both versions of how texts are shaped (Portela 2016). Both of these concepts refer to the mental model that an author has of their works in progress, and through the ease of word processing the need for a purely mental model is disappearing. Rather than having to keep tabs on various elements of their prose, writers can now refer back to a physical model of their work at a single click. This is also exemplified in the ability for instant changes to be made to a draft throughout the writing process, which definitely impacts a writer’s approach to their work as they are able to jump around and write in a non-linear fashion. Additionally, due to the ease that word processors lend to formatting, particularly in more recent developments that Kirschenbaum mentions in his book and interview, the possibilities for unique and engaging formatting of a text are far more diverse than those that a typewriter or earlier processor could offer.

To conclude, I believe that Kirschenbaum’s article was admirably neutral surrounding such a polarizing and hotly debated topic of the development of technology and its impact on literature. The article managed to bring facts about the history of word processing into a current debate, while making a variety of predictions about the future. I am most looking forward to observing the outcome of the paradox set forth by Portela regarding  the “excess of information” and the “loss of information” surrounding word processing technologies and digital information, as I believe that this will be a crucial element in the near future (Portela 2016).

WORKS CITED

Portela, Manuel. “This strange process of typing on a glowing glass screen: An Interview with Matthew Kirschenbaum.” Capa 4:2 (2016): http://iduc.uc.pt/index.php/matlit/article/view/3017/2283. Accessed 16 September 2016.

© 2019 krwilson. Unless otherwise noted, all material on this site is licensed under a Creative Commons Attribution 4.0 License.


Theme by Anders Noren

Up ↑