Information Sharing Online and in Coffeehouses:
Gatekeepers and Social Discourse
Information sharing today has reached a peak that is unprecedented. Higher literacy rates, the accessibility of the Internet, and the availability of pages online, inclusive of blogs, comments, and profile pages, contribute to a endless stream of information that must be sorted through in order to be understood. Furthermore, what are the side effects of the ways users are sorting through content? By examining the social changes in regard to information sharing during the Age of Enlightenment and comparing them to the challenges of sharing knowledge on a website such as Facebook, this essay will argue that while using algorithms is beneficial for the expansive amount of information on the web, it ultimately leads to a less knowledgeable, less informed online community. It will examine how the Age of Enlightenment thrived where the Internet is failing despite the possibility for progressiveness and innovation.
The Age of Enlightenment was a period in eighteenth-century Europe in which there was a movement against the then-current state of society, inclusive of church and government. In pre-Enlightenment Europe, “individual dignity and liberty, privacy, and human and civic rights… [were] virtually nonexistent… ‘burned and buried’ in medieval society and pre-Enlightenment traditionalism” (Zafirovski 9). This illustrates the church and state’s role as gatekeepers of knowledge, allowing only what they deemed as appropriate to be accessed by society. Zafirovski states that during the Enlightenment, “Descartes, Voltaire, Diderot, Kant, Hume, Condorcet, and others emphasized overcoming ignorance and intellectual immaturity, including religious and other superstition and prejudice” (4). He is referring to the major thinkers of this time, those who wrote public essays on the tenets of enlightenment and reason. It was the age where past ideals were rejected in order to champion the concept of individual thought and voice. It was not a period of “anti-” religion or state, but of individual liberty and of pushing against absolutism. During this time, the Encyclopédie was published, which disseminated the thoughts of the Enlightenment. Diderot, the editor of the project, is quoted to have said that the goal of the Encyclopédie was to “change the way people think” (“Encyclopédie”). During the Enlightenment, the opinions of those who wanted to remain within the norms of pre-Enlightenment society existed alongside the dissertations of those who proclaimed it was time for change: “The inner logic, essential process, and ultimate outcome of the Enlightenment are the destruction of old oppressive, theocratic, irrational, and inhuman social values and institutions, and the creation of new democratic, secular, rational, and humane ones through human reason” (Zafirovski 7). The thinking that existed pre-Enlightenment had to occur; the prominent thinkers emerged from a society of rules they did not relate to. In other words, they had to know the culture they were living in very deeply in order to argue strongly against it.
As stated previously in regards to the Encyclopédie, the dissemination of knowledge was paramount during the Enlightenment. For the sake of this paper, the major sources of knowledge-spread are deduced to be of two origins: book publishing and the salons and coffeehouses. As illustrated much earlier through Martin Luther’s Ninety-Five Theses, the ability to spread printed information became much simpler and more efficient with the invention of the moveable type by Johannes Gutenberg. Previous to this invention, religious scribes hand wrote all of the books that were available. Because this was such an intensive process and paper was handmade, books were very expensive. Yet, as time went on, the efficiency of the printing press grew, especially with the beginning of the Industrial Revolution. This meant lower prices and therefore more availability. In turn, literacy grew. Furthermore, the inexpensive cost allowed the increased spread of journals, books, newspapers, and pamphlets (“Age of Enlightenment”). More people could engage with texts because of higher literacy rates and the growing number of texts that were now available. Once articles, essays, and books were read, they were also discussed in places such as coffeehouses and salons where both men and women could meet to debate and discuss the ideas of the time. This created a social environment that was a catalyst for new philosophies. In fact, the idea for the Encyclopédie was conceptualized at the Café Procope in Paris, one of the coffeehouses of Paris that is still maintained (“Age of Enlightenment”). Furthermore, because anyone could come to discuss politics and philosophy, it undermined the existing class structure, thus allowing for multiple perspectives in one place.
At the time of its introduction, the possibility of how an open public internet would become so ingrained in human society and culture could not have been predicted. The rapid growth of the Internet is considered by Douglas Comer to be a result of its decentralization and the “non-proprietary nature of internet-protocols” (qtd. in “Internet”). During the time in which the Internet became popular, the speed of information growth was unprecedented. New websites with personalized homepages and links emerged as people began to explore the World Wide Web. Today, sites such as Facebook act as home websites replacing the “homepages” of before. This, as shown in “The Rise of Homeless Media,” is beginning to replace the old ways of the web. Facebook is becoming a much bigger entity than the developers imagined at its conception. While this change may mean that the web is becoming streamlined, it comes at a cost of control to these site users. In the ‘90s and early 2000s, the popular free web hosting services provided a very personalized experience. Sites such as Angelfire, Freewebs, LiveJournal, and DiaryLand relied on subscribers and ads in order to allow their sites to run freely and in a way that allowed users to personalize their content, with the exception of ad placement for non-subscribers. Personalization occurred through writing code such as HTML. Furthermore, serious bloggers acted as a catalyst for other voices, creating a community where readers were linked to other bloggers and informative sites of related ideologies and/or topics. For instance, Mike Shatzkin’s The Shatzkin Files hyperlinks to other sites that may be of interest to a reader of that particular subject. Though it is a fairly recent blog, it is basic in its design, reminiscent of much earlier blogging interfaces. Today, blogs are increasingly popular and come with pre-made themes, making coding unnecessary although still possible on platforms such as WordPress. On Facebook however, users cannot change the style of their page. This control of style is one way the web is becoming more streamlined. The primary benefit to living on a home website such as Facebook, Twitter, Instagram, or LinkedIn is accessibility. Each site has their own niche purpose and learning to code is not a necessity to run these pages. One simply needs to know how to link the various pages properly to allow for an integrated movement across platforms. Because users do not need to understand code in order to have a profile on these websites, their user base is much larger. This is comparable to the accessibility to literature in the eighteenth century which made reading a pastime for more than just an educated elite.
This ease-of-use has led to a global reach of perspectives. In this sense, the age of the Internet can be correlated with the Age of Enlightenment in that the proliferation of knowledge is now much easier than it was in the past. Today, over one billion pages exist on the web (Woollaston). The billions of people using the web are provided access to a multitude of differing perspectives and insights (“Internet Usage on the Web by Regions”). Though this has the potential for tension, it has been proven to help develop critical thinking and empathy. In the article, “How Diversity Makes Us Smarter,” Edel Rodriguez states, “social diversity… can cause discomfort, rougher interactions, a lack of trust, greater perceived interpersonal conflict, lower communication, less cohesion, more concern about disrespect, and other problems.” However, being confronted with these problems and having to mediate around diversity enhances creativity, “leading to better decision making and problem solving” (Rodriguez). Thus, diversity creates adversity, but provides good results when people are encouraged to consider other people’s perspectives. Our minds are prompted to work harder when disagreement arises as a result of social differences. Thus, a difference in perspective “[encourages] the consideration of alternatives” (Rodriguez). This article, published by Scientific American, puts words to this phenomenon being studied by a multitude of people, including “organizational scientists, psychologists, sociologists, economists and demographers” (Rodriguez). It illustrates why salons and coffeehouses were so important as places to spark conversation. They were hubs of discourse that generated innovative ideas and ideologies, sometimes for pleasure, but other times to create planned social movements such as those that led to the French Revolution. Similarly, the web provides an outlet for people to create discourse. Though not a physical space like salons, the web allows for a greater global discourse to occur; it should be the perfect platform for our globally-social world.
The most popular social network today is Facebook (“Leading Social Networks Worldwide as of January 2016, Ranked by Number of Active Users [in millions]”) with approximately 1.59 billion active monthly users (“Number of Monthly Active Facebook Users Worldwide as of 4th Quarter 2015 [in millions]”). Facebook is a platform for users to create profiles for personal or business use in order to connect with others. Facebook also doubles as a publication platform, though Facebook would argue against this (Kiss and Arthur). Publishing is defined by the Oxford English Dictionary (OED) as, “the act of making something publicly known” (“Publishing”). Users on Facebook create posts and share them with both strangers and friends, thus creating a public publishing platform. These posts and comments are as much a public form as blog posts or online fan fiction. It is documented proof of what has been said by whom; in fact, it is now possible to see the editing history on a single post or comment. Even if a post or comment is deleted, Facebook retains access to that content. Their Help Centre website states, “When you choose to delete something you shared on Facebook, we remove it from the site. Some of this information is permanently deleted from our servers; however, some things can only be deleted when you permanently delete your account.” Thus, content considered “deleted” exists past the time the creator removes it; it is still available to some, remaining “published” on Facebook’s servers.
Facebook is a platform where unique content is created in addition to a site where users “share” and “like” content they deem relevant. This can result in a lively discourse of back-and-forth commenting, especially with the new option for users to “reply” to previous comments. However, in order to find content that is in opposition to one’s currently held views, one must often purposely seek it out themselves. This is due to Facebook’s algorithms, largely invisible and secret to Facebook users. Facebook created algorithms that filter its seemingly-endless content into curated, personalized “news feeds” for its users. An algorithm, defined by the OED, is “a precisely defined set of mathematical or logical operations for the performance of a particular task” (“Algorithm”). As a business, Facebook succeeds in the task of retaining consumers; they are able to deliver an appropriate amount of content to their consumers. Where Facebook’s algorithms fail is in giving users unique content, not only based on their specific “likes” but on their broader general interests. Furthermore, they are unsuccessful at providing “readers” with interesting and challenging content that is oppositional to their currently held views. They are unable to show a snapshot of the multitude of voices that exist on this platform; instead, they proliferate a user’s preconceived views and reinforce a user’s confirmation bias. Ultimately, Facebook is a business. Their prediction algorithms that provide users with a personalized news feed are meant to generate a user-friendly experience; however, in doing this, computers become gatekeepers and users become confined to ideological bubbles.
During the Age of Enlightenment, the book trade and affordability of books allowed for the proliferation of new areas of thinking and novel philosophies. Censorship by the church and state was dying in favour of books that were engaged readers and inspired discussion and debate about their ideas. What made books and discourse interesting was not necessarily the sameness of opinion, but the diversity of opinions that were becoming louder during the eighteenth century. Facebook could have become a place of social diversity. Instead, its owners have engaged in gatekeeping and invisible editing in order to keep users returning to their site. This comes at the expense of social and intellectual growth and change. The people who manage Facebook’s algorithms generate many of them based on “likes,” hidden posts, and the amount of time spent reading an article. “[Chris] Cox [Facebook’s chief product officer] and the other humans behind Facebook’s news feed decided that their ultimate goal would be to show people all the posts that really matter to them and none of the ones that don’t,” states Will Oremus in “Who Controls Your Facebook Feed.” However, humans are not as predictable as mathematic equations; utilizing “likes” or time spent reading as a baseline of what is shown to people does not illustrate the whole complex picture of what human beings can, and should, engage in. In his TEDxTALK, Eli Pariser gives an example of algorithms attempting to understand a human being based on these baselines alone. He says that as a liberal, he engaged in more progressive content. However, he enjoys politics and likes reading about the conservative side of the political spectrum. He recognized he was engaging in right-wing content less often, but he was perceptive enough to notice when the conservative viewpoint disappeared from his feed, leaving only content from his liberal friends. Opposing content, though interesting and necessary for Pariser, was gone and he now had to seek it out. He had no active role in editing his news feed as content was disappearing, and neither do other users. Yet, most other users to not notice the content shift happening; instead, they see their own views proliferated. Previous to the Internet, the broadcast and print media were the gatekeepers of information. It is widely recognized that the media is fallible, but journalistic ethics existed in order to promote multiple perspectives. The Internet undermined this old media as it expanded. Huge companies, such as Facebook and Google, grew and computers have become the gatekeepers of information. Oremus states, “Facebook had become… the global newspaper of the 21st century: an up-to-the-minute feed of news, entertainment, and personal updates from friends and loved ones, automatically tailored to the specific interests of each individual user.” The idea of a platform presenting only one perspective to its readers without the availability of an opposing opinion at arm’s reach, as is the case with newspaper stands, is an archaic thought considering the movements that have been made to prevent censorship from occurring, especially in regards to the importance of social diversity. Oremus’ article is informative and supportive of algorithms, yet he still laments, “Drowned out were substance, nuance, sadness, and anything that provoked thought or emotions beyond a simple thumbs-up.” Ultimately, Pariser, in his TEDxTALK, recognizes the biggest issue at hand when computers control the information people see, and it is not always as simple as ideological bubbles. Ultimately, it extends into a dysfunctioning democracy, removed from a conducive and just flow of information. To have a strong conviction requires knowing and understanding all sides of an issue. As Katherine Phillips states, “We need diversity… if we are to change, grow, and innovate.” Facebook and Internet users cannot let website conglomerates be the only innovators, the only ones capable of seeing solutions from multiple angles, whether those problems involve an algorithm or differences in ideology, religion, or politics. Users cannot let computers be their personal gatekeepers, preventing them from understanding that there are other perspectives and that they are equally as valuable.
Ultimately, Facebook’s algorithms serve a vital purpose: a means of generating revenue, retaining users, and making sense of the expanse of information available on the web. However, these secret, invisible algorithms prevent Facebook’s users from being introduced to novel information or opposing viewpoints. This in turn prevents people from understanding global events, and instead creates ideological bubbles. Milan Zafirovski writes, “subjects were literally reduced to the servants of theology, religion, and church, thus subordinated and eventually sacrificed… to theocracy.” In this statement, he is referring to pre-Enlightened Europe. However, as people become more accustomed to seeing their own views proliferated on what many consider their main news source, they are becoming accepting of the idea that their view is the only one. As history shows, discourse and challenging opinions and ideas are what fuel social change. Ultimately, Facebook needs to sort through the massive amount of information on their site; however, they cannot be gatekeepers to distribute only information they deem as “important.” Facebook users need to have a voice in what is shown to them, and this needs to be bigger than a “thumbs up.”
“Age of Enlightenment.” Wikipedia: The Free Encyclopedia. Wikimedia Foundation, Inc. 30 January 2016. Web. 31 January 2016.
“algorithm, n.” OED Online. Oxford University Press, December 2015. Web. 26 January 2016.
Arthur, Charles and Jemima Kiss. “Publishers or Platforms? Media Giants May be Forced to Choose.” The Guardian. 26 July 2013. Web. 29 January 2016.
Chowdhry, Amit. “Facebook Changes News Feed Algorithm To Prioritize Content From Friends Over Pages.” Forbes. 24 April 2015. Web. 26 January 2016.
Dickey, Michael. “Philosophical Foundations of the Enlightenment.” Rebirth of Reason. Web. 26 January 2016.
“Internet.” Wikipedia: The Free Encyclopedia. Wikimedia Foundation, Inc. 29 January 2016. Web. 29 January 2016.
“Internet Usage in the World by Regions.” Internet World Stats. 26 January 2016. Web. 1 February 2016.
“Leading Social Networks Worldwide as of January 2016, Ranked by Number of Active Users (in Millions).” Statista. January 2016. Web. 31 January 2016.
Luckerson, Victor. “Here’s How Facebook’s News Feed Actually Works.” Time. 9 July 2015. Web. 26 January 2016.
Marconi, Francesco. “The Rise of Homeless Media.” Medium. 24 November 2015. Web. 15 January 2016.
“Number of Monthly Active Facebook Users Worldwide as of 4th Quarter 2015 (in Millions).” Statista. January 2016. Web. 26 January 2016.
Oremus, Will. “How Facebook’s News Feed Algorithm Works.” Slate. 3 January 2016. Web. 26 January 2016.
Pariser, Eli. “Beware Online ‘Filter Bubbles.’” TED. March 2011. Lecture.
Phillips, Katherine. “How Diversity Makes Us Smarter.” Scientific American. 1 October 2014. Web. 26 January 2016.
“publishing, n.” OED Online. Oxford University Press, December 2015. Web. 26 January 2016.
“What Happens to Content (Posts, Pictures) that I Delete from Facebook?” Facebook. Web. 29 January 2016.
Woollaston, Victoria. “Number of Websites Hits a Billion: Tracker Reveals a New Site is Registered Every Second.” Daily Mail Online. 17 September 2014. Web. 26 January 2016.
Zafirovski, Milan. The Enlightenment and Its Effects on Modern Society. New York: Springer. 2010. Web.