“Hello world!” These two words are often the first to be published on a new blog – its automatic birth cry after being launched, if you so will – to familiarize first-time users with posting, commenting, and making one’s message public.
Funny as this hello to the unknown depths of the Internet may sound, the meaning behind it is real: it has never been simpler to enter the online publishing game and, in theory, anybody with unrestricted access to the Web can view any page – it is just a matter of finding it. In the reality of the attention-driven business that is online publishing, the way the Internet is mapped and presented by search engines determines what is likely to be found and what is not, posing challenges to content creators and readers alike.
Algorithms decide which pages make it into the search results and if they appear in the top section of that list. These algorithms consider several criteria to lead users to websites, popularity being one of most important ones. To express popularity as a numerical value, Google counts the links and backlinks of a page to calculate its PageRank, for instance. Even after all the refinements of its algorithm over the years, this basic principle still applies. By shaping the list of results that users get to see, search engines in general not only measure popularity as a factor, but also determine it by creating a hierarchy of how popular websites are. About 75 percent of Internet users do not bother scrolling past the first page of their results to find alternative sources – it is not surprising that search engine optimization (SEO) has become a line of work of its own for marketers.
Furthermore, the order of the results does not necessarily equate with quality. For example, there is a revenue strategy to launch small websites on special topics, add SEO content, and then monetize them, which proves that search algorithms can still be somewhat blind to the expertise and background of a website. Ophelie LaChat, editor-in-chief at the Melbourne publisher SitePoint, explains how pages with a niche topic can easily pick up traffic: “Because you are picking a keyword with low competition, even mediocre content can get you a good rank.”
Does that mean that websites at the top of the search results can get the lion’s share of the traffic while others, which might be equally or more comprehensive, starve on the lower ranks? This assumption seems at least plausible, given that search engines strongly affect a website’s discovery and its survival by distributing the attention of users to the pages they display as results. If popular pages are better ranked than alternative sources, with no or little consideration of quality, then the latter are more likely to be undermined as a result.
These circumstances make for quite a competitive environment for whoever wants to make their “Hello, world” heard – not only for bloggers, but for everybody who publishes online and competes for readers’ attention, which is finite and therefore scarce. When applying social theorist Michael Warner’s definitions of publics to the topic, it becomes clear that websites are actually competing for publics – for a group that pays attention to them.
The structure of the Internet as depicted by search engines influences which publics users choose to join – it is not that this structure does not allow for publics and so-called counterpublics (nonmembers of a public) of all sizes to exist, it is rather that the way of finding sources is biased. Of course, Google and other providers of Web search services constantly adjust and personalize their functioning according to their users’ behaviours. This is how search engine bias comes into being. Beyond single factors like popularity, it is generally the underlying notion of knowing what users are looking for that shapes search results, ultimately affecting our perception of what the Internet has to offer.
Not only people’s perception of the Web, but also the competitors themselves within it determine which share of attention a website gets, which is also related to its publics’ willingness to buy products. Media theorist Douglas Rushkoff explains how small firms are put into direct competition online with bigger companies. In his example, an owner of a small music shop owner finds himself unable to keep up with the low prices offered by shopping aggregators. Rushkoff’s advice to small players with a simple website is to scale up to “become all things to some people, or some things to all people,” as is done by price aggregators like Compare Vinyl. Needless to say, not everybody with a traditional business model can afford and manage this step, especially with big corporations like Amazon not only dominating search results, but also outperforming their competitors on pricing.
From a cultural standpoint, it is great that the Internet enables people to publish. When it comes to reaching one’s public though, the Web is subject to the economic and competitive nature of markets, with real-life consequences, as Rushkoff’s example shows.
There are no real alternatives to using search engines and they have to categorize content in one way or another in order to function. One might think that Google does the best job at this task, given that, even though it plummeted in 2013, Google’s global market share is still close to 70 percent. Taking into account the central role of being the means of Web navigation for the vast majority users, the dominance of a single provider may have worrisome cultural implications for the Internet and what gets seen.
Whereas content creators and retailers face the challenge to reach their publics, readers need to remain aware of what their search results are not giving them. It is necessary to keep in mind that, while content is systematically displayed in their search results, other content is hidden at the same time because it is deemed not relevant – this is reason enough to use more than one search provider. It seems labourious to compare the results, but it can lead users to better websites that deserve their attention. In fact, it might just be the best way to truly benefit from the plurality that the Web actually offers.