There’s a whole Web out there

“Hello world!” These two words are often the first to be published on a new blog – its automatic birth cry after being launched, if you so will – to familiarize first-time users with posting, commenting, and making one’s message public.

Funny as this hello to the unknown depths of the Internet may sound, the meaning behind it is real: it has never been simpler to enter the online publishing game and, in theory, anybody with unrestricted access to the Web can view any page – it is just a matter of finding it. In the reality of the attention-driven business that is online publishing, the way the Internet is mapped and presented by search engines determines what is likely to be found and what is not, posing challenges to content creators and readers alike.

Algorithms decide which pages make it into the search results and if they appear in the top section of that list. These algorithms consider several criteria to lead users to websites, popularity being one of most important ones. To express popularity as a numerical value, Google counts the links and backlinks of a page to calculate its PageRank, for instance. Even after all the refinements of its algorithm over the years, this basic principle still applies. By shaping the list of results that users get to see, search engines in general not only measure popularity as a factor, but also determine it by creating a hierarchy of how popular websites are. About 75 percent of Internet users do not bother scrolling past the first page of their results to find alternative sources – it is not surprising that search engine optimization (SEO) has become a line of work of its own for marketers.

Furthermore, the order of the results does not necessarily equate with quality. For example, there is a revenue strategy to launch small websites on special topics, add SEO content, and then monetize them, which proves that search algorithms can still be somewhat blind to the expertise and background of a website. Ophelie LaChat, editor-in-chief at the Melbourne publisher SitePoint, explains how pages with a niche topic can easily pick up traffic: “Because you are picking a keyword with low competition, even mediocre content can get you a good rank.”

Does that mean that websites at the top of the search results can get the lion’s share of the traffic while others, which might be equally or more comprehensive, starve on the lower ranks? This assumption seems at least plausible, given that search engines strongly affect a website’s discovery and its survival by distributing the attention of users to the pages they display as results. If popular pages are better ranked than alternative sources, with no or little consideration of quality, then the latter are more likely to be undermined as a result.

These circumstances make for quite a competitive environment for whoever wants to make their “Hello, world” heard – not only for bloggers, but for everybody who publishes online and competes for readers’ attention, which is finite[1] and therefore scarce. When applying social theorist Michael Warner’s definitions of publics to the topic, it becomes clear that websites are actually competing for publics – for a group that pays attention to them.

The structure of the Internet as depicted by search engines influences which publics users choose to join ­– it is not that this structure does not allow for publics and so-called counterpublics (nonmembers of a public) of all sizes to exist, it is rather that the way of finding sources is biased. Of course, Google and other providers of Web search services constantly adjust and personalize their functioning according to their users’ behaviours. This is how search engine bias comes into being. Beyond single factors like popularity, it is generally the underlying notion of knowing what users are looking for that shapes search results, ultimately affecting our perception of what the Internet has to offer.

Not only people’s perception of the Web, but also the competitors themselves within it determine which share of attention a website gets, which is also related to its publics’ willingness to buy products. Media theorist Douglas Rushkoff explains how small firms are put into direct competition online with bigger companies. In his example, an owner of a small music shop owner finds himself unable to keep up with the low prices offered by shopping aggregators[2]. Rushkoff’s advice to small players with a simple website is to scale up to “become all things to some people, or some things to all people,” as is done by price aggregators like Compare Vinyl. Needless to say, not everybody with a traditional business model can afford and manage this step, especially with big corporations like Amazon not only dominating search results, but also outperforming their competitors on pricing.

From a cultural standpoint, it is great that the Internet enables people to publish. When it comes to reaching one’s public though, the Web is subject to the economic and competitive nature of markets, with real-life consequences, as Rushkoff’s example shows.

There are no real alternatives to using search engines and they have to categorize content in one way or another in order to function. One might think that Google does the best job at this task, given that, even though it plummeted in 2013, Google’s global market share is still close to 70 percent. Taking into account the central role of being the means of Web navigation for the vast majority users, the dominance of a single provider may have worrisome cultural implications for the Internet and what gets seen.

Whereas content creators and retailers face the challenge to reach their publics, readers need to remain aware of what their search results are not giving them. It is necessary to keep in mind that, while content is systematically displayed in their search results, other content is hidden at the same time because it is deemed not relevant – this is reason enough to use more than one search provider. It seems labourious to compare the results, but it can lead users to better websites that deserve their attention. In fact, it might just be the best way to truly benefit from the plurality that the Web actually offers.


[1] Douglas Rushkoff (2013): Present Shock. When Everything Happens Now (p. 124). Penguin Group (USA).

[2] Douglas Rushkoff (2011): Program or Be Programmed. Ten Commands for a Digital Age(p. 67). Soft Skull Press.

One Reply to “There’s a whole Web out there”

  1. Interesting topic Till–so rarely, if at all, do we stop to consider how our Google searches are aggregated, and whether the information that tops the list is really creme de la creme. But, based on a recent class conversation, we take it to be regardless of the quality, as the majority of us are not willing to scroll past a handful of links let alone pages of them before we use another search term. I was actually surprised by the statistic you cited – that 75% of us do not scroll past the first page – I would have thought that number be near 100%. I’m with John in that I rarely click on an article that doesn’t make it into the top three, there’s a sort of subconscious weariness about the integrity and/or value of anything lower than bronze status.
    This would be an interesting idea to explore if you were to expand your paper – what factors make readers unwilling to scroll past the first page of their Google search and rather opt for a different term.

    Watching the Google pagerank video (where I learned top articles are the ones with the most links and backlinks – didn’t know that), I realized that Google prides itself on the quality of information within their hierarchical, popularity-based ranking system, despite its ability to be circumvented, as you suggested. (Scary to think that technology controls the information we see and that there’s no quality assurance process in place as the peer-reviews of pre self-publishing days.) But, if using niche non-competitive keywords to top the search list becomes a thing, and non-niche publishers simply lengthen their list of tags to include less commonly used search terms, those too will become common and the niche underdog is still unable to capture the attention of the publics among the abundance of information online. As well, the advice from Douglas Ruskoff which you referenced – for small players to “become all things to some people, or some things to all people,” doesn’t seem like a viable business option for most because it requires major operational adjustments (I would have liked to see an example of how this notion can be put into action).

    Easier than changing an entire business model then, would be tweaking marketing strategy. While you mentioned SEO, you didn’t go into SEM, which on a broader level takes into account how to generate targeted traffic to websites (through advertising and paid placement) rather than only bettering organic search results via keywords (a subset of SEM). A Marketing Blog I found on the subject claims SEM is particularly useful in increasing visibility immediately, and is beneficial to say a young company that does not have a big online footprint. Check it out:
    http://www.weidert.com/whole_brain_marketing_blog/bid/115490/What-Exactly-is-the-Difference-Between-SEO-SEM

    So it seems bloggers and businesses do have options when it comes to increasing their online visibility: SEO – trying to keep up with (or cheat) Google’s ever-changing algorithms or SEM – pay, the good old-fashioned way.

Comments are closed.