INside Performance Marketing
PMI is Next Week – Join Us The only two-day event for affiliate marketing, lead gen and biddable media Book your ticket now
To What Extent is Google Shaping Our Web Content?
Image Credit  Carlos Luna Creative Commons license

To What Extent is Google Shaping Our Web Content?

Blame a couple of plucky students from Stanford University.

In their search for a suitable dissertation theme, classmates Larry Page and Sergey Brin unearthed the idea of converting backlink data from a web page into a measure of importance. Nearly 20 years later and while most of their fellow students’ work likely been resigned to a space in their parents’ loft, the aftermath of project “BackRub” is ‘Google’, a search engine which facilitates around six billion queries every single day. Given a bit of a head start, the pair could have used their own technology to find an appropriate topic for their assignment.

Websites took little time in recognising the importance of search engines for increasing their visitors and it was not long until the practice of optimising a page for search purposes - or SEO, as it is now known - became a top priority. Recent updates to Google’s rather unforgiving algorithm have meant that quality content can lead to quality rankings, a fact exemplified by the demand for skilled copywriters and web media editors. Some SEO agencies even favour a content-driven approach for climbing organic search rankings in the most natural way possible.  

Google is the owner of an estimated $121 billion worth of assets as of June 30, 2014. Any further praise for its undoubted and invaluable contribution to the evolution of the internet would be added to an already sky-high list of plaudits. Neither Page nor Brin intended to change the way sites were constructed for their own gain. Although when it comes to evaluating just how much content their creation has inspired, the engine’s chokehold on site pages is there for all to see.

Content in a Google-Powered Web

Websites are under no obligation to construct content in a way that would please anyone other than themselves and their visitors. This in theory is something Google has tried to acknowledge with its more recent algorithm changes. Released in early 2011, the engine’s Panda update dealt a hammer blow to sites sporting thin, cookie-cutter web pages which could rank highly based on their supply of certain information. It was then up to the developer to repeatedly duplicate this winning formula.

These ominous-looking entries still exist, but Google is constantly finding ways of penalising sites that construct uber-optimised content without a care for how it actually appears. The parable of the high school try-hard rings true in this scenario. Much like playground cliques are unforgiving of the pupil who aims to please at every opportunity, Google came down hard on sites that set out to play the system. A post-Panda world looks favourably upon websites with natural, unique content with the goal of providing greater relevancy to search users.

The days of keyword-stuffed pages stealing the show are slowly but surely coming to end, enabling web developers to focus on a far more virtuous and fulfilling pathway.

Robot after all

Of course, being a machine by nature, Google’s reward system is dictated by a page’s ability to correspond with a series of numbers and figures. Much like it cannot truly appreciate the wordsmithery of Shakespeare or the musical wizardry of your favourite unsigned band, Google’s flagship robot will not reward a blog post for its ability to transfer human emotion into words. Thus, there are certain things publishers must do in order to drive traffic with their content.

Lukasz Zelezny, head of SEO at price comparison site uSwitch, is more than familiar with this line of work.

“I have this conversation sometimes with people about why an article should be 1,000 words rather than 300,” he muses.  

“I have to explain that it is not me setting these rules - longer articles tend to rank higher in search engines. They are algorithm based and, essentially, machines - machines that cannot read a piece of content and say: ‘Wow, this is such an interesting article.’ They do not have this artificial intelligence that people have.”

It is perhaps comforting that decisions in real life are based on personal feelings as opposed to mathematics. In the eyes of Google, though, high-ranking content must conform to a strict set of guidelines. That is regardless of how well it can sum up a fairly complex topic into four sentences, and without a care for the author’s own appreciation of their work.

When asked about his acceptance of Google’s ranking system, Zelezny is clear where he stands.

“Sometimes a keyword can be tricky to rank highly for and you need to follow certain rules that some people may find confusing, but that is also part of my role. This is their (Google’s) website and their business; they can do whatever they want. They are setting up the rules and we can follow them or look somewhere else, you know?”

To conform or not to conform?

Lukasz is right: websites are under no instruction to consider Google in the making of their online media. Yet spend on SEO is continuing to rise, and if there is anything to be taken out of the billions of pounds currently being dedicated to search campaigns worldwide, it is that companies care about Google’s way of thinking.

Fortunately for search agencies around the world, Google returns the gesture by providing for the audiences it serves, and creating content that is both search and reader-friendly is getting easier to do.

Peter Handley, client strategy director of UK search agency theMediaFlow, has witnessed a vast transformation in the way site content is created over nearly ten years as a search professional.

“I think historically and certainly when I started in search, we were writing content to suit what we felt the algorithm wanted. We’d identify keywords we wanted to rank for, look at ensuring the content was written around those words and variations thereof.”

Peter refers to the practice of ‘keyword stuffing’, where writers would include unnatural strings of words within their paragraphs in an effort to rank highly for them. If you were after a ‘web designer in London’, for example, you would not have to look far to read a post from a company claiming to be exactly that - usually in highlighted text.

“I think decent agencies and webmasters are now looking at this from the opposite perspective: what do audiences want to see in terms of website content? What can we do to truly earn those top positions in search results?,” explains Peter, whose company specialises in driving traffic through organic search.

“Those that are doing it well have switched away from chasing the algorithm to focusing on audiences and their needs. When done correctly, the algorithm rewards this.”

The dark arts

Google has achieved some magnificent feats in repairing relationships between websites and their audiences, but many products of its former, less developed state remain. ‘Blackhat SEO’, the phrase used to describe unscrupulous practices in the world of search optimisation, is still influencing a high proportion of rankings, and representing a serious proposition for short-term gains.

Handley believes the situation at present bears plenty of similarities to the game “whack-a-mole”, whereby Google will hammer a website down to the depths of obscurity before another site is launched, meaning another target is begging to be hit.

There is money to be made by straying from the honest path, but most webmasters do not advise it - a point stressed by SEO expert Martyn Slack, who currently serves as affiliate manager at web development agency Twist Digital.   

“I do think it’s vital to create a site for the visitors and not for money or search engines. If a site is created for the visitor in mind, it should have good quality content based on what they are searching for, which could then feed into product and service recommendations.”

Despite this, Slack also agrees that blackhat techniques will be around for some time yet, mainly due to the ability to quickly change something that no longer works. If one method is penalised by Google, webmasters will only discover a new way to work the system.

It is through these examples of blatant game-playing that we can see just how much of an impact Google has had on site content.

Problems of its own making

Consider this: Google rewards sites with good user experience and what it considers to be useful and unique content. In some instances it is hard to think that some of the most search-friendly pages are constructed with SEO guidance in mind. The navigation; the knowledge on offer; the responsiveness; the opportunity to interact with certain elements. Everything flows naturally. So why are all websites not following suit?

Google may have given birth to ‘blackhat’ SEO, diminishing the quality of site design in some corners of the web, but it is hard to comprehend why companies would favour rankings over user experience when they could have both. In a possible explanation, Slack believes the engine has done little to help itself in ensuring that each case is dealt with fairly.

He has seen evidence of publishers having their content stolen but receiving a penalty after being accused of duplication. It does not help that blackhat practitioners tend to be very good at what they do, according to Handley, and money in employing them can often be made back. However, the search experts at Google also know a thing or two about how websites should work, and  playing the engine as it attempts to whack every mole in sight will always present risks.

“Give them a good experience; develop the types of content they want to see; frame it in an appealing way in terms of site design; make it easy for them to purchase and share that content. Market yourselves,” urges Handley.

“Google rewards this sort of behaviour and while yes, it can be possible to play the system in the short term, they’re always going to catch up. You might as well save yourself the trouble of later having to clean up a mess of your own and do it sustainably from the start.”  

Unfortunately blackhat techniques are part of content creation as Google knows it. Somewhere, somehow, webmasters are finding a way to work the system to their advantage. Brin and Page may recognise just how many articles, landing pages and site designs they have inspired over the years, but coming to terms with everything good, bad and ugly does not make the achievement any less remarkable.

From SEO agencies to online-only retailers, whole businesses are being powered by search, and it is up to Google to ensure that reputable content vendors are being rewarded. Now the proud owners of a social network, a mobile operating system and a pair of hi-tech glasses that can map out a trip from Devon to Dundee, the real question is whether they will take responsibility and rise to the occasion.

Richard Towey

Richard Towey

Richard serves as head of content at PerformanceIN. After many years spent covering developments from the automotive, sports, travel and finance sectors, he eventually turned his full attention to reporting on stories from the fast-evolving world of digital marketing. Richard now heads up the editorial team at PerformanceIN: the performance marketing industry's leading publication.  

Read more from Richard

Join over 10,000 performance marketers for the ultimate weekly update on industry news