On September 23, Google announced a long awaited update to the Penguin algorithm, to much chatter amongst SEOs and the wider search community.  

For more than 700 days, since Penguin 3.0, thousands of webmasters and brands across the globe had been waiting for this day after Google took punitive measures on websites for using ‘manipulate techniques’, artificially enhancing their search rankings.  

So what is Penguin? 

The Penguin algorithm was first unleashed in 2012 and was devised to target sites using linking techniques that Google deemed were ‘spamming’ search results, essentially inflating a site’s ranking above where it should naturally rank.  

What followed was a seismic jolt to the SEO industry; the punishment was quick and severe, a combination of algorithmic and manual actions leading to many brands losing large percentages of traffic overnight.  Some knew the risks, some had no idea what their ‘SEO’ company had been doing. It was a massive slap on the wrists from Google: “Don’t do it again”.

And on the whole people didn’t do it again.  The vast majority of SEOs turned their backs on paid link-building outside guidelines and focused on ‘cleaning up’ links in an attempt to recover a penalty or attempt to avoid a hit from a further Penguin assault.  There were five, full update or iterative, over the following two and a half years, causing more misery for many. 

According to Google, Penguin 4.0 marks the end of this turbulent era of sporadic updates with the introduction of ‘real time’ Penguin processing in the core algorithm.   No more waiting for updates for a site to recover, instant results from new links and the cleaning up of links.  To many in the search industry this sounds great, almost too good (bearing in mind the almost two-year wait for 4.0 with no chance of recovery beforehand for those affected).  So is 4.0 all it’s cracked up to be?

Real-time Penguin

On the face of it, the fundamental difference between 4.0 and its predecessors is the ‘real time’ nature with Penguin updating continuously as part of the core algorithm.  As far back as June 2015 Google informed us this would be the case with many SEOs pondering what this would mean.  My instant thought was that it could lead to the rise of negative SEO whereby a third party would deliberately place low quality links that contravene Google’s guidelines, with the express purpose of triggering a penalty and causing the target site to lose positions, all in a short space of time.

Foreseeing a ‘real time’ world where new links were taken into account continuously, dropping a few thousand spammy links on a competitor looked more likely than ever due to the potential speed of action versus a competitor.  I was unconvinced how Googlebot would be able to distinguish between a site owner building links for gain and a third party.   

This was also obviously a consideration for Google so much so that they have apparently changed the way in which are links are viewed, with Google’s Gary Illyes stating that the new algorithm managed to devalue spam links instead of demoting the site.  So in essence, a link can either be good or if it is spam, then it will be devalued or ignored.  So now there is no risk in linking? Pass me the can opener, these worms are ready to roll…

Quality at risk? 

Having witnessed all Penguin updates first hand, I always believed it to be a fundamentally good thing. Though I sympathised hugely with sites that had suffered more than they should have or lost out due to lack of understanding of what an agency/someone was doing on their behalf, I believed that the  ‘get rich quick’ nature of paid links and guaranteed positions gave SEO a bad name.  What Penguin did positively was push search marketers towards quality content on quality websites, forcing site owners to put the consumer first, not focus solely on the purchase of links.  It also scared the living hell out of most who were contravening Google guidelines or had been on the business end of Penguin.  The risk simply began to outweigh the reward in almost all cases, it wasn’t sustainable to ‘build’ links anymore.

So it’s slightly worrying to think that this risk seems to have been reduced with 4.0, and all the good work that Google had forced marketers to put in, promoting great content and great sites, may be replaced by a new generation of paid linking, with the specific intention of achieving higher rankings.  

Return of the link 

Sure, it won’t be the same as before, it will be smarter.  Gone are directories, exact match anchor text will be a more natural mix with brand, while sites selected will have to be legitimate and relevant.  If links are still in the top two most influential ranking factors, as they are widely purported to be, and they can only benefit a site or not do anything, it seems apparent this could happen.  Google states that manual actions can still be handed out suppressing rank, however now SEOs are more attuned to what is good and what is not, will this really be a failsafe?

I am not currently convinced.  Google’s line has always been to stop focussing on links and just create great content on great sites for users.  This is all well and good, however when you are in cut-throat verticals such as retail, travel and finance where many sites are hugely similar in content and UX, won’t you be tempted to turn the link dial once again?  Rather than making SEOs think less about links, or be scared about links, this brings links right back into focus.

So in summary, Penguin 4.0 answers a lot of questions and puts to bed an era of uncertainty for many but raises a whole raft of new ones: negative SEO, return of paid links, only good or devalued links, it’s certainly not clear how this will pan out yet. 

The real time element does apparently make it more forgiving, which is great for SEOs.  Doing the right thing will be rewarded quicker but will Google get their wish of focus on quality websites and content and away from focus on links? Only time will tell.  For Penguin, this could be the tip of the iceberg.