INside Performance Marketing
Q&A: Searchmetrics Founder and CTO Dissects Google Panda 4.1
Image Credit  fortherock Creative Commons license

Q&A: Searchmetrics Founder and CTO Dissects Google Panda 4.1

Last week Google set alarm bells ringing around the SEO community with an update to its Panda algorithm.

Google Panda 4.1 is alive and kicking across the web and seems to have already pinpointed its fair share of winners and losers. Some have welcomed the news with open arms; witnessing an unforeseen lift in search visibility as a result of their high-quality content. Others have been left to rue their site-building decisions and may be on the comeback trail for some time yet.  

With a "slow rollout" of Panda 4.1 still taking place, we called upon the expertise of Marcus Tober, founder and CTO of SEO platform Searchmetrics, to answer some of the most pressing questions about the update.

Hi Marcus. What can sites expect to see from Panda 4.1?

Marcus Tober: Google’s Panda updates are focused on the quality of content with Panda 4.1 impacting about 3-5% of search queries. It is a small iteration of the Panda 4.0 update which was rolled out in May 2014 (a major update impacting about 7.5% of search queries). Google said it has discovered some new signals to help it detect low-quality content. So sites with thin or aggregated content can expect to lose visibility in search results, whereas sites with comprehensive, helpful and user-oriented content are likely to benefit.

Have you noticed any trends in the types of sites being rewarded and targeted?    

MT: The fourth Panda generation is supposed to help small to medium-sized businesses rank better and – as with previous Panda updates – reduce spam and irrelevant content from the search pages. 

In our winners/losers analysis of 4.1, the main losers tended to be aggregators such as similarsites.com or findthebest.com. These domains are using predominantly external text that is available elsewhere and provide very little unique content. Moreover, some sites with rather thin content like answers.com or yellow.com lost visibility. On the other hand, some news pages and download portals have been rewarded. As usual, there have been some outliers on both sides. And, for example, portals dealing with medical topics can be found both on the winners and on the losers list.

There were some pretty unusual entrants into your top Panda 4.1 Winners in the US - the controversial piratebay.se being the most striking. Does this somewhat devalue the work Google does in promoting quality results? 

MT: It seems that thepiratebay.se is a winner not because of Panda 4.1. It gained significant visibility at the same time as Panda 4.1 was released, but it looks like a coincidence. Before the increase for this website they had a huge drop - so it looks more like a recovery from a technical issue or relaunch of certain areas of their site.

Other analysts seem to have publishers like Vice.com and NYtimes.com coming off well post-Panda 4.1. So is there still work to be done by Google in promoting the smaller sites? 

MT: As I said, first and foremost, Panda is about quality content – and of course user-relevant content. Vice.com may be a somewhat “special” case, but it is creating user-relevant content. The same with NYtimes.com. It is understandable that these larger sites have benefited.

You must understand that sites win because other sites lose. When a URL loses its #2 ranking for a certain keyword (and also its rankings for related keywords the URL is indexed for), then the URLs behind all rise up a position. Of course, this is an over-simplified example, but it is basically how it works.

The “problem” was, that the bigger sites often benefited the most, just because they showed a good amount of other signals Google considers to be relevant (such as backlinks). Now Google is trying to find a good balance for interpreting the different signals from both bigger and smaller sites. It is assessing signals from large and small sites in different ways because they require different benchmarks. These signals are continuously improving, so this area will always remain a developing process.

In your own experience, what steps will a website take after seeing their visibility drop?

MT: The first thing to do when your site loses visibility and/or traffic due to an update is optimise your content. I know, this has become a hackneyed saying in recent years, but that is why we always try to prove our advice with data.

For example, we have analysed what content on well-ranked pages has in common and what differences there are compared with content from lower ranking sites. Our research found that better-ranked content is on average more comprehensive, readable and often also longer. But the most important fact is that relevant content does not only focus on certain keywords, but covers further semantically related terms, treating several aspects of the given subject in a holistic fashion. We have gathered some more of these parameters in our recent Ranking Factors / Rank correlation Study.

So my advice is: get rid of landing pages that are optimised for single keywords. Remove poor performers from your website and rework your whole site structure. Create fewer, relevant and topic-oriented landing pages rather than having several pages with quite similar content. That’s how you can improve the user experience and have a positive impact on your rankings in the mid and long run.

Given some of the previous stories of disgruntlement from badly hit sites, do you think it is fair that Google should continually hide the release and core focuses of its algorithm updates? 

MT: I think it helps Google and the whole industry when there is a certain buzz around the release and the effects of updates and penalties. This helps to generate greater attention and to make the people aware of what really counts.

Finally - the million dollar question - what do you believe we can next expect from Google?

MT: More updates. Honestly, I think, the direction in which Google is developing has become quite clear given the recent trends and changes. Of course, it is still possible to cheat on Google and achieve good rankings with certain methods, but this is only short term. In the long term, quality will always win in organic search. And one of the most important “factors” is the user. In the end, it’s the user who decides if they like the content or not. And Google measures the user’s behaviour with several parameters such as Time on Site or Bounce Rate etc, which I believe strongly influence rankings. So if you provide user-relevant content, you are on a good track.

Has your site been affected by Panda 4.1? Let PerformanceIN know by answering our poll

Continue the conversation

Got a question or comment – tweet Richard @RichToweyPI or comment on Twitter, Facebook or LinkedIN.

Richard Towey

Richard Towey

Richard serves as head of content at PerformanceIN. After many years spent covering developments from the automotive, sports, travel and finance sectors, he eventually turned his full attention to reporting on stories from the fast-evolving world of digital marketing. Richard now heads up the editorial team at PerformanceIN: the performance marketing industry's leading publication.  

Read more from Richard

Related Articles

Join over 10,000 performance marketers for the ultimate weekly update on industry news