INside Performance Marketing
Duplicate Issues for Product Feeds?

Duplicate Issues for Product Feeds?

In a recent blog post by Google Principle Engineer Matt Cutts, Google outlined how the Search Engine giant has been making changes to the algorithm to clamp down on duplicate content across the net.

“We recently launched a redesigned document-level classifier that makes it harder for spammy on-page content to rank highly. The new classifier is better at detecting spam on individual web pages, e.g., repeated spammy words—the sort of phrases you tend to see in junky, automated, self-promoting blog comments”

Search Agency Greenlight issued a Press Release on the expected impact but what of the affiliate community?

Over the past week TradeDoubler have asked for feedback from a number of affiliates and the result has been a mixed bag. Some larger affiliates have seen no impact at all whereas other, smaller sites, have dropped off the radar altogether.

The whole ‘duplicate-content’ debate is rather old-hat in site-design and SEO (to build a successful business out of web publishing you will, of course need to continually provide reasons for users to return and original content/UGC are some of the main ways). One would expect leading affiliates to be doing this any way. However, one difference that has been spotted is with the use of product feed data.

An unnamed publisher (albeit smaller than the affiliate key-players) saw significant drop-off of traffic after the algorithm change. Upon inspection, his site looked fine. Price comparison over electrical goods with a good dosage of product data to help users purchase decisions – all with the aid of the product feeds available from merchants via affiliate networks. However, he was seemingly penalised for the duplicate product information that had been standard in the feed(s) he downloaded.

Duplicate Content Issues?

This poses a problem for new sites that are in the process of building up good site relevance and hence improving quality scores. If Google are looking at the product data as simple duplicate content then there is a definite need for publishers to:

  • Utilise user reviews to a deeper level and, if possible, to make editorial commentary on the products
  • Mitigate risk by leveraging other forms of traffic (social media, emails etc)

And don’t expect this to go away. Judging from Cutts’ comments below this will be an increasing focus for Google over 2011:

“We’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content” - Matt Cutts, Principal Engineer

We shall be (as I’m sure you will be) keeping a close eye on this topic considering product feeds are becoming increasing utilised by merchants and publishers in affiliate marketing.

Continue the conversation

Got a question or comment – tweet Sanjit @SanjitAtwal or comment on Twitter, Facebook or LinkedIN.

Sanjit Atwal

Sanjit Atwal

Originally from a creative background, Sanjit has worked across film, press, magazines, radio, lo

Read more from Sanjit

Join over 10,000 performance marketers for the ultimate weekly update on industry news