It’s not exactly Martin Luther King, but I have a dream, a dream of one keyword search account structure, something that could change the complexity of search into simplicity. In all of the my nine years working in search the smallest account keywords that I or my team have ever managed consisted of 10 keywords, back in 2008. In contrast, the biggest single client search operation consisted of 36 accounts with over 10,000,000 active keywords (1+ impressions over the last 30 days), broken into over 30,000 campaigns.

But this approach is a dated one, the introduction of audience based multipliers and RLSA means that more is no longer better. 

Audience as the strategy centrepiece

In Search personalisation is not a future trend but a reality. Regardless of the keyword variation that a user is typing into Google search, it has become easier for PPC practitioners to target the user based on who they are, by using data on their profile, previous behaviour and other intent signals. 

For example, if you are a holiday operator spending an eight-figure annual budget on covering all possible search variations, you could instead buy one keyword only – “holiday” – and through a bidding strategy based on the audience not the keywords, save a big portion of the investment. Every user would then be identified with your brand based on who they are and where they are based, regardless of whether they had visited your site in the past, or if they are an existing or past customer –  just by typing in “holiday” in Google. 

This would work by combining all the first party data and knowledge a brand has about their existing consumers with the third party data around consumer behaviour on site. It could also pull data from social platforms and the wider web, such as Google owned full tech  – DC Digital Marketing Platform or the likes of BlueKai with AdWords targeting capabilities – we could apply a significant layer of different bid multipliers for this one (or small selection of) keyword(s). 

For example a user types in “holidays in Sicily”. The first and third party data would show the extent to which the user had previously browsed your site; engaged in discussion about Sicily as a holiday destination on social media; and if they fit the demographic profile of your top converting customer; live in SW2 in a household with above average income.
This means it would be wise to add a 100% multiplier to the bid and tailor the ad copy specifically for top Sicily offers. Similarly, negative multipliers could be added for the users with a very low probability of conversion – based on some kind of calculated but statistically significant probability score, using all said data sources. 

Putting trust in Google’s accuracy

There are a number of obvious pitfalls, such as full reliance on Google targeting accuracy. There will always be a risk of missing out on a potential customer. However, the advantages still stand, most notably, that you could save a fortune. It would allow for extremely personalised targeting and creative, plus a shorter customer journey from search to purchase. It would simplify the traditional optimisation framework from 40 hours a week to four – yet another saving.
In conjunction with the Google knowledge graph and the attempts in semantic search – personalised audience targeting is something we could have much greater control over.

We are already moving towards the direction of testing and proving the value of this concept. Using a minimum of keywords, simplifying account structures and speaking to searchers and audiences with a highly tailored message leads to reduced budget and fewer workforce hours wasted.