Structural SEO breaks down into two broad areas.

In the first one, you’re worrying specifically about users and their search behaviour, the goal being to have a suitable page for any relevant search query, so you have the best chance possible of ranking for that query. You’re further concerned with the ongoing user experience once a customer has found your site, too.

The second area involves search engines, their ability to efficiently crawl the site and whether you will have a problem with duplicate content or thin pages. 

Usability and structural SEO

Site structure is often planned in a card sort exercise, where cards representing pages can be arranged in different permutations with navigational links drawn between them. 

SEO data, like target keyword groups for each page and the search demand connected with those groups, and commercial data from the business about the value of a customer, adds a lot of useful context to this exercise and should be one of the first UX/SEO integrations on the table.

For many businesses, one approach I like to take to structural SEO and blending usability and SEO is something I call ‘site search optimisation.’ Most organisations with site search functionality are sitting on a gold mine of data about what their customers search for and whether your site has the right answer, but usually this is criminally underused. A good site search optimisation project will mine this data for frequently-used queries that exhibit signs of bad user experience, such as poor CTR or clicking through multiple pages of results. 

In essence, you’re doing exactly what the major search engines do. Often we also manually rate the quality of the site search results; again this is exactly what the engines do. One outcome is usually a list of new content that needs to be created and integrated into the structure of your site.

Digging deeper

User data can be analysed at more granular levels to refine the structure of your site. For example, changing the order of options like colour in a product filtering menu to match current search demand (promoting white clothing to the top of the list during summer, say) so the most often needed options are immediately available and don’t need to be scanned for. 

These seemingly tiny changes can have a significant aggregate impact on usability, and therefore your sales and your SEO, and in many ways are the ultimate expressions of a holistic, user-centric approach to it. 

In one case, re-ordering the options in a filtering menu on a fashion website from alphabetical to an order determined by user demand led to a greater than 20% improvement in conversion rate and associated usability signals.

‘Crawl budget’

An important SEO concept that relates to structural SEO is that of ‘crawl budget.’ In essence, it says search engines will only spend so long indexing your site and while they are doing so you don’t want them ferreting off in endless navigation loops or getting lost looking at ever-deeper combinations of faceted search options. 

Equally, you should be aware of how many thin, duplicate or near duplicate pages a search engine may be able to crawl, as there is some evidence to suggest that you can trip a filter that downgrades your entire site from a quality perspective. In essence, it’s usually in your interest to limit the scale of your site as far as search engines is concerned so that only unique pages that are in demand by users are crawled and indexed.

At the most basic this is accomplished with crawl and indexing directives in places like robots.txt, robots meta tags and in Google’s Search Console. But simply blocking a page is not ideal as it doesn’t prevent it accruing link value. This has erroneously led lots of sites into prolifically using nofollow to manage crawl in the conviction it retains link value – something debunked years ago by Google but still widely believed. 

This doesn’t leave many options, so I’d advocate trying to work with whoever is building the site to limit the number of problem pages from the outset. Everything else is sticking tape – and sadly is a subject I don’t have space to write about fully here.

Keeping it simple

As I’ve alluded to above by far the most ‘at risk’ sites when it comes to page proliferation are e-commerce sites with faceted browsing. 

The nature of providing so many necessary options to users, from price, to colour, brand, size, material and so much more, means an almost inevitable problem of having too many overly similar pages available or search engines to index. In the worst cases the problem can be virtually infinite in scale as search engines will give up crawling long before finding all possible combinations of facets in a big e-commerce site.

A good rule of thumb with these sites is to allow search engines one level deep into your faceting, but forbid anything further or any combinations of faceting. Although, as ever, there are exceptions to any rule.