Google Search Console is one of the very few places you can get free link and ranking data. Even when compared to paid sources it often has better link data and reports on more long tail keyword data because it’s the only provider that has direct access to Google’s data.
Link & Ranking Data Limitations
There is a problem – Search Console keyword data is ‘truncated’, whilst links data is ‘sampled’. Search Analytics – the part that gives you the ranking data – limits what it shows you in two ways:
- It strips out ‘very rare’ queries for user privacy reasons; and then
- It shows a maximum of 1,000 queries. When paid ranking tools can often show over a quarter of a million keywords for big sites, this limitation is crushing.
The link data, meanwhile, is a random sampling. That means that if you’re trying to clean up your links then the links that are hurting your site might not even show up in Search Console. Agh!
All of that means Google’s not giving you all the data they have about your site. If you’re trying to build a list of the keywords you’re ranking for – or a complete list of all the links your site has – then that’s a problem.
Solution – Forcing Search Console to give us the data
Simply, we add the site to Search Console multiple times. It sounds stupid, but works because:
- Different parts of your site will be ranking for different search terms. If you have a different property for the blog than for a theoretical ‘shoes’ category then the 1,000 keywords in each property will be 1,000 completely separate and different keywords from each other.
- Google says the links data is randomly sampled – so another Search Console property will see a different random sampling.
Side note: I don’t believe the links data shown is entirely random despite what Google says. If this were true there would be multiple instances where a site’s top links are not showing in Search Console, but I’ve never come across this (except where the link’s very very new). I believe it’s more likely Google’s showing your top links and then sampling at the lower, more spammy, end. This makes it even more important, if you’re doing a link cleanup, to gain access to all that sampled data!
This can be scaled almost infinitely (depending on the amount of folders your site has), but bare in mind having 50 Search Console properties for one site can be a total pain when, for example, you need to give someone else access.
Let’s use Verve Search’s own site as an example. You’ll probably be starting with only one property for that site in your Search Console account. Time to add (a lot) more! You start by clicking the ‘add a property’ button:
You’ll then be asked what ‘site’ you want to add:
This is where what you’ll do will depend on your site. You’ll want to create a list of your top folders of interest. If you don’t know what the top folders on your site are then have a look in Google Analytics (Behaviour >> Site Content >> Content Drilldown) for the most visited folders. We’ll be adding:
- Our blog
- The Knowledge Bank
- Our ‘services’ folder
Generally though, you’ll want to consider adding:
- Main categories
- Key product pages (depending on your URL structure)
- Key landing pages
For each one you’ll be asked to verify site ownership. If you have Tag Manager installed this is the easiest thing in the world as Google will do it pretty much automatically for you. If you don’t, then you’ll have to pick from one of these alternative methods:
Of the alternatives, Google Analytics is by far the easiest if you’ve got it installed on the site. Domain name provider is almost always the most painful way of doing it. Once complete, you’ll have a nice list of properties in Search Console:
But Wait! You’re not quite done yet. Remember I said that the links data showed random samples? As a result there are a couple more properties you can register to get even more link data:
- http://[your domain]
- http://www.[your domain]
- https://[your domain]
- https://www.[your domain]
Even if you’ve got redirects set up, have always been on one, rather than any of the others, it’s worth having all four versions in your Search Console just in case. Depending on your setup verifying a few of these may be a pain too, but it is actually Google recommended:
The bad news is that it can take up to a week before the below message disappears and data starts flowing in to the new properties you’ve created:
Once Google has finally processed your data, however, you’ll have lots of shiny new information to check through. Remember, the link data in different properties will often contain duplicates, but by exporting it and removing duplicates in Excel you’ll find that you’ll still be left with anything up to 50% more link data!
So there you have it – thousands of extra ranking data points and tons of ‘new’ links through what should be a maximum of a half an hour task.