SES London 2014: How to Become a Leading SEO Mechanic
I attended SES London situated a stone’s throw away from Big Ben and Westminster Abbey. I attended many presentations covering a wide range of SEO problems and solutions, and I’ll give a quick rundown of some of the best tips and tricks to help with technical SEO from the “How to Become a Leading SEO Mechanic” session.
Dave Naylor of Bronco
The Starting point
When analysing a website, start with Google Webmaster Tools. The key things to look out for are:
- Crawl errors. This is to identify technical issues such as broken links, site errors and URL errors across devices.
- HTML issues. Be aware of duplicate content and remove the pages as soon as possible. It is important to keep on top of this to help increase and maintain your page rank.
Within webmaster tools you should be downloading your backlinks daily (yes daily) according Dave Naylor. This is to prevent unwanted links going unmissed.
Note Webmaster Tools gives sample data meaning that it’s not the full picture. So when you download a set of backlinks on a particular day the data set will be different to when you download it the next day.
Be aware that Webmaster tools is not perfect and should be used alongside other tools to complement and verify the data found with Webmaster tools such as Majestic SEO and aHrefs.
The reason for this is it can be extremely costly when it comes to very large sites. In the case of monitoring unsavoury links if you can’t see the whole spectrum of links and you miss a large amount of bad links then this makes the task of disavowing links very difficult. This is very important if you want to avoid a penalty and stay on Google’s good graces.
Other areas to check
It is still unknown if the loading time of a website has a direct correlation to PageRank. Studies have shown no correlation while others within the industry swear by it. The one thing for sure is that it’ll have an effect on the user experience, i.e. if it takes ages for a site to load, it’s more than likely that the user will leave to go to a competitor. Use tools such as Botify and GT Metrix to find the key problem areas for your site and decrease those load times.
A large part of a page ranking factor has been relevant content on your site. It’s extremely important to look out for thin and duplicate content. There are several ways in which to do this e.g. using a site search “site:www.vervesearch.com excitement of imagination such as animates an artist”
By Andre Alpar of AKM3
Andre Alpar gave an informative presentation on how to find and fix technical issues within your site. The presentation boiled down to two main areas to focus on Crawler management and Indexation management with the culinary metaphor that a website is like an onion. Crawling and indexation ultimately needs to be managed in order to achieve higher rankings.
Choosing the levels in which the Googlebot travels through the website is said to be a, “necessary precondition for great rankings and efficiency plus effectiveness in SEO”. With very large sites, it’s all the more important to pick and choose what the bot crawls. As it won’t crawl everything it’s very important that the most useful pages are found, crawled and indexed.
Pages such as PDFs, filter and sorting options and printable versions can be described as the skin of the onion, they hold a valuable purpose to the vegetable but can ultimately be thrown away therefore these pages should be blocked with Robots.txt.
There are said to be three types of user oriented web pages, knowing what to Noindex, block and what to focus on is key.
- User and internal linking pages:These pages (paginated content, help pages, and checkout pages) can be described as the outer core, a little hard and tasteless with low overall value, therefore these pages should be Noindex and follow
- User, internal linking and SEO:These pages can be seen as the juicy centre they hold all the nutrients for the user and for SEO purposes and therefore need to be crawled and indexed
- SEO Oriented landing pages:These are the juicy, juicy core and where the majority of your efforts need to lie.
Andre Aplar suggested to aim most of your time and effort in making each page, that has relevance to your end user and that holds SEO value, have unique and relevant content. By driving the focus of the crawler away from the pages that don’t need to be indexed via robots.txt the chances of greater rankings for relevant content will increase.