How To Do a Technical SEO Audit
Below is a list of items that you should review in your technical SEO audit:
Robots.txt – Is your robots.txt file preventing search engines from crawling key pages or sections of your website?
Robots.txt is a file that you can use to tell search engines on how to crawl your website. You can check yours by typing robots.txt after your website URL:
The file should look something like this:
Make sure you’re not blocking any content or sections of your website you want crawled. If you’re you should remove them from this file.
- It’s best practices to add the location of your xml sitemap to your robots.txt file.
- If you have an internal site search function you should disallow these from being crawled.
Noindex tags – Are you blocking key pages from being indexed with a meta noindex tag?
Noindex tags tell crawlers how to crawl and index information on a specific webpage. You can check if you are blocking important pages by using a crawling tool.
In Screaming Frog put your URL into the bar at the top of the application and start:
Note: you’ll be using this crawl throughout the guide to analyse different parts of your site.
On crawl completion, scroll down on the Overview tab until you hit ‘Directives’. Here you’ll want to then click ‘Noindex’:
If any of these pages are of high value to your website you’ll want to remove the noindex tag.
Canonicals – are your web pages competing against each other in search results?
Using canonical tags is a way to tell search engines which version of a page you want them to rank in search results. This will prevent any duplication issues.
For example, if you have an HTTP and HTTPs version of your website you’ll want to tell search engines to rank the https version.
In your SEO audit, check if all pages on your website have the canonical tag. Even if you only have one version of each page you want it to use a self-canonical.
You can check this manually on small sites or use screaming frog on larger site:
4xx and 5xx errors – are your pages not accessible to search engines?
Search engines can’t access and index your website pages if they return errors. For example, the most common ‘404 not found’.
Using screaming Frog you can identify and fix any 4xx and 5xx status code errors on your website:
To fix the errors you have a couple of options:
- If you still need the page update it so it can be found on that URL or 301 redirect it to its new location
- If the page is no longer needed then 301 redirect it to the most relevant page on your website.
It’s very important to use the 301 redirections (and not 302). Because it will pass through any links or SEO value the original URL had.
Index status – are all your website pages being indexed by search engines?
If you have a page on your website, unfortunately, it doesn’t guarantee that you will be in the index.
You can do a quick check by doing a site search in Google for example:
And ask the question, is this number in the range of Google Analytics data?
Jump into Google Analytics and go Behaviour > Content > Landing Pages:
I have 41 pages in Google’s “index” and 46 in my Analytics report so it all seems to be okay.
If you’re seeing a large discrepancy between pages on your website and pages in search engines index you will need to investigate.
Sitemaps – are you making your website easier to crawl with HTML and XML sitemaps?
Sitemaps are essentially a map of your website that search engines can follow to find all your pages.
Your website should have an XML and HTML sitemap. You’ll want to check if you have both.
Site Architecture – is your website well planned and structured to pass authority?
You can spend days auditing your site architecture to make sure it’s well planned.
But for your SEO audit, you want to make sure pages aren’t falling too deep into the architecture. By this, I mean more than three clicks from the homepage.
You can check this in Screaming Frog by selecting the ‘Site Structure’ tab:
If you’re seeing a large percentage 3 or more clicks from your home page you will need to address this.
Site Speed – is your page load speed providing a good user experience?
Page speed is important for user experience and search engines prefer fast websites.
2 seconds is the ideal speed that website should aspire to, but this is often not realistic. Run your website through Google Page Speed Insights and Pingdom Tool to get an understanding of how you stack up.
If you’re getting average results you may want to consider enlisting the help of a developer. They will be able to improve your website’s performance.
Security SSL – is your website on a secure protocol?
Having a website secured by an SSL certificate has been a light ranking factor for some time now.
It’s a no-brainer to upgrade your website to HTTPS if you’re not already. You need to very careful with the migration and correctly 301 redirect HTTP to HTTPs.
Here is a great resource for the process of the migration to HTTPs.
Mobile – is your website optimise for the device experience?
More than half of internet users are on a smartphone and search engines know this.
Is your website mobile responsive?
If not, you could be in for a big shock this year as Google starts ranking websites based on the mobile version. If you haven’t already made your website responsive you should ASAP.
Here’s Google’s guide how to prepare for mobile-first indexing.
Structured data – has your website been marked up with structured data?
Using Google Search Console, check if your website is using structured data. Do this by navigating search appearance > structured data:
Structured data allows you to enhance your search result appearance. This can increase click-through rates and help provide more information to searches.
For example, display reviews for your product or service: