Technical SEO: The Only Guide You’ll Ever Need

a picture of Tom Donohoe
Written by: Tom Donohoe
Last updated: August 5, 2018

This guide has everything you need to know about technical SEO.

So if you want to learn technical SEO, you’re in the right place.

And let me be clear about something:

I made sure that this is the most thorough guide out online. Your one-stop resource for learning everything.

Let’s dive right in.





Chapter 1

What is Technical SEO?

Technical SEO is a subjective area of search engine optimisation. And anyone you ask will give you a different definition. So, I’ll add my two cents to the debate.

On a high level, Technical SEO involves ensuring that search engines can crawl, index and rank your website. It encompasses maintaining all technical aspects of your website to keep up with webmaster best practices. Which includes, but is not limited to, things like mobile, speed, structured data and security.

In plain English, technical SEO is managing all technical aspects of the website other than content and links.





Why is technical SEO important?

Technical SEO isn’t sexy, and most website owners don’t understand why it’s essential. But it’s critical to every other aspect of SEO.

It breaks down into three categories:

  • Performance: if your website is slow users will leave your website
  • Crawlability: if search engines can’t easily crawl your website they won’t understand it
  • Indexation: if they can’t crawl it they won’t index it, so you will never rank in the search results.

So, as you can see getting these things right is vital to your website driving results. Technical SEO is the foundation of your website. Without it, everything else will be unstable.

Websites with robust technical SEO are well placed to take advantage of the benefits of SEO techniques. For example, optimised content and quality link building campaigns.




Chapter 1

Crawling

When you use a search engine, believe it or not, you’re not searching the internet. Instead, you’re searching a search engines index of the internet.

For your website to be present in search engines index, it must be crawled. If your website can’t be crawled and indexed, you’ll never be ranked in search results.

In this chapter, I’ll cover everything you need to know from a technical SEO perspective about crawling.





A dead simple explanation about web crawling

Crawling is when a bot is sent by a search engine to visit your website. The bot goes from one link to another, then makes a note of the content on the website and sends it off to be stored for later.

In Google’s guide to how search works they use the analogy of a library to explain crawling:

“The web is like an ever-growing library with billions of books and no central filing system. We use software known as web crawlers to discover publicly available web pages. Crawlers look at web pages and follow links on those pages, much like you would if you were browsing content on the web. They go from link to link and bring data about those web pages back to Google’s servers.”

In plain English, search engines come to your website, view the pages and store them to show to people later.

Technical SEO’s role in crawling

As an SEO or site owner, it’s critical to make sure search engines can crawl your important pages (not all pages).

I say, ‘important pages’ because you don’t want every page on your website to be crawled. You can control which pages on your website you want to be crawled by search engines.

You might be wondering:

“Why not have every page on my website crawled?” Well, If every page on your website is crawled this can waste your ‘crawl budget’ (more on this later).

A lot goes into making your website accessible to search engines. But solid site structure is critical for search engines to crawl your website. In a nutshell, site structure is about having important pages within three clicks of your homepage.

How your website is crawled and controlling it

I mentioned that search engines give your website a crawl budget. Crawl budget means they won’t crawl every page on your website every time they visit.

So, how can you control the pages that they crawl?

  • Robots.txt: you can tell web crawlers, like Googlebot, to ignore specific sections of your website via a robots.txt file.
  • Robots Meta tags: you can tell web crawlers not to index certain pages by using the ‘no index’ tag in the page header.
  • X-robots-tag: like meta robots tag you can control indexing. But the x-robots-tag needs to be controlled via header .php, .htaccess, or server access file.

On the flip side, you can help web crawlers access and crawl your website easier by providing XML and HTML sitemaps.

Sitemaps list all the pages on your website in one location. The main differences are that XML sitemaps can provide more information to crawlers, such as the last time a page was updated.




Chapter 2

Mobile

Smartphones have taken over the world. In most countries, there are more mobile phones in use than computers requiring us as SEOs to shift our strategy.

It’s clear that mobile is a priority for search engines and recent announcements from Google confirm it. For example, rolling out their mobile-first index, and that page speed will soon be a ranking factor for mobile searches.

In this chapter, I’ll share everything you need to know to get your website ready for mobile SEO.





Creating Mobile websites

If your website isn’t mobile-friendly, then you could be losing a portion of your customers. Research by Google has uncovered that mobile surpassed desktop searches back in 2015.

Research by Google has uncovered that mobile surpassed desktop searches back in 2015.

If you don’t know if you’re website is mobile-friendly test it with Google’s mobile-friendly testing tool.

When creating a mobile-friendly website, you have three choices:

3 ways to implement your mobile website

Google doesn’t favour a particular approach when ranking websites. But, responsive design is Google’s recommended design pattern.

Best practices for creating a Mobile website:

  • Google recommends using Responsive Web Design over other design patterns.
  • Use meta name=”viewport” tag to tell the browser how to adjust the content.
  • If you’re website is using JavaScript, Google recommends JavaScript-adaptive

Common mistakes to avoid:

  • Blocked JavaScript, CSS, and Image Files
  • Unplayable Content
  • Faulty Redirects
  • Mobile-Only 404s
  • Avoid interstitials
  • Irrelevant Cross Links
  • Slow Mobile Pages
  • Incorrect viewport configuration
  • Small font size
  • Touch elements too close

If you’ve submitted your website to Google Search Console, you can use the mobile usability report to fix mobile usability issues affecting your site.

Mobile-first indexing

Google now indexes websites by the mobile version of the website content. In the past, Google has always used the desktop version to rank websites.

So, this means that mobile SEO is more important than ever. And optimising your website for mobile-first indexing is vital.

Best practices for mobile-first indexing:

  • Your mobile site should contain the same content as your desktop site
  • Structured data should be present in both versions of your site.
  • Metadata (page titles and meta descriptions) should be present on both versions of the site.

If your website uses responsive design, it’s likely that you don’t need to do anything to prepare for the mobile first index. But, I recommend reviewing your configuration to make sure it follows best practices. It’s also crucial that you have tailored your content to a mobile audience.

Improve mobile site speed

Page speed will be a mobile search ranking factor by July 2018, and it’s critical to improve site speed. Improving page speed can be very technical, and it’s best left to an experienced developer.

The future of mobile SEO: AMP & PWAs

Technology for improving the mobile experience is advancing at a blistering pace. Two approaches gaining traction are Accelerated Mobile Pages (AMP) and Progressive Web Apps (PWAs). Both AMP and PWA both aim to provide a better (and faster) experience for mobile users.

Accelerated Mobile Pages (AMP)

AMP is a straightforward way to create web pages that are compelling, smooth, and load near instantaneously for users. The project was designed by Google and Twitter.

Progressive Web Apps (PWAs)

PWAs are web applications that are a regular website, but they appear and act like a native mobile application. They are much faster than a traditional website and aim to provide a better user experience.

PWAs are built on a JS framework, getting them indexed correctly by search engines can be difficult. Read my guide to JavaScript SEO to understand the measure you need to take for getting JavaScript-heavy websites indexed properly.

I will deep dive into both AMP and PWAs in a mobile SEO guide I’m working on.




Chapter 3

HTTPs (SSL)

Security is a top priority for Google, and since August 2014, Google has been using HTTPs as a ranking signal.

So, should you switch to HTTPs? Well, research from ahrefs suggests that moving to does slight correlation with Google rankings.

I’d recommend making the switch because having a secure website has benefits beyond just SEO. For example, visitors are more likely to trust your site, which can lead to more conversions.





What is HTTPs?

In plain English, HTTPs makes your website more secure for people who visit by protecting their data between the visitor’s computer and your site.

But, if you’re a more technical here’s the definition from Wikipedia:

HTTPS (HTTP Secure) is an adaptation of the Hypertext Transfer Protocol (HTTP) for secure communication over a computer network and is widely used on the Internet. In HTTPS, the communication protocol is encrypted by Transport Layer Security (TLS), or formerly, its predecessor, Secure Sockets Layer (SSL). The protocol is therefore also often referred to as HTTP over TLS, or HTTP over SSL.

You can tell if a website is on HTTPs by the green padlock in the search bar:

secure website example

HTTP vs. HTTPs

The visual difference is that HTTPs URLs start with “https://,” whereas HTTP URLs start with “http://.” But the real difference is that HTTP websites are a risk of attackers stealing sensitive information from your site traffic.

HTTPHTTPs
Non-secure against hackersSecure against hackers
Not SEO friendlySEO friendly
Not trusted by visitorsTrusted by visitors>
Can’t use AMPCan use AMP
HTTP/2 not supportedHTTP/2 supported
Does not preserve referrer dataPreserve referrer data

How to migrate to HTTPs

If you’re ready to make the switch to HTTPs (you should!), I’ll now share everything you need to know to get started.

(note: if you’re not very technical get the help of an SEO and developer or even contact your web host)

I don’t claim to be a technical expert at migrating websites from HTTP to HTTPs. So, below are a couple of guides I’ve successfully followed.

A quick note on HTTP/2

HTTP/2 is an updated version of the HTTP protocol. Its goal is to make the website pages load faster and more secure. The original HTTP dates back 25 years, and the most recent update was in 1995.

HTTP.1.1 request resources to load a webpage one at a time, which makes loading a page quickly tricky. HTTP/2 is designed to receive many responses at the same time. HTTP/2 should result in a faster page load time, in turn giving you SEO and performance benefits.

All the major browsers support HTTP/2, so compatibility shouldn’t be an issue. And chances are your web hosting server also supports it. So, to get support for HTTP/2 on your website contact your web host and find out the steps to enable it.

How to get started:

  1. You have to be using HTTPs
  2. Check your server can support it.
  3. Configure the server to support HTTP/2



Chapter 4

Structured Data

Structured data is a term that refers to organising data to make it easier to understand.

Search engines, like Google, use structured data to understand the content of a webpage. It also allows them to present your website in ‘rich results,’ which I’ll share examples of later.

It’s important to note that structured data is not a ranking factor. But it can have an indirect impact on your rankings by improving click-through rate.





Structured data explained

You can help search engines understand your web pages by using structured data. Schema.org is a format for providing information, and it was founded by Google, Microsoft, Yahoo, and Yandex.

Here’s an example, of a review structured data snippet that might appear on the product page:

a review structured data snippet that might appear on the product page

And here’s how it looks in the search results:

example of review structured data in the search results

The 3 Ways of Structuring Data

Most search engines support structured data in the following formats:

I prefer to use JSON-LD as it contained within inline <script> tags. Instead of hardcoding it to wrap HTML properties. It’s also straightforward to install into the header via Google Tag Manager.

Troubleshooting your markup

Structured data is easy to troubleshoot thanks to Google, who developed a Structured Data Testing Tool to test your markup.

Structured Data Testing Tool

Paste your code, or page URL and the tool will test to see if you’ve created the markup correctly.

Structured Data Testing Tool example with review schema

How Structured Data Helps Your SEO

This is an SEO guide, right? So, you might be wondering: “how does structured data help my SEO?” Well, here’s how:

Rich result: search results that contain styling, images, and other features. For example:

a review structured data snippet that might appear on the product page

Enriched search result: search results that are more interactive or more advanced. For example:

Enriched search result

Knowledge Graph result: is a complication of different sources to provide an visual layout and more information. For example:

Knowledge Graph result

Carousel: a collection of rich results from your website in the form of a carousel. For example:

Carousel

All these features help you get more exposure in the search engine results pages. Another benefit is that you can increase your CTR dramatically.

If you want to learn more, then read Google’s documentation on structured data.




Chapter 5

Migrations

There are four common types of migrations for a website: domain name, redesign, CMS change or HTTP to https.

Migrations are one of the scariest things to undertake as a website owner or SEO. There is a risk of losing search engine rankings, but if done right you can see minimal impact.

In this chapter, I’ll cover the best practices for website site migrations.





Common Best practices for Migrations

As mentioned there are several types of migrations that a website can undertake. But there are standard best practices that you should be aware of for all.

  • Keep site structure, content, URLs, meta the same
  • Put redirects in place from old to new.
  • Update tools and tracking

When migrating, you should avoid changing more than one thing at a time if possible. Migrating with fewer changes makes it easier for search engines to understand that you’re still the same website. And makes diagnosing any problems much more manageable.

Here’s a helpful diagram from Moz:

migrations diagram considerations

The best approach is to start by reading the documentation that Google provides on site moves. Now let’s take a more in-depth look at the most common types of site migrations.

Domain Migrations

If you’re changing your domain from oldexample.com to newexample.com, then the process is straightforward.

Replicate everything from the old site on the new site and then 301 redirect the domain. You can then use Google’s change of address tool to notify Google to index your new URLs at the new address. This step will minimise impact to your current ranking in Google Search results.

Website Redesigns or relaunches

Website redesign is where things get tricky. From my experience, most website redesigns involve changes to structure, content, URLs, and more. All these changes make it more challenging to migrate without SEO damage.

The key to migration success is planning and testing everything in a staging environment.

Content Management System changes

Content Management Systems (CMS) migrations are straightforward as long as the only change is the CMS. All you need the do is replicate your website precisely on the new CMS and switch it over.

Moving from HTTP to HTTPs

I mentioned in an early chapter the benefits of switching your website to HTTPs and how to move your website to HTTPs. For brevity sake, I won’t cover it again.




Chapter 6

Page Speed

Page speed refers to the time it takes the content on your page to load.

Google has confirmed way back in 2010 that page speed is a ranking signal on a desktop and more recently mobile searches. So, we know that it’s crucial for your website to rank higher on Google.

In this chapter, I’ll cover what is ‘page speed’ and how you can improve the page speed on your website.





Why page speed matters

Other than better search engine rankings there are many benefits to improving page load speed, such as:

  • User experience: lower bounce rates, more time spent on site and better experiences.
  • Marketing performance: your advertising becomes more cost-effective as fewer people leave your site before seeing you messaging.
  • Your bottom line: you will get more conversions, which ultimately means more revenue.

I can’t think of a reason not to work on improving page speed, other than lack of financial resources.

How to improve your page speed

There are many techniques you can implement to improve page speed, but most need technical expertise.

I’m by no means a developer and don’t know to implement the required changes myself. But I do understand what needs to be done to improve page speed.

A checklist for improving page speed:

Measure your existing page speed: benchmark your current page speed by running your website through Google Page Insights or Pingdom.

Start with your images: uncompressed images are the number one page speed killer. Image compression is the quick win for page speed. Use a tool like TinyPNG to reduce the size of images before uploading them to your website. Aim for between 30kb – 150kb for optimal speed. Don’t compromise on image quality.

Minify CSS, JavaScript, and HTML: minifying means to remove unnecessary space from your websites code. It will reduce the size of the file and help shave time off your page load speed.

Enable Gzip compression: gzip compression will compress files on your web server before sending them to the browser. Most servers can do this easily, ask your web developer or contact your hosting provider to assist in setup.

Use a content distribution network (CDN): CDN is a network of servers in different geographic locations that store static files from your website. It will deliver the files from the server from the closest server to your website visitor. CDNs also offer further speed optimisations like gzip compression, parallel downloads and cookieless files.

Leverage browser caching: in plain English, browser caching stores pages elsewhere, so that your server doesn’t have to constantly fetch them from the database.

Remove render-blocking JavaScript: JavaScript can slow your website down if you need to wait for the files to execute before the page loads. Best practices is to make JavaScript inline, asynchronous or defer loading.

Resources:




Chapter 7

Status Codes

In plain English, status codes are a way that web browsers and servers communicate. To understand status codes you need to understand the basics of a browser to server communication.

The browser will send a request to the server to get a webpage. Then the server will send a status code back telling the browser the status of that webpage.

The status codes come in a three-digit code, known as the HTTP status code. In this chapter, I’ll share everything you need to know from a technical SEO perspective.





Common status codes

Five different types of status codes communicate different messages from the server to web browser:

  • 1xx – Informational
  • 2xx – Success
  • 3xx – Redirection
  • 4xx – Client error
  • 5xx – Server error

It’s a good idea to memorise which category is which that way it’s easy to identify what the issue is. For example, if you see a 5xx error, you immediately know that it’s a server error and to speak with a developer.

Each category has several different response codes they send to the browser. But in this guide, I cover the most important for SEO.

Important status codes for SEO

There are many HTTP status codes but only a handful that you need to be aware of for technical SEO. I’ll run you through them now.

HTTP Status Code 200 – OK

A 200 is the status code you’re hoping to see! It means that your page is functioning as expected.

HTTP Status Code 301 – Permanent Redirect

A 301 is used to redirect one URL to another URL permanently. 301 redirects will pass all the SEO value of an old URL to the new URL.

HTTP Status Code 302 – Temporary Redirect

A 302 is the same as a 301, but the critical difference is that research has shown that it is not as effective in passing SEO value. That is why it is known as a temporary redirect. I’d advise that you always use a 301 over a 302.

HTTP Status Code 404 – Not Found

You’re probably familiar with the “404 page not found”, which merely means that the resource can’t be found. You should 301 redirect 404 errors to the correct page.

HTTP Status Code 410 – Gone

A 410 is like a 404; however, the page is gone. It’s no longer available on the server.

HTTP Status Code 500 – Internal Server Error

Rather than the page is missing or not found, a 500 is an error with your server. You don’t want to have these issues and should have a developer investigate it immediately.

HTTP Status Code 503 – Service Unavailable

503 means that the server is unavailable. 503 could be because of maintenance, or the server is overloading. It’s unlikely to have any impact on your SEO because search engine knows to come back later.

Where to find and fix status code errors

To identify status code errors you will need to monitor the Crawl Errors report in Google Search Console.

Crawl Errors report in Google Search Console

The report will tell the page where the error occurred and the status code associated with it. You can export this information to a spreadsheet and share it with a developer to who will be able to fix the issues.

For 404 errors you will want to 301 redirect them to a new location. If you’re using WordPress you can use a plugin to redirect pages with ease.




Chapter 8

Indexation

To have your website appear in search engines results you need it to be indexed.

A search engine’s index is like the index of a book. It’s a collection of all the resources that they have crawled on the internet.

In this chapter, I’ll share what you can do to improve your website’s indexability.





Ways to get your website index

There are many ways that search engines can find your website and index it. You can be passive or proactive and to be honest, if you have excellent content search engines will likely index it. But there are three core ways to get your website indexed:

  • Passive: do nothing and wait for search engines to crawl and index your website.
  • Active: provide a sitemap on your website that helps search engines crawl all page.
  • Pro-active: submit your new URLs to search engines through webmaster tools.

There is no best way to do it, and it depends on your resources. But I’d recommend you’re at least active by placing sitemaps on your website.

For more details on each approach read Google’s documentation on indexing.




Chapter 9

Site Structure

Website structure is critical for user experience and also to get indexed by search engines.

Without a well-planned structure, your search engines rankings can suffer. And user will struggle to navigate to the content they are looking to find.

In this chapter, I’ll take you through the best practices for a well-structured website.





Optimal site structure for SEO

The optimal site structure involves every important page on your website to being close to your homepage. Where users and search engines can make no more than three clicks to reach what they are looking to find.

Here’s a graphic of what a perfect site structure would look like:

graphic of what a perfect site structure would look like

But you shouldn’t randomly group content to achieve the ‘three click rule.’ It should cascade from homepage to category page and finally sub-category page.

By using categories, you will pass relevancy signals to search engines to help them better understand your content.

Importance of URL Structure

URLs should provide people visiting your websites and search engines an easy way to understand what the page is about. Your URL structure should also reflect your site structure. For example:

www.example.com/category/sub-category

www.tomdonohoe.com.au/blog/blog-post

It’s also best practices to include target keywords in the URL, as long as it doesn’t hinder the user experience and feel like you’re forcing it.

Why internal Links matter

Internal links are vital to maintaining a robust site structure. Internal links refer to hyperlinked text the links one page on a website to another.

For example, you should have an internal link pointing from your category page to your sub-category page and vice versa. Internal links will be followed by search engine crawlers to find and index pages deeper in your site.

Internal links also help to distribute ‘link equity’ (ranking power) throughout your entire website. Like how a backlink from an external website would.




Over to you

There you have it everything you need to know about technical SEO.

I hope this guide was helpful and you learned a lot. If you have any questions please leave them below in the comments.