If you are reading this, you probably don’t need anyone else to tell you why SEO is important.
The equation is fairly simple – if they can’t find you, they can never become your customers.
While there are dozens of different aspects to SEO – all important in their own rights – it all always starts with technical SEO. Having come across and worked on numerous websites over the years, I know for a fact that people – especially those who run a ready-to-deploy CMS like WordPress – never really get around to working on technical SEO. For marketers, it’s tough to onboard clients with long-winding SEO audits, simply because there isn’t enough time.
Our mini Technical SEO checklist gives you a simple, easy to understand and actionable way around these problems. And yes, all it takes is about half an hour to find glaring holes that may already be costing your website an SEO fortune.
Table of Contents
Technical SEO, as the name says, deals with the technical-only aspects of search engine optimisation. It includes optimising the website and the servers so that the search engines can crawl, assess and index the content without any trouble.
To understand the importance of technical SEO, we need to understand how search engines work. Once your website is indexed, the search engine crawlers will regularly visit your pages to see what’s going on. With each visit from these crawlers, pages on the website are indexed according to the content value, relevance and user experience. Now when a search query is entered, the most relevant page will be ‘fetched’ by search engines to be displayed to the user.
If your pages aren’t easy to crawl or index, they will never be fully fetched by Google or any other search engine. The result – your website gets pushed down the SERPs, regardless of the quality the content. If your website isn’t on ‘friendly’ terms with search engine crawlers, you might as well kiss goodbye to organic traffic.
Organic traffic is the bread and butter for business websites – and that is exactly what makes technical SEO enormously important.
A technical SEO audit tells you what difficulties search engines might be facing while crawling and indexing your website. A full-spectrum technical SEO audit is usually best left to experts and can take 2-4 hours to complete. You can, however, perform a quick technical SEO audit of the most important elements to get a broad idea of what needs to stay, what needs to go and what needs to improve.
In this post, we will draw a basic technical SEO roadmap that is generalised for all websites. Towards the end, you will find our free mini technical SEO checklist that you can keep handy for reference.
It’s important to remember that this mini technical SEO audit will only reveal problems that directly impact your website’s relationship with search engines. To audit and remedy more specific technical SEO factors, have a look at our end-to-end SEO solutions. Get in touch with us here, and we will build a detailed proposal for you.
A quick audit is usually enough to know if the website uses the best technical SEO practices or not. It paves the way to finding the most common problems that can be fixed right then and there. As was the case with our mini local SEO audit, I have developed this strategy with two broad aspects in mind:
The entire point of optimising your website is to stay on the right of the rules followed by the search engines – especially Google.
A good way of knowing if you are doing this or not is to tie the essential reporting tools into your website. If the website you’re auditing isn’t already using these tools, it’s a red flag right there, and you can expect more technical SEO problems to crop up as you go on.
These tools include:
Google Search Console is your one-stop shop for most things technical SEO. Even when your website isn’t live, you should claim your domain as your property in your Search Console account.
Search Console is more of a diagnostic tool that delivers regular reports to your inbox.
Update: The new and updated Search Console brings many scattered testing tools together. It’s possible to perform most of the steps detailed in this mini audit using this version.
Focussing only on Google doesn’t work anymore. You need Bing on your side, given that it processes 1 out of 4 web searches. Bing Webmaster Tools is exactly similar to Google Search Console.
This is an analytical tool that keeps track of all visits to your website and provides some easy-to-process insights.
An XML sitemap is a clearly-defined structure of how your website content is arranged.
Experts have always differed in how sitemaps should be developed, read and reported. The XML format we use these days is more or less universal in nature and accepted by all major search engines. Here’s what an XML sitemap tells search engines:
By having this information about every URL on your website, Google knows exactly where to send the incoming traffic. This ease of being found reduces the amount of data that crawlers have to search and index, helping improve the SERPs ranks of your pages.
Most platforms automatically generate sitemaps. Generating sitemaps isn’t enough – you need to let Google know where it is. You can check the sitemap submission in Google Search Console as shown above.
Having an SSL enabled website is no more a luxury. Thanks to the search algorithm updates introduced by Google in 2014, a secure website now always outranks a comparable non-secure website.
While you are at it, also make sure that the all non-https versions redirect to a single https version of the website. Your Search Console dashboard lets you report all such redirections to Google.
You can quickly check the status of your domain’s SSL certificate in the address bar, by clicking the HTTPS protocol or the secure padlock (depending on the browser you’re using).
Robots.txt is a simple text file that tells search engine bots which pages to crawl and which to leave out. It gives webmasters more control over the appearance of their website in search results by disallowing the crawling and indexing of non-presentable pages like archives, categories, users etc.
A standard robots.txt file can disallow up to 50,000 URLs. As you publish more content, the importance of robots.txt file keeps increasing. This, however, shouldn’t mean that only large websites need it. As far as the ease of being indexed goes, I recommend every website – small and large – to have a well-maintained robots.txt file.
Since this is a mini technical SEO audit, you don’t really have to go through the file itself. It’s enough at this point to see the overall health of the file. For a more thorough inspection of robots.txt, a full-scale technical SEO audit is necessary. Our customised SEO audit plans can dig up the smallest – yet significant – SEO issues that may be hurting your website on a daily basis. To know more or to get in touch with us, click here.
You can check the robots.txt file in your Search Console Dashboard. For faster results, use the old version (or just skip to the Robots.txt tester). Skim through the disallow tags to see if any important pages are accidentally blocked (trust me, this happens a lot!). Standard Robots.txt files are located directly in the root folder (https://your-domain.com/robots.txt)
The load times for web pages – or page speeds – are becoming increasingly important to maintain a certain level of user experience.
A bloated, slow-to-load page has multiple impacts – it drives users away (increasing the bounce rate), creates unnecessary web maintenance problems, and – most importantly – allows a free way ahead for competing pages that load faster. This is because Google loves fast websites (they have an entire project dedicated to this).
A wide range of on and off page problems can cause the page speed to go south. These typically include structural problems, content problems, hosting problems and developing/design problems. Since this is a post dedicated to only checking the errors, we won’t dwell too much on how to fix slow-loading pages (this job should ideally be handled by your dev team). Common fixes include:
The best starting point is the home page of the website. Just enter the home page URL in the PageSpeed Insights tool (don’t forget to type in the correct http/https version of the URL), and Google will tell you if your site is up to the mark or not.
I have always maintained that SEO is an ever-evolving process, and you should never rest easy thinking that one-time fixes will come good forever. The mobile-friendliness factor is a great example of this.
Not too long ago, most websites were designed to give the best user experience on desktops. Any traffic coming from mobiles was just a bonus. Things have changed a lot since then. Smartphones are gateways to the internet for billions of people, and all websites need to be serious about giving a good navigation experience on all smartphones (and tablets).
By checking the mobile-friendliness of your website, you make sure that you are not ignoring a large chunk of visitors who can potentially turn into leads and customers.
Important: Accelerated Mobile Pages (AMPs) have started to become a measure of mobile-friendliness. If your website can’t do without heavy scripts and plugins, enabling AMP can be a good way out. You can check the AMP status of any page by following its source code and searching for the AMP tag rel=amphtml. This is a canonical tag and doesn’t attract duplicate content penalties.
Google, once again, is your best friend here. Use this simple testing tool to get an instant mobile rendering of your website. You can also enter the URL in the latest Google Search Console version to check its mobile-friendliness.
This part includes bits and pieces from technical SEO and on-page SEO.
You may be absolutely sure that you haven’t published duplicate content, but you still need to check for any accidental URLs created by your CMS or publishing platform. This includes archive URLs, category URLs and comment threads.
Having multiple URLs containing the same content confuses the Googlebot and divides the SEO goodwill from backlinks. Any web practice that increases the workload of crawlers is bound to attract SEO penalties – and duplicate content is no exception to this rule.
Fixing duplicate content can be a time-consuming job, if the ‘infestation’ is large. HQ SEO’s thorough technical SEO audit service can identify all duplicate content events, along with other technical SEO problems. To request a free proposal, get in touch with us here.
The best way to avoid duplicate content penalties is to use canonical elements and preventing the duplicate URLs from being crawled (via the robots.txt file).
You can use Copyscape to run a quick and free duplicate content check for important URLs.
Structured data is another way of reducing the burden on search engine crawlers and also improving the visual appeal of search engine results via rich snippets. Since this is a web standards movement supported by all leading search engines, you can assume that it’s going to be a norm in coming years (it’s already getting there).
When there’s no tool available, just turn to Google!
Enter the search term site:your-url and see if the result is displayed with various attributes (date, time, price, location, meta data, reviews etc.). Refer to the image above to see how good structured data markup practices help the rich snippets in SERPs.
Click here (or the image above) to download a handy mini technical SEO checklist. Using this checklist, you can assess the technical SEO health of any website in a matter of 30 minutes.
Technical SEO is the foundation of all SEO. If this mini audit reveals any warning signs or red flags, it’s time to let an expert take over.
Technical SEO, in a nutshell, is all about adopting the best web standards. Think of it as improving the user experience for search engine bots. The easier your website is for them to use, the better it is for you.
Have you found any technical SEO errors on your website? Let us know about them by getting in touch here. You can also request a free proposal by filling in the form below.