You know that optimizing your website for search engine algorithms is important, but don’t know where to start?
A technical SEO audit can be a great way to find potential problems that are preventing you from ranking higher, and our checklist will help you get started on performing a technical SEO audit of your own website. Keep on reading!
What Is Technical SEO And Why Is It Important For Businesses?
Technical SEO refers to the behind-the-scenes aspects of your website that impact how search engines crawl and index your content. These include things like the site’s current architecture, internal links, redirects, and more.
Many businesses are unaware of the importance of technical SEO and neglect this important aspect of their online marketing strategy. However, if you want your website to get higher search engine rankings and attract more visitors, it’s essential to prioritize technical SEO.
But before you get started with technical site audit…
Before you get started
There are some preliminary questions you need to ask yourself and your team before starting the audit process. This will help you understand your website’s current state and what needs to be improved.
Some questions you should ask include:
- Why are you doing a technical SEO audit?
- Is this website newly migrated from another platform, server, or domain?
- If so, when did the migration happen?
- Has the website been crawled before migration?
- Do you have admin access to the WordPress dashboard and DNS settings?
- Are you using Google Tag Manager?
- Do you have access to Google Search Console?
- Is Google Search Console connected? Does Google Search Console have data?
- Do you have access to Google Analytics? Does Universal Google Analytics have data?
- Do you have Google Analytics 4 set up? Does Google Analytics 4 have data?
Once you have answers to these questions, you can move on to the next stage of the process and start auditing your website.
Step 1. Crawlability
The first step in performing a technical SEO audit is to crawl your website using SEO audit tools like Screaming Frog, Deep Crawl, or Ahrefs. These crawlers will analyze every page of your site and identify any issues that could be holding you back from ranking higher in search results.
Some common issues that these tools can identify include:
- Broken links
- Duplicate content
- 404 errors
- Redirect chains
- Meta tags that are too long or too short and more.
From this, the audit tool creates an in-depth report on everything it finds to help you identify and fix any issues that are hindering your site’s performance. Of Of course, more advanced issues may need further investigation that involves other tools, such as Google Search Console.
Overall, you want to look up things like:
- Robots.txt files
– Check if there are any robots.txt files
– Check if the robots.txt file blocks crawlers or search engine crawlers
– Check if robots.txt blocks paginated URLs
- Google Search Console
– Make sure a Google Search Console crawl stats report any host status issues
– Check if crawl requests appear in Search Console crawl stats
– See if Google Search Console is making crawl requests for “refresh” or “discovery” purposes
– Check if Google Search Console crawled HTML files in the last 72 hours
– Check if there is a sitemap on the website
– See if the sitemap index URLs appear in the robots.txt file
– Check the number of URLs in the sitemap
– See if there are non-indexable URLs found in the sitemap
– Make sure sitemaps are submitted to Google Search Console and processed
– See if there is pagination on the website
These are just some of the key areas to look at when performing a technical SEO audit. By taking the time to thoroughly examine your website, you can identify and fix any issues that could be preventing it from getting a boost in search engine rankings.
Once you have completed the crawl of your site, it’s time to start fixing any issues that you find. This may involve making changes to your website’s coding or improving your internal linking structure for users. Whatever the case may be, it’s important to prioritize these fixes in order to see better results from your SEO efforts.
Step 2. Indexing your website
In addition to making sure that your site is crawlable, you also need to focus on the indexing process. This involves submitting your website’s sitemap and ensuring that search engines are able to properly crawl and index all the pages on your site. To do this, you can use tools like Google Search Console or Bing Webmaster Tools to submit your sitemap and monitor the status of your site’s indexing.
Steps to index your website:
- Create a sitemap for your website that includes all the pages on your site
- Submit the sitemap to Google Search Console and Bing Webmaster Tools
- Monitor the progress of indexing in both tools
- Check for any errors or warnings in either tool and fix them as soon as possible
- Regularly review the indexed pages report to make sure all of your pages are being crawled and indexed correctly.
It’s also important to note that there are some technical SEO factors that can affect how quickly your site is indexed, such as the number of links pointing to it and the quality of its content. As such, it’s essential to keep an eye on these factors as well.
Make Sure You Are Using HTTPS
Google prefers websites that use the secure, encrypted version of the Hypertext Transfer Protocol (HTTPS). When Google started using them as a ranking factor back in 2014, many businesses rushed to switch from regular unencrypted websites to secure ones. This is because it offers more security and privacy for users, so if your website isn’t using this protocol yet, you should definitely consider making the switch. HTTPS will help you not only to protect user data but also to rank higher in search results.
To check if you’re using HTTPS, go to your website’s homepage and look up the “lock” icon in the URL bar. If it is present, you are using the secure version of the protocol – good job! If it isn’t there, then you need to take steps to switch over to using this protocol on your site:
- Purchase an SSL certificate: To get started with switching from regular to secure, you first need to purchase an SSL certificate. This is the digital certificate that will allow your website to use the encrypted protocol. The process for purchasing and installing one can vary depending on your web hosting provider, but most providers offer step-by-step instructions on how to do this.
- Set up your website to use an SSL certificate: Once you have an SSL certificate, you need to set up your website to use it. This will involve making some changes to the code of your site, so you may want to enlist the help of a developer or webmaster if you aren’t familiar with working with HTML and other code languages.
- Check if your website is using HSTS: Additionally, you should also check if your website is using HSTS. This is a web security policy that forces browsers to use secure connections only, so it’s important to ensure that you are using this protocol on your site as well.
Besides HTTPS, you want to pay attention to the following:
- Check if the website is using a “non-www” or “www” version of the URL in its links: If the site uses www by default, does non-www redirect to www? If the site uses non-www by default, does www redirect to non-www?
- Check if URLs end with a trailing slash or a non-trailing slash: Are URLs without a trailing slash redirected to a trailing slash if they have a trailing slash by default? Does a trailing slash redirect to a non-trailing slash if URLs do not have a trailing slash by default?
- Check if there are orphaned URLs in the sitemap: Are there 301 or 404 URLs in the sitemap How are sitemaps generated on the website?
- Check the sitemaps in Google Search Console: Do any of them show errors? Are they reported in Google Search Console index coverage? Were they expected?
- Check if there are URLs marked as “Submitted URL blocked by robots.txt” in the Google Search Console index coverage: Are there valid URLs reported? Is there an increasing number of URLs reported as “Indexed not submitted in sitemap”?
- Check if there are excluded URLs reported in Search Console index coverage: Is there an increase of URLs reported in Google Search Console index coverage as “Crawled – currently not indexed”, “Discovered – currently not indexed”, “Excluded by noindex tag”, “Duplicate without user-selected canonical”, “Duplicate, Google chose different canonical than user”?
- Check Google Search Performance: Are impressions, clicks, and CTR for the website increasing or decreasing in the last 16 months? Are there any changes in SERP rankings?
- Check if there are non-indexable URLs: Should they be set as noindex?
- Check if there are 301/302 redirects: Are these redirections correct?
- Check if URLs have canonicals set and if the canonical tags correct
- Check indexable URLs: How many indexable URLs are there? How many of them are indexed on Google? How many are not?
- Check pagination: Are paginated URLs shown in the search results? Are all paginated series indexable? Have they canonicalized to a “view all” page or to a root page?
- Check an H1: Are the H tags correctly nested across all pages? Is there an empty h1 tag? Is there only one h1 per URL or more?
- Check WordPress tags: What tags are used? Are they indexable? Do they have canonical URLs? What categories are used? Are they indexable?
Speaking of canonical URLs…
Use Canonical URLs to Avoid Duplicate Content Issues
Duplicate content is one of the biggest obstacles to SEO rankings, and it can be caused by a number of different things. One such issue is when your website uses URLs that are unnecessarily long or contain unnecessary parameters. To avoid this problem, you should adopt canonical URLs whenever possible. These are the “original” versions of your pages, which search engines will prioritize over the other variants.
But how to do this?
Firstly, make sure to identify any duplicate content issues on your website by using tools like Google Search Console or Screaming Frog. Once you’ve identified these issues, you can then use canonical URLs to address them.
One strategy for incorporating canonical URLs is to use a plugin or extension that automatically adds them to your pages. For example, you can use the URL Canonicalizer plugin for WordPress, which will add canonical tags to all of your posts and pages. You can also manually add these tags by editing the code on each page.
Another option is to use 301 redirects, which will ensure that all traffic is directed to the canonical URL. This can be done by creating a new page with the right URL and then setting up the old page so that it automatically redirects users to this new one.
If you do everything right and take care of any technical issues that arise, you should see a significant improvement in how well your website ranks in search results.
Once you make sure your website is properly indexed, it’s time to move on to the next step in your Technical SEO Checklist and focus on your content for a moment.
Step 3. Content Optimization
Content optimization is an important part of technical SEO. You need to focus on optimizing your content for both search engine bots and users. This involves making sure that titles, meta tags, headings, image alt texts, and descriptions are optimized properly.
You should also ensure that you have high-quality and relevant content for the topics you’re targeting. If you have multiple pages on the same topic, consider consolidating them into one page or using canonical tags to point search engines to the most important version of a page. Additionally, make sure all your content is linked internally in order to maximize the chances of it being found by search engines.
To sum up:
- Make sure you don’t have any duplicate content
- Check if your content is organized with hierarchical HTML tags
- Make sure your titles, meta descriptions, and headings are optimized for target keywords
- Optimize content itself for the target long-tail keywords
- Ensure your content doesn’t violate Google’s Quality Guidelines
- Check the structure of your URLs
- Check if the content is linked internally
- Make sure all image files have alt text
- Check for broken links on the page
- See if there’s paginated loading for infinite scroll
- Check publication and updated dates, as well as the author’s names
- Fix any spelling or grammar errors.
Once you’ve reviewed and made any necessary changes to your content, you can move further.
Step 4. Speed and mobile-usability
Good and optimized content won’t mean anything if your website takes too long to load or users have a poor experience on their mobile devices. That’s why it’s important to make sure your website loads quickly and works properly across different devices. In other words, responsive design is what ensures that all visitors have the same user experience no matter what device they’re on.
You can use the Google PageSpeed Insightstool which will give you an idea of how fast your pages are loading and what can be done to improve them. Additionally, you should make sure that your website is optimized for mobile devices by checking the responsiveness of all pages and running a Mobile-Friendly Test in Search Console.
There are a number of things you can do to improve your website’s speed, including:
- Compressing images
- Leveraging browser caching
- Using a content delivery network (CDN) for static assets
- Optimizing your page’s code structure
- Optimizing your databases
By following all these steps, you can ensure that your website provides a great search journey for users.
And last but not least…
Step 5. Monitor and analyze results
Once you’ve completed all the steps, it’s time to monitor your website performance and analyze how your changes are affecting the performance of your website. This means tracking metrics such as organic traffic, primary keyword rankings, and conversions.
You can do this by setting up analytics tools such as Google Analytics, or by using specialized SEO tools like Screaming Frog or Ahrefs.
- Identify the pages that are performing best
- Monitor their rankings and organic search traffic
- Identify any pages that have low or declining performance
- Analyze why this is happening and take action to improve them
- Track conversions from organic search traffic.
By following these steps, you’ll be able to identify any issues that need to be addressed, as well as track your progress and success.
The 69-Point Technical SEO Checklist
To help you get started on your Technical SEO Audit, we’ve created a checklist of all the things you should be looking for when analyzing your website. This checklist covers everything from basic optimization tips to more advanced techniques like setting up redirects and leveraging structured data markup. Use it to make sure that you’ve covered all the bases and that your website is in top shape.
Click here to download The 69-Point Local SEO Checklist
However, if you need help or you have any questions – we’re here for you! Our team of experts can carry out a comprehensive Technical SEO Audit and provide you with actionable insights on how to optimize your website for search engines. We’re available for a free consultation, just check the time slot available on our website and book your appointment!
What is robots.txt?
Robots.txt is a file on your website’s server that tells search engines how to crawl and index the content of your site. It can also be used to block certain URLs from being indexed by search engines. This allows webmasters greater control over what parts of their sites are crawled, which helps improve their ranking in search engine results.
How do I use Secure URLs?
Secure URLs are those that start with “HTTPS” instead of “HTTPS” which is a secure version of the Hypertext Transfer Protocol (or simply, “the web”) that helps protect your website and its visitors from potential security threats. It’s important to switch to this protocol as it ensures the integrity of data exchanged between a server and browser, provides encryption for all transactions and is becoming increasingly more important for SEO.
But how to do so? Firstly, you need to obtain an SSL Certificate from a reliable provider and then install it on your web server. After that, make sure all of the website’s URL links are redirected to the “HTTPS” version instead of “HTTP”. Lastly, submit the updated sitemap to Google Search Console so that search engine bots can start crawling the new URLs.
How do I access Google Search Console?
To access the Search Console, all you need to do is log into your Google account and navigate to the Google Search Console URL. You’ll be asked to enter your domain name, and you can then start exploring the features of the Search Console.
How do I know if my site is already in Google Search Console?
Once you’ve logged in to the Google Search Console, you’ll be able to see a list of all your verified websites. If your site isn’t there, you can use the URL inspection tool or add it manually by verifying ownership.
What is Google Tag Manager?
Google Tag Manager is a tool that enables you to manage and deploy website tags without having to edit the code of your site. It simplifies tagging as it allows users to quickly and easily add, edit, or remove snippets of code from their sites. This helps optimize your site for better performance on search engine results pages (SERPs).
How do I set up Google Analytics?
Google Analytics is a powerful tool for tracking website visitors and understanding user interaction with your content. To set up Google Analytics, all you need to do is create an account, add the code snippet to your site, and start collecting data. Once you have data in your account, you can use it to monitor traffic sources, track conversions, and analyze user behavior.
However, Google has announced they’re moving to a new platform – Google Analytics 4 from July 1, 2023. So if you’re planning to use Google Analytics, it’s important you start preparing as soon as possible, and we can help you with the process of migration!
What does Google mean by mobile-first indexing?
Mobile-first indexing is a process used by Google to prioritize content that is optimized for mobile users over desktop users. It ensures that your website appears properly on all types of devices, giving an optimal user experience no matter where they are accessing it from.
What is the problem with my tags?
If your tags aren’t working properly, it could be due to a number of issues. It might be that they are not correctly configured or that some of the code is missing or incorrect. Other possibilities include incorrect tagging parameters, inadequate tracking setup, and outdated tag management tools. If you’re having trouble with your tags, it’s important to audit them. This will help identify any issues and allow you to make the necessary fixes.
What is Canonicalization?
Canonicalization is the process of choosing a preferred version of a URL when there are multiple versions of the same page. It helps to avoid duplicate content issues, as well as ensure that search engines can index and crawl your site efficiently.
What URLs should be canonicalized?
Any URLs that are similar to or identical to one another should be cannibalized, such as:
– Home page URL variations
– Mobile versions of the site
– Trailing / and not trailing slash after the domain name
– Uppercase and lowercase letters in URLs
– Parameters used for tracking or sorting content
– URLs with and without a “www” prefix, etc.
How do I verify the canonical tag?
To verify the canonical tag, you can use a tool like Screaming Frog to crawl your website and check the source code for any canonical tags. It’s also possible to check with Google Search Console, as it displays all of the URLs that have been indexed by Google.
What are the Crawl Errors?
Crawl errors are issues that may prevent search engines from properly crawling and indexing your website. Some of the most common errors include server errors, broken links, page loading issues, and blocked resources.
What is the best way to check for crawl errors?
The best way to check for crawl errors is to use Google Search Console or another similar tool. This will allow you to see any issues with your website and fix them so that search engine bots can properly access it.
What is the meta tag?
The meta tag is an HTML element that contains information about a page, such as its title, description, keywords, and other metadata. This data helps search engines understand what your page is about and how best to index it.
What is Content pruning?
Content pruning is the process of removing or updating any content that is outdated, irrelevant, or no longer relevant to your website. This will help keep your website fresh and up-to-date for both online users and search engines.
What is a redirect?
A redirect is a server-side command that automatically forwards visitors from one URL to another. Redirects are important for SEO because they make it easier for search engines to find and index your content, as well as ensure users don’t land on broken pages or 404 errors.