An SEO audit reveals quick wins that drive results and outlines a path to SEO success.
Yet, so many SEOs struggle to audit a site to deliver immediate results or instill long-term confidence in the SEO program.
To help overcome the issue, I created a simple, easy-to-follow SEO audit template that lists every check you should perform to drive search performance.
Open the SEO audit template and follow along for this thorough checklist designed to drive better search visibility and ROI.
The advice you’ll read below comes from my over a decade of experience helping enterprise-level SEOs drive results. No fluff or theory-based information included.
You'll find that this checklist provides the full scope of an SEO site audit, not just bits and pieces. For a beginner's guide to technical SEO issues, I recommend: 15 Common Technical SEO Issues and How to Fix Them.
Table of Contents:
Although not necessarily required, a SEO site crawler will help you conduct a comprehensive site audit.
As you run your crawl through the audit process (and follow along with our checklist!), you’ll gather a list of technical issues. Give each step a grade of pass, fail, or needs improvement.
Doing so will help you create a prioritized list of issues to tackle in order to improve your organic traffic — starting with the fail issues, followed by the most promising needs improvement issues based on your time and resources.
There are plenty of site audit and crawler tools on the market, and in this post I’ll be demonstrating with one of the best: seoClarity’s built-in site crawler technology.
It’s been battle-tested on a site with more than 48 million pages, and gives you full access to find SEO issues that plague your site with no artificial limits. That’s no limits on crawl depth, speed, pages crawled…
Let's get started!
The following SEO site audit checklist has been split into two main focus areas:
Technical SEO is crucial because it ensures that a website meets the technical requirements of modern search engines.
Starting your SEO audit with technical aspects is essential as it lays the foundational groundwork, allowing your content and on-page strategies to perform effectively.
By addressing technical deficiencies first, you ensure that search engines can efficiently index and rank your site, maximizing visibility and traffic.
Before crawling a site, a search engine bot views the robots.txt file which gives directions on how to crawl (or not crawl) the website.
It contains instructions about folders or pages to omit as well as other critical instructions. As a best practice, it should also link to the XML sitemap so the bot can find a list of the most important URLs.
You can view the file manually by going to mydomain.com/robots.txt (replace “mydomain” with your site’s URL, of course). Look for commands that might be limiting or even preventing site crawling.
If you have access to an SEO tool to crawl the site, let it loose on the site, and be sure to set the user agent to follow instructions given to Googlebot.
This way, if you’re blocking Googlebot via the robots.txt file, the data from the crawl will reflect that with a “403 Forbidden” status code for the URL instead of a “200 OK” status code and the information for the URL.
Google Search Console historically reports URLs where Googlebot is being blocked. seoClarity users can find this in the advanced settings in a Clarity Audits crawl.
Mistakes with Robots.txt files can result in significant technical SEO challenges. As such, it's important to be aware of the most common ones so that you can avoid them.
Some of the most prevalent Robots.txt issues include:
Using default CMS robots.txt Files
Using Robots.txt to Noindex pages
Using the wrong case
Blocking essential files
Using absolute URLs
Moving staging or development site’s Robots.txt to the live site
The list goes on.
For a more comprehensive and in-depth list, check out our blog on common Robots.txt mistakes and how to avoid them.
A sitemap contains the list of all pages on the site. Search engines use it to bypass the site’s structure and find the URLs directly.
Recommended Reading: How to Create an XML Sitemap
Your sitemap should reside in the root folder on the server. The most common place to find it directly is at mydomain.com/sitemap.xml or linked to/from the robots.txt file. Otherwise, the content management system (CMS) may show the URL if there is one.
Crawl the sitemap URLs to make sure they are free of errors, re-directs, and non-canonical URLs (e.g. URLs that have canonical tags to another URL). Submit your XML sitemaps in Google Search Console and investigate any URLs that are not indexed. They’ll likely have an error, re-direct, or non-canonical URL!
SSL encryption establishes a secure connection between the browser and the server. Google Chrome marks secure sites (those having an active SSL certificate) with a padlock image in the address bar.
Recommended Reading: HTTP vs HTTPS: What’s The Difference and Why Should You Care?
It also warns users when they try to access an insecure site.
Most importantly, though, Google also uses the HTTPS encryption as a ranking signal.
Visit the site in Chrome and look at the address bar. Look for the padlock icon to determine whether or not your site uses an SSL connection. You can also test your SSL encryption at ssllabs.com/ssltest/ to ensure it is valid.
seoClarity users can use our built-in crawler to run a crawl and leverage the Parent Page report to find all instances of old internally linked http URLs at scale so they can be updated to the new HTTPs version.
More than half of web searches come from mobile devices which is why it's crucial to ensure that all basic mobile-friendly aspects are in place at this stage of the audit.
Recommended Reading: Mobile SEO Optimization: 6 Factors That Help Improve Mobile Search Visibility
Select your most important templates such as a category page, product page, and blog post. Test them with the Google Mobile-Friendly Test. Prioritize issues reported for the development team to fix.
Google also offers further resources on how to optimize for mobile.
Note: Google announced mobile first indexing of the entire web on their Webmaster Blog in March 2020.
The most common mobile SEO issues typically come from limiting the mobile experience compared to desktop. Give mobile users a full experience, not just the parts of the desktop site that work OK on mobile.
Other mobile design aspects that come up are:
Page speed is one of the most critical factors that affect a site’s visibility in Google — and it's only grown more important with Google’s announcement of Core Web Vitals update and page experience.
In the update, the Core Web Vital metrics (which all relate directly to page speed) combine with other experience metrics to create the page experience signal.
Page speed also has a direct correlation with bounce rate and conversions! As a result, optimizing page speed and decreasing load time often deliver instant results to a company’s organic presence and improves the search experience.
Recommended Reading: Page Speed and SEO: How to Improve User Experience and Rankings
Use Google’s Page Speed Insights tool to evaluate key templates on the site. This data is also within the Google Lighthouse data found at web.dev.
seoClarity’s Page Speed Analysis gives a handy point of view by combining all these issues across the site to prioritize the impact. It also allows you to keep track of page speed on a weekly or monthly basis, making it easier to monitor and evaluate your progress.
A few important <head> section tags help Google index the site properly. These exalted tags include:
Without these tags, Google must assume where to pull content from (title and description) to create the listing, which content among duplicates should be shown to users (canonical tag), and who to show it to (hreflang).
Recommended Reading: How to Write the Perfect SEO Meta Description
Install the Chrome plugin Spark, or manually inspect the code on key landing pages via Inspect in Chrome to spot these tags. Assess whether key SEO tags are present in the <head> section. The Spark plugin will display the data if it’s properly coded.
Also, as shown below, a seoClarity crawl will collect these issues. This report will show any potential issues with these tags (e.g. duplication, values that are too long, or are missing).
Utilize and configure each tag properly for every page on the site. At this stage, the tags must be present and valid on the site.
Duplication Issues:
Incorrect Placement:
<head>
section effectively renders them unrecognized, as if they are absent.Inconsistencies in SEO Tags:
For search engines to index and rank a site, they need to crawl its pages first. Google, for example, releases a bot to crawl a site by executing internal links.
Errors, broken pages, overuse of JavaScript, or a complex site architecture might derail the bot from accessing critical content or use up the available crawl budget trying to figure out your site.
Recommended Reading: A Guide to Crawling Enterprise Sites Successfully
Use an SEO crawler to imitate the path taken by the search engine’s bot — an advanced crawler will replicate Googlebot and see your website as the search engine sees it.
Look for reports of crawl issues caused by unnecessary URLs, broken links, redirect chains, or incorrect canonical configurations. The right crawler will be fully customizable and allow you to set the crawl depth, speed, and frequency.
We’re proud to say that our built-in crawler allows for all of that, all with no limitations!
Google Search Console also surfaces crawling errors it has found.
Rendering involves how Google views and displays your page's content and code. Adhere to progressive enhancement principles to ensure that the content and core functionality are accessible, even from a text-only browser.
As CSS and JS are executed, ensure all content remains available for Google's rendering process. Previously, Google did not process JavaScript, but now that it does, ensure Google can access all necessary files without restrictions to fully render the page like a regular user.
Recommended Reading: AngularJS & SEO: How to Optimize AngularJS for Crawling and Indexing
Run a site crawl with JavaScript enabled to render pages exactly as they would appear in the browser. In doing so, you’ll evaluate issues Google might encounter when rendering your pages with their JavaScript crawling capabilities.
Another way to check how well your page renders to Google is to view the cached version of a few important page templates. You can view this after searching for the page and clicking the option next to the URL to view the cached version.
Google tends to store only the HTML of websites in the cache opposed to the JavaScript executed version. If properly executed, the page should still render all important content and SEO elements in the HTML state.
Additionally, the Mobile Friendly Testing tool is known to behave as Google’s headless browser, executing the page’s JavaScript and rendering the page. This is a great SEO tool to test if anything is stopping Google from accessing the content.
The index is where Google stores information about pages it has crawled and where it selects the content to rank for a particular search query. Google is rapidly expanding, culling, and updating its index.
Recommended Reading: 3 Common Search Engine Indexing Problems
To audit indexation, start by searching "site:domain.com" on Google to review indexed pages, which helps identify any duplicates or over-indexed content.
By navigating to the last search results page, you can access a message indicating removed duplicates, revealing consolidated pages. This step confirms the absence of crawl issues, ensuring that Google has discovered and indexed your content.
For a comprehensive view, Google Search Console offers insights on indexation levels, including a section for Excluded assets, indicating pages deemed unworthy of indexing.
Additionally, seoClarity's Site Health report provides detailed indexability data such as:
Finally, you can review bot activity on the server to see how Googlebot is crawling your site which may explain how your site is being indexed. In seoClarity, you can filter the results by response codes to identify page errors instantly.
You may find that some pages that are not getting crawled simply have no links to them. Improve internal links continuously to ensure the bot can reach pages deep in the site’s architecture as well. With non-indexed pages, check if the content is too thin to warrant indexation.
Indexation issues often stem from problems with crawling and rendering. Under-indexed sites may require updates to meta tags or content to boost relevancy and convince Google of their value.
A frequent error is the accidental use of the robots=noindex tag, sometimes added to key landing pages during development updates, leading Google to de-index these pages. SEO teams must act swiftly to remove this tag and request re-indexing.
seoClarity users benefit from Page Clarity, which monitors URLs daily and alerts via email if the noindex tag appears, aiming to catch it before Googlebot does.
Additionally, faceted navigation (discussed in step 13) can significantly increase the number of pages indexed if not managed properly, potentially leading to duplicate URLs and over-indexation that weakens the visibility of crucial pages.
This is where the SEO analysis switches hats slightly from technical-minded to content-minded. Your site is in the game, now let’s think about how it’s being played.
Specifically, how well is it optimized for relevant keywords?
Key areas to audit are how the meta tags, header tags, and body copy are being used to create a great search experience for the target keyword topics.
Before evaluating on-page SEO, conduct thorough keyword research so that you know what phrases various content assets target.
Recommended Reading: 6 Steps to an In-Depth Content Audit That Will Ensure a Traffic Boost
Auditing on-page elements can be done in a few ways:
But you can use this capability to do so much more.
Let’s say you want to find all pages with video on them to audit their on-page optimization. Simply look for instances of words such as “video,” “YouTube,” or “short clip” to access every page featuring a video.
Recommended Reading: Finding Additional Content: Narrow in on Specific Site Features
Title Tags - approximately 70 characters, use target keyword. Learn how to run a title tag test.
Meta Descriptions - approximately 150 characters. When it comes to writing a good meta description, you should convince the searcher to click through the site and assure them you have their answer.
Headers - The “H1” tag should be a shortened version of the Title Tag, typically between 2-3 words. H2 tags should be used if they follow the format of the page. H3s and beyond are less important but should be relevant and used in order.
Body Content - Created to improve the search experience. Write with authority and help solve the searcher's problems using the target keywords. Find the target keywords for the page and write to them with authority. seoClarity users can leverage Research Grid to find the keywords where the ranking URLs for target keyword are also ranking.
At this stage of the audit, the goal is to spot a few quick SEO tweaks on pages where the keywords are ranking between 3-8. By doing this you can gain wins and reveal an ongoing workflow to improve these elements for the search experience.
There are a few common issues with these elements.
Consider the information gaps of the searcher. Do you offer all the information and context needed to choose the best available product or service for their needs, beyond stating the keyword?
Do you help them learn about the product or service and make a well-informed buying decision? Can they see what is unique about your offering or information? This is relevant content.
Recommended Reading: How to Create Relevant Content to Captivate Your Target Audience
Start by analyzing the top-ranking sites for your target keyword on Google. Investigate why they are ranked highly, focusing on what sets them apart and how they provide value.
For example, they might provide supplementary content beyond the target keywords, or incorporate engaging elements like videos and images that enhance the user experience.
Providing relevance is all about considering the contextual aspects of the user experience.
Users of seoClarity have access to an extensive database of over 30 billion keywords, allowing them to align their content strategy with actual user intent. Tools like Topic Explorer help identify key topics and search trends, enabling the creation of highly relevant content that meets user needs and stays ahead of the competition.
Create a great search experience for your target audience. The best search experience is unique and true to your brand and includes well-written content and a sound site structure.
We’ve put together a complete framework for this that we like to call search experience optimization.
Structured data from schema.org allows webmasters and SEOs to add semantic context to website code, enhancing how Google displays search listings with details like telephone numbers, reviews, star ratings, and event information.
This enrichment helps attract user attention and can increase organic click-through rates. However, for structured data to be effective, it must be correctly implemented.
During an audit, it's crucial to review the markup for errors and strategically plan the optimal schema for key website templates.
Recommended Reading: Technical SEO Best Practices: Schema [WEBINAR]
Use the Google’s Rich Results tool to evaluate your schema markup and its eligibility to appear as a rich snippet on the SERP. Just grab a few important pages and enter them in.
Google Search Console also reports on potential issues with Schema under the Enhancements section (it shows if you have markup added to the site). It shows errors, warnings, and the number of valid URLs in total.
While Google’s tools can get the job done, you’d have to go page by page — which isn’t feasible for an enterprise site! Use an SEO platform to audit schema at scale.
More on that here: Auditing Schema Markup: Confirming Structured Data's Implementation.
seoClarity’s Schema builder is a Chrome plugin that makes it super easy to apply structured data to your site. Try it now for free below!
Not including all of the essential data is a common mistake with schema. Audi tools will flag issues as “required” or “recommended” to help you prioritize the fixes.
Sometimes developers won't include all the information in the tag. After checking a URL with the Google Structured Data Tool, take a moment to read through the values to make sure everything is complete.
Don’t abuse structured data markup. Google is much more aware of Structured Data manipulation these days and will happily apply a manual action if they feel you are spamming them.
The many ways you can trigger a manual action from Google include:
Recommended Reading: 7 Common Issues with Implementing Structured Data
Faceted navigation enhances e-commerce sites by allowing the creation of specific sub-categories,
For example, different colored Chicago Bulls hats. While a general category page covers searches for "Chicago Bulls Hats," faceted navigation enables the creation of tailored pages like "White Chicago Bulls Hats," helping users who know their preferred color to directly access what they need, streamlining their shopping experience.
However, this feature can also lead to SEO challenges. If each filter generates a new URL without substantial demand, it can result in Google crawling and indexing duplicate content or unnecessary URLs.
Therefore, while faceted navigation is user-friendly and beneficial for filtering specific products, it's crucial to manage these pages to prevent the indexing of redundant content by Google.
Recommended Reading: Faceted Navigation SEO: Optimize for the Long-Tail Experience
Search different filters and product categories in Google to see if your pages show up in the index. Check to see if you’re creating too many pages by finding the URL pattern of how your site creates the pages, e.g. searching “site:wayfair.com inurl:color=”
Utilizing faceted navigation brilliantly may be the single thing separating the top sites from the pack. Only pages that align with search volume are offered to be indexed and all of the key On-Page elements mentioned above update to the long-tail target.
It's essential that each faceted navigation URL functions as a standalone entity, complete with a unique URL, title tag, description tag, and H1 tag, effectively treating each as if it were a top-level category page.
Every possible combination of facets is typically (at least one) unique URL, faceted navigation can create a few problems for SEO:
To mitigate these issues, consider the following strategies:
A thorough site crawl can identify these issues, revealing whether the appropriate tags are applied to match search demand and ensure effective SEO for faceted navigation.
A website should support people with physical, cognitive, or technological impairments. Google promotes these principles within its developer recommendations.
For SEO, accessibility issues are a combination of rendering issues and relevance laid out above. Google describes accessibility to “mean that the site's content is available, and its functionality can be operated, by literally anyone.”
The Google Lighthouse Plugin (web.dev) does a great job of outlining accessibility issues. It flags items in the code such as missing alt text on images or names on clickable buttons. It will also look for descriptive text on links (no “click here”).
These elements help bring the site to life for those using screen readers and help Google understand the web at scale.
Be on the lookout for generic or missing link names, missing alt tags, and headers skipping order (e.g. an H2, without an H1).
This is a great opportunity to teach the team about these issues and create an accessibility standard for the site.
If you use a crawler like seoClarity’s Clarity Audit, you’ll get notifications on those issues in the site crawl report as well.
Authoritative content is unique to a brand and showcases their expertise on a topic. This works to build trust with the target audience.
Give your content a hard look. It should include original facts and research, answer the reader's search query, provide value, and be well-written.
This is also the step where you evaluate the topic clusters for SEO, interlinking your content with blog posts and resources that cover sub-topics.
Evaluate how the site is targeting “awareness” keywords by filtering Google Search Console queries containing words like “how” (seoClarity users can do it using Search Analytics). Evaluate rankings and performance and look for low-hanging fruit (pages ranking between positions 11-40, requiring a little push to appear on page one.).
This is also a good step to find content gaps — areas where your competition ranks but you don’t — and create more authoritative content to fill them!
To find them, look for keywords where three competitors are in a prominent position but you aren't.
You should also conduct research to determine keyword opportunities at different intent stages related to the target keyword. For example, if you’re selling running shoes, an article on “how to choose the best running shoes” and “tips for running a marathon” could expand your authority for the primary terms (running shoes) and move searchers down the funnel without leaving your site.
Off-page analysis is a look at everything happening off the website that impacts SEO (i.e. external links).
The quality and quantity of relevant websites sharing and linking to your content is a good sign that your content is worthwhile.
After this review of backlinks, you’ll have some specific targets and a good understanding of what drives linking in your industry.
My favorite link-building tactics are the skyscraper technique and content outreach methods.
After auditing your backlinks, you can develop a plan on how to incorporate these tactics to best your competitors.
The biggest mistake SEOs make with off-page tactics is doing outreach without researching the specific value that may be brought to the person they're contacting.
Skipping the competitive research part, which can reveal invaluable nuggets on what is driving their off-page success is another misfire.
Congratulations! You made it through a site audit. Doesn’t that feel great?
You are now an expert on how to tactically improve your website and perform an SEO audit. When performing this site audit in the future or across other sites, remember:
Over time, as you continue to address these areas on your site, you’ll see the “Needs Improvement” notes turn into your strengths as you see the SEO performance improve.
Not sure if you’ve remembered it all? Bookmark our convenient, free site audit checklist to guide you through each step of the process.