Back to Expert Advice

10-Point Website Audit Checklist → Is Your Site Suffering From These Technical SEO Problems?

Featured image for “10-Point Website Audit Checklist →  Is Your Site Suffering From These Technical SEO Problems?”

Uncover Technical SEO Problems With Our 10-Point Audit Checklist

Google changes its algorithm hundreds of times a year and it takes due diligence to stay on top of changes that can negatively impact your search traffic. We created this checklist based on the most common issues we’ve seen over the past fifteen years that will prevent a website from ranking high in search engines.

Every Website Needs a Technical SEO Audit Every Year

1)    Can Google crawl your website?

Is every URL from your domain in Google’s search index?

The important question is, “Can Google fully access all your important, publicly-available URLs on your website?” This initial audit checklist ensures pages aren’t being blocked from Google incorrectly or by mistake. If important pages aren’t in Google’s database, they won’t be returned during a search.

Test your website for the most common technical SEO problems. Get the 10-Point Audit Checklist. Click To Tweet

Crawling a web page does not indicate how your website ranks in Google for your target keywords, only that Google can locate and index your pages. (Learn the difference between crawling and indexing.)

Google Search Console “Must-review” Reports

Google offers website owners a free dashboard for website management. With Google Search Console (GSC), you can get an instant website performance review, covering your site’s performance in Google organic search based on keywords, pages, mobile devices, geography and more. Google will contact you through this dashboard to alert you to issues such as malware, HTML improvements, etc. This is the first place to look for problems when search rankings begin to fall.

Index Status Report

Use the Index Status report to see the number of pages Google has crawled from your domain. The graph shows data from the previous year. A healthy website will display an increase in crawling activity in this graph.

Google Search Console Index Status Report

Another quick check is to type this phrase into Google < site:yourdomain.com > (Replace the words “yourdomain.com” with your own website domain.)

We recommend checking the Index Status Report result on a monthly basis. Or to check a specific page, in the Google search box type “info:” in front of that page’s URL to see if the page is indexed.

Read the blog “Build your Own SEO Reporting Dashboard → For SEO Ninjas → No Coding Required” for step-by-step directions on creating a Google Search Console tracking dashboard. This will be invaluable in tracking website problems.

Crawl Errors

Review both Site Errors and URL Errors to locate problems search engines are having accessing pages on your site. A website audit will check for both server and page errors.

  • Site Errors: This type of error is serious, as it reports a problem on the web server itself. Review the report for the past 90 days to ensure that Googlebot has not had a problem accessing your entire site.
  • URL Errors: There are many HTTP error messages (see the 5 most common). While Google does not penalize a site for 404 errors, an audit evaluates active pages returning an error and whether old, abandoned pages are producing the error. A 404 is the most common error and means that a URL could not be found. This is easy to fix with a 301 redirect.

404 errors on invalid URLs do not harm your site’s indexing or ranking in any way.
John Mueller, Google

Robots.txt File

Big problems can happen here! The robots.txt file gives instructions to web crawlers, like Googlebot, to not index your website, its directories or selected pages.

We have seen entire websites blocked by mistake due to a forgotten disallow tag after a redesign.

Search crawlers will automatically index any URLs they can find. Usually, the only reason to use “Allow” is if you want to unblock selected URLs in a blocked directory.

Robots.txt also gives crawlers the URL of the XML sitemap.  You can test your robots.txt file here.

A technical SEO audit evaluates each line item where the phrase “Disallow” is present so as to ensure this is the desired action.

Locate your robots.txt file by adding /robots.txt to the end of your domain: https://totheweb.com/robots.txt. Learn how to correctly create a robots.txt file.

2)   Is your site safe and secure for visitors?

It’s Time to Migrate from HTTP to HTTPS

In 2014, Google announced it was giving a ranking signal (a boost) to sites encrypted with HTTPS. At that time, this signal was “light,” according to Google. Over time, however, the signal has “strengthened”. If you haven’t made the switch yet learn how.

Migrate from HTTP to HTTPS to Secure Your Website

HTTPS had a reasonably strong correlation with first page Google rankings.
Sep 2016 Research from BackLinko

Google Safe Browsing Tool

Hackers can compromise legitimate websites, tricking visitors into unknowingly installing malicious software. This software uploads a user’s private information, like usernames and passwords, into a fake site.

In addition to using Google’s Search Console data, you can test your site’s safe browsing site status here.

Google Safe Browsing Transparency Report

3)    Have you blocked Google with forgotten NoIndex Meta Tags?

This is an issue that happens all to often: inadvertently blocking pages from search indexes with a NoIndex Meta Tag. This tag is often left behind when a new page is created but was previously blocked during review and testing.

You can check the source of a page for this code: <meta name=”robots” content=”noindex”>

The easiest way to test your entire website for this tag’s page presence is to run your own crawl using ScreamingFrog. It will provide a page-by-page report, showing the current status of any Meta tags.

ScreamingFrog is our ‘go-to’ tool. We recommending adding ScreamingFrog to your toolset. It will be the best money you spend as part of your own website audit.

 

4)    Do you have high-authority domains linking to your website?

One of the most important Google ranking algorithm signals is the importance it gives to the quantity and quality of inbound links. There is no “magic number,” because Google is looking at a variety of factors, such as topical relevance (links from websites that are about the same topic).

Your website will not receive visibility in Google without trusted domains linking to it.

Count individual domains, not links. One domain linking to your site can show up as hundreds of inbound links.

You Can’t Fool Google or You Risk a Penalty
Tricking search engines using black-hat link building techniques is not an effective strategy. Among other signals, Google examines new link creation rates, in addition to the domain’s authority, to better identify naturally-occurring inbound links. Google’s ranking algorithm is looking for link diversity as well as authority.

A technical SEO audit will evaluate your link profile and other data points such as whether there is a high volume of toxic links pointing to your site.

To learn more about bad links and Google penalties, read “An In-Depth Guide to Link Quality, Link Penalties and “Bad Links

There is no such thing as a 100% accurate backlink tool: All link popularity tools will deliver different data, so it’s best to monitor any trends over time.

Conducting a complete backlink audit is usually a separate project but for the do-it-yourself webmaster, start by picking a couple of backlink audit tools, download the data, remove duplicate domains and evaluate inbound links. Set a benchmark using the results, and monitor changes over time.

     Our favorite link auditing tools:

5)    How FAST is your website?

Website Site Speed: How Fast is your Websiste

There are three major reasons to care about page speed:

  1. Google gives faster websites better rank potential than slower websites. The best content in the world won’t help you achieve a high rank if the page’s load time is slow!
  2. Your customers won’t wait for slow page to load. Not only is a slow loading time a one-way ticket to your competitors’ site, but a bounce-back from Google SERPs sends a strong signal that your URL wasn’t a good result for the search query.
  3. Slow page speeds limit the number of pages Google will crawl. John Mueller from Google states that crawl rates will slow down if Googlebot requires over two seconds to fetch a single URL.

How Fast are You? Compare Your Site Load Time

  • 2 second load time – faster than 70% of sites
  • 3 second load time – faster than 54% of sites
  • 6 second load time – faster than only 27% of sites

Start right now. Remember “speed kills rankings”.

  • Optimize all your images to reduce file size
  • Evaluate WordPress plugins to identify bandwidth hogs
  • Remove redirect chains
  • Speed up your Javascript

We like many page speed tools, but these three are our favorites:


This is an ongoing process as plugins and Javascripts are continually being upgraded. 
Test Server-Related Speed. Continue optimizing by reading this page speed post.

 

Does your website suffer from these common technical SEO problems? Get the 10-Point Audit Checklist. Click To Tweet

Content Audit } Finding Issues Affecting SEO

6)    Are you splitting search visibility by unknowingly creating duplicate content?

Duplicate Content Will Dilute the Popularity of the Primary URL
Google’s goal is to deliver high-quality, non-duplicate pages in its search engine.

Duplicate content can hurt your site because it splits “link juice” (think trust and authority) across different URLs. It’s worth saying again – duplicate content dilutes the authority of the main URL.

There are many reasons why pages containing duplicate or even similar content appears under different URLs including:

  • Query parameters (tracking URLs) used for campaign pages
  • Blog categories and tag pages containing the same content
  • Identical content contained on subdomains

The canonical link tag was created years ago to prevent one page of unique content appearing with different URLs. We recommend using the canonical tag to identify the primary URL for duplicate, or similar, content that is available through multiple URLs.

Another important consideration when one piece of content is served using different URLs is the problem that creates in data analysis.

Learn more here about creating canonical tags.

7)    Does your site have a high percentage of low-value pages?

Low-value content, or “thin” content, detracts from the user’s experience, and therefore Google doesn’t want to offer it in the SERPs. In fact, Google has developed a separate algorithm, called Panda, to remove low-value pages from its index.

One of the most important steps towards helping your web pages ranking higher in Google, is to ensure each page contains plenty of original, rich content. This content will include your primary relevant keywords and be content that visitors will want to read.

“If you build high-quality content that adds value, and your readers and your users seek you out, then you don’t need to worry about anything else.”
Amit Singhal, Google

What is a Low-Value Page and How is it Created?

During a content audit, our first assessment will identify and evaluate pages with a word-counts less than a few hundred words. In evaluating these low-word-count pages, we often find duplicate content created by press releases, blogs and media article postings.

Marketing campaign landing pages for ad campaigns are often responsible for low-value content, as they are often only a few brief paragraphs of text plus a lead form to access the “good content”.

  • While great for PPC, these pages are not healthy for organic search. You can evaluate this yourself in Google Analytics by running a Landing Page Report and segmenting for organic traffic.
  • These pages will almost always have a very high bounce rate from organic search when they are the first page a visitor see upon visiting your site.

What Can You Do About Low-Value Content?
Savvy SEO marketers should diligently remove these pages from the index. Identify any low-value pages, and check them against visitor behavior in Google Analytics. Redirect each page to a content-rich URL (using a 301 redirect), or you can improve the page.

At ToTheWeb, our mantra is, “only give Google the very best pages to index.” These are pages that result in a higher chance of reducing website abandonment while increasing visitor actions.

Use ScreamingFrog’s crawler to see a page-by-page word count. It will provide a list of the pages with a word count of the text inside the body tag.

8)    Have you implemented proven SEO techniques to boost page rankings?

A content audit will examine your most important web content, evaluating each page with the following questions:

Are there unique title tags? Your title should be unique, including a primary keyword phrase which best describes the page’s theme. The title length should be maximized for new mobile search engine result pages (SERPs).

Does a Meta description tag exist? Your description should include the most important keywords within the summary. This tag should not be a duplicate of content currently on that page.

Are keywords emphasized in the body’s content?

  • Clear headings and sub-headings should be used, and each should contain the page’s keywords to help Google identify the content and page topic(s).
  • The page’s code should use an <H1> tag properly for the main page heading. <H2/3> tags should be used in sub-headings.
  • The page’s file name should include the page’s primary keyword, separated by dashes.

Use this search optimization tool as a guide when creating your Title and Meta descriptions.

 

9)    Will mobile visitors have a great user experience?

Is your content equally accessible on smartphone, desktop and notebook displays?

Having a responsive website—which can be adapted to various devices and screen sizes—won’t automatically infer a great user experience. But with Google’s “Mobile First” indexing, it’s critical if you want to appear in mobile searches.

Desktop and Mobile Displays

People are five times more likely to leave a mobile site that isn’t mobile-friendly.
Google

There are many auditing tools used to evaluate web page mobile-friendliness, but our go-to tools are:

See how device size will impact content rendering with our Homepage Wireframe design examples shown across primary displays.

10) Does your site have a well-planned internal link structure?

Does your site use descriptive anchor text that uses the same keywords as the page being linked to? If it does, that sends a strong single to Google and other search engines. It is often a missed SEO opportunity on many sites.

Search engines place greater emphasis on link text than on regular body text, so link text should include important keywords.

This is one more signal that Googlebot uses to understand the page you are linking to. Make it count!

Walking through this checklist is the first step in uncovering search-related problems. Step two is execution and monitoring results. We can help!

Talk to ToTheWeb

Contact Us

Test your website for the most common technical SEO problems. Get the 10-Point Audit Checklist. Click To Tweet

How helpful was this content?

Click on a star below to rate our tool out of 5 stars

Average rating 4.5 / 5. Vote count: 422

No votes so far! Be the first to rate this.

AI Consulting and Training
Master today’s most effective productivity tools.

ToTheWeb ensures your route to capitalizing on the immense power of AI is fast and efficient. Our AI consulting and training programs will quickly enable your marketing team to adopt solutions to optimize operations and boost performance.

Learn about our generative AI and ChatGPT consulting and training programs

Is your company AI Ready? Take the Test

Customise our Employee AI Policy