The Elementary Digital Complete SEO Audit - Part One

Jan 11, 2016


min read

When you've had a website for a while, you will probably be constantly tinkering with it to keep the SEO up to date. Every now and then you should consider doing a full SEO audit to ensure that everything is in order. This is the first of our three part guide to a thorough, top-to-bottom SEO audit for any site.

Part One: Technical SEO

Here’s a complete checklist of everything you’ll need to look at when performing a technical SEO analysis. It's a lot, so prepare yourself! Make sure that you have a cup of coffee, or four, to hand.

1. Overview

Check indexed pages

Start with a site: search.

Look at how many pages are returned.

The top result should be the home page – if not, it could be because of a penalty you haven’t found yet or an issue with the site’s internal linking. However, it doesn’t always mean there is a problem – bear in mind this is quite a "quick and dirty" check.

How many landing pages are in Google Analytics?

Tally this with the site: search. It’s a more accurate way of seeing which pages Google thinks are valuable.

Search for your brand

Are pertinent pages appearing in search results? If not, you’ve got an issue somewhere.

Google cache results

Can you see the content? Are the navigation links there? Can you see links that aren’t on your site? Try using a text-only version of your site for a thorough check.

Mobile search

Is your site showing up as mobile friendly? Are landing pages the same? You need to make sure that the site and pages are mobile friendly to boost organic search.

2. On Page Optimisation

Title tags

Make sure titles tags are unique and optimised – often this will mean using the primary keywords towards the beginning of the tag and your brand name at the end. Use 55-60 characters at the most to ensure full display.

Check for the following:

  • Important pages have compelling meta descriptions and optimised titles

  • If there are missing titles and meta descriptions complete them

  • Primary keywords should be used in the titles

  • Ensure pages use keyword phrases and similar alternatives throughout content (but don’t overuse them!)

  • Each page should contain unique content

  • Image file names, alt and title tags should feature appropriate keywords

  • URLs should be appropriately descriptive and optimised

However, changing URLs with 301 redirects can have a negative impact, so only change them if it’s really really bad or when you don’t need to change the URL in an existing link. Make sure there aren’t excessive parameters and they don’t include session IDs. If there are URLs exposed to a search engine, they need to be static. URLs should also be 115 characters or fewer.

3. Content

Is the homepage content optimised?

There needs to be enough content to show search engines what you are about. The days of ‘a page has to be 400 words minimum and more is better’ seem to be gone; in fact, with homepages, we’ve lately seen a little more success from a less-is-more approach. However, let’s say 150 words is the bare minimum.

Are landing pages optimised?

Make sure landing page text is plentiful to give search engines an understanding of the page. Don’t just use the same text from page to page – keep it original and unique.

Do you have appropriate landing pages? Look at your main services, locations or sectors you serve – are they all covered with content-rich, optimised landing pages? I once worked with a fashion retailer that had a vast e-commerce website set-up with appropriate filtering across every type of garment.

This meant you could potentially have natural landing pages for all types of products. The one element they didn’t have was brands. It took a substantial amount of work to add this in, especially as we made these fully featured landing pages, rather than optimise category pages. It was worth it. The results were phenomenal – they now had around 30 high quality landing pages targeted at key brands – organic traffic shot up!

Are the pages full of real content?

This means generally avoiding lists and definitely avoiding random links.

Have the keywords been used properly?

Are the keywords and the page content aligned? Are keyword and long-tail phrases used? Also check for any duplicated content using keywords, as well as duplicated page titles using keywords.

Is the content for humans too?

Don’t just write for search engine spiders, make sure the content is clear and says exactly what it is you do for human readers to understand too. Use proper paragraphs, punctuation, images, headers (with H tags). Make sure it’s easy to read. Use attention-grabbing headlines.

Content vs. Ads

Make sure your content is unique and not just an ad, particularly above the fold.

No duplicate content

One URL per piece of content

Do a search and make sure that you have only used content once on your site. Don’t duplicate anything, including product descriptions – if your filtering puts the same products available at different URLs you need to be able to annotate the canonical version. No duplicate content on other subdomains or any other site that the business may own. Even print-friendly versions of pages can be counted against you.

4. Indexing and Accessibility


Be careful with what you block here – some internal resources and admin pages are fine (though this by no means secures them), but it is no longer advisable to block your Javascript and CSS pages. In fact, doing so nowadays may raise suspicions.

You should also reference the location of your sitemap (or sitemap index if you are using multiple ones).

Turn off CSS, JavaScript and cookies

Now check that all the links work and that content is still there. This is a good test of how robots may see the site because though their capabilities get more comprehensive all the time. you can’t trust they’ll be as capable as a human visitor.

Check for meta robots noindex tag on pages

Make sure that only the right pages have a noindex command such as goal pages.

5. Site Architecture

How many links on a page?

In the majority of cases this should be fewer than 100 – but as with most things SEO, if there’s a good, non-manipulative reason behind it, the rules can be bent – especially if the page is authoritative.

Think about your horizontal and vertical linking structure

Broadly speaking, items of equal importance –let’s say types of clothing on an e-commerce website – should form your horizontal structure. These would be your top-level categories.

Then, in this example, your vertical structure would comprise logical subdivisions of the main category for your subcategories, then into your product pages.

Use links in content, not off on their own

Don’t just have massive blocks of links in the content, they should be within the content, in context and used where appropriate – i.e. where they provide value to the visitor.

Use footer links appropriately

Footer links shouldn’t be used for primary navigation items.

Anchor text

Follow the Google guidelines for anchor text – you can’t game the system with lots of keyword rich anchor text. The vast majority of your inbound links should feature your brand name/URL.

No broken links

Whilst a few 404s aren’t going to cause you significant ranking issues, they should be kept in check as they do affect user experience.

6. Technical Problems


You should use 301s for pretty much all your redirects – at least all where you want the ‘link juice’ to flow from the old URL to the new one. Potential exceptions including redirects for logging in/account functions, currency switchers etc.

So in general this means no 302s, 307s, JavaScript redirects or meta refreshes!

Redirects are direct

No complex redirect chains, just a to b.


Don’t have links in Javascript, because you can’t really rely on crawlers being able to follow them.


Check that content isn’t pulled in via iFrames – except for the odd YouTube video, you’ll want the overwhelming majority of your content to live natively on your website.

Flash (ah-ahhhh)

Your site shouldn’t be done in Flash; you should probably avoid using Flash at all if you can – it’s rapidly becoming obsolete.

Search Console (Google Webmaster Tools)

Use Search Console (formerly Google Webmaster Tools) to submit sitemaps, find all sorts of errors – 4xx, 5xx, etc. & review penalty warnings.

XML Sitemaps

Make sure you’ve used full XML sitemaps, that they aren’t hiding poor architecture on your site and that they follow proper protocols (which you can test here).

Canonical site version

Your server should establish the ‘correct’ version of your site – i.e. does it use www or not in the URLs? Does it establish https throughout the site? If not, it is ok to still establish a secure connection for things like a checkout process, but it should never be haphazardly employed throughout the site.

It’s a little counter-intuitive to me, but the current advice from Google is that each version of your site should be added to Search Console. So for our site we have:

  • http://www.

  • https://www.

  • http://

  • https://

Rel canonical link tag should be used correctly

The most common need for canonical tags that I see is in e-commerce sites where product filtering affects the URL. For example, on a shoe retailer you might navigate via the style first or via the size first, creating two URLs leading to one product. So you need to tell Google that there is a legitimate reason for what they will see as duplication:

7. Speed

Check load times for pages

It needs to be quick! (Of course!). I would advise initial checks through Pagespeed Insights. This may reveal easy to implement server changes such as enabling compression and leveraging browser caching. It will also reveal potentially time-consuming tasks such optimising images, up to the potentially impossible/not-worth-it tasks of eliminating render-blocking, where the time invested may not be worth the speed gains.

Following this, take a look at Gtmetrix and Pingdom for a super thorough breakdown. Again, some of these tasks may be too time intensive for the returns they generate – follow the improvements at your own discretion!

Finally, talk to your developers and your hosting company about what other steps you can take – this may include employing a content delivery network to deliver geographically distributed copies of your content.

8. Mobile

How is the mobile experience?

Are you using a responsive layout?

Most people these days are offering a responsive layout and it’s not hard to see why. It’s adaptive to new display layouts (such as the recent even bigger iPads) and avoids SEO problems that can come up with an improperly configured mobile site, which can cause duplicate content issues if you’re not careful.

Are you using a mobile site?

Use analytics on your mobile site as well as your normal site. Make sure the markup is done correctly with the pointed to from the desktop with the re= ‘alternate’. The mobile site needs to match the canonical desktop site.

Dynamic serving?

Dynamic serving means you need to use the Vary HTTP header to alert search engines about the mobile use.

Mobile experience meeting users’ needs?

Check to see what your mobile users are looking for on your site and optimise appropriately – is your conversion rate much lower here? It could indicate navigational & transactional issues if so.

No mobile redirects

Ensure mobile users are being sent to where they want and not the homepage, you’re just asking for bounces if you redirect to the homepage.

9. International

If your site is serving the world, you need to make sure that all versions meet SEO guidelines. Additionally:

  • Most people will want to put international versions into subfolders:, rather than having individual domains.

  • Indicate the country targeting within Search Console.

  • Use hreflang and rel alternate to clearly demonstrate why there might be a lot of content that could seem like duplication to a crawler.

  • If you have sites in the same language serving different countries, ensure that the copy is original for both versions.

  • Use the correct currency.

10. Analytics

Make sure the tracking code is on every page – otherwise you’re not getting the full picture.

Ensure there’s only one analytics tracking code on each page and that it’s the same one throughout – with the latest version you can get demographics information that might reveal where your targeting is going astray, or even new types of customers you hadn’t imagined.

Link up your Adwords account to get the full picture of your marketing activities.

Exclude internal IP addresses

The bigger your company is, the more relevant this is – visits by your own staff could be wildly inflating your stats and feeding you false information.

Event tracking is on for user interactions you want to track

Event tracking is great for tracking conversions – both macro and micro – but in my experience once it’s put in place it needs testing, testing again and then once more for luck – though technically quite simple in theory, I’ve always found the practical application a little tricky. Maybe it’s just me!

Phew! There you have it – a technical audit guide. Come back for parts two and three of our SEO audit guide.

Note: The opinions expressed in this article are the views of the author, and not necessarily the views of Caphyon, its staff, or its partners.

Article by

Owen Radford

Owen is the online marketing manager at Elementary Digital, a Leeds and London based digital agency specialising in WordPress and Magento websites.

Share on social media

Share on social media

stay in the loop

Subscribe for more inspiration.