SEO Strategy

SEO Techniques

Sam Underwood


min read

It probably seems the SEO techniques you need to know evolve each year. And that’s because they do!

As each year passes, Google gets more intelligent, and search results constantly change. If you want to keep up to speed, the techniques you use will need to evolve.

Fortunately, this article has you covered.

I’ll go over the top SEO techniques you need to know in 2022, all the way from basics to advanced.

Write useful, compelling, and targeted content

People visiting your site want to read something not only relevant but also valuable for whatever they search for within a search engine. You should do a few key things with your content to fulfill their needs.

Build trust, show expertise, become an authority

Building trust, showing expertise, and becoming an authority should be at your SEO strategy's core.


Expertise, authority, and trust (E-A-T) are the guiding principles that Google uses when building and qualifying algorithms, to make sure they are doing what they should do.

That does not mean that E-A-T is a ranking factor. It does, however, mean Google is looking for measurable signals that align with E-A-T, such as high-quality content, a popular brand, or links from other reputable sites.

Becoming a brand that shows E-A-T has other benefits outside of rankings signals, including a huge impact on things such as the CTR you receive in search results.

For example, when you use a search engine, are you likely to pick a popular site you know has excellent content or one that you’ve never heard of before?

A great example is for a SERP showing cake recipes.


Are users more likely to click the BBC Good Food or Waitrose result? Or the Baking Mad result?

Chances are, it’s going to be one of the first three sites. Users are more likely to recognize and trust these brands, as you can see from the branded search volumes of each site.


But how do you get to this point?

You can do certain things to better show E-A-T, but part of the answer is a mix of different types of marketing activity that goes outside of an SEO article.

If you want to show better show E-A-T with your content, here are some tips:

  • Use authors who are experts and display them on your site with bios.

  • Tell users why they should trust you, such as the process you’ve gone through to build a product or research an article.

  • Keep content factually accurate. Reference third-party sources to verify statements.

  • Make sure the content is current.

  • Show positive comments and reviews from users or third-party sites.

  • Display trust signals on your site, like any awards, certification, or events you’ve spoken at.

  • Make it easy for users to contact you, especially if you’re a lesser-known ecommerce store.

Quality content, not quantity 

While a simple point, it’s an important one.

The content you publish must have a purpose instead of being there for content’s sake.

There is no SEO benefit from creating a set number of articles at a specific word count that doesn’t fulfill your audience’s needs. 

There is, however, a benefit from developing content that matches search intents, is comprehensive, and is of a higher quality than competing URLs.

You’ll often hear the question, “How many articles should we create a month for SEO?”

The answer is always, “As many as we can create while ensuring they’re better for users than competing ones.”

I regularly see this with my clients. Creating higher quality, thorough content will outperform competitors who are writing more articles at higher word counts.

Use an inverted pyramid style

The inverted pyramid is a widely adopted way of writing where you start with essential information and end with the least important.

To break it down even further, it might look like this:

  1. The who, what, where, why, how information

  2. A breakdown of the points that you’ve introduced at the beginning

  3. Background information

  4. Any additional/related information or quotes

  5. References 

inverted pyramid style

Analyze any well-written piece of content, and you’ll soon be familiar with this structure.

Make your content engaging and useful

There are multiple ways to create engaging and compelling web content.

Some effective methods for blog articles are to:

  • Use short, concise paragraphs that are to the point, as mentioned in Google’s SEO Starter Guide

  • Include images and video

  • Add quotes from experts

  • Provide useful resources like templates or calculators

  • Create custom graphics to explain a concept 

If you’re creating a product or category page for ecommerce, consider:

  • Adding FAQs

  • Videos showing customers using the product

  • Valuable reviews from users

  • Comparisons with similar products

  • Interactives to display your products USPs (this great example by Bellroy comes to mind)

In addition, it’s not just about the content itself; think about the layout and design.

Design plays a significant role in how users feel about a brand and their enjoyment from reading a piece.

You’ll quickly get the point if you disable CSS for a webpage.


While this is what a user wants from the content, its design isn’t too inspiring and memorable. A competing site providing similar content could quickly improve upon this with a better content presentation.

Ensure your content is targeted

You should always create content that your audience is actively searching for.

You can validate you’re creating targeted content by:

  1. Understanding users' searches: You can do this easily with keyword research tools that show you what users are looking for.

  2. SERP analysis: This will ensure you’re answering the query with the suitable types of content, whether that’s a blog post or service/product page.

  3. Competitor analysis tool: Check competing articles while doing your SERP analysis. Does your content have value-add assets like imagery, video, expert quotes, or unique data?

  4. Contact your company’s help desk and see what people are asking about: This isn’t only targeted, but it also means users aren’t finding what they’re looking for.

In addition, making targeted content goes beyond what a user searches. A large part of creating valuable content is catering it to your audience.

There are several ways you can learn more about who you’re writing for:

  • Look at your competitors’ audience. A tool like Quantcast or SparkToro can help.

  • Understand demographics: Tools like demographics.io can help you understand demographics based on keywords. You can even filter this for individual countries.

  • Use social insights. Facebook offers a follower breakdown of your company page. Platforms like Instagram and Twitter allow you to create polls, so if you already have an audience and want something answered, these can be useful.

  • Run surveys: Tools like Google surveys allow you to hone in on precisely what you want to know.

Use heading tags for structure

As I mentioned, heading tags are a great way to incorporate structure into your content.

Not only do they break up blocks of text, but they are also an excellent way for your audience to find the part they want to read.

The concept for heading tags is simple — nest lower headers (like H3s) within high headings (like H2s) to show how content is organized.

Here’s an example heading structure for an article as an example:

H1 - Europe’s best attractions
H2 - France
H3 - Eiffel Tower
H3 - Notre-Dame
H2 - England
H3 - London Eye
H3 - Tower Of London

If it’s a substantial piece, including a table of contents with jump links can be handy.

Jump to section

Keep your content fresh

Google has a freshness algorithm that matches queries looking for new content with new articles, so it’s essential to ensure you’re keeping content up to date.

Not all content needs refreshing. If you’ve already covered a topic thoroughly and there is nothing more to add, there’s no need to update the content. That’s unless you see an opportunity actually to make the content better.

Fresh content is most important for news, trends, celebrities, or other ever-changing topics.

There are a few ways to ensure you’re always keeping your content fresh:

  • Keep on top of industry news: Update your article and reshare it with your audience if anything exciting changes. This will also show that you know your stuff in the industry and you’re an expert.

  • Implement content refresh process: It’s off-putting to see a site full of outdated articles. Rather than publishing new content all the time, refresh older pieces that aren’t performing.

Improve your search appearance

It doesn’t end once you’ve created high-quality content; next, you need to make sure it gets clicked from a search result.

The main ways to entice a user to click your result are:

  • Title tags

  • Meta descriptions

  • Rich results

Here’s how to get those optimized.

Create compelling title tags

Title tags can persuade your audience to visit your site and read that piece of content.

For example, this title tag isn’t that exciting, and your CTR would likely mirror this.

"Different ways you can make money doing a side job | Brand Name"

However, consider this title tag::

"Growing Your Side Hustle [Ultimate Guide] | Brand Name"

It would be more likely to entice you to click.

Here are a few age-old tips for making title tags more compelling.

  • Include targeted keywords near the start: While a brand name in titles can be beneficial, in most cases, you won’t want it at the beginning of your title tags.

  • Don’t be repetitive: Try to vary the words used; you don’t need to repeat the exact keywords over and over in different ways.

  • Keep title tags short: They ideally should be a max of 70 characters, or search engines will truncate them in results.

  • Use power words: These words evoke emotion and intrigue, which can boost your clickthrough rates.

While all of these are SEO best practices, what works will vary for each search result. Make sure to try different ones and see what works.

Improve your meta descriptions

SEOs often deprioritize meta descriptions, and you can understand why. Data from Ahrefs shows that Google rewrites meta descriptions in 63% of cases. This means even with custom meta descriptions, Google may decide they know better and create their own based upon your content.

Nonetheless, a well-written meta description can help improve your SEO CTR by giving users valuable information about your page and enticing them to click your result.

Meta descriptions are the short piece of text under the title tag. They’re typically around 160 characters long on desktop and 120 characters on mobile, but this can vary.

Here are some tips for writing better meta descriptions:

  • Match the search intent: If a user is looking for something specific, mention your page provides what they’re looking for. For example, if a user is looking for "things to do," say something like, “Looking for things to do in London? Our guide covers the 20 best things…”.

  • Make them actionable: You want the user to click your result, prompt them to click with phrases like “Check out our guide covering…”

  • Create urgency: Running a sale that ends soon? Mention that within your meta description. 

  • Mention your USPs: Does your brand have anything that makes you unique in your market, like same-day delivery? Mention that within your meta description. 

Optimize for rich results

Acquiring rich snippets like People Also Ask and Featured Snippets are a great way to boost where you rank, improve SERP coverage, and receive an enhanced listing in search results.

Featured snippet

Here are the steps to help obtain these rich results.

1. Check the search result

First, check which rich results are showing for the targeted keyword and note the content returned. It’ll likely be one of these:

  • Paragraph

  • Table

  • Numbered list

  • Bullet list

2. Mimic the type of snippet returned

Next, mimic the type of snippet with your content, e.g., if it’s a table, you’ll likely need to match the query’s intent with a table.

3. Follow best practices

While you’re answering the intent of the query, you’ll also need to follow a few best practices, such as:

  • Answer the question factually and to the point.

  • For paragraph snippets, use 40-50 words.

  • Use imagery with descriptive alt text to optimize for featured snippets with images.

Acquire rich results with structured data

Adding structured data is a great way to help search engines better understand your content and enhance your listing with search results.

Structured data explicitly tells search engines about your content via code added to the page. This increased understanding means search engines can include additional information about your webpage when a user searches for content like yours.

A great example of these is Recipe rich results.

Pie recipe search

The additional information such as the reviews, ingredients, and time to cook comes from the recipe structured data each site has added.

Another example for ecommerce sites is Product rich results.

Google search

Providing pricing and stock information within structured data on your site gives Google the confidence to show that information within the search results, potentially positively impacting your CTR.

Organize content logically and optimize internal linking

How you organize content and internally link to it significantly impacts your site's search performance results.

If you’re not incorporating logical site hierarchies and internal links into your strategy, you could be missing out on the many benefits that come with them.

Here are some techniques to improve this aspect of your SEO strategy.

Build content around topics

Search engines are much better at understanding sites built around topics and subtopics.

Internal links are a crucial way to show how you’ve organized your content. By building clusters of content where broader parent topics internally link to more specific subtopics, you’ll help search engines better understand your site.

Topic cluster

A simple example to explain this concept would be a travel site.

You’ll often find these sites create logical hubs of content based upon a country, which then have internal links to subpages, such as cities or towns within that country. The city and town pages could have internal links to landmarks or attractions within them.

Building sites in this manner also helps:

  • Your audience navigate through your site

  • Reduce bounce rates and increase time on site

  • Ensure you’re efficiently internally linking all of your content

Reflect organization within your URLs

A good URL should:

  • Be short and readable

  • Indicate how you’ve organized the site

  • Not include irrelevant directories or numbers

  • Be lowercase and use dashes to separate words

The following good URL is short, readable, and shows how the site is organized.


In contrast, this URL has several directories, is challenging to read, and hides the site structure.


Use breadcrumbs

A breadcrumb shows where the current URL sits within the site hierarchy. You’ll usually find them at the start of the content or product description section, and they look something like this:


Following our Women’s Boots URL example, a breadcrumb for that site would look something like this:

Women’s > Shoes > Boots > Waterproof

Usually, your breadcrumbs and URL structure should be pretty similar in that they both indicate the site's hierarchy of content.

Sometimes you may want to omit parent pages from your URL structure to avoid having a large number of subdirectories which can be more challenging for a user to read — it’s important to note this doesn’t directly impact SEO. Still, you should always try to keep URLs as readable as possible.

In these cases, you may exclude specific directories from the URL, but how you’ve organized content should always be correct within your breadcrumbs.

In addition to being helpful for users, they also give Google clues on the structure of your site, especially if you add the associated breadcrumb structured data.

Make good use of sitewide links

Sitewide links included in your header and footer are still a key focus for SEO. Having a URL in one of these areas will signal to Google its importance and increase the amount of link equity passed to it.

They’re also essential for UX, allowing users to quickly access popular areas of the site that may be deeper within the site hierarchy.

In a few steps, you can spot opportunities to add high-importance URLs using Sitebulb.

1. Set up a crawl and include Google Search Console/Google Analytics data

First, set up your Sitebulb crawl to pull data from Google Analytics and Google Search Console (GSC).

2. Filter and sort the URL explorer report

Next, head to the URL explorer report in the top bar.

audit overview

Then head to Directives > Indexable


Now you’ll want to adjust the columns to include only:

  • Crawl Depth

  • No. Internal followed links

  • Total Clicks

  • Total Impressions

  • Sessions

Selected columns

Now sort by "Total Impressions" and spot high-impression URLs with a low number of internal links or a higher crawl depth. 

Total Impressions

Review these URLs to see which are most important to your business, then incorporate them as internal links within your header or footer.

Use descriptive anchor text 

Anchor text is the text used whenever you add a hyperlink. It gives vital information to search engines and users about what they’ll find on the page the link is pointing to.

Descriptive anchor text is beneficial for accessibility. Without context, your audience is less likely to click anchor text such as "click here" or "more info."

Longer, more descriptive anchor text gives Google more context, which can positively impact the rankings of the destination page.

Here’s an example.

If the hyperlink is to a page on Japanese restaurants in Osaka, the following anchor text is an example of being both relevant and descriptive:

"Some of the best Japanese restaurants within Osaka."

However, you’ll likely want to spot these opportunities in bulk. Again, a great way to do this is with Sitebulb, thanks to their "Link Explorer."

First, head to the link explorer report from the top bar.

explorer tab

Use the search bar to filter for the target URL you want to optimize internal anchor text for.

Anchor text

Examine the anchor text used; is it all descriptive? If you see a large portion of anchor text that uses non-descriptive phrases like "here" or "this article," this highlights an optimization opportunity.

Provide a great user experience

A user-friendly site can't make up for poor content, yet good content can't entirely make up for a poorly functioning site.

Instead, a good-looking site and quality content go hand in hand to provide an excellent experience.

Having a poorly designed or functioning site can also impact SEO. In some cases, directly, with algorithms such as Core Web Vitals. Other times indirectly, with a poor site appearance impacting a site's ability to perform well for direct ranking factors, like links — something I believe John Mueller is alluding to here.

Here are some ways to improve user experience and your organic traffic.

Go mobile-first and use a responsive design

Mobile-first refers to prioritizing a mobile site's design and user experience over a desktop. Essentially, websites should be created with the mobile user’s experience first and then adapted to larger screens.

This reflects the way users consume content on the web nowadays and how Google has changed how they crawl and index the web’s content.

Mobile-first SEO has no significant changes from what you’d do for desktop, with the largest consideration being that you have parity between the two when it comes to:

  • Content

  • Internal linking

  • Structured data

  • Meta tags/information

The best way to ensure parity is simply to use a responsive design.

Go mobile first

This way, the same HTML code is used no matter the device, with the only changes being visual via CSS media queries.

Suppose you use an alternative method of implementing device-optimized sites, such as separate URLs or dynamic serving. In that case, you’ll then need to start doing parity audits, which tend to be complicated and time-consuming.

Optimize your Core Web Vitals

Core Web Vitals are metrics created by Google to measure the user experience of a webpage.

In 2021, Core Web Vitals became a ranking factor (albeit a minor one), meaning improving them is one technique to improve organic traffic.

Google breaks Core Web Vitals down into three key areas, each with different ways to improve them.


Users are impatient when it comes to loading times.

The difference between a two-second and four-second load time can have a massive impact on how many of your users drop off.

The Largest Contentful Paint (LCP) Core Web Vital measures loading performance by indicating how long the most prominent element in the browser viewport took to load.


In addition to how long it takes to load a page visually, how quickly the browser responds to a user interacting with the content is also essential.

First Input Delay (FID) measures this by evaluating the time it takes for a user to first interact with a page compared to how quickly the browser responds to that interaction.

Visual stability 

The final measure, called Cumulative Layout Shift (CLS), measures the content’s stability while the page is loading — that annoying reshuffling of page elements right as you're about to click. There are great video examples of what the CLS metric aims to solve on Google’s web.dev site.

Issues here usually happen due to browsers parsing CSS in a particular order. Browsers parse some CSS after the user can see the webpage, causing layout changes as new styles are applied.

Resources for optimizing Core Web Vitals

A good place to start to learn more is my Core Web Vitals article. I’d also definitely recommend reading Google’s dedicated articles on tactics to optimize each metric:

Keep important content above the fold

Unique content above the fold is an important usability and SEO fundamental. 

The importance of above-the-fold content has been reiterated by John Mueller and shown by Google’s page layout algorithm.

content above the fold

The technique here is simple: prioritize important content higher up the page and ensure you summarize the page above the fold. That includes both your H1 tag as well as content summarizing the page.

Avoid intrusive interstitials 

There is nothing more frustrating than clicking somewhere on a page, only for an interstitial to show at that exact time, and before you know it, you’re on a site you didn’t want to be on.


That is called an intrusive interstitial, and Google has an algorithm that detects these and demotes sites that use them.

Ads may be a part of how you make your revenue, but providing a frustrating user experience to surface those ads will not help your brand’s perception or search traffic.

Sometimes non-ad interstitials are necessary — for example, a cookie-usage interstitial.


Or even an age-verification interstitial.


You don’t have to worry as much about these as you won’t be penalized for them.

If you can’t get rid of ads, instead make sure they don’t take up too much screen space.


This is much less in-your-face and annoying!

Help Google discover and index your content

While you should write content with your audience in mind, you still need to do your due diligence to help Google discover and serve that content to users.

This is the part technical SEO has to play in your SEO strategy. Here are some key techniques to ensure technical SEO fundamentals are covered.

Optimize crawling 

Crawl budget is the limit a search engine places on your site regarding how many URLs are accessed. The crawl budget varies from site to site, based on several factors such as the site's speed and popularity. 

If you don’t have a huge site (over 1 million URLs), optimizing the crawling of your site is unlikely to be an SEO technique you’ll spend much time on.

If you’d like to verify whether it should be a focus, GSC or an auditing tool can help by simulating how a search engine crawls your site, showing you the number of URLs discovered.

In Sitebulb, it’s as simple as setting up a crawl then checking the total number of URLs crawled on the overview page.

Audit overview

If you spot a large number of crawlable URLs, you’ll want to use the robots.txt to manage the crawling of your site.


The robots.txt is a file that gives directives on the URLs web crawlers can and cannot crawl on your site.

Good crawlers obey it to the letter, including search engines like Bing and Google.

The contents of the file will look a bit like this:

User-agent: * 
Allow: / 
Disallow: *?color 
Sitemap: https://www.example.com/sitemap.xml

If you’re looking to manage which URLs bots can crawl, you’ll likely be adding disallow rules like the above to the page to instruct search engines not to crawl a URL.

For example, within Sitebulb, if you selected the list of "Crawled" URLs shown earlier, you could scan the table returned and spot that the crawler found a large number of URLs with a"‘?size=" parameter. 

To then exclude that from being crawled, you’d add a wildcard rule like this:

Disallow: *?size

Improve indexing

Ensuring quality indexing is a fundamental technique for improving organic traffic. Google evaluates your site based on the URLs indexed, as explained by John Mueller in this webmaster hangout.

A quick and easy way to understand whether you have indexing issues starts with having XML sitemaps that reflect all important URLs you want indexing on your site.

Most CMSs, including WordPress, generate these automatically. You may also have a more customizable sitemap via WordPress SEO plugins like RankMath.

First, go to the Sitemaps section of GSC to make sure you've submitted an XML sitemap. 

GSC Sitemaps section

Next, go to the coverage reports, select "Valid" on the chart, and look at the Details table. 

Valid urls

There is a significant difference between the two rows shown in this report..

Fixing indexing first requires you to understand the URLs indexed that shouldn’t be. 

You can get a sample by selecting the "Indexed, not submitted in sitemap" row of the table in the earlier screenshot.

URL examples

For small sites, that data may be enough, but the report is limited to 1,000 URLs. Sometimes you’ll need more data to understand potential patterns of URLs that are causing indexing URLs.

In that case, use a crawler like Sitebulb and go to the "Indexability" report in the left sidebar.

Audit Overview

Then you’ll want to click on the green area of the "Indexability Status" chart.

Indexability Status

Sitebulb will then return a list of all the indexable URLs found on your site.

Next, go through these URLs and try to spot URLs that are indexable but not should be.

Once you’ve spotted the issue, you’ll want to get those URLs out of the index. There are two ways you can do that.

Canonical tags

You can use canonical tags to consolidate duplicate pages and all ranking signals of those pages into one authoritative page.

That means that you should only be using a canonical tag to solve your indexing issues if the pages causing the problem are duplicates of each other.

Common causes for indexing issues from duplicate content tend to be sites adding URL parameters or the CMS generating more than one URL path to access a piece of content.

Adding the canonical tags look like this:

<link rel="canonical" href="https://example.com/shoes/leather" />

So if you had a duplicate URL cased by parameters on a URL like the following:


You’d correct it by adding a canonical tag on the URL with the parameters, pointing to the URL without them.

In some cases, a 301 redirect may make more sense to consolidate ranking signals between duplicates, such as when choosing a canonical hostname (e.g., non-www/www or HTTP/HTTPS).

Canonical tags

Generally, you should opt for a 301 if the user should be redirected to the canonical URL rather than simply consolidating ranking signals between the duplicates.


Suppose duplication isn’t causing your indexing issues. Instead, it’s just a CMS generating a page that doesn’t add value for search (a good example is WordPress creating media attachment pages). In that case, you’ll want to use the noindex tag.

The noindex tag tells search engines to stop the indexing part of their processing pipeline, resulting in the URL not being used when evaluating your site's quality. 

The noindex tag is added to the <head> of the page and looks like this:

<meta name="robots" content="noindex">

It’s important to note that if you add the noindex tag, you also need to ensure that you haven’t blocked the crawling of that page with the robots.txt until a search engine removes the page from their index.

If you have, the bot won’t access the page, resulting in the noindex tag not being seen and the URL not being removed from their index.

Ensure search engines can render URLs

Sites rendering client-side has become particularly prevalent over the past several years due to the UX benefits and more straightforward implementation using frameworks like Vue.js or React.


Client-side rendering simply means that HTML is rendered from JavaScript by the browser, rather than the server rendering the HTML code, like with popular languages such as PHP.

However, this can come with potential SEO issues as not all crawlers render JavaScript. 

Google has improved its JavaScript rendering capabilities. However, by not server-side rendering key elements like content, meta information, and internal links, you still run the risk of Google not correctly rendering the page’s content, which could impact how you rank.

To test whether Google is rendering your content fully, use GSC's URL inspection tool.

Accessing this tool is as simple as searching a URL within the bar at the top.

Google indexing

Once you’ve done that, you’ll then want to select the test live URL button.

Google index request

Googlebot will fetch and render your page and let you view the tested page. 

URL available on google

You’ll then want to view the screenshot. Does it accurately reflect your page?

indexed page

If you know you’re injecting content with JS, and the screenshot is missing key content, you may have issues with bots not critical content on the page, negatively impacting how you rank.

Correcting these issues involves either server-side rendering or statically generating content (NextJS is an excellent option for React). You could also implement dynamic rendering, where users get a client-side rendered version of the page, and bots get a server-side rendered version.

Make sure to read my complete guide on technical SEO for more techniques to improve how Google discovers, indexes, and renders your content.

Improve your local SEO

For businesses with a local premise that customers can visit, Local SEO is a must if you want to increase your store footfall.

While I’d recommend reading my complete Local SEO guide, here are some quick tips:

  • Set up/claim your Google My Business (GMB) Profile

  • Add your site to local directories like Yelp 

  • Keep your name, address, and phone number consistent across GMB, your site, and local directories

  • If you manage many locations, use a local SEO management tool like Moz Local

  • Provide a positive customer experience and collect GMB reviews from customers

  • Create localized pages on your site targeting specific areas you cover. For example, if you have a dentist practice in Vermont and New Hampshire, create an individual page for each, with H1s and title tags targeting "Dentists in [location]."

Monitor and analyze performance

Once you’ve put your hard work into perfecting your SEO, it doesn’t end there. 

Naturally, you’ll want to keep on top of how you’re performing. It’s easy to forget to check on your performance regularly when you have 101 other things to be doing, so make the most out of tools that do the hard work for you.

We’ll explore the different options next.

Google Analytics / Google Search Console

Google Analytics and Google Search Console will cover most of the insights you’ll need for how you’re acquiring users to your site for the organic channel.

Google Analytics

With Google Analytics, you can apply an organic segment to see all reports with organic-only data.

Google Analytics

Alternatively, head to Acquisition » All Traffic » Channels and view the organic channel's acquisition data.

Google Analytics

Rather than manually checking performance, you can also use the alerting system built into Google Analytics. Do this by clicking Admin, followed by Custom Alerts, and select +new alert and configure based upon your requirements.

google Custom Alerts

Google Search Console

Google Search Console provides another layer of acquisition data due to it giving information on the specific queries users searched when they clicked your result. You can find these reports in the performance section.

Google Search Console, performance menu

Once in there, you’ll find a filterable table with a sample of queries users have used when finding your content.

Google Search Console

Rank trackers

Rank trackers like Advanced Web Ranking enable you to monitor your position on the SERP for critical terms.

AWR sear features

Even a fall from position two to three could have a significant impact, so it’s important to spot these if and when they happen. When changes occur, a rank tracker will alert you to investigate as soon as possible.


You can do so much when it comes to SEO that it’s easy not to know where to begin! With this guide, even if you think you have your strategy perfected, you’ll hopefully find a technique or two for improving organic traffic that might not already be a part of your strategy.

Any techniques I’ve missed that work well for you? Tweet them to me.