It probably seems the SEO techniques you need to know evolve each year. And that’s because they do!
As each year passes, Google gets more intelligent, and search results constantly change. If you want to keep up to speed, the techniques you use will need to evolve.
Fortunately, this article has you covered.
I’ll go over the top SEO techniques you need to know in 2022, all the way from basics to advanced.
People visiting your site want to read something not only relevant but also valuable for whatever they search for within a search engine. You should do a few key things with your content to fulfill their needs.
Building trust, showing expertise, and becoming an authority should be at your SEO strategy's core.
Expertise, authority, and trust (E-A-T) are the guiding principles that Google uses when building and qualifying algorithms, to make sure they are doing what they should do.
That does not mean that E-A-T is a ranking factor. It does, however, mean Google is looking for measurable signals that align with E-A-T, such as high-quality content, a popular brand, or links from other reputable sites.
Becoming a brand that shows E-A-T has other benefits outside of rankings signals, including a huge impact on things such as the CTR you receive in search results.
For example, when you use a search engine, are you likely to pick a popular site you know has excellent content or one that you’ve never heard of before?
A great example is for a SERP showing cake recipes.
Are users more likely to click the BBC Good Food or Waitrose result? Or the Baking Mad result?
Chances are, it’s going to be one of the first three sites. Users are more likely to recognize and trust these brands, as you can see from the branded search volumes of each site.
But how do you get to this point?
You can do certain things to better show E-A-T, but part of the answer is a mix of different types of marketing activity that goes outside of an SEO article.
If you want to show better show E-A-T with your content, here are some tips:
While a simple point, it’s an important one.
The content you publish must have a purpose instead of being there for content’s sake.
There is no SEO benefit from creating a set number of articles at a specific word count that doesn’t fulfill your audience’s needs.
There is, however, a benefit from developing content that matches search intents, is comprehensive, and is of a higher quality than competing URLs.
John Mueller recently reiterated this point on Twitter.
How else would it determine if the content is comprehensive?— Dina | CC (@copy_cookie) August 31, 2020
You’ll often hear the question, “How many articles should we create a month for SEO?”
The answer is always, “As many as we can create while ensuring they’re better for users than competing ones.”
I regularly see this with my clients. Creating higher quality, thorough content will outperform competitors who are writing more articles at higher word counts.
'Do we really need to create 20 - 30 articles monthly for SEO?'— Sam Underwood 🇬🇧 (@SamUnderwoodUK) November 8, 2021
Two brands have asked me that within four weeks on intro calls.
The answer is no.
I create 4/5 articles monthly for my largest content client; here's their growth over nine months.
Quality beats quantity. pic.twitter.com/DGlcJO54gV
The inverted pyramid is a widely adopted way of writing where you start with essential information and end with the least important.
To break it down even further, it might look like this:
Analyze any well-written piece of content, and you’ll soon be familiar with this structure.
There are multiple ways to create engaging and compelling web content.
Some effective methods for blog articles are to:
If you’re creating a product or category page for ecommerce, consider:
In addition, it’s not just about the content itself; think about the layout and design.
Design plays a significant role in how users feel about a brand and their enjoyment from reading a piece.
You’ll quickly get the point if you disable CSS for a webpage.
While this is what a user wants from the content, its design isn’t too inspiring and memorable. A competing site providing similar content could quickly improve upon this with a better content presentation.
You should always create content that your audience is actively searching for.
You can validate you’re creating targeted content by:
In addition, making targeted content goes beyond what a user searches. A large part of creating valuable content is catering it to your audience.
There are several ways you can learn more about who you’re writing for:
As I mentioned, heading tags are a great way to incorporate structure into your content.
Not only do they break up blocks of text, but they are also an excellent way for your audience to find the part they want to read.
The concept for heading tags is simple — nest lower headers (like H3s) within high headings (like H2s) to show how content is organized.
Here’s an example heading structure for an article as an example:
H1 - Europe’s best attractions
H2 - France
H3 - Eiffel Tower
H3 - Notre-Dame
H2 - England
H3 - London Eye
H3 - Tower Of London
If it’s a substantial piece, including a table of contents with jump links can be handy.
Google has a freshness algorithm that matches queries looking for new content with new articles, so it’s essential to ensure you’re keeping content up to date.
Not all content needs refreshing. If you’ve already covered a topic thoroughly and there is nothing more to add, there’s no need to update the content. That’s unless you see an opportunity actually to make the content better.
Fresh content is most important for news, trends, celebrities, or other ever-changing topics.
There are a few ways to ensure you’re always keeping your content fresh:
It doesn’t end once you’ve created high quality content; next, you need to make sure it gets clicked from a search result.
The main ways to entice a user to click your result are:
Here’s how to get those optimized.
Title tags can persuade your audience to visit your site and read that piece of content.
For example, this title tag isn’t that exciting, and your CTR would likely mirror this.
"Different ways you can make money doing a side job | Brand Name"
However, consider this title tag::
"Growing Your Side Hustle [Ultimate Guide] | Brand Name"
It would be more likely to entice you to click.
Here are a few age-old tips for making title tags more compelling.
While all of these are SEO best practices, what works will vary for each search result. Make sure to try different ones and see what works.
SEOs often deprioritize meta descriptions, and you can understand why. Data from Ahrefs shows that Google rewrites meta descriptions in 63% of cases. This means even with custom meta descriptions, Google may decide they know better and create their own based upon your content.
Nonetheless, a well-written meta description can help improve your CTR by giving users valuable information about your page and enticing them to click your result.
Meta descriptions are the short piece of text under the title tag. They’re typically around 160 characters long on desktop and 120 characters on mobile, but this can vary.
Here are some tips for writing better meta descriptions:
Here are the steps to help obtain these rich results.
First, check which rich results are showing for the targeted keyword and note the content returned. It’ll likely be one of these:
Next, mimic the type of snippet with your content, e.g., if it’s a table, you’ll likely need to match the query’s intent with a table.
While you’re answering the intent of the query, you’ll also need to follow a few best practices, such as:
Adding structured data is a great way to help search engines better understand your content and enhance your listing with search results.
Structured data explicitly tells search engines about your content via code added to the page. This increased understanding means search engines can include additional information about your webpage when a user searches for content like yours.
A great example of these is Recipe rich results.
The additional information such as the reviews, ingredients, and time to cook comes from the recipe structured data each site has added.
Another example for ecommerce sites is Product rich results.
Providing pricing and stock information within structured data on your site gives Google the confidence to show that information within the search results, potentially positively impacting your CTR.
How you organize content and internally link to it significantly impacts your site's search performance results.
If you’re not incorporating logical site hierarchies and internal links into your strategy, you could be missing out on the many benefits that come with them.
Here are some techniques to improve this aspect of your SEO strategy.
Search engines are much better at understanding sites built around topics and subtopics.
Internal links are a crucial way to show how you’ve organized your content. By building clusters of content where broader parent topics internally link to more specific subtopics, you’ll help search engines better understand your site.
A simple example to explain this concept would be a travel site.
You’ll often find these sites create logical hubs of content based upon a country, which then have internal links to subpages, such as cities or towns within that country. The city and town pages could have internal links to landmarks or attractions within them.
Building sites in this manner also helps:
A good URL should:
The following good URL is short, readable, and shows how the site is organized.
In contrast, this URL has several directories, is challenging to read, and hides the site structure.
A breadcrumb shows where the current URL sits within the site hierarchy. You’ll usually find them at the start of the content or product description section, and they look something like this:
Following our Women’s Boots URL example, a breadcrumb for that site would look something like this:
Women’s > Shoes > Boots > Waterproof
Usually, your breadcrumbs and URL structure should be pretty similar in that they both indicate the site's hierarchy of content.
Sometimes you may want to omit parent pages from your URL structure to avoid having a large number of subdirectories which can be more challenging for a user to read — it’s important to note this doesn’t directly impact SEO. Still, you should always try to keep URLs as readable as possible.
In these cases, you may exclude specific directories from the URL, but how you’ve organized content should always be correct within your breadcrumbs.
In addition to being helpful for users, they also give Google clues on the structure of your site, especially if you add the associated breadcrumb structured data.
Sitewide links included in your header and footer are still a key focus for SEO. Having a URL in one of these areas will signal to Google its importance and increase the amount of link equity passed to it.
They’re also essential for UX, allowing users to quickly access popular areas of the site that may be deeper within the site hierarchy.
In a few steps, you can spot opportunities to add high-importance URLs using Sitebulb.
First, set up your Sitebulb crawl to pull data from Google Analytics and Google Search Console (GSC).
Next, head to the URL explorer report in the top bar.
Then head to Directives > Indexable
Now you’ll want to adjust the columns to include only:
Now sort by "Total Impressions" and spot high-impression URLs with a low number of internal links or a higher crawl depth.
Review these URLs to see which are most important to your business, then incorporate them as internal links within your header or footer.
Anchor text is the text used whenever you add a hyperlink. It gives vital information to search engines and users about what they’ll find on the page the link is pointing to.
Descriptive anchor text is beneficial for accessibility. Without context, your audience is less likely to click anchor text such as "click here" or "more info."
Longer, more descriptive anchor text gives Google more context, which can positively impact the rankings of the destination page.
Here’s an example.
If the hyperlink is to a page on Japanese restaurants in Osaka, the following anchor text is an example of being both relevant and descriptive:
"Some of the best Japanese restaurants within Osaka."
However, you’ll likely want to spot these opportunities in bulk. Again, a great way to do this is with Sitebulb, thanks to their "Link Explorer."
First, head to the link explorer report from the top bar.
Use the search bar to filter for the target URL you want to optimize internal anchor text for.
Examine the anchor text used; is it all descriptive? If you see a large portion of anchor text that uses non-descriptive phrases like "here" or "this article," this highlights an optimization opportunity.
A user-friendly site can't make up for poor content, yet good content can't entirely make up for a poorly functioning site.
Instead, a good-looking site and quality content go hand in hand to provide an excellent experience.
Having a poorly designed or functioning site can also impact SEO. In some cases, directly, with algorithms such as Core Web Vitals. Other times indirectly, with a poor site appearance impacting a site's ability to perform well for direct ranking factors, like links — something I believe John Mueller is alluding to here.
Here are some ways to improve user experience and your organic traffic.
Mobile-first refers to prioritizing a mobile site's design and user experience over a desktop. Essentially, websites should be created with the mobile user’s experience first and then adapted to larger screens.
This reflects the way users consume content on the web nowadays and how Google has changed how they crawl and index the web’s content.
Mobile-first SEO has no significant changes from what you’d do for desktop, with the largest consideration being that you have parity between the two when it comes to:
The best way to ensure parity is simply to use a responsive design.
This way, the same HTML code is used no matter the device, with the only changes being visual via CSS media queries.
Suppose you use an alternative method of implementing device-optimized sites, such as separate URLs or dynamic serving. In that case, you’ll then need to start doing parity audits, which tend to be complicated and time-consuming.
Core Web Vitals are metrics created by Google to measure the user experience of a webpage.
In 2021, Core Web Vitals became a ranking factor (albeit a minor one), meaning improving them is one technique to improve organic traffic.
Google breaks Core Web Vitals down into three key areas, each with different ways to improve them.
The difference between a two-second and four-second load time can have a massive impact on how many of your users drop off.
The Largest Contentful Paint (LCP) Core Web Vital measures loading performance by indicating how long the most prominent element in the browser viewport took to load.
In addition to how long it takes to load a page visually, how quickly the browser responds to a user interacting with the content is also essential.
First Input Delay (FID) measures this by evaluating the time it takes for a user to first interact with a page compared to how quickly the browser responds to that interaction.
The final measure, called Cumulative Layout Shift (CLS), measures the content’s stability while the page is loading — that annoying reshuffling of page elements right as you're about to click. There are great video examples of what the CLS metric aims to solve on Google’s web.dev site.
Issues here usually happen due to browsers parsing CSS in a particular order. Browsers parse some CSS after the user can see the webpage, causing layout changes as new styles are applied.
A good place to start to learn more is my Core Web Vitals article. I’d also definitely recommend reading Google’s dedicated articles on tactics to optimize each metric:
Unique content above the fold is an important usability and SEO fundamental.
The technique here is simple: prioritize important content higher up the page and ensure you summarize the page above the fold. That includes both your H1 tag as well as content summarizing the page.
There is nothing more frustrating than clicking somewhere on a page, only for an interstitial to show at that exact time, and before you know it, you’re on a site you didn’t want to be on.
That is called an intrusive interstitial, and Google has an algorithm that detects these and demotes sites that use them.
Ads may be a part of how you make your revenue, but providing a frustrating user experience to surface those ads will not help your brand’s perception or search traffic.
Sometimes non-ad interstitials are necessary — for example, a cookie-usage interstitial.
Or even an age-verification interstitial.
You don’t have to worry as much about these as you won’t be penalized for them.
If you can’t get rid of ads, instead make sure they don’t take up too much screen space.
This is much less in-your-face and annoying!
While you should write content with your audience in mind, you still need to do your due diligence to help Google discover and serve that content to users.
This is the part technical SEO has to play in your SEO strategy. Here are some key techniques to ensure technical SEO fundamentals are covered.
Crawl budget is the limit a search engine places on your site regarding how many URLs are accessed. The crawl budget varies from site to site, based on several factors such as the site's speed and popularity.
If you don’t have a huge site (over 1 million URLs), optimizing the crawling of your site is unlikely to be an SEO technique you’ll spend much time on.
If you’d like to verify whether it should be a focus, GSC or an auditing tool can help by simulating how a search engine crawls your site, showing you the number of URLs discovered.
In Sitebulb, it’s as simple as setting up a crawl then checking the total number of URLs crawled on the overview page.
If you spot a large number of crawlable URLs, you’ll want to use the robots.txt to manage the crawling of your site.
The robots.txt is a file that gives directives on the URLs web crawlers can and cannot crawl on your site.
Good crawlers obey it to the letter, including search engines like Bing and Google.
The contents of the file will look a bit like this:
User-agent: * Allow: / Disallow: *?color Sitemap: https://www.example.com/sitemap.xml
If you’re looking to manage which URLs bots can crawl, you’ll likely be adding disallow rules like the above to the page to instruct search engines not to crawl a URL.
For example, within Sitebulb, if you selected the list of "Crawled" URLs shown earlier, you could scan the table returned and spot that the crawler found a large number of URLs with a"‘?size=" parameter.
To then exclude that from being crawled, you’d add a wildcard rule like this:
Ensuring quality indexing is a fundamental technique for improving organic traffic. Google evaluates your site based on the URLs indexed, as explained by John Mueller in this webmaster hangout.
A quick and easy way to understand whether you have indexing issues starts with having XML sitemaps that reflect all important URLs you want indexing on your site.
First, go to the Sitemaps section of GSC to make sure you've submitted an XML sitemap.
Next, go to the coverage reports, select "Valid" on the chart, and look at the Details table.
There is a significant difference between the two rows shown in this report..
Fixing indexing first requires you to understand the URLs indexed that shouldn’t be.
You can get a sample by selecting the "Indexed, not submitted in sitemap" row of the table in the earlier screenshot.
For small sites, that data may be enough, but the report is limited to 1,000 URLs. Sometimes you’ll need more data to understand potential patterns of URLs that are causing indexing URLs.
In that case, use a crawler like Sitebulb and go to the "Indexability" report in the left sidebar.
Then you’ll want to click on the green area of the "Indexability Status" chart.
Sitebulb will then return a list of all the indexable URLs found on your site.
Next, go through these URLs and try to spot URLs that are indexable but not should be.
Once you’ve spotted the issue, you’ll want to get those URLs out of the index. There are two ways you can do that.
You can use canonical tags to consolidate duplicate pages and all ranking signals of those pages into one authoritative page.
That means that you should only be using a canonical tag to solve your indexing issues if the pages causing the problem are duplicates of each other.
Common causes for indexing issues from duplicate content tend to be sites adding URL parameters or the CMS generating more than one URL path to access a piece of content.
Adding the canonical tags look like this:
<link rel="canonical" href="https://example.com/shoes/leather" />
So if you had a duplicate URL cased by parameters on a URL like the following:
You’d correct it by adding a canonical tag on the URL with the parameters, pointing to the URL without them.
In some cases, a 301 redirect may make more sense to consolidate ranking signals between duplicates, such as when choosing a canonical hostname (e.g., non-www/www or HTTP/HTTPS).
Generally, you should opt for a 301 if the user should be redirected to the canonical URL rather than simply consolidating ranking signals between the duplicates.
Suppose duplication isn’t causing your indexing issues. Instead, it’s just a CMS generating a page that doesn’t add value for search (a good example is WordPress creating media attachment pages). In that case, you’ll want to use the noindex tag.
The noindex tag tells search engines to stop the indexing part of their processing pipeline, resulting in the URL not being used when evaluating your site's quality.
The noindex tag is added to the <head> of the page and looks like this:
<meta name="robots" content="noindex">
It’s important to note that if you add the noindex tag, you also need to ensure that you haven’t blocked the crawling of that page with the robots.txt until a search engine removes the page from their index.
If you have, the bot won’t access the page, resulting in the noindex tag not being seen and the URL not being removed from their index.
To test whether Google is rendering your content fully, use GSC's URL inspection tool.
Accessing this tool is as simple as searching a URL within the bar at the top.
Once you’ve done that, you’ll then want to select the test live URL button.
Googlebot will fetch and render your page and let you view the tested page.
You’ll then want to view the screenshot. Does it accurately reflect your page?
If you know you’re injecting content with JS, and the screenshot is missing key content, you may have issues with bots not critical content on the page, negatively impacting how you rank.
Correcting these issues involves either server-side rendering or statically generating content (NextJS is an excellent option for React). You could also implement dynamic rendering, where users get a client-side rendered version of the page, and bots get a server-side rendered version.
Make sure to read my complete guide on technical SEO for more techniques to improve how Google discovers, indexes, and renders your content.
For businesses with a local premise that customers can visit, Local SEO is a must if you want to increase your store footfall.
While I’d recommend reading my complete Local SEO guide, here are some quick tips:
Once you’ve put your hard work into perfecting your SEO, it doesn’t end there.
Naturally, you’ll want to keep on top of how you’re performing. It’s easy to forget to check on your performance regularly when you have 101 other things to be doing, so make the most out of tools that do the hard work for you.
We’ll explore the different options next.
Google Analytics and Google Search Console will cover most of the insights you’ll need for how you’re acquiring users to your site for the organic channel.
With Google Analytics, you can apply an organic segment to see all reports with organic-only data.
Alternatively, head to Acquisition » All Traffic » Channels and view the organic channel's acquisition data.
Rather than manually checking performance, you can also use the alerting system built into Google Analytics. Do this by clicking Admin, followed by Custom Alerts, and select +new alert and configure based upon your requirements.
Google Search Console provides another layer of acquisition data due to it giving information on the specific queries users searched when they clicked your result. You can find these reports in the performance section.
Once in there, you’ll find a filterable table with a sample of queries users have used when finding your content.
Rank trackers like Advanced Web Ranking enable you to monitor your position on the SERP for critical terms.
Even a fall from position two to three could have a significant impact, so it’s important to spot these if and when they happen. When changes occur, a rank tracker will alert you to investigate as soon as possible.
You can do so much when it comes to SEO that it’s easy not to know where to begin! With this guide, even if you think you have your strategy perfected, you’ll hopefully find a technique or two for improving organic traffic that might not already be a part of your strategy.
Any techniques I’ve missed that work well for you? Tweet them to me.