SEO visibility is a handy metric that helps you better understand the impact changes you’re making have on where you rank.
Since it doesn’t directly measure what matters to a business, like revenue and traffic, SEO visibility is often misunderstood and disregarded.
However, as an SEO, you need to know how it works and its importance in measuring your work.
What is SEO visibility?
SEO visibility measures how visible you are within search results over time. It’s commonly used alongside other performance indicators to understand how well your SEO strategy’s tactics are working.
The better you rank for a term, the higher your visibility.
However, SEO visibility is based upon ranking. It isn’t based upon traffic or users going to your website. High visibility has the potential for traffic but doesn’t result in traffic itself.
If searchers are not visiting your site from that SERP, or perhaps not even searching for that term, you could have a better visibility score but not see an increase in traffic or revenue.
In some formulations of the SEO visibility metric, search demand is part of the equation — a higher score is given for queries with a higher search volume.
Your SEO visibility score will be higher in these situations if you rank better within a popular SERP.
Why is SEO visibility useful?
SEOs mainly try to generate more traffic to your website by improving where you rank.
And in general, higher rankings = more traffic.
But more traffic ≠ higher rankings.
You could be receiving more traffic because of:
Higher demand for your product/service.
Increased clickthrough rates not caused by an SEO-related change, e.g., increased brand recognition.
With so many elements outside an SEO’s control, SEO visibility provides a way to measure the impact of an SEO strategy while minimizing the influence of non-SEO activities.
How is SEO visibility calculated?
If you’re already confused about SEO visibility, it’s made even more confusing by different tools calculating it differently.
There are three types of SEO visibility metrics you’ll see across tools:
Points-based: You receive 30 points for ranking first, 29 points for second, and so on. This system does not assign points based upon the keyword’s search volume and is highly influenced by the number of keywords within the dataset. By making your dataset size more extensive, you’ll artificially inflate your score.
Percentage-based: This methodology considers your current position and the query’s search volume and then estimates the clicks you’ll receive based upon an estimated CTR curve. It’ll then give you a percentage based upon that compared to the total clicks available for that SERP (examples include Moz, Ahrefs, and Advanced Web Ranking).
An index: Like a points/percentage system, but presented as a metric used to track the performance of the sites called an index. These are often weighted by the estimated search volume of the keyword (examples include Searchmetrics and Sistrix).
Pixel-based: This is a relatively new way to measure visibility created in response to the increase in SERP features, making CTR curves for classic listings significantly vary for each search query. By considering how far down the SERP your listing is, this metric better indicates how visible you are. Measuring pixels to understand visibility is included in tools like Advanced Web Ranking, Myposeo, and Rankranger.
Using visibility metrics
While you may think the best systems for measuring visibility are the ones that consider volumes (as they better measure the number of users seeing your listing), you can use different visibility metrics together to understand performance better.
For example, if the points-based visibility metric goes up, but the percentage-based metric doesn’t go up, this likely means you’re ranking better, but for keywords that either aren’t high search volume or aren’t ranking well enough yet to gain clicks.
If you see percentage-based metrics go up, but your points-based visibility stays the same, it’s likely that search volumes have increased. So you’re getting more clicks, but that isn’t because you’ve improved where you rank.
The difficulty you’ll have with any metric based upon search volumes is that search volume metrics can be wildly inaccurate.
This isn’t an issue specific to any vendor, but more due to them not having access to live search data like you can find within Google Search Console.
Instead, they all rely on data from third-party sources, and the data is often averaged over the past 12 months, meaning any recent search trends won’t show within these visibility metrics.
While this seems like a pitfall, you can still use this to understand your performance better. Suppose you see that your traffic figures within Google Analytics or Google Search Console have dropped/increased, but visibility metrics have stayed the same. In that case, the cause for the drop is likely from changes in search trends, not because your site’s rankings have changed.
Given this information, whatever tool you decide to use, you’ll need to understand how they source search volumes and how frequently the tool refreshes that data.
SEO metrics compared
Now we’ve defined SEO visibility; how does it compare to other SEO metrics, like ones that show you real traffic figures?
Here is an overview.
Organic sessions (Google Analytics): The number of sessions started by a user from a search engine, as measured by a tool like Google Analytics that you set up on your own site.
Clicks (Google Search Console): The number of clicks from Google Search, as measured by Google’s internal data.
Estimated traffic: Calculated by multiplying a query’s search volume by an expected CTR based upon your ranking position, often provided by third-party tools like Semrush, Ahrefs, Advanced Web Ranking, and more.
Causes of visibility changes
When it comes to what could impact your visibility, it can be anything that affects SEO, which leaves many things to check.
If you’re diagnosing an organic traffic drop, start with my guide.
If you just want to know some common causes of changes, read on.
#1 Technical issues
A likely factor of SEO visibility changes is technical issues; some of the most common include:
A common issue with canonical tags is not referencing the right page. If this is the issue contributing to the drop in visibility, it's easy to fix if you have a developer on hand.
Canonical tags are placed within HTTP headers or the <head> of the page.
They aim to help Google understand what page to consolidate ranking signals to for pages that are duplicates of one another.
Unfortunately, there are several common mistakes when using canonical tags, including:
Incorrectly including parameters in canonical tags.
Having canonical chains (a canonical pointing to a page that then canonicalizes to another canonical).
Having canonical tags pointing to page one of a paginated sequence, rather than being self-referencing.
The robots.txt is used to control how bots crawl your website.
It doesn’t impact indexing in search engines but is particularly useful for sites that create large numbers of crawlable URLs that either aren’t:
Useful for bots to crawl
Indexable, and therefore not worth crawling
In general, the robots.txt will impact your visibility if you’re blocking the crawling of the wrong pages.
This usually happens if a developer blocks crawling via robots.txt in staging and then pushes that live without removing the blocking, or you add a robots.txt rule that is too broad and inadvertently blocks the wrong page (using something like a wildcard).
You can check for issues using Google Search Console’s robot.txt tool. Enter the URLs that concern you, and you’ll quickly identify any problems.
Another common error when moving a site from staging to live is incorrectly having a noindex tag.
If this happens, the page will not be indexed and shown within search results, impacting how visible you are (obviously).
To check for noindex tags, I use SEO Peek, a chrome extension that highlights key SEO tags in the page’s rendered source.
This will quickly reveal any problems with HTTP headers and tags.
Core Web Vitals are a set of speed and UX metrics shown within Lighthouse and other tools (like PageSpeed Insights); since June 2021, web vitals are also a ranking signal within Google.
If you’ve seen visibility incrementally decrease between that data and August 2021, you may have been negatively impacted by the update. Sistrix has shown that declines from sites failing the Core Web Vitals test have, in fact, been measurable using their own visibility metric.
A common cause of decreased visibility comes from changes during a redesign and/or URL migration.
This can be caused by either changes to the site’s content, structure and internal linking, or a change in the URLs that’s confused Google.
In the case of URL migrations, the best way to avoid a drop is to not change URLs en masse altogether.
Unless there is a very good reason to change them (e.g., they’re not human readable or don’t follow multiple best practices), URL migrations can introduce unnecessary risk.
But, they’re often unavoidable due to other factors such as a change in CMS occurring at the same time which enforces a certain URL structure.
When you can’t avoid URL changes, attempt to isolate the change. For example, in the scenario where you’re doing an entire redesign, try to separate changes to the following:
If you can, it’s also best to not change the entire site in one go e.g., if you’re an ecommerce site, try redirecting product pages first, waiting for that to settle, then redesigning the page and then do the same process for categories.
Separating out changes gives more clarity to what has caused a change in visibility. It also gives Google less to reevaluate, meaning reducing the chance of volatility in rankings from the change.
Internal linking changes
Internal linking has a significant impact on what URLs Google understands to be important and related to one another.
Carefully consider changes that you make to links within your header/footer and on high-authority pages like the homepage.
A common cause for a drop in visibility after an internal linking change would be:
Moving a high-importance page further away from the homepage.
Reducing the number of internal links to important pages.
One of the best ways to validate whether this is an issue after a change is to evaluate click depth from the homepage using a tool like Sitebulb:
If you know that before your change that most URLs were two clicks from the homepage (as shown in the earlier chart), and after this it changes to most URLs being four clicks, you’ve likely built a deep site structure which will negatively impact SEO.
Further analysis can be done by using the URL explorer and looking at the number of internal links to a URL vs. the number of clicks.
If you spot after a change that an important page has lots of clicks but not many internal links, it’ll likely perform worse because of it.
Perhaps this is the first thing you’ve checked, and quite rightly so. Server errors occur a lot, and if Google tries to crawl your site when the server is down or slow to respond, you could see a drop in visibility.
If you want to spot server errors quickly, your log files will be the best source of truth. Most hosting providers can provide access to these; if they don’t, you can also look into setting up a CDN that works as a proxy (like Cloudflare) then use the popular Logflare app. This also helps with site speed, and thus Core Web Vitals.
If that isn’t an option, another way to spot server errors is via Google Search Console. Within the "coverage" reports of the tool, Google will report any server errors it finds when crawling.
The downside? There is a delay in receiving these alerts, so you’ll likely be alerted well after the fact if it’s a temporary issue. The result is you’ll see a drop in traffic from the issues and then you’ll have to wait for Google to recrawl the site before recovering the lost visibility.
#2 Algorithm changes
If you’re sure your site is technically sound, a shift in visibility could also be due to dreaded algorithm changes.
Algorithm updates happen daily. While they’re not always announced, some are more noticeable than others.
They can be great and have a positive impact on visibility. Still, the opposite can also happen, which can be pretty infuriating if you’ve been working hard to improve your site!
There is a lot of industry chatter following major updates; think Penguin, Panda, Hummingbird, or even recent core updates. Some places to check include:
You can also monitor various tools that measure the volatility of search results. Some popular ones include:
You can also use tools that overlay known algorithm updates with your analytics data, with a popular choice being the Panguin tool.
#3 Increases in competition
You may have checked all of the above and are still scratching your head as to why your visibility has changed.
Sometimes, it’s not you at all; your competitors might have made a change that made them grow in visibility, resulting in a drop for you, or vice versa!
In these scenarios, you want to use a rank tracker to monitor where your competitors rank daily. Advanced Web Ranking does this across various SEO ranking reports in the tool that allows you to measure visibility changes.
As well as individual position changes for specific keywords.
Tactics to grow your visibility
You can do many things to grow visibility beyond what we could include in just this one article. To help, I’ve curated some of my deep-dives into various tactics that’ll help you grow:
SEO visibility can sound complicated, but once you’ve got your head around what it means, you can understand the importance of using it as a metric.
If you notice a drop in search engine visibility, it can be puzzling, not to mention frustrating. However, if you follow this guide, exploring the different possibilities, you can get to the bottom of the problem quicker, allowing you to take action much sooner.