Most business owners and webmasters today understand that SEO plays a key role in their visibility online. I’m guessing you do too, or you wouldn’t be here.
Data collected by BrightEdge found that an average of 51% of a site’s traffic comes from organic search – far higher than any other traffic source.
This means a well-balanced strategy of on-page and off-page SEO is critical for anyone that wants to get their website seen, and more importantly, clicked on.
But SEO isn’t the only factor playing a part in your website’s success.
User experience is pretty damn important too.
According to Think With Google “79% of people who don’t like what they find on one site will go back and search on another site.”
In other words: it doesn’t matter how your site appears to search engines, if visitors aren’t getting what they want, your website isn’t working.
In an ideal world this wouldn’t be an issue: search engine and UX optimization would work in harmony. But it’s not always that simple.
Sometimes our desire to give the search engines what they want can get in the way of the user experience.
That’s not good. Both from a user perspective, and, in a roundabout way, from the perspective of search engines.
You have to remember what the search engines’ goals are. They want to make money, of course. But to do that, people have to be using their search engine.
And why do people choose to use a search engine?
Because it provides them with relevant, useful results.
The fact is, you might think your SEO campaign is doing the trick, but search engines today – Google, in particular – are very clever. Google uses the time spent on a page as a signal of its relevance. If users are frequently landing on a page of your site, and quickly returning to the search results, Google takes this as a sign that something is wrong.
This is why it’s so important your SEO and UX optimization work together.
If you fail to take into account the impact your SEO campaign has on your user experience, you could be doing your site more harm than good.
Let’s take a look at a few ways your SEO could be hurting your user experience.
Keywords are an important part of any SEO campaign. Ensuring the right keywords are included in the right parts of a page provides valuable signals to Google about that page’s subject matter and the search terms it should be ranking for.
Unfortunately an overzealous keyword strategy could cause problems for your users (and in turn, for you).
Pages that have been over-optimized suck to read. It feels unnatural and frankly, looks highly unprofessional.
The above example is an extreme one, but you don’t have to go to these lengths to write copy that feels unnatural. Shoehorning even just one or two keywords into your page copy is poor practice.
In short: keywords are great, but only when they fit seamlessly into the surrounding text.
Thankfully Google is intelligent enough these days to understand the context of a page without keywords being shoved in its face.
Google understands the semantics of keywords and the relationship that words have to each other. If you’re writing quality, relevant copy you should find that you naturally provide Google with the signals it needs to rank your page.
Perform keyword research, by all means, but use the results to guide the content of the page, rather than to devise an exact match keyword strategy.
Landing Page Copy
Increasing numbers of SEOs are taking keyword stuffing out of their toolbox. Unfortunately in some cases, its replacement involves writing unnecessarily long copy.
My guess is that these SEOs, understanding how today’s Google analyzes the content of a page, have decided that to get that page ranking they need to give Google as much information as possible.
I can certainly see their logic, but where does this leave the user?
Visitors to your site want to find information as quickly as possible. They don’t want to have to trawl through hundreds of words of fluff in order to get an answer.
If the subject matter lends itself to long content, then by all means, write away. It’s normal for blog posts to run to a few thousand words and so long as each word exists for a reason, that’s okay.
Copy elsewhere on your site, particularly on product, service and category pages, serves a very specific purpose. Providing visitors with the information they need to convert is far more important than providing search engines with hundreds of words to analyze.
In 2012 Google began to penalize sites for aggressive use of anchor text in external links. That’s understandable – the practice was manipulative and made Google’s job harder.
Internal linking is different. We have pretty much free-reign over how we link between the pages of our site and the anchor text we use.
In fact, Google’s only real guidelines on how we should use internal linking is to “keep the links on a given page to a reasonable number.”
However just because we can do what we want, doesn’t mean we should.
Although the anchor text of internal links doesn’t carry the same weight as that of an external link, most signs point to them being a ranking factor.
Based on my experience, I strongly believe interlinking related factors have strong weight within Google algorithm. I work with startups and have an opportunity to run varied tests on completely new websites.
I saw websites start rankings for keywords used in the anchor text of internal links without any external links pointing to those websites. Of course, I am not talking about highly searchable terms but it really shows it is an important part of SEO.
This could tempt webmasters to exploit the power of an internal link, and I wouldn’t blame them. But it’s important to remember how these links, and their anchor text, could affect user experience.
Anchor text should accurately reflect the content of the page the link points to. Using an exact-match keyword in place of something more descriptive makes the link less enticing to click and risks misleading those that do (click it, that is).
At some point, most sites will have to use redirects. They’re generally used when the URL of a page changes, or a site moves to a new domain.
The redirect ensures any visitors to the old URL are taken to the new version – this includes search engine spiders.
A 301 (permanent) redirect will cause the search engine spiders to index the new version of the page, drop the old version from its index, and pass the link equity from the old URL to the new one.
A 302 should be used only when the change is temporary.
Redirects are a necessary feature of the web, but they can become problematic if misused.
The most common (and most problematic) misuse of a 301 is to redirect all old URLs to a single URL (generally a site’s homepage).
This is bad for the user because they wind up on a page that’s very different to the one they were looking for. It’s bad for your SEO for a similar reason: search engines only want to serve relevant content to their users, so if you’re sending visitors to a location that’s very different to what they want to find, you have a problem.
In fact, Google’s gone as far as to suggest that this technique could result in old URLs being dropped from its index without passing link equity.
Recent statements by Google representatives suggest that Google may go a step further and treat bulk redirects to the home page of a website as 404s, or soft 404s at best.
This means that instead of passing link equity through the 301, Google may simply drop the old URLs from its index without passing any link equity at all.
Another UX-killing redirect error is to use a 301 in place of a canonical tag.
Canonical tags are generally used when a site has duplicate, or near-duplicate content across multiple pages.
This often occurs when a site stocks a number of very similar products, or the same product appears in multiple categories.
Problems arise when webmasters decide to “fix” these duplication issues by using a 301 in place of a canonical tag.
True, the impact on search engines will be the same: they will only index the “master” version of the page that the redirects point to.
However this technique isn’t so great for users, who will be taken to a different area of the site than they intended to reach.
In the case of 301s being used to consolidate similar products, this might mean the user sees a product in a different colour or style than what they were interested in.
The consequences aren’t so significant for products that appear in different categories – however users that have navigated through your site to find a product may become confused when they get diverted to a different section of it.
All-in-all, neither example offers a great user experience.
For me the overriding lesson is that while SEO and UX both have an important role to play in improving a site’s performance, they are intrinsically linked.
It all comes down to what Google, and search engines generally, want: to serve the best possible results to their users. To do this, they need to send those users to sites that don’t just satisfy the search query, but that offer a great UX too.
Regardless of the time or resources you invest into SEO, if your site sucks to use, it isn’t user friendly or SEO friendly.
What are your thoughts? Have you ever been concerned that your SEO efforts might be hurting your user experience? Let us know the details using the comments below.
Note: The opinions expressed in this article are the views of the author, and not necessarily the views of Caphyon, its staff, or its partners.