By now, you’ve got this on-page SEO thing on lock. You’ve learned how to research keywords, develop an editorial calendar, write successful blog posts and optimize images for search. You’ve written brilliant meta descriptions for each page and post.
You’ve done everything right, but your organic traffic hasn’t improved.
Maybe you haven’t earned enough backlinks. Perhaps you could get more aggressive with social media marketing. Those things can make a difference, but they’re probably not what’s tanking your organic traffic.
Chances are, you’ve got some technical SEO mistakes that need to be addressed.
So for now, let’s assume you’ve exhausted all on-page SEO options and your issue lies with how your website is actually built, structured and coded. This guide covers the common technical SEO mistakes that are negatively impacting your rankings, and how to fix them.
(By the way, if you need more specific help than we offer in this guide, join our digital marketing Facebook group! It’s been seeded with digital marketing pros who can answer any questions you may have.)
But first, let’s cover some basics for the “non techie” readers:
- How to tell if your site has technical SEO issues
- 1. Your site doesn’t use SSL (HTTPS)
- 2. You have duplicate content issues
- 3. Your site speed and user experience leave a lot to be desired
- 4. You have 4xx errors (most likely, 404s)
- 5. Your website isn’t mobile friendly
- 6. Google doesn’t crawl and index your pages correctly
- 7. Your robots.txt and sitemap.xml files are missing or improperly formatted
- 8. Your site’s crawl depth rivals the Mariana Trench
- 9. You’re outspending your crawl budget with thin content
- 10. You need to break some redirect chains
- Take Your Technical SEO Knowledge to the Next Level
What is technical SEO?
Technical SEO is what makes your website crawlable, indexable and easy to understand for search engines.
At TCF, we like to say that your website is your digital storefront — and part of driving traffic is making sure your business is appealing and helpful enough to draw the window shoppers in. (Note: if you also have a physical storefront, you should read our guide to local SEO). Technical SEO is the foundation of your storefront. Without a solid technical foundation, all of your window dressing and awnings and signage come crumbling down.
Google doesn’t read your content the way people do. It also reads all the code underlying that content, and if you don’t build your website to Google’s specs, it could hurt your visibility on the search engine results pages (SERPs).
Good news: you don’t have to be a coder to understand technical SEO.
How to tell if your site has technical SEO issues
You’ll need an SEO audit tool to uncover any under-the-hood issues dragging your organic traffic down.
Two popular options are Screaming Frog and SEMrush (hint: you can get a free week of SEMrush Pro with our affiliate link!). You’ll get a site audit report that lists your technical SEO issues and tells you how to resolve them.
Chances are, you probably have some issues that need to be addressed. According to a massive study SEMrush conducted on 1,300 ecommerce websites, even the biggest brands need some level of technical SEO remediation. Here are just a few stats from the research:
- More than 76% of ecommerce sites have 4XX errors (typically, “404 not found”).
- 95% have pages with only one internal link driving to them — I’ll explain in the crawl depth section why this is a bad finding.
- Nearly 12% of ecommerce sites use Flash, which is also a huge security issue.
Get a free, high-level SEO audit
Just pop in your domain name and your email address, and we’ll send you an SEO audit report within the hour. You’ll learn the top three technical SEO issues that could be harming your organic site traffic. This guide will tell you how to address them.
Here are 10 of the most common technical SEO mistakes that kill organic traffic – and how to fix them:
1. Your site doesn’t use SSL (HTTPS)
SSL is short for “secure sockets layer,” and websites use it to secure traffic between browsers and web servers.
You can tell if a site uses SSL by looking at the URL. If it says “https,” it’s secure. If it says “http,” it’s missing that extra layer of security — and that can negatively impact your search rankings.
Google uses https as a ranking signal, so it’s important to implement SSL on your website and make sure every page is secure. First, you’ll need to purchase an SSL certificate from a provider like Rapid SSL, Symantec or Comodo.
Then, you need to install the certificate on your server. If you’re on WordPress, you can do this in one click with the Really Simple SSL plugin.
Congrats — now you’re secure! Well, mostly. Now, you need to set up 301 redirects to make sure links to any of your pages drive to the https versions of those pages.
What’s a 301 redirect?
A 301 redirect is a permanent redirect from one URL (for example, the http version) to another (https). They have plenty of other uses outside of an SSL implementation, like changing domain names or sunsetting popular landing pages, but they’re essential for SSL to avoid duplicate or mixed content issues.
To see a 301 redirect in action, open up a new tab, enter
contentfac.com into the address bar, hit Enter, and watch what happens. The address will correct to
There are a few ways to implement 301s:
- You could edit the .htaccess file on your directory, but be warned: it’s very easy to break your site using this method.
- You could also edit the PHP files on your pages.
Both of these methods require development chops, but fortunately, there’s a WordPress plugin for (almost!) anything — use one of these 7 WordPress redirect plugins to make your life easier.
2. You have duplicate content issues
To paraphrase Frank Herbert, “Duplicate content is the mind-killer. Duplicate content is the little-death that brings organic search traffic obliteration.”
Google has a very good reason to penalize duplicate content: it is often a sign of plagiarism, and even when it isn’t, Google doesn’t know what to prioritize when two pages or blog posts share identical content.
Easy to avoid, right? All you have to do is make sure each page and post on your website is unique. Unfortunately, if you don’t implement https correctly, you can end up with two versions of each page on your site — one http, one https, each competing for the same spot in search results.
That’s why those 301 redirects mentioned in #1 are so crucial.
3. Your site speed and user experience leave a lot to be desired
You could create the most visually stunning and content-rich website in the world – but if your page loading speed is anything over three seconds, 40% of users will bounce and bring their traffic (and business) to your competitors.
A slow loading time hurts your SEO efforts in a number of ways:
- Directly, as page speed is a mobile ranking factor
- Indirectly, as users are more likely to bounce
- If users hit “back” on their browser and click another link on the SERP, Google counts that against you in search
How to optimize your site for speed
First, run your website through Pingdom and GTmetrix (both free!) to audit your current site speed and performance. You’ll see how long it takes your web page to load, what’s slowing it down and how to fix it.
Here are just a few ways to make your site run faster:
Scale and compress images.
We were able to shave three seconds of load time from a client’s site simply by optimizing one image. It was uploaded in its original resolution at 4000×3000 pixels. Scale your images as much as you can, and use a service like Imagify to reduce their size even further with no noticeable loss in quality.
Install the W3 Total Cache Plugin.
Replace plugins that are slowing down your site.
When you generate a performance report in GTmetrix, click the Waterfall tab to see which parts of your site are taking the longest. You can frequently identify resource-intensive plugins that take longer to load.
For example, we used GTmetrix to determine that our old social sharing plugin was adding whole seconds to our load time. Yikes! We switched to Sumo because it provided the same features we needed without dragging down our site performance.
4. You have 4xx errors (most likely, 404s)
Quoth the raven, “404.” 4xx errors like “404 Not Found” harm both your user experience and your site’s crawlability — neither Google nor your users will reward a big roadblock on their browsing journey.
If any of your pages link to a now-deleted page or misspelled URL, you’ll need to go in and either remove those broken links or point them elsewhere.
Another great 404 tip is to customize your 404 page to keep users from bouncing. Providing links to your most popular content and a search box may help users who misspell your URL find what they need and stay on your site.
5. Your website isn’t mobile friendly
Mobile devices are responsible for more than half of internet traffic globally. Here in the US, that statistic is closer to 40%, but don’t expect that number to go down any time soon.
Google knows this, and has been rolling out updates that make the web easier to access for mobile phone users — including its slow migration to mobile-first indexing and its inclusion of page speed as a mobile ranking factor.
What can you do to make sure your site is mobile friendly? First, go to Google’s Mobile-Friendly Test tool and run a report.
Some of the errors you might run into in this report include:
Touch elements are too close.
We all know the agony of trying to tap a button or text link on a mobile site, only to shake our fists at the heavens for the misfortune of fat finger syndrome. Test your site on your phone to make sure all interactive elements are large enough and far enough apart for comfortable mobile browsing.
Your fonts are too small.
Do your font sizes look great on desktop, but become unreadable on mobile? If so, go back to your CSS and adjust.
You haven’t configured the meta viewport tag.
If your site uses responsive design, this error shouldn’t show up on your report. If it does, adding the following line to your page will make it scale to the device your audience is using:
<meta name="viewport" content="width=device-width, initial-scale=1">
You can find more tips for making your site mobile-friendly in this great resource by Bridget Randolph at Moz.
6. Google doesn’t crawl and index your pages correctly
No amount of keyword research will make you rank well if search engines aren’t indexing your pages to begin with. Ever write a great piece of content, wait a few weeks, then search for the title only to come up with nothing? You might have an indexing issue.
To see if Google’s indexing your site correctly, log into your Google Search Console account (formerly Google Webmaster Tools), click Google Index, then click Index Status. Make sure that the number in the “Total Indexed” box matches the number of pages and posts on your site that you want to show up in search results.
Help — Google isn’t indexing all my pages!
There are a few possible reasons Google may not be indexing all of your content, including:
Your site is brand new
Google takes a while to index pages, and that waiting period is longer if you just created those pages and don’t have a ton of domain authority. To speed things up, you should focus on building backlinks to earn some Google cred.
Right-click an unindexed page, click View Page Source, then hit “ctrl-f” to search for “noindex.” If that tag exists, there’s your problem — your code is literally telling Google not to include your page in search results.
When you create a page with no internal links or navigation pointing to it, Google sees it as an orphan page not worth indexing. Try linking a few pages or posts to the unindexed page.
Another technical SEO issue you can run into with indexing is an robots.txt file error, which brings us to our next point…
7. Your robots.txt and sitemap.xml files are missing or improperly formatted
Robotx.txt files and XML sitemaps both tell search bots how to crawl your site, but they come at it from two different angles. Your sitemap is essentially a list of all links on your site you want Google to index, while robots.txt tells bots a) that they’re allowed to crawl your site and b) which pages to ignore.
Your robots.txt file can be extremely simple — ours is!
- User-agent: * means any search bot who wants to crawl our site can, from Googlebot to Bingbot.
- Disallow: /wp-admin/ means we’re blocking access to our WordPress admin folder.
- Allow: /wp-admin/admin-ajax.php allows bots to crawl content generated dynamically from plugins using admin-ajax.php.
It’s important to include robots.txt in your site, but formatting errors can hurt your indexability.
When you start using robots.txt to disallow too many folders, you can easily make mistakes that cause search bots to ignore files you wanted them to index. Using noindex tags on individual pages is a better way to keep them out of search.
XML sitemaps are a bit more complicated, but there are plenty of WordPress plugins that create them and keep them up-to-date. At The Content Factory, we’re big fans of XML Sitemap & Google News feeds, and Yoast SEO is another great option.
Once you generate your sitemap, submit it to Google Search Console to improve the indexability of your site — and make sure it includes every link you want to rank!
8. Your site’s crawl depth rivals the Mariana Trench
Crawl depth refers to the amount of clicks it gets to any page or blog post on your website.
Generally speaking, you should aim for no more than three clicks. This doesn’t just make your site easier for Google to crawl and index, but it also makes things easier on your users.
There are many ways to improve crawl efficiency:
- Use categories and tags for blog posts, and link them in the sidebar of your blog
- Implement breadcrumb navigation (Yoast makes this easy)
- Use internal linking to connect your pages
9. You’re outspending your crawl budget with thin content
Crawl budget is exactly what it sounds like — the upper limit of pages on a site Google will crawl.
If you have pages with thin or duplicate content, that’s like spending an outsized proportion of your household budget on pet rocks and Flowbees. You’re going to run out of resources for the stuff that really matters.#SEO tip: spending crawl budget on thin/duplicate content is like spending your household budget on pet rocks and Flowbees. You run out of resources for the stuff that matters. Click To Tweet
If you have a lot of pages with not a lot on them, see if you can either beef them up individually to add more value, or combine them into one page. They might even be worth deleting or noindexing — the point is to tell search bots to focus their crawling elsewhere.
10. You need to break some redirect chains
Too many 301 redirects can also hurt your technical SEO cred.
You lose a little bit of link equity with each step in a redirect chain, so if you have to use redirects (for example, while implementing https), make sure it’s one url turning into another, with no stops along the way.
You can see if redirect chains are a problem for you by downloading the Screaming Frog SEO Spider, entering your URL, then clicking Reports > Redirect Chains. (You can see in the screenshot below that we could stand to improve our crawl depth!)
Take Your Technical SEO Knowledge to the Next Level
While fixing these 10 mistakes will go a long way toward improving your technical SEO, there’s still much to learn when it comes to earning those top spots in the SERPs. If you want to get serious about optimizing your content for organic search traffic, targeted leads and sales, sign up for our Comprehensive Online SEO Training course!
From the comfort of your own home or office, you’ll learn everything from how to conduct keyword research to trusted methods for earning backlinks. For a free preview, enroll in our free webinar on 9 Common SEO Mistakes (and How to Fix Them), then enroll today to go from novice to SEO pro.
We also have a digital marketing Facebook skill sharing group, which has been seeded with experts who can answer your questions about SEO, content writing, social media marketing, digital PR and more.
Are you a lady in SEO? Whether you’re a newbie or a beginner, we welcome you to join Sisters in SEO, a Facebook group founded by Kari DePhillips, TCF’s owner. Some of the top women in the industry are active participants, and it’s a great place to level up your SEO skill set while networking with the best in the business.
This in-depth, step-by-step guide shows you how to drive highly targeted, primed-for-conversion organic traffic to your website. TCF owner Kari DePhillips used these exact methods to build her freelancing side hustle into a multi-million dollar agency without spending a dime on ads or hiring a sales team.