11 Reasons Why Your Organic Traffic Has Dropped

11 Reasons Why Your Organic Traffic Has Dropped
Website traffic comes and goes, unfortunately. No one enjoys it when their organic traffic drops, and when it does, your first instinct may be to panic. You start to wonder, did Google release an algorithm update? Were you the victim of an SEO attack? What happened?!

Reasons Why your Organic Web Traffic Dropped

As a technical SEO consultant, I’ve seen the many ups and downs of organic traffic. More often than not, the core reason isn’t because of a change that Google made, but the good news is that there is (almost always) something you can do about it. Incorrect redirects, difficulties with robots.txt files, sitemap problems, and even broken internal links can unwittingly cause your site migration to result in a drop in organic search traffic. Identifying why your organic traffic dropped significantly after a migration can be a matter of trial and error. I wrote this article to help others get to the bottom of their issues.

1. Changing your Domain Name

There are many reasons for changing your domain name, but you need to tell Google about the change if you want to do so successfully. Google created the Change of Address tool for this reason.
 
This tool tells the search engine about your new domain name and helps move your Search results from your old to the new domain.

2. Indexing Everything

It’s essential to prune your content and not allow Google to index everything. In most cases, there is no need to index every piece of content on your website. In the case of a car aggregator website, car history pages are just one example of content that can be no-indexed. While it may be helpful for users to search by car registration number, this probably isn’t where leads are coming from – and likely takes up a lot of crawl budget. 
 
An excellent way to determine which pages you should index or not is to assess which page templates are providing the most value to your website. Instead, focus on indexing pages that have the potential to provide great search results.
Noindex pages that you don’t want to show up in search results or that you don’t want to rank for any of your target keywords.
 
I find it is good practice to run a keep, combine, kill exercise from time to time and use this as an opportunity to prune your low quality and under-performing content.
 
Remember to also use the Crawl Coverage report in Google Search Console by navigating to Index > Coverage. Here you can see which URLs have an error, a warning, are valid, or are excluded from the index. On the inspection page, you can “test live URL” and identify any issues that Google might have when accessing the page.

3. Blocking Content with Robots.txt

Robots.txt is a powerful tool, but it can also be dangerous if not used correctly. You should always place this file at the root of your domain. I’ve seen a few instances of the robots.txt located at domain.com/pub/robots.txt or within another subfolder. This placement is a problem because it will apply to all URLs below that level only. In this example, only URLs below /pub/ would be affected by any applicable rules in the robots.txt file, and anything above that would remain unchanged.
 
You should always place the robots.txt file at domain.com/robots.txt. This location will mean that any rules in the file will be applied to all pages below the homepage.
 
Note that it is also best practice not to block search engines from accessing JavaScript or CSS files from the robots.txt file, as this will affect how Google crawls and indexes your content. Blocking these files can result in a drop in rankings.
 
A common issue with this file is to block access to your entire website accidentally. If you see “disallow: /”, you may have cause for concern. Google has a robots.txt tester tool that can help you identify and correct any issues you may have with your file.
 
Before you panic about your organic traffic drop, make sure you’re not unintentionally blocking access to parts of your website or even the whole thing!

4. Migrating a Website and Not Mapping your URLs with 301 Redirects Correctly

A domain migration is a significant change and certainly needs managing correctly. URL slugs also tend to change during a migration, and they also need 301 redirecting to the new appropriate URL. The last thing a user wants is to experience a 404 error page when trying to find the answer to their query. A 404 error is a bad page experience.
 
You can map all of your URLs individually, or you can work with your dev team to notice patterns in the URL changes and add a few lines of code to your htaccess file to redirect them.
 
This way, all of your site traffic, authority, reputation etc., should be migrated to your new URLs without experiencing an organic traffic loss.
 
It’s worth noting that you can experience a loss in performance during a migration even if all of your URLs are redirected correctly. Many aspects can change at once during a website migration, and it can be hard to determine which is the cause for the drop. For this reason, some SEOs recommend making these changes in stages.
 
Unfortunately, redirect issues can be common, so take some time to address the following:
  • Does the redirect chain resolve to the correct, new URL? 
  • Clean up and remove any redirect chains,
  • Use a permanent 301 redirect rather than a temporary 302 redirect. 
 
If your website relies heavily on traffic from your images, you need to make sure that you migrated them across and those URLs were updated too; otherwise, you could be missing out on a large amount of traffic.

5. Moving From a Subdirectory to a Subdomain

Subdirectory vs subdomain is a controversial topic within SEO, with many differing opinions. I have yet to see a migration from a subfolder like domain.com/blog to a subdomain such as blog.domain.com happen without a drop in traffic, even when handled perfectly.
drop in organic traffic after subdomain migration
This drop in organic traffic is likely because the subdomain appears not to carry the same authority as a subfolder due to it being further from the root domain. Internal linking can go a long way to helping here, but that alone is not strong enough to resolve these traffic issues.
 
I’ve seen many instances where the opposite is true. A client had part of their website on a subdomain (blog.domain.com), and we’ve seen immediate uplifts when migrating to the domain.com/blog subfolder.
visibility increase after subfolder migration
I’m not saying you shouldn’t use subdomains; they definitely have their place. I am saying that you should think carefully about when and where they are used before making any decisions.  

6. Prioritising Quality over Quantity of Backlinks

It’s not about how many backlinks a website has; what matters more is where they are coming from. A website with ten backlinks from ten domains, all of which are from reputable sources, will generally perform better than a website with 1,000s of backlinks from a couple of PBNs (private blog networks).
 
It is not advisable to have a lot of links from one domain. Keep your backlink profile clean and healthy and disavow spammy backlinks if you need to. Only use the disavow tool if you know what you’re doing.
 
It’s also good practice to look at your lost and broken backlinks from time to time and look at opportunities to reclaim any lost link equity.
 
Backlinks can easily be lost during a migration, which is another reason why correct redirects are needed. This is especially true when it comes to your high-value links from authoritative websites. It may be worthwhile reaching out to these site owners to ask them to update their links to your new site. If their old links are broken or go to a page on your old site that isn’t a part of your redirect, this can cause problems with search rankings. 

7. Having a Poor Mobile Performance or Drops in Site Speed

Google indexes your website content based on your mobile website rather than your desktop version called Mobile-First Indexing. While it is correct to call this “mobile-first”, I find it helps everyone understand how this works and how important it is by referring to it as the “mobile-only index.”
 
Your website should be responsive. Responsive means that your website adapts the way it looks and behaves by responding to the size of the screen of the device requesting a copy of your webpage. In short, if your website performs slowly or looks badly on mobile devices, it will likely perform poorly on SERPs.
 
You may experience some common issues on a mobile device, including text too small to read, clickable elements too close together, and content wider than the screen.
 
If you moved servers, there is a chance that this could harm your site speed. To combat this, some first steps could be to: 
  • Check that your CDN was migrated too and is working as expected, 
  • Ensure that your caching system is running correctly, 
  • Use Page Speed Insights to find any easy fixes that you can make to facilitate a quick win. 

8. Having a Poor User Experience

We touched on this briefly earlier, but many factors can contribute to a poor user experience. A slow website is just one example of a poor UX.
 
Poor UX is why your website hosting should be as fast and reliable as possible. This is what TTFB measures. Time to First byte is a measurement of the responsiveness of your server.
 
Most site speed tools, including Page Speed Insights and GTMetrix, include measurements of your TTFB, and there’s a quick test you can run to get to the bottom of this quickly.
 
You need to compare the TTFB of your robots.txt file vs the TTFB of your homepage:
  • If the robots TTFB is slow = you likely have a slow server and should consider changing hosting. 
  • If the homepage TTFB is slow, and the robots.txt is good = there are more issues with your site speed, and you need to dig deeper. 
 
Cheap hosting isn’t usually the best hosting. As the old saying goes, you get what you pay for. For larger websites and eCommerce websites, a dedicated server is recommended over a shared server.

9. Having Duplicate Content on Your Website, or on Another Website

Everyone can agree that duplicate content is harmful. It can happen on your website and on other websites too. The problem with having the same content as in another location online (whether internally or externally) is that Google won’t know which version is the original piece of content and, therefore, should rank for the relevant term.
 
The content on your website should be as unique as possible. While you may not necessarily receive a Google penalty for having duplicate content, it will not help you gain organic traffic.

10. Not Migrating your Metadata or Internal Links Across

It is vital to make sure that your page titles, meta descriptions etc., are migrated correctly during the site move. This is because many changes are happening at once, and it helps Google if you can keep certain aspects as similar as possible to reduce the risk of organic traffic drops.
 
Use a crawling tool such as Screaming Frog on your existing website (before launch) to see what metadata is in place. You can then compare this with what is currently on your staging website and see where the gaps are.
 
If your website has already launched, then things work a little differently, but there are some ways you can see what metadata was in place. Use the “site:yourdomain.com” command to see what is in Google’s index or use the backup that was (hopefully) taken before the migration. If neither of these is an option, you’ll have to manually add the page titles and descriptions back in yourself, which can be time-consuming and very difficult to match them to the previous versions depending on your website’s size.
 
As I mentioned earlier, internal links can be very powerful, and they are a natural way to keep people on your website. Whatsmore, they help search engines navigate from one page to another too. I would suggest checking that your internal links from both your blog posts and landing pages link to the new website with updated URLs and not your old site.

11. Your Canonical Tags are Incorrect

Canonical Tags or Canonical URLs help search engines understand which URL is the “preferred” version of a webpage or the URL that you would prefer Google ranked instead of the given URL.
 
Although they are seen more as an indication than a directive, meaning that Google may choose to ignore them, you must get them right.
 
It is best practice to use self-referencing canonicals (means URLs that point back to themselves) unless there is a specific reason to use another URL. A common reason for a drop in traffic is when canonical URLs are not updated after a migration. This happens more than you might think. I’ve seen an instance where a website was essentially telling Google not to index it as the canonical URLs across the entire website were pointing back to the unsecured HTTP version! I’ve also seen a website go live with the canonical URLs pointing back to the staging site! Both of these can be very problematic and should be resolved as soon as possible.
 
So before you panic about your drop in organic traffic and start to blame Google, check your site to see if your canonical tags are correct.
 
Some common issues with canonical tags are: 
  • Pointing to non-relevant pages,
  • Programmatic issues (such as missing trailing slash),
  • They are pointing to old URLs that no longer exist.

Summary

No one wants to see their organic traffic drop, and I wouldn’t blame you for panicking if that does happen. But, before pointing the finger at anyone, it’s worth making sure that there are no inherent issues with your website first. 
 
Make sure you benchmark your data before the migration happens so you can keep track of any changes in the future. Correcting all of your site’s issues takes time and patience, and remember to annotate each of your changes in Google Analytics, so it is easier to see which change had the most positive impact.
Facebook
Twitter
LinkedIn

Subscribe To Tech SEO Tips Newsletter

The latest news from the SEO industry, plus tips and discussions on improving your tech SEO performance.