One of the biggest nightmares for SEO specialists is getting de-indexed by Google, or accidentally de-indexing their site themselves. The consequences of such a mistake can stretch over a long period of time. And, if we are talking about websites that generate revenue, this is a big blow to the business flow.
Let’s see what happens when you accidentally de-index your website, and what you can do to fix and prevent the problem in the future.
How Can Accidental De-Indexing Happen?
Most of the time, this happens when the file robots.txt is modified and ran with no prior testing. While making changes to robots.txt is common practice, you have to make sure you know what you’re doing, as chasing away crawlers from your website will not damage your ranking.
One of the most important tools to manage your website’s indexing is the Google Search Console. This tool enables you to control and monitor the indexing on your web site and optimise it for Google and other search engines.
If you notice sudden drops in your rankings in Google Analytics, check the robots.txt file first through the Search Console. If you find the noindex detected in robots meta tag where it doesn’t belong, it means that you accidentally de-indexed in Google. You must then check the number of pages that became invisible for crawlers.
Case Study by Jeff Baker on Moz
As SEO specialists, we learned the hard way how indexing our content can be a nice boost to rankings, but also a dangerous approach. One of the most useful materials on the subject is the case study written by Jeff Baker, SEO specialist at one of Boston’s most successful advertising companies.
In 2019 he wrote how the developer of one of his websites has accidentally de-indexed the whole website by running a code that unknowingly ran live.
He writes: “after sending a few frantic notes to our developer, he confirmed that a sprint deployed on Thursday evening (August 1, 2019), almost three days prior, had accidentally pushed the code live on every page”.
A week after the incident, “we had lost about 33.2% of our search traffic”, he recalls.
So what did Jeff do? He requested re-indexing manually. But the site was not repaired instantly. “Despite re submitting my sitemap and manually fetching pages in the Search Console, many pages were still not being indexed”.
Surprisingly, waiting was the only thing left to do after the SEO specialist did everything he could to get Google to re-index the damaged pages, “including pages that I counted on to drive traffic”
How to Assess the Damage After De-Indexing Your Site?
To find out just how many pages of your website are indexed, you can run a check in the Google Search Console, by going to the Coverage tab. Once this check has confirmed that the pages with no traffic are, in fact, de-indexed, it’s time to find a solution.
It’s also time to check the drops in your conversion rates, as you might need to implement crisis solutions, like replacing your best-selling product pages, redirecting to a different page, and so on.
How to Repair the Damage Caused by an Accidental De-Indexing?
If you have inspected and requested new indexing, but nothing is happening, your best bet is to wait for Google to get your pages back to their previous rankings. Make sure that the number of pages that Search Console reported as missing will match the pages re-indexed by Google after the ordeal is over.
How to Prevent Accidental De-Indexing of a Site?
To prevent such a horrifying thing from happening again, you have to implement some security measures when it comes to any changes made to your site’s indexing rules. While the robots.txt file is not really a rule of law when it comes to what crawlers do on your page, it can easily redirect robots away from the most important pages of your site.
Some methods of prevention include:
- Getting an email alert when there is a sudden drop in your traffic. You can use the Search Console to activate such an alert.
- Always check your robots.txt file when implementing changes
- Check your websites for errors that could get Google to de-index your website, like the 503 or 403 response
Contact Our Sydney Search Engine Optimization Specialists if You Have an Indexing Nightmare!
Indexing and other similar technical SEO issues can easily become a nightmare for website owners since it’s highly specialised knowledge. But, since every one of you works hard to optimise for better ranking and better revenue, this know-how is valuable.
The safest way to optimise indexing for your website, before sending frantic notes to the Google support team, is to hire an SEO specialist. At Australian Internet Advertising, we consider ourselves SEO geeks. We are the kind of SEO-obsessed people who check the robots meta tags files for fun.
Contact us to get an in-depth audit of your website and find a quick solution for your indexing problems.