Recovering from deindexing

by Jim May 24, 2016

Welcome back Rankers!
Off to The Internet Conference this week, so looking forward to that Gold Coast sunshine.

Thought I’d do a search for eCommerce to see what’s been in the news this week. Didn’t find a great deal except for a great featured answer picked up by Network Solutions. My search did show a result that gave me a laugh as I did a story on this site a few years back and it’s a government site. A page that you want ranked by Google ends up being displayed with the following text underneath.

SEO, Government & Robots
SEO, Government & Robots

Not a good thing.
Government Failure: This site should be #1 for the search in Australia. It currently sits sixth. It is never going to be #1 as someone has blocked it with the robots.txt. This site, incidentally, is a taxpayer funded site, designed to teach business how to go digital and get online. Maybe it should be under the tab ‘How not to do it’ as all they’ve managed to do is block the search engines from finding their site.

Good one.
I’ve had another client’s site suffer this during the week. They were updating a plugin for WordPress, Gravity Forms, and it wrote over the robots.txt, which, needless to say, is not a great thing for a plugin to do. Getting back to our government site. They’ve set up their site to deny access to any robots. The homepage has design and quality issues that would be frowned upon by Google and the site itself is incredibly s-l-o-w.

Admittedly, the site does offer good advice with a strong blog structure. It’s just a shame they’re hidden within a site being hidden by robots.txt. The site is full of broken links, maybe due to a lack of funding, and maybe that’s why they blocked it with robots.txt. We are coming up to a Federal Election soon so I’d hate to point out some politician’s folly, spending money where it’s not required and all.

If you do require expert advice on getting online for business, be sure to attend The Internet Conference this week on the Gold Coast. Starts on Wednesday. If you want a free SEO audit while you’re there, you can either stop by our stand or book a session online with me.

Robot Maintenance

One of the things you should be doing regularly on your own site is looking at search results. See what Google has indexed. (Use the site:) If your homepage is not being ranked first, it’s a sign that something is wrong with your homepage, or Google doesn’t like it or can’t access it.

When you do that search, the most authoritative search should come up first. Which should be your homepage.The government site has a litany of problems. Duplicate content, incorrect indexing of categories and not using SEF URLs. My advice is not to take their advice!

If you run this test on your own site and experience a mild panic attack when you realise you are being blocked by robots.txt, depending on the size of your site you can steady the ship relatively quickly. You do this by unblocking the robots to begin with, and then have the site reindexed.

I did one recently for a client and they had 50,000 pages deindexed. The following day they had all been reindexed. However, on closer inspection, many of the files that Google said were fine still had the robots.txt message underneath them. Which basically meant that Google had not crawled the page.

The other curious thing that happened was that although they’d redirected the www version of the site to the root of the site, the robots were blocking both the www and non-www versions of the site. Therefore, it couldn’t see the header and the redirect. So before it had even got to the page to see a redirect it was told it couldn’t look at it.

So Google would then say all these pages are blocked by robots.txt and it would then start putting them in the index with a simple notice saying that the information is not available due to blocking by robots.txt.The one I fixed up for the client I resubmitted, re-fetched, and the rankings returned. One phrase went from #12 to #6. So if you do accidentally block your site through robots.txt you should be able to come back from it quickly, but you need to know what’s happened through regular site checks. Once a month is advisable, or weekly if you have a busy site.

Hopefully that’s helpful. I hope to see some of you at The Internet Conference on the Gold Coast this week. Bye for now.

« | »[fbcomments]
Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
ErrorHere