SEO Tools Part 3 – sinister & spiteful SEO

by Jim October 30, 2015

This is a follow up on a post from a couple of days ago and also my SEO Tools post. We discovered that a client’s former SEO company had hidden code within the site, that had the effect of removing them from Google.

Today I’ve followed up on this story as I wanted to find out how many other former clients of this SEO company had the same problem.

The SEO Task.

Get a list of URLs of the SEO company’s clients

Check each website home page to see if it had the noindex directive.

The SEO Tools

For this task I used a combination of Screaming Frog, Google Sheets & Excel. You could do away with Excel altogether and just use Google sheets though.

Step 1

Fortunately the SEO company in question had a page that listed all their client complete with backlinks to them.  I simply put that specific URL into Screaming Frog for it to crawl. I then exported all the “Out” links it found. These are essentially all the client website the page was linking to.

Step 2

I opened the exported CSV in excel and tidied up the list of URLs. Saved it and put it aside.

Step 3

I then created a Google docs Sheet and in cell B1 I entered =importxml(A1,“//meta[@name=’robots’]/@content”That piece of code basically says, grab the URL in cell A1 and display the contents of the robots meta tag. 

Step 4

Grab the list of URLs from the CSV and drop them into column A . Voila

Results

Out of the hundred or so URLs we found 7 or 8 that had the same problem. We are in the process of contacting them to let them know what is wrong with their site. I spoke with Greg earlier today from johnreynoldsmusic.com.au who features in the video above. He did seem a little surprised.

If your site or someone’s site you know has this problem just have a look at the Wednesday video I did explaining the problem.

« | »[fbcomments]
Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
ErrorHere