Some people think that it is a bad thing. I would argue it is quite necessary. As you will see from today’s show sometimes Google goes to places you just don’t want it going and I don’t mean emotionally either. At the very least you should be directing Google where you want it to go. One of the easiest ways to do this is to use your robots.txt file. This is a really simple thing to setup even if you are not that technical. The robots.txt file keeps Google out of areas you don’t want it nosing around in.. in theory anyway.
Most good content management systems will come with a pre-populated robots.txt file that all good robots should obey. However there are still plenty of robots that won’t. Google has released a new tool inside webmaster tools so you can test your robots.txt, see today’s video for details.
Think of Google as a nosy tour guide wanting to find out everything there is to know about your site so it can show its users. If you give it free reign it will not only access all areas it will index all areas for everyone else to find as well. Whilst it’s bad enough someone finding a version of your website from the 1990s, what’s worse still is Google trying to make sense of it and how it relates to your current site. Stretching the tour guide analogy even further, why would they want to send someone to a confusing destination when there are easier ones to navigate?
Jim’s been here for a while, you know who he is.