Robots NoIndex – What to Know

When using the robots noindex tag/setting, you’re effectively telling Google and every other search engine’s search crawling robots/spiders to ignore any page on your site on which you use that tag or in other words to not index them. By default, spiders will crawl every page on your site, so you need to tell them which pages specifically you want them to omit by making changes to your robots.txt file on your site.

Robots NoIndex

Robots noindex should only be used on pages which you don’t want search robots to crawl.Robots NoIndex

For instance, you likely don’t want your “checkout” page to be crawled and indexed if you have a product sales site. From an SEO standpoint, you don’t want the same content indexed twice on your site if it exists in more than one location as this will can generate a duplicate content penalty which you don’t want. If you know that you have the same content in two places, perhaps if a post exists in two different places in two different categories, you’ll want to noindex one of them.

You also want to noindex pages you don’t want or care to be indexed because this means less time which the spiders need to crawl your site which means less bandwidth usage which means a more responsive and faster site for the end user.

If you’re using WordPress, it automatically creates a robots.txt file, though it’s up to you to add the robots noindex information. If you don’t already have this file on your site, you can create a normal text file, save it as robots.txt, and upload it to your site via FTP.

Some standards to add to your robots.txt file are:

Disallow: /feed/
Disallow: /comments/
Disallow: /author/
Disallow: /archives/
Disallow: /trackback/

That’s the format you should be using and this tells the spiders to ignore those directories and any associated files altogether.

An easy way to implement robots noindex if you’re using WordPress is to download this very handy plugin called PCRobots.txt. You can just input any pages which you don’t want crawled, click save, and the plugin notifies spiders accordingly without your having to mess with an FTP or robots file yourself.

If you’re comfortable with FTP then I recommend you just go ahead and do it manually yourself because that’s one less plugin slowing down your site (see this post on how to improve website load time).

Scroll to Top