How-To: Robots.txt Disaster Prevention

Beacon Blog Article

Published December 1, 2015 | Categories: SEO

It's any SEO's worst nightmare. The Production robots file got overwritten with the Test version. Now all your pages have been 'disallowed' and are dropping out of the index. And to make it worse, you didn't notice it immediately, because how would you?

Wouldn't it be great if you could prevent robots.txt file disasters? Or at least know as soon as something goes awry? Keep reading, you're about to do just that.

The 'How'

Our team recently began using Slack. Even if you don't need a new team communication tool, it is worth having for this purpose. One of Slack's greatest features is 'Integrations'. Luckily for us SEOs, there is an RSS integration.

5 Simple Steps for Quick Robots.txt Disaster Remediation:

  1. Take the URL for your robots file and drop it into Page2Rss.
  2. Configure the Slack RSS integration.
  3. Add the Feed URL (from Page2RSS) for your robots file.
  4. Select the channel to which notifications will post. ( I suggest having a channel for each website/client, read more on why later)
  5. Relax and stop worrying about an accidental 'disallow all'.

The Benefits of this Method

4 Benefits of Using Page2Rss & Slack to Watch Your Robots File:

  1. You can add your teammates to channels, so key people can know when changes occur! One person sets up the feed once, many people get notified.
  2. Page2Rss will check the page at least once a day. This means you'll never go more than 24 hours with a defunct robots file.
  3. No one on your team has to check clients' robots files for errors.
  4. You'll know what day your dev team updated the file. This enables you to add accurate annotations in Google Analytics.

robots damage prevention

The 'Why'

OK, cool, but why is this necessary? Because you care about your sites' reputation with search engines, that's why!

Mistakes happen with websites all the time, and search engines know that. They're not in the business of destroying websites. But they are in the business of providing the best results to their customers. So if you neglect to fix things like this with a quickness, you're risking demotion.

I've seen sites go weeks with a bad robots file, and it is not pretty. Once search engines have removed your site from the index, it is quite difficult to get back. It can sometimes take weeks to regain the indexation you had prior. Don't undo hard work put into search engine optimization because your file was overwritten. Do yourself a favor and setup this automated monitoring feature.

I've armed you with this info, now there is no excuse for getting de-indexed due to robots.txt issues. If this has happened to someone you know, please share this post with them!

Let's get to work!

Contact Us