Easily and instantly generate a Robots.txt file for your Blogger (Blogspot) blogs for SEO-friendly crawling.
Robots.txt is one of important SEO aspects, which will tell search engines' crawlers on how they should crawl your site. It is the 'front-door' to your blog or website. The content inside will tell crawlers whether they should crawl your site or not, depending on the rules you set inside.
Shortly, robots.txt is a text file that allows search engine bots/crawlers such as Google and Bing to successfully scan and index your site.
It is a collection of instructions or rules that crawlers and search engine bots follow when they visit your site.
Crawlers from various sorts of websites, such as social networking sites like Facebook and Twitter, as well as SEO sites like online keyword research tools, employ the Robots.txt file.
The main purpose of the Robots.txt file is to instruct crawlers and bots about which portions of your site should be accessible via search engines and which should not.
For example, a blogger may want all of her/his articles to appear on Google, but she/he may not want other sites, such as the blog's internal search pages, to appear on Google!
It's really easy and simple to use this Blogger Robots.txt Generator tool. All you need to do is generate the robots.txt file, then submit it into your Blogger blog settings. Follow these simple steps:
Simply input the URL of your Blogger blog, including https:// and www but without including the trailing slash '/' at the end of the URL., into the above tool and click the "Generate Robots.txt" button. The tool will automatically generate the most basic yet important rules for you to copy, including sitemap URLs which are also important, like the following:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://robotstxt.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: https://robotstxt.blogspot.com/atom.xml?redirect=false&start-index=501&max-results=500
Sitemap: https://robotstxt.blogspot.com/atom.xml?redirect=false&start-index=1001&max-results=500
Sitemap: https://robotstxt.blogspot.com/atom.xml?redirect=false&start-index=1501&max-results=500
Sitemap: https://robotstxt.blogspot.com/atom.xml?redirect=false&start-index=2001&max-results=500
Sitemap: https://robotstxt.blogspot.com/atom.xml?redirect=false&start-index=2501&max-results=500
Sitemap: https://robotstxt.blogspot.com/atom.xml?redirect=false&start-index=3001&max-results=500
You can add your own custom rules, or use the generic Robots.txt Generator for various rules or directives.
Once you're ready, just copy the generated robots.txt rules and save it elsewhere.
Log in to your Blogger account now.
Then, under Settings, choose Crawlers and indexing menu.
Enable "Enable custom robots.txt" and then click "Custom robots.txt."
Now, use CTRL+V on Windows or Command+C on Mac to paste the robots.txt code.
Press the "Save" button.
Once done, you can now test your robots.txt rule by going to the old Google Webmaster Tool's Robots.txt Tester. If your robots.txt file hasn't been updated, just enter it manually, then begin testing it by filling some pages' slugs add the URL field.
That's it!
Other interesting tools related to this one. More tools are coming soon!