Free Online Robots.txt Generator Tool


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Free Online Robots.txt Generator Tool

What is Robots.txt File?

The robots.txt file contains instructions on how to crawl a website and how to index it.

There are too many websites on the web now, so Google has certain criteria and limits when scanning websites. This limit is the amount of time their browsers will spend on a website. When Google sees that your website is not tailored to user experience, it starts crawling slower. This means that the posts on your website will be added to the search engine index later. To remove this browser restriction, you must have a Sitemap (sitemap.xml) and a robots.txt file on your website. These files provide information to the browsers that come to your website about where they are important and need to be indexed fast, so speeding up the scanning process.

The Robots.txt file is also known as the Robots Exclusion Protocol. The file needs to be created carefully, a misspelled or inserted line or command can damage pages on your website.

What is Robots.txt Generator?

With Robots.txt Generator, Robots.txt file is professionally generated quickly and easily. Using this website tool, you can specify which part of your website you want indexed and which part you do not want to be indexed.

How to use Robots.txt Generator?

While preparing the Robots.txt file, use the above options to enter the information about which bots can access your website. If you do not enter any information, Robots.txt Generator automatically allows access to all robots in the file.

Select a delay command in the range of 15-20 seconds in the second stage. If you have a sitemap for your website, write the file address on the relevant line, otherwise leave it blank and go to the next step.

Then check which bots you want to crawl your website.

And finally, specify the restriction of directories if you want. The directory in which you put a slash (/) is considered restricted.

When you click the Create Robots.txt button, Robots.txt Generator creates a robots.txt file according to your requests. You can add this file and the codes it contains to the root directory of your site.

If you want to save the Robots.txt file, click the Create and Save as Robot.txt button after specifying how the file should be prepared.