Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robots.txt Generator

Web robots, also known as crawlers are used by search engines, such as Google, to automatically surf the Web to index web content. 

Every site has a robots.txt file that allows or forbbids the web robots to index different pages. The robots.txt file shouldn't be used to hide information because it is public information and anyone can see what pages you allow or forbbid the crawlers to index.

The Robots.txt Generator is 100% FREE and instantly allows you to modify which robots can crawl your pages and which ones cannot.

Usually, this process needs a lot of attention because any adjustment may change your website URL's and can result in URL's that aren't working anymore. To avoid these problems we suggest the Robots.txt Generator. It's safe and easy to use, more important, your website will stay fully functional.

Try our Robots.txt Generator and decide which search engines can index your pages.

The Robots.txt Generator is brought to you by SEOmaxim!