Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

The Robots.txt Generator helps you create a custom robots.txt file for your website to control how search engine bots crawl and index your pages.

This tool is essential for website owners and developers who want to:

  • Allow or block specific search engine crawlers

  • Prevent indexing of sensitive or duplicate content

  • Improve overall SEO and crawl efficiency

Just select the bots you want to allow or disallow, enter the directory rules, and generate your optimized robots.txt file in seconds — no coding required.

Use it to protect your site’s structure and guide search engines the right way!