Forum Discussion

Peter_Reilly's avatar
Peter_Reilly
Icon for Nimbostratus rankNimbostratus
Feb 28, 2008

Disallow Robots

Will the following iRule work for disallowing robots?

 

 

Thanks,

 

 

when HTTP_REQUEST {

 

if {[HTTP::path] eq "/robots.txt"}

 

{

 

HTTP::respond 200 content "

 

User-agent: *

 

 

Disallow: /

"

 

}

 

}

 

3 Replies

  • That is close. robots.txt is a text file, so you don't need to send HTML tags in the response. You can use \r\n to insert a new line:

    
    when HTTP_REQUEST {
       if {[string tolower [HTTP::path]] eq "/robots.txt"}{
          HTTP::respond 200 content "User-agent: *\r\nDisallow: /"
       }
    }

    It looks like Google has a robots.txt analyzer you might be able to use to verify your rule:

    How do I check that my robots.txt file is working as expected? (Click here)

    Aaron
  • If you want to check for any path ending in /robots.txt, you could use:

    
    when HTTP_REQUEST {
       if {[string tolower [HTTP::path]] ends_with "/robots.txt"}{
          HTTP::respond 200 content "User-agent: *\r\nDisallow: /"
       }
    }
    

    But the common use is for a robot to only request /robots.txt.

    http://www.robotstxt.org/robotstxt.html

    Aaron