Peter_Reilly
Feb 28, 2008Nimbostratus
Disallow Robots
Will the following iRule work for disallowing robots?
Thanks,
when HTTP_REQUEST {
if {[HTTP::path] eq "/robots.txt"}
{
HTTP::respond 200 content "
User-agent: *
Disallow: /
"
}
}
Thanks,
when HTTP_REQUEST {
if {[HTTP::path] eq "/robots.txt"}
{
HTTP::respond 200 content "
User-agent: *
Disallow: /
"
}
}
when HTTP_REQUEST {
if {[string tolower [HTTP::path]] eq "/robots.txt"}{
HTTP::respond 200 content "User-agent: *\r\nDisallow: /"
}
}
It looks like Google has a robots.txt analyzer you might be able to use to verify your rule:
How do I check that my robots.txt file is working as expected? (Click here)
Aaron
thanks
when HTTP_REQUEST {
if {[string tolower [HTTP::path]] ends_with "/robots.txt"}{
HTTP::respond 200 content "User-agent: *\r\nDisallow: /"
}
}
But the common use is for a robot to only request /robots.txt.
http://www.robotstxt.org/robotstxt.html
Aaron