iRules and robots.txt question
Got a quick question for the F5 and iRules experts out there. I have been asked about "putting a robots.txt file
on an LTM", and was wondering if that is possible, or even makes sense.
I understand that robots.txt is supposed to give web crawlers advice on what part of the directory structure is off limits (which they are free to ignore). I have looked at
https://devcentral.f5.com/wiki/iRules.Version_9_x_Robot_and_Request_Limiting_iRule.ashx
which restricts robots to the ones that respect robots.txt and puts limits on client requests.
However, the question is whether it is possible to put a robots.txt file on an F5, have it parsed, and then
have the F5 restrict client request access according to that.
I am not an F5/iRule expert (to say the least), so before I go out on a limb and say it can't be done
I'd like to get some expert opinions.
Thanks,
W.