Forum Discussion

Mike_62629's avatar
Mike_62629
Icon for Nimbostratus rankNimbostratus
Jul 16, 2008

Rate limiting Search Spiders

We're currently having some problems with some web spiders beating up our webservers sucking up available sessions in our application and slurping up a whole bunch of our bandwidth. We're interested in rate-limiting them.

 

 

I found what appeared to be a very relevant iRule at http://devcentral.f5.com/Default.aspx?tabid=109 (third place winner), but when I try to load it up in the iRule editor it complains. It complains, I believe, because HTTP Headers are not available from within CLIENT_ACCEPTED and CLIENT_CLOSED logic. That makes sense because CLIENT_ACCEPTED and CLIENT_CLOSED are associated with building and destroying tcp connections (i believe), so it wouldn't make sense for data (headers/req-uri's) to be transferred at that time.

 

 

Does anyone have any suggestions on how to accomplish this or something similar?

 

 

13 Replies