You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@httpd.apache.org by Joshua Slive <jo...@slive.ca> on 2002/03/07 18:08:11 UTC
Re: Possible rehash of a question, but looking for a different
solution
On Thu, 7 Mar 2002, Aaron wrote:
> I have seen many ways using cgi scripts and User-Agent variables to block
> spiders from an Apache web site.
>
> I am looking for something a little more complex that can be done in the
> httpd.conf file (so that it is not dependant on anything external) which will
> detect when a user has accessed a spam-bot/spider page, and will then set a
> variable which will follow them throught their sessions.
There is no such thing as a "session" in HTTP, so I think you need to
refine your requirements a little. You can create sessions with cookies,
but obviously that will not help for a spider that doesn't accept cookies.
You could also block all requests from a specific IP address. But then
you may block other users of the same proxy.
In other words, this is a very difficult problem, so you need to be very
specific about what you are trying to do.
Joshua.
---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org