I have a squid 2.7 stable 7 caching server in front of my apache web servers.
Every now and then when I hit the main page of the company's web site, I get an Apache 500 Forbidden page.
If I hit refresh, I get right in.
The following is from the squid access log:
www.company.com 172.21.84.170 - - [14/Jan/2010:14:28:26 -0500] "GET http://172.21.100.66/ HTTP/1.1" 403 584 "-" "Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10.4; en-US; rv:1.9.1.7) Gecko/20091221 Firefox/3.5.7" TCP_NEGATIVE_HIT:NONE
www.company.com 172.21.84.170 - - [14/Jan/2010:14:28:36 -0500] "GET http://172.21.100.66/ HTTP/1.1" 200 27034 "-" "Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10.4; en-US; rv:1.9.1.7) Gecko/20091221 Firefox/3.5.7" TCP_MISS:ROUNDROBIN_PARENT
Is apache generated the 403 return code or squid?
I know a change was made to the web servers to address problems with rogue search engine robot/spiders.
There was an entry put into .htaccess to block any vistors where the User Agent string is blank. Could that be causing a problem?
If I'm reading the log correct, the User Agent string is defined, but the Referrer is blank.
Here is a copy of from the squid log where BOTH the Referrer and User Agent string are blank ( but we are only looking for just the User Agent string being blank )
www.company.com 68.142.243.85 - - [14/Jan/2010:05:47:24 -0500] "GET http://172.21.100.66/ HTTP/1.1" 403 576 "-" "-" TCP_MISS:ROUNDROBIN_PARENT
Not sure if this is the right board or if I should post this over the apache users forum?
Any help would be appreciated.
Thanks
_________________________________________________________________
Hotmail: Trusted email with powerful SPAM protection.
http://clk.atdmt.com/GBL/go/196390707/direct/01/
Received on Thu Jan 14 2010 - 21:17:53 MST
This archive was generated by hypermail 2.2.0 : Fri Jan 15 2010 - 12:00:02 MST