I am having a problem getting Squid to work like I want it to, and I
would like some advice on whether or not it should work the way I
want it to.
I have Squid installed on a IP masquerading machine with RedHat 6.1,
arranged like this on our campus network.
WORLD <---> ROUTER <---> LINUX/SQUID <---> ALL DORMS
It all works perfectly when I tell the browser to use the caching
proxy, but since this proxy is serving about 1000 dormitory students
I would like for them to use the web cache without each of them
having to reconfigure their web browser.
So similar to what is suggested in the Squid FAQ at
http://www.squid-cache.org/Doc/FAQ/FAQ-17.html
where it says "How can I make my users' browsers use my cache without
configuring the browsers for proxying?" I used IPchains to forward
all the web traffic coming into port 80 to port 3128.
ipchains -I input -p tcp -s 192.168.0.0/16 -d 0/0 80 -j REDIRECT 3128
After doing this it appears that Squid sees the traffic, but the
access log is full of entries like
192.168.2.151 NONE/400 1122 GET
which according to the documentation "NONE" means that "Squid does
not forward the request at all".
The store.log file has entries like
RELEASE FFFFFFFF 400 -1 -1 -1 unknown -1/1021
A workstation attempting to access the web via the proxy gets the
error "The requested URL could not be retrieved, While attempting to
retrieve the URL: /, The following error was encountered Invalid URL"
It seems to be saying that all URLs requested are just "/".
If I again configure my browser to use a proxy while the port is
forwarded all works fine.
As soon as I stop the port forwarding direct access is immediately
restored.
Am I missing something basic or am I barking up the wrong tree
altogether?
Brent.
Received on Mon Mar 06 2000 - 13:10:15 MST
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:51:56 MST