> Are your file descriptors limit correctly configured ? Let squid have 6
> descriptors per client minimum to be cool.
> Is squid limit under user limit ?
I have:
# ulimit -n
1024
And even as latency skyrockets I only have:
# lsof -u squid | wc -l
80
Is it a problem that the client is behind a router which I have no
control over? Does squid need to establish an inbound connection to
the client?
client -> router (not mine) -> internet -> squid (mine) -> internet
- Grant
>> I'm running squid-3.3.5 on a remote Gentoo system with the default
>> config file. Small sites like squid-cache.org load fine on the client
>> in firefox but large sites like cnn.com and amazon.com don't load all
>> the way. A few page elements load but then it gets stuck on
>> "Connecting", "Waiting", or "Transferring". Once it hangs, SSH
>> latency to the squid system becomes extremely high and I quickly lose
>> contact with the system completely. Someone working on the squid
>> system locally also can not access the internet. Closing firefox on
>> the client brings internet connectivity back very quickly.
>> /var/log/squid/access.log shows some NONE_ABORTED and TCP_MISS_ABORTED
>> requests. Does anyone know what could be causing this?
>>
>> - Grant
Received on Thu Jul 04 2013 - 07:00:15 MDT
This archive was generated by hypermail 2.2.0 : Thu Jul 04 2013 - 12:00:06 MDT