Hi,
I found that when squid fetching some web pages like
http://en.beijing2008.com/, the connection to the original server
always breaks when only one or two packets arrives. Squidclient
diagnose shows only the first part of the html file arrives, after
that, it seems that squid is waiting for something and then the client
connection is reset. Here is the access.log:
1144130265.689 34562 127.0.0.1 TCP_MISS/200 2818 GET
http://en.beijing2008.com/ - DIRECT/61.183.246.108 text/html
Nothing special in cache.log, no specific tuning of squid, neither do
the TCP/IP settings of my OS. I've tried very basic config of squid,
even disabled cache, but this problem is still there.
Currently, I have tried to compile squid-2.5.STABLE13 on my RHEL,
FreeBSD 6.0 and 5.3 boxes, even config my browser to use some squid
proxies published on the Internet, all of them show the same problem.
But I can using wget or fetch on the same machine download the same
page (http://en.beijing2008.com/) directly without any choke.
This site (en.beijing2008.com) is using HTTP Accelerating and you can
get different IP from different source IP. But at least with these IP,
squid can not work very well: 61.183.246.108, 211.154.222.24. I think
there may be something not compatible with squid and that server
daemon on TCP/IP layer.
Any suggestions will be greatly appreciated.
Regards
Bin
Received on Tue Apr 04 2006 - 00:39:19 MDT
This archive was generated by hypermail pre-2.1.9 : Mon May 01 2006 - 12:00:02 MDT