Thanks. I am looking at the squid access.log and the delay is caused by
a GET which for some reason does not result in a response from the
server. Either there is no response or Squid is missing the response.
After a 120 second time-out the page continues loading, but the end
result may be malformed due to the object which did not load.
The error object is different every time and seems random! So the page
never loads properly with Squid 3.x and takes about 125 seconds to load.
It always loads properly without Squid and takes about 5 seconds to
load. It always loads properly using Squid 2.7 and takes about 5 seconds
to load.
For consistency in tracking the problem down, I have Squid's disk and
memory caches disabled so every client request is a "cache miss".
Strange eh?
Max
P.S. I am debugging natively on my Ubuntu 10.10 64 bit laptop using
Firefox, but the original problem comes from an embedded device running
the QNX RTOS using a libcurl based WebKit browser (both the browser and
Squid are running on 127.0.0.1 in each case, but this problem happens
across the network as well).
-----Original Message-----
From: Amos Jeffries [mailto:squid3_at_treenet.co.nz]
Sent: Wednesday, January 19, 2011 9:18 PM
To: squid-users_at_squid-cache.org
Subject: Re: [squid-users] Squid 3.x very slow loading on
ireport.cnn.com
On 20/01/11 13:31, Max Feil wrote:
> I'm wondering if anybody knows what might be causing this. I've
> confirmed this problem in linux builds of Squid 3.0, 3.1.1, 3.1.10 and
> 3.2.0.4.
>
> Using firefox (or probably any browser - it also happens in a webkit
> based browser under development) clear the browser's disk cache and
try
> to load or reload http://ireport.cnn.com (with proxy address/port set
to
> Squid of course). Loading the page takes a very long time (several
> minutes) even on a fast network connection. Take Squid out of the mix
> and everything loads in seconds.
>
> This is using the default squid.conf file. The problem does not happen
> in Squid 2.7!
>
> Thanks,
> Max
There are 101 different objects assembled into that one page coming from
10 different domains.
Browsers set a very low limit on the amount of connections and objects
fetched in parallel when using a proxy as compared to going direct.
Large pages like this make the speed difference more noticeable.
That will account for some of the extra time. But should not be taking
that much longer. You will need to find out which objects are taking too
long (firebug or the webkit dev tools should help) and then figure out
why them.
Amos
-- Please be using Current Stable Squid 2.7.STABLE9 or 3.1.10 Beta testers wanted for 3.2.0.4Received on Thu Jan 20 2011 - 07:50:37 MST
This archive was generated by hypermail 2.2.0 : Mon Jan 24 2011 - 12:00:03 MST