On 02/06/11 16:47, Tim Boughen wrote:
> I have installed and configured Squid (2.7 and 3.0).
>
> Default settings in place except for ACL to allow access for local network.
>
> Squid is working 99.99% with the exception of a few sites, including http://www.geelongadvertiser.com.au/
>
> I get some header banner being displayed, and nothing else. Eventually after 10 minutes, the page is displayed
>
> I am currently in Solomon Islands, and not sure if it is due to the location.
>
> I am interested if other users of Squid experience the same issue with this site, or do I need to further configure squid.conf
>
> Cheers
Lets see...
http://redbot.org/?descend=True&uri=http://www.geelongadvertiser.com.au/
The site has a huge number of items (196) to download just for the front
page. *all* of them containing at least one HTTP transfer problem.
Most of these are bandwidth wasting problems (INM failure). The core
pages and scripts also have variant confusion (leading to either data
corruption or even more bandwidth wasting).
FireBug tells me these 196 requests consume 2.3 MB of bandwidth. And
take 4 minutes to download on my fast ADSL connection.
Several dozen are fetched by jQuery scripts, which start badly with 10s
to download jQuery, then go on to make things worse by sub-fetching
handful of widgets at 10-40 sec delay each (sequentially, not in parallel).
So if your browser is one of those which waits for everything to
arrive before displaying the page, it will wait for a while. If it waits
for the scripts to all execute as well it will be waiting a long time.
The header appearing is the odd part out. But can be explained away as
being actually a (nicely small) second page in a iframe.
It appears that if you wait around for all of this to complete one of
the scripts is a little mouse catcher which will detect the mouse
(moving, not stopping) over one of the captions and reload the entire
page to expand it.
It did this to me twice though I'm not sure about why the second time
around. Maybe a timer reload as well.
Anyways ... You astound me for your patience in both visiting this site
and waiting the required time for it to download.
The webmaster for this site has some to answer for in regards to those
annoying scripts. However to be fair most of the problems and their
severity is caused by the web server not handling HTTP variant objects
in a compliant way. It is responding to all INM requests by sending back
the full object (static images mostly). The core objects without a Vary:
header, which will cause middleware to settle (hopefully) on sending
non-compressed versions.
The server software alternately claims to be "AkamaiGHost" or "Apache"
(with no version details on either) depending on the page fetched.
Amos
-- Please be using Current Stable Squid 2.7.STABLE9 or 3.1.12 Beta testers wanted for 3.2.0.8 and 3.1.12.2Received on Thu Jun 02 2011 - 07:54:07 MDT
This archive was generated by hypermail 2.2.0 : Thu Jun 02 2011 - 12:00:02 MDT