Ok, here is an update and I wonder if anyone could help me understand
this. I tracked the problem I was having down to my use of
mod_deflate with apache. If I have squid trying to cache the
compressed content from the web server I get what looks like a cached
copy of each page for each browser version that accesses it. If I
turn compression off I get one cached page for all users with all
browsers. I then set an apache proxy with compression on my squid
server on port 80, it sends all requests right to squid on port 8080,
and squid then grabs the requests on a miss from the web server. By
doing this, squid again works as expected by feeding one cached copy
to all users with various versions of browsers. The downside to this
is only that the cache is now populated with the full size of the
webpage instead of its compressed version. Not that big of a deal,
but ideally if possible I would like to have the compressed pages
cached, and the same page fed to all users. Any suggestions??
Thanks very much for any info.
Matthew
On 8/7/06, Henrik Nordstrom <henrik@henriknordstrom.net> wrote:
> mån 2006-08-07 klockan 14:34 -0700 skrev Matthew Shoemaker:
> > I have squid 2.6stable1 up and running on suse 10.1 in acceleration
> > mode. It works fine under normal testing on my pc.. But if I open up
> > say internet explorer instead of firefox and hit the same address on
> > the cache server I get a cache miss in the access log.
>
> Many browsers forces a fresh copy on the first request. Make sure you
> navigate to the intended page via a link. Avoid bookmarks and reload
> buttons.. Also remember to clear your browser cache on at least one of
> the test stations (the first to request the test page).
>
> Regards
> Henrik
>
>
>
Received on Tue Aug 08 2006 - 11:57:18 MDT
This archive was generated by hypermail pre-2.1.9 : Fri Sep 01 2006 - 12:00:02 MDT