Could we possibly test the news caching to prove it is working? (Matthew, not
I'm not confident on your reply but I guess we need to empty our local browser
for the test and see how it really works. Anyway, how do I safely empty my
cache). I guess a max time out (time to live) of 48 hours (for the news pages)
will be about right? How do I go about in doing this?
hoping for your response
Joel
Matthew King wrote:
> Hi.
>
> Date forwarded: Mon, 5 Apr 1999 05:00:46 -0700 (PDT)
> Date sent: Mon, 05 Apr 1999 18:56:53 +0800
> From: Joel Taqueban <jtaqueba@mnl-hub.dhl.com>
> To: squid-users@ircache.net
> Subject: Caching
> Forwarded by: squid-users@ircache.net
>
> >
> > Just questions on the caching:
> >
> > As I understand when a client requests web pages via the proxy it caches
> > and stores the them on the proxy so that when another client tries to
> > access the same page they are actually accessing the proxy's cache without
> > the proxy again accessing the pages from the site.
> >
> > I've got concerns though:
> >
> > There are some server sites that almost update its pages every hour,
> > such as news sites
> > like CNN News, BBC News, etc. I just wonder how does Squid handle
> > these kind of
> > web pages? I mean, how do you define a 'time to live' for these kind
> > of pages since
> > they change every now and then? For example, at 0700 hrs. a
> > certain client checks
> > CNN news and such web page is cached on the proxy. At 0900 hrs. same
> > client
> > checks the same news site but surely that page has already been
> > updated. How does
> > squid handle this? Does the proxy delete the previous page from the
> > cache and
> > request a copy of the new one?
>
> When you request a page from squid.. Or any proxy, it will first
> check to see if there is a copy of the page in it's cache.. If there is
> not it will grab the page from the location you requested and
> stream it to you, while copying it to it's cache.. When another
> person requests this same page, the proxy checks to see if the
> page exists in it's cache, if it does, the proxy then checks to see if
> a newer version of this page exists on the location that you are
> requesting from.. That way pages are always up to date. If the page
> you request has changed, you get the new copy, if it has not, you
> get the cached copy.. Of course checking for new versions of the
> page can take a few microseconds (depending on your link to the
> net) but it is much faster than getting the whole page every time :)
>
> Hope that kinda answers your question..
>
> Cya
> Matthew
>
> --------------------------------------------------
> My ICQ#: 2342475
> Message me!
> --------------------------------------------------
>
> --------------------------------------------------
> Cellular Phone: +61 415 257 516
> E-Mail: nerd@zip.com.au
> matthew_king@hotmail.com
> Homepage: http://www.zip.com.au/~nerd
> --------------------------------------------------
>
> ---------------------------------------------------------------------------
>
> -----BEGIN GEEK CODE BLOCK-----
> Version: 3.1
> GM/S d(-)@ s(-):(+) a--- C++++(+++) U P(+) L(+) E? W++>+++ N++ o? K++
> w !O- M--(-) !V- PS PE Y(+) !PGP t+++ 5+++(++) X+ R+++ tv++ b+++ DI+ D
> G++ e h+ r-->+++ y
> ------END GEEK CODE BLOCK------
>
> ---------------------------------------------------------------------------
Received on Wed Apr 14 1999 - 08:17:37 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:45:51 MST