On Tue, 17 Dec 1996, Donald Neal wrote:
> One other approach to sites like www.microsoft.com would be to use caching
> software which can be set to ignore the wishes of the content publisher,
> caching material based on rules written by the cache administrator despite
> the wishes of the authors. Ignoring cookies and caching everything for
> administrator-set periods based on URL's regardless of headers is quite
> technically feasible. And do we really mind annoying Microsoft?
> Not a general solution, I know, but there will be sites where a
> significant proportion of traffic is to only a very small number of servers,
> and it is thus worth the administrator's time and effort to tune for what
> those servers do.
What about adaptive tuning?
Have some kind of a "fool" list, wich says how many times the cache pulled
an object from Micro$oft, which hasn't changed, compared to the
number of realy "needed" pull's.
If the cache gets "fooled" 99.9% of the times, then you multiply the
expire with 1000.
Maybe the numbers may need some adjustment....
/Morten %-)
PS. You need to store a CRC on all objects, otherwise you can't tell if an
object realy changed :-(
-- -------------------------------------------------------------- Morten.Guldager.Jensen@uni-c.dk UNI-C Denmark. +45 3587 8935 -------------------------------------------------------------- * Linux is FREE, and if you don't like it YOU can CHANGE it. *Received on Tue Dec 17 1996 - 00:12:24 MST
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:33:54 MST