This work great!
However, I ran into another problem. I need the page to live for 30
seconds. I added a Last-Modified with current time and a cache control
value of max-age=30 in the header. In the squid.conf I have
"refresh_pattern . 0 50% 1". The pages are refreshing every 113 seconds.
Can you shred some light on it?
Thank you very much!
Alan
> -----Original Message-----
> From: Henrik Nordstrom [mailto:hno@hem.passagen.se]
> Sent: Wednesday, October 04, 2000 12:13 AM
> To: Chan, Alan S
> Cc: squid-users@ircache.net
> Subject: Re: [SQU] Multiple request blocking
>
>
> Chan, Alan S wrote:
> >
> > With squid is there a way to block multiple requests from being
> > forwarded to the internet, in case a similar request had
> been received
> > earlier, been forwarded to the internet, and the response
> is being waited
> > upon?
> >
> > This feature would really help for heavily hit web pages
> that have a long
> > retrieval time, and would save multiple requests from
> unnecessarily being
> > forwarded to the web-site.
>
>
> Maybe this will work:
>
> edit global.h and change the one after
> neighbors_do_private_keys to a 0,
> then "make install".
>
> If it does then an option for this should perhaps be added to
> squid.conf..
>
>
> Please note that it will also give problems with stalled requests as
> browsers does not allow the user to make a forced reload until they at
> least have got the headers of the page..
>
> --
> Henrik Nordstrom
> Squid hacker
>
-- To unsubscribe, see http://www.squid-cache.org/mailing-lists.htmlReceived on Thu Oct 05 2000 - 08:46:29 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:55:41 MST