Henrique Abreu wrote:
> Hi!
>
> Have squid a options to can this automatic "suck" the content of web sites
> in internet in predefined times, to preserver time from users conected ?
>
> Regards,
> Abreu
Best bet to me would be to create a cron job which would wget recursively the pages
in question and then delete what it got. You can instruct wget to use a proxy, by
using some environment variable, startup file or something. Can't remember exactly
now.
[]'s,
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:47:36 MST