On Tue, Jan 04, 2000 at 09:35:06AM -0800, Ramesh Veemaraj wrote:
...
> so as i read in the FAQ regarding purge of cache files
>
> i used the following command
>
>
> $ ./client -m PURGE http://www.sample.com/
>
> so in the output it says code 200 Ok but
> later all the associated pages in item 2,3 and 4th page
> are still available in the cache.
Unfortunately the PURGE method only works by exact match of a URL.
> my intension is when i use the above client command
> giving only the http://www.sample.com/ it should delete
> all the associated files. but that is not the case
> i have to issue client -m PURGE individually do delete
> all the files. so is there any option to do that of can
> you make sure whether squid has this facility.
No, there isn't.
Sometime in the coming month I should have a Perl script working that
handles this kind of thing by reading the squid.conf file, walking
through the directories for all the cache systems listed there, finding
all the URLs for a domain, and issuing purges for them. It's a pain,
but right now that's the only solution I'm aware of.
A PURGE_REGEX method built into Squid would be mega-cool, but I am
not comfortable hacking into the Squid internals. (Yet?)
-- Clifton
-- Clifton Royston -- LavaNet Systems Architect -- cliftonr@lava.net "An absolute monarch would be absolutely wise and good. But no man is strong enough to have no interest. Therefore the best king would be Pure Chance. It is Pure Chance that rules the Universe; therefore, and only therefore, life is good." - ACReceived on Tue Jan 04 2000 - 12:41:02 MST
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:50:13 MST