On Mon, 26 Apr 2004, Nick wrote:
> I have squid setup as a reverse proxy and am trying to get one page excluded from cache. I thought I could do this with url_regex or urlpath_regex. Is my syntax wrong because the page still gets cached? How do I get one URL excluded? 2 examples of the URLs that would be excluded are
> http://test.test.com/classifieds-bin/classifieds?temp_type=detail&tl=2&classification=employment
> and
> http://test.test.com/classifieds-bin/classifieds?temp_type=detail&category_number=333&classification
> =autos&date=today,sunday_before(today)&orderby=start_date:d
>
> Here is what I put in the squid.conf file.
> acl excludeURL url_regex test.test.com\/classifieds-bin\/classifieds?temp_type=detail&category_number=333&classification
> =auto&date=today,sunday_before(today)&orderby=start_date:d
> no_cache deny excludeURL
This is not correct regex syntax for the given URL, and why have you left
out the protocol part? You should also bind the pattern to start/end of
line.
It is mainly ., ? and + you need to escape as these have special meaning
in regex. There is also a few other special characters but these is not
very often seen in URLs. There is no need to escape the / as / is just /
and nothing special.
Complete list of special characters in regex: ^.[]$()|*+?{}\
See
man 7 regex
for a in-depth description of the regex language. Squid uses what is known
as "Modern/Extended" regex, not the old style regex syntax.
Regards
Henrik
Received on Mon Apr 26 2004 - 11:43:51 MDT
This archive was generated by hypermail pre-2.1.9 : Fri Apr 30 2004 - 12:00:02 MDT