Palula...
On Wednesday 21 December 2005 06:17, Palula Brasil wrote:
> I created a file with a some strings I don't want my clients to access.
> Very nice it works fine, but it is blocking some sites with string I
> don't want it to block... So I created another acl with permitted
> strings ok? So the thing goes like this...
>
> acl bad_strings url_regex "path_to_file/file"
> acl good_strings url_regex "path_to_file/file"
>
> Denial:
>
> http_access allow good_strings
> http_access deny bad_strings
>
> But the problem is that I blocked the word "anal" on the bad strings
> file and I have the word "canal" (means channel in portuguese) in the
> good_strings file. But now, the word anal can be searched/accessed etc.
> How can I overcome this...
Your syntactical solution would be:
http_access deny bad_strings !good_strings
However blocking by keywords has proven to be very inefficient. It takes a
user with an IQ of a three year old child to circumvent this "security".
Take the google cache, all the anonymizing proxies, web anonymizers etc.
You can't block "bad content" by using URL keywords decently. Rather -
depending on the seriousness of blocking - try SquidGuard or consider
throwing money at a commercial product.
Christoph
-- ~ ~ ".signature" [Modified] 2 lines --100%-- 2,41 AllReceived on Wed Dec 21 2005 - 02:54:21 MST
This archive was generated by hypermail pre-2.1.9 : Sat Dec 31 2005 - 12:00:03 MST