Matus UHLAR - fantomas wrote:
> On 21.10.08 16:23, Alejandro Bednarik wrote:
>> You can also use url_regex -i
>>
>> acl bad_sites url_regex -i "/etc/squid/bad_sites.txt"
>> http_access deny bad_sites
>
> using regexes is very ineffective and may lead to problems if you don't
> count with:
> - dot matching ANY character
> - regex matching the middle of string, not just the end of it (like
> dstdomain does)
- URL parts often included in regex not occuring in CONNECT requests.
- neither the http(s):// part.
>
>>>> # cat bad_sites.txt
>>>> .youporn.com
>>>> .rapidshare.com
>>>> .googlevideo.com
>>>> .photobucket.com
>>>> .dailymotion.com
>>>> .logmein.com
>>>> .megavideo.com
>>>> .audio.uol.com.br
>>>> .imo.im
>>>> #
>>>>
>>>> But I am able to connect https://imo.im
>>>> I only got access denied when I access http://imo.im
>
Are you absolutely certain the HTTPS request is going through Squid
then? Your browser may be configured to only send HTTP to squid and the
rest elsewhere.
Amos
-- Please use Squid 2.7.STABLE4 or 3.0.STABLE9Received on Wed Oct 22 2008 - 10:45:50 MDT
This archive was generated by hypermail 2.2.0 : Thu Oct 23 2008 - 12:00:04 MDT