-----Original Message-----
From: Amos Jeffries [mailto:squid3_at_treenet.co.nz]
Sent: Tuesday, April 03, 2012 8:43 AM
To: squid-users_at_squid-cache.org
Subject: Re: [squid-users] Allowing linked sites - NTLM and un-authenticated users
On 3/04/2012 6:12 p.m., Jasper Van Der Westhuizen wrote:
>
> -----Original Message-----
> From: Amos Jeffries [mailto:squid3_at_treenet.co.nz]
> Sent: Monday, April 02, 2012 9:27 AM
> To: squid-users_at_squid-cache.org
> Subject: Re: [squid-users] Allowing linked sites - NTLM and
> un-authenticated users
>
> On 2/04/2012 5:54 p.m., Jasper Van Der Westhuizen wrote:
>> -----Original Message-----
>> From: Amos Jeffries
>>
>> On 30/03/2012 11:45 p.m., Jasper Van Der Westhuizen wrote:
>>>> Hi everyone
>>>>
>>>> I've been struggling to get a very specific setup going.
>>>>
>>>> Some background: Our users are split into "Internet" users and "Non-Internet" users. Everyone in a specific AD group is allowed to have full internet>>access. I have two SQUID proxies with squidGuard load balanced with NTLM authentication to handle the group authentication. All traffic also then gets>>sent to a cache peer.
>>>>
>>>> This is basically what I need:
>>>> 1. All users(internet and non-internet) must be able to access sites in "/etc/squid/lists/whitelist.txt"
>>>> 2. If a user wants to access any external site that is not in the whitelist then he must be authenticated. Obviously a non-internet user can try until he is blue>>in the face, it won't work.
>>>>
>>>> These two scenarios are working 100%, except for one irritating bit. Most of the whitelisted sites have got linked websites like facebook or twitter or>>yourtube in them that load icons and graphics or adds etc. This causes a auth-prompt for non-internet users. I can see the requests in the logs being0>>DENIED.
>>>>
>>>> The only way I could think of getting rid of these errors was to
>>>> implement a "http_access deny !whitelist" after the allow. This
>>>> works great for non-internet users and it blocks all the linked
>>>> sites without asking to authenticate, but obviously this breaks
>>>> access to all other sites for authenticated users.(access denied
>>>> for all sites)
>>> You can use the "all" hack and two login lines:
>>>
>>> http_access allow whitelist# allow authed users, but dont challenge
>>> if missing auth http_access allow authed all # block access to some
>>> sites unless already>logged in http_access deny blacklist
>>> http_access deny !authed
>>>
>>>
>>> The authed users may still have problems logging in if the first site they visit is one of the "blacklist" ones. But if they visit another page first they can login>and get there.
>>>
>>>
>>> Amos
>> Hi Amos
>>
>> Thank you for the reply.
>>
>> I think I already tried this method but it still fails. In any case I tried what you suggested and the problem remains that my unauthenticated(non-internet)>users can get to the whitelisted sites just fine, but they still get authentication prompts for the linked content like facebook and youtube that the site>contains. An example of a site is http://www.triptrack.co.za/ and you will see what I mean. At the bottom right of the site there are links to facebook and>youtube. Those links cause a authentication request to the unauthenticated(or non-internet) users. I can't have these prompts appear for these users. They>have a set list of sites they can visit, and it should work for them and should not get asked to authenticate. Only once they try and go directly to sites that are>not in the whitelist, should they be prompted, and obviously denied since they are not included in the AD group.
>> The problem of course is that they *are* going "directly" to the blacklisted sites when they load an object from those sites. Even if the object was embeded>in some third-party whitelisted sites HTML.
>> HTTP protocol makes no distinctions about how HTML, XML, or Flash document structures group objects. All Squid sees is a request for an object on a non->whitelisted site.
>> Current rules:
>> http_access allow whitelist
>> http_access allow authenticated all
>> http_access deny blacklist
>> http_access deny !authenticated
>>
>> Kind Regards
>> Jasper
>>
>
> Something else I've tried was using a cache_peer_access to pass the whitelisted domains that everyone should have access to, to another squid instance that should only allow access to the whitelisted sites. Nothing else. Again it works kind of. I can see that the proxy sends the request to the cache_peer, but it only sends the requested site there and again not any sites that are linked within it.(like facebook).
>
> Is there a way to send the entire "session" to the cache_peer if a particular domain was requested?
>
> There is maybe the Referer: header. Since the evercookie attacks it has becoming popular to erase or not send those though. So good luck.
> You can test that with req_header ACL type and a regex pattern.
>Amos
I think I found a work-around. I added another ACL and used the same list of "whitelisted" domains, but instead of dstdomain the new acl is of type srcdomain.
So the access list looks like :
--cut--
### Whitelisted sites for all users
acl whitelist dstdomain "/etc/squid/lists/whitelist.txt"
acl blocklist srcdomain "/etc/squid/lists/whitelist.txt"
http_access allow whitelist
http_access deny !blocklist
###############################
### Allow authenticated users #
###############################
http_access allow authed all
--cut--
This allows my un-authenticated users access to the whitelisted domains and blocks any links in the sites that are not whitelisted(like facebook and youtube). It also allows my authenticated users access to all sites, including whitelisted sites, as well as allowing linked sites like facebook etc.
Do you perhaps see any issue with this setup?
Regards
Jasper
Received on Tue Apr 03 2012 - 10:27:46 MDT
This archive was generated by hypermail 2.2.0 : Tue Apr 03 2012 - 12:00:02 MDT