Hi everybody,
I am trying to use an external_acl_type to be able to filter internet
traffic according to specific User-agent headers and destination
(let's say you have the right to browse facebook only by using
Firefox).
this is my external acl:
external_acl_type getheaders %{User-Agent} %DST /etc/squid3/getheaders
acl myacl external getheaders
http_access allow myacl
this is my getheaders program:
(I runned it, and there are no permissions problem)
#!/bin/sh
read agent
read DST
date=`date`
echo "$date $agent" >> /var/log/squid3/headers.log
echo "$DST" >> /var/log/squid3/headers.log
echo "OK"
exit 1
and this is what I get in the debug when I try to access facebook:
2009/04/16 21:17:16.481| aclMatchExternal: acl="getheaders"
2009/04/16 21:17:16.481| aclMatchExternal:
getheaders("Mozilla/5.0%20...............0Version/4.0%20Safari/528.16
www.facebook.com") = lookup needed
2009/04/16 21:17:16.481| externalAclLookup: lookup in 'getheaders' for
'Mozilla/5.0%20(Macintosh;%20U;%20In...........Version/4.0%20Safari/528.16
www.facebook.com'
2009/04/16 21:17:16.481| externalAclLookup: looking up for
'Mozilla/5.0%20(Macintosh;%20U;%20..............)%20Version/4.0%20Safari/528.16
www.facebook.com' in 'getheaders'.
2009/04/16 21:17:16.481| helperDispatch: Request sent to getheaders
#1, 167 bytes
2009/04/16 21:17:16.482| externalAclLookup: will wait for the result
of 'Mozilla/5.0%20(Macintosh...........0Safari/528.16
www.facebook.com' in 'getheaders' (ch=0x85a4760).
Apparently squid is waiting for a domain lookup that my getheaders
program should do.
I am a bit lost as I thought that the only reply options are OK/ERR
As I didn't find anything on google, if anybody has a clue, I would
appreciate the share! :-)
I am running the latest squid3 on debian
Thank you,
Julien
Received on Thu Apr 16 2009 - 19:38:07 MDT
This archive was generated by hypermail 2.2.0 : Fri Apr 17 2009 - 12:00:02 MDT