rross@supernet.net writes:
>
>I am using squid-1.1.beta7 and I have impleted an ACL using the file patch
>(Arjan de Vet) added in beta5. The below two lines show the simple entry
>in the squid.conf file.
>
>> acl badsites url_regex "/usr/local/squid/etc/acl.list"
>> http_access deny badsites
>
>The problem that I am experiencing is that, if the acl.list file gets
>above a certain size, then all sites are denied. I have not been able to
>pin down exactly where the "magic size" is. 10,000 lines seems to be too
>many and 4,000 seems to be fine. I believe that the size issue may be
>related to memory (which would explain why I haven't pined it down).
>
>This size limitation is a real problem. Any suggestions would be greatly
>appreciated.
I'm testing it right now with a list of 30,000 URLs. I don't get the
same results--most of my requests are getting through.
But the Squid process is taking 95% of the cpu.
Seems like even a list of 4,000 is too many to put in the main Squid
process. Would probably make more sense to put them in the redirector
process, no? Then you might even do some fancy hashing, etc. to speed
it up.
Duane W.
Received on Tue Oct 15 1996 - 22:53:44 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:33:17 MST