> -----Original Message-----
> From: Joe Cooper [mailto:joe@swelltech.com]
> Sent: Saturday, October 05, 2002 1:11 PM
> To: Linda
> Cc: squid-users@squid-cache.org
> Subject: Re: [squid-users] How to record squid traffic?
>
>
> Could you define "all traffic"?
>
> Squid logs all requests that go through it, unless you disable access
> logging. However, if you're asking Squid to log all data that passes
> through it including image data, HTML/text, etc., then Squid won't do it
> and it is a not-quite-sane desire. Think how big those logs would be...
---- What if the data were filtered based on client-IP, time or sitename? Let's say I have my PC running, and I want to record any traffic that my PC generates when I am in bed. I.e. -- suppose there are automatic update or information requests made by the PC at 2 or 3 am -- that I don't want to prevent -- I just want a complete record of what was asked for and what came back (assuming it wasn't encrypted). Is that not-quite-sane, or is keeping track of automatic appliance or PC updates not a good security practice? Another example -- suppose I am a large brokerage that is required to keep records of all customer interactions. One of those is a web-chat that works through a standard http proxy like squid. Does it make more sense to record the data at each client, or to configure all clients to use 1 squid proxy that records all the traffic? Another example - as data comes from a server back to a client through squid -- I want to examine that data and remove harmful content. Perhaps harmful is defined to be all VBS script, or all objectional content -- purely up to the site policy -- but if it is harmful VBS script -- maybe I want to quarantine the harmful content for later study. Another example -- intrusion/trojan detection/analysis. If I have loaded a trojan server onto my system, I'd like to filter on all HTTP traffic to/from my system and record anything suspicious. Perhaps, as you say, it involves recording *everything*, but that could be a use for those large, multi-gig disks. If one has a 128kb (-> 16kB) link, a 128GB disk would take 97 days to fill if it was going full speed 24x7. At even 40% usage, that goes up to 243 days, uncompressed. Compress that...easily 3x storage, now one disk will last close to 2 years. Of course the logging disk can be attached to a different system (say an old 120MHz pentium) that only allows console logins. A little bit of filtering (not recording cnn, slashdot, sec-foc, linux-whatever, etc...) and even with a 1.28Mb DSL, one could still record months of traffic. Not bad for a ~130$ disk and a pc headed for the trash (the logging machine). Another use -- if we are able to call a filter for incoming -- programs like virus checkers and ad/cookie/script/looping-image busters could be plugins to squid. Why shouldn't squid support something like ijb (Internet Junkbuster Proxy @ http://internet.junkbuster.com/) as a plugin? Squid already supports "redirect" plugins, why not input/output filter plugins? I'm assuming (perhaps incorrectly, ??) that a redirect filter won't see the content (data) of a 'POST' request. So a 'output' filter would filter not only the HTML 'GET' messages from a client, but the content of POST requests and similar as well. Input filters would filter header and data coming from a server before it goes into the local cache. Theoretically one could have "post-process" filters that would filter content to individual clients as it comes out of the cache -- so different clients could receive different HTML streams, but I'm more interested in a pre-cache plugin. Still sound so crazy? -linda Another example -- suppose I use multiple clients behind a squid-proxy. Now let's say a couple of those are win machines (98,2000,xp,etc...). Now suppose I want to scan for harmfulReceived on Sun Oct 06 2002 - 12:00:45 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:10:36 MST