Hi All:
Maybe I'm crossposting but anyway....
I'm intending do develop an application to store in
a database all users traffic in a way to easier create reports.
I'm considering two solutions:
1) Create a daemon which will read all squids logs by a pipe
in /dev, configure this pipe as squid's log file and store
users_ip and URL in the database, or
2) Create a daemon which will connect to squid's port and
send a "GET cache_object://hostname/filedescriptors",
and store only Description field which content is "^http://"
(URL's).
I'm wondering which will be the better way to accomplish my
duty. Suggestion?? :)
PS: as a database I'll use Berkeley DB or GDBM, which are MUCH MORE
lighter.
thanks in advance
-- []'s Lucas Brasilino brasilino@recife.pe.gov.br http://www.recife.pe.gov.br Emprel - Empresa Municipal de Informatica (pt_BR) Municipal Computing Enterprise (en_US) Recife - Pernambuco - Brasil Fone: +55-81-34167078Received on Thu Jun 05 2003 - 10:21:08 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:17:16 MST