> interesting and usefull places on the net. It would be nice to have
> a small stand alone program that could, given a base URL and a number
> specifying how many levels deep it should go, just go out and get the
> pages via Squid.
I believe what we need here is a Spider that simply
craw and explore a specified list of sites, be it
extract from the access.log or otherwise.
Maybe Harvest Gatherer can fit the bill.
*8)
Ong Beng Hui
ongbh@singnet.com.sg
...Yet Another Day in a ISP Business
...and they lived happily ever after
Received on Fri Aug 02 1996 - 20:20:16 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:32:45 MST