Where to you see this req/sec from? Cachemgr?
Thank you very much.
Best regards,
Edward Millington
(Network Administrator & Senior Technical Support Technician)
Cariaccess Communications Ltd.
Wildey
St. Michael
Barbados
1-246-430-7435
Fax : 1-246-431-0170
www.cariaccess.com
----- Original Message -----
From: "Joe Cooper" <joe@swelltech.com>
To: "Mark Pace Balzan" <mpb@melitacable.com>
Cc: <squid-users@squid-cache.org>
Sent: Tuesday, May 08, 2001 4:30 PM
Subject: Re: [squid-users] Sizing a pretty Large Squid Install
> Hi Mark,
>
> I've written a tuning article about Squid on Linux in the past (as soon
> as I have the time, I'll update it to reflect that Squid 2.4 is now
> about as fast as the version documented in the article):
>
> http://www.swelltech.com/pengies/joe/squidtuneup/t1.html
>
> Tuning any reasonably modern hardware with enough RAM using the steps in
> the article will get you up to about 100 reqs sec, pretty easily (you'll
> want more than 512MB is you're using 32GB of disk space, however--768MB
> is probably a safe minimum).
>
> Clustering becomes more cost-effective than building a bigger box
> somewhere between 150 and 220 reqs/sec. client load (and by that time
> you're into a very big box--at least 1GB RAM, and probably 4 10,000 RPM
> disks, and a GHz processor).
>
> I've also written an article on building a caching infrastructure with
> Squid (it's focused on our products, but if you have tuned your Squid
> properly, then the sizing info will apply to you too):
>
> http://www.swelltech.com/support/sizecache/t1.html
>
> It's also getting some age on it...but the advice is still relevant, and
> Squid hasn't gotten any more memory efficient since I wrote it.
>
> Good luck!
>
> Mark Pace Balzan wrote:
>
> > Hi all,
> >
> > Ive looked abit around the faqs but didnt quite find all the answers I
need.
> >
> > My situation:
> >
> > I am thinking of setting up squid for caching having worked with it in
the
> > past, but Im unclear about some sizing issue for a large installation.
> > We are a broadband (cable modem) based ISP with a some dialups too. My
> > immediate need is to service around 50 - 60 http requests per second.
> >
> > This got me to the big figure of 4,320,000 per day, which I didnt not
find
> > in the JANET artice by Martin Hamilton on the FAQ. Also with an average
8k
> > object size, I gather id need 32GB hard disk and 512MB ram minimum.
> >
> > We currently run on a 14 Mbps Internet link, this will be up to 30Mbps
by
> > the end of the year, so the requests per second will be increasing by
quite
> > an amount !
> > Im estimating 64-80GB hard disk and 1GB RAM minimum for this.
> >
> > My questions:
> >
> > At what point do I need to consider clustering ? (Note the above is at
one
> > single physical location.)
> > Practically speaking what is the max load seen in the field/production ?
> > Does squid break under such heavy load, if running on appropriate
> > hardware/memory ?
> > Is anyone aware of any hardware, kernel o/s or squid issues with memory
> > addressing over 1GB physical RAM
> >
> > Im open to any o/s that fits the job (linux or solaris) and any
sugeestions
> > you may have.
> >
> > Thanks for your time
>
>
> --
> Joe Cooper <joe@swelltech.com>
> Affordable Web Caching Proxy Appliances
> http://www.swelltech.com
>
>
Received on Wed May 09 2001 - 18:54:02 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:59:54 MST