Have you checked if your File descriptors? Each socket uses one file
descriptor and there is a limit of 1024 on Linux by default. You can
increase it if yopu need.
Take a look here:
http://www.onlamp.com/pub/a/onlamp/2004/03/25/squid.html?page=2
Hope that helps.
Rafael Sarres de Almeida
Seção de Gerenciamento de Rede
Superior Tribunal de Justiça
Tel: (61) 319-9342
Jeffrey Ng <jeffreyn@gmail.com>
11/07/2005 15:12
Favor responder a
Jeffrey Ng <jeffreyn@gmail.com>
Para
squid-users@squid-cache.org
cc
Assunto
Re: [squid-users] Concurrent Connection Limit
Hello? does anybody know what's wrong?..
On 7/10/05, Joshua Goodall <joshua@roughtrade.net> wrote:
> On Sun, Jul 10, 2005 at 02:04:36PM +0800, Jeffrey Ng wrote:
> > Hi, I have problem with squid web accelerator on my site. My site is a
> > photo sharing site like webshots. It has a pretty busy load, so I
> > decided that squid may be able to sooth the load of my image server by
> > caching some of the images. We have set everything up and it uses 1GB
> > RAM. It was fine at first. But suddenly all the images stopped loading
> > after 6 hours. I checked netstat and found that there are 1000
> > connections from outside. and squid stops responding whenever the
> > connections hit that number. I am pretty sure that squid has a
> > concurrent connection limit of 1000. How could I increase that limit?
> > Any help is appreaciated. Thank you!
>
> Sounds like you're running out of filedescriptors.
> See http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.4
>
> - Joshua.
>
> --
> Joshua Goodall "as modern as tomorrow
afternoon"
> joshua@roughtrade.net - FW109
>
Received on Mon Jul 11 2005 - 12:29:55 MDT
This archive was generated by hypermail pre-2.1.9 : Mon Aug 01 2005 - 12:00:02 MDT