On Mon, 5 Oct 1998, David Luyer wrote:
> > Is Squid able to act as HTTP load balancer ? Idea: one Squid receives
> > all requests for a www.***.* site and forwards the requests to a farm
> > of httpd daemons behind a firewall, takes the responses and send them
> > back to the browser.
> >
To spread the load, you can test drive CARP support in Squid. Perhaps
with some tweaking for "special" URLs.
> If you want to be smarter, you could try to maintain a shared memory segment
> with one word containing the latest load average for each server
Shared memory assumes that all Squids run on the same machine. Not a good
idea in most cases. Note that the "redirector" sees all the requests that
second-level Squids are processing. Thus, you can maintain a pretty good
local estimate about the load on second-level servers if you want to try
some smart load balancing.
However, first-level cache will introduce latency for forwarding requests.
Estimate the benefits twice before introducing one more level of
indirection...
$0.02,
Alex.
Received on Mon Oct 05 1998 - 10:10:26 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:42:20 MST