I solved the problem.
It was the range_offset_limit -1 KB line that was not letting squid resume downloads. I set it back to 0KB as it is by default and woila!!! Everything back to normal!!
Thank you very much for your support. This is one of the best mailing lists.
-----Mensaje original-----
De: Henrik Nordstrom [mailto:henrik_at_henriknordstrom.net]
Enviado el: jueves, 23 de octubre de 2008 14:07
Para: Osmany Goderich
CC: squid-users_at_squid-cache.org
Asunto: Re: [squid-users] Problems with downloads
On tor, 2008-10-23 at 14:34 -0500, Osmany Goderich wrote:
> Hi everyone,
>
> I have Squid3.0STABLE9 installed on a CentOS5.2_x86_64 system. I have
> problems with downloads, especially large files. Usually downloads are
> slow in my network because of the amount of users I have but I dealt
> with it using download accelerators like “FlashGET”. Now the downloads
> get interrupted and they never resume and I don’t know why.
Can you try "downgrading" to 2.7 to see if that makes any difference. If it does please file a bug report.
Also check your cache.log for any errors.
> I can’t seem to find
> a pattern as to when or why the downloads get interrupted. I don’t
> know if I explained my self well enough. I’m suspecting that there is
> something wrong with all the configurations I did to tune de cache effectiveness.
There isn't much you can do wrong at this level.
Regards
Henrik
Received on Fri Oct 24 2008 - 12:35:47 MDT
This archive was generated by hypermail 2.2.0 : Fri Oct 24 2008 - 12:00:04 MDT