> From: Amos Jeffries <squid3_at_treenet.co.nz>
> Date: Mon, 19 Oct 2009 18:14:33 +1300
> Cc: "squid-users_at_squid-cache.org" <squid-users_at_squid-cache.org>
> Subject: Re: [squid-users] High CPU Utilization
>
> Ross Kovelman wrote:
>> Any reason why I would have high CPU utilization, avg around 90%? I did
>> build it for PPC although I do have a large dstdomain list which contains
>> URL's that are not allowed on the network. It is a Mac G4 dual 1.33. This
>> is with no load, or I should say no users on the network.
>>
>> Thanks
>>
>
> Could be a few things:
>
> * bug 2541 (except latest 3.0 and 3.1 releases)
>
> * lots of regex patterns
>
> * garbage collection of the various caches
>
> * UFS storage system catching up after a period of load
>
> * memory swapping
>
> * RAID
>
> * ... any combination of the above.
>
> If you have the strace tool available you can look inside Squid and see.
> Or a use "squid -k debug" to toggle full debug on/off for a short
> period and troll the cache.log afterwards.
>
> Amos
> --
> Please be using
> Current Stable Squid 2.7.STABLE7 or 3.0.STABLE19
> Current Beta Squid 3.1.0.14
Amos,
I am not using a raid, although my single drive performance might be slow?
Will need to check on the i/o. When I do run squid or make any changes to
the config I do get a lot of :
2009/10/16 14:44:08| WARNING: You should probably remove 'xxx.com' from the
ACL named 'bad_url'
2009/10/16 14:44:08| WARNING: 'xxx.com' is a subdomain of 'xxx.com'
2009/10/16 14:44:08| WARNING: because of this 'xxx.com' is ignored to keep
splay tree searching predictable
2009/10/16 14:44:08| WARNING: You should probably remove 'xxx.com' from the
ACL named 'bad_url'
Would this by chance do it? There is about 22,000 sites in the bad_url
file.
Thanks
Received on Mon Oct 19 2009 - 05:24:42 MDT
This archive was generated by hypermail 2.2.0 : Mon Oct 19 2009 - 12:00:04 MDT