I have seen http://www.squidblacklist.org/ which is a very nice idea but
I am wondering if squid.conf and other squid products are the good
choice for any place.
For a mission critical proxy server you will need to prevent any
"reload" of the proxy which can cause a *small* download corruption.
I know that admins most of the time don't think about reload of a
process but it can cause some department in the enterprise a problem.
From the ISP point of view you must prevent any problem to the client.
The current solutions for filtering are:helper,squid.conf,icap\ecap service.
A helper is a nice solution like squidguard but this also needs a reload
of the whole squid service.
On a small scale system it's easy to do but for a 24x7 filtering
solution we can't just reload the squid process or recreate the DB.
There for I wrote my ICAP service for this specific issue.
The need for a more persistent solution which doesn't require down time
from the DB proxy or the filtering service point of view.
Would you prefer a filtering based on a reload or a persistent DB like
mongoDB or tokyo tyrant?
Since there were some bug fixes in squid ICAP based solutions can give
more throughput and I got into a 8k persistent request on a INTEL ATOM
based filtering system.
Regards,
Eliezer
Received on Sun Jun 09 2013 - 09:30:38 MDT
This archive was generated by hypermail 2.2.0 : Mon Jun 10 2013 - 12:00:11 MDT