Coming from a novice background myself I think I'm well qualified to give
you my understanding of how proxy filtering with squid works (or should
work). I agree with you on terminology, but IMHO squid has been the easiest
package to get working for me ... although I wish people wouldn't 'assume'
knowledge on things like regex etc.
Anyway, on with the help!
The most simple method of blocking sites in squid is to set up what is
called an 'ACL', or access control. These are basically a set of 'rules'
that determine if a request is allowed through. What you want is a ACL rule
that says allow requests from my six modem IPs, as long as they don't have
any of these 'words' in the URL of the request.
Without sitting at my actual proxy machine I can't give you precise
descriptions of what you need to have in your squid.conf, but it should be
something like this (see the Access Control section):
acl allowedsrc src "/usr/local/squid/etc/allowedsrc.squid"
acl bannedsites url_regex "/usr/local/squid/etc/bannedsites.squid"
http_access allow allowedsrc !bannedsites
http_access deny all
The two files mentioned would need to be created. The src file (I'm going
from memory here, check the docs/squid.conf for what I'm saying) should
contain a list, one IP per line, of the machines/modems that are allowed to
use your proxy. It isn't a very good idea to allow anybody to access it. The
banned sites file is a 'regex' formatted file, which is a type of 'search
pattern'. The simplest way to use this is:
To match the following you would have...
http://www.playboy.com -> www\.playboy\.com
www.hardcore.com -> www\.hardcore\.com
www.geocities.com/shagmequick -> www\.geocities\.com/shagmequick
That will give a match on any url that contains "www.playboy.com", including
searches for it etc.
If the user's request does get rejected they don't get any explanation, just
a screen telling them that their access was denied.
If you listen to all the 'big players' they'll recommend you go for a url
redirector, which I have no experience of. A redirector does much the same
as above, except it will redirect the user to another page - for example a
page telling them to check the local nightclub rather than your internet
access!
The url redirector also has other advantages, for 'high bandwidth' loaded
sites it is much faster than ACL matching, it can also be easier to manage
and also allows you to have a more intelligent cache. For example if you
want users to download a certain zip file, for example netscape, you can
force them to get it from you rather than direct (put in a pattern match for
nexxxxxx.zip and redirect them to a local copy on a web server at your
site). A URL redirector also means you don't have to do a squid -k
reconfigure everytime you add a banned site, you just reload the redirector
program.
I shouldn't imagine a url redirector would be very difficult to set up. BUT
it is obviously something else that you've got to think about and manage.
I would recommend you try the ACL to start with, to get a working system
going and something that you can use to sort out the 'other problems' you'll
get. Then look at getting a redirector going.
All the best,
Tim.
> -----Original Message-----
> From: Canary, Robert W. [mailto:Robert.Canary@alcoa.com]
> Sent: 18 May 1998 21:24
> To: 'squid-users@nlanr.net'
> Subject: httpd filter
>
>
> Hi,
>
> I taken the advice of the RedHat Linux list and I am proceeding to to
> install squid as filter from pornographic sites. (yes, I know I
> can't stop
> them all, but I can make a noticable dent in them).
>
> I am using IP Masq with six dialin modems. I am very green with this
> squid application. After reading through the docs I am even more
> confussed
> as I am little frustrated at the lack of clear defintions of the
> terminology.
>
> Can someone point me in the right direction and get me going in
> the rtight
> direction with out having to make this a "2 year college credit course".
> I really just need to set up a filter of somesort all the other
> stuff is
> really moot at this point but will followed up later.
>
> thanks in advance
> --
> robert
>
Received on Wed May 20 1998 - 11:43:52 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:40:15 MST