I've changed:
acl QUERY urlpath_regex cgi-bin \?
to
acl QUERY urlpath_regex cgi-bin
and restarted squid - however it doesn't look like the page is being
cached properly. Do I need to do something else?
Thanks,
Max
TCP_MISS/200 27983 GET http://10.10.10.19/index.php? -
DIRECT/10.10.10.19 text/html
TCP_MISS/200 27983 GET http://10.10.10.19/index.php? -
DIRECT/10.10.10.19 text/html
On 3/14/06, Max Clark <max.clark@gmail.com> wrote:
> Oh - and for reference squid is running on a CentOS 4.2 machine with
> the following configuration:
>
> http_port 80
> hierarchy_stoplist cgi-bin ?
> acl QUERY urlpath_regex cgi-bin \?
> no_cache deny QUERY
> cache_mem 128 MB
> maximum_object_size 40960 KB
> cache_dir diskd /var/spool/squid 1024 16 256
> cache_access_log /var/log/squid/access.log
> cache_log /var/log/squid/cache.log
> cache_store_log /var/log/squid/store.log
> auth_param basic children 5
> auth_param basic realm Squid proxy-caching web server
> auth_param basic credentialsttl 2 hours
> auth_param basic casesensitive off
> refresh_pattern -i gif$ 0 50% 4320
> refresh_pattern -i jpg$ 0 50% 4320
> refresh_pattern -i png$ 0 50% 4320
> refresh_pattern -i ico$ 0 50% 4320
> refresh_pattern -i js$ 0 50% 4320
> refresh_pattern -i css$ 0 50% 4320
> refresh_pattern -i html$ 0 50% 4320
> refresh_pattern . 0 40% 1440
> acl all src 0.0.0.0/0.0.0.0
> acl manager proto cache_object
> acl localhost src 127.0.0.1/255.255.255.255
> acl to_localhost dst 127.0.0.0/8
> acl SSL_ports port 443 563
> acl Safe_ports port 80 # http
> acl Safe_ports port 21 # ftp
> acl Safe_ports port 443 563 # https, snews
> acl Safe_ports port 70 # gopher
> acl Safe_ports port 210 # wais
> acl Safe_ports port 1025-65535 # unregistered ports
> acl Safe_ports port 280 # http-mgmt
> acl Safe_ports port 488 # gss-http
> acl Safe_ports port 591 # filemaker
> acl Safe_ports port 777 # multiling http
> acl CONNECT method CONNECT
> http_access allow manager localhost
> http_access deny manager
> http_access deny !Safe_ports
> http_access deny CONNECT !SSL_ports
> http_access allow localhost
> http_access allow all
> http_reply_access allow all
> icp_access allow all
> httpd_accel_host 10.10.10.19
> httpd_accel_port 80
> httpd_accel_single_host on
> httpd_accel_with_proxy on
> coredump_dir /var/spool/squid
>
> On 3/14/06, Max Clark <max.clark@gmail.com> wrote:
> > Hi all,
> >
> > I have an _old_ server running a simple CMS driven web site. This
> > machine is sorely underpowered and has been having extreme issues
> > lately keeping up with the load. We are currently in development on a
> > new CMS platform to replace these old systems with a unified
> > environment - but for now I need to keep this box running.
> >
> > So for the short term I have placed a squid cache in front of this CMS
> > server which for the most part has been working perfectly. The CMS has
> > pages that look like this "/page.dyn?N=12345" where the N=12345
> > references the article to be served. I know squid by default will not
> > cache dynamic pages, but in this case how do I instruct squid to cache
> > this page and be aware that the different N= pattern is different
> > pages to serve? This will save many hours of sleep.
> >
> > Thanks in advance,
> > Max
> >
> > --
> > Max Clark
> > http://www.clarksys.com
> >
>
>
> --
> Max Clark
> http://www.clarksys.com
>
-- Max Clark http://www.clarksys.comReceived on Tue Mar 14 2006 - 12:32:22 MST
This archive was generated by hypermail pre-2.1.9 : Sat Apr 01 2006 - 12:00:04 MST