Andres Salazar wrote:
> Hello,
>
> Ive setup my first reverse proxy to accelerate a site, ive used wget
> to spider the entire site several times and noticed that even after
> running it some files never get cached like html files! I presume it
> is because the htmls dont have the correct cache headers.
>
> It didnt even want to cache up .swf files, but then I added this line
> and it helped a lot but not completely.
>
> refresh_pattern . 0 20% 4320 ignore-reload
>
> Iam thinking then the best approach is to make squid cache EVERYTHING,
> and then manually give it specific exceptions of dynamic content (
> like .php and some .html with embedded php scripts). I dont want to
> start editting files because I want to test the performance increase
> before adding headers one by one.
No its not the best approach. The best approach is to take what really
exists right now and see how the variations you plan change things.
Side points:
That gives the minimum threshold you want to improve.
The maximum threshold can be found by locating cacheable files of
various sizes and running apache-bench or similar on them. That will
give you the 100% caching speeds for those file sizes and the req/sec
Squid can serve them at.
Back to main point:
Squid already attempts to cache every possible object it can cache.
Making exceptions only for objects with explicit expiry information that
indicates its non-cacheable or too old.
The exception there is dynamic content, old configs had rules to
explicitly prevent dynamic stuff caching.
Squid can easily cache dynamic content as well if they contain
Cache-Control: and/or Expires: headers. Just follow the instructions at
http://wiki.squid-cache.org/ConfigExamples/DynamicContent
Amos
-- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE17 Current Beta Squid 3.1.0.12Received on Fri Jul 31 2009 - 04:44:49 MDT
This archive was generated by hypermail 2.2.0 : Fri Jul 31 2009 - 12:00:04 MDT