On Mon, 18 Apr 2011 18:30:51 -0700, Linda Walsh wrote:
> I was wondering if anyone had write a module for squid to change it
> into
> an 'accelerator', of sorts.
>
> What I mean, specifically -- well there are a couple of levels. S
>
> 1) Parsing fetched webpages and looking for statically included
> content
> (especially .css, maybe .js, possibly image files) and starting
> a "fetch" on those files as soon as determines which files are
> going to be needed to render the page. By 'render', I mean something
> along the lines of "wget"'s --page-requisites
>
> Theoretically, squid would have an edge as it sees the information
> first, and could start fetching all of the needed content in parallel
> ASAP (of course if it isn't needed, or the client fetching that page
> stops the render, existing, outstanding requests based on that page
> could be aborted.
>
Squid is designed not to touch the content. Doing so makes things
slower.
There are ICAP server apps and eCAP modules floating around that people
have written to plug into Squid and do it. The only public one AFAICT is
the one doing gzipping, the others are all proprietary or private
projects.
>
> 2. Another level would be pre-inclusion of included content for pages
> that have already been fetch and are in cache.
>
> I.e. Suppose a page is fetched and it's known that it
> includes 3-4 different css pages. If it is a commonly fetched page,
> rather than having each client do multiple fetches -- some of which
> may involve nested css files -- meaning a client will have to
> parse and ask for more (adding multiple Round-Trip-Times/RTT's) to
> the page's render time.
>
> Depending on load/RTT and sizes, there could be a significant speedup
> to the client if those files were all concatenated into 1 file,
> so instead of getting:
>
ESI does this. But requires the website to support ESI syntax in the
page code.
Amos
Received on Tue Apr 19 2011 - 01:50:51 MDT
This archive was generated by hypermail 2.2.0 : Wed Apr 20 2011 - 12:00:03 MDT