> -----Original Message-----
> From: Chris Wilcox [mailto:not_rich_yet@hotmail.com]
> Just had a suggestion about a project I'm working on: can we provide
> predictive caching? I know it's possible to use cron and
> wget to schedule
> downloads of pages to keep them in the cache, but is there
> any way I can get
> squid to follow links on pages it downloads so they load even
> quicker when
> requested by users?
I don't think so, but it seems like this is the sort of thing you could
easily tinker with as a seperate program. Here's my thought: Write your
own very basic proxy, maybe in Perl or some other interpreted language for
the proof-of-concept version so it's easy to tweak. Point your browser at
this proxy, and point your proxy at Squid. Then your experimental proxy can
follow the links in the page, after passing it on to the web browser, and
Squid will automatically cache whatever it retrieves. Make sense?
If you decide to play with this, keep us posted. I find the idea pretty
interesting.
Received on Tue Jul 22 2003 - 10:10:19 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:18:14 MST