On Mon, 7 Jul 1997, Christophe Zwecker wrote:
> Gregory Maxwell wrote:
> >
> > On Sun, 6 Jul 1997, Christophe Zwecker wrote:
> >
> > > squid ? The idea is to feed wget my bookmarkfile and to browse through
> > > everything offline, using squid
> >
> > I've never used wget but I suppose you whould just make it use the squid
> > as a proxy and redirect wgets output to /dev/null.. Then if you configure
> > squid to pull from the cache if it can't make a connection you should be
> > all set..
> well, thing is I will tar.gz wgets output and get it from my isp,
> because I will run wget overthere (point is to save money). So the
> question is, how I pipe the dirs and files wget got into squid..
>
> any ideas ?
Eeek.. That would be QUITE a hack.. Hmm.. You'd need prob need to hack
wget so that it produced output like this:
say for squid's site
\squld.nlanr.net\index.html
...
(^^ See, complete url)..
Then you severly hack a copy of some proxy server (maby squid.. or maby
something easier to hack)... Or maby you write a script that listens on a
port and recieves proxy requests in the form of
http://squid.nlanr.net/index.html
and reads the approiate file..
Then you configure squid with your fake proxy set as the only parent..
Then you rerun wget with the exact same paramaters to populate your
squid..
OR you could just make a program to take your wgeted stuff and orginise it
into a squid cache directory structure and create the metadata.. Then you
feed it to squid..
Received on Sun Jul 06 1997 - 16:37:10 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:35:41 MST