Hi to all,
I have a question concerning the usage of squid to dump the content of the cache, keep a copy of the cache or block this site (and its bound content) from being deleted from squid´s cache.
There are some sites in the net that I would like to ensure that I can surf them in the future even if the site goes offline or gets deleted/ modified. wget is not really useful for this since it does not interpret js and may offer a different result than when surfing with the browser - robots.txt and similar nuisances.
So, I would like to have a secured copy of the website I surfed. squid does this, but how do I secure the cached content? Am I missing something in the manuals?
Best regards
Luigi Monaco
Received on Sun Nov 14 2010 - 22:34:29 MST
This archive was generated by hypermail 2.2.0 : Mon Nov 15 2010 - 12:00:02 MST