[H-GEN] Want squid to automatically update certain sites every 10 mins
Sarah Hollings
sarah at humanfactors.uq.edu.au
Sat Mar 22 20:58:53 EST 2003
[ Humbug *General* list - semi-serious discussions about Humbug and ]
[ Unix-related topics. Posts from non-subscribed addresses will vanish. ]
Shane Ravenn wrote:
> On Sun, 23 Mar 2003 09:49:34 +1000
> "t" <s4565 at lycos.co.uk> wrote:
>
>
>>Thanks for the wget suggestion. I have just been trying it out but I
>>cannot seem to download the images. If I do
>>
>>wget -nd --delete-after "www.smh.com.au"
>>
[snip]
> Hi
>
> Among the best options for this would be this bit, skimmed dirrectly out
> of the man page:
> -p
> --page-requisites
> This option causes Wget to download all the files that are necessary to
> properly display a given HTML page. This includes such things as
lynx will do some of these sorts of things, and may fill a need - use
"man lynx" and check out the -crawl and -traverse options.
Sarah Hollings
--
* This is list (humbug) general handled by majordomo at lists.humbug.org.au .
* Postings to this list are only accepted from subscribed addresses of
* lists 'general' or 'general-post'. See http://www.humbug.org.au/
More information about the General
mailing list