[Nottingham] Downloading multiple images from a website
Robert Hart
enxrah at nottingham.ac.uk
Wed Jun 1 12:47:31 BST 2005
On Wed, 2005-06-01 at 12:16 +0100, Duncan John Fyfe wrote:
> Rather than messing around with backticks, command line lenghts and
> xargs why not write the URLs to a file and use "wget -i file".
> If you are concerned about overloading the server try "wget --spider -o
> log -i file" and use the output to filter your list.
I was talking in a generic sense about long command lines, rather than
specifically about the wget case.
Rob
--
Robert Hart <enxrah at nottingham.ac.uk>
University of Nottingham
This message has been checked for viruses but the contents of an attachment
may still contain software viruses, which could damage your computer system:
you are advised to perform your own checks. Email communications with the
University of Nottingham may be monitored as permitted by UK legislation.
More information about the Nottingham
mailing list