[Nottingham] Downloading multiple images from a website
Michael Quaintance
penfoldq at penfoldq.co.uk
Wed Jun 1 08:23:27 BST 2005
Roger Light said:
>
> wget --wait 10 --random-wait `for i in $(seq 0 999); do echo
> "http://blah/${i}.jpg; done`
>
> I've not tested either of those, so feel free to pick holes.
>
Wonderful, this is the type of thing I was trying to achieve but hadn't
got the backticks, for or echo the right way round.
Thanks very much. I know how to do this as a 'C' program but that is
clearly overkill. Must learn shell programming.
Correct me if I am wrong but won't this actually spawn 1000 consecutive
instances of wget? I might be better using the 'for' to create a list file
of each url I want to download and ensuring wget actually performs the
--random-wait algorithm. Actually, this seems better as I can then
randomise the list file and get the images in a pseudo-random order.
Cheers
-Penfold.
More information about the Nottingham
mailing list