[Nottingham] Downloading multiple images from a website
Roger Light
rogerlight at gmail.com
Wed Jun 1 00:16:44 BST 2005
> I want to steal all the files from (not the actual website)
>
> http://www.website.com/images/random404img/
>
> with the filenames 0.jpg to 999.jpg but acknowledging that not all of
> these will exist and not wanting to hammer his bandwidth so I want to
> make sure that I wait a reasonable amount of time between each one.
Something like
for i in $(seq 0 999); do
wget http://blah/${i}.jpg
sleep 3s
done
should do it.
Alternatively:
wget --wait 10 --random-wait `for i in $(seq 0 999); do echo
"http://blah/${i}.jpg; done`
I've not tested either of those, so feel free to pick holes.
Cheers,
Roger
More information about the Nottingham
mailing list