[Nottingham] Downloading multiple images from a website

Duncan John Fyfe djf at star.le.ac.uk
Wed Jun 1 12:22:00 BST 2005


On Wed, 2005-06-01 at 11:50 +0100, Robert Hart wrote:
> On Wed, 2005-06-01 at 10:16 +0100, Michael Quaintance wrote:
> 
> > That makes sense. Does the builtin nature also stop the commandline
> > getting too long? 1000 URLs of approx 30 chars each plus a bit of overhead
> > for the initial wget options is more than I expect I could legally type as
> > a commandline.
> 
> The builtin echo can take a much longer command-line. But this wont help
> you in this case, because it was wget that you were giving the long
> command line to, and not echo.
> 
> If you are getting close to the limit for length of command line you
> should use xargs. (which has been mentioned often enough here before)
> 

Rather than messing around with backticks, command line lenghts and
xargs why not write the URLs to a file and use "wget -i file". 
If you are concerned about overloading the server try "wget --spider -o
log -i file" and use the output to filter your list.

> Rob
-- 
Duncan John Fyfe <djf at star.le.ac.uk>




More information about the Nottingham mailing list