[SWLUG] How to cope with superlarge file
Jonathan Wright
mailing-lists at djnauk.co.uk
Mon Dec 12 23:32:08 UTC 2005
Neil Jones wrote:
> I am exploring an idea. One of the possible options involves
> downloading a .tar file that is around 300 _Giga_ bytes in size.
No-one's asked the question yet, and you are free not to answer it if
the situation doesn't allow, but why have you chosen or opted to use the
method of trying to download a 300Gb tar file via HTTP?
Is it a single file that has been tar'ed, or a group of files. If a
group of files, why not download them separately.
(BTW, it may worth looking into whether any of the methods, FTP, HTTP,
etc. can support such a large file - you may find many are limited to
say 2Gb or 4Gb files.
Also, even co-location or broadband of not a guarantee that you will
receive the file in one piece in one go - a single drop-out at any point
can render the transfer useless)
--
Jonathan Wright
~ mail at djnauk.co.uk
~ www.djnauk.co.uk
--
2.6.14-gentoo-r2-djnauk-b1 AMD Athlon(tm) XP 2100+
up 8 days, 12:51, 2 users, load average: 3.15, 3.59, 3.62
--
cat /dev/random (because u never know, u may see something u like)
--
"Did you hear about the Scottish drag queen? He wore pants."
~ Lynn Lavner
More information about the Swlug
mailing list