[GLLUG] Transferring high volumes of data.
JLMS
jjllmmss at googlemail.com
Wed Jun 11 00:16:09 UTC 2014
HI,
I am wondering what are people out there doing to transfer high volumes of
data (100 GB or more every time) between geographically distant sites.
I started using rsync (over ssh, including using a version of ssh optimized
for performance during file transfers) and got very poor performance (3-7
MB/s).
I started to play with sending data in parallel (going as far as splitting
some files) and although I improved speed by a factor of 3 or 4 times, the
time the transfers take is still unsatisfactory.
I started by opening many instances of rsync and my bottleneck became the
amount of sessions ssh can handle before starting to drop connections, at
the end I had to settle for running around 10 rsync instances, this works
much better but would be considered still slow by the powers that be.
I would appreciate any ideas, pointers, etc that may make possible to
transfer such amounts of data in an efficient manner as possible. I am
looking for expertise in the field rather than assisted googling (although
if you find something very,very interesting I would of course love to hear
about it) :-)
Thanks!
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.lug.org.uk/pipermail/gllug/attachments/20140611/3463369e/attachment.html>
More information about the GLLUG
mailing list