[Gllug] Splitting files at 2GB barrier

Nix nix at esperi.org.uk
Fri Jan 26 20:14:52 UTC 2007


On 26 Jan 2007, Richard Cottrill said:

> Hi guys,
>  
> I'm trying to backup a disk image to a device that cuts files at 2GB. In
> this case I've booted to a Live CD, so /dev/hda is not mounted. My first try
> was:
>
> for i in 1 2 3 4 5 6 7 8 9 ; do
>     gzip --best | dd of=<mounted smb share>/image.part${i}.gz obs=1000000
> count=2000 ;
> done < cat /dev/hda

That won't work. That'll redirect in from a file called `cat' and yield a
syntax error via the extraneous /dev/hda. Try

cat /dev/hda | for i in $(seq 1 9); do
    gzip --best | dd of=<mounted smb share>/image.part${i}.gz obs=1000000 count=2000
done

This is also expressible as

for i in $(seq 1 9); do
    gzip --best | dd of=<mounted smb share>/image.part${i}.gz obs=1000000 count=2000
done < (cat /dev/hda)

(note the brackets! That runs the cat in a coprocess and feeds the
output in on stdin.)

Or, if you want to have a single huge gzip sliced into pieces:

cat /dev/hda | gzip --best | for i in $(seq 1 9); do
    dd of=<mounted smb share>/image.part${i}.gz obs=1000000 count=2000
done

> cat seemed to be essential... I know it's not good form.

It sometimes makes things clearer.

> this didn't work. Aside from creating a number of junk/pseudo-empty files
> when it runs out of data, the image doesn't seem consistent. In addition, it
> seems to cut the files at 1GB, rather than 2GB (I could just fiddle the
> numbers - but it's odd). Please lets not argue about MiB vs. MB...

I'm surprised it generated an image at all.

-- 
`The serial comma, however, is correct and proper, and abandoning it will
surely lead to chaos, anarchy, rioting in the streets, the Terrorists
taking over, and possibly the complete collapse of Human Civilization.'
-------------- next part --------------
-- 
Gllug mailing list  -  Gllug at gllug.org.uk
http://lists.gllug.org.uk/mailman/listinfo/gllug


More information about the GLLUG mailing list