[Gllug] IDE RAID
Rich Walker
rw at shadow.org.uk
Wed Jun 11 17:18:27 UTC 2003
Richard Cottrill <richard_c at tpg.com.au> writes:
> Simon Faulkner wrote:
> > We have a requirement for a lot of storage on a server.
> > I was thinking of using the Adaptec ATA RAID 2400A and 4 x 250 Gb IDE
> > drives.
>
> Is there a specific requirement for a hardware RAID?
>
>
> I understand that
> software RAIDs are generally significantly quicker (something to do with
> the outrageous speed of modern processors). The only other comment I can think of is to try to get
> good IDE cards for each drive.
My experience has been that the route to do this kind of thing looks
more like:
when N GB hard drives are as cheap as possible,
and you need 4*N storage,
buy 4 drives from 1 manufacturer, and 4 from another (Make sure they are
*different factories*!)
Label the drives A,B,C,D for the one vendor and W,X,Y,Z for the other
Install the mdadm tools.
ide0: A B
ide1: C D
ide2: E F
ide3: G H
configure 4 RAID-1 arrays, such that in each pair, the two drives are on
separate controllers from each other:
md0: A E
md1: B F
md2: C G
md3: D H
*Then*, configure a RAID-0 or linear (Your preference) drive-set:
md4: md0 md1 md2 md3
This is technically overkill (8 drives for 4*Capacity result) but it
does have the advantages that the failure of a controller can't bring
your box down, the failure of 1 drive is neither here nor there, and the
failure of an entire batch of drives from a given vendor can't bring
your box down. (Yes, that does happen. My favourite version of it was
some account of a server with ~25 drives being shut down - on reboot it
was found that the dry bearing on the spindles had melted while the
system had been running, and when shutdown had turned into glue. Many
simultaneous failures...)
cheers, Rich.
--
rich walker | technical person | Shadow Robot Company | rw at shadow.org.uk
front-of-tshirt space to let 251 Liverpool Road |
London N1 1LX | +UK 20 7700 2487
--
Gllug mailing list - Gllug at linux.co.uk
http://list.ftech.net/mailman/listinfo/gllug
More information about the GLLUG
mailing list