[sclug] RAID 5 - Summary
Peter Brewer
p.w.brewer at reading.ac.uk
Thu Sep 22 14:37:32 UTC 2005
Thank you all for your assistance. I think all is ok now. I've
reinstalled Ubuntu from scratch using RAID5 with 7 active disks and 1
inactive disk. The active/inactive is apparently whether they are
actively used in the RAID and nothing to do with boot stuff. So our
system now has RAID 5 across 7x 250Gb disks (which gives a total of
1.5Tb as RAID 5 = n-1), with one spare disk waiting in the sidelines to
save the day if any of the active RAID disks fail. This means that a
total of 3 disks would have to die in quick succession for us to loose data.
Due to the slightly confusing way the OS had named the drives it appears
that the drive that 'failed' was actually the same drive that I had
removed. I'm quietly confident that by actually removing the drive, the
RAID borked, rather than anything more sinister. I don't suppose we'll
know until one of the drives dies for real!
In summary:
1) Fedora Core sucks :-P
2) Don't boot your RAID system with one drive unplugged - your computer
doesn't like it.
Thanks again.
Pete
More information about the Sclug
mailing list