Jump to content

Archiving with BD-R


nstgc

Recommended Posts

Before I begin I'd like to adress all those people who will tell me about disk rot and how disks sometimes just can't be read. I have previously archived with DVD+R's and learned the hard way that disks decay over time and sometimes they simply won't read at all. Burnable media isn't as reliable as that which you find in commercially created disks. That is why I am making this topic to make sure that I don't have the same problem as last time.

 

Pretty much I'm backing up a bunch of stuff. I'm a pack rat of sorts (or we can just call it what it is -- OCD/phobia developed from a period were I had a HDD failure every few months) and I like to reinstall my oporating systems frequently. Every 6 months or so I reinstall both Windows and Linux. I have a bunch of hard drives lying around, but I'd like to put those on BD-R's. Needless to say the amount that I need to back up is substantial (>2TB). I want to store it, as well as whatever comes after on BD-Rs for a long period of time with out having to worry about data loss.

 

I currently am planning on employing two different methods to prevent data loss. Firstly, I'll be using DVDisaster to create ECC files with 64 roots (33.5% redundancy). I will store that on a separate BD-R which will also contain the ECC file for the previous disk with ECC data on it.

 

Then I will take the ISOs of 6 disks (4 with data and 2 with ECC files) and create par2 files (using MultiPar) with 3000 blocks (since anything more would take DAYS to compute given that it will be computing it for ~150GB of data) and storing 1500 of those blocks on 3 BD-R (500 blocks per disk each in its own file). Each of these disks will be protected with a 32 root (14.7% redundancy) ECC file. 6 stored on a single BD-R plus one for the previous ECC containing disk.

 

The ECC is suppose to take care of basic aging. For most disks this should be sufficient. However I know from experience that some disks age unexpectedly quickly and others can become damaged in such a way that they simply can't be read at all. This is what the Par2 data is for. It is to take care of those disks that age too quickly or become unreadable. It also is why its not necessary for them to have more than 500 blocks per disk. Its for those cases where the disk is seriously screwed up. Also if the par2 disks get too bad off, the files on them are sufficiently small that it should be possible to get enough blocks off of them to reconstruct one entire disk.

 

I chose not to simply have duplicate disks (instead of par2 files) since it would result in a redundancy ratio of 2:1 (3 disks for every 1 data disk) instead of 5:4 with my currently implication. Also as disks get old and decay, I will be checking them and making duplicates anyway to replace the dying disks. I am worried about the par2 disks though since if one of them goes completely bad I'll likely have to completely redo all disks in that set (of 3). [edit] Thirdly, is there any brand of media that you guys would suggest. I am currently using Optical Quntum because its cheap, and so far the disks I've burned in January (7 of them) don't show signs of heavy decay. However if there is a better brand I'll switch to that.[/edit]

 

Comments and suggestions are much needed? I also have a few questions. Firstly, does DVDisaster handle BD-R well? Secondly is there a respectable alternative to par2? Par2 takes a short eternity to compute. Ideally I would like to get rid of the two method system altogether and use only one, but with the need for large block sizes under par2 that isn't possible.

 

tl;dr: I used to archive with DVD+Rs, that was a disaster. I'm trying again with BD-Rs and have a plan. If you don't want to read through my plan at least leaves some questions or coments that you find helpful or perhaps I can help you with.

 

Thank you.

Edited by nstgc
Link to comment
Share on other sites

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.