I'm planning on using M-DISC to archive a few terabytes of data. I am worried about the disc getting corrupted and becoming unreadable (scratches, deterioration, or whatever) after many years.
My idea is to invest in more upfront processing to compute and insert parity blocks onto the disc for forward error correction. I have software that operates on chunks of data and spits out some parity bits, and the same software can reverse the process and recover bit errors. I realize this will take up more space on the M-DISC (having to include more parity bits compared to a normal burn), but it's worth it to me.
Will this method work? Will it actually improve my chance to recover the data?
I'm not sure what error checking methods are already included in reading/writing discs. I'm assume there must be something because I've had times when Windows will be completely unable to load a disc. I'd like to know if there is a "raw data mode" (is this what an ISO format is?). That way, even if there are bit errors (ie Windows won't load it), I can still just ask ImgBurn to read out whatever bits it can, and then I would run my own parity software and attempt to recover the data myself.