ronbaby Posted November 30, 2016 Posted November 30, 2016 Due to great BlackFriday discounts, at long last I am the proud owner of a Bluray burner and also a boat load of blank BD-R disks. (Got 'em all dirt cheap, which was what I had been waiting for for about the past 3+ years.) Now that I've got all this stuff, of course, I want to start to use this stuff to make backups of the majority of the nearly 2TB of gunk I have on my main harddrives. But that's easier said than done. My plan is to write a modest sized Perl script/program which will do the following: Given a list of files and directories, and also the full pathname to a pre-existing "backup history" file (which initially will be empty), the program will begin by skipping over and ignoring any & all files/directories that have already been backed up to optical media (based on what it says in the history file). For those remaining files/directories that have not yet been written to optical media, I want the program to find/calculate some subset of them that will amount to just under 25GB, i.e. a subset that will just fit onto my output media, then link (or symlink) all of those "selected" files/directories into a temporary directory which will then be what I give to Imgburn as its one and only input for the next burn. Finally, of course, the names of all the files so selected will be appended into the tail end of the history file, in preparation for the next burn/run. Figuring out how best to pack some subset of a given set of files/directories into a 25GB space without exceeding 25GB is a specific example of the general class of computational problems called "bin packing problems". These are all NP-hard, but I don't mind, and I won't be in any hurry, so I'll just thave the program to try all possibilities until it finds the best fit. So that part of the problem is not my major concern, nor is that why I am posting this message here. My real question is this: Given some arbitray set of files/directories, what's the magic formula for calculating exactly how much space that set would actually take up, i.e. if/when that set all got burned by Imgburn onto some piece of blank optical media. (In my case this will generally be blank single-sided BD-R media most of the time, but I may sometimes need to run this kind of auto-splitter-upper program also when making some backups to blank DVDs or maybe even blank CDs.) Anyway, Imgburn itself quite obviously does already know exactly how to make this exact calculation. i know because I've gotten helpful error messages from it on occasion, when it tells me that I've been a little too over-optimistic, and that in fact the files/directories I've just asked it to burn will not in fact fit on the specific blank optical media that I currently have in my optical drive. So anyway, it would be quite helpful if someone could tell me exactly how to make this same size calculation, you know, so that I could write either a Perl script or maybe even a C program to do everything I've described above. (If I can do it, then I'll probably release it, either as GPL'd open source or maybe FreeBSD license open source.) Of course, if there's already a free (or cheap) program out there that can do all I've described above, then by all means please tell me about it. It seems like this is such a common task, i.e. splitting up a bunch of files and then writing them to a set of consecutive optical disks... that it feels like somebody should have already created and provided a nice tool to help people do this. But maybe not, and maybe the world is just waiting for me to get off my ass and provide this.
LIGHTNING UK! Posted November 30, 2016 Posted November 30, 2016 I'm afraid there's no simple answer to this. The exact size required depends on various things... the number of files in any given directory, the length of each folder and file name, the type of file systems included on the disc (and versions of said file systems). ImgBurn basically builds the complete set of file system descriptors when performing its calculation and that's how it knows exactly how big the resulting image will be or how much of the disc will be used up. One thing is for certain, no 2 files can share the same physical sector (unless they're essentially 0 bytes in size), so you always round up the file size to a multiple of 2048 (the sector size).
Recommended Posts