Jump to content


  • Posts

  • Joined

  • Last visited

Profile Information

  • Location

ronbaby's Achievements

ISF Newbie

ISF Newbie (1/5)

  1. Yea. For BD-R sector size == 2048 bytes. However as far as most things relating to BD-R, go, that number is almost entirely superflouous and useless, because in reality, the only things every written to BD-Rs are clusters of sectors, comprised of 32 sectors. In short, the real block size for BD-R is 32*2KiB == 64KiB.
  2. Thanks for all the replies folks. Most illuminating. Now, if I could just figure out a nice formula for predicting, based on the set of input files (and paths) how many sectors (on the BR-R optical media) Imgburn will decide to use for the corresponding (UDF 1.02 only) image, I'd be in heaven. I'm starting now to try to reverse engineer exactly such a formula, based on empirical testing. I'm doing that just because I do not relish the prospect of having to find the actual detailed specs for the UDF format and then groveling through that for hours on end. (And anyway, even if I did, all of the information might not be a precise predictor of what ImgBurn will actually do.) I hope that my formula will end up being reasonably simple... something like: UDF image size = roundup_mod_64k (actual data blocks) + (per-file-overhead * files) + general disk overhead but I already have some hints from my testing that things may not be so simple.
  3. I'd like to use my BluRay burner and the blank BD-R disks I have to create backup copies of all of the (essentially arbitrary) data files that I have stored on one particular disk drive and under one particular directory (over on my FreeBSD file server, which is Samba-exporting the relevant filesystem to my Windows7 system, where I run ImgBurn). I've already written a small Perl script which breaks up this great mass of data files into appropriately sized (25GB/23.3GiB) chunks such that no single file will be split across two of the target BD-R disks. This will make it easier to recover single files from the backup(s) should that ever be necessary. What I've just now realized is that although I have been using ImgBurn happily for years (for DVD backups) I actually don't know what I'm doing when it comes to using this fine tool for data archiving. Specifically, I am utterly confused about all of the available "Image Options", and would like some guidance. For this specific (data file archiving) aplication, what would be best? ISO 9660? UDF? Joliet? All of the above? I'm totally in the dark. My only requirement is that the resulting burned BD-Rs should be readable on Win7 and also on recent vintage Linux systems. (It would be Nice if they were readable also on recent vintage FreeBSD systems, but that's not a hard and fast requirement.) Also, with respect to UDF, I am being offered (by Imgburn) the choice of no fewer than six different UDF revision numbers. I have no idea which one of those will be "best" in this case, and don't have any idea what the pros and cons of the different versions are (and the ImgBurn guides and FAQ don't seem to say anything specifically about this). I've looked also at the summary of the UDF revisions on the applicable Wikipedia page, but it is all still rather opaque to me. And last but not least should I just stick with the default of "MODE1"? What is "MODE2" used for? Is MODE2 adding more error correction data? If so, I might want to use that. Advice on all of these points would be appreciated. P.S. The one useful tidbit of info that I was actually able to gleen from the ImgBurn guides regarding the above topics was that ISO 9660 permits only ancient DOS 8.3 filenames (yecch) which would be totally non-usable for me, but that Joliet allows filenames up to 64 bytes (characters?) and UDF up to 128 bytes (characters?). Of course, this all sort-of encourages me to use just straight UDF, but it also begs the question: What will happen if I have a file in a directory that has a 129-byte long filename and if I then ask ImgBurn to burn that (In straight UDF format) ? Will the filename just get silently truncated?? That would be Bad. And also, do these limits apply to just the filenames, or do they apply to the complete pathnames? Thanks in advance for any & all replies.
  4. Due to great BlackFriday discounts, at long last I am the proud owner of a Bluray burner and also a boat load of blank BD-R disks. (Got 'em all dirt cheap, which was what I had been waiting for for about the past 3+ years.) Now that I've got all this stuff, of course, I want to start to use this stuff to make backups of the majority of the nearly 2TB of gunk I have on my main harddrives. But that's easier said than done. My plan is to write a modest sized Perl script/program which will do the following: Given a list of files and directories, and also the full pathname to a pre-existing "backup history" file (which initially will be empty), the program will begin by skipping over and ignoring any & all files/directories that have already been backed up to optical media (based on what it says in the history file). For those remaining files/directories that have not yet been written to optical media, I want the program to find/calculate some subset of them that will amount to just under 25GB, i.e. a subset that will just fit onto my output media, then link (or symlink) all of those "selected" files/directories into a temporary directory which will then be what I give to Imgburn as its one and only input for the next burn. Finally, of course, the names of all the files so selected will be appended into the tail end of the history file, in preparation for the next burn/run. Figuring out how best to pack some subset of a given set of files/directories into a 25GB space without exceeding 25GB is a specific example of the general class of computational problems called "bin packing problems". These are all NP-hard, but I don't mind, and I won't be in any hurry, so I'll just thave the program to try all possibilities until it finds the best fit. So that part of the problem is not my major concern, nor is that why I am posting this message here. My real question is this: Given some arbitray set of files/directories, what's the magic formula for calculating exactly how much space that set would actually take up, i.e. if/when that set all got burned by Imgburn onto some piece of blank optical media. (In my case this will generally be blank single-sided BD-R media most of the time, but I may sometimes need to run this kind of auto-splitter-upper program also when making some backups to blank DVDs or maybe even blank CDs.) Anyway, Imgburn itself quite obviously does already know exactly how to make this exact calculation. i know because I've gotten helpful error messages from it on occasion, when it tells me that I've been a little too over-optimistic, and that in fact the files/directories I've just asked it to burn will not in fact fit on the specific blank optical media that I currently have in my optical drive. So anyway, it would be quite helpful if someone could tell me exactly how to make this same size calculation, you know, so that I could write either a Perl script or maybe even a C program to do everything I've described above. (If I can do it, then I'll probably release it, either as GPL'd open source or maybe FreeBSD license open source.) Of course, if there's already a free (or cheap) program out there that can do all I've described above, then by all means please tell me about it. It seems like this is such a common task, i.e. splitting up a bunch of files and then writing them to a set of consecutive optical disks... that it feels like somebody should have already created and provided a nice tool to help people do this. But maybe not, and maybe the world is just waiting for me to get off my ass and provide this.
  5. Yes, I googled around a litte on UMEDISK-DL1-64 after I posted, and I see that most people seem to think these are just junk, and not worth the bother, however... Here's the bottom line: For me, in my experience... at least up until recently... the vast majority of these burn and verify OK, and they seem to hold up OK too, over time, if stored properly (i.e. vertically, in opaque cases, without extremes of temperature or humidity). This one 50-pack is definitely not too good, and as of late last night I now have about 16 coasters from this pack (which is only about half way finished) but still, if you look on Newegg right now the Optical Quantums are on sale... which happens every couple of moths or so.. so you can get them for like $19 per 50-pack including shipping. Compare that to the wonderful Verbatim AZO DVD+R DL 50 packs which at the moment are listed at Newegg for $49.98 (shipping included). I mean jeeeeezzzzzz! So the VerbatIms cost 2 and 1/2 times as much!! Given that, I'd rather buy the Optical Quantum ones even if it ends up that 50% of them are coasters. They would stuill be cheaper, per successful burn, even in that case. P.S. I should mention that I always burn my DVD+R DLs at 6x... never 8x. I think that would be pushing the limit, and I'm not in that big of a hurry when burning.
  6. The subject line pretty much says it all. I'd like to know if there is any such thing as a definitive test for burner failure, and if so, where I could obtain it. My situation is as follows: I've been using Imgburn happily & successfully for several years, with only occasional coasters. The burner I have been using for a long time now is an ASUS BC-12B1ST. I've successfully used this to burn hundreds of DVD+R DLs, essentially all of which are either TDK branded or else Optical Quantum branded. Some time ago, I started using the Optical Quantums exclusively, because they seemed to produce only 1 or 2 coasters per 50 pack, and because they tend to be less expensive than the TDKs. I've purchased at least five (5) 50 packs of the Optical Quantum DVD+R DLs, and burned through the first 3 of those with minimal coasters. Now however, working on the 4th 50-pack, I am getting more than 50% coasters out of the first half of that 50-pack. In general I'm getting either write or verify failures (or both) in layer 1. It's possible that I just got unlucky, and that this one pack happened to come from a bad batch, but... I've read somewhere that over time, the effectiveness of the laser inside a burner can diminish (with use). I've used this ASUS burner quite a lot, and so now I'm just wondering if it is suffering from this effect, and if it is time to put it out to pasture. (I have an LG burner sitting on the shelf already, collecting dust, so it would not be particularly difficult or costly for me to replace that ASUS, but I am reluctant to do so until I know that the ASUS is the source of the problem, and not just a bad batch of DVD+R DL media. P.S. The media code on the disk in the specific Optical Quantum 50-pack I'm currently working with is UMEDISK-DL1-64. Googling that seems to indicate that Verbatim DVD+R DLs also have that same code. If true, that would make all these recent coasters even rather more inexplicable (because everyone says that Verbatims are the best). P.P.S. I will post a burn log (from one of these failures) if someone requests that, however that is not really relevant to my question. No matter what the log shows, I still want to know if there is a definitive way to test a drive to see if its laser is past its prime. P.P.P.S. Yes, my recent run of coasters most probably is due to plain old media failures. I've just now completed a burn of another DVD+R DL from the same 50-pack (and using the same burner) with no hiccups at all. But I'd still like to know how to test the drive.
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.