I've been working on a large project which used the de-duplication feature to save a lot of space to cram a lot of files on a disc in various combinations, but each iteration is painful as imgburn re-hashes all of the files, not just the ones that have changed since last time. It's impressive how quickly it does such a large volume of files, but it's still wasteful and takes about 10 minutes to hash ~5GB of them.
So I was thinking, why not optionally cache the hashes along with the file size and timestamps and only re-hash if they don't match? It would speed things up dramatically.
Additionally, to save space on my HDD, I'm using NTFS hardlinks - it would be super awesome if imgburn looked at the link's siblings and avoided re-hashing files that have the same, linked content (I can recommend the "link shell extention" for working with these)
Edited by mr_jrt, 10 November 2015 - 01:11 AM.