Jump to content

Recommended Posts

Posted (edited)

I've been working on a large project which used the de-duplication feature to save a lot of space to cram a lot of files on a disc in various combinations, but each iteration is painful as imgburn re-hashes all of the files, not just the ones that have changed since last time. It's impressive how quickly it does such a large volume of files, but it's still wasteful and takes about 10 minutes to hash ~5GB of them.

 

So I was thinking, why not optionally cache the hashes along with the file size and timestamps and only re-hash if they don't match? It would speed things up dramatically. :)

 

Additionally, to save space on my HDD, I'm using NTFS hardlinks - it would be super awesome if imgburn looked at the link's siblings and avoided re-hashing files that have the same, linked content (I can recommend the "link shell extention" for working with these)

Edited by mr_jrt

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.