Jump to content

Possible RAM-Leak with 1.2.0.0?


Recommended Posts

Posted

Ladies and Gentlemen,

 

I have just updated ImgBurn to 1.2.0.0 and for some reason my RAM seems to dissapear while writing and verifying with this most recent build, although ImgBurn stays rock-solid at 10-30 MB in the taskmanager like the swap file as well (no fluctuation here neither).

 

My observations make it look like that the RAM gets eaten up by no visible process in the taskmanager, but starts to drop in steps of about 20 MB from 600-700 MB free RAM when ImgBurn starts to load the image file (for writing and verifying) until "Cacheman" kicks in and frees up some RAM, which will get eaten up again and again and so on. This occurs with using both SPTI and ELBY I/O. It doesn't matter if Slysofts red fox is active or not, nor does it matter from which HD I am writing.

 

Since this behaviour was introduced with the latest build and as other Software (DVDD as an example) is not affected, my wild guess would be that there is something wrong with this version. What do you think of it LUK!? I wonder, if anyone could confirm this behaviour or has an idea of how to fix this?

 

Best Regards

Posted

This is just your OS caching the reads/writes. Nothing I can do about it (except to specifically telling the OS not to, but that causes me additional problems)

 

It's only temporary and windows will recycle it when it needs to.

Posted (edited)

Thanks for your reply, but "loosing" 500-600 MB of free RAM is not a typical behaviour of ImgBurn as I know it and it occurs with every write attempt of the latest 1.2.0.0. And it's only about 500-600 MB because of a RAM-Management software is kicking in. I would bet, if I had 2 GB of RAM (right now I have 1 GB) installed on my machine, all of it would vanish as well.

 

I am not sure, if I was precise enough, but the memory vanishes stepwise pretty quickly - and it would definitely drop down to zero free RAM - when ImgBurn has finished the lead-in and starts to write the image file. With the beginning of the lead-out the vanished memory will be free'd up and vanishes again when the verification process beginns. The memory remains "missing in action" until verification has finished. Not to mention that the whole PC slows down while this happens and would probably lead to a coaster if there wasn't this "doorblocker" named "Cacheman" on my machine.

 

ImgBurn 1.0.0.0 + 1.1.0.0 is not affected and leaves about 500 MB of free RAM, so does your previous tool or any other writing-software. The drive with which this was observed is a Plextor 716A 1.09 in combination with Verbatim 16x and Ricoh 8x quality media. PoweRec was disabled via ImgBurns advanced settings.

 

Best Regards

Edited by ????
Posted

i just duplicated your settings with a 716a 1.09 verb 16x discs and power rec disabled and im using 22% ram during the burn and i have 1gb

Posted

This has been something I've been aware of since DVD Dec days, it's nothing new, I promise you.

 

It really is just windows caching the data that's read from the image file, does it on my pc too.

 

The second the file handle is closed, windows frees it up again. Hell, kill that handle in process explorer if you like - ImgBurn won't do any cleanup, it'll just give you an error message - and yet your 'available' physical memory will shoot back up.

 

As you mentioned earlier, it's not ImgBurn itself that's using the memory, if it were, it would be added onto ImgBurn.exe memory allocation - which is clearly visible in task manager.

 

Not only that, as the beta team will vouch, the program has been run through codeguard many many times without any leaks being reported.

Posted

Oh, the only difference I made to the file reading was that I added that 'sequential' flag to the 'CreateFile' API call. It's supposed to tell windows to optimise caching for reading a file from start to finish - which is of course what I'm doing.

 

You seem like a clued up guy, so if you want, go check it out on msdn.

 

Just search for 'CreateFile' and looks at the programmers help on it.

 

EDIT: look here: http://msdn.microsoft.com/library/default..../createfile.asp

Posted (edited)

@ cornholio7

 

Thanks for testing, and honestly, I do not have a clue what is happening here. Again, ImgBurn does not grab the memory itself, as it takes maximum 30 MB in the taskmanager, the RAM vanishes for no real reason and that happens only when v1.2 is handling the content of an imagefile. And this applies to all of my imagefiles which are processed from HDs connected to an Intel- (ICH5 Raid0 S-ATA) and Promisecontroller (20378 simple S-ATA and P-ATA).

Edited by ????
Posted

 

Thanks for your reply. Unfortunately I am not into programming, but I will have a deeper look into the topic, maybe I can understand something on a theoretical basis. But I just don't understand why this occured for the first time on my machine with this version. You wrote something about an added flag with this release, is there a way to disable this (it's not a bug, it's a) feature :lol: on my end to test if this is the culprit?

 

Thanks for your efforts.

 

Best Regards

Posted

Like I said, I have the same issue on my pc with 1.2.0.0. When I burn, the available memory vanishes into cache. As such, I can do some testing my end - which makes things sooooooooo much easier!!! lol

 

I think this is something windows has done recently as I remember way back when (during DVD Dec days) that someone reported this as a bug (or maybe a suggestion), that I tell windows not to cache the reads.

It's clear to me as a programmer that this is what's happening, although to a user, I can understand that's not what you'd think to be the cause.

 

Maybe with DVDs becoming more popular, Microsoft changed the way it dealt with caching big files and reads, and kinda bodged it so 64k reads didn't cache on certain (?) large files.

 

When I said I'd only changed the sequential file flag earlier, I forgot about another change I'd made - on the transfer length.

As reads are proxied by Windows anyway, I just saw making the number of physical reads my programs makes smaller as being a good thing. Read larger amounts at a time = less reads = less for the OS to do. It was actually added in an attempt to speed up reading from a network drive - again because I assumed it would have less work to do.

 

Anyway, that seems to have backfired and now windows is not ignoring the fact that I'm reading big files.

 

I've just done a test with the transfer length set to 256kb (as is default in 1.2.0.0) and then again at 64kb (default for 1.0.0.0, 1.1.0.0 etc).

 

NOTHING else was changed.

 

At 64kb, I saw no change in available memory.

At 256kb, Windows gobbles it all up.

 

So the sequential flag is not the problem, it's the transfer length and the (guessed) fact that I'm now bypassing Window's caching controls on large files.

 

Oh well, we live and learn!

 

I bet I could search google all day long and not find that little gem documented ;)

Posted

Hehe, somehow I must have missed that you can see the same behaviour on your machine. This raises my hope that you may find a way to optimize ImgBurn on this one. And as it seems you have already found the reason for this issue. Do I understand correctly, that changing the transfer length to 64kb should "stabilize" the available memory? I just did that in the I/O settings, did another burn and had to say bye bye to my RAM again. I bet it had a good time on va-cache-ion. :blink::lol:

 

However, thanks a lot for looking into it, your efforts are highly appreciated.

 

Best Regards

Posted

Noooo, that's the Device (dvd drive) transfer length, you cannot configure the value used for reading from the hdd.

 

The two are very different.

 

As mentioned earlier, Windows proxies all hdd reads, it doesn't proxy drive reads.

 

If I read 256kb at a time from a hdd, it'll just issue multiple physical reads from the hdd if it can't do 256kb at a time - and then return me the full 256kb.

If I read 256kb at a time from a dvd drive, I get an error.

 

Most direct/physical I/O is limited to 64kb at a time - MAXIMUM. Many old/shite drivers are limited to 32k.

 

As I never saw using 256kb as a problem (because the OS is supposed to deal with it), I never gave the users a slider so they could select the size they wanted it to use.

 

Before you started this thread, I had actually already implemented a slider so people COULD change that value.

 

This was added after reading / dealing with that other post in this forum about slow burning.

 

Basically, I think you both have the same issue, only because that other chaps system can't cope with having no free memory, his swap file is being used and it's killing the transfer rate - the image is on the same physical drive as the swap file.

 

So now this issue has been brought to my attention (again, but in a different style!), I've removed the slider again and gone back to the old trusty 64kb (fixed) for hdd reads.

Posted (edited)

LUK!,

 

thanks for clearing that up. For a short time I thought I was completely lost, but the way you explained it makes sense for me (with my limited knowledge) and I am sure that you know what you are talking about. :lol:

 

And now I hope that time will pass by quickly until the next version will be unleashed. :P

 

Best Regards

 

P.S.: If you will ever get near Cologne/Germany, drop me an email/PM and I will get you a beer or two. :beer:

If you don't, well, then I will have to make another small donation. ^^

Edited by ????
Posted

Oh, OK. I thought the depletion of memory issue due to 256K chunks beign read from hard drive was universal. I'll give it a shot on a single layer image before trying dual layer.

 

Thanks.

Posted

I have 1gig of ram but my system becomes so unstable it usally crashes part way through a burn with v1.2.0.0.

 

Thanks LUK for tracking this bug down, and I look forward to the new version in due course.

 

Best wishes.

Posted

I'm glad I found it too, but you can't call it a bug because it's not actually doing anything wrong!

 

It's perfectly allowable to read 256kb chunks, it's just a limitation of windows that it's tries to cache a 4GB file in memory! lol

 

Anyway, just a couple more bits to sort out and then I'm ready to put another release out.

Posted (edited)
Anyway, just a couple more bits to sort out and then I'm ready to put another release out.

 

Believe me, I am the kind of guy that would like to see updates rather yesterday than tomorrow, but sometimes I am concerned and worried about the health of people like you, that seem to work, work hard and work even harder. Do you ever sleep?

 

Don't forget to relax from time to time, my fellow friend (if I am allowed to call you a friend :blush: )!

 

Best Regards

Edited by ????
Posted
what is a RAM leak anyway?

 

My definition of a RAM leak is, simply put, when an application grabs all the available RAM for no good reason. So, in this case, it is not a RAM leak at all, it's all about the weird windows caching. If my topic title is too irritating (since it is obsolete now), I could try to change it, but I don't know if this is possible regarding the settings of the forum.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.