Jump to content

VirtualAllco Failed: Not Enough Storage...


Recommended Posts

Posted (edited)

Hi,

 

First and foremost I would like to thank the Devs for such a wonderful program. Secondly, I've searched the forums using FIFO, Buffer, VirtualAllco and many other keywords, and I couldn't find what I was looking for. Thirdly, I only use the "Writing files/folders to disc" mode. Finally, the problem happens just after the directory calculation finishes, but before any data is burnt to disc and there are no error messages logged, only error dialogs appear.

 

I saw the change log for v2.4.2.0 and noticed that the "Max buffer size is now 512MB". Every once in a while I've watched the buffer get low, so I decided to try upping the buffer from 256MB to 512MB. First I changed the Build-Read Buffer Size from 256MB to 512MB, but the log still showed "Filling Buffer... (256 MB)". Thinking I needed to change the I/O-Buffer Size, I also changed it from 256MB to 512MB and that's when a problem occurred.

 

After loading a directory, err, folder, I clicked the "Build" button and ImgBurn tells me that I've "only selected 1 folder" and whether this folder "represents the root directory for the image content". To which I click the "Yes" button. ImgBurn now does a calculation and displays a summary to which I click the "OK" button. That's when I get the following pair of ImgBurn error dialogs. First ImgBurn error dialog:

 

"VirtualAllco Failed!

Reason: Not enough storage is available to process this command."

 

Click "OK", then a second ImgBurn error dialog immediately appears:

 

"Failed to initialise<sic> FIFO Buffer (536,870,912 bytes)"

 

Click "OK" and were back to the "Writing files/folders to disc" view.

 

The system is running WinXP SP2 with 2GB of memory and at the time of the error there was just over 1Gb free. As to storage, there are over 70 de-fragged GigaBytes free on the DMA mode running hard drive. So, what I don't understand and maybe someone could please clarify this for me:

 

1) Why does it say "Filling Buffer... (256MB)" in the log when the Build Buffer is set to 512MB?

 

2) When changing the I/O Buffer to 512MB why do I get the above pair of errors every time?

 

3) What is the difference between the Build Buffer and the I/O Buffer, and which one is more important (to increase)?

 

Final observations, when I set the I/O Buffer back to 256MB from 512MB I still get the same pair of errors mentioned above when clicking the "Build" button. It is not until I close and re-open ImgBurn does the errors stop appearing. Also taking this one step further, it does not help to set the I/O Buffer to 512MB and then close and re-open ImgBurn. Restarting ImgBurn only seems to fix the problem going back to a I/O Buffer of 256MB and not when trying to increase it. lastly, there are no error dialogs when toggling to "Image File Output" mode and then pressing the "Build" button.

 

Thanks for your time.

Edited by Handler
Posted

It seems your machine just cannot handle having 2 buffers of that size.

 

One buffer is for the reading part - i.e. as ImgBurn reads all the files off the hdd (think of it as a cache). That buffer comes into play for both output modes within Build mode.

 

The other buffer is only used when burning to a disc and basically contains cached data exactly as it's sent to the burner (i.e. after all the processing). Obviously then, this buffer is not allocated when creating an image file and hence output to image does not bring up the error.

 

If anything, you'd want the 2nd one to be bigger, not the 1st (reading) one. (Why can't you leave them both on their default values anyway?!)

 

The actual error messages are direct from the OS. You'll have to talk to Microsoft about why they're coming up when the VirtualAlloc API function is called.

Posted (edited)
It seems your machine just cannot handle having 2 buffers of that size.

Perhaps a better way would be to say that a machine with only 2GB of RAM installed isn't enough for two 512MB buffers. After rebooting the machine there was over 1.6GB of RAM free available. Yet, testing ImgBurn again with the two 512MB buffers still caused the errors to happen. So apparently 1.6GB of free RAM isn't enough for 1GB total buffer space.

 

Leaving the I/O Buffer at 512MB and slowly dropping the Build Buffer down 25MB at a time, there were finally no errors at 425MB. So apparently 1.6GB of free RAM will work with 937MB of buffered space. BTW, by not getting the errors I see the log showing "Filling Buffer... (512 MB)", so now it's clear which buffer that line in the log has been referring to. :)

 

One buffer is for the reading part - i.e. as ImgBurn reads all the files off the hdd (think of it as a cache). That buffer comes into play for both output modes within Build mode.

I presume this is the Build Buffer.

 

The other buffer is only used when burning to a disc and basically contains cached data exactly as it's sent to the burner (i.e. after all the processing). Obviously then, this buffer is not allocated when creating an image file and hence output to image does not bring up the error.

I presume this is the I/O Buffer.

 

If anything, you'd want the 2nd one to be bigger, not the 1st (reading) one.

I presume I now have that with the I/O Buffer at 512MB and the Build Buffer at 425MB.

 

(Why can't you leave them both on their default values anyway?!)

Because the hard drive can be a bottleneck at times and RAM is faster. By moving more data into RAM before burning start frees the hard drive and insures a better constant fill of the drive buffer.

 

The actual error messages are direct from the OS. You'll have to talk to Microsoft about why they're coming up when the VirtualAlloc API function is called.

Maybe we both should for different reasons; seeing as how the error dialog said ImgBurn in the title and it appeared as though Microsoft was blaming ImgBurn. :)

 

Seriously, thank you for your reply. I still don't understand why so much free RAM can be available and yet it can't be utilized. Even with the I/O Buffer at 512MB and the Build Buffer at 425MB, the system still had 600MB free. Odd.

 

My next puzzlement is when a CD-RW burn is canceled and ImgBurn saying "Synchronizing Cache..." for 45 minutes and ultimately having to reboot the system in order to kill off the task. Which can only be equivalently described as the laborious task of stripping paint off a intricately turned table leg, layer by layer, with a toothpick, until the system finally reboots itself. But, I'll have to save that for another time after doing more tests. ;)

 

Thanks for your time.

 

 

Edit: Winky emoticon doesn't work.

Edited by Handler
Posted

Sorry, the error code (a numerical values) comes direct from the OS.... ImgBurn displays the error box with that code translated into plain english (again using an API function - so it actually gets translated to your native language)

Posted
Sorry, the error code (a numerical values) comes direct from the OS.... ImgBurn displays the error box with that code translated into plain english (again using an API function - so it actually gets translated to your native language)

Understood and thanks.

  • 3 months later...
Posted

Hi! I'm sorry to revive and kidnap this old thread, but I just had a very similar problem, and I don't think its going to hurt to report it.

 

I hardly ever use ImgBurn (I grow used to use Nero 6), but recently I started to burn DVD with files bigger than 2GB, so I started using UDF. Since Nero limited the length of the label, I googled a solution and found that the ImgBurn could rename the volume label, with a far larger limit.

 

But I didn't like the idea of having to create a image just so I could rename the label. So I decided to check if ImgBurn supported disc creation and not only image burning, and for my surprise, it also did.

 

So I stared using it whenever I needed to create UDF discs. I have always had the impression that the buffer limits used by burning software in general to be to low, and I always thought that every one else should allow 256mb like ImgBurn did. And again, while re-tweaking its options, I was amazed to find that now it allowed 512mb. I had to try right away, and then the same problem happened to be. When I google it, I found this thread.

 

I do agree that this must be a windows issue, but I also think it is one that you, the developer, should be aware. Who knows, this might hurt stability of the program if allowed to be pushed to its limit.

 

I use Windows XP 64-bit Editon and have 6GB of RAM and have had exactly the same problem as Handler did. Perhaps windows has a limitation for memory allocation per process.

 

Since lowering the Build buffer solves this is issue (and this size is enough), its no big deal. But if some day, you could check a way to solve this I be grateful.

 

Thanks.

And by the way, this is a wicked piece of software. Really good. The only thing I missed is the speed selection box not hiding speeds unavailiable for current media/burner.

Anyway, thanks for sharing it with world.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.