Cube 8 Posted August 9, 2009 Posted August 9, 2009 (edited) The next message says: Failed to initialize FIFO buffer (536.870.912 bytes) I have reviving old threads, so I created a new one. This time, I'm trying to read from a CD to an iso file. I have XP x64 with 6GB RAM. I have v2.5.0.0. I didn't have this problem with version 2.4.2.0. I didn't change anything in the settings. Transfer length = 64KBBuffer = 512MB Enable buffer recovery = Yes If I set the buffer to slightly lower than 512 (say, 500MB) it works. I think this is a bug in ImgBurn. If you confirm so, you can move the thread to the appropriate forum. Two things make me believe it: 1. Not being able to handle 512MB of buffer in a system with 5GB free RAM makes no sense. 2. As I said, v2.4.2.0 works with max buffer setting. Edited August 9, 2009 by Cube 8
LIGHTNING UK! Posted August 9, 2009 Posted August 9, 2009 ImgBurn just calls the VirtualAlloc API function. If it fails then it fails, there's nothing I can do about it.
Cube 8 Posted August 9, 2009 Author Posted August 9, 2009 I agree and won't argue about that. How do you explain the fact that it worked in v2.4.2.0? Did you change any of the API parameters?
LIGHTNING UK! Posted August 9, 2009 Posted August 9, 2009 Did that version allow for the same buffer size? No, I haven't touched how the function is called. I guess some other allocation is just preventing Windows from getting 512MB exactly how it needs it for the function to succeed. I don't pretend to know the details of what's actually going on behind the scenes with the API.
Cube 8 Posted August 9, 2009 Author Posted August 9, 2009 Yeah, it allowed setting exactly 512MB. Anyway, this small workaround is not really a problem. Allocating 500 instead of 512, makes almost no difference. I would recommend you should look through this (if there is anything you can do - maybe MSDN?). There's no point in having the setting of 512 available to users and not being able to use it.
LIGHTNING UK! Posted August 9, 2009 Posted August 9, 2009 It really isn't worth my time. A buffer of that size is pointless. The defaults work fine and I'm happy with that. I'd drop the max back to 256MB but then odd people would just complain. Also, just because it doesn't work for you doesn't mean it won't work for someone else.
Cube 8 Posted August 9, 2009 Author Posted August 9, 2009 Maybe you should add a notice about this message when someone attempts to set the buffer to max. You shouldn't lower the max value! 512 is OK. Setting as much buffer as possible (especially if there is a lot of RAM) is good. You already know that.
Cube 8 Posted August 9, 2009 Author Posted August 9, 2009 (edited) UPDATE I've just tried 511MB and it works (well... for reading*). I really don't understand what could be wrong with 1 megabyte more buffer. *EDIT: I discovered that, after reading a disc, I can't write one. The same message appears. If I close and re-open the program, I can write to a CD using 511MB buffer. Something must be wrong with ImgBurn's memory flushing. Edited August 9, 2009 by Cube 8
LIGHTNING UK! Posted August 9, 2009 Posted August 9, 2009 I allocate and release correctly, if fragmentation causes the function to fail then that's down to the memory manager - and of course that's not my work.
Cube 8 Posted August 9, 2009 Author Posted August 9, 2009 (edited) After some quick experimentation with the options inside I/O tab, I managed to make it work with max buffer setting. I changed the interface setting from SPTI to Patin-Couffin. I don't know what actual difference it makes (maybe it's just a coincidence), but I don't get this annoying message any more. Edited August 9, 2009 by Cube 8
Neil Wilkes Posted July 28, 2010 Posted July 28, 2010 I just got hit with this one for the first time too. Buffers - as per the poster here - were at max. Thing is, this time it hit me when trying to write from a folder. I have been using buffers at max since, oh, forever - yet this is the very first time I have run into this issue. Solved it by dropping buffers back to 256. I agree with Cube 8 though - it is very irritating. It has to be a bug, because I just booted the system. Nothing running really in the background, I have 2Gb of RAM in this. Why does it happen now, when writing from a folder, when it will happily write an image to DL with buffers at max?
LIGHTNING UK! Posted July 28, 2010 Posted July 28, 2010 'Build' involves 2 buffers, 'Write' just uses the one. An API command failing is not my problem, nor can I do anything about it - I mentioned this at the top of the thread and things haven't changed. I have no idea why it would work sometimes and not others, I didn't code the OS or the memory manager.
Neil Wilkes Posted July 28, 2010 Posted July 28, 2010 I'm not saying it's anything you can do anything about - it looks like the OS for some reason. It is very odd though - I have around 1.7Gb free RAM after initial boot, and the default Virtual memory settings have been altered to suit DAW use. 5Gb DVD image, scads of RAM, scads of HDD space All very odd.
Recommended Posts