bluraywriter posted a topic in ImgBurn BugsBut I have 12 Gb .. 4 Gb in use. ImgBurn (love the tool, kudos!) can't allocate ~ 250 Mb of FIFO. After a heavy day of use my PC's memory seems to be fragmented. Nothing I can do about it. I want to allocate even more than 500 Mb since I'm burning BluRays over the network ; the read-speed is not always consistent and I don't want to run 'out of files in the buffer'. You (developer) said you try to allocate one big chunk. Suggest as a fellow developer ; why don't you try to allocate first the big chunk and if that fails try to allocate half of that, twice. Keep allocating until you have the buffer that was 'desired. The speed of chunked memory isn't a problem (in terms of Gigabytes per second) so the only issue is the internal pointer list of where all the memory-chunks/blocks are. Is that (too) hard to build in? The thing is ; yes as a developer you are right that you do the right thing, but if people have say 2 Gb of free memory and a tool cannot allocate 25% of that (500 Mb) then you can't blame the user Would be great if you could look into this (besides the other dozens of points on your wishlist probably) ; if several peeps have reported this before then trust me ; hundreds of users will have seen that error and probably did not take the effort to 1) go to the forums 2) create an account 3) post a message about it. Tip 2 ; log those errors and report them back over http to your server so you can 'real-time' see what goes wrong out in the open. People can opt-out of that or something, perhaps helps you to improve this pretty nice product! Cheers!