It has been a while since my last post. After over 5 weeks of working round the clock on the 3D engine to compensate for the unplanned 3D conversion, I really needed a break.
Things should get back to regular scheduling now. But even if I took a break, I still got to do some work.
First, the integration between the regular game engine and the 3D stuff has been strengthened and the way you specify the 3D meshes and textures is no longer hard coded. I have now a fairly flexible system, but which unfortunately does not manage to hide the implementation details 100%.
I have a fair amount of new assets courtesy of BrewStew. I also experimented a lot with first person mode. At this stage I do not have the resources, but if I ever do, the "adventure" mode for this game will be Elders Scrolls meets Minecraft, but without so many cubes.
I also have a fair amount of unanswered mails in my inbox. Sorry about that. Breaks are total commitment affairs for me so I don't check my email that often :). I'll get to answering you question ASAP.
With all the new resources game start up is no longer that fast and (for my own convenience) I tried to add multithreaded texture loading. The idea was for the game too boot up instantly and load the textures in background. As soon as a texture would load it would pop in. This was only meant for me, so I can test easier. The consumer really does not want extra pop in. But this attempt failed completely and I did not manage to get it to work in any acceptable fashion.
I complained in the past about the randomness of the GPU + Irrlicht. With new knowledge and techniques, I now have a predictable model for medium to high end GPUs. Actually, these GPUs run the game very well and with predictable behavior and performance. The low end GPUs are the problem. I will be buying a cheapish (around $700) laptop soon and I hope it will run the game perfectly with performance to spare. This is one of the hardware categories I am targeting.
As a lot of you may now, a mesh is just a collection of vertices and indices. The indices tell you in what order the vertices form a face. In Irrlicht with hardware buffers indices are 16 bit. This means that you can have at most 65536 vertices in a buffer. But, if you repeat vertices, you can have more than 65536 indices. It just so happens that meshes tend to have a larger number of indices than vertices and this is very true for my meshes. So it makes sense to have more than 65536 indices in the same buffer. Better GPUs not only do not have any problems with this, you even get a mice performance boost. But at least some low end GPUs complain and do not allow you to have more that 65536 indices.
To make matters worse, low end GPUs also have a maximal total number of vertices and indices, and if you go over either one of these, the entire scene fails to render under DirectX.
So for low end hardware I needed a way to break buffers up into smaller chunks. The problem is that Irrlicht does not give you direct control over that hardware mapping for the buffer and it does not like very dynamic and drastic changes in the structure and content for buffers. Once you have created them, if you change structure Irrlicht will crash, deep inside the DLL where I really don't want to debug why. The problem seems to be that the correlation from software buffers to hardware buffers is done only at the render step, and then if it does not find something it expects to find, it will crash.
I managed to find a needlessly complicated solution for this. When you need to get rid of a buffer, you create a new empty dummy buffer. You clone all the hardware identification info and put this buffer in place of the original, while you take the original and add it to a pool. You also mark the new empty buffer as dirty. You do not remove the old scene node and add a new one with new real buffers. When Irrlicht renders, it will think that the new empty dummy buffer is the old one, but has been emptied. So if will clear up whatever reference it has to it that causes the system to crash if I remove it directly.
The system is not perfect yet. There are still some memory leaks that will eventually cause the system to run out of RAM. Level changing is a little bit slow (but a lot faster that without pooling). And the system gets inundated by empty scene nodes which must be present for at least a render cycle in order to clean up some buffers. These nodes don't seem to eat up resources, but I need to find a way to clean them up.
Like said, all this mess is for low end GPUs.
As for Irrlicht, I don't have the resources right now to abandon it, but our collaboration will not be long lived. It is a good workhorse, but it is not well suited for very advanced hardware related tasks. Or maybe it is poorly documented, and there is somewhere a function cleanUpThisStupidMessOfABuffer or illLetYouManageStuffOnYourOwnBecauseIAmTooStupid. My theory is that (like so often in open source) the more advanced features were not extensively tested and almost nobody except for the one who implemented them has used them for anything that is not trivial. For example I am sending double the amount of MiBs my GPU has to it with buffers and it works fine. Only it is frustratingly hard with Irrlicht. With every single line of code you are fighting it. I'm sure nobody has tried this before. If someone claims that "Irrlicht has top world leading buffer management capabilities" they deserve a slapping. No, Irrlicht has buffer management capabilities so simple and painless to use that a monkey could do it, as long as you are only doing simple stuff. That is somewhat of a compliment. Too bad it does not like destroying these buffers you so easily created.
I'll leave you with a few places this project has been referenced on the Internet. Most topics only mention it and then the subject is changed, but still decent exposure. I need to really settle on the final name for the game before this becomes a PR disaster. But finding this name is a lot harder than making Irrlicht delete a stupid buffer.
DwarvesH (Dwarf Fortress for the rest of us?)
GoblinCampDwarvesH - New Dwarf Fortress inspired game in development
DWARF FORTRESS 0.34.01 (Feb 14, 2012) has been released!
DWARF FORTRESS 0.34.01 (Feb 14, 2012) has been released!
Bay12 ForumsDwarf Fortress Clone?
DO YOU RECOMMED DWARFS!?