This blog series is a part of the write-up assignments of our Game Engineering II class in the Master of Entertainment Arts & Engineering program at University of Utah.
This post shows the update I made in the project and also the final presentation.
Use WASD or controller analog stick to move the wolf around. Step onto the deer to eat them. After all the deer are gone, the game will shut down after 5 seconds, and you can open the eae6320.log file to look at the time and steps took you to finish the game.
During this week, I have added some gameplay codes, such as killing the deer, and a game-ending condition where you kill all the deer. I’ve also made the level a little bit bigger with two more deer.
Aside from that, I also incorporated Ekshit Nalwaya’s particle system to add some visual indication to the game.
My goal of this final project was to create a gameplay similar to the Crypt of the NecroDancer. However, I made some changes to the original gameplay. First, the player has free movement instead of having to move on the music beats. Second, there is no “enemy” in the game, which means that the player does not have health, and there is also no lose state.
So basically, my game is simply a hunting game where you play as a wolf and try to hunt all the deer on the screen while they roam around the level. The final result is shown below.
Now let’s talk about all the components that I’ve used in the game. As I’ve said in the previous blog post, Aside from my own navigation system, I have included Deng’s behavior tree, Sai Upadhyayula’s controller input, Ekshit Nalwaya’s particle system, and Shantanu Pandey’s audio system into my final game. I will talk into how I use each of them in below.
Behavior Tree (4 hrs spent)
The behavior tree system is really fun. Since it basically has the same structure and implementation as Unreal Engine’s behavior tree, which I am super familiar with, I had no problem implementing it at all. At the same time, the AI I need in my game is really simple, I only made a custom decorator class, a task class, a service class, and that’s all I need.
One problem that took me some time to resolve was handling blackboard and behavior tree sharing. At first, I thought I could just share the same blackboard and behavior tree among all the NPCs since they are running the exact same logic, and the only difference will be in the task node. However, if I do that, then the tree will only update with one of the game objects since they have a one-to-one relationship. Fortunately, this problem was trivial to solve, and I could still share the same blackboard among them since all it stores is basically the time interval of movement.
Audio System (1 hr spent)
This system is really easy to implement and use. Despite the fact that I wished for a function that can provide me the beats-per-minute (BPM) of the music so I can use it in my game and not getting it (which is totally fair since this feature is really specific to my game), everything else works pretty well. However, there is a problem that I still cannot resolve, which I will talk about in the obstacles section.
Controller Input (0.5 hr spent)
This is the easiest system to implement out of all the systems simply because this is so independent of all of the other subsystems. I also didn’t need the custom C# settings graphical interface so it was trivial to simply add some more codes to check the controller inputs.
Particle System (4 hrs spent)
This system is really written with a high degree of customization in mind. Also, when I had a problem with finding the builder project and contacted Ekshit, he was really responsive and resolved the issue super fast even though he was already back in India. I did encounter some problems when implementing it in my game though.
When I try to render the particle at a certain location, I thought I could use the UpdateParamer() template function to change the position for the particle (which is pretty cool since it also goes through LuaEngine). However, I later realized that there is no way for me to retrieve that information back from the particle system to create the transform matrix that I need to render the particle on screen. What I had to do was to add another array particleSystemPositionArrayPtr_perFrame in my rendering bucket to store all the positions of the particle systems.
Another problem that I had to deal with was how to handle the particle destruction. At first, I was decreasing the reference count and set it to nullptr immediately after passing it on to my Graphics system. The problem was that the update function won’t update the particle since it is, of course, nullptr. What I ended up doing is replacing the old particle whenever I need to create a new one, which works fine since my game doesn’t really need more than one particle effect so far.
After Mukul reminded me, I realized that the audio system is probably not cleaning up thoroughly. This is causing the application to stay in the background process after it is shut down. After spending some time on the project and going through the Microsoft XAudio2 API, I still couldn’t figure out which part was giving the problem. Although, I did notice that it only happens in release build for me, but not in debug build. My guessing is that there is still another process running on another thread that hasn’t been cleaned up.
What I Have Learned Throughout
Since we have a final project of making an engine component for other people to use, it forced me to really think about platform independence and interface structure. To be honest, a lot of time when I was writing codes that I know I will probably be the only to ever see it, I don’t really think about making a great interface that much. I will do it to the extent of it making sense to myself, which could happen is that maybe after a few months, I would come back and revisit the code, and it would take me some time to actually figure out what is happening. What the final project offered us is a great experience of creating an actual component with two concepts in mind, one if providing a good interface, and the other one is allowing others to not have to worry about the implementation. This is also probably my first experience of putting that much effort into thinking about the precondition and postcondition of my functions. I understand that if I have to change the implementation of my functions, the precondition and postconditions still need to remain valid for the previous usage, otherwise it might produce unexpected behavior in other people’s applications.
Constantly needing to incorporate and maintain project dependencies gave me a really solid training of being aware of how to link different projects together and also incorporate external dependencies. This kind of experience can be hard to gain with only working on small projects or working with a commercial game engine. Having gone through this semester, I have become extremely comfortable resolving any sorts of linker errors without problems.
I have been so used to object-orientated programming for so long that is just hard to not think in that mindset. Even though that I have done data-driven programming for a dialogue system that I made before in Unreal and also Lua object system in another engine, it is a totally different experience seeing that we can use a custom plugin to export models from Maya into a Lua file, and then build it into a binary file so that we can read and extract information in the most efficient fashion as possible.
To achieve this, we really need to have the hardware structure constantly in mind. Should I add paddings to some structs? What about when exporting binary files? Or dynamically handle data type (such as using uint16_t instead of unsigned int when the data size is small enough).
I really enjoyed doing these kinds of programming and optimizations, it makes me feel really good to figure out how to reduce memory usage or read/write data more efficiently since I have always wanted to do console development, where memory and processing power is super limited.