top of page

Battle of Mustafar

Battle of Mustafar is a VR survival game based on Star Wars where your goal is to defend against waves of drones. You can deflect laser bolts, cut the drones or use the force to throw your lightsaber and retrieve it. This was created as a group project for my 3D Games Programming module on my Master’s course. Created in Unity with the focus on VR and AI.  

Showcased at Develop: Brighton 2019!

Battle of Mustafar was created with the aim to explore VR and AI. I was responsible for creating the AI using machine learning, and applied this to the drones that float around the player. This was a specific requirement for the module which allowed me to explore new areas of AI. It was created using neural networks with one input layer, one hidden layer and an output layer. A genetic algorithm then measured the fitness of each drone based on the collective distance of 18 sensors that were being cast out from the centre of the drone. The higher the distance, the further the drone was from obstacles where it could be considered that it was avoiding. A set of 18 drones were put into a training scene for 24 hours to produce the best fitness, which was then extracted and placed into the main game. The hardest part of the development was checking if the neural networks were working as expected, or just a stroke of luck.

VR is always a struggle to develop for. Making sure a player doesn’t feel sick and that the game doesn’t break a player’s immersion. This meant a lot more focus had to be put into world and sound design. Having experience in sound design I took this on and immediately found a problem. I originally wanted to create my own sounds to move away from the Star Wars connection so that the game could be put somewhere. However, it was quickly apparent that this made people feel worse such that it took them out of the game, as the sounds weren’t what they were expecting. It then was redesigned so that all Star Wars sounds were added in and immediately people didn’t realise they were in a VR game and instead started calling themselves Jedi! In order to get good spatial sound, a sound engine called FMOD was used, allowing proper 3D audio that assists the player in locating the drones.

After submitting the game for the module, an opportunity came up to showcase the game at Develop: Brighton 2019. Therefore, a lot more time was spent polishing the project and making it feel a lot more like a game rather than a university project. Unfortunately, the first thing to be removed was the neural networked drones, these although great learning point did not help towards making it a fun game, as more often than not drones still collided and seemed to have no real intelligence as every drone ended out firing and moving at the same time. The drone AI was then rewritten with some more randomness creating varied experiences. To make the drones seem more realistic, they were given some depth in their design and implemented basic animation so that they weren’t just spheres floating in the sky.

The most valuable thing learnt on this project was player experience. This was one of the first projects I was showing where it felt like a proper game, and this was largely due to the reaction of players. A lot of the comments after people played it were them saying they felt like they were a Jedi as they were deflecting bolts and destroying drones successfully. The truth if they knew, is that a secret aim assist was added to increase the chance bolts were deflected directly at drones. Without this aim assist most bolts just blasted off in the wrong direction and drones just kept shooting you. It was a sad truth that I learnt I wasn’t a Jedi, but a very successful lesson learnt about making the player feel like they are.

bottom of page