Tuesday, 23 January 2018

Final Development Log

This is the final development log for my Individual Project which I've decided to name "Transition Level". This log will be unique in that three weeks before the due date, I decided to scrap my VR game and do a non-VR version. However, after a week of work on that, I was shown various techniques that'd make doing a VR version of the game much more feasible in Unity. I decided to spend a weekend as a crash course in C# and Unity and after that I moved my project into Unity. I'll be going through step by step on how I managed to produce the Unity version of the game.

First, I needed to import all of my built levels from Unreal into Unity. I decided that if I couldn't easily do that, then rebuilding the game in Unity would be far too time consuming, as I had spent the whole week prior rebuilding levels. I found a way to export levels from Unreal into FBX and imported those FBX's into Unity as different scenes. They had to be resized with new lighting. Colliders and the like needed to be set up again, as well. But all in all, it wasn't time-consuming enough to detract from trying the Unity version.

One of the major reasons I rebuilt in Unity was to add cloth physics to the game. As I've previously outlined, I had difficulty adding cloth physics in Unreal. It was one of the things my project supervisor suggested and I really didn't want to let him down. I found out how to do it with ease in Unity and I did just that. By adding a cloth collider to the clothes I wanted to behave like cloth and a sphere collider to each of the controllers, I could easily create the effect. I merely needed to paint constraints so the clothes wouldn't fall through the level and adjust some other settings such as dampening.


The objects simply needed to be grabbable. I don't find the grabbing mechanics quite as fluid in Unity's VR Toolkit as I did in Unreal's VR Template, but I was able to adjust to them. I'm wondering if, with more time, I could've done more with these. For now though, I focused on making sure the phone worked as intended. By containing the specific audio clips that the phone would play to the phone object itself, I was able to create a speaker effect akin to a real phone, where it's more audible if you put the phone next to year ear.

One of the key differences between Unity and Unreal, is the lack of visual scripting. Unreal's visual scripting system, Blueprints, is one of the major reasons I originally chose that engine for this project. However, eventually, I realized that being constrained to Blueprints limited what I could actually do in a game. I started to learn basic C# for Unity and I was able to set up my events much easier than with Blueprints. Here's the script I used for the phone:



Level transitions were mostly handled by "locking" the door in each level until "GameController.levelconditions" had had its conditions met. In each level, these conditions might be different, but it was all still contained in that same "levelconditions" variable. In the Clothes Store level, a more specific series of tasks needed to be completed. So, the script would simply be set to wait until that final condition was met to set "levelconditions" were true, while in the Bedroom, the player merely needed to pick up the phone.

NPC's have a follow mechanic that was created by scripting a controller that followed the HTC Vive's head set around and locking the actual model into a specific axis. The rest of the animation is handled through an animation controller and runs animation I've made in Maya. Here are some quick renders of the character models from after I made and then rigged one of them:



Most of the game was simply setting up voice clips to different collisions and scripts, however, I do feel I let the game down in that respect. Due to my previous attempts at getting semi-professional or student actors to help with the project falling through, I was forced to provide the voices for the game myself. I was able to enlist my girlfriend to help, but neither of us are professional voice actors. The difficult material of the game made it hard for us to properly act out the dialogue, and I had to tone it down a bit in order for us to save it.

The last bit of development that I really want to mention is the mirror mechanic. I was most worried about failing to implement this mechanic as it was always planned to be the crux of the game. I was able to create the mirror by essentially making it an extra camera view that was showing what it was seeing. However, a mirror wasn't good enough. I needed it to not be able to see the character until the end of the game. I did this by attaching a model a script similar to the aforementioned NPCController to the headset and using a culling mask so you couldn't see the character in the game barring the mirror. Since VR doesn't tend to use character models for the playable character, it took awhile to come up with this solution.

My final thoughts on the matter are that I'm just glad to have a playable VR game. I ended up paring it down far more than I'd like, cutting out at least two scenarios and all of the "transition levels". This was due to not being able to use a VR headset until the middle of last month. Had I had more time to work in VR, I'd likely have a product closer to what I originally planned. I'm not sure this game actually does a very good job capturing my feelings of gender dysphoria and the harassment I deal with when presenting as a woman, but I am glad to have at least attempted. Below is a video of the game showcased in VR:

https://youtu.be/RJG5GquWDgU