Jason Lynch , 4th Year Individual Portfolio (The Thinking City) - First Person Controller

Here you will find a step by step layout of the development behind the U.I, Door & Robot Pod Design, Quest Design, and the FPS Controller

Back To Homepage

Relevance of Sections Covered:

The User Interface

From our user testing we found that one of the main problems in the game was that the player was unsure of what it was they were suppose to be doing. To address this issue I began designing an in game U.I for the player to utilize. This U.I contains useful information to help the player get a better understanding of what their current objective is, what it is they're looking for, and where they are in the level. The U.I contains a map on the left hand side with icons for each room so the player always knows where they are on the map. It has an objective list in the middle with an arrow underneath that tells their player what they have to achieve with the arrow pointing constantly toward it. Finally on the right it has a visual image of what it is the player is looking for. Tests after this addition showed that players knew what they were doing a lot easier so made the U.I a very important part of the game.

 

The First Person Controller (FPS Controller) Note: Development was stopped once we decided on VR

The First Person Controller be it VR or Keyboard/Controller is the most important part of the game because without one of these there is no way for the player to move around our level and hence we have no game. We are currently experimenting with both a VR player input and a standard keyboard/controller input in an attempt to measure which compliments our game more. However since it is good knowledge to have and would make the game accessible to larger audience I began working on the FPS Controller while we conducted tests with a Virtual Reality headset.

The reasoning behind choosing a first person perspective over a third person or top down view, etc, is that to convey the tone and atmosphere of the game effectively the player needs to be viewing things as if they are there themselves and the best approach to conveying this is the first person perspective.

 

The Door Functionality & Robot Pod

The malfunctioning doors are one of the core mechanics from the early concept of The Thinking City. These doors will randomly open and close providing/restricting access to rooms. From early play tests of the paper prototype for this game we found that users really enjoyed the added tension that doors could suddenly close in front of them blocking their, or even open providing an escape from the A.I enemy. This positive feedback from tests made it important to keep this functionality in the real game.

The robot resides inside a charging pod at the entrance to the labs. It remains in here in an Idle state until the player enters the level. Once the player does it begins a power up event where the charging pod begins to open and the robot starts its powering up state. This is the point at which the tension in our game picks up as the robot is now in the level searching for the player and attempting to kill them. 

 

The Players Quest

In our paper prototype the player had to traverse the level and acquire three different keycards scattered throughout the map in three different but important rooms without our level. The first card is located in robotics, the second in the gravity room, and the third in the power generator room. Once all keycards were acquired the player could restore power to the facility in the power room, head to the security room and lift the lockdown, then finally exit through the right of the map. The reason for these cards was mainly to get the player to explore our environment while adding an obstacle to their objective of escaping. It proved to be exciting to the players through testing with the A.I constantly on the lookout for you. However there were comments that the ending was a bit anti-climactic since it just faded out once you exited the bulkhead doors. To address this we added a teaser of the rest of the game at the end which you will see later in this page.

Technical Challenges:

I had quite a few challenges when it came to developing the several sections covered in this page. These challenges are categorized and listed below.

 

The User Interface 

  1. I had never designed a U.I before.
  2. The U.I had to exist inside the game environment.
  3. It had to be easy to understand.
  4. It mustn't break game immersion.

 

The Door Functionality & Robot Pod

  1. It was my first time using animation curves over pre-made animations. 
  2. I hadn't used IEnumerator much before.

 

The Players Quest

  1. I had not built scripts that interacted with each other to this extent before.
  2. I needed to make a gameManager to track player progress.
  3. I was unsure how to properly structure scripts involved in mission progress.

 

The FPS Controller
  1. I had never made an FPS Controller before.
  2. Developing input for the keyboard.
  3. Creating an aesthetically pleasing head bob effect on the FPS Controller.
  4. Handling collision with surfaces
  5. Handling collision with enemies

References:

The references I used when making the U.I, FPS Controller, Door Functionality, and Players Quest is the same pool of videos used for the other roles. There are plenty of videos on Unity Scripts, designing user interfaces, and developing scripts for the Unity environment.

 

Keywords: Unity, FPS Controller, U.I, Unity Scripts

Developing the User Interface (U.I)

Considering the U.I

When developing the U.I there were several things we needed to consider before building it. The first was that we wanted the U.I to fit the Sci-Fi theme of our game. Our artist did not have enough time to develop the visuals for this U.I so I searched for third party assets that fit the theme of we were looking for. Luckily I managed to find a very good Sci-Fi U.I asset which I've linked below.

The second thing we considered is that we wanted the U.I not to take from the games immersion. To combat this we built a watch that the player wears on their left hand. Once the player presses the X button on the left hand trigger it causes the watch to open up a holographic U.I that the player can then use to see the map, objective, or item they're looking for. The Hologram can then easily be closed by letting go of the button.

The third thing we wanted to consider was the feedback from the users. We got a lot of consistent feedback saying it was difficult to understand what the objective was in the game, and where the player was in the map as they often got lost in the halls of the game. To address the issue of players getting lost we designed the U.I to have a map of the facility on the left hand side with the current location of the player as well as icons for each room. To address the issue of players not knowing what they had to do we put a panel in the middle with the players current objective written down with an arrow below pointing toward their objective in the level. Finally to address the issue where players didn't know what they were looking for we put a panel on the right with a visual image of what they are currently searching for.

Below is a video of the U.I working within the game. i will explain each part of it's development further below.

Final in game U.I

Download UI_Finished.mp4 [20.72MB]
Details

Setting up the U.I Watch

We wanted to have a visual representation of the U.I in the game and not just a holographic display from out of nowhere. To this end I suggested to the team that the player wears a watch that the hologram will come out of on a trigger button press. Everyone seemed to like the idea so Dylan began modelling the watch. once he was done he gave the model to me where I connected up the U.I. The first thing I put on was a small U.I on the watch face. This is purely visual in design to give the watch more of a Sci-Fi look. 

You can see the watch model below with the U.I on the face of it.

Watch Model with Face U.I

Details

Setting up the Map

The Map Model

The map section of the U.I is generated using a neat little trick I found in the resource videos. It involved me modelling an extremely primitive replica of the level. Once I had that done I added it to the game and scaled it up so it mirrored the level perfectly. I then lowered the map model under the level and gave it a tag of "MiniMap". By giving it a tag of minimap I was able to add a camera and set it up so it would only render objects with the tag of minimap.

mapPosition.PNG

 
The Minimap Camera

With the Minimap model in place I set up a camera that would look straight down on the level. However, I specified in the culling mask settings that the camera ignore everything in the scene but the minimap. This meant that the camera would only render objects with the tag of "MiniMap". With the camera only rendering the minimap I can now use it as the field of view for the map. 

mapCamera.PNG

 
The Players Position

My next task after setting all that up was to get the player to show up on this minimap. To achieve this I attached a sphere and cone to the player and lowered it into the map below the game level. This meant that whenever the player moved the sphere also moves along the map underneath the game. It was a simple trick to achieve what we were looking for in the game.

playerSphere.PNG

 

Displaying the Map on the U.I

With everything set up for the map the next step was to get the map displaying on the players U.I. To do this I used a raw image component and created a render texture that allows what the camera sees to be displayed on an image component. Doing this in the U.I gave us the map of the level that we needed from feedback.

mapOnUI.PNG

Setting up the Objective List

The Current Objective

From feedback we found that players didn't know what it was they were suppose to be doing. To address this we implemented a goal tracker in the middle of the U.I. This goal tracker contains the current objective of the player. This changes as the player reaches different milestones in the mission. The objective list is contained within a UI script and is displayed on a text mesh pro component on the U.I.  

objectiveList.PNG

 

The Arrow Pointing to Objective

 The arrow is located in the center of the middle U.I and constantly points toward the current goal/item the player is after. I built the arrow model in Maya and applied the holographic shader that I made. It uses a script that updates the arrows transform to point at the current objective which it gets from the UI script. 

arrowUI.PNG

You can find the arrow script below with comments within.

Setting up the Goal Visual

The Objects Name

The U.I on the right contains the name of the object in the level that they are looking for. This is to help new players find what they need to a bit easier. The name is contained in a text mesh pro component that is updated with the new items name once the player achieves the current goal.

objectname.PNG

 

Visual of Object

Underneath the name of the object is a holographic visual of the item itself that the player is looking for. We added this as we found from testing that players didn't know what items in the environment were quest orientated. Just like the name of the object the visual hologram changes as the object they're looking for also changes. This is handled inside the UI script.

objectVisual.PNG

Overall Assembly of U.I

The different sections of the U.I were built up using several images on a canvas that use third party holographic textures. From there it was just a matter of customizing the canvas to fit the form I wanted. The U.I had to project from the watch and since the watch is on the players hand I wanted it to be as small as possible so the player could see it comfortably. The left and right side of the U.I are also rotated so the U.I mimics the same concept as a curved tv so the players eyes have a clear view of each section. With that done the U.I is complete.

You can see an image below of the canvas object with it's child images used to make up the U.I

You can also view a video above of the U.I complete and in the game.

U.I Assembly

Details

Door Functionality & Robot Pod Design:

Developing the Doors

Considering the Doors

We added the idea of malfunctioning doors that had a specific percent change of alternating from their current state during the paper prototype phase of development. The addition of these doors was to add a bit of a shifting environment into the level and make things a bit more hectic. 

From testing this feature during the paper prototype we found that it added a new level of excitement to the game as the players plans could get changed at any moment forcing them to adapt. They could also have their only escape route from the robot suddenly blocked or even be lucky enough to get through a doorway right before it closes saving them from the robot. 

Since players seemed to really enjoy this feature we added in to the game .

Door Functionality

To create the doors functionality I built a script that would use a timer and a random number generator to determine whether or not the door should open or close. The doors starts off in a closed state with the locked hologram displaying and a red light underneath it. The random number generator generates a number between 0 and 100. If that number is above 45 then the door changes it's state. This gives the doors a 55% chance of changing it's state that through testing seemed to be a good number. When changing state the door script calls an IEnumerator that animates the door over a specified duration and animates it using an animation curve that evaluates t over the duration. 

Depending on whether the door is opening or closing it will flicker the light from red to green and land on red if closing and green if it's opening. The hologram on the door will also change between two different styles depending on it being open or closed. The player also has the ability to enter a code on a keypad to cause the doors to open or close immediately.

You can see a video of the door functionality in action below.

Door Script

Door Functionality in Action

Download DoorFuncVideo.mp4 [5.71MB]
Details

Developing the Robot Pod Event

Purpose of the Robot Pod Event

The robot pod event was designed as a more flashy introduction for the robot into the level. The robot was originally planned to be lifeless on the ground and get up once the player had come within a certain distance of it. However this seemed a little anti-climactic so I came up with a new way of introducing the robot. This event has a charging pod straight in front of the player at the end of the hall. Once the player reaches the end of this hallway and enters the lab area a klaxon lights up above the pod to grab the players attention and the pod door begins to lower revealing the robot in a powered down state inside. 

After the pod door has lowered completely a holographic head flickers to life on the robot and it begins to power up and leave the pod. This signals the end of the robot pod event and introduces the robot as a new threat in the level

Final Robot Pod Event

Download Robot power Up.mp4 [3.97MB]
Details

Setting up the event trigger

The start trigger for this event is a child of the Robot Pod itself. It is a box collider set at the end of the hallway just as it opens up into the labs which you can see in the image below. Once the player has entered this trigger box it calls the OpenRobotPodDoor() function inside of the RobotPod script which begins the animation of the door lowering. It also sets the triggerKlaxon variable to true before the animation starts so that the klaxon script knows to activate.

You can see an image of the trigger box below.

You can also find the script for the trigger below with comments within.

Event Triggers in Scene

Details

Event Trigger Script

The Robot Pod

The Robot Pod houses the robot and has a script attached to it that controls the majority of functionality for this event. The triggers hold references to this script and supply it with information when something has entered the trigger box. The RobotPod script tells the klaxon when it should start up and also controls functionality such as the doors animation process, and its animation curve for specifying how it should move. This event also triggers a boolean inside of the Robots Idle state which initiates its power up event where it can move into its other states discussed in the A.I page.  

You can find an image of the animation curve below.

You can also find the RobotPod script below with comments within.

Pod Door Animation Curve

Details

Robot Pod Script

The Players Quest

Player Quest Summary

In The Thinking City the goal of the player is to escape the labs that they are stuck in. This is done through several steps. The first step is for the player to restore power to the facility. This is done by acquiring 3 different keycards that are scattered throughout the map. The locations of the cards are:

  1. In the power room on a table next to a body
  2. In the robotics lab suspended by a robotic arm
  3. In the gravity room floating the air.

The first keycard is easy for the player to acquire as it's just sitting on a table next to the power console. This is to help the player know what it is they are looking for. The next two cards are only able to be acquired by overcoming VR interactable events. The first one is moving a robotic arm with controls that you can grab. The second one is destroying an anti-gravity generator by throwing an object at it resulting in the keycard dropping into reach of the player.

From there the player can then slot the 3 keycards into the power generator and pull the level attached to it to trigger the power up event. This gives the player access to the second part of their goal which is to open the bulkhead doors to the exit. Once the level on the generator is pulled the security room in the center of the map opens up giving the player access to the door lever. 

The final goal for the player is to pull the lever in security and then make their way out through the exit. As the player leaves the map this triggers the ending event for our level which is a teaser for the rest of our game. 

I will go into more detail on each of these steps below.

Player Quest Walkthrough

The Robot Arm VR Event

The Robot Arm VR event was built by Oisin Murphy. You can read more about its development in his portfolio. 

The Anti-Gravity Generator VR Event

The Gravity room was originally a Medical Bay back in the concept phase when the player had weapons. This room was an area where the player could go to heal. After deciding on removing weapons from our game to make our robot a bigger threat we brainstormed new ideas for the Medical Bay and came up with the Gravity Room. In this room there is a generator in the middle on the roof that is altering gravity causing everything to float. One of these items is a floating keycard that the player needs to acquire for the power generator. 

In this VR event the player needs to throw one of the floating items in the room into the generator. If the item is thrown hard enough it will break the generator causing gravity to be restored and all the items to fall to the floor allowing the player to pick up the keycard. 

Below is the steps involved in making up the Gravity Room VR event.

Anti-Gravity Room Finished

Download GravityRoomEvent.mp4 [37.43MB]
Details

Creating the Anti-Gravity Mechanic

There are several steps that I took in order to set up the mechanics for the gravity room. The first was setting up the objects so that they could work in the anti-gravity environment, the second was setting up the room so that it would but apply and un-apply gravity to objects, the final part was adding the generator for the player to throw an object at to disable the gravity. Below I have split up the rooms function into three different sections which I will explain in more detail.

 

Setting up object to work with anti-gravity

In order to have objects effected by in game physics such as gravity there are two main things that it needs. The first component is a rigidBody. This component allows you to specify details such as the objects mass, whether or not it's kinematic, and if it should use gravity or not. The second component needed is a collider. This can be a sphere collider, box collider, or any of the other available ones. You can see in the image below that the barrel has these two objects on it which allows it to react to gravity or lack of gravity. I also put an ObjectSlightRotation script on some of the objects so that they slowly rotate while in the weightless environment.

You can find the script for the slight rotation below this section

gravObjectSetup.PNG

 
Setting up anti-gravity room

 The Anti-Gravity room itself has a collider the size of the whole room which you can see in the image below. This collider is responsible for the objects either being affected by gravity or not. When an object with a rigidBody and a collider enters this area it becomes unaffected by gravity. Similarly once the generator is broken this collider finds all objects inside that are currently unaffected by gravity and changes them so that they are now affected by gravity. 

You can see in the image below there is a script attached to the collider. This script is responsible for picking up objects that enter and applying the conditions to them as well as disabling them once gravity is restored.

You can find the GravityRoom script below with comments within.

gravRoomScript.PNG

 
Setting up anti-gravity generator

The Anti-Gravity generator is located in the middle of the room on the roof. Similarly it has a rigidBody and a SphereCollider attached so that it can collide with objects. The generator also has a script attached to it that detects if the player has hit it with one of the objects in the room. If they have then it signals the previous script that controls the trigger box for the room which applies gravity to all objects inside the room.

You can find the GravGenerator below with comments within.

gravGenCollision.PNG

The Power Generator Event

Once the player has acquired the 3 different cards from around the map the next step in the quest is for the player to slot the cards into the card slots on the power generator and then pull the power lever right next to the slots. Doing this will the next phase of the level where the player will be able to access the security room in the center of the level. There are a few steps involved in designing this event so I will break it up below.

You can see the finished event in the video below.

Power Generator Event

Download PowerUpEvent.mp4 [18.62MB]
Details

Creating the Power Up Event

 There are several steps that I took in order to set up the mechanics for the power room event. The first was validating the cards that the player slots into the power generator. The second step was to use these validations along with the power generator lever so that the power generator would only trigger the power up event if all the cards were in place and the lever was pulled down fully. The final step was having the security room in the center of the level opening up so that the player can access it as well as having several special effects and shutters release for flare.

 

Slotting and Validating Keycards

The first phase of this event involves the player slotting 3 different keycards into the slots on the power generator as shown in the final video. These slots are color coded to allow only a card of a similar color. Once the player has put the card into the correctly colored slot it will snap into place and a light above the slot will light up to indicate that the card is in the right place. Only once all these cards are in place can the player pull the lever to restore the power.

You can find the coreCardInput script below which passes the color of the card being inserted to the corePowerUpQuest Script for validation

cardValidation.PNG

 

Pulling the power lever

Once all card have been slotted into place the player can then trigger the power up event by pulling the lever on the right hand side of the power generator. When this lever is pulled the corePowerUpQuest script validates that the player has acquired all 3 cards and put them in the correct slots. If they haven't then pulling the level will do nothing. However if the player has gathered and slotted all cards in the correct place then pulling the lever will cause the power up event to trigger. The script will activate the klaxon above the generator as well as it's sound effect. It was also start the lightning and fusion effect. Trishs voice line to guide the player to the next objective will also be activated. 

powerLeverPull.PNG

 

Security Room Lockdown Lift

Restoring the power makes the security room in the center of the map accessible to the player. Once the player walks out of the room a trigger box will set of the shutter raise event. This causes all the shutters that were previously over the windows to the security room to rise. It uses an animation curve similar to the other events like the Robot Pod. This also unlocks the doors to the security room which then function the same as every other door in the level.

shutterTrigger.PNG

Card Input Script

The Game End Event

After lifting the lockdown the player then has access to the security room. In this room the player is able to pull a final lever that opens the exit bulkhead doors. The lever triggers an animation for the doors causing them to open. Once the player proceeds through the exit door it seals behind them trapping them in a dark hallway with a blue light at the end illuminating the windows. When the player heads far enough down this hallway they become frozen in place as a giant reptilian eye bathes the hallway in a yellow glow while it stares at the player.

Games End

Download GameEndEvent.mp4 [41.78MB]
Details

Creating the Games End Event

The end of the game has two main steps. The first step is having the player pull the security level which will open the exit doors. The second step is the eye opening event at the very end once the player has left the main play area. I will split each one up and describe how they are done below.

 

Opening the Exit Doors

 The first part of the games end involves the player opening the exit doors. To do this the player needs to pull the level inside the security room that's in the center of the room. Once the level is pulled it calls a function inside the bulkheadControl script. This function activates an animation that opens the exit doors and also activates the klaxons above the door. At this point the player can exit the main part of the level. 

exitdoor.PNG

 

Ending Event

Once the player makes it past the exit doors there is a trigger box that activates an animation that closes the doors behind you so you are trapped in a dark hallway with a faint blue light coming in from the windows at the end. As the player heads down the hallway they will enter another trigger box that stops the players movement and triggers an animation that causes a giant eye on the other end of the glass to open and illuminate the hall in a bright yellow light from its eye.

eyeEnd.PNG

End Event Scripts

FPS Controller Development Process (non-VR):

Setup for FPS Controller

Before writing any script for controlling movement some things have to be set up in Unity first. The first thing added is an empty game object. This will contain all info on the FPS Controller. I then added a unity component called "Character Controller" which we will need as it contains info related to the player controller. After that a camera is added as a child of this GameObject. This will be the eyes and ears of the player as it has an "Audio Listener" component by default.

The final setup step is setting the empty gameObjects "Tag" and "Layer" to "Player", this is important as it identifies the object as the player for everything in the game. With all that set up it's time to write the script. You can see some images below of the FPS Controller setup.

FPS Controller Setup Image

Details

FSP Controller Movement & Look Rotation

Now that the setup for the First Person Controller I can start writing the script that will control the players movement. This script will hold information about the "Character Controller" such as it's height for crouching. It will also contain the walk speed, run speed, jump speed, crouch speed, ground force, multiplier for gravity and a step lengthen for running. I will explain a little about these various values below

  1. Walk speed : Float variable which assigns players walking speed
  2. Run Speed : Float variable which assigns the players running speed
  3. Jump speed: Float variable controlling the force at which the player moves up while jumping
  4. Crouch speed : Float variable controlling the speed the completes the crouch.
  5. Stick to ground force : This will be the gravitational effect on the player as the player controller is not effected by normal in game physics
  6. Gravity Multiplier : This will alter gravities effect on the player
  7. Run step lengthen : Alters the step rate while running

 

The look rotation controlled by mouse movement is achieved by creating an instance of Unity's mouseLook script within my FPSController script. This allows the player to look in all directions around them.

Below you can find the script of the FPS Controller with a load of comments inside giving a more in depth description of the codes functionality and a video of the script in action. 

FPS Controller Head Bob

It is common for fps games to have a slight head bobbing feature when the player is walking. While this isn't exactly realistic to how a person moves it feels a lot more natural than the camera remaining at a specific height with no movement at all.  The headbob feature is achieved by using an animation curve along with a horizontal and vertical multiplier. As the player moves the animation curve is played and the camera adjusts it's position based on the value of the curve at the time. 

You can see the CurveControlledHeadBob class inside the FPS COntroller script with comments.

You can also see a video of the headbob below. It is subtle so it doesn't take from the game but it's values can easily be adjusted.

FPS Controller Script

FPS Controller Video

Download FpsControllerVideo.mp4 [10.47MB]
Details

FPS Controller Headbob

Download HeadBobVideo.mp4 [13.55MB]
Details

Halt on FPS Controller Development

Due to the fact that we have decided to build our game for VR now we have halted development on the FPS Controller and Oisin has started building the VR Player. However the FPS Controller is now being repurposed as a debugging tool for the members of our team that don't have VR headsets so that they can still traverse the level and set off events.