Final Year Project Portfolio - Creating VR Interactions in Unity with SteamVR

In order to give the player something to do in our game environment, I needed to create a number of VR interactions using the tools supplied by Valve's SteamVR Unity Plugin, as well as some home made functionality of my own.

This section will go over the creation of some of the VR interactions present in The Thinking City - ones that push the player forward in their objective, communicate story, and some that are purely cosmetic.

1200px-Home_Icon.svg.png

Creating the Robotic Arm Interaction

One of the major VR interactions that we wanted to include in our game was that of a large Robotic Arm. Used by the Scientists who worked in the game environment prior to the player's arrival to help with object transportation and precise tooling procedure, this Robotic Arm was of great importance. Gameplay wise, we wanted to use the concept of a robotic arm as a means for the player to interact - to move it around with a control panel in VR, receive some story context, and also use it as a means to advance through the game's main mission. This section will go over the different things I needed to do in order to create an on-rail system that the player must move a Robotic Arm across in order to receive a Keycard & advance through the game.

 

Robotic Arm Control Panel & Basic Movement

My first task when creating the Robotic Arm was to make working joysticks for the control panel. The first step of this was to bring the joystick into Blender and rig it so that the base of the joysticks would deform when moved, as seen here:

2020-01-28_17-45-16.gif

Next was allowing for the player to grab onto and move the joystick around in VR. This was a tough and time consuming task, and one that I had to talk to my team to compromise on, as I had struggled to create a joystick system that allowed the player to move an object in a spherical motion. We came to the conclusion that we should forego that approach and instead create two joysticks - one that rotates forward / back, and one that rotates left / right. This turned out to be a nice compromise as it allowed me to create a system where the right joystick controlled object translation, and the left joystick controlled object rotation.

In order to facilitate movement for the robotic arm, I programmed a system that gets a normalized amount of rotation deviated from a joystick's resting state and limited by its max / min angle. This value would be passed into the Robotic Arm's movement / rotation script to be used as a speed modifier. For now, I made it so that the Robotic arm could only move in a 10 unit circle from its original position, in order for it to not leave the map. Then, I moved on to making the Robotic Arm toggle ragdoll physics to create the "drop" effect. Using this tutorial, I managed to figure out how to implement the ragdoll system for the Robotic Arm, and then hooked it up to the Control Panel's button for activation. The first iteration of the Robotic Arm displaying 10 unit radius movement and ragdoll physics is demonstrated in the video below at the 1:30 mark.

 

 

Posing the Robotic Arm

In order to make this VR mechanic more eye-catching and "important" to the player once they come into contact with it, we needed to pose it in a way to better represent the arm's capabilities / purpose, as well as add the collectible Key-Card into its grip (in some form) to give context to the player that "they need to interact with this".

To pose the Robotic arm, the mesh needed to be rigged and re-exported in a new position, a task that was done in Blender. It did not take much time, but care was taken when rigging to make sure that all aspects of the mesh were accounted for and therefore posed in a believable way for use within the game. Below is a demonstration of the rig's capabilities:

2020-04-03_13-52-02.gif

Once this was finished, the arm was posed in an arced position as if it was stuck holding up an object. This was exported with the armature and imported into Unity. Once it had been hooked up with all of the scripts and dependencies that the previous arm had, that was it. The differences between the two are stark, as the new version much more accurately communicates the purpose and form of the Robotic Arm.

Before Posing                                                                                                                After Posing                     

Unity_2020-02-17_19-52-37.png          Unity_2020-04-03_14-16-37.png

 

Changes to the Robotic Arm Control Panel

In order the move the Robotic Arm, the player is required to use one of the two control panels seen within the Robotics lab - each giving the player a different view of the arm during movement. The control panel features two joysticks for moving the arm, and a button off to the right that allowed the player to drop the arm. There were a few issues with this setup. Firstly, after moving the joysticks, they would not assume their default position when the player let go. Secondly, an issue where the player would accidentally press the release button was discovered during user testing. These two issues needed to be addressed in order to make the interaction more usable and intuitive for the player.

To fix the issue with the joysticks, I added code to the SteamVR CircularDrive script, which made the Joysticks return to their initial orientation once a player hand had stopped grabbing it. This worked quite well, except in some cases where the joystick would overcompensate and move slightly further in the opposite direction from its initial orientation. I was unable to figure out what was doing this, since the initial orientation was initialized on object start, and looked as if it was never overwritten during play.

In order to fix and issue with the control panel where players would accidentally press the "crash" button when using the joysticks, the setup of the control panel was altered. I moved the Left and Right joysticks further towards the edges of the panel face, the release button moved into the center, and the keyboard scaled down and duplicated onto the opposite side of the button.

Before Changes                                                                      After Changes             

Unity_2020-02-17_20-00-33.png         Unity_2020-03-19_10-31-02.png

 

Adding Purpose to the Robotic Arm

Now that the Robotic Arm was posed, it was now the time to add purpose to the arm. It needed to have a means for the player to retrieve a Key-Card and therefore a compelling reason for the player to interact with the mechanic. To do this, I sourced a skeleton model from CDTrader, re-textured it in Substance Painter, posed on the robotic arm, and set up ragdoll physics on it's rig. In order to have it move in tandem with the robotic arm, rag-doll physics have to be initially disabled, but enable when the player "drops" the arm. This allows the skeleton to drop with the arm and release the Key-Card from its mouth, thus allowing the player to pick it up from the floor. The finished positioning and setup of the skeleton can be seen here:

Unity_2020-03-28_14-21-18.png

Lighting was added to bring emphasis to the robotic arm, in an attempt to guide the player's eye to it as the main focus of the room.

 

Creating the Robotic Arm Rail System

One of the key aspects of this Robotic Arm mechanic was that it would be constrained to an overhead rail. This was to bring an air of believably and logic to the system in terms of the environment, as well as to create a means for the player to gauge what they are supposed to do and to denote their progress whilst performing the task. The first thing that I needed to do was to map out a path for the rail system to follow in the level. We had the individual rail parts pre-made, so we just needed to decide on a layout that made the most sense.

The layout that we chose was a rather simple one - one that merely loops around some of the props in the Robotics Lab and features some faux broken segments (ergo the pieces of broken rail on the floor) to give off the impression that the system had degraded over time.

CLIPStudioPaint_2020-04-03_17-03-00.png

The major hurdle when creating this system was getting the robotic arm to conform to a path. However, with the help of a very detailed YouTube video on the topic, the issue was significantly easier to implement. It was possible to create a path system and create the backbone of following system. I then edited the Robotic Arm controls script in a way that allows the arm to move forward and backward along the path. This meant moving the movement mechanics of the Robotic arm to a dedicated PathFollower script.

With the implementation of the rail system brought some changes to Robotic Arm structure. Initially, the Robotic Arm took a more conventional top-bottom object hierarchy from root to last child. This needed to change so that the rotation of the main body of the Robotic Arm could be maintained, and so that a Player's use of the Rotation joystick wouldn't be immediately overwritten. To do this, I separated the Robotic Arm object into two pieces - one containing the ceiling rail attachment (PathFollower), and one containing the main arm (Robotic_Arm_Posed). The PathFollower object would be controlled by the robotic arm controls' translation joystick, taking in movement input and using that to move up / down the path. The Robotic_Arm_Posed object however would take the input of the robotic arm controls' rotation joystick, and be rotated to whatever degree the player desired.

CLIPStudioPaint_2020-04-27_16-30-23.png

To make these two objects work in tandem with each other, I added some code to the Robotic_Arm_Posed object that translated it to a point directly under the PathFollower at any given point. This gives off the illusion that the two objects are attached.

Before demonstration, I made one final piece of improvement made to the rail system. Per my request, Dylan modeled and textured some straight and curved lights to be attached to the existing rails, as well as rounded end pieces for the entire system, each with an emissive component. This was done in order to implement a system in which the path would illuminate on/off depending on how much progress was made along the path, as well as to give positive feedback to the player when using this mechanic.

This was achieved by associating each rail piece to different way-points along the path, allowing the game to turn on/off emissive/non-emissive materials on the object once the way-point had been reached. The result of these improvements can be seen in the demonstration video below.

This incarnation of the Robotic Arm is in the state it was at by the end of the Production release.

 

Before the Robotic Arm could be deemed "complete", I added some code that stopped the robotic arm at the end of the track, disabling movement after it had successfully hit the end of the rail. This was to avoid any issues where the arm would deviate from the end in the time between the player letting go of the joysticks and pressing the release button.

With this addition, the Robotic Arm VR Interaction was completed (at least in terms of major additions).

 

 

Reflection / Weaknesses of the Robotic Arm

Evaluating the Robotic Arm as it is now, we can see a number of weaknesses that will inform what we may plan to change / improve as a result of user testing. These are:

  • Control Panel Joysticks return to a close-to-zero position on release after being brought to their max rotation - causing unwanted, slow movement in the arm.
  • Arm Rotation controls serve no important gameplay purpose.
  • Having two control panels in the Robotics Lab may confuse the player OR cause them to disregard them as merely props.
  • The path illuminates piece-by-piece and not granularity - looks a bit primitive.

Going forward, we aim to use these insights as well as any gleaned from User testing in order to improve & refine the Robotic arm VR interaction. However, whether any, all, or only some of these problems get addressed is very reliant on the priorities of the team regarding the time remaining in this project.

 

 

 

Creating Minor Interactions

Story Holotable

Our game up until a point was quite devoid of Story context other than what could be deduced from what the player sees in the environment. To add a little more context, I got to work creating a VR interaction wherein the player can use one of the holotables to learn some of the G.S.A. Tainted robot's backstory.

The holotable was made by appropriating the desk asset that already existed in the game and adding new functionality to it. I started by making the table's control panel buttons work with the SteamVR Hover Button script, then assigned a function that would spawn a certain object at a transform above the table with a holographic shader attached to its OnButtonDown callback. This was pretty straightforward to do in code, leaving the meat of the work in what story should be told and how.

After talking to my team, we decided that it may be nice to give context as to how the robot character became infected in the first place, so I got to work trying to make it happen with the holotable. My approach for this was limited by both time and what I was given to work with. Using the table, I felt that an object-to-object  storyboard type story telling angle would be appropriate, tying a 3D object to each of the table's number pad buttons that, when pressed in sequence, gave the player a timeline of how the robot got corrupted. One idea I had was to make the table turn against you in some way at the end, and I decided to do this by having the last object in the story be an eye (similar to the robot) that looks at the player's face for the remainder of the game. I needed to model some items to fill in gaps in the story as necessary, as we did not have all of the assets to communicate this.

A demonstration of the VR Holotable story can be seen below:

The story goes as such:

  1. Old Robot - A previous iteration of the Service Bot that the scientists used to use.
  2. Robot Head
  3. Trashcan - A call to scrap that model of the robot and create an improved model.
  4. Blueprint - A plan for a new & improved Service Bot
  5. Tools - They began working on building the Robot from the blueprints.
  6. Tentacles in Cup - At the same time, the scientists were experimenting with a biological mass extracted from the City's Power Source.
  7. Tentacle Spill - An accident occurred causing the experiments, causing the specimens to escape.
  8. Tainted / Infected Robot - The new Robot that the scientists had been building got taken over by the specimens & corrupted.
  9. Error - Every button (except the final one) results in Error, something went wrong.
  10. Eye - Follows you and disables all other buttons - You've learned too much.

Some improvements were made as a result of brief user testing, where we learned that players had trouble making reliable contact with the control panel buttons. To mitigate this issue, I decided to move the control panel to the face of the table. This would allow the player to have a better view of the buttons and hopefully orientate their hands better when pressing the buttons as the panel overhead would no longer obscure them.

Unity_2020-04-27_17-05-02.png

Internally, we noticed immediately that this allowed for less strenuous use of the holotable, and in fact improved its visibility in the scene as this change visually differentiated it from the other "standard" holotables. One more improvement I made was that it always began projecting the first object in the story in order to telegraph to the player that there is something worth checking out about this prop.

On reflection, I am unsure about the effectiveness of this interaction as a story element. Its means of telling story is primitive, and leaves a lot to the imagination. It's possible that the objects shown are too ambiguous and come off meaning nothing.

Here are some ways I think it could be improved with further development:

  • Animated holotable projections - have characters / objects depicted move in a way that relates the story being told.
  • Transitions between objects - holographic distortion effect.
  • Sound effects & voices - communicate story through emotional response brought on through sound & small voice clips.

 

 

Door Hacking with Keypads

One of the things we included in our original paper prototype was the ability for the player to hack open / closed doors, as a means to get through temperamental, randomly cycling doors by sacrificing their safety (movement). From testing that prototype at the beginning of the project, we found that players felt that the challenge of escaping the robot persisted but the game felt much fairer overall, as they would not be subject to the mercy of malfunctioning doors. Calling back to this, I decided to create said system in Unity, but in a way that requires the player to use a code found on noticeboards scattered around the level.

Unity_2020-04-27_17-45-36.png

an in-game noticeboard where the player could note the game's door code

To create the Keypad system, I ripped one of the keypads that Dylan had made for the storage shelves and made the buttons functional by attaching a SteamVR Hover Button script. The idea was to allow the player to input a 4 digit code, press enter, and then open the door if the code is correct. This wasn't hard to implement, as we just needed to store button presses (each of which was assigned to a number from 1 - 4) and check it against a hard coded value to see if they got the combination right. If correct, we'd give the player feedback through a green light. If incorrect, we'd show them a red light and reset the input, allowing them to try again. I used UI text objects as a means to communicate the values held by each button, and to keep a track of the numbers the player had input. Below is a video showing my implementation of this system:

Initially, there was an issue sometimes where the door would open after getting hacked, but then immediately close because of the door's transition timer in Jason's DoorFunctionality script. To fix this, I had the hacking function reset the countdown on the door. It took a bit of self-testing to get right. Originally, the keypad was a little frustrating - each button needing to be pushed in significantly to register as pressed, and instances where the player could accidentally press the wrong button were common. I changed the button push and release thresholds on each button's HoverButton script to make it snappier to use as you would not need to press the button the whole way in.

There are still a few ways in which this mechanic could be improved, namely disabling a button's input if pressed until reset - to stop codes with repeat characters being input (e.g. 1123).

We look to get feedback on this mechanic from user testing & trials, and use any feedback from people's experiences to figure out ways to make it a more responsive and intuitive system.

 

 

 

Creating Distance Grab Mechanics

One of the major issues that plagued our game was that objects that fell onto the ground were impossible to pickup with different VR setups. We noticed this once we were playing with the retrieval of Keycards. If a Keycard were to be dropped by the player, its small form coupled with the floor height of the player's VR setup would make it impossible to retrieve the object in almost every case (sometimes we got lucky). This essentially rendered the game unbeatable if the player dropped a key item.

In order to remove this problem completely, I began work on a system that allows the player to grab certain items from a distance by pointing at them with their hand and grabbing.

 

Background Research / Prep

Before I could really get into the meat of developing this mechanic, I needed to get a greater understanding of how SteamVR's Hand script worked, particularly how it allows the player to grab. Thankfully, this wasn't a very hard thing to figure out. The Hand script contained a function called AttachObject(), which takes a GameObject and a GrabbedType enum as parameters. Using this, I could attach any object to the player's hand from anywhere in the level, or when any event takes place in Unity.

I came up with a plan to use this in order to create my own Distance Grab mechanic, in which the player's hand would continuously cast a ray to detect throw-able objects (or "pickups"). If the ray detected one of these objects and the player were to grab using the controller for that hand, the detected object would appear immediately in their hand. A simple approach that would lack the nuance seen in other games like Half-Life: Alyx, in which the player must detect an object pull it towards them with a gesture, then catch it mid air. Given the time constraints of the college environment and my inexperience with VR overall, I thought my plan to be a solid, achievable task, with room to add flair too later if time allowed.

 

Raycasting from Player Hand

The first thing I needed to implement was a ray-casting system that shoots from the player's hand and detects objects that can be picked up. Initially, I attempted to shoot raycasts from the tip of the player's index finger, as it gives the the ability to use the point hand animation in order to easily point at and grab objects that they want with accuracy. However, this would not work as intended on implementation. Since the finger would curl and extend according to its hand animations, the raycast would be unreliable.

So instead of this I decided to edit the VR hand prefabs (left & right) supplied by the SteamVR Unity Package to include an empty GameObject to act as the origin of the raycasts. These were placed to be just above the back of the player's hands.

Unity_2020-04-27_20-06-47.pngUnity_2020-04-27_20-07-04.png

It took a few attempts to get these raycasting points aligned. As you can see in the pictures above, it appears as if these transforms are greatly misaligned, but they're not. During startup, the hands spawn in a completely different orientation - a process that I do not fully understand. So how did I align them properly? To do this I positioned the raycasting points during play. I loaded into my test scene and ensured the hands were present, then used Unity Editor's pause functionality to change the position of the raycasting points in-game. Once I had them positioned above the back of the player's hands and pointing outwards from the player along their Z axis, I copied the transforms and stopped running the scene. In each hand's prefab editor, I then pasted the transform values into the GrabPointerRight and GrabPointerLeft GameObjects respectively. This set their position correctly for when the player loads into the game.

Next, I moved onto scripting. I attached a script called Oisin_GrabDistance (to differentiate it from the OVR distance grab script) to both of the player's hands that would constantly get the position of the new raycasting point, then cast a ray from that point on a given hand. If the ray hit something with a collider, the script would then get the root object of the collider and store it as the currently selected object. If this newly selected object had a Throwable component (meaning it's an object the player can pick up & throw) the script would check if the player had clicked on the Grab action on that hand's controller. If the player had pressed grab, it would then call that hand's AttachObject() function, bringing the object to their hand.

Below is the Update method of the script I have just described along with the functionality mentioned above.

devenv_2020-04-27_21-27-52.png

This is the finished code for the Distance Grab, with some functionality that I will mention in a later part

 

Here's a demonstration of the distance grab in action at this point - as you can see there are some issues with inconsistent grabbing, and grabbing with the left hand would not work at all.

To fix these issues, I needed to adjust the rotation and position of the raycasting points again, because they were registering the hand object itself as the selected object thanks to the raised nature of the player's knuckles. Once I made this adjustment it worked just fine.

 

Highlighting

As you may have noticed from the video and code shown above, I have implemented a system that highlights objects when selected by the distance grabber raycast. This was done firstly by creating a script called the Highlighter, that swaps an object's base material with a highlighted material (and vice versa) when the HighlightObject() function is called. HighlightObject() takes a boolean as input to dictate the polarity of the change.

The Highlighter script would be added to every textured mesh on a given throw-able object, and required a Highlight material to be passed to it from the editor. This meant that for each material a throw-able object had, I needed to create a copy of that material with glow shader. To make these highlighted materials, I cloned the shader & materials that Dylan had made for the game's Keycards, then changed the highlight colour to dark blue. This was to differentiate it from any other holographic effect in the game. I then created materials for each object using this new shader, and passed their base material's maps into it to create the finished highlighted variant - an example of this can be seen here:

Unity_2020-04-27_22-14-48.png

A highlighted hammer object from the Gravity Lab

Once this was done, I needed to create functionality in my GrabDistance script that would loop through all of the meshes in the selected object and activate OR deactivate the highlight. This can be seen in the image below:

 devenv_2020-04-27_21-29-11.png#

One of the difficulties that I had when testing the script however was that, because I was getting the root GameObject of the raycast's detected collider, the object selected would often be an empty GameObject that we used to organize object in the scene hierarchy. This meant that no object that the player pointed at to grab would work or highlight. To rectify this, I needed to take every object in the level that I intended to be distance grabbable and pull them out of any parent game object.

Here's a demonstration of how the Distance Grab mechanic worked by the end:

*** ADD VIDEO ***

 

 

Reflection

In the end, I managed to create a simple distance grab mechanic that resolves the issue where an object on the ground would be impossible to pick up, this breaking the game. However, despite the problem being solved, I can't help but notice a few ways in which to improve it if I were to iterate on it over time.

Regarding the Distance Grabbing, here's some of the things I reckon I could improve:

  • Provide Haptic feedback (i.e. controller vibration) to the player once an distance grabbable object is highlighted.
  • Adding sound effects for when the player selects and grabs an object.
  • Instead of teleporting the object into the player's hand, lerp the object's position to the hand's position over time.

With these improvements, I would hope that the interaction becomes a bit more satisfying for the player to use during play.

When it comes to the Highlighting of objects, there's one thing that I would improve. The shader that was made to allow materials to glow blue was incapable of interpreting the base map's alpha channel. This meant that glass material as seen in the image below would be completely opaque.

Unity_2020-04-27_22-41-34.png

 

 

 

 

 

Creating Different Hand Poses for more Immersive Interactions

For each and every object in the game that the player is able to pick up with their hands, I needed to create a set of hand poses in order to give the player visual feedback communicating that their hands have realistically grabbed onto the object. To do this, I first began to look into the SteamVR Unity Plugin documentation in order to read about their Skeleton Poser component. Thankfully, as well as their written documentation, they had a link to handy video tutorial that explains how to create hand poses on an object for both the Left and Right hands.

Following the documentation & tutorial, I created a variety of different hand poses for the different props that the player can pick up in our game, below are some examples:

Unity_2020-04-27_23-23-01.png  Unity_2020-04-27_23-23-33.png 

Unity_2020-04-27_23-23-54.png  Unity_2020-04-27_23-24-18.png   Unity_2020-04-27_23-24-36.png

Unity_2020-04-27_23-25-00.png

Note - Hand poses exist for both the Right and Left hands for each object the player can pick up.

 

One of the things I could not create hand poses for, much to my dismay, was the Levers and Joysticks that exist within the level. This was because the SteamVR CircularDrive script that I used to create them actively disabled skeleton poses on the object. This was frustrating enough for me to begin prototyping my own Lever system that eventually got scrapped in favor of more pressing tasks. As a result, I had to settle for the unrealistic, offset attachment poses for the game's levers.

References & Resources

Valvesoftware.github.io. 2020. Introduction | Steamvr Unity Plugin. [online] Available at: <https://valvesoftware.github.io/steamvr_unity_plugin/tutorials/Skeleton-Poser.html>

 

Zulubo Productions - Published 30 Jan 2019 - "SteamVR Hand Animation - Skeleton Poser Tutorial" - https://youtu.be/a9EBILq2ep8