Researching VR Movement Methods
After a number of weeks of work on our game concept, a number of voices from outside the team suggested multiple times that we should look into the implementation of Virtual Reality. This got us thinking, could VR enhance our game concept? If so, how much of groundwork that we already set down would be disturbed? Despite my team's stance that our game should remain a traditional desktop experience, we obliged to give VR a fair chance and run some tests.
After some deliberation we identified that, in terms of our game concept, player movement within the environment would be the most important factor in deciding whether we should pursue VR. As such, Jason & I volunteered to take charge of this process. Jason took on immediate development of a VR Controller in Unity, and I taking on the main background research - particularly into the means of Movement. I joined in on building the stage of the demo in Unity once research was completed.
Before looking into the different movement methods available in VR, I created a set of requirements for a VR movement method that took into consideration our already established game concept. These were:
- Must allow for the player to move freely in the environment - to the same extent as traditionally able.
- Must allow for the player to vary the speed to which they move (e.g. sneak & run).
- Movement must not hamper the player’s ability to look around.
- Must not require additional hardware from the user.
My concern for the accessibility risks were brought about after reading a short post on ablegamers.org, which detailed a number of grievances AJ Ryan had with both VR hardware and software - available here. I also made a list of things to look out for, in terms of accessibility & risk to the end user. These are:
- Does the movement method pose the risk of inducing motion sickness? And how can we mitigate that?
- How accessible will the the movement method be? - Could they use it if bound to a wheelchair?
- Could the player use a specialized or standard controller if they cannot use the motion controls?
I used the recent announcement of Half-Life: Alyx as a starting point, as on the official website Valve has listed the number of VR play styles that will be supported (see below).
I then began informing myself about the different movement methods that have been explored already in the VR space, with the aim to create list the extent of freedom the player has, the risks to the player, the accessibility, and the impact an established movement method would have on our game concept. One paper in particular contributed greatly to this research - (Costas, 2019) - which compared the most popular movement techniques. Using this paper and a number of other sources (listed in the bibliography below), I created this table detailing the previously mentioned qualities of a movement method.
Method |
Freedom of Movement |
Risks |
Accessibility |
Gameplay Impact |
Smooth artificial locomotion |
Player moves using the analogue stick or keyboard input & looks with the VR headset.
|
Motion sickness – dizziness or nausea thanks to the disconnect between physical and visual movement.
“Vestibular Mismatch” |
Allows the player to play sitting or standing.
Requires the player to use an analogue stick (on traditional controller or motion controller) or keyboard input.
May be capable of taking specialized controller input.
Easy to master thanks to familiarity to traditional movement methods (Costas, 2019) |
Player would be capable of moving in ways similar to traditional games, but thanks to the risk of motion sickness the speed of which the user can move while running will need to be examined. |
On-rail locomotion |
Player moves on a rail, similar to on-rail shooters except the head movement is free to control.
The rail itself is controlled by the developer and allows them to script the pacing and choreography of scenes more drastically. |
Potential motion sickness if handled poorly. A slow / medium pace should mitigate these concerns. |
The player need not worry about controlling their movements and would only need to focus on the things that enter their range - less physically demanding. |
Since the game is designed so far with emphasis on the player making decisions and reacting to the environment in real time, it would be detrimental to force them into a determined pace & script gameplay, likely hampering the overall experience and replayability. |
Instant teleportation |
The player places a pointer into the environment from their current position, to which they can teleport, blink, or shift towards when activated. This allows them to traverse the environment in “steps”. |
Unnatural movement technique can cause breaks in immersion for the player. The constant visual jumps reportedly induce tiredness in the player in some instances. (Costas, 2019)
The players are required to think & plan out any action, requiring an amount of mental presence. |
The player must be able to articulate a controller to place a pointer into the environment.
Those sensitive to eye-strain may find the experience to be rather discomforting over time. |
Due to the emphasis on planning your positioning and the potentially jarring nature of movement, this method would break up our game concept a little too much. It may be the case that escaping the robots would be a usability issue, where a stressed individual may be unable to use this technique effectively, negatively impacting their experience. Alternatively, this method may suck all tension or flow from the game. |
Walking in place |
The player moves in place – simulating walking – in order to traverse the environment.
The player free to move around the environment at their own pace, depending at the rate of their physical movement. |
The player is required to stand.
As Physical movement is heavily relied upon, raising the risk of physical fatigue & exhaustion.
Fear of collisions & motion sickness, but a highly- immersive method. (Costas, 2019) |
Requires a tracker on the players limbs (e.g. a Vive controller) which may be indispensable for most users.
Reliance on physical movement excludes people with motor difficulties or those who are chair-bound from play. |
Whilst this method allows the player to move at different speeds around the environment in all of its glory, the emphasis on physical movement will exclude an unknown portion of potential players from even having the chance to experience the game. |
Once this was complete, we as a team convened with out mentors and came to the conclusion that Smooth artificial locomotion using a motion controller joystick, keyboard input, or gamepad input would be the only major viable method of movement that we could apply to our game without significant design changes. Once this decision was made we began preparing to design and conduct a VR experiment to test whether this method would be worth adopting over traditional desktop gameplay.
Designing & Building the Test
Test Design
I began to design a test to measure the extent of motion sickness within our game environment using our already decided intended gameplay mechanics. We would use the game environment that we had developed up until this point as the stage of this test, as this would be the closest possible thing to the finished game's setting that we have.
The test would be conducted as follows:
- A set of between 4-10 participants that fit within our game concept's target demographic would be taken into our VR test environment.
- They would be tasked to traverse the environment by following a set of Waypoints set out in the following positions:
- The participant will be able to move themselves in the environment using a motion controller's analogue stick, with the option to toggle running by holding the same controller's trigger input.
- The participant will repeat this twice. The first run will be done at the player's walking speed, the second run will be done at the player's running speed.
- After this is complete, the participant will be taken out of the VR environment and asked to provide feedback on their experience verbally and through filling out an anonymous questionnaire.
Our aims with this test is to quantify the severity of motion sickness at both walking and running speeds, as our game concept relies heavily on the player getting pursued by patrolling robots.
Building the test
Building the test was a task split between both Jason and I. Jason took responsibility of creating the player controller in Unity so that participants can actually play the level, whilst I would prepare the environment & Waypoint system.
To prepare the environment, I set up basic lighting around the environment in order to emulate an approximation of the feel our game intends to have. Additionally, I applied a temporary green material to the walls in order to make the environment slightly more compelling, as the level was completely without texture prior. I also added in some animated Service Bot models into the environment that I had created earlier in the semester, in order to give the participant a sense for them in the VR environment. Here is what it looked like after this process:
Next I needed to implement a Waypoint system that meet the following criteria:
- Waypoint stands out from the environment, and is easy for the participant to identify.
- Appears one at a time in the fixed positions matching its proposed course.
- Once the player collides with the Waypoint's collision mesh, that Waypoint would be destroyed and the next would appear.
To create this system, I first created a number of spheres sens the mesh renderer component in the environment to act as anchors for Waypoint positions in the environment. These would be added to a list of transforms in my VRWaypointSystem script, which would be cycled through one at a time once the player reaches an active Waypoint.
Then I went on to create a simple Waypoint prefab to be instantiated once an existing Waypoint is collided with by an object with the tag "Player". To test this, as Jason had not created the VR Player Controller, I used a box primitive with the tag "Player". The finished Waypoint system consisted of the aforementioned objects and about 20 lines of code.
Here's what the finished Waypoint system being tested with a cube object looked like:
Once this was finished, I pushed it into our team's Github repository and awaited for Jason to finish creating a VR Player Controller that allows for smooth locomotion and running mechanics. Once this was done & implemented, we were prepared to start running the test to see if VR was viable for our game.
Running the Test
Creating the Google Forms Feedback Sheet
Jason and I needed to create a Google Forms sheet in order to consolidate and organize the test results we would get, so we needed to come up with a number of questions to ask the participant in order to fulfill our investigation. Since we were investigating the impact of motion sickness for our game concept, we needed questions that probed the participant's experience in the game. We came up with the following questions:
- Have you ever used VR prior to this test? -> this was to see if there was a correlation between new users of VR and susceptibility to motion sickness.
- From 1 - 10, what degree of motion sickness did you feel whilst Walking around our scene? -> to deduce severity of motion sickness when walking.
-
From 1 - 10, what degree of motion sickness did you feel whilst Running around our scene? -> to deduce severity of motion sickness when running.
-
How fast is the running speed in your opinion? -> to gauge if our running speed was too fast, too slow, or fine as is.
-
Would you rather see this game as a VR experience? Or as a traditional desktop / console experience? -> after explaining the game concept, we wanted to know if they thought it would be better in VR.
(Extra Qs - added because of situation regarding original robot design - discussed in my Creating Art page)
-
From 1 - 10, how 'scary' do the robots look in your opinion? -> to get feedback on how suitable the old robots were in the environment.
- Honest thoughts on Service Robot -> written opinion on how the robot looks, moves, and seems in-game.
Once we felt that we had come up with questions that cover all of the things we wished to investigate, we felt prepared to move on with the recruiting of participants for the test.
Running the Test
The first thing we needed to do before running the test was to gather participants. To do this, Jason and I went into the area outside of Starbucks at DkIT's PJ Carroll's building and approached groups of students to ask if they'd like to participate. When doing so, there was a handful of things we needed to mention to ensure they were fully informed. These were:
- Privacy - that their personal information would not be collected at all for this test other than for participation - would be scrubbed after.
- How the test would be run - length, equipment, testing criteria.
- The risks - Motion sickness, visual vestibular mismatch, etc.
Once we had them fully informed, we took willing participants' names and college email addresses and told them that we would email them when and where we would run the test.
We ended up getting 6 participants total, and organized to run the test on a day that we had free - on Thursday 19/12/2019 from 9am - 4pm, in one of the consultation rooms in the Carroll's building. Once we had booked the room, we sent the emails out to participants with info regarding the location of the test a link to a Google Sheets page for them to input their names beside the time slot that suited them most.
*We scrubbed the names from this form as soon as testing ceased*
On that Thursday, we ran the test as planned. The test followed the structure as detailed below:
- The player would don the Oculus headset and load into our Test Scene
- The player would then maneuver themselves around the course twice, one at a regular speed, and again at a running speed.
- Afterwards, we would ask them to give brief feedback on the props and environment as well as our (old) character that would loop through its animations within the level.
- Once the test was completed, they were asked to document their thoughts on the experience on a Google Forms survey.
Results
Each test went by without a hitch - thankfully nobody got sick, and we got a small set of results as seen here. This set of results is what Google Forms put together when looking at all submissions:
From this set of results, we could glean a few things about our implementation of VR movement & how it was experienced by the participants. Namely, Most participants had used VR previously, and felt middling levels of motion sickness when walking and strong feelings of motion sickness when running. We can see as well that our movement speeds need adjusting, as the running speed borders on being too fast. These two likely correlate and can be improved by lessening the speed and applying techniques like FOV reduction during movement.
Despite the motion sickness, all participants agreed that our game concept would work best in VR, and thus gave us a compelling argument for pursuing the shift to that mode of gameplay with the stipulations that we work on reducing motion sickness as much as possible.
In terms of the Robot character (the extra addition to the form), consensus was that it looked "cute" and nonthreatening, which validated some doubts about that Robot's suitability in our game.
Conclusion / Reflection
As a result of this series of tests, we did indeed decide to go with VR for our project, pledging as well to make some fundamental changes to our concept in order to make it fit and make the most of VR specific interaction.
Looking back though, there are a few things I would have done differently in order to tidy up the test process and to get some more information. These are:
- Separate Robot feedback into its own Google Form, so as not to take away from the focus of the test.
- Inquire as to how fluid the game felt - frame rate, graphics quality, environment scale, etc.
- Extend the test to investigate how people interact with the environment, documenting how people explore environments in VR & how it differs from regular PC gameplay.
Overall though, the test went smoothly and we learned a lot about setting up these kind of trials. Additionally, we also got the justification we needed to begin shifting our game to VR, which was our goal going into this test.
Bibliography
Costas Boletsis and Jarl Erik Cedergren, “VR Locomotion in the New Era of Virtual Reality: An Empirical Comparison of Prevalent Techniques,” Advances in Human-Computer Interaction, vol. 2019, Article ID 7420781, 15 pages, 2019. https://doi.org/10.1155/2019/7420781.
A.S. Fernandes, S.K. Feiner, "Combating VR sickness through subtle dynamic field-of-view modification", Proc. 3DUI, 2016. https://ieeexplore.ieee.org/document/7460053.
- Supplement: https://www.youtube.com/watch?v=lHzCmfuJYa4
Ryan, A. (n.d.). Thoughts on Accessibility Issues with VR. Retrieved from ablegamers.org: https://ablegamers.org/thoughts-on-accessibility-and-vr/
- Kim, W. Kim, S. Ahn, J. Kim, S. Lee, "Virtual reality sickness predictor: Analysis of visual-vestibular conflict and vr contents", QoMEX, 2018.
Carbotte, K. (2019). Do the Locomotion: The 19 Ways You Walk and Run in VR Games. [online] Tom's Hardware. Available at: https://www.tomshardware.com/picturestory/807-virtual-reality-games-locomotion-methods.html.
Nelius, J. (2019). How to combat VR sickness. [online] pcgamer. Available at: https://www.pcgamer.com/how-to-combat-vr-sickness/.