My Team and "The Thinking City"
Team Robo-Kraken
Team Robo-Kraken consists of four members: Jason Lynch, Dylan Reilly, Eoghan Maguire, and I.
Each of us fulfill different roles as part of this project, but also work in tandem with one another with the goal to make "The Thinking City" a successful final year project.
More information about the team can be accessed via our Team Mahara page - listed here.
"The Thinking City"
The Thinking City is a game birthed from a series of pitches done within the team, the originator being Jason Lynch. From here the concept evolved as the entire team began to work with it.
The Thinking City is designed to be a vertical slice of a conceptually larger single player game, intended to reflect the small scoped game demos distributed up to and during the 2000s. You play as a silent protagonist who enters the secret underground city after receiving a mysterious invitation. On arrival, you find that the entire technological paradise has been abandoned, with no signs of life. You are greeted by an A.I. character named T.R.I.S.H. who guides you on a journey to figure out the mystery of its downfall, and to restore prosperity to the now desolate underground metropolis. The game is primarily focused on delivering tone and atmosphere in its environment, as well as storytelling in its non-player character dialogue and environmental detail.
The virtual slice intended to be developed for this project sees the player during a section of game in which they have been trapped in a lab environment, patrolled by seemingly possessed Service Robots that become alarmed to any noticeable movement and sound. This part of the game takes place during the players excursion into the Science District - one of the 4 districts of the city. During this slice of gameplay the player will need to carefully and tactically explore the lab environment, with the goal to restore full power to the facility and open the door to escape.
The player shall have little in the ways of defending themselves. Instead, the player will need to manipulate the attention of their robot adversaries by throwing distractions (that the player has but a limited amount) and by moving carefully around the environment, hiding behind objects or escaping through constantly opening & closing malfunctioning doors.
Aesthetically, there are a number of games have influenced what we intend for the game's "look". Games such as Doom (2016), Bioshock, The map "Five" from Call of Duty: Black Ops, Portal 2, and Alien: Isolation among others have been heavily influential in terms of our core aesthetics & tone. We aim to create an unsettling, dilapidated, yet futuristic environment for the player to roam that enforces an implied narrative aspect of the game.
This game is geared towards a rather specific sort of player - one who is represented by this persona that we designed:
A game trailer that I edited for this project can be seen below:
Relevance of My Roles to the Project
How are my roles relevant to the project you may ask? Well..
Lead Artist
The role of Artist is incredibly important in relation to this project thanks to its heavy emphasis on tone and atmosphere. For us to construct a convincing and suitable sense of the game, we need to be able to design characters and an environment that feel part of the same continuity / setting that reflect the narrative of the game, fulfill their intended purposes, and that are informed based on examples from within and outside of the game industry. To fulfill this role, I shall create art concepts and designs digitally, using a Wacom Intuos 5 drawing tablet and a fully licensed copy Clip Studio Paint.
Character Design, Modelling, & Texturing
For the non-player and player characters in the game to exist in the first place in a 3D context, they must be modeled using 3D modelling software. Without character models, our game would lose a massive portion of itself - that being the enemy robots that patrol the environment as well as the player's presence within the game (hand / arm model that does actions during animated sequences). I shall be creating all of these models as well as fully rigging them within Blender 2.8, a piece of software that I will adapt my 3DS Max knowledge to in order to make use of the new Blender update's improved functionality & usability.
For each of these models that I create, it will be necessary for them to be textured and presented in a way that fits them into the general aesthetic of the game's environment, as well as communicate the character's attributes. To create textures and materials and also apply them onto any characters and /or objects I make, I shall be availing of Allegorithmic's Substance Painter - on a student license.
Character Animation
Characters in games need animation to communicate a number of things. Their personality, their function, as well as their condition and context within a game's world. Animations pull the player into a given character and their actions a sense of believably, invoking an emotional response that reflects a character's intended design specifications & purpose. Without animations, it is very possible that the game would seem flat and lose any and all of its intended impact. Animations will be created in Blender's animation workflow, then brought over to Unity for splicing, testing, & implementation into its animation state machine system.
VR Lead
After a certain point, we decided to re-gear our game concept towards being a VR game. As a result of this, I took up the mantle of VR Lead. It is important for us to have a member in this specific role as none of us had been familiar with VR prior, and there are many pitfalls that one can fall into when developing without reference - e.g. motion sickness, immersion breaking interactions, etc. In this role, I would experiment and design different interactions for our game in VR, researching best practices in order to avoid creating an unpleasant experience, as well as learning more about how VR works in Unity.
Personal Goals for Project
Over the course of this project there are a number of things I wish to accomplish and put further work into, whether through research or practice.
Generally, I wish to improve my artistic ability - specifically in the field of character design through the robot characters present in our game project, but also attempt to create environment art that communicates the game's setting, aesthetic, mood, and atmosphere. I shall be putting effort into researching lighting & light properties, environmental composition, and how to elicit an emotional response from the player through the game's presentation.
Additionally, I wish to learn how to properly model and rig characters to a production standard for use in the Unity engine. There are a number to aspects of this that I wish learn more about and implement into the game's character models - namely: UV mapping, Texturing, Rigs (focusing on bone weights and kinematics), as well as how to make use of Unity's animation state machines. I will also be working with Unity's lighting & shader system.
I will be putting time into researching how these things are done currently in industry, as well as following a number of courses when learning how to make use of unfamiliar software. Any and all resources & references I use during the project will be available via the 'References & Resources' section of this Mahara page.
Project Journal
Week 27 & 28 - Leaps & Bounds (Sprint12) [21th - 1st April 2020]
Week 27 & 28 - Leaps & Bounds (Sprint12) [21th - 1st April 2020]After finishing up our major assignments and getting ((((slightly)))) more adjusted to restricted life under the rule of Covid-19, we return to a sprint chock full of work in need of doing to prepare for the upcoming CA3 presentation.
The major responsibilities I took on this sprint were the fixing and improvement of the Closet objects, the implementation of a pause menu for the player, filling in the bug / issue tracker on our GitHub repo, restricting the movement of the Robotic Arm to an illuminating path, and creating some story elements (noticeboards and character design).
Improving the Closets
One of the issues from weeks prior that still persists is the issue of the Closets. Before this week they were functionally useless, with handles and door opening mechanics that are too difficult to use, completely rendering them useless during gameplay. In order to address the issue, we took on some of the prior advice given to us by testers and lecturers - use buttons instead. It makes sense doesn't it? It's a sci-fi game, so why not?
With that justification, I hooked up a button to each door of the closets and created opening and closing animations for each. On button press, that specific door will open and stay open for a number of seconds, then close. This was tested a few times to make sure that the player had comfortable but limited enough time to get in when opened. To allow the player to leave the closet once inside, I placed a button that opens both doors.
Now, whilst buttons aren't the most sensible or "believable" solution, it was one that could be implemented quickly in order for the player to use them effectively. We may need to look in alternative solutions / designs further down the road. Below is a demonstration of how the closets work after this process.
Making a Pause Menu for VR
I needed to make a menu system that allows the player to Pause, Resume, Restart, and Quit the game entirely during play. This would give the player more control over their game but also more power to us devs, who want to re-do a test or to quit mid-way through a test without the hassle of fiddling with a VR headset and clicking buttons in Unity. This implementation was rather simple, using default Unity UI objects to create them. Due to the fact that we've designed our game in VR, we needed to make the UI as VR friendly as possible. This means that we need to spawn the UI in the game space, and not as a part of the camera's peripheral view. Doing this allows the player to view the UI in the context of the game world, as well as interact with it easier - using their hands for pointing at an option for example. I followed this series of tutorials in order to learn about Unity's event system as well as creating the VR pointers.
I had some issues with the restart option, as pressing it would reload the level but create a second instance of the Player object, breaking interaction. This was easily fixed, as I just needed to disable the Don'tDestroyOnLoad toggle on the player prefab.
Below is a demonstration of how the Menu System works:
Right now I think there's a problem with player bindings being broken on restart, so I'll have to fix that later.
Bug / Issue Tracker in Github
During this week I filled in Github's issue tracker on our repository with the major bugs and issues that we found so far. Up until now we had been using Google sheets to track issues one-by-one, but this would go days-weeks without a single look. Due to the frequent neglect this page (and others) had gotten, it only made sense for us to move to a more "relevant" system, that we wouldn't be able to ignore.
Our Github repo can be found by clicking on the Github icon below - or clicking here.
Pathing & Illuminating the Robotic Arm
One of the many things that this Robotic Arm mechanic was initially designed to be was that it would be constrained to an overhead rail. This was to bring an air of believably and logic to the system in terms of the environment, as well as to create a means for the player to gauge what they're supposed to do and denote their progress whilst performing the task. The first thing that I needed to do was to map out a path for the rail system to follow in the level, which was a relatively simple task. I had the individual rail parts pre-made my Dylan from the start of the project, so I just needed to decide on a layout that made the most sense.
The layout that I chose was a rather simple one - one that merely loops around some of the props in the level and features some faux broken segments (ergo the pieces of broken rail on the floor) to give off the impression that the system had degraded over time.
The major hurdle when creating this system was getting the robotic arm to conform to a path. However, with the help of this tutorial, it was possible to create a path system and create the backbone of following system. I then had to to hook up the Robotic Arm controls in a way that allows the arm to move forward and backward along the path, but also exempt the main parts of the arm from conforming to the directional rotation of the follower object (in order to allow the player to use the Rotation controls more comfortably & without losing their input). This process meant the re-direction of the Transform joystick's normalized rotation into the script responsible for moving along the path, in order to tell it which direction in which to move.
One of the last minute things I wanted to implement was a rail illumination system. I requested Dylan to model and texture some straight and curved lights to be attached to the existing rails, as well as rounded end pieces for the entire system, each with an emissive component. He did an excellent job as always in record time. This was done in order to implement a system in which the path would illuminate on/off depending on how much progress was made along the path, as well as to give positive feedback to the player when using this mechanic.
I made the effect by associating each rail piece to different way-points along the path, allowing the game to turn on/off emissive/non-emissive materials on the object once the way-point had been reached by the robotic arm. The result of these improvements can be seen in the demonstration video below.
Story Elements - Player Character
Originally, our game was set up without a protagonist in mind, but rather opting for a "blank canvas" type of entity for the player to take up. Allowing the player to play as "themselves" was a decision that we thought was good thanks to examples set by other games, plus it would save us time in the college environment so that we could focus on creating mechanics and learning to tool sets that we would eventually choose. This was beneficial for the majority of the project, giving us time to experiment with a number of things in and out of game. However, over time as we added certain mechanics to the game (Gravity Room, Watch UI, etc.) it became apparent that we needed some justification for how the player character interacted with things and their history prior to the beginning of this "vertical slice" of a game. Additionally, as T.R.I.S.H. began to get brought into the game, we received feedback that players & onlookers had no investment in their situation, and could not see any point to it given the lack of background knowledge and experience.
So, in my spare time mind you, I decided to reconsider the protagonist for our game. I thought about how they could look, why they look that way, and invented ways to explain game mechanics through their design.
This potential protagonist design is intended to communicate the mid-game state of the player character. I started with a character that entered the Thinking City in formal attire, as they had been given an ambiguous and prestigious invite into the secretive city. Through the trials they may have gone through up until the point of our "vertical slice", their clothes have become torn, dirty, and have been replaced by items found around the environment. The design needed to feature reasons / justifications for in-game interactions as well as items seen in the game. The Watch can be seen on their left hand, which allows the player to access in-game UI and check their objective. The Gravity Boots allow the player to traverse zones with anomalous gravitational properties without being affected by them.
Thankfully this was met with approval by the others in the team, who were happy to have more than a lame excuse of a player character justification to use when discussing the topic.
Practicing for CA3
We spent a lot of our time meeting up and practicing over discord and Zoom, in order to nail down the format and pacing of our presentation. We employed the same strategy as in CA2, where we would create and edit a video to showcase aspects of our game alongside our own commentary and powerpoint presentation. Since we were using Zoom for this, we needed to pay extra attention to the quality of our content over the internet, so we deemed that Jason, the member with the fastest connection, would be broadcasting the video and presentation from his PC and tab between the two when necessary.
Creating the video was a horror this time around. Due to some unfortunate issues with the game's lighting, we met delay after delay for the video presentation because baked lighting just wouldn't work on some objects within the game. This caused so much stress as we aimed to create the videos for our practice presentations with Shane and Derek - our mentors. We ended up presenting with the video missing significant portions for these practice runs but managed to get it done and practiced a few times before the final thing. Below is the final version of the video that I just finished editing - I hope it goes well.
Oisin's good video of the week:
Week 26 - PANDEMIC or "How the 2020s will be remembered for the rest of time" (Week 11) [14th - 20th Mar 2020]
Week 26 - PANDEMIC or "How the 2020s will be remembered for the rest of time" (Week 11) [14th - 20th Mar 2020]Welp. This is it I suppose. Ireland has closed its schools and colleges for a few weeks (as of Friday the 13th of March). For now anyway, we've been instructed to stay home and take the coming week as a study week, meaning I can work on other assignments with more time and continue to add stuff into the game.
This week I needed to do some major in-level things - applying blood decals and creating room signs for the player to know the purpose of a room within the level.
Creating Lab Room Signs
I was tasked with creating Lab Room signs in order to telegraph to the player what each rooms of the level were for and to improve environment readability. Initially, I pursued sourcing some ceiling mounted signs, but came up with no real contenders that suited the style and aesthetic of our game. It was around the same time that I was investigating the HDRP Decal Projector object within Unity, and I remembered that in the original concept art that I had made in the Concept phase, details were printed directly onto the walls.
With this in my mind, I set about making a trial piece of art that would act as a means to pitch it into my team. I came up with this (below) and my team preferred it to the initial approach.
Taking on the feedback of my team, I went to work creating a unique decal using a similar base for each room, with side pieces to make them look more natural on the walls. Below are a number of examples of the room sign decals in our scene.
*** NOTE: Very big thanks to this website for its ability to generate normal maps for decals, saved my life <3 ***
Decorating the Level with Blood Decals
One of the tasks I elected to do this sprint was to source and place blood decals and wall writing / smears about the level. This was something small that we hoped would greatly enhance the player's involvement in the level & the increase the feeling that something horrible happened in the game space. I sourced them from this page on the Unity asset store, paying the fee myself, then got to work creating prefabs from the textures included that would work with Unity's HDRP Decal Projector objects. Some of the environment work I did with these can be seen below.
Please note that these act as a first pass on the blood decal decoration of our scene, and it may be refined in the future if time becomes available.
For the remainder of the week I worked on some other college assignments and sat under my desk reading article after article after article about Covid-19.
Oisin's good video of the week:
Week 25 - Hacker Mode :sunglasses_emoji: (Sprint 11) [7th - 13th Mar 2020]
Week 25 - Hacker Mode :sunglasses_emoji: (Sprint 11) [7th - 13th Mar 2020]Continuing Production, I've moved on to implementing a number of new game mechanics this week - namely the Door Keypad system and Player Death. This sprint was rather light overall, as we had a number of crippling assignments for other classes that required our attention.
Creating the Keypad System
To create the Keypad system, I ripped one of the keypads that Dylan had made for the storage shelves and made the buttons functional. The idea was to allow the player to input a 4 digit code, press enter, and then open the door if the code is correct. This wasn't hard to implement, as we just needed to store button presses (each of which was assigned to a number from 1 - 4) and check it against a hard coded value to see if they got the combination right. If correct, we'd give the player feedback through a green light. If incorrect, we'd show them a red light and reset the input, allowing them to try again. Below is a video of my implementation of this system:
There was an issue sometimes where the door would open after getting hacked, but then immediately close because of the door's transition clock interval. To fix this, I had the hacking function reset the countdown on the door. It also took a bit of testing to get right. Originally, the keypad was a little frustrating - each button needing to be pushed in significantly to register as pressed, and instances where the player could accidentally press the wrong button were common. Through testing, we managed to iron a lot of things out and make it feel more ingrained in the gameplay loop, making it more responsive and comfortable to use under pressure.
Player Death System
I could only work on the Player Death system up to a point as the player damage system was and continues to be broken to the point where the player is unable to register any damage when hit by the Robot character. Despite this, I was able to create a basic death box area and create a system that checks if the player's health is 0, then teleports them there.
There seems to be a lot of speculation that the colleges and schools around Ireland are going to close. The Covid-19 epidemic is continuing to grow in both England and here, which makes me really quite anxious for my family (who live in England) and their safety. I reckon I'll be fine because I can continue college work from home but, the uncertainty is beginning to distract me hardcore. We'll have to see where this goes.
Oisin's good video of the week:
Week 23 & 24- Spooky Skeletons & Holographic Storymode (Sprint 10) [22nd- 6th Mar 2020]
Week 23 & 24- Spooky Skeletons & Holographic Storymode (Sprint 10) [22nd- 6th Mar 2020]After a week off post CA2, we're heading right back in with a whole new list of priorities and goals for the game in this Production cycle. During this week, I was tasked with posing and adding the skeleton to the robotic arm (to allow for it to hold a keycard), as well as improving the Control Panel layout, to model and texture a tentacle clump for the creature lab, and to create a VR Hologram system that alludes to how the robots got infected with tentacles.
Improving the "look" of the Robotic Arm
In order to make this VR mechanic more eye-catching and "important" to the player once they come into contact with it, we needed to pose it in a way to better represent the arm's capabilities / purpose, as well as add the collectible Key-Card into its grip (in some form) in order to give the context to the player "that they need to interact with this".
To pose the Robotic arm, the mesh needed to be rigged and re-exported in a new position, an endeavor that was done in Blender. It did not take much time, but care was taken when rigging to make sure that all aspects of the mesh were accounted for and therefore posed in a believable way for use within the game. Here's a demonstration of the rig's capabilities:
Once this was finished, the arm was posed in an arced position as if it was stuck holding up an object. This was exported with the armature and imported into Unity. Once it had been hooked up with all of the scripts and dependencies that the previous arm had, that was it. The differences between the two are stark, as the new version much more accurately communicates the purpose and form of the Robotic Arm. Below is a before (left) and after (right) of what the Robotic Arm looked like in the scene.
Improving the Robotic Arm Control Panel
In order the move the Robotic Arm, the player is required to use one of the two control panels seen within the Robotics lab - each giving the player a different view of the arm during movement. The control panel features two joysticks for moving the arm, and a button off to the right that allowed the player to drop the arm. There were a few issues with this setup. Firstly, after moving the joysticks, they would not assume their default position when the player let go. Secondly, an issue where the player would accidentally press the release button was discovered during user testing. These two issues needed to be addressed in order to make the interaction more usable and intuitive for the player.
To fix the issue with the joysticks, some code needed to be added to the SteamVR CircularDrive script, which enabled the dev to toggle a position reset on an object, which was enabled on the joysticks. This was a great improvement to the Joysticks overall and allows the user greater control over the mechanic.
In order to fix the issue with the control panel layout causing accidental button presses, the setup of the control panel was altered. The Left and Right joysticks were moved further towards the edges of the panel face, the release button moved into the center, and the keyboard scaled down and duplicated onto the opposite side of the button. Below is a before (left) and after (right) comparison of the control panel as a result of this process.
Adding Purpose to the Robotic Arm
Now that the Robotic Arm was posed, it was now the time to add purpose to the arm. It needed to have a means for the player to retrieve a Key-Card and therefore a reason for the player to interact with the mechanic. To do this, a skeleton model was sourced, re-textured in Substance Painter, posed on the robotic arm, and had rag-doll physics applied. In order to have it move in tandem with the robotic arm, rag-doll physics are initially disabled, but enable upon "dropping" the arm. This allows the skeleton to drop with the arm and release the Key-Card from its mouth, allowing the player to pick it up from the floor of the room. The finished positioning and setup of the skeleton can be seen here:
Lighting was added to bring emphasis to the robotic arm, in an attempt to guide the player's eye to it as the main focus of the room.
Creating a Holographic Table to Communicate Story
In essence the holotable was made by appropriating the desk asset that already existed in the game and adding new functionality to it. I started by making the buttons work with the SteamVR button scripts, then tied them to a function that would spawn a certain object above the table with a holographic shader attached. This was pretty straightforward to do in code, leaving the meat of the work in what story should be told and how.
After talking to my team, we decided that it may be nice to give context as to how the robot character became infected in the first place, so I got to work trying to make it happen with the holotable. My approach for this was limited by both time and what I was given to work with. Using the table, I felt that a storyboard type story telling angle would be appropriate, tying a 3D object to each of the table's numbad buttons that, when pressed in sequence, gave the player a play-by-play of how the robot got corrupted. One idea I had was to make the table turn against you in some way at the end, and I decided to do this by having the last object in the story be an eye (similar to the robot) that looks at the player's face for the remainder of the game. I needed to model some items to fill in gaps in the story (like the tentacles in the cup & the spill), but it was a necessity so I didn't quite mind it.
A demonstration of the VR Holotable story can be seen below:
To be honest I question how effective this method is in communicating story but considering time limitations as well as limited object resources I feel that it's at least ok.
Oisin's good video of the week:
Week 22 - Again, Again, Again! (Sprint 9) [7th - 14th Feb 2020]
Week 22 - Again, Again, Again! (Sprint 9) [7th - 14th Feb 2020]During this week, I mainly worked on fixing some animation issues that came up with the Robot character, and joined my team a number of times for repeated presentation practice and preparation as we came up to CA2.
Fixing Robot Animation Issues
Jason brought it to my attention that there were some issues with the character's attack animation as it would step forward whilst doing so and misalign itself with level geometry, making it possible for it to clip between walls or out of the map. So I took some time to fix that by changing the animation so that the robot returns to their original position after attacking.
Presentation Practice
On most days of this week, we met up and practiced going through our presentation. Mainly done through Discord, we nailed down what we needed to say and made notes of what to improve on and include when talking. A massive help for us in this process was the assistance of both Derek O'Reilly and Shane Dowdall, who both gave up portions of their time to meet us and watch us present with our work-in-progress setup. Through their feedback, we improved a method of side-by-side video and presentation demonstration that we had in mind going into this presentation. We would go through the details on our respective slides, then move on to the video and play the segment relevant to that information, in which the speaker would go into more detail. This was met with much approval from Derek and Shane as we refined it.
Making a Demo Video for CA2
Jason and I worked to create the Demo Video for CA2, divvying up portions of gameplay to record for each slide which were then sent to me for editing. I worked within Sony Vegas 15 to create the video that you can watch below.
CA2 went really well! WHAT A RELIEF! We got a little criticism as to how our scrum work was unbalanced and got a lot of helpful suggestions as to how to go forward. All in all, our hard work paid off!
Oisin's good video of the week:
Week 21 - VR is kinda jank.. at least when I try to do it. (Sprint 9) [1st - 7th Feb 2020]
Week 21 - VR is kinda jank.. at least when I try to do it. (Sprint 9) [1st - 7th Feb 2020]This is our last sprint before heading into our CA2 presentation, and there are still a number of things that need to make it into the game before then - namely the Robotic Arm VR mechanic and the Control panel for it. I also needed to fix some issues with the Robot's animations and texturing, as well as explore some ways of optimizing it to reduce the impact on performance.
Various Robot Texturing Improvements & A Lesson Learned
Going into this sprint I made some changes to the just-finished Robot textures. After a conversation within the team, Dylan said he'd look into optimizing the textures in the level as much as possible (resolution-wise) as all of them were 2048x2048 in size. I agreed to do the same, and went to work reducing the size of many of the textures used on the Robot model. In Substance, I reduced any less detailed or obscured texture to 1024 or 512 as necessary, experimenting so as not to lose quality in the meantime. 2048 sized textures were reserved only for important items like the Robot Torso and the Tentacle Arm. These changes were made, the textures re-exported, and re-assigned to the character materials inside Unity. It is unknown how much of an impact this had in improving game performance, but any means for us to lessen the load is helpful especially as we're making a VR game.
In hindsight, and because I may get time to revisit it in the future, it was probably unwise of me to create a new texture set for every part of the robot mesh. It takes a lot more processing for the PC to load multiple different textures at a time. It would have been much more beneficial to make a Texture Atlas or just have one large texture set that encompasses the entire Robot mesh.
Story - Nametag & Schematics
For the presentation we wanted to have at least some token towards story integration in our game. This took the form of creating an Employee Nametag and creating some schematics that denote the Robot's original design.
The nametag was done initially by me drawing out a template of how it should look - taking into account desired shape, contents, and potential space for Easter Eggs (barcodes, QR codes, etc.). After a look at real world examples, I came up with a base that holds details of a potential in-game character. Next, I modeled a nametag object to a similar spec to which I had designed, then unwrapped it in a way that would facilitate the design as an applied texture. The larger portion of the below image is the face that would hold the nametag information, the size is important as that part of the mesh would hold the most detail.
Once I had this done, I exported the mesh as .fbx and brought it into Substance Painter for proper texturing. The result looked like this:
I pretty much went through the same process with the schematics sans the substance portion. I took one of the character design sketches I made earlier in development and (quickly, admittedly) edited it to look as if it were schematics, then created an object in blender that fit it in dimensions and brought it into the scene.
Locker Hiding Spot
Using the locker that I modeled in a previous sprint, I began creating a system in VR for the player to open and close the doors and hide inside of the container. This gameplay aspect is intended to allow the player to wait out a robot's occupation of a room and avoid detection.
I created a handle opening system for each door of the closet with assistance from this video. This essentially allowed the user to open the door by twisting a handle, then pulling on an area around the handle to swing the door open and closed.
We got some feedback when testing this with some users, as many couldn't effectively use the handles to open the door correctly. It was suggested by one of our lecturers to investigate just having the doors operable with buttons, which is an idea we decided to explore after this sprint as time was of the essence. The version of the closet seen as of this journal entry is demonstrated in the VR Interactions video below the Robotic Arm section.
Robotic Arm - Part 1
In beginning to work on the Robotic Arm, my first task was to make working joysticks for the control panel. The first step of this was to bring the joystick into Blender and rig it so that the base of the joysticks would deform on movement, as seen here:
Next was allowing for the player to grab onto and move the joystick around in VR. This was a tough and time consuming task, and one that I had to talk to my team to compromise on, as I had struggled to create a joystick system that allowed the player to move an object in a spherical motion. We came to the conclusion that we should forego that approach and instead create two joysticks - one that rotates forward / back, and one that rotates left / right. This turned out to be a nice compromise for the time being.
In order to facilitate movement for the robotic arm, I programmed a system that gets a normalized amount of rotation deviated from a joystick's resting state and limited by its max / min angle. This value would be passed into the Robotic Arm's movement / rotation script to be used as a speed modifier. For now, I made it so that the Robotic arm could only move in a 10 unit circle from its original position, in order for it to not leave the map. Then, I moved on to making the Robotic Arm toggle ragdoll physics to create the "drop" effect. Using this tutorial, I managed to figure out how to implement the ragdoll system for the Robotic Arm, and then hooked it up to the Control Panel's button for activation. This and the other VR interactions I worked on up to and during this week are visible in the video below:
Oisin's good video of the week:
Week 20 - IT LIVES!! (Sprint 8) [25th - 31st Jan 2020]
Week 20 - IT LIVES!! (Sprint 8) [25th - 31st Jan 2020]During this week my major tasks were top continue work on our robot character's animations and start on texturing the character in Substance Painter.
Finishing Robot Animations
I moved on with the leftover animations this week, which was tougher than immediately thought thanks to the time lost fixing the character's rig. Thanks to this, I feverishly got to work when I can, knocking out one after the other. The animations I needed to do this week were as follows:
- Left Turn
- Right Turn
- Investigate
- Attack
- Detect Player
As mentioned in the previous journal entry, some of these had been started to an extent, but ultimately had to be scrapped thanks to the rig changes brought about as a result of creating the Chase and Walk animations.
Ultimately, these animations were pretty straightforward to make. I had no issue creating something for each required animation - making full use of root motion wherever possible. I did have to re-do the Investigate and Right / Left turn animations near the end due to the fact that I had omitted to rotate the root bone in the direction the robot was facing when turning. This led to scenarios where the character AI would attempt to turn, but then reset to its original direction because the rotation of the root bone didn't change. This, again, ate into my time but it wasn't too much of a hit overall.
The finished animations (in Blender) in full can be seen in this video:
Texturing the Robot
Once the animations were done and approved by the rest of the team, I moved onto texturing the Robot in Substance Painter. This wasn't a very hard process, as I've become familiar with the software quite fast thanks to my background in using art programs like Photoshop, paint tool sai, and clip studio paint - substance painter acts much the same in regards to layer management and tool set in a lot of ways.
In preparation for working in substance, I needed to apply a material to each object in Blender in order for substance to recognize it as a Texture Set - where the user can create layers and apply materials to that of the mesh. Next, in Substance, I baked the mesh's maps after adding in an ambient occlusion map - this allowed me to create dirt effects in areas that would likely accumulate dirt by virtue of its surroundings. It was at this point that I got to work texturing.
Once this process finished I could export the maps into Unity in order to make materials. However, since we're using the High Definition Render Pipeline, I needed to create an export option in Unity that works with the HDRP format. By looking at the inputs of the HDRP/Lit material inside of Unity, I arranged an output that had the correct data in the correct channels per texture.
Next, I created all of the Materials needed within Unity and applied them to the imported Robot .fbx. It turned out pretty well in my opinion.
Some other things of note, I assisted Eoghan in implementing the Robot sounds to play during animation, using Unity's animation events system to do so. My input into this was to show him how it worked, and he went from there.
Oisin's good video of the week:
Week 19 - One Way Ticket to Hell and Back (Sprint 8) [19th - 24th Jan 2020]
Week 19 - One Way Ticket to Hell and Back (Sprint 8) [19th - 24th Jan 2020]College has begun again! After a few weeks of Christmas break and some of the most productive times we've seen, we now enter a new semester in a brand new decade - what's in store? Good times I hope! [EDIT: This aged like fine milk]
This week my priority was to create new animations for our new Robot character. This process began with the ARDUOUS task of rigging a character that, at first looked simpler than it was.
Rigging the GSA
Rigging the robot was a process that started off relatively straightforward but one that got quite complicated quite fast - particularly when trying to make the arm bellows deform in a believable fashion.
I began rigging from the legs up, starting with the root bone and then moving onto the digitigrade-esque legs our character features. This wasn't a very difficult process, and I got assistance from this video in order to figure out how to create the piston system that connects the two legs. The finished left leg rig can be seen in the video below:
The rigging process wasn't much of an issue in terms of setup, I had a working armature made within 2 hours without much headache. The real problems came about when skinning the mesh - applying the mesh to the bones to facilitate movement.
An Unholy Amount of Issues
Again, for a majority of the rigging process things went swell. Bones would be applied to specific parts as standard bones (as seen in the parenting menu in Blender) and would work fine. Same could be said about areas that make use of IK constraints - like the arms and legs (including the arm tentacle). The parts of the model that caused trouble were the Right Shoulder, Right Bellow, and the Right Leg Tentacle Connector.
I was on the verge of ripping my hair out when skinning the Left Shoulder and Left Bellow. I wanted to create a bellow open and close effect as the arm opened and closed at the shoulder, but I really couldn't suss how to pull it off. I went through a few different methods - first trying to create bones for the entire length of the bellow and then connecting it to a bicep bone, then when that didn't work thanks to the shoulder pad and bellow clipping into and through each other. When that didn't work I tried a number of times to rectify it by merging the shoulder pad and bellow to frustrating but ultimately no avail. Finally, in a last ditch effort, I decided to forego the individual bone attempt and just decided that the bicep bone would act as the hinge for the bellow, with the shoulder pad and bicep parts being skinned to that same bone. The results of this attempt weren't the best but it was a nice compromise after a few days of frustration and a limited time frame. The results can be seen below (right).
The other issue was one that affected the Right Leg Connector Tentacle and also the character's Tentacle Fingers. This was a much simpler issue. It turned out that I had neglected to apply subdivision surface modifiers when trying to skin the object, and in tandem with this there were issues where vertex duplicates were present in the same meshes. This was an easy fix but took a little figuring out to get right.
The most frustrating thing about this process overall was how much it slowed me down in regards to getting the animations done. I planned to get at least half of them done this week, but only managed to get 2 done - each bringing with it necessary changes to the mesh / armature.
Animating the GSA
Like the last robot character, we had a short list of animations that were needed in order to work with Jason's AI code. These were:
- Left Turn
- Right Turn
- Investigate
- Walk
- Chase
- Attack
- Detect Player
During this week, I only managed to get the Chase and the Walk completed, with parts of others started or in a to-be-scrapped state.
For each animation, each pose of the sequence would be blocked out, with vague timing in mind. Looping back over the pose-to-pose sequence, the timing would be adjusted in order to suss out a suitable pace. Next came the in-between keyframing, wherein the transitions between each pose would be given more detail and structure - allowing for smoother, more believable movement. It was at this point that reference material would be utilized to the best of our ability, needing a keener eye to identify the little details in movement. Like all animations done for this project, the frame rate chosen for use in our game was 60 fps.
The sequence was animated on the spot, without any transformations yet keyframed on the character's root bone. This was to allow for animation to be done without the hassle of following the character around the blender space, and mitigating potential errors. After the on-the-spot animation was complete, the Root bone would then be paced in the forward axis by an amount matching the pace of the character's walk. However, this would give the character a 'gliding' effect, so to mitigate that the character's feet would be halted in place (at the point it hits the ground) this was to give the illusion that the character is pushing themselves forward with whichever foot is currently planted on the ground.
The result of the walk animation can be seen here:
This approach was one that I was going to take when doing all of the other animations, as it seemed to be quite effective when at least creating the first pass of animations.
Oisin's good video of the week:
Week 18 - Auto-Robotic Transfixiation (Sprint 7) [12th - 17th Jan 2020]
Week 18 - Auto-Robotic Transfixiation (Sprint 7) [12th - 17th Jan 2020]This week my focus was on Modelling the GSA robot, sourcing / implementing player hands for the VR controller, and investigating / implementing VR buttons & levers. At the same time, I had exams to study for - and that was VERY fun let me jut say.
Modelling the GSA
I hopped into Blender after making the decision to pursue the GSA design and got straight to modelling. Since creating the original Service Bot I had learned a few things NOT to do whilst modelling the new bot, things like managing Ngons & prioritizing loops in the mesh. This prior work did result in the completion of the mesh in a much faster time. However, I did need to figure out how to make organic looking tentacles, which was something I needed to attempt a number of times in order to get a detail / vert count balance. I used the following resource in order to create the tentacle arm seen in the finished mesh:
The finished mesh came out as seen below. The greatest challenge honestly was creating and manipulating the tentacles around and within the robot's body.
From here, I would need to Rig the model up and make it suitable for animation.
VR Hands
I went about the internet searching for a number of different VR Hand models that we could use for within our game - to immerse the player a bit more and to get rid of those pesky standard VR controllers. I looked on the unity asset store mainly as it would guarantee Unity compatibility, and likely have integration tips / tutorials relevant to our use case. Unfortunately, a majority of them came with a paywall. There were single-pair hands, hand packs, transparent hands, and gloved hands galore!
I made a shortlist of 10 VR Hand assets, factoring in design and price, as well as their capabilities in VR (amount of articulation, etc.). In a call with my team, we looked at all of the possible options and unanimously chose the Sci-Fi VR hand asset made by Kandooz Studio - As seen here.
I then set about trying to implement it into the Unity scene, which seemed easy enough but turned out to be a BIT more complicated. I didn't get it completely sorted out this week as Jason found errors with implementation, but I hope to rectify it at the start of next week.
VR Buttons & Lever movement
Another task I had to complete this sprint was figuring out Button & Lever movement in Unity. I started by cruising through YouTube tutorials that covered the mechanics of how it worked, and different ways to implement the systems within Unity using plugins. Once I felt that I had found a good set of resources (this and this) I went right ahead and installed the SteamVR plugin into our unity project.
SteamVR, developed by Valve, contains a lot of basic scripts that can be used to create different VR interactions, with the ability to assign callback functions at different states (say, to be activated at the start and end of a lever's arch). The tools included are seemingly VERY powerful.
I followed the tutorials I found and adapted what they showed to some of the assets Dylan had made - the lever and a button pad that was attached to a shelving unit. My results came out pretty well despite me not having a VR headset and things looking majorly jank.
I then sent these to the group so that someone with a VR headset could test it - and got the all clear that they worked in the same way as in the video.
Oisin's Good Video of the week:
Week 17 - New Year, New Robot (Sprint 7) [6th - 11th Jan 2020]
Week 17 - New Year, New Robot (Sprint 7) [6th - 11th Jan 2020]The new year begins! And with it, the need to conduct redesigns of our Service Bot character (that until now I had spend over a month creating (ಥ﹏ಥ) ). Either way, it began with a team call to discuss some design ideas that we could attempt to explore. These were:
Using these specifications as a guide, as well as the previous examples, I began sketching out new ideas & concepts, with the aim of making them more robotic looking and sinister, with an emphasis on their utility within the context of the game (as lab helpers) - The General Service Assistant (Tainted & Untainted) and the Carrier Bot (Infested and Clean).
General Service Assistant
The GSA was the first new concept that I came up with. I considered what a robot capable of performing normal human tasks would look like, with the added ability that they can extend their legs somewhat in order to reach higher places. When it came to incorporating the idea of a holographic head I first interpreted it as the possessor using the robot's inherent abilities to "fix" itself if it had a broken head. As if the possessor was adapting to circumvent any damage done to it.
The 'Tainted' variant was an idea pitched to me by Jason, that the robot's possessor was more of a physical force - the tentacles - that has permeated the inside of the bot and has grasped control. This would have the additional effect of influencing the robot's movement to that more of a newborn child or a drunk - an entity that struggles with basic coordination.
Carrier Bot
The carrier bot was based on the idea of a robot designed to assist with carrying or transporting machinery, tools, supplies, and other things from place to place within the lab environment. Acting as the Scientist's mules, the Carrier would know the environment well thanks to their inherent need for navigation. Same as the GSA, the Carrier would have an Infested variant where the possessor entity has permeated its internal mechanics and has made changes / fixes to the body's many points of damage.
Decision Making
These, along with the previous design (Service Bot) were put together into a google form that was distributed to 25 people - both friends, family, and external - for feedback. We wanted to gauge how scary each design was from 0 - 10, and also which was the favorite amongst our participants. The results of the questionnaire can be seen below:
From this data, it's obvious that the GSA & The Infested Carrier designs are the 'Scariest' of all of the designs presented. Additionally, we can see that the GSA is also the most favored of all the participants.
Internally, we also leaned more towards the GSA robot design, particularly the Tainted variant. We could assign and identify story elements and potentially create interesting gameplay elements with it, so we figured it would be the design with the most potential. We left the questionnaire active for 2 days in order to get results, and after this we had a team meeting in order to evaluate the results and make a decision, as the clock was ticking down on the robot's 3D model quite fast. We decided to go with the GSA design, as it was our favorite of the bunch and is also the most successful candidate from the questionnaire.
After this, I went to work creating an official model sheet that would be used during the modelling process as seen below:
Oisin's good video of the week (a new trend to make journals a bit more interesting) :
References & Resources
Art
Gurney, J. (2010). Color and light : a guide for the realist painter. Kansas City, Missouri: Andrews McMeel Publishing.
Aaron's Deign Class YouTube Channel - published 5 December 2018 - "Silhouette Drawing Methods". https://www.youtube.com/watch?v=lGL878oEh9k
Mart's Struggle with Drawing - published 14 July 2016 - "About Using Silhouettes in Character Design - DigitalDrawingStruggle". https://www.youtube.com/watch?v=GTdfQIHiJ9Q
Sycra - published 28 June 2017 - "Going From Grayscale to Colour". https://youtu.be/oQOFPraUNoQ
My concept art speedpaint & misc playlist: https://www.youtube.com/playlist?list=PLdt1gbIaw0MY6D-AjzcZVG2WnxyEC5PTu
CLIP STUDIO TIPS. (2019). How To Create Glitch Effect. by ayu.shi - CLIP STUDIO TIPS. [online] Available at: https://tips.clip-studio.com/en-us/articles/1931
I.A. Magazine. (2019). The Art Of Doom : 40 Concept Art. [online] Available at: https://www.iamag.co/the-art-of-doom-40-concept-art/
Robot Design Feedback Google Form: https://docs.google.com/forms/d/15WtoIL12YhT6mfXI88wj68eUelI7AoK7yHTKPMj5NmI/edit?usp=sharing
3D Modelling
Julien, D. (2019). Blender 2.8 The complete guide from beginner to pro. Link: https://www.udemy.com/share/101WyMCUAfdVlWRXg=/
Texturing using Substance
Substance 3D. (2019). Substance | The leading software solution for 3D digital materials. [online] Available at: https://www.substance3d.com/
80.lv. (2019). Benefits of Procedural Materials. [online] Available at: https://80.lv/articles/benefits-of-procedural-materials/
Innocenti, U. (2019). Comparative Case Studies: Methodological Briefs - Impact Evaluation No. 9. [online] UNICEF-IRC. Available at: https://www.unicef-irc.org/publications/754-comparative-case-studies-methodological-briefs-impact-evaluation-no-9.html
Gdcvault.com. (2019). GDC Vault. [online] Available at: https://gdcvault.com/
Gdcvault.com. (2019). Texturing Uncharted 4: a matter of Substance (presented by ALLEGORITHMIC). [online] Available at: https://gdcvault.com/play/1023488/Texturing-Uncharted-4-a-matter
Gdcvault.com. (2019). 'Marvel's Spider-Man': A Deep Dive into the Look Creation of Manhattan (Presented by Substance). [online] Available at: https://gdcvault.com/play/1026495/-Marvel-s-Spider-Man
Substance. (2019). Blade Runner 2049’s Oscar-winning Texturing Workflow at Framestore. [online] Available at: https://store.substance3d.com/blog/blade-runner-2049-s-oscar-winning-texturing-workflow-framestore
Kevin O. (2019) Learn the ART of Substance Painter. Link: https://www.udemy.com/course/learn-the-art-of-substance-painter/
Virtual Reality
Costas Boletsis and Jarl Erik Cedergren, “VR Locomotion in the New Era of Virtual Reality: An Empirical Comparison of Prevalent Techniques,” Advances in Human-Computer Interaction, vol. 2019, Article ID 7420781, 15 pages, 2019. https://doi.org/10.1155/2019/7420781.
A.S. Fernandes, S.K. Feiner, "Combating VR sickness through subtle dynamic field-of-view modification", Proc. 3DUI, 2016. https://ieeexplore.ieee.org/document/7460053.
- Supplement: https://www.youtube.com/watch?v=lHzCmfuJYa4
Ryan, A. (n.d.). Thoughts on Accessibility Issues with VR. Retrieved from ablegamers.org: https://ablegamers.org/thoughts-on-accessibility-and-vr/
Kim, W. Kim, S. Ahn, J. Kim, S. Lee, "Virtual reality sickness predictor: Analysis of visual-vestibular conflict and vr contents", QoMEX, 2018.
Carbotte, K. (2019). Do the Locomotion: The 19 Ways You Walk and Run in VR Games. [online] Tom's Hardware. Available at: https://www.tomshardware.com/picturestory/807-virtual-reality-games-locomotion-methods.html.
Nelius, J. (2019). How to combat VR sickness. [online] pcgamer. Available at: https://www.pcgamer.com/how-to-combat-vr-sickness/.
Creating Art for our Game Project
In order to lay out a foundation for how the game should look, I created a number of designs for characters and environmental objects, as well as explorations of the setting's mood and lighting.
I have outlined my process to create, and the importance of both the Service Bot designs, Protagonist design, and Foyer environment art here.
Texturing Our Robot Character Using Substance Painter
Procedural Texturing is a relatively recent development in the "AAA" development space. It is a texturing technique being adopted across the board and implemented in games like Control, Uncharted, Spider-Man (2018) and many other titles.
In this section, I will be describing how I went from learning Substance from scratch all the way up to having a completed Robot character textured and imported into a Unity project that uses the High Definition Render Pipeline.
Read about this here.
Should We Use VR? - Trials & Experimentation
On the 22nd of November 2019, we as a team decided to commit to experimenting with Virtual Reality as a part of our game concept. We wanted to quantify the use case for VR integration within our project concept, so we conducted a study to do just that. I took charge of the background research portion of the project and assisted in the development of our test stage and authored the study structure.
Read about the process we went through here!
Creating Character Animations with Root Motion in mind
Over the course of this project, I learned about Root Motion in character animation and applied this principle to our G.S.A. Tainted Robot character's animations.
I will go through the process I took to make use of this technique in animating my robot character, the problems I faced, and how I imported them into unity - here.
Creating VR Interactions in Unity with SteamVR
After we shifted to VR, we found ourselves in the position where we needed a dedicated VR developer to design and implement systems for the player to interact with in VR. I took up that mantle. In this section, I will go over the process I took to create a number of VR game mechanics and the problems I encountered using SteamVR in Unity!
Click here to get to the page.