For my Master's thesis at Georgia Institute of Technology, I explored the topic of 3rd person control schemes in VR. This was an expansion of the research I was already involved in with Michael Nitsche dealing with puppet archiving, but the central topic was born from an observation I had though about VR game design. I noticed that although there are other 3rd person VR games, none use the hand-tracking controllers. I felt this was an oversight and wanted to explore what a 3rd person VR game could play like. Because of my previous work, I felt puppetry could be a useful practice to base my control scheme and interface design off of.
Technology | Unity , C#, VRTK |
---|---|
Role | Sole Developer/Designer |
Links | Website , Documentation , Github Repo , Executable (VR Required) |
I designed and developed playable interface prototype for a third person VR video game. In the game, the player controls 3 puppets, each of which is based on a different form of puppetry and a different circus act. One is a strongman, another is a balance beam gymnast and the last is a juggler. The player's goal is to keep the crowd energized by performing with each puppet in a way that fits their act i.e. lifting objects as the strongman, walking as the gymnast or juggling as the juggler.
The final version of the prototype was presented in May 2018, for which I won an honorable mention in a yearly award presented by the faculty of my department. I was also recognized by the Ivan Allen College of Liberal Arts at the GVU Research Showcase 2018.
This question, what could a 3rd person VR video game play like, led into a 6 month iterative design process where I worked with my academic advisor, Michael Nitsche, through weekly design sprints in Unity. The first set of prototypes focused on solving some of the core technical problems involving the manipulation of ragdoll characters from more than one point at a time. This was troublesome, but a requirement for me to argue in my thesis that the design was based off puppetry practices.
To solve this issue, I iterated on a custom physics tracking and constraint system over the course of the project. This system was based on a PID controller, which continuously tracked the position of the interface for each puppet but prevented the player from directly manipulating the position of the ragdoll (seen above).
These early technical prototypes then switched over to more design work. I would go back and forth between the two modes, either designing new ideas for puppets and acts, designing interactions and 3D interfaces for those puppets or solving technical problems that arose from the design work. This cyclical process allowed the design and development to push and help one another. A technical solution could be supported by a interface design devised after the fact, which in turn exposed more problem areas for the tech.
Interface design in particular went through multiple iterations before it reached it's current form. The idea from the beginning was to draw from puppetry, but I quickly realized that the "interfaces" used with physical puppets interfere with one's ability to see the actual puppet. That's problem I could overcome digitally though. I started with transparent 2D shapes that corresponded with the position of the body parts, but I realized I lost the way physical interfaces afford certain motions. I switched to 3D interfaces here, but rendered them in a unique way relative to the rest of the environment, not unlike how GUIs in games are visually distinct from the rest of the visuals.