Now that Midmortems is over and the dust has settled, I can announce String Theory was selected to move forward for further development!

Production II

My production II course is the most dynamic and exciting game development course I have taken at Champlain College so far, and it’s only just starting to get serious. For the first three weeks of class, we were tasked to rapid prototype a new game idea each week, in teams, and present them to the class. Rapid prototyping is no joke, and as the lead programmer, it requires strict time management and crunch periods to bring an idea to life, that’s worth playing. The goal of these rapid prototyping sessions is to simulate the process of what most AAA studios go through in the industry.

Doceratops Studios

My development team during the prototyping phase included:

  • Andrew Rimpici (Me!) lead programmer.
  • Tim Carbone lead designer and associate programmer, who I’ve happily worked with on all of my Production I games as well.
  • Michelle Lee amazing lead artist and specializes in environment art.
  • Brett Schwartz lead producer and is excellent at keeping team morale high while also keeping us organized and on track.

String Theory

During the second week of rapid prototyping, our team decided to create String Theory, a VR yoyo combat game where the player enters a world, as a child, and uses their yoyo to fend off enemies, explore levels, and solve puzzles. The game takes place in a secret underground city, and the player’s goal is to find the ultimate treasure that is rumored to be hiding there.

Once the rapid prototyping phase was over, my team decided to continue development on String Theory. Our professor gave us one more week to add extra features and polish to the game before presenting, to the entire junior class, for Midmortems, yikes! Midmortems is an event where all of the game teams come together from all of the production II classes and present their games. After presentations, certain games get cut while others are selected to move forward. Through these cuts, the teams that are chosen to move forward can then adopt new team members to help ease the development process and workload.

Here is the Midmortem Gameplay Trailer:

String Theory Midmortem Gameplay Demo

The Development Process

As the lead programmer, it was intimidating, and seemed out of scope at first, to undertake a whole Virtual Reality game, especially during the rapid prototyping phase. We decided to push forward with VR because this course is about taking risks, and if the prototype failed, we would have two other prototypes to fall back on. Plus, VR is pretty awesome and who wouldn’t want to at least try and develop for it.

The method of best approach I chose for development was to use Unity3D in combination with the Oculus SDK to provide a base to progress off. As the development of String Theory progressed, I noticed VR programming is not very different from traditional PC programming.

My main focus during the prototyping phase was to make sure the yo-yo felt good to hold and to yo. Our goal was to take how the yo-yo works in the real world and exaggerate the physics so the player can use it to throw in any direction. The three primary systems I implemented to allow the yo-yo to come alive included a motion tracker, an input gesture tracker, and a yo-yo state machine.

Motion Tracker

Without the motion tracking framework, String Theory would have no way of knowing how fast the controllers are moving and rotating. I chose to write a simple script, as opposed to using Unity3D’s Rigidbody component class, since I only needed to know the acceleration, velocity, and position of each controller. Unity3D’s Rigidbody component comes with a lot of extra overhead and functionality that was overkill for my purposes. Besides, the Rigidbody component does not give access to the object’s acceleration, so I would’ve had to track that on my own anyway.

Input Gesture Tracker

Collecting the motion data for each controller then allowed me to create a gesture tracking system that uses motion as one of the deciding factors for which gesture occurred. The yo-yo has many actions that the user can perform, like flicking the controller down to simulate using the yo-yo, flicking the controller outward to simulate attacking, and flicking the controller to the side after throwing to grapple objects. All of these actions need gestures to go along with them, and without the motion tracking system, I would not be able to keep track of all of them concisely.

The input gesture tracker uses the motion data to detect when something of interest happens. For example, if the velocity of a controller abruptly changes from one frame to the next, we can assume the controller has either just stopped or started moving based on the direction of motion. Finally, once we have our snapshot where something interesting happened, we can send off an event so systems like the Yo-yo state machine can use it.

Yo-yo State Machine

The yo-yo state machine is responsible for keeping track of the current yo-yo state in a clean and expandable way. It would have been straightforward to cram all of the yo-yo states into one script, but as time goes on, adding new yo-yo actions would grow to be a nightmare; thus my idea to use a state machine was born.

The state machine uses the event data that is sent from the input tracker to move from one state to the next, keeping the yo-yo code organized and extremely flexible. Another upside to using a state machine is that now the same gesture can lead to different actions based on the current yo-yo state.

Closing Remarks

I want to thank all of my team members for their hard work. We’ve grown so close through this exciting process, and I wouldn’t have been able to do this without them. There is still a long way to go now that we are moving into full development for the rest of the year, but it’s gratifying to see how much my team and I were able to get done in just two weeks of development.