XR

Jukebox Beatdown, Producing, Games, Design, Programming, XR

Blog Post #4: The Finish Line

This most recent sprint was one of the most challenging but fulfilling sprints yet!

The end of this sprint coincided with the final submission deadline for the Oculus Launchpad program, so it was time to finish the project’s remaining essential features while also building out promotional materials and responding to playtest notes. Needless to say, I didn’t sleep much. 😊

The Oculus Launchpad program only distributes grants to a handful of its more than 100 participants based on the merit of their vertical-slices, so my demo had to be looking and playing its best. I didn’t have the money to continue building Jukebox Beatdown beyond my vertical slice, so the project’s survival depended on the grant.

For my project to be eligible for Oculus’s consideration, it had to meet the following requirements:

  1. Provide a complete experience in five minutes or less.

  2. Meet the Oculus Store’s technical standards, especially in terms of performance.

  3. Be accompanied by at least a basic press kit.

In this post, I’ll discuss how I reached each of these goals, which are divided nicely into the categories of Design, Engineering, and Marketing:

  1. DESIGN: Building the game’s five-minute demo by scripting the attacks of Funky Franny, the game’s first boss.  By doing so, we hoped to achieve the game’s central promise of boss-fights set to music.

  2. ENGINEERING: Optimizing the game so that it met the Oculus Store’s technical requirements (and didn’t trigger VR-nausea).

  3. MARKETING: Building a professional and exciting presence for the game online through my press kit.

The final banner art for the project.

The final banner art for the project.

Design

This was far and away the most fun sprint of this game’s production because I finally had the technical foundation to fulfill the game’s central promise: boss battles set to music.

Like the movie Fantasia, we wanted all the bosses in Jukebox Beatdown to have their actions choreographed to music.

This was an intimidating goal. The song I commissioned for this project had multiple saxophone, horn, drum, synth, bass, guitar, string, and keyboard tracks each with either a unique melody or rhythm. It was a bit overwhelming to decide which sound would be paired with which attack.

To make this process easier, Ben Young, the game’s composer, designed our song so that it had four distinct, self-contained phases. We thought that these self-contained phases would make it easier for the player to notice when the music matched with the action of the game.

A screenshot of the Ableton project for the game’s song.

A screenshot of the Ableton project for the game’s song.

To cut through the noise (pardon the pun), I made a list with two columns for each of these four sections. One column had all the tracks in that section (saxophone, guitar 1, guitar 2, etc.) and the other column had a list of attacks I could create for the game. This list was an anthology of riffs on attacks I had seen in other games with great bosses, such as Dark Souls 3, Cuphead, Titan Souls, and Furi, plus some inventions of my own.

From there, it became a jigsaw puzzle of music, visuals, and gameplay. Using the music, which was basically set in stone, as a starting point, I tried to pair the attacks and music together. The design process was complex and went like this: what fits a guitar melody better, a flurry of lasers or a series of rapidly-exploding bombs? If I tie the lasers to the guitar, that means I can’t use them for the saxophone, so what should I tie into the saxophone? If the lasers are more difficult than the bombs, maybe they should be with the saxophone, which comes in predominantly at the end of the song – but now I am using the bombs for the horns, so what should go with the guitar now? Moving one element displaced another.

This process of experimentation continued until we had built out the first several phases of Franny’s fight:

However, now that we had added all this new content, our framerate had dropped into the teens. It was time to put our engineering hat on and optimize the project.

Engineering (Optimization)

From an engineering perspective, VR development can be one of the most challenging forms of game development. One of the main reasons that VR development is so challenging is that the slightest amount of lag can make someone sick. If you’ve ever played a new game on older hardware, you’ve probably noticed the game’s graphics lagging for a moment before updating to where they should be. This is never desirable but is especially unwanted in a VR context because the slightest amount of lag can cause the player to get nauseous.

The dangers of lag.

People get nauseous when the stimuli that their eyes are receiving does not match the stimuli that the vestibular system in their ear is receiving. (The vestibular system gathers and transmits data about our body’s position through several organs in our ear.) To minimize the difference between what these stimuli, Oculus requires all desktop-connected VR content to run at 90 frames per second.

Not my diagram. I would credit this but the original link is now a 404.

Not my diagram. I would credit this but the original link is now a 404.

Unfortunately, after my first design pass, my game was running at around 15 to 30 FPS at worst on my mid-range Razer laptop. To hit 90 FPS, I had to use many optimization tricks, including:

  1. Using object pools as I mentioned in my previous blog post.

  2. Eliminating almost every unnecessary UI canvas object from my scene in Unity as they were constantly being recalculated, putting unnecessary stress on my CPU.

  3. Eliminating almost every dynamic light in the scene and replacing it with lightmapping, which is essentially the practice of “painting in” a scene’s lights and shadows beforehand than simulating those at runtime.

However, the most impactful step for me was reducing my draw calls.

A draw call is the directions that a CPU (Central Processing Unit) gives to the GPU (Graphical Processing Unit) about what to render in a scene. Specifically, the draw call is the information that the GPU needs to render each object in the scene. While most computers’ GPUs do not struggle to execute these directions once received, preparing these directions puts significant strain on the CPU, which results in lag.

To use a filmmaking metaphor, you can imagine the CPU as a location scout and a GPU as a lightning-fast set-builder. In the context of the metaphor, the CPU is visiting the “real location” and sending instructions back to the GPU on what to build in the movie-set. The CPU/location scout looks at objects that make up the “real location” and communicates every visual detail about them to the GPU/set-builder, who recreates them.  However, to extend the metaphor, the CPU/location-scout is using a slow fax machine, so sending these details to the GPU/set-builder takes a long time and can slow down the entire process. Thus, the more succinct the directions can be, the faster the GPU will be able to build the set. We’ll use this metaphor as a way of explaining some of these optimization techniques.

Below is a timelapse that shows a scene in Jukebox Beatdown being rendered one drawcall at a time.

To reduce my draw calls, I used two techniques: mesh-baking and texture-atlasing.

Mesh-baking is when a developer takes several meshes (3d models) in their level/scene and turns them into one mesh. If we bake three meshes into one, our CPU will now need to process one draw call for those three meshes instead of three. In the context of Jukebox Beatdown, we generally baked together most meshes that shared the same shader, which is code that dictates how an object reacts to light. Our floor, for example, was made of about sixty different meshes; we baked these into one object.

To continue our movie metaphor, now that the meshes are baked together, our CPU/location-scout can describe, for example, a group of ten stereos as a group of ten stereos rather than communicate information about each stereo one-by-one. Put another way, it’s the difference between counting bowls of rice versus counting individual grains of rice. Preparing and sending the directions is the bottleneck in this context, so using succinct instructions is paramount.

Texture-atlasing is the process of aggregating all a game’s textures onto one image file. If a project is not texture-atlassed, every texture in the game is contained within a unique image file. The problem with this setup is that as the number of unique images go up, the number of draw calls go up as well. So, in order to minimize the number of images that need to be sent by the CPU, developers will pack as many textures as they can onto one image or texture atlas. The GPU will then look at this atlas for every texture that it needs.

In our location-scouting metaphor, texture-atlasing would mean that instead of taking pictures of every scene in our metaphorical “real-location” and sending them through the slow fax machine, our CPU is instead sending one page that contains all the pictures.

TextureAtlasForVirtualCity.jpg

A texture atlas for the buildings in a virtual city.

All these changes together helped us reach our technical performance goals. Now, it was time to make sure our project’s marketing was as engaging as the project itself.

Producing

The Oculus Launchpad program is highly competitive, with only three to five grants awarded to around fifty entries. I knew that some of the other entrants had teams that were five or more people strong (compared to my two) and had significantly larger budgets than I did, so I knew my project needed to look as polished and exciting as possible.

At the Oculus headquarters in San Francisco area for the presentation day.

At the Oculus headquarters in San Francisco area for the presentation day.

For my project to receive the grant, it had look professional. I knew that I had the skills to reach the required level of polish as a designer and a programmer, but maybe not as a voice-actor, graphic designer, or 3D modeler. Even if I did have the skills for those later responsibilities, I knew I didn’t have the time to do them all.

I had a $200 and a month to get the project ready for Oculus.

To ensure that I would get everything I needed by the grant deadline, I made a list of what features and assets would constitute the vertical slice of Jukebox Beatdown and then planned backwards from that date (February 28th). I then prioritized each of these items by how important they were and how much lead time they would need in order to be completed. My scrum training came in handy here.

From there, to decide which items I would outsource, I took a “time is money” approach. I estimated how long each item would take in hours if I did them myself and then multiplied that number by my hourly pay. I then compared how much it would cost to pay someone to do the same job on Fiverr. When a task was significantly cheaper to due via Fiverr, I outsourced it.

Ultimately, I ended up outsourcing a considerable amount of work, including the voice-acting, poster design, logo design, and character artwork. I spent $200 between several vendors to get all these items. This amount of work took about two weeks to be delivered.

In the gallery below, you can see the final artwork followed by the original sketch I sent to the Fiverr artists:

To do so, I started using Fiverr, a freelancing website, which was a great experience. If you decide to use Fiverr for your work, consider using my referral code, which will give you 20% off your first order: http://www.fiverr.com/s2/5b56fb53eb

Full disclosure: I will receive Fiverr credits if you sign up with the above link as part of their referral program. I am not affiliated with them in any other way. My opinions are my own.

Presentation

With my game built, optimized, and marketed, it was time to present to fly to Menlo Park, CA and present the game:

next Steps (Several Weeks Later)

Unfortunately, Jukebox Beatdown did not win any of the Launchpad grants. However, I am very happy with the project and will be polishing the vertical slice for release on itch.io so that others can play it.

Amongst other small tweaks, this next sprint will be about:

  1. Adding more attacks to Funky Franny’s moveset.

  2. Adding a system that ranks the player’s performance within the boss battle like this.

  3. Giving the game more “juice,” which means to make the game feel more visceral.

Thank you for following the development journey of Jukebox Beatdown! You can expect one or two more blog posts about the project followed by a release on itch.io!

Jukebox Beatdown, Awards and Festivals, Design, Games, XR

I Gave a Talk About my Game Jukebox Beatdown at the Oculus/Facebook Headquarters!

In February, I had the chance to talk about my rhythm-combat game Jukebox Beatdown at the Oculus / Facebook headquarters as part of the Oculus Launchpad program’s Demo Day!

During Demo Day, all of the Oculus Launchpad members are invited to Northern California to present their projects to the Oculus leadership and the other Launchpad members.

Check out a recording of the talk below!




Jukebox Beatdown, Design, Games, XR, Programming, Producing

Jukebox Beatdown Development Blog #3: Hitting Reset

This past month working on Jukebox Beatdown has been demanding but productive: I rebuilt the game’s mechanics, reconstructed my codebase, and recruited new team-members. In this blog post, I will update readers on new features within Jukebox Beatdown. Along the way, I will also talk about the challenges I faced and how I approached them as a designer and software engineer.

(The section entitled Engineering is structured more like a tutorial, so feel free to skip if you are not interested in computer science.)

Want more Jukebox Beatdown? Join our mailing list:

Overview

My design process is very player-centric, so playtesting (asking friends to give feedback on the game) is a crucial part of my process. My friends’ feedback provides direction for the next phase of development. If you have been following this blog, you may remember that my most recent playtest session gave me three primary notes:

  1. The game needs more “juice.” In other words, there needs to be more feedback for the player’s inputs. More simply, the gameplay is not satisfying.

  2. If the game is going to be marketed as a rhythm game, music needs to be a bigger part of the game’s mechanics.

  3. It needs to be clear that the player’s hands are the player, not their head. Alternatively, this mechanic needs to be reworked.

  4. Most importantly, the core gameplay loop (“boss fights synced to music”) sounds compelling, but the most recent execution of that idea is not engaging players.

This blog post will cover three main topics: Design, Engineering, and Producing.

An still from the previous iteration of Jukebox Beatdown. Dr. Smackz, the giant boxer pictured above, punched to the beat of Mama Said Knock You Out by LL Cool J.

An still from the previous iteration of Jukebox Beatdown. Dr. Smackz, the giant boxer pictured above, punched to the beat of Mama Said Knock You Out by LL Cool J.

Design

I am generally a bottom-up designer, which means that I try to find a fun, exciting, and unique mechanic then build other aspects of the game (story, art, engineering) around that mechanic.

While the above adjectives are subjective terms, there are a few concrete questions that can confirm the presence of each:

  1. Fun: If a player plays a prototype of my game, do they want to play it a second time?

  2. Exciting: When someone hears the elevator pitch for my game, do they ask a follow-up question?

  3. Unique: When hears someone the elevator pitch for the game, do they assume it is a clone of an existing game? (I.E., “So it’s basically Beat Saber?”)

As I mentioned in my previous blog post, Jukebox Beatdown was passing the “Exciting” test but failing the “Unique” and “Fun” tests. People were hooked by the pitch but bored by the gameplay. They also perceived the game as being a Beat Saber-clone, when it was actually a rhythm game crossed with a bullet-hell game.

Beat Saber is one of the most famous VR games and by extension, the most famous VR rhythm game. Due to its fame, I wanted to steer clear of any mechanic that resembled Beat Saber too closely.

Beat Saber is one of the most famous VR games and by extension, the most famous VR rhythm game. Due to its fame, I wanted to steer clear of any mechanic that resembled Beat Saber too closely.

Given this, I decided it was time to start over and try to create a new mechanic that engaged players and incorporated music. If I could not create a new mechanic that passed these requirements in two weeks, I would put the project on ice.

My previous game’s most popular minigame revolved around using tennis rackets to keep falling eggs from hitting the ground. It was a weird game, but a fun one!

My previous game’s most popular minigame revolved around using tennis rackets to keep falling eggs from hitting the ground. It was a weird game, but a fun one!

My last VR game had had success with a basic “bat” mechanic (you had to bounce eggs falling from the sky with tennis rackets), so my first inclination was to prototype a version of that mechanic that could work in the context of Jukebox Beatdown. I created a “Beat Bat” that the player could swing at enemies. If they hit the enemy on the beat, which was designated by a bullseye-like icon that shrunk as the beat approached, they would get a critical hit.

A very quick screen-grab of the Beat Bat prototype. It didn’t feel intuitive to me and it felt too similar to Beat Saber.

A very quick screen-grab of the Beat Bat prototype. It didn’t feel intuitive to me and it felt too similar to Beat Saber.

As a player, I found this mechanic difficult and awkward. It also felt too much like Beat Saber, so I went back to the drawing board once more.

My next idea was to have the player shoot on the song’s beat in order to apply a damage multiplier to each bullet. I was worried that this mechanic would feel shallow, but I also figured it would be accessible to less musically-inclined players, so I built a quick prototype. My first prototype rewarded the player with a more powerful bullet when they shot on the beat in 4/4 time regardless of whether a note was played at that time. I liked this mechanic, but it felt too basic and unsatisfying.

To learn how to make the rhythm mechanic more compelling, I decided to study existing rhythm games made for both VR and traditional platforms. I studied Audica, Beat Saber, and Rez Infinite, but by far the most useful game to play was Pistol Whip. Whip was useful partially because it had a similar shoot-on-the-beat mechanic, partially because its implementation of that idea was frustrating, and partially because it was built for a different kind of audience. These elements made me think of how Jukebox’s mechanic could be different and, I thought, more satisfying to its audience. (As a side note, all of the games mentioned above are excellent. My notes on the games below reflect my personal tastes rather than my opinion of the game’s quality. They are all extremely well-crafted.)

Below were my main takeaways and notes from playing those games:

Pistol Whip:

Pistol Whip is a on-rails VR shooter in which you get additional points when you shoot on the beat.

Pistol Whip is a on-rails VR shooter in which you get additional points when you shoot on the beat.

  • The shoot-on-the-beat mechanic always applied even if there was no instrument playing on the beat. This created awkward moments in which you had to shoot on the beat but there was no way for you to know when the beat was occurring besides a faint visual tremor in the level’s environment art and a haptic effect on your controller.

  • The shoot-on-the-beat mechanic served no strategic purpose; this made it less compelling. As far as I could tell, there was no incentive to shoot on the beat besides raising your score. I felt that this limited the appeal of the game to people who care about high scores. As someone who never felt strongly about leaderboards, this made the mechanic less interesting to me. (There are people who love this kind of mechanic, so points and leaderboards are a great idea; I just felt the mechanic was a missed opportunity.)

  • The feedback for shooting on the beat was too subtle: when you shot on the beat in Pistol Whip, the only feedback you got was a red dot above the enemy you shot. This felt like a missed opportunity to reward players.

  • Your hitbox was your head: in Pistol Whip, you are your head. In other words, to dodge the enemy’s bullets, you need to move your head around. I’m not a fan of this design because:

    • I personally dislike standing up for long periods of time to play a game.

    • I worry about banging my head against my furniture.

    • My hands afford me finer control than my head does.

    • This control pattern makes the game inaccessible to some players with disabilities.

Audica:

Audica is a music-shooter from Harmonix, the creator of Guitar Hero. It was one of my favorites.

Audica is a music-shooter from Harmonix, the creator of Guitar Hero. It was one of my favorites.

  • A rhythm-game can be made more interesting by requiring the player to hold down the trigger at times. This was a mechanic I had not seen in many other VR rhythm games and which I may incorporate into Jukebox Beatdown in the future.

  • Audica has fantastic particle feedback for every successful hit. Particle feedback is highly satisfying.

Rez Infinite:

Rez Infinite is a critically acclaimed music-action-adventure game in which your shots are timed to the music.

Rez Infinite is a critically acclaimed music-action-adventure game in which your shots are timed to the music.

  • Rez Infinite made the interesting choice to ensure that the player bullets always hit the enemies on the beat by having the player lock-on to enemies and then fire homing missles rather than shoot them directly. When the beat played, the missles would fire out of the player and hit the locked-on enemies so that it appears that the player has hit the enemy in perfect time with the beat. I want to recreate this effect with the homing missles Jukebox Beatdown’s bosses will use against the player.


With these notes in mind, I built a new prototype with changes that I felt made the gameplay more interesting:

  • Shooting-on-the-beat became a risk vs reward mechanic. If a player shot on the beat consistently, they would be awarded an increasing damage multiplier: their first shot on the beat would multiply their damage by two, their next shot would multiply their damage by three, and so on. However, if the player missed the beat or was hit by an enemy, their multiplier would reset to one. This gave players two options: they could either time their shots to the beat in pursuit of a high damage multiplier (but have to lower their firing rate to do so) or they could ignore the multiplier and simply blast away, making up for their lack of damage per bullet with a higher firing rate.

  • Shooting-on-the-beat was made more dramatic. As the player shot on the beat, their bullets would grow larger and change color. Additionally, a Guitar Hero-esque combo counter was tied to the player’s hands.

ComboMultiplier.gif

With the shoot-on-the-beat mechanic on firmer ground, it was time to incorporate more music into the boss’s attack patterns. In my previous prototype, I had programmed the prototype’s boss, a giant boxer, to punch the player on the beat. However, almost none of my playtesters perceived that the boss’s attacks were tied to the music and some even expressed that they wished that it had been! 

It was time to turn things up a notch. I felt that if players did not recognize events on the beat, they might recognize specific actions tied to specific notes. With some engineering secret sauce, I put together a pipeline that automatically tied notes to specific game-events. For example, a C# note could fire a rocket while a B note would shoot a laser.

The red bullets that the Beat Brothers are dodging are triggered by notes within the song’s melody.

The red bullets that the Beat Brothers are dodging are triggered by notes within the song’s melody.

However, as I will note in the Looking Forward section, this change was still too subtle for players to notice.

Engineering

This section is more technical in nature and resembles a tutorial. If you are not interested in computer science, feel free to skip this section.

After I implemented the above mechanics, I found the game had two big problems: poor performance and significant disorganization. 

Performance:

Due to the high number of projectiles on screen and some funkiness with my music-syncing plugin, Jukebox was crashing within ten seconds of start.

To fix this, I implemented a game programming pattern called an Object Pool. The purpose of an Object Pool is to enable a program to generate and manipulate large groups of in-game objects without adversely affecting performance. Large groups of objects can cause problems because the operations for Creating and Destroying these objects are computationally expensive, especially when executed many times per frame. To sidestep this issue, the Object Pool instead generates objects at program-start then places them within a “pool” of similar objects. When one of these objects is required, it is activated and moved to where it needs to be. Once it is no longer needed, it is deactivated until it is required once more. This saves performance significantly because it removes the need to perform many expensive Create and Destroy operations.

In the case of my game, this pattern was a lifesaver because the gameplay evolved to include up to 80 bullets on-screen at any given time. With this pattern in place, I was able to eliminate crashes.

As the bullets go offscreen, they are deactivated and returned to the pool. From Ray Wenderlich.com, one of the many resources I used to learn how to create an Object Pool. Click on the link above to learn more about Object Pools.

As the bullets go offscreen, they are deactivated and returned to the pool. From Ray Wenderlich.com, one of the many resources I used to learn how to create an Object Pool. Click on the link above to learn more about Object Pools.

Organization:

Once I felt more confident about the project’s direction, it was time to refactor my code.

During prototyping, my code had gotten messy and found myself losing time because I was writing multiple versions of the same few functions for similar classes. For this reason, I decided to create some basic Inheritance Trees.

If you are not familiar with Inheritance Trees, it is a manner of organizing code so that it incorporates basic is-a relationships. Is-a relationships are useful because they allow us to define objects using abstract terms rather than stating every attribute of every object from scratch. The textbook example is classifying animals:

Assume that you do not know what a Dog is, but you do know what an Animal is. If I tell you that a Dog is an Animal, you may, for example, know that an Animal is living and that it can reproduce, so a dog must also be able to do those things. Rhinos and Chickens, by virtue of being Animals, must also have these attributes.

Animals_InheritanceTree.png

Assume now that you know what a Dog is but you do not know the traits of particular Dog breeds. If you know a Dog has ears and four legs, you can assume that a Greyhound, Chihuahua, and Great Dane do as well. That is the value of an Inheritance Tree: it enables you to define objects/classes without having to repeat yourself.

To write the inheritance tree for my health functionality, I took all my health classes and wrote down all their properties and functions. I then identified which properties and functions I wanted in every object that had health. These “essential” functions and properties were then put into my most basic health class. After this class was written, I worked myself “down” the tree, creating increasingly more specific classes as necessary.

One advantage of an inheritance tree like this is that it helped me enforce certain design standards. For example, I wanted every object to play a sound effect and spawn an explosion effect when it “died” so that combat always felt dramatic. By defining this functionality in my base Health class, it was included in all the sub-classes (descendants of the base Health class like EnemyHealth, PlayerHealth, BossHealth) so I did not have to remember writing the same functionality for every sub-class. 

Producing

One of the more challenging aspects of this project has been finding appropriate and compelling music on a tight budget.

Fortunately, I’m excited to announce that Benjamin Young, a composer whose credits include Love, Death, & Robots and Star Wars: The Old Republic, will be composing and recording an original disco song for the game’s vertical slice! Check out more of Ben’s awesome music here!

For the last bit of exciting news, I’m happy to say that we have a new logo! In my next blog post, I will introduce our new poster and concept art as well!

LogoPNG.jpg

Looking Forward

After implementing the above changes, I hosted a playtest party in December to see how players would react.

The response from the twelve play-testers was generally positive and it was clear that the next move would be augmenting these changes rather than changing direction once more.

 The critical feedback grouped around three main points:

  • The shoot-on-the-beat mechanic could be made more complex. In its current iteration, the shoot-on-the-beat mechanic is tied to the snare drum section of Disco Inferno. Some playtesters felt that the rhythm section be made more complex.

  • The boss’s attacks need to be made more interesting. At present, the boss follows a simple attack pattern in which he moves around the stage then does a special attack in which he spins while shooting many green pellets. This needs to be made more interesting.

  • The player needs more interesting choices within the game-loop. The risk-vs-reward dynamic in the shoot-on-the-beat mechanic is interesting, but I can give the player more opportunities to make interesting choices. For example:

    • Create “beat zones” that reward the player additional points when the player hits the beat within them. Make this a risk vs reward mechanic by putting the zones in dangerous spots.

    • Build a mechanic around holding down the trigger.

    • Reward the player for alternating which Brother/hand shoots on the beat.

  • There needs to be clearer feedback for the player’s actions. Combos, hits, and boss attacks need more visual and audio feedback.

Thank you for following the development of Jukebox Beatdown! To get blog posts, beta-invitations, and launch coupons for the game delivered to your inbox, sign up for our mailing list below:

Jukebox Beatdown, Design, XR, Awards and Festivals

Exciting News: I have been accepted into Oculus's 2019 Launch Pad program!

I’m excited to announce that I have been accepted into the 2019 iteration of Oculus’s Launch Pad program!

What is Launch Pad?

If you’re not in the world of VR, Oculus is the world’s preeminent VR hardware company. They are known for building some of the world’s most popular VR headsets, including the Oculus Rift and the Oculus Quest, their amazing new standalone VR headset.

The Oculus Quest.

The Oculus Quest.

The purpose of Oculus’s Launch Pad program is to populate the VR ecosystem with new and diverse content. At the start of the program, one hundred developers from North America are invited to San Jose to attend a two-day VR bootcamp led by Oculus. They are also invited to Oculus Connect, Oculus’s flagship VR conference, that same week. After this initial training, Launch Pad members are provided technical support as they develop vertical slices of the projects they initially pitched to Oculus in the application stage. In early 2020, these developers will have the opportunity to pitch their vertical slices to Oculus again in hopes of gaining funding and ideally, launching their game on the Oculus store.

Some amazing projects have come out of Launch Pad in previous years, including Bizarre Barber, an awesome VR action game from NYU. I am thrilled to have the opportunity to work towards creating a VR demo of the same caliber.

For my application to Launch Pad, I submitted Jukebox Beatdown, a VR boss-rush game in which every boss fight is a distinct interactive music video.

In Jukebox Beatdown, you play as Kleft and Kright, two up-and-coming alien musicians that are tied to the player’s left and right hands respectively. Your goal is to make it to the top of the Billboard Galaxy Top 10. To do so, you will need to battle the existing Top 10 musicians in a series of fast-paced, music-themed boss fights.

For more detailed information about Jukebox Beatdown, please read my initial post about the project.

Kleft and Kright’s first album. If these characters look familiar, that is because they are Goopy Le Grande from Cuphead (2017) on the left and Slime from Dragon Quest (1986) on the right. They will be replaced when we finalize our concept art.

Kleft and Kright’s first album. If these characters look familiar, that is because they are Goopy Le Grande from Cuphead (2017) on the left and Slime from Dragon Quest (1986) on the right. They will be replaced when we finalize our concept art.

So What’s Next?

Right now, I am exploring the best way to build an awesome vertical slice of Jukebox Beatdown. Since the game is made up of a series of boss fights, I think the most logical vertical slice would be a single boss fight.

To create this vertical slice, I will need to achieve the following:

  1. Find or commission original music for the boss’s score.

  2. Nail down the game’s mechanics as I outlined in the previous blog post.

  3. Sync the game’s visuals to its music in a satisfying and clear manner.

  4. If there is time, optimize the project so that it approaches the technical requirements for the Oculus store.

Most likely, I will not hit step four and not totally complete step three. However, I think the game should be able to stand on its own should that happen. In game producing, I believe you should find what makes your game fun first then build everything else around that element.

I’m excited to see San Jose and attend Launch Pad!


Will you be at Oculus Connect and/or Launch Pad? If so, fill out the form below and we can meet up!

Real Al's Human Academy, XR, Design, Games

Design Retrospective: Real Al's Humanity Academy #3: Crafting Game Mechanics

In this series of blog posts, I will talk about my process as a producer, designer, and programmer of Real Al’s Humanity Academy. I’ll discuss why I made the decisions I did in each of these roles to give readers a better idea of how I approach these disciplines. The intended audience of these posts are potential employers, collaborators, and/or fans of the game who want a more behind-the-scenes look.

In this post, I will discuss how our team designed the project’s game mechanics.


In the previous posts, I discussed why our team set out to make a VR party game and why we thought a minigame collection was the best fit for our team.

With our gameplay concept selected, we brainstormed individual gameplay mechanics. As in Warioware or Dumb Ways to Die, each minigame would be based around a simple mechanic. When the game progressed, the minigames would become increasingly complex and difficult.

The Dumb Ways to Die minigame collection was another source of inspiration for our game.

The Dumb Ways to Die minigame collection was another source of inspiration for our game.

A core principle that we kept in mind throughout this process was to design every minigame for VR and only VR. At that time, there were many VR titles on the market that were essentially 2D experiences shoehorned into the medium. We wanted to make something that could only be experienced in VR and that utilized VR’s strengths, such as its physicality and 360 degree environments.

One of my favorite aspects of VR is its physicality. Like the Wii remotes before them, the HTC Vive’s motion controllers feel fantastic when you use them to perform physical tasks like swinging tennis rackets, boxing heavyweights, or climbing mountains. To brainstorm minigame ideas, I would grab a Vive controller and play with it like a toy until I found a motion that I found satisfying. Then, I would try to spin this action in a surprising way. For our “Bounce the Eggs” minigame, I first started with the motion of hitting a tennis ball. At that point, I prototyped surreal variants of this action, like the balls transforming into balloons on impact or the balls shooting out from the floor. Somehow, we arrived at eggs falling from the sky in a supermarket.

Our “Bounce the Eggs” minigame in action.

Our “Bounce the Eggs” minigame in action.

Another aspect of VR that I loved was the fact you could build a game all around the player. There’s something magical about turning around while in VR and finding a whole new side of the environment for you to explore. We applied this thinking to our game by designing every minigame so that you had to turn around and explore your environment to win.

Early in my time at NYU, I went to a VR masterclass that had Saschka Unseld, the director of Oculus Story Studio, in attendance. He said:

Film is about showing, not telling.

VR is about discovering, not showing.

I found this statement to be true in my time as a VR consumer and we tried to apply this paradigm to our design of the minigames. Imagine that each level is divided into a north, east, south, and west quadrant. Whenever it was possible, we tried to put a compelling and unique gameplay or art element in each quadrant so that the player would always feel rewarded when they turned around.

The “Grab the Numbers” minigame forced players to turn around because the numbers they needed were sometimes behind them.

The “Grab the Numbers” minigame forced players to turn around because the numbers they needed were sometimes behind them.

I loved this phase of production because our team had the chance to experiment. We made minigames where you popped balloons, smashed computers, played blackjack with floating cards, and dodged evil bees.

To externally validate these designs, we were constantly playtesting our ideas. In order to expedite playtesting, I would parameterize the settings of the various minigames and then have playtesters try variants of the same minigame one after the other until they felt just right. For the egg bouncing minigame, I would modify the eggs’ speed, size, color, and sound effects until every bounce felt satisfying and fair. This method of iteration was one of my favorite parts of the process.

Sometimes, the minigames would get stuck even after several rounds of tweaking. When this happened, we’d review our playtest notes for a diagnosis. Most of the time, the issue was that the minigame was too complicated to understand in five seconds. When this happened, we identified the best part of the minigame and simplified or eliminated the rest. In one case, we had a minigame in which you had to use a flashlight to find a lever in the dark and then pull that lever. Playtesters didn’t like the minigame, but they did like the flashlight, so we kept the flashlight and redesigned the game around finding ghosts in a dark room.

Players loved our flashlight mechanic, but not the rest of the minigame, so we kept the flashlights and ditched the rest. To make the flashlights even more rewarding, we added about forty unique images to the walls of the flashlight level so that pl…

Players loved our flashlight mechanic, but not the rest of the minigame, so we kept the flashlights and ditched the rest. To make the flashlights even more rewarding, we added about forty unique images to the walls of the flashlight level so that players would be encouraged to explore the room.

When designing minigames, we found that a few guidelines generally held true. We measured our minigames’ success by two simple metrics: A) Did they want to play again? B) Did they say they liked the game? In general, we listened to “A” a lot more than “B!”

  1. Simpler minigames performed better. Every minigame that required two phases ultimately became a one phase game.

  2. Minigames that heavily utilized motion controllers performed better.

  3. Minigames that gave dramatic reactions to the player’s actions performed better, regardless of whether the player won or loss. [For most players, a dramatic failure is more satisfying than a tepid victory.]

  4. Minigames that forced the player to utilize all 360 degrees of an environment performed better.

  5. Minigames that asked the player to simply touch an object were not as satisfying as minigames that required “bigger” actions like swinging, punching, or throwing.

  6. For most players, the game felt well balanced and fair when they won about 75% of the minigames in the first round and won about 50% in the later rounds.

  7. Players appreciated failure the most when they could clearly see how close they were to success. I think the egg game was popular in part to the fact that you could clearly see how many eggs were left to bounce at any given time. On other hand, I felt that the ghost game was sometimes unsatisfying because you could easily feel like you had made no progress if you didn’t spot any ghosts.

A progress bar from the amazing Cuphead. Players love to see how close they are to success.

A progress bar from the amazing Cuphead. Players love to see how close they are to success.

At this point, we kept the game’s art and theming to a minimum. We wanted to to communicate the game’s general tone, but we did not want to finalize art assets until we knew that our game mechanics were solid. In general, we thought it would be easier to create a story around a set of mechanics than vice versa. I also didn’t want our team’s artists to spend time building assets for a mechanic that would ultimately be cut.

We knew from playtests that many players enjoyed taking turns playing the game with their friends. While this was great, we needed to give players a reason to invite others to play the game with them, so we implemented a simple local high score system. Putting a leaderboard in a game can awaken players’ competitive spirits; we found that most players would do an additional playthrough if they were aware of the leaderboard and were in the presence of their friends.

The final big test of our game’s mechanics that semester was the NYU Game Center’s biannual show. This was an important show for us because the show often had professional game designers in attendance.

The NYU Game Center, where this game was first designed.

The NYU Game Center, where this game was first designed.

Thankfully, the game was well received; several people even said the game was terrific. Most importantly, the game seemed to achieve its stated mission: it thrived in the show’s party-like atmosphere. Many people would first play the game individually then challenge their friends to beat their high scores. For me, this was a profound moment. We had built the foundation of a solid VR party game.

However, there was still significant work to do; the minigames needed refinement and the game lacked a theme. In my next post, I will discuss how I wrote the game’s script and how our team approached the game’s art.

Enjoyed this blog post?

Subscribe below for more dev blogs and news delivered to your inbox.

* indicates required

XR, Design, Real Al's Human Academy

Design Retrospective: Real Al's Humanity Academy #2: Choosing an Idea

In this series of blog posts, I will talk about my process as a producer, designer, and programmer of Real Al’s Humanity Academy. I’ll discuss why I made the decisions I did in each of these roles to give readers a better idea of how I approach these disciplines. The intended audience of these posts are potential employers, collaborators, and/or fans of the game who want a more behind-the-scenes look.

In this post, I will discuss why our team chose to make a minigame collection given our stated mission.


In my previous post, I discussed the mission that guided this project, which was:

Make a VR game that people would want at their party.

With this goal in place, I started recruiting people who felt as passionate as I did about making VR games less isolating. I reached out to Keanan Pucci and Matthew Ricci because they were some of the hardest working people in my VR production class and seemed equally invested in solving this isolation issue.

Once our developer and designer team was in place, we started brainstorming ideas to achieve our goal. We decided that we would each bring five gameplay ideas and five theme ideas to our meetings until we found an idea we all believed in. I personally set aside twenty-five to fifty minutes a day to brainstorm ideas so that I was always bringing my best ideas to the group meetings.

We analyzed these ideas with a lens similar to the “Hedgehog Concept” mentioned in Jim Collins‘s book, Good to Great. Essentially, we asked ourselves three questions:

1) If we sold this game, would other people want to buy it?

2) Could we make a version of this game that was competitive with similar games on Steam?

3) Are we passionate about this idea?

If the answer to all these questions was yes, we would go forward with the idea.

HedgeHog_Concept.png

Between the three of us, “Warioware in VR” was our favorite idea, so we tested that idea first. If you’re not familiar with Warioware, it is a fast-paced casual game in which players compete to complete as many five to ten second minigames as they can. In the minigame pictured below, you are given five seconds to slap Wario and wake him up.

WarioWare_Slap.gif

To answer question number one, we pitched our game to anyone who would listen and gauged their reaction. The game was easy to pitch and most people seemed enthusiastic about the idea. For question two, we spent significant time looking through Steam and Itch.io for similar games. There were one or two Warioware-like VR games, but they were either buggy or bland; we felt that we could do better. It was clear our whole team was excited about the idea, so we decided to move into planning the project.

The Warioware concept had other game development-specific advantages to it. For one, it was modular: a minigame could fail in production and the rest of the project would be fine. This modular design gave us the flexibility to pursue three minigames if things were going slowly and ten if they were going quickly. If we had instead made a narrative game, we would have to commit to finishing every chapter or the experience would be incomplete. Some members of the team were relatively new to VR development, so this modular paradigm helped diffuse production risk.

The other advantage to the Warioware concept was that it gave us room to experiment with a variety of art styles. If you have played Warioware, you’ll know that each of its minigames has its own unique look: there are games rendered in claymation, 3D animation, acrylic paint, watercolor, and more. Warioware’s art is not always the most realistic or technically polished, but it makes up for this with visual inventiveness and humor. We knew our team couldn’t afford to create naturalistic animations like in AAA VR experiences like Henry or Robo Recall, so we decided to also aim for humor and inventiveness in our art rather than technical polish. You can find some images from our initial moodboard below:

Now that we had our core gameplay paradigm in place, it was time to brainstorm our mechanics.

Enjoyed this blog post?

Subscribe below for more dev blogs and news delivered to your inbox.

* indicates required

XR, Design, Real Al's Human Academy

Design Retrospective: Real Al's Humanity Academy #1: Finding Purpose

In this series of blog posts, I will talk about my process as a producer, designer, and programmer of Real Al’s Humanity Academy. I’ll discuss why I made the decisions I did in each of these roles to give readers a better idea of how I approach these disciplines. The intended audience of these posts are potential employers, collaborators, and/or fans of the game who want a more behind-the-scenes look.

In this post, I will discuss why our team set out to make a VR party game and what problems we hoped to solve by doing so.


This project started as a question in Robert Yang’s VR production class at the NYU Game Center. On the first day of class, Robert asked us, “What do you dislike most about VR?” Though I was absolutely fascinated with VR, I still found some issues with it: the cords got caught on everything, the sensors took a millennium to set up, and the headsets often ran hot.

However, there was one issue with VR that outranked all the rest: VR was incredibly isolating. When Robert’s question was posed to me, I had owned a Gear VR and borrowed an Oculus Rift for about a year. I had enjoyed many great single-player games and narratives with both headsets, but these play sessions were always dampened somewhat when I took off my headset and saw that one around me had shared in my experience. The technology made you feel lonely.

Moreover, the technology was difficult to share. If you invited your friend over to try your Oculus Rift or HTC Vive, they might have a great time, but you would be stuck watching them play through your computer’s monitor. There are a few great local multiplayer VR games such as the amazing Keep Talking and Nobody Explodes, but these are few and far between. Checkers, which could be played with stones and grid paper, had more staying power as a social activity than VR, which has billions of dollars of investment. If VR could not fix this isolation problem, I honestly thought it would die out (again.)

In Keep Talking and Nobody Explodes, one player defuses a bomb in VR while the other reads them a series of complicated directions from their phone.

In Keep Talking and Nobody Explodes, one player defuses a bomb in VR while the other reads them a series of complicated directions from their phone.

I thought VR had incredible potential as both a gaming platform and a storytelling medium, so I wanted to make something that proved to others that VR gaming could be a fun social activity. I wanted to make a game that someone would turn on at a party and that the whole room would enjoy, such as Wii Sports. With this in mind, I began the project with a simple mission:

Make a VR game that people would want at their party.

In my next post, I will discuss about how our team approached this question as game designers.

XR, Programming

Had a great time at the 2019 Unity XR Jam!

This last weekend, I participated in the 2019 Unity XR Jam at the RLab, NYU’s XR center in the Brooklyn naval yards.

It was an an amazing experience and I got to meet many great people in the developer community.

XRJAM2019-13.jpg

While I was there, I got the opportunity to create an awesome project called Surveillance State. It’s essentially Where’s Waldo with CCTV cameras in VR. You can check it out here.

Finally, here’s a video of me piloting a drone with a Magic Leap headset! (Not my project.)

XR, Games

Real Al's Humanity Academy Launches Today!

I'm excited to announce that Real Al's Humanity Academy, a VR game I produced, wrote, co-designed, and co-programmed, officially launched on Steam today!

Check out the game here:

https://store.steampowered.com/app/1025710/Real_Als_Humanity_Academy/

If you like the game, please be sure to leave a positive review! (The game is made for the HTC Vive VR headset.)

Keanan Pucci and Matthew Ricci co-created this project; we collaborated on the project's design and programming.
Other awesome collaborators on this project include Brodie Cornett (the voice of Real Al), Daniel Pauker, Amanda Berlind, Anjali Krishnan, Alex Danger Raphael, Emily Zhao, Anthony Michael, Hilary Taylor, Julia Hemsworth, and Natalia Bell.

Without this superstar team, this project would have been impossible!

Thank you also to all those who playtested the game and to our professor Robert Yang at the Game Center for nurturing the original version of this project in his class. Thank you also to all of those who gave advice to us as we set up an LLC to launch the project on #Steam (you have been thanked in the project's credits!)

XR, Awards and Festivals

The Wamco PIE is at the National Film Festival for Talented Youth!

Happy to say that the Wamco PIE, a VR experience I wrote, programmed, and directed, was shown at NFFTY, the National Film Festival for Talented Youth, this weekend!

The project was shown in NFFTYX, the festival's interactive and VR section.

Unfortunately, I am in China with a one-entry visa, so I could not attend, but I am grateful the project was shown at the festival!

You can check out a play-through of the experience at my website here.

A still from the experience.

A still from the experience.

The project was also shown at the Downtown Los Angeles Film Festival earlier this month.

Thank you to Anjali Krishnan and Emily Zhao for all their hard work on this project!

XR, Awards and Festivals

The Wamco PIE is Headed to the Downtown Los Angeles Film Festival (DTLAFF)!

DTLAFF_LAURELS_2018_SUNRISE.jpg

The Wamco Product Immersion Experience, a VR comedy that I wrote, programmed, and directed, is headed to the Downtown Los Angeles Film Festival!

I unfortunately will not be at the festival because I am in China on a one-entry visa, but I will be following the festival’s releases!

You can check out a play-through of the experience below:

XR, Awards and Festivals

My Wamco VR Experience is Headed to the Ivy Film Festival!

This weekend, the Wamco PIE VR experience is headed to the Ivy Film Festival at Brown University!

 

Ivy Film Festival is one of the world's largest student film festivals, so it is an honor to be invited. In the past, the festival has hosted guests such as Robert De Niro, Wes Anderson, and Jack Nicholson and held preview screenings of films like No Country for Old Men and Super Size Me

The piece will be featured in the festival's Virtual Reality Arcade. If you are in the Brown area, you can sign up for it here. The festival is free and open to the public. For those not in the Brown area, you can check out a video play-through of the experience here.

I am feeling a bit under the weather today, but I hope to be able to take the train up to see the festival for a bit this weekend. If you are in the Brown area, let's meet up!