At the end of the year I thought I’d take the time to reflect on a critical part of game design that can be very easy to ignore in the pre-production phase: audio. Specifically I want to talk about the importance of sound effects, as important as music and ambient noise are to the sense of presence and mood in a game, sound effects are pivotal in establishing clear and consistent feedback patterns to player behavior that helps “sell” a concept without solidly locking down a specific direction if the designer doesn’t want to.
For example, if someone was designing a first-person shooter and there were enemies spawning off screen to attack you, it would feel incredibly bad if you just started losing health and the camera started shaking every time you got shot. You wouldn’t know where the damage was coming from, or what was dealing the damage. However, with even simple sounds you could incorporate the sound of the bullets firing, the bark of the enemies as they call more friends, and even the footsteps and shuffles of those enemies getting in to position. This way a player can make inferences based on audio feedback that visuals are too limited to represent. And that is a very important note for not only the work I’ve been doing on creating tutorials for PuppetJump, but also VR game design in general. In a medium where almost every game is in first person, that means your ability to provide clean visual feedback is more limited, and audio has to be able to pull more weight throughout the project.
This came up as I was working on adding a timed section to the tutorials as a “final test” of sorts. Implementation was easy enough, and applying it to the UI was also simple, but I was having trouble getting across the idea that you were being timed in real time. My initial solutions were all visual in nature, and none of them were good. My first thought was having a static UI element on the screen, similar to a timer in racing games. I very quickly ruled that out, you want to avoid cluttering the screen and forcing a player to move their eyes independent of their head puts a lot of strain on a player that already has a pound of plastic strapped to the front of their face.
I wanted to keep the timer though, so I thought about having a UI element parented to the player, meaning it moves alongside the player, like one of those holoscreens in a scifi or superhero movies. I tried that, but took that out because of how distracting it was. It was very uncomfortable having something follow you that doesn’t have a direct anatomical analogue (like hands) or has an entire entity dedicated to it (like an NPC following you), and moving while looking at the screen was very disorienting so I imagine those with motion sickness may have a harder time completing those sections like that.
Then I thought I could add a visual element to the hands, that way there wasn’t a sense of detachment from the player, but that felt antithetical to the overall minimalist design of PuppetJump’s avatar. If I took this clean, abstract hand model and slapped on an additional piece it would have broken the simple balance of the avatar. Maybe some day there will be a reason to expand upon the avatar, but for something small and specific like this? It felt unnecessary.
So I gave up. Thinking visually, that is! I realized if I wanted the player focus visually on one thing, but also be aware of another thing, I had to rely on a different sense for that. So I added sound! I added a ticking sound effect that gets faster as it approaches the end of the timer. Specifically I have one ticking sound play, then when the timer reaches the last quarter of its runtime, I have a faster ticking noise play. This was exactly what the scene needed! While I still have to do some work balancing the sound and I’m sure the sound itself will be replaced with a cleaner recording in polish, right now it provides the exact feedback I was looking for. Now a player can focus on traversing the scene while being aware at all times that they are under a specific time restraint!
All this to say that it is important to consider the user experience as early as pre-production and what kind of feedback you use in your greyboxes and why! Understanding how your visual feedback and stimulus may be limited and using other sense to either enhance or replace those visuals will help your game have a strong foundation initially, allowing an easier transition from greybox to alpha to beta to release. As with most aspects of game design, the more thought and work done to refine and establish the user experience early, the less time you’ll have to spend later redesigning something this intricate and specific.
Leave a Reply