SOURCE: Block Interval

Anyone who has tried modern virtual reality technology will agree that it offers a level of visual immersion far beyond what past technologies were capable of. That first time experiencing a sense of presence, that “is this real?” moment, is unforgettable.

I see VR as the next great leap in sound design. Presence doesn’t only come from what we see, but from all senses. Since we can’t (yet!) control what you touch, taste, and smell, we have to rely on visual and audio cues to immerse the player.

The brain has this amazing ability to interpret its environment through what we hear, and we need to exploit that. We as sound designers have some amazing tools at our disposal, such as the sound engine Wwise, which Block Interval is using for Life of Lon. More important than the tool, however, is how it’s used, and I believe we have broken some new ground in this area.

BUILDING A SEASCAPE

Most of the gameplay of Life of Lon will take place underwater, and one tricky part has been expressing this to the player. From making underwater recordings using a hydrophone, watching videos taken underwater, and just dunking my head in, one conclusion I came to was, water sounds boring! Everything sounds dull and lifeless, and the last thing we want in VR is to make our environment sound dead and flat. Add to that the fact that the player will be wearing a virtual helmet (which in real life would make things sound even more claustrophobic), I realized that realism just won’t cut it.

My "assistant" and I making hydrophone recordings at the lake

My “assistant” and I making hydrophone recordings at the lake

So I set out to create a unique underwater ambience, that sounds exciting and vibrant, but still gives the player the feeling of being underwater. I don’t necessarily want to slap a filter or effect on every sound effect in the game. After all, would you want to play a game where everything sounds dull or warbly?

Instead, I’m using an ambient backdrop to give the player a sense of space. Think wind noise, but with water. The backdrop I created is made of dozens of hydrophone recordings made from a stream by my house. Some spots were turbulent, some were relatively calm, and by filtering, pitch shifting, and blending them, I’m able to create a wide variety of ambient loops. I created 4 different “turbulence” levels, from calm to choppy water.

I blended the loops in Wwise using a blend container, and tied it to a game sync called Amb_Turbulence, with a range of 0 – 100. Using this game sync parameter, I can change how turbulent the water sounds around the player at different spots in the environment. This backdrop will be a stereo file, non-positional, and gives the player the sense of a dynamic, changing environment.

All of this is well and good, however integrating these parameter changes into Unity in a way that was easy to use, streamlined, and versatile took a little finesse.

PAINTING WITH SOUND

In order to create this truly dynamic environment, I had to make a custom tool in Unity. What I wanted was the ability to create zones where the backdrop would change smoothly and seamlessly. If you swim through a current, or through a tunnel with rushing water, you don’t hear the current come from a point source; you hear the water around you become more agitated.

The central idea to express here is that the environment is a character. You are surrounded by water and it’s alive. This tool allows me to do that. It has four main components. Emitters, Groups, Zones, and Emission Points.

  • Emitters are GameObjects that the Wwise ‘Play’ events are attached to. They also contain scripts defining the Game Syncs, which control the RTPCs in Wwise for the blend containers.
  • Groups are containers for Zones and Emission Points. They control which Emitter the zones control, which Game Sync(s) they affect, and control the overlap behavior of the zones. If two zones are overlapping, do we average the values, take the loudest, or take the quietest? This can be set in the Inspector for the Group.
  • Zones are the areas where the sound is affected. They can either be box shaped or spherical, and when the player moves through them, they set the Game Sync to a certain value, as determined by a slider in the inspector. The box shaped zones can be tapered across the x/y/z axis to give a smoother transition, and the spherical zones can be tapered from the origin to the edge.
  • Emission Points are optional objects that, when added to a Group, will cause the sound to taper out from either a point source or a line, adding more flexibility to the taper behavior of multiple zones. They are represented by a wire sphere or cylinder gizmo.

Multiple groups can affect the same Emitter and parameter, and groups can be children of any GameObject in the scene, so they can move with it.

Here is a video (above) demonstrating how I can dynamically change Wwise game parameters using this tool. Right now I just have a simple blend container with a basic turbulent water sound, but this can be applied to multiple (or nested) blend containers with several parameters, which will give me even more control over the soundscape.

CONCLUSION

Thanks for reading- hope it’s been helpful as you work on making your experiences more immersive!

Dave Nelson runs Alley Cat Recording and is the lead sound designer of the in-development VR game Life of Lon. For more information about the project, check out lifeoflon.com or the VR announcement.

Leave a Reply

Your email address will not be published.