Custom Dynamic Ambient Systems

Calum Slee
6 min readFeb 12, 2022

Using Unity and Wwise

The use of Wwise as an Audio Middleware tool, gives us quick access to creating dynamic audio systems within our games. Using Real Time Parameter Controls or pre-existing Game Syncs such as Distance, we can easily manipulate audio depending on the games input.

The Great Fleece takes place in a museum, with rain visible outside. The camera switches between multiple positions as you progress, and could be likened to a security camera system. Most of the Sound Effects are panned based on the position within the camera. Left is left, and right is right, but I wanted to create some movement within our ambience.

The common alternative is to have the ambience play in relation to the Player itself, but this felt unnatural. Instead, I wanted to create a mixture of the two, a 2.5 design if you will. To achieve this, I wanted the ambient noise from each side of the building to be manipulated by how close each camera angle was to that wall, thus resulting in panning and volume in relation to the viewer. On top of this, I wanted suspense to increase and decrease as the Player itself moved around the space, towards and away from the windows. To achieve this, I could create an additional manipulation that also controls the overall settings.

Firstly, to create the camera manipulation in Wwise, we want to use Listener Relative Routing, this will allow us to capture the position of the game object in relation to our listener (the camera).

We can also make use of the Attenuation within the Positioning tab. Attenuations allow us to control a Real Time Parameter (RTPC) based upon the distance between the object and the listener.

In our game, two instances of this rain ambience Event are called, with each being manipulated individually due to their different locations. If the camera is positioned in the dead center of the space, both rain Events on the left and right of the camera, will be played at the same volume. If the camera moves closer to the left for example, the distance on that Event will decrease, thus getting louder, whilst the opposite side would increase, and get quieter.

Controlling the distance from the Player, requires a custom RTPC, but works in a similar way to our previously used Attenuation Parameter. The key difference, is that we don’t want the distance value being calculated by the Listener. Instead, we need to manually control it based upon a different game object — our Player.

I don’t want the Player Distance affecting the volume too much, just a subtle bit of dynamics, so we can create a much smaller range than our listener distance, and can also set a Slew Rate Interpolation to create a larger fade time.

In addition to the smaller range, we only want to affect the overall volume by a slight amount too. Instead of setting the opposite end of the graph to -200, we can settle for a smaller change such as -6db.

Now, once we assign a way to control this RTPC value, each rain Event’s volume will have a range of 6db depending on the Player’s distance from the windows.

To make use of these features in Unity, we needed to create some custom trackers. For a start, we can create an area that the rain will play from. This can be created with a box collider covering the desired area. Within this, we can have two sphere colliders that will act as our PlayerTracker, and CameraTracker. We can then create functionality for these spheres to track their respective targets, whilst staying within the defined area.

Within the Camera Tracker, is an Emitter object containing a script that calls the event. Once we create functionality, this script will follow it’s parent object, enabling the posted Event to move throughout the world and control our Wwise Attenuation created earlier.

To go further, we should actually be passing this information through to our Audio Manager. Passing through itself as a game object, will allow the Audio Manager to post the Event on the Emitter object, achieving the exact same result.

To make the Camera Tracker actually move, we can pass in a GameObject variable to track. Before we move anything, we also need to set our limits. Grabbing a handle on the objects Box Collider allows us to then store a Bounds variable.

With this Bounds variable, we can use a ClosestPoint function, to create a Vector3 position within the area of our Box Collider. Then we can simply pass in our target’s position.

Lastly, we can use Vector3.MoveTowards, to create a smooth transition. In this game, the camera moves between various static positions, and just setting a new position every frame, results in quick, drastic changes. Depending on the situation, adjusting a step speed may take a bit of trial and error to achieve the desired result.

Now as the camera switches positions, we can see the sphere collider following the camera whilst keeping within the bounds of the ambient area.

To follow the Player, we can create a similar system, but need to manually control the RTPC value we created in Wwise. Again, we can get a reference to the Box Collider and store its Bounds. We also need a target GameObject. From here, we can again use the ClosestPoint function, to set our transform position. As the player doesn’t teleport around, we don’t need to introduce speed or MoveTowards.

To pass this information through to Wwise, we first need a distance variable. We can use Vector3.Distance to calculate the space between the target and our position within the Bounds.

To access the RTPC we can store the ID, then using AkSoundEngine.SetRTPC, we can pass the Distance variable. For this to apply to the correct GameObject, we can also pass in the Emitter object that is parented in the Camera Tracker.

Now our sphere collider moves in relation to the player, within the bounds of the defined ambient area. Capturing the distance between the two allows us to pass in the RTPC value, controlling the volume by a range of 6db.

In my opinion, the most amazing feature of Wwise, is the ability to Remote Connect to Unity, and track everything happening in realtime, whilst also being able to adjust the likes of volumes on the fly. Using the Game Object Viewer, we can identify where our Main Camera (Listener) is, as well as our two rain emitters.

To verify our camera’s distance attenuation tracker works, we can manually move our camera in Unity, and watch the values for each emitter move along the graph. The attenuation will show all objects as it is a global parameter.

The RTPC we created for the Player Distance, exists in its own instance per event. Therefore we can only view one graph at a time. Moving the Player to and from the wall, shows us the emitter appearing and moving up the graph.

Of course, there’s plenty more control we can gain out of Wwise, one of the most crucial for testing these systems is the Mutes and Solos to only hear what we are testing, and the Voice Profiler to ensure nothing is playing incorrectly.