Audio And Coder Communication in Game


Audio And Coder Communication in Game

This Devblog is mainly aimed at audio designers and coders, who have not worked a lot in the opposing field yet and might find some tips on how to avoid miscommunication quite useful. However, it is more focused on making the perspective of audio for a coder more graspable. 

What Is This Post About?

I’m shortly about to finish my second year of my master’s degree in audio and had the honor of working on some game projects by now. One of the biggest issues I encountered on my way there was the communication between audio designers and coders. Due to our different understanding of the topic, some issues might arise and miscommunication might be common. Especially, if either side doesn’t really understand what the other is talking about, due to different use of vocabulary or lack of knowledge. 

In this DevBlog I want to share my learnings and what I think is helpful to know as a game-audio beginner, in order to properly talk about game audio and its integration. 

Audio is important and the code for it too!

This might seem obvious, but is often forgotten on both sides. Coders tend to forget that a game should have audio too and approach audio people way too late into the process of the project. Audio designers on the other hand might underestimate the workload of their audio-integration at first. This however can lead to heavy crunching times for both audio designers and coders alike and a lessened audio experience in game, which can be avoided at least to some degree through good preparation and communication. 

Audio is a great medium to subtly add storytelling in games without adding more visual input. It might even make some details in visuals less necessary, as the audio can tell a lot of information in a short time. Think especially of UI sounds, which can make a simple button interface epic, quirky or sad. 

Now, game audio consists of approximately 1/3 sound design, 1/3 logic for the playback of audio in game and 1/3 troubleshooting named logic in game. Simply put: the integration takes more time than the sound design process itself. This might seem exaggerated at first but considering that great sound design stands and falls with its integration, it is logical to put a lot of effort into said integration. The sound designer might be the best sound designer in the world, if the integration falls short, no one will hear the great sound design in game and as a non-coder, there is only so much an audio person can do themselves. There are always some problems with the integration too, which can take way more time than anticipated to troubleshoot. It is always recommendable to plan some time for troubleshooting in the production plan too, in order to avoid additional stress.

What really makes the difference though, is good preparation.

The Preparation

Depending on one's job, one can already prepare their work for the project to some extent, before there’s even a lot of material to work with:

For Audio Designer:

It is never too early to think about integration! The structure of events etc. can be done early on and also the preparation of parameters etc. can be done very soon by the code person or oneself, if one wants to do some of the coding themselves. Start to talk to your code person as early as possible, so that they can calculate how much time audio code will take for them. Also it is handy to see how the game logic works, as the integration might vary depending on how the game-code is structured. 

E.g.: For our masterproject we did a local multiplayer with three minigames and I wanted to add a switch, which tells me, which player is doing what interaction in game. While this did work on the first minigame, the second was not built code-wise to actually tell one, which player’s bottle is actually interacting with which level-element, as this was set at the beginning of a throw for all the bottles simultaneously. We had to change the logic and add an event for each player’s bottle, in order to correctly trigger the sounds, as the switch did not work out. While this was an easy fix, there might be moments, where something more complex doesn’t work out and where one, as an audio designer, might have to change how to do sound design, in order for the sound to work out properly. 

Concerning sound design, one can also prepare the session to a certain degree, even if there are no animations in game yet. First there is the previously mentioned middleware logic, which can already be prepared and pre build with placeholder sounds. Second, one always may start with recording some general textures, like ambiences and sounds. For this an animation is not exactly needed, as all movements etc. have a general logic to them and there are some recurring elements, which are always the same and only vary in texture. The Sounds will later be designed with visuals anyways and there one can layout the pre recorded files as fitting and doesn’t have to go around and use precious time for finding many textures.

The disadvantage of being part of a game project early on is sadly, that some work might be scraped and not used in the final game. However, this is a common issue even late on in the game and can’t really be avoided, so it’s not really a disadvantage per se to start with audio early on, as already designed sounds often might be repurposed in game and are ultimately not lost. When starting early on, there is more time for iteration too, which is a definite advantage!

For Coders:

Depending on the project and the available resources of the project, it might seem as if it will not take a lot of time to implement audio codewise and that it can be done on the go, as it seems not to take a lot of time. Actually, it’s quite the contrary, as there will be a lot of events to implement and depending on the complexity of the game there might be a lot of parameters, logic etc. needed. In general, it doesn’t take too much time to do it, but it is crucial for audio to work and might easily be overseen and not done due to stress with other coding tasks to do. So, always make sure to calculate at least half a day to day per week for audio code only! The more time one commits to audio, the better the audio-system can get, especially if one talks a lot with the audio designer about how audio might work and develop a cool system for it, although most of this thinking still lies in the hands of the audio person.

Another common mistake is to involve the audio designer way too late in the project. In order for the audio person to plan their work load accordingly, it is always better to inform them about the project as early as possible, so that the person might better manage their time and work for the project. Also there might come some input from the audio designer, which might improve the gaming experience in general, as there are many ways to add storytelling through sound design. 

Both:

While preparing for the integration, Code and Audio might stumble over some communication problems, as there are several terms in both areas, which are principally the same but called differently. This is something, which one will learn eventually with the practice of several projects. In a project, it is for sure recommendable to first do a

Common Ground Check

on how much each person understands either code or audio. Some might have learned something about code or audio in previous projects and might already have a basic understanding of some of the ongoings in both fields. Some might know nothing at all and will need an introduction in the opposite field. Either way, it might help to get a small overview on the terms which are often encountered in audio and what their counterpart in code is.

Audio Terms and their counterparts in Code!

As an audio designer, we might often talk of middleware. 

You know what that is? Skip to the next paragraph! 

For those who don’t: The most common middlewares out there are Wwise and FMOD. They emerged out of the need for creating better audio integration, with tools which are common for audio editing, without the need of delving into code to create them. Those tools are missing in game engines like Unity or Unreal but are essential for good sounding audio. 

At the start of a project, audio designers will go on babbling about which middleware to use and which version will fit the project. Then most of the time FMOD or Wwise needs to be integrated into the game engine - which sometimes can be very buggy (which would be another topic altogether). Once this is done, the base is settled for audio designers to make their sound structure in the middleware, which will be later integrated (again) into the game engine. For this integration and some basic soundlogic, there are some common terms used:

Terms used in Middleware

RTPC?!

Only using Wwise, one will stumble upon the term RTPC, short form for Real Time Parameter Control. It’s basically a normal game parameter, however Wwise decided to give it a fancy name. In FMOD the same is actually called a parameter. In both middlewares it is commonly used to modulate sound in realtime and could be anything. While in Wwise it’s normally a float number, FMOD has three types of parameters to choose from: Continuous (=float), Discrete (=integer) and Labeled (=strings), where the latter works similar to switches and states (see Switch & States). 

The most common case of usage is e.g. to pair the parameter with some health settings and to let the sound change the more the health is dropping. The audio designer can define in the settings of the middleware from where to where the parameter reaches (e.g.: health 0-100), what the default value is (health: 100) and how it is named. This should match the parameter created in code, as it might lead to issues otherwise. So it is really important to talk about those points, when discussing needed parameters. 

Switch & States

Switches and States are called the same in code, so when an audio designer talks about them a code person ought to understand it. FMOD is somehow different with their use of a string-parameter, however logically it is still the same, just that there is no difference made between a Switch and a State. That’s something which should be considered in code.

Global & Game Object

This is a setting audio designers set up in the middleware independent of a coder but often leads to issues when set up wrong. If the audio person has some integration issues,as a coder, it is useful to remember this term and to check if the sound in Wwise or the parameter in FMOD is set up with the right setting in the middleware. 

Both reference how a triggered event should consider the sound and how often it should be played. Global is used, when a value or a sound is shared between all instances.Game Objects (Wwise) or Local (FMOD) is used, when each instance or game object has its own value or sound.

Terms used in Game Engine

Soundbanks (.bnk files)

Soundbanks are a type of file, which is generated by the middleware for the game engine. It basically describes which files are used, how they should be modulated in which way, and when they should be played. Without it, audio will not work in the game engine. This is actually a common bug, as sometimes the audio person forget to actually build their soundbank at the end of their work or because the game engine or source control settings are not set right and then the engine might not find the banks and audio seems to be broken again. Always make sure the soundbanks are set up right!

Events

The event is the object which is being added to the game objects in the game and which contains specific sound-logic and specific sound-files and plays the audiofile once triggered. This event can only be read by the game engine, once the soundbank is correctly linked, as it tells the game engine what to do with that event. 

E.g. one has a game, where the player hits enemies all the time. The event might be called sfx_char_hit and will only contain the hit sound and play this back on collision enter. 

That’s about it!

It’s a short glimpse into game audio and by no means a full description. If one is motivated to learn more about game audio and the intricacies of the middlewares, always go to their page, sign up and use their free to use tutorials. Also there are many tutorials on youtube and co, with endless possibilities to deepen one’s knowledge. Have fun learning!

Files

BiteMeBoss.zip 176 MB
May 30, 2024

Get Bite Me, Boss

Leave a comment

Log in with itch.io to leave a comment.