Should You Always Use an Audio Manager in Unity?

Omer FaranOmer Faran
3 min read

When building a game in Unity, especially as a solo developer or a small team, it’s common to implement an AudioManager as a centralized system to play sound effects. This can simplify things: you can call something like AudioManager.PlaySFX("button_click") from anywhere, and not worry about where or how the sound is played.

How AudioManager is Implemented

A basic AudioManager is often implemented as a singleton, which means it can be accessed globally from any script without needing to reference it manually. Here's a simple example:

public class AudioManager : MonoBehaviour
{
    public static AudioManager Instance;
    public AudioSource sfxSource;
    public AudioClip buttonClickClip;

    void Awake()
    {
        if (Instance == null)
        {
            Instance = this;
            DontDestroyOnLoad(gameObject);
        }
        else
        {
            Destroy(gameObject);
        }
    }

    public void PlaySFX(string name)
    {
        if (name == "button_click")
        {
            sfxSource.PlayOneShot(buttonClickClip);
        }
    }
}

With this setup, any script can simply call AudioManager.Instance.PlaySFX("button_click") without needing a direct reference to the AudioManager. This makes it very convenient for playing UI sounds.

But should you always use an AudioManager? Especially when working with 3D audio, the answer is often no.


The Problem with 3D Sounds in an Audio Manager

An AudioManager typically lives on a GameObject in a persistent scene and is accessed globally via a singleton. This works well for 2D sounds like UI clicks or global notifications, but breaks down when trying to play 3D positional sounds.

Let’s say you want to play a footstep sound at the player's location. You do:

AudioManager.PlaySFX("footsteps");

But since the AudioManager exists on a GameObject in the scene, often unrelated to the actual sound source, the sound will play from the wrong position, such as world origin (0,0,0).

You might try to fix this by passing a position to the AudioManager:

AudioManager.PlaySFX("footsteps", somePosition);

But this requires instantiating or repositioning an AudioSource, adding unnecessary complexity and defeating the purpose of centralization.


A Better Alternative: Audio Player Instances

For 3D sounds, it’s much easier to attach a dedicated AudioPlayer script to the object that emits the sound (e.g. a character's foot or a weapon).

Benefits:

  • Audio plays at the correct location without any complex setup.

  • AudioSource settings like spatial blend are local to the emitter.

  • Easy to adjust volume/pitch per instance.

Here’s a simple example of a basic AudioPlayer script:

public class AudioPlayer : MonoBehaviour
{
    public AudioSource audioSource;
    public AudioClip clip;

    public void Play()
    {
        if (clip != null)
        {
            audioSource.PlayOneShot(clip);
        }
    }
}

Attach this script to any GameObject (like the player or a weapon), assign the AudioSource and AudioClip in the Inspector, and then call Play() whenever that object needs to emit a sound. You can also tweak the AudioSource directly on the GameObject to customize spatial settings, volume, or route it to a specific group in your Audio Mixer.

You can still keep the AudioManager for 2D or global sounds. Best of both worlds (:


So, in conclusion

Use an AudioManager when it makes sense:

  • UI clicks

  • Music

  • 2D overlays or notifications

But for 3D positional audio, prefer AudioPlayer instances tied to the object that owns the sound.

It’s not an either-or decision. The best games use both approaches together, depending on context.

0
Subscribe to my newsletter

Read articles from Omer Faran directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Omer Faran
Omer Faran