Lesson 10 made sight readable on screen. This lesson makes sound honest: footsteps that match speed and surface, attenuation that sells distance, and stingers that fire when AI alert changes—the same enum or blackboard tier you already drive in Lesson 7 and surface in the HUD.

Course illustration - pixel cat and helicopter scene from Dribbble


Lesson objective

By the end of this lesson you will have:

  1. Footstep (and optional foley) events on your Animation Notifies that call a small Blueprint or C++ interface so walk / jog / crouch all sound different.
  2. Attenuation and occlusion (or manual low-pass under line-of-sight rules) so guards hear you through the same logic Lesson 5–7 already trust.
  3. Short stingers or one-shots triggered from AI state changes (idle → suspicious → investigate → engaged) with headroom left for music and VO.

Step 1: One bus, one lie detector

Create or name Submixes so you can duck and meter without guesswork:

Submix Holds
SMX_SFX_Gameplay Footsteps, impacts, world interactions
SMX_AI_Feedback Stingers tied to alert (not long music)
SMX_Music Bed (optional for slice)
SMX_UI Menus and HUD clicks (Lesson 10)

Route your player and AI gameplay sounds into SMX_SFX_Gameplay first. Stingers that must read over music go to SMX_AI_Feedback with a slightly higher send or volume curve—still clamped so peaks do not clip the master.


Step 2: Footsteps from notifies (not tick spam)

  1. Open your locomotion Animation Blueprint or montage.
  2. Add Anim Notifies at heel contacts for walk and run cycles (offset if your mocap differs).
  3. Each notify calls PlayFootstep on the pawn (or AnimNotify_Blueprint that plays a Sound Base).
  4. Pass surface type: line trace down from foot bone, read Physical Material → map to switch (concrete, metal, grass, carpet).

Pro tip: crouch-walk should use quieter cues or fewer notifies per cycle so perception hearing radius from Lesson 5 matches player expectation.


Step 3: Attenuation assets that match your level scale

  1. Create Sound Attenuation settings: log or linear falloff tuned to your greybox units (Lesson 3).
  2. Enable spatialization appropriate to your target (headphones vs TV).
  3. Set max distance so AI hearing and player ear agree—if AI hears 2000 units, player should hear something meaningful at that range or you will get “unfair” audio.

Gameplay note: loud gadgets from Lesson 8 should use their own attenuation or larger inner radius so distraction reads as intentional.


Step 4: Occlusion without buying a middleware license (slice scope)

Pick one approach for the vertical slice:

  • Option A — Audio Volume brushes: Place Reverb / filter volumes in doorways and thick walls; overlap reduces highs on SFX when the listener and source are in different volumes.
  • Option B — Blueprint gate: When line trace from listener to sound fails, apply low-pass or volume scale on footstep Sound Class for that play only.
  • Option C — Native occlusion (if enabled): Use engine occlusion settings on attenuation where your platform and project settings allow—verify on headphones once.

Avoid three systems at once; ship one readable rule.


Step 5: Stingers tied to AI state (same authority as HUD)

Reuse the alert tier or blackboard key that MissionDirector / AI controller already updates:

  1. On state enter (Behavior Tree Decorator, AI Perception callback, or Notify from your Lesson 7 flow), call PlayAlertStinger with an enum value.
  2. Map Suspiciousshort tick or riser 0.3–0.6 s; Investigatelow brass or pulse; Engagedhit that stops on cooldown or death.
  3. Cooldown the stinger channel (1–2 s) so perception spam does not machine-gun the player’s ears.

Critical: if Lesson 10’s HUD shows Investigate, the stinger for Investigate must fire from the same transition—no orphan sounds.


Step 6: Mix headroom and LUFS sanity

  1. Master or submix limiter light enough that footsteps still punch 6–10 dB below stinger peaks.
  2. Music, if present, ducks 2–4 dB under stingers using submix send or Blueprint mixer snapshot on Engaged.
  3. Capture 30 s of worst-case combat + footsteps and listen on one bad speaker—if sizzle disappears, your highs are overcrowded.

Mini challenge

  1. Jog across metal then carpet—confirm AI hearing perception updates match audible difference.
  2. Force Suspicious three times in five seconds—confirm stinger cooldown prevents overlap spam.
  3. Stand one wall away from a patrolling guard—occlusion or volume should soften their footsteps without muting yours incorrectly.

Troubleshooting

Symptom Likely cause Fix
Double footsteps Notify on both feet and mirrored anim One notify per contact; check mirrored assets
Silent in packaged build Sound not cooked / bank path Validate include in packaging; check staging
Stinger never plays Wrong state hook Print enum on transition; align with BT task
Everything flat 2D non-spatial sounds Enable attenuation on wave or cue
Music masks stingers No ducking Lower music submix on Engaged snapshot

Summary

  • Notifies tie animation time to audio time.
  • Attenuation sells distance; occlusion sells geometry.
  • Stingers are state UI for the ears—keep them sparse and authoritative.

Further reading


FAQ

MetaSounds vs Sound Cues?
Sound Cues are enough for many slices; MetaSounds help when you want modular layers and parameters without graph explosion.

VO in trailer vs in slice?
Keep VO on SMX_UI or its own bus so ducking rules do not fight gameplay SFX.


Next: Lesson 12: Lighting and Visibility Tuning locks lighting and exposure so silhouette and readability match what players heard here—Lumen and post tuned for stealth, not cinematic mood alone. Finish this lesson when alert stingers and footsteps survive a packaged Development build, not only PIE.