Introducing Spatial Audio Capabilities for Xbox and Windows

Sign in to queue

Description

A key component of audio immersion is sound localization: sounds appearing
to come from a specific position in space. In both virtual and augmented worlds,
spatial perception can be a critical component to successful user interactions.
This talk covers new capabilities of the Windows 10 Creators Update that enable
consistent and intuitive spatial implementations that span devices (headphones
and speakers), experiences, and technologies.

Day:

1

Code:

GDC2017-002

Embed

Download

The Discussion

  • User profile image
    chkpnt

    If I've understood correctly, the architecture of ISpatialAudioClient is channel based (0:25:00). So how does the Dolby Atmos encoding work when not using "Dynamic Objects", as its ceiling speakers can only be used for objects? How are the objects "extracted" from the channels?

    Wouldn't it be more natural to use a technique like Auro 3D [1], where the height channels are encoded within an uncompressed 5.1 PCM signal? Or are you using Auro 3D somewhere in the background? Soon, a game called "Get Even" is released which is using Auro 3D for spatial sound [2]. But as its development started some time ago, I guess it's not using Windows Sonic, is it?

    [1] http://www.barco.com/secureddownloads/cd/MarketingKits/3d-sound/White%20papers/Auro%2011.1_versus_objectbased_sound_in_3D.pdf
    [2] https://www.youtube.com/watch?v=vTCv5P0PI5w

  • User profile image
    Lifespan

    @chkpnt:ISpatialAudioClient is actually object based with the ability to tie an object to a channel, so it can be used either way.  The rendering system can currently use a variety of endpoints such as Dolby Atmos for home theater, Dolby Atmos for headphones, and Windows Sonic for headphones (which is also used for Hololens).  The headphone technologies are variations of HRTF, which Auro 3D is as well.

Add Your 2 Cents