Back to agenda

Why Immersive Audio is a Critical Component of Immersive Experiences: a Practical Guide to Production, Delivery and Rendering of Spatial Audio

Nov 9

11:50 AM - 12:45 PM

Description

We are living through a watershed year for immersive experiences. Our covid isolation has accelerated every form of immersive experiences. Streaming concerts, augmented museums, virtual meetings and immersive story telling. All of these verticals have reached to the world of immersive story tellers to keep us all connected. Across all of these activities, spatial audio has emerged as one of the most important and visceral elements of the immersive story telling toolbox. Apple has unleashed headtracked spatial audio on the more than 90 million Airpods pro users, Dolby Atmos, Sony 360 and Apple have been accelerating the adoption of spatial audio standards, and customers now have awareness of the power of immersive sound. Join us as we discuss spatial audio production, delivery methods and the ever expanding customer experiences made possible by spatial audio. We will imagine the future of spatial audio and talk about the evolution of practical tools that are available today.

Speakers

Founder , CrossBorderWorks
Audio Experience Innovation Lead , Bose
Founder / CEO , Mach1
Managing Partner , WXR Fund

Related Sessions

Nov 9

10:30 AM - 10:55 AM

Description

Leading head-mounted displays deliver immersive visuals and audio, but they’re packaged with plastic controllers that are designed for gaming. Enterprise and government VR users need to use their hands naturally and feel what they see. The success of mission-critical VR applications in VR often depends on having sufficient realism and accuracy. Learn how a new generation of wearable products deliver true-contact haptics through a combination of detailed tactile feedback, force-feedback exoskeletons, and sub-millimeter motion tracking. Hear how application developers for enterprise and government have embraced this technology to improve virtual training, design, education and even robotics.

Speakers

Chief Revenue Officer , HaptX
Nov 9

11:20 AM - 11:45 AM

Description

Travel through Qualcomm’s plans for helping developers scale up the future of AR.

Speakers

Director of Product Management , Qualcomm
Nov 9

02:45 PM - 03:10 PM

Description

Journey with XR national and international speaker Christopher Lafayette as he covers the intriguing world of Non Fungible Tokens and what they bring to the XR Ecosystem. Chris will cover the topics of 1) Non Fungible Tokens Defined, Today. 2) How are NFTs made: Art vs Natural Assets 3) Where are they stored? 4) NFTs in Virtual Worlds 5) Relevant Industries

Speakers

Emergent Technologist , HoloPractice
Nov 9

03:15 PM - 03:40 PM

Description

Conor will dive into OpenBCI's latest development: Galea, which combines mixed reality (XR) headsets with state-of-the-art biosensing and brain-computer interfacing (BCI) techniques. Galea comes equipped with multiple sensors to simultaneously monitor biometric data streams in real time and is designed to seamlessly attach to head-mounted displays (HMD), including virtual reality (VR) or augmented reality (AR) devices. The talk will also focus on the amazing possibilities and potential pitfalls of this technology being introduced to the world.

Speakers

Founder & CEO , OpenBCI
Nov 9

04:35 PM - 05:00 PM

Description

Eye tracking will be a critical success factor in making interaction with AR/MR devices more natural and intuitive, paving the way for mainstream AR. Eye data are, for example, essential for displaying information exactly where the wearer’s attention is, without interrupting the immersive experience. Through versatile eye tracking-based applications, from iris identification to improving image quality in VR/gaming, providers can take their XR experiences to a new level - and gain access to the most intuitive human-machine interface. However, integrating eye tracking technology is challenging, mainly due to the size of the components and the complexity of the technology. There is also the risk of patent infringement. In this session, you will learn how complex eye tracking technology can be minimized to preconfigured, calibrated and ready to use modules, enabling easy integration for providers of smart glasses. Further, also hear how eye tracking will evolve over the next years and how more and more functions will be minimized down to a smaller and smaller form factor.

Speakers

CEO & Owner , Viewpointsystem