Back to agenda

Render it Real: Computational Techniques to Overcome the Limits of Optical Fidelity and Achieve Realistic and Natural Visual Experience in VR/AR Displays

Nov 9

10:00 AM - 10:25 AM

Description

The presentation explores fundamental limits of VR/AR HMD optical design and shows how dynamic aberrations correction can increase the apparent resolution, eliminate color fringing and pupil swim effects, and enlarge the eye-box in the highest-end displays, such as Varjo VR-1 and HP Reverb G2. Having to achieve high resolution, wide field of view, and large eye-box, the VR/AR head-mounted display makers face the challenges impossible to overcome by hardware design alone. Even the latest and greatest devices retain the common flaws spoiling user experience: blur and color fringing outside of the small “sweet spot,” picture quality degradation and geometry distortion at wide gaze angles, and a tiny eye box. To achieve realistic picture quality and natural visual experience, the rendering pipeline has to include advanced image pre-processing beyond the standard geometry warp and channel scaling. Almalence Digital Lens is a computational solution correcting optical aberrations and distortions in head-mounted displays by utilizing a precise characterization of HMD optical properties along with a dynamic aberration correction technique, adjusting on the fly to the eye-tracking data.

Speakers

CEO , Almalence

Related Sessions

Nov 9

10:30 AM - 10:55 AM

Description

Leading head-mounted displays deliver immersive visuals and audio, but they’re packaged with plastic controllers that are designed for gaming. Enterprise and government VR users need to use their hands naturally and feel what they see. The success of mission-critical VR applications in VR often depends on having sufficient realism and accuracy. Learn how a new generation of wearable products deliver true-contact haptics through a combination of detailed tactile feedback, force-feedback exoskeletons, and sub-millimeter motion tracking. Hear how application developers for enterprise and government have embraced this technology to improve virtual training, design, education and even robotics.

Speakers

Chief Revenue Officer , HaptX
Nov 9

11:50 AM - 12:45 PM

Description

We are living through a watershed year for immersive experiences. Our covid isolation has accelerated every form of immersive experiences. Streaming concerts, augmented museums, virtual meetings and immersive story telling. All of these verticals have reached to the world of immersive story tellers to keep us all connected. Across all of these activities, spatial audio has immerged as one of the most important and visceral elements of the immersive story telling toolbox. Apple has unleashed headtracked spatial audio on the more than 90 million Airpods pro users, Dolby Atmos, Sony 360 and Apple have been accelerating the adoption of spatial audio standards, and customers now have awareness of the power of immersive sound. Join us as we discuss spatial audio production, delivery methods and the ever expanding customer experiences made possible by spatial audio. We will imagine the future of spatial audio and talk about the evolution of practical tools that are available today.

Speakers

Founder , CrossBorderWorks
Founder / CEO , Mach1
Audio Experience Innovation Lead , Bose
Managing Partner , WXR Fund
Nov 9

02:15 PM - 02:40 PM

Description

Coming soon!

Speakers

Competency Supervisor, Wireless & Mobility Solutions , Naval Information Warfare Center Atlantic
Nov 9

02:45 PM - 03:10 PM

Description

Journey with XR national and international speaker Christopher Lafayette as he covers the intriguing world of Non Fungible Tokens and what they bring to the XR Ecosystem. Chris will cover the topics of 1 Non Fungible Tokens Defined, Today. 2 How are NFTs made: Art vs Natural Assets 3 Where are they stored? 4 NFTs in Virtual Worlds 5 Relevant Industries

Speakers

Emergent Technologist , HoloPractice