Journey with XR national and international speaker Christopher Lafayette as he covers the intriguing world of Non Fungible Tokens and what they bring to the XR Ecosystem. Chris will cover the topics of 1) Non Fungible Tokens Defined, Today. 2) How are NFTs made: Art vs Natural Assets 3) Where are they stored? 4) NFTs in Virtual Worlds 5) Relevant Industries
Leading head-mounted displays deliver immersive visuals and audio, but they’re packaged with plastic controllers that are designed for gaming. Enterprise and government VR users need to use their hands naturally and feel what they see. The success of mission-critical VR applications in VR often depends on having sufficient realism and accuracy. Learn how a new generation of wearable products deliver true-contact haptics through a combination of detailed tactile feedback, force-feedback exoskeletons, and sub-millimeter motion tracking. Hear how application developers for enterprise and government have embraced this technology to improve virtual training, design, education and even robotics.
Travel through Qualcomm’s plans for helping developers scale up the future of AR.
We are living through a watershed year for immersive experiences. Our covid isolation has accelerated every form of immersive experiences. Streaming concerts, augmented museums, virtual meetings and immersive story telling. All of these verticals have reached to the world of immersive story tellers to keep us all connected. Across all of these activities, spatial audio has emerged as one of the most important and visceral elements of the immersive story telling toolbox. Apple has unleashed headtracked spatial audio on the more than 90 million Airpods pro users, Dolby Atmos, Sony 360 and Apple have been accelerating the adoption of spatial audio standards, and customers now have awareness of the power of immersive sound. Join us as we discuss spatial audio production, delivery methods and the ever expanding customer experiences made possible by spatial audio. We will imagine the future of spatial audio and talk about the evolution of practical tools that are available today.
Conor will dive into OpenBCI's latest development: Galea, which combines mixed reality (XR) headsets with state-of-the-art biosensing and brain-computer interfacing (BCI) techniques. Galea comes equipped with multiple sensors to simultaneously monitor biometric data streams in real time and is designed to seamlessly attach to head-mounted displays (HMD), including virtual reality (VR) or augmented reality (AR) devices. The talk will also focus on the amazing possibilities and potential pitfalls of this technology being introduced to the world.
Eye tracking will be a critical success factor in making interaction with AR/MR devices more natural and intuitive, paving the way for mainstream AR. Eye data are, for example, essential for displaying information exactly where the wearer’s attention is, without interrupting the immersive experience. Through versatile eye tracking-based applications, from iris identification to improving image quality in VR/gaming, providers can take their XR experiences to a new level - and gain access to the most intuitive human-machine interface. However, integrating eye tracking technology is challenging, mainly due to the size of the components and the complexity of the technology. There is also the risk of patent infringement. In this session, you will learn how complex eye tracking technology can be minimized to preconfigured, calibrated and ready to use modules, enabling easy integration for providers of smart glasses. Further, also hear how eye tracking will evolve over the next years and how more and more functions will be minimized down to a smaller and smaller form factor.