Augmented reality will fail without a method to adapt to the user’s context. Annoying or frustrating pop-ups will lead to distraction, frustration or confusion. This talk presents the world’s first contact-free facial expression and emotion sensing glasses. Applications include content testing in real-world environments, adaptive displays, and hands-free gesture interaction. The presentation includes a live demo.
Learn how Medtronic built a framework to introduce Augmented Reality (AR) to their teams. The goal is to involve and align different stakeholders by showing AR-as-a-solution instead of technology hype. William Harding, Peter Tortorici (both Medtronic)and Dirk Schart (REFLEKT) will use the example of Training Guides to show the framework and impact of the solution. Medtronic’s AR Training Guides reduce onboarding times by 50%.They will deep dive into the operator onboarding with the example of the EPIX use case to show you exactly how the switch to AR Training guides transformed the way they work in their facilities.What’s more, they will give you the full understanding of how Medtronic made it to their scaled rollout – all the way from the initial problem and project planning, to pilot, and then full deployment as well as the results they are now seeing.Here's what we will talk about: • Medtronic Use case – Why were AR Training Guides deployed? • Daily Usage – See what the Medtronic AR Training Guides look like and how operators use them in a normal day • Benefits – What results are the facilities seeing after implementing AR Training Guides • Framework - How easy is it to enable teams to use new technologies?
Leading head-mounted displays deliver immersive visuals and audio, but they’re packaged with plastic controllers that are designed for gaming. Enterprise and government VR users need to use their hands naturally and feel what they see. The success of mission-critical VR applications in VR often depends on having sufficient realism and accuracy. Learn how a new generation of wearable products deliver true-contact haptics through a combination of detailed tactile feedback, force-feedback exoskeletons, and sub-millimeter motion tracking. Hear how application developers for enterprise and government have embraced this technology to improve virtual training, design, education and even robotics.
Travel through Qualcomm’s plans for helping developers scale up the future of AR.
We are living through a watershed year for immersive experiences. Our covid isolation has accelerated every form of immersive experiences. Streaming concerts, augmented museums, virtual meetings and immersive story telling. All of these verticals have reached to the world of immersive story tellers to keep us all connected. Across all of these activities, spatial audio has emerged as one of the most important and visceral elements of the immersive story telling toolbox. Apple has unleashed headtracked spatial audio on the more than 90 million Airpods pro users, Dolby Atmos, Sony 360 and Apple have been accelerating the adoption of spatial audio standards, and customers now have awareness of the power of immersive sound. Join us as we discuss spatial audio production, delivery methods and the ever expanding customer experiences made possible by spatial audio. We will imagine the future of spatial audio and talk about the evolution of practical tools that are available today.
P&C’s smart AR Glasses are a wearable HMD device equipped with an AI engine and the XR2 platform. The information is displayed on our HMD clearly as a flat graphic overlay or augmented 3D objects and immersive contents. Our AR Glasses are slim type optical system with a see-through solution achieved with the Qualcomm Snapdragon 865 XR2 platform, the best solution for the AR industry. P&C’s proprietary, vSLAM platform, provides a convenient augmented reality, allowing a seamless merging of the real and augmented objects in medicine, education, military and smart factories.