Learn how Medtronic built a framework to introduce Augmented Reality (AR) to their teams. The goal is to involve and align different stakeholders by showing AR-as-a-solution instead of technology hype. William Harding, Peter Tortorici (both Medtronic)and Dirk Schart (REFLEKT) will use the example of Training Guides to show the framework and impact of the solution.
Medtronic’s AR Training Guides reduce onboarding times by 50%.They will deep dive into the operator onboarding with the example of the EPIX use case to show you exactly how the switch to AR Training guides transformed the way they work in their facilities.What’s more, they will give you the full understanding of how Medtronic made it to their scaled rollout – all the way from the initial problem and project planning, to pilot, and then full deployment as well as the results they are now seeing.Here's what we will talk about:
• Medtronic Use case – Why were AR Training Guides deployed?
• Daily Usage – See what the Medtronic AR Training Guides look like and how operators use them in a normal day
• Benefits – What results are the facilities seeing after implementing AR Training Guides
• Framework - How easy is it to enable teams to use new technologies?
Leading head-mounted displays deliver immersive visuals and audio, but they’re packaged with plastic controllers that are designed for gaming. Enterprise and government VR users need to use their hands naturally and feel what they see. The success of mission-critical VR applications in VR often depends on having sufficient realism and accuracy. Learn how a new generation of wearable products deliver true-contact haptics through a combination of detailed tactile feedback, force-feedback exoskeletons, and sub-millimeter motion tracking. Hear how application developers for enterprise and government have embraced this technology to improve virtual training, design, education and even robotics.
Travel through Qualcomm’s plans for helping developers scale up the future of AR.
We are living through a watershed year for immersive experiences. Our covid isolation has accelerated every form of immersive experiences. Streaming concerts, augmented museums, virtual meetings and immersive story telling. All of these verticals have reached to the world of immersive story tellers to keep us all connected.
Across all of these activities, spatial audio has emerged as one of the most important and visceral elements of the immersive story telling toolbox.
Apple has unleashed headtracked spatial audio on the more than 90 million Airpods pro users, Dolby Atmos, Sony 360 and Apple have been accelerating the adoption of spatial audio standards, and customers now have awareness of the power of immersive sound.
Join us as we discuss spatial audio production, delivery methods and the ever expanding customer experiences made possible by spatial audio. We will imagine the future of spatial audio and talk about the evolution of practical tools that are available today.
P&C’s smart AR Glasses are a wearable HMD device equipped with an AI engine and the XR2 platform. The information is displayed on our HMD clearly as a flat graphic overlay or augmented 3D objects and immersive contents. Our AR Glasses are slim type optical system with a see-through solution achieved with the Qualcomm Snapdragon 865 XR2 platform, the best solution for the AR industry. P&C’s proprietary, vSLAM platform, provides a convenient augmented reality, allowing a seamless merging of the real and augmented objects in medicine, education, military and smart factories.
Journey with XR national and international speaker Christopher Lafayette as he covers the intriguing world of Non Fungible Tokens and what they bring to the XR Ecosystem.
Chris will cover the topics of
1) Non Fungible Tokens Defined, Today.
2) How are NFTs made: Art vs Natural Assets
3) Where are they stored?
4) NFTs in Virtual Worlds
5) Relevant Industries
Conor will dive into OpenBCI's latest development: Galea, which combines mixed reality (XR) headsets with state-of-the-art biosensing and brain-computer interfacing (BCI) techniques. Galea comes equipped with multiple sensors to simultaneously monitor biometric data streams in real time and is designed to seamlessly attach to head-mounted displays (HMD), including virtual reality (VR) or augmented reality (AR) devices. The talk will also focus on the amazing possibilities and potential pitfalls of this technology being introduced to the world.
Worlds is the Spatial AI Platform for real-world digital transformation. Our software helps organizations transform critical, real-world activities into a live data stream where users can learn, analyze and build automation into their physical world in ways that have never been possible before.
In this talk, Jason Fox, VP of Worlds Labs, will discuss and demonstrate how cameras, sensors, digital twins, and AI all come together inside the Worlds platform to fully sense and measure the physical world.
Eye tracking will be a critical success factor in making interaction with AR/MR devices more natural and intuitive, paving the way for mainstream AR. Eye data are, for example, essential for displaying information exactly where the wearer’s attention is, without interrupting the immersive experience. Through versatile eye tracking-based applications, from iris recognition to foveated rendering in VR/gaming, manufacturers and suppliers of XR devices can take their XR experiences to a new level - and gain access to the most intuitive human-machine interface.
However, integrating eye tracking technology is challenging, mainly due to the size of the components and the complexity of the technology. There is also the risk of patent infringement. In this session, you will learn how complex eye tracking technology can be minimized to standardized, preconfigured, calibrated, and ready-to-use modules, enabling easy integration for manufacturers of smart glasses. Further, also hear how eye tracking will evolve over the next years and how more and more functions will be combined in a smaller and smaller form factor.
Over the past 15 years, we’ve seen mobile spearhead the biggest shift in interaction. Touchscreens are now the predominant interaction feature for personal devices, leaving keyboard and button controls behind. The result? More natural and intuitive experiences. So why should it be any different for a virtual world?
Interaction solutions that remove the boundaries between the physical and digital worlds are not only intuitive, they’re essential for any immersive experience. Just think, a digital world will be infinitely more engaging when you are able to reach out naturally and interact with your own hands, just as you do in the real world. This is why we believe that hand tracking will be as essential to VR as touchscreens have been to mobiles.
In this session we’ll cover:
* The importance of hand tracking
* How hand tracking can make VR more accessible
* Lessons learned from the paradigm shift that was controllers/keyboard/mice to touchscreens
* How to adapt VR games and apps made for controllers
* How to design hand tracking first
One of the key challenges for augmented reality is the development of ultra-compact, lightweight, low-power near-to-eye display solutions with good image quality. Laser Beam Scanning (LBS) technologies can meet these key requirements and deliver form-factors that enable light weight, fashionable, all-day wearable AR smart glasses with the ability to scale resolution and field-of-view (FoV) with low power consumption.
In this session, we will briefly highlight the key technologies and solutions behind LBS that enable AR smart glasses, as well as more complex mixed reality HMD devices, including MEMS Micromirror scanners systems, laser diode modules, waveguides and LBS systems design considerations. Finally, we will briefly touch upon the ecosystem that supports end product manufacturers with key technologies and devices.