In this opening keynote for the 12th annual AWE, Ori Inbar - the founder of AWE and Super Ventures - will welcome the XR Community to the first in-person AR or VR event in 2 years (!) and will highlight the industry's recent achievements as it enters the mainstream. Mr. Inbar will then share his vision for the next decade and call attention to the responsibilities and challenges we all share in ensuring the new wave of Spatial Computing is harnessed to fight humanity's existential threats to create an inclusive, just, and meaningful world worth living in.
The presentation explores fundamental limits of VR/AR HMD optical design and shows how dynamic aberrations correction can increase the apparent resolution, eliminate color fringing and pupil swim effects, and enlarge the eye-box in the highest-end displays, such as Varjo VR-1 and HP Reverb G2. Having to achieve high resolution, wide field of view, and large eye-box, the VR/AR head-mounted display makers face the challenges impossible to overcome by hardware design alone. Even the latest and greatest devices retain the common flaws spoiling user experience: blur and color fringing outside of the small “sweet spot,” picture quality degradation and geometry distortion at wide gaze angles, and a tiny eye box. To achieve realistic picture quality and natural visual experience, the rendering pipeline has to include advanced image pre-processing beyond the standard geometry warp and channel scaling. Almalence Digital Lens is a computational solution correcting optical aberrations and distortions in head-mounted displays by utilizing a precise characterization of HMD optical properties along with a dynamic aberration correction technique, adjusting on the fly to the eye-tracking data.
A detailed look into the making of Fictioneers’ and Aardman’s latest location-based AR experience, Wallace & Gromit: Fix Up The City. Hear from Susan Cummings, Co-Founder of immersive storytelling venture, Fictioneers, who will talk about how we used city architecture to generate occlusion meshes and overcame the challenges of embedding AR features into cities in a new and pioneering way.
Immersive experiences are about to become the new frontier of marketing, providing customers with more control over their experience and the ability to invite in and experience brands they love. With the looming fight for attention, brands need to understand the opportunities and threats: how to be discovered, how to be "invited in" as trusted partners in VR , and how to stay involved when consumers navigate their world with augmented reality as the information and experience layer. With new ways to access AR experiences (including WebXR), this panel will explore how consumers will use augmented reality to engage with brands in a variety of ways - including potentially excluding them completely - and how brands should be approaching how to set themselves up for success in the near future.
The "old web" of information, centered on 2D static content with HTML/CSS as the dominant paradigm, is based on web technologies that are decades old now, and is poorly equipped to serve the needs of AR/VR and spatial computing. Currently, a number of exciting new open source initiatives are coming together to create a new open spatial paradigm for the web, with grammars of interaction suitable for blending the virtual with the real and supporting AI and ML safely and wisely, to create a web of perception and shared context. This talk will look at the philosophy, technologies, innovations and innovators, working on this critical mission for spatial computing.
Leading head-mounted displays deliver immersive visuals and audio, but they’re packaged with plastic controllers that are designed for gaming. Enterprise and government VR users need to use their hands naturally and feel what they see. The success of mission-critical VR applications in VR often depends on having sufficient realism and accuracy. Learn how a new generation of wearable products deliver true-contact haptics through a combination of detailed tactile feedback, force-feedback exoskeletons, and sub-millimeter motion tracking. Hear how application developers for enterprise and government have embraced this technology to improve virtual training, design, education and even robotics.
Mayo Clinic has a firm belief that the needs of the patient come first and a foundational culture that unites every health care provider, scientist, student, and trainee in a singular mission to provide hope and healing to the world. Through our education activities, we aim to develop the workforce of the future and lifelong learners who are trained on new models of care delivery, and ready to keep pace with changes in technology over their careers. As we move forward in a bold new direction, Mayo Clinic aims to bridge excellence in education and clinical practice across our enterprise by investing in immersive technologies to deliver one- of-a-kind educational and training experiences. It's important to recognize that Mayo Clinic has 3 campuses in 3 states (AZ, FL, and MN) with 65,000+ staff and a College of Medicine & Science that includes 5 distinct schools (School of Graduate Medical Education, Alix School of Medicine, School of Health Sciences, Graduate School of Biomedical Sciences, and School of Continuous Professional Development) with 4500+ active learners in 410+ subspeciality programs. In this talk we will discuss Mayo Clinic’s current journey to strategically deploy Extended Reality (XR)-enabled education across our vast and diverse learner populations (i.e., surgical/medical trainees, staff, medical students, nurses, technicians, therapists, and patients). Additionally, we will provide insights into our preliminary work to utilize XR intraoperative navigation of patient-specific 3D data in the operating room using an expert-novice proctoring approach. Lastly, we will overview our new, bold-thinking strategy for establishing proper governance and collaborate leadership in an agile organization that is always learning and evolving — and moving fast — to expand knowledge and create new capabilities that will help transform health care delivery on behalf of patients.
Today Adidas demonstrates once again why they are an innovation powerhouse by completely disrupting their design process through the use of immersive technology. They implement a new workflow that empowers their teams to digitally work together on 3D concepts in space by jumping into the same virtual studio in Gravity Sketch and collaborating from any location around the globe. As futuristic as this might sound, it is actually a thing of the present, and the first sneaker designed in virtual reality is already on the market. The ease of use of the tool and the fact that people create and communicate in space, pushes cross-disciplinary collaboration to the next level, from design, marketing, all the way to the developers in China, their teams can now have faster decision making due to the understanding of the 3D form at the ideation phase and throughout the entire design process. Collaborative 3D sketching accelerates creative iteration and presents the company with a new era of opportunities in end-to-end digital integration. Join Gravity Sketch co-founder& CXO and Adidas designers on a talk on the revolution of the sportswear design process and how this technology has allowed them to take their products to the next level.
How do you experiment with XR in the Immersive and Location-Based Experience Space? Join Meow Wolf's CTO Barbara Ford Grant as she shares the history of Meow Wolf's POV on XR, and how Meow Wolf has integrated XR through a variety of real-time interactive technologies that allow participants to further engage a dive deep into our narrative experiences, including RFID gamification, AR simulations, and multi sensory "memory storms," all the while aligning XR with a mission and commitment to connect to our community.
Snapchat Lenses have taught a whole generation how to use augmented reality to express themselves, learn about the world, and even get things done. But behind the Lenses are Snap’s powerful tools to build AR experiences and a vibrant, global community that’s pushing the boundaries of what’s possible in AR. Join Snap AR leaders as they discuss how AR creators, developers, and partners can tap into the power of Snap’s camera products to build the next generation of AR.
This will be a talk about the various commercial cases in XR industry of South Korea, which is the most world renowned innovative and fast moving market.
XR gaming brings a new level of immersive interaction, and multiplayer experiences can create the sense of amazing social presence. Chicken Waffle has built the some of the top social XR content in the world, and has launched an Indie Publishing Fund to help innovative developers bring their content to market! This session will cover advanced development tips for creating interactive cross-platform multiplayer AR/VR experiences.
You’ve been told it works, you’ve seen it work, and maybe you’ve even made it work—but why does it work? Even enterprise developers and pilot champions do not always know the robust research that lives in the bones of their wonderful products, and to no fault of their own. The XR industry has somewhat overused valuable terms and frameworks (presence, immersion, transfer, embodiment, recall, DICE, etc.) for the worthy sake of speed of communication to the hoi polloi, but at the expense of breaking down the mechanics that would create a dialogue to push our products and pilots to the bleeding edge of efficacy. This I promise: the words you use will change your outcomes. To that end, in this talk I’ll cover the aforementioned broad vocabulary, add some depth to each item with research and stories (near transfer, far transfer, embodiment congruency, contextual learning, Zone of Proximal Confusion, etc.), and end with a few practical examples of how these terms will very practically impact your simulation design and execution.