In the session we will explore how dynamic aberrations correction can increase the apparent resolution, eliminate color fringing and pupil swim effects, and enlarge the eye-box in the highest-end displays, such as Oculus Quest, Varjo VR-1 and HP Reverb G2. Having to achieve high resolution, wide field of view, and large eye-box, the VR/AR head-mounted display makers face the challenges impossible to overcome by hardware design alone. Even the latest-and-greatest devices retain the common flaws spoiling user experience: blur and color fringing outside of the small “sweet spot,” picture quality degradation and geometry distortion at wide gaze angles, and a tiny eye box. In order to achieve realistic picture quality and natural visual experience, the rendering pipeline has to include advanced image pre-processing beyond the standard geometry warp and channel scaling. Almalence created the Digital Lens, a computational solution utilizing a precise characterization of HMD optical properties along with a dynamic aberration correction technique, adjusting on the fly to the eye-tracking data.
The ability of field medics and field surgeons to be able to get immediate consultation from an expert surgeon may be the critical difference in the survival and more positive longer-term recovery of a wounded warfighter. Being able to do this in an Augmented-to-Virtual Reality space would set this collaboration experience to an even higher level. This 3D collaborative space would provide a spatial calibration of the physical and virtual realities to allow the Novice surgeon to work on the physical body within an augmented reality alongside an Expert surgeon’s avatar with a virtual reproduction of the patient’s injury in his virtual reality. The Virtual Augmented Distributed Engagement Reality (VADER) is an AR-VR Remote Telementoring/Telemedicine application that was implemented by SOLUTE for the US Navy’s Virtual Medical Center (VMC) at Navy Medical Center in San Diego, CA (NMCSD). VADER is a part of NMCSD’s Virtual Medical Operations Center (VMOC) showcase offerings highlighting the latest leading-edge medical technologies.
This session will review some of the most exciting technological innovations that are being patented throughout the world in the AR / VR / MR space. Numerous examples of patented and patent-pending innovations covering both hardware and software inventions from the smallest companies to the largest companies (e.g., Apple, Microsoft, Google, Facebook) will be provided. Predictions of future patent infringement lawsuits will also be made.
OnBoardXR is a seasonal, virtual festival of short, live performances and artists showcasing their own experimental XR tools in a single 3D virtual event. Audiences may access the show from almost any device to experience roughly one hour of curated synchronous, short spatialized performances. Join this session to see highlights from the last year, including full body motion capture dance, IOT connecting the physical and virtual worlds, structured improvisation and semi-traditional theater using avatars and web cameras.
Panel discussion with XR platform leaders to compare and contrast what form factors and scenarios will entice main stream consumers to engage in XR. Evaluate VR versus AR in terms of what consumers will actually enjoy and see value in. Do we really need 360VR or is location triggered AR a better use case?
At the heart of every successful Augmented Reality(AR) experience, is solid storytelling that leaves the user with an emotional connection to the brand. Consumer demand for a personalized experience and emotional realism to content has been a major driver for AR experiences. However, capturing lifelike, digital recreations was limited. Volcap enables AR to break free from the frame and into fully immersive worlds, creating authentic, realistic experiences for the user across multiple verticals including entertainment, sports, training, enterprise, gaming, and more. How do you tell your brand story in AR using VolCap? Join our panel as they discuss how AR has taken the reins using VolCap to create personalized experiences for consumers, while providing brands with the ultimate immersion.
Advances in XR technology raise important socio-technological issues – not the least of which is the potential for unprecedented insight into users’ behaviors, preferences, and mental processes. That said, however, the adoption of XR technology need not and should not be an “us (the consumer) versus them (the industry)” scenario. Instead, industry and civil rights advocates should work together to tackle privacy challenges and ensure we get forthcoming policies and best practices right. Because XR is still relatively nascent, we have the opportunity to avoid mistakes of the past and develop this technology with a more informed, cleared-eyed approach. The XR industry and XR users should be partners in this process. This panel will explore the ways in which the XR industry, civil society, policymakers, and other stakeholders can work together to advance the greater good. We will discuss the influence of civil society; the role of law and policymakers; the science behind XR technology and the challenges privacy by design presents to engineers; and the opportunity for industry to help lead the way.
18Loop and the American Childhood Cancer Organization (ACCO) have conducted a Joint Experimental Intervention Research Study (JEIRS) to measure the efficacy of VR on children with pediatric cancer. This session will report on the findings and recommendations as well as a presentation on the benefits of Virtual Stress Management, the gamification of meditation and collaboration to the larger issues of mood and pain management. VR is truly a family intervention and data suggests that the experience strengthens family bonds (especially when related to families coping with illness). Attendees will learn three types of VR approaches in pediatric oncology, what our results look like and what the industry needs to do moving forward regarding standards and hardware to support the Virtual Pharmacy.
Miniaturization is the trend to manufacture even smaller mechanical, optical and electronic products, medical devices, and other high-value parts. This trend continues to be strong, with year-over-year growth in many markets. One of the limiting factors to miniaturization is the inability of traditional manufacturing methods like injection molding and CNC machining to effectively and economically produce smaller and smaller parts. Additive Manufacturing (AM), or 3D Printing has been around now for over 30 years. For a long time, there were only a few technologies available and applications were generally limited to prototyping. Past advances in AM have come short in meeting the needs of small parts, printing them at a resolution, accuracy, precision and speed that made them a viable option for end-use production parts. That has all changed. Additive Manufacturing and Miniaturization are now converging – in a very meaningful and impactful way. The growth in the AR/VR market and pace of innovation is opening up applications that were not even imagined a decade ago. With this comes challenges for manufacturing to scale at the same pace. Many leading AR/VR technology companies have started using micro 3D printing as a method for producing various micro-precision components as an alternative to traditional fabrication methods – finding huge time and cost savings. Learn how micro-precision 3D printing enables companies in this competitive space to address development challenges limited by current microfabrication methods, but also allows companies to explore the potential of pushing the limits on miniaturization by expanding the boundaries otherwise thought impossible with 3D printing.
360 XR technology is immersive and cool but sometimes it's hard for end users to concept how the technology can be used to drive positive impact for their business or mission. Hugh will discuss use cases and highlight examples pulled from his personal work to help the audience understand how 360 XR can be used to drive impact for them. Use cases include: - Agriculture: California Supima Cotton example - world's finest cotton. Captured footage to showcase cotton grower's fields. - Healthcare - Immersive virtual trainings for healthcare workers and other industries. -Tourism & travel: San Paulo 360 VR content that immerses potential visitors - Live events - Special Olympics 360 event capture -Automotive - experience world's fastest production car - SSC Tuatara
The Metaverse is coming and it will impact all our lives before the current decade is out. As an industry pioneer who began working in AR and VR over thirty years ago, I have a broad perspective on where the field has been and where it's going. And right now, I am both excited and concerned. Excited because the momentum has finally hit critical mass. Concerned because of the very real prospect that large corporations could bring the destabilizing features of social media into the Metaverse. To address this, I have been calling for sensible regulation of metaverse platforms to help ensure the technology rolls out in a positive way. This talk will address this important issue by answering three key questions: (1) What will the metaverse really be like when widely deployed? (2) What are the most significant risks consumers will face? and (3) What are the most helpful and sensible forms of regulation to avoid the metaverse becoming a destabilizing force like social media?
NASA developed a series of XR medical scenarios for astronaut training during deep space exploration missions. Led by Dr. Roger Dias, MD, PhD, MBA at the Brigham and Women’s Hospital (BWH) and Harvard Medical School, and funded by the Translational Research Institute for Space Health (TRISH), this project has a rather lengthy title “Mixed Reality (MR) Care-delivery Guidance System to Support Medical Event Management on Long Duration Exploration Missions” but is a very noble and important mission. The development of these XR, highly realistic, medical scenarios for in-flight space training was carried out by a collaboration between the BWH, Radiant Images, Inc., and Panogramma Inc. The photogrammetry and volumetric video capture, editing, and processing were performed with several other collaborators, including Metastage, Sony, and Arcturus. This session will cover the challenges and solutions this team faced and solved in the production of these medical training modules.
Within USA TODAY NETWORK's DNA is the desire to innovate how the organization tells stories. Since 2018 USA TODAY's Emerging Technologies team has developed over 40 highly engaging and interactive augmented reality (AR) experiences, exploring complex, historical, and trending news topics. Recognizing an opportunity to improve the core user experience within the app's framework, the team leveraged engagement analytics and external user testing to develop a robust and intuitive user onboarding sequence, resulting in a 3x increase in average time spent per user. A few media-rich AR experiences that include the updated framework are “Seven Days of 1961: A Dangerous Ride on the Road to Freedom,” "Champions of Women's Suffrage,” "Accused: The Impending Execution of Elwood Jones,” “Attica: A Prison Uprising 50 Years Later,” and many others. The Emerging Technology team has also implemented accessible designs to meet their audience's growing expectations for new and inclusive forms of premium interactive experiences. Will Austin, the Senior Designer of Emerging Technology, will explore the evolution of UI/UX to foster a deeper understanding of AR amongst casual consumers. He will also share insights and case studies on USA TODAY's onboarding challenges, new instructional design, and its vision for the future of premium interactive editorial experiences.
IEEE Future Directions drives efforts on current, new and emerging technologies. Specifically of interest is the focus on immersive and intelligent realities leveraging AR/VR/MR, AI/ML driving the overall digital transformation. Hear more on what IEEE is doing in this space.
The definition of the open metaverse needs to go beyond the notion of just connecting isolated game platforms and virtual worlds to real time sense making, including AR, digital twins, cyber/physical experiences, Web3, and more…. This panel will look at the future of open spatial cyber/physical UI/UX and OS architectures through the experience of leading technology architects and designers.