Back

Dynamic Aberrations Correction Enables Users to See High Resolution in VR Displays

Jun 3

09:00 AM - 09:25 AM

Description

In the session we will explore how dynamic aberrations correction can increase the apparent resolution, eliminate color fringing and pupil swim effects, and enlarge the eye-box in the highest-end displays, such as Oculus Quest, Varjo VR-1 and HP Reverb G2.

Having to achieve high resolution, wide field of view, and large eye-box, the VR/AR head-mounted display makers face the challenges impossible to overcome by hardware design alone. Even the latest-and-greatest devices retain the common flaws spoiling user experience: blur and color fringing outside of the small “sweet spot,” picture quality degradation and geometry distortion at wide gaze angles, and a tiny eye box.

In order to achieve realistic picture quality and natural visual experience, the rendering pipeline has to include advanced image pre-processing beyond the standard geometry warp and channel scaling. Almalence created the Digital Lens, a computational solution utilizing a precise characterization of HMD optical properties along with a dynamic aberration correction technique, adjusting on the fly to the eye-tracking data.

Speakers

CEO , Almalence

Related Sessions

Jun 1

11:00 AM - 11:25 AM

Description

Whether hands free mobile displays or in situ data overlay, head-mounted augmented reality offers much to improve productivity and reduce human error in space. Unfortunately, existing solutions for tracking and holographic overlay alignment tend to rely on, or at least assume, earth gravity. Nothing inherent to a microgravity environment would make AR tracking impossible but several factors need to be taken into account. First, the high-frequency camera pose estimation uses SLAM, which relies on data from IMU sensors, which, by default accommodate the acceleration of gravity in their base measurements. Second, most holographic alignment strategies assume a consistent down direction. This session will explore strategies to mitigate these limitations.

Speakers

Chief Technology Officer , Argyle.build
Jun 1

11:30 AM - 11:55 AM

Description

Digital Twins, The Metaverse, Game Engines, 3D Maps and Computer Vision are all merging to make a vision of "Sim City" where the city is our real city. When will this happen? How will it happen? What will it mean for AR and VR? What will we use it for? Why will web3 & crypto be important?

Matt will explore what comes after the AR Cloud is built, and we can finally connect The Metaverse to the Real World, showing technically ground-breaking world-first demos of his new startup which is making all this possible

Speakers

CEO , Dejavu
CEO , Stealth
Jun 1

12:00 PM - 12:25 PM

Description

Volumetric video technology captures full-body, dynamic human performance in four dimensions. An array of 100+ cameras point inward at a living entity (person, animal, group of people) and record their movement from every possible angle. Processed and compressed video data from each camera becomes a single 3D file – a digital twin of the exact performance that transpired on stage – for use on virtual platforms. Finished volcap assets are small enough to stream on mobile devices but deliver the visual quality detail of 100+ cameras, making them a go-to solution for bringing humans into the Metaverse.

The volumetric video market is expected to grow from $1.5B USD in 2021 to $4.9B USD by 2026 as holographic imaging becomes increasingly crucial for the development of compelling, human-centric immersive content and Metaverse creators strive to solve the “uncanny valley” problem.

The session dives into the latest and greatest applications of volcap in augmented reality across multiple sectors – including fashion, entertainment, AR marketing and branding, enterprise training, and more…

We’ll examine the ground-breaking potential this technology holds for augmented and mixed reality as well as some of the challenges that may face this burgeoning industry.

Speakers

Stage Hand , Departure Lounge Inc.
General Manager , Metastage
Jun 1

01:55 PM - 02:20 PM

Description

After a full house presentation at AWE 2017 our team will return with new hands free 3D user interfaces for XR headsets using the eyes alone. For the first time we will give insight into our technology and explain how eye-vergence based 3D interaction was achieved and why it is as significant for XR headsets as the touch screen for smart phones.

During the presentation we will demonstrate on the latest XR headsets how users can interact in their forward visual space e.g. pick up, move, rotate 3D content using binocular eye vergence movements without having to engage other objects e.g. controllers or display elements. In summary we will have a deep dive into hands free 3D user interfaces using the eyes alone and share how it is useful in consumer and enterprise environments.

Speakers

Co Founder, CEO , Pillantas Inc.
Jun 1

02:25 PM - 02:50 PM

Description

In 2019, Eliud Kipchoge chased a GPS-guided green laser into history and ran the first sub-2 hour marathon. In technical terms, the laser was a real-time positional indicator. Said differently, the green laser was a ghost-runner, a virtual running partner - a tech-enabled Enhanced Reality solution.

What if everyone - not just world-class athletes - had the ability to compete against a ghost runner in real time? Not vs. an abstract segment displayed on a watch or a cycling computer - but a visible projection in the natural field of view. Non-distracting, but instantly accessible. That's Enhanced Reality.

Mark Prince, GM of smart eyewear company ENGO, will present advances in display systems that can enable anyone to do exactly that. ENGO's product is the first wearable display for sports that achieves glanceability for instant access to data, light weight for comfort and practical use, long battery life required for general use and fir endurance sports in particular, and a high-performance display that works in all light conditions.

Prince will explain how ENGO's solution can be used to change the dominant mode of sports training from post-activity review and analysis to in-activity decision making - with the potential to unlock performance for runners, cyclists, and other athletes - much like a vehicle-based green laser helped Kipchoge break the 2-hour barrier. Prince will demonstrate how use cases like this will finally pave the way for mass adoption of AR technology beyond niche applications.

Speakers

GM and Chief Commercial Officer | ENGO , MicroOLED