Technologists in industries from retail, to medical technology, to gaming, to manufacturing are scrambling to keep pace in the rapidly expanding world of augmented reality (AR) and virtual reality (VR)—collectively known as XR. This rush for revolutionary technology is particularly exciting for developers who are eager to find new ways to deliver immersive experiences to end users. We are just on the cusp of a new frontier of XR applications and experiences, with limitless potential for innovation; however, significant barriers to the growth of this frontier exist. Fragmentation hurts the end user and causes confusion: “Will the next hot title come to the platform I purchase?”; “Will the next innovative controller be supported?”; “Will my platform remain relevant going forward?” Consumers are left uncertain as to what kinds of software and hardware they should invest in. This hesitancy kills consumer interest, further hindering the AR/VR industry’s ability to grow. OpenXR is the cross-platform standard for the VR and AR ecosystem, which helps applications run on a multitude of platforms, and allows hardware manufacturers to gain access to a broader range of pre-existing content. This presentation looks at what is will take for success in this new XR frontier and the role that open standards and ecosystems plays in getting software and hardware developers to common ground.
Near-eye displays – like those in augmented, virtual, and mixed reality devices – project virtual objects and information in close proximity to the human eye, sometimes encompassing the user’s entire field of view. This proximity not only creates immersive visual experiences, but also enhances defects like poor contrast, nonuniform brightness or color, line and pixel defects, low image clarity, and image positioning issues. To accurately test the quality of displays that are viewed so near to the eye, a display measurement solution should take into account the position, limitations, and characteristics of the human eye within the unique viewing environment of an XR headset. In this presentation, Davis Bowling from Radiant Vision Systems will describe an innovative imaging solution for near-eye display testing that replicates human vision for the most accurate evaluation of the user’s visual experience. Topics include: · Challenges of near-eye display measurement · Important optical design features that allow imaging systems to replicate human vision within headsets · Radiant’s AR/VR display measurement system and software analysis capabilities An audience Q&A with representatives from the Radiant team will follow the presentation.
To enable ubiquitous spatial AR (AR Cloud) a refreshing and accurately localizing Digital Twin is needed. It is not an easy task to create a localizing Digital Twin let alone to use your standard mobile phone(s) to do it. This talk is elaborating the process of creating a City Scale Digital Twin in Helsinki, Finland. Also covered is the tools available, skills needed and how to move the next level.
The future of augmented reality lies beyond specific use cases, applications, or even a single application. It involves a mindset change, a novel scope of user experiences, and a platform requirement. We must shift from augmenting individual objects with single user experience applications and begin to look at AR from an architectural scale. That starts with examining connected experiences that help guide people and make spatial data accessible within its spatial context. This allows the user to feel an intuitive connection with computing. In this session, Valentin Heun will provide a deep understanding about, and shares demonstrations of, our next important moves in an augmented reality industry exploring this most powerful paradigm: Spatial Context.
Unfortunately, the days of delivering sizzle videos & corporate logos to starry eyed investors and corporate boards is over. Like any mature solution, we need to show true ROI, both internally and to end customers. Eric will review the hurdles of moving beyond the innovation lab to deployment, detailing how Lenovo is accelerating XR adoption by bridging partnerships between XR solution providers and their sizable enterprise customer base. We’ll present a deep dive case study on our solution partner Holo One
This session introduces people on what are the current BCI (Brain-computer-interface) technologies, and what are their potentialities, especially when mixed with augmented or virtual reality. It will be investigated what are the devices available, both for brain-reading and emotion-reading. It will be presented a first-person feedback on the use of these kind of hardware pieces. Then will be explored the possible business opportunities of employing this technology and what is the future outlook.
We're entering the era of immersive computing. Spatial technologies such as MR, VR, IoT, AI, Cloud, Edge, Wearables, and 5G are all converging to make the physical world much more digitally interactive. The whole world is becoming a giant computer, and we're running around inside it. MR will be the user interface for this new computing paradigm. In this talk we'll make sense of how these technologies are coming together to create the next big tech disruption.
ThinVR is an approach to simultaneously provide wide FOV and a compact form factor in a near-eye VR display, through the combination of heterogeneous lenslets and curved displays. This is a research effort and is not currently the basis of a commercial product. In this presentation, I will explain the basics of this approach and show photos and videos shot from working prototypes that prove this approach works. This talk centers around the design process and fabrication challenges we had to overcome to successfully build working prototypes.
In order for AR headsets to become mass-market wearables, locating the AR processing in the mobile network is a tempting move. But could it work? In this talk, Leslie Shannon lays out the structure and physics of mobile networks and how they may be the missing link between AR today and the AR of the future
The AR Cloud is a digital 3D copy of the real world or a 1:1 model of the world understandable for IT systems and constantly updated in real time. ( Works withAndroid and iOS freely )
The Open AR Cloud Association, a global non-profit that aims to drive the development of open and interoperable real world spatial computing technologies to connect the physical and the digital worlds for the benefit of all is currently hard at work designing and build a reference platform to enable what some might call the Spatial Web or the Mirror World Like the World Wide Web platform the so called Open Spatial Computing Platform (OSCP) will be based on a set of open standards and protocols that allows everyone, everywhere with any AR-enabled device to connect to tservices on the platform for localization (GeoPose), a machine readable description of the physical reality around them (RML), and seemlessly access and summon any experience through a Spatial Discovery Service, This platform can potentially unleash trillions in new value creation and become a new major engine of growth. In this talk you will hear all about the current progress, future plans and how to get involved in this historical undertaking