The presentation includes an overview of Perception Grid‘s cloud-native infrastructure which enables bi-directional, real-time interaction between multiple users while simultaneously orchestrating immersive video, 2D media and text. The session highlights PerceptionGrid’s graphing database technology, which facilitates building relationships between data, to drive contextualized multimodal presentation of XR media, triggering, enhancing then accelerating the cognitive processes of perception, problem-solving, and learning. The session will showcase through demonstration how Perception Grid helps accelerate cognitive processes via XR to solve real problems.
You’ve been told it works, you’ve seen it work, and maybe you’ve even made it work—but why does it work? Even enterprise developers and pilot champions do not always know the robust research that lives in the bones of their wonderful products, and to no fault of their own. The XR industry has somewhat overused valuable terms and frameworks (presence, immersion, transfer, embodiment, recall, DICE, etc.) for the worthy sake of speed of communication to the hoi polloi, but at the expense of breaking down the mechanics that would create a dialogue to push our products and pilots to the bleeding edge of efficacy. This I promise: the words you use will change your outcomes. To that end, in this talk I’ll cover the aforementioned broad vocabulary, add some depth to each item with research and stories (near transfer, far transfer, embodiment congruency, contextual learning, Zone of Proximal Confusion, etc.), and end with a few practical examples of how these terms will very practically impact your simulation design and execution.
As we embark on our digital transformation, the digital twin has become a more central focus in our strategy. Enterprises with distributed facilities, regardless of how many or few, are challenged to digitalize because our workflows are spread across too many unintegrated platforms. Allow me to share our vision, our journey, the wins, and the pitfalls as we forge ahead into the METALverse with Beamo.
Trainers, L&D professionals, Architects, Product Developers and more can all benefit from immersive workflows with HP VR Solutions. Join Matt Gaiser, HP’s North America Head of VR Business Development to learn how to integrate VR into your enterprise
The rise of virtual care services is an emerging trend to provide medical services without the need of physical presence. However, there are barriers related to the physician’s ability to measure vital signs, emotional and physical conditions, which prevents virtual care to deliver the same level of experience, as is the case with face-to-face care. We have come up with innovative solutions based on set of our technologies, related to Vital signs measurements, EmoGraphy stress management, 3D depth sensing, etc., to enable AR/VR Enhanced Virtual Care. The proposed solutions try to improve upon existing virtual care by combining remote physiological measurement with 3D computer vision models of human pose and new forms of rendering 3D health information in an immersive and interactive manner, allowing spatially aligned rendering of vital signs data with captured images. The proposed solution, can be extended to remotely monitor multiple subjects, which is useful for providing elderly care in crowded places.
Technologies such as AR and VR can provide great benefits in the world of biopharma to improve the efficiency and collaboration of scientists working in lab and manufacturing environments. This session will describe current and future use cases of the technology, in addition to challenges that were overcame to enable its use. The timeline from initiating a pilot program, obtaining senior leadership buy-in, overcoming logistical challenges during COVID-19, and expanded use throughout multiple company sites will discussed.