Storytelling is as old as time itself. As humans, we have used stories for centuries to build collective knowledge, adapt, and to learn. With each new technology the way we experience stories, educate, and learning evolves. Now we are in the time of XR. We have lived the early years of XR technology dissemination and today we can begin to collectively reflect and share our knowledge on what is and is not working, what creates the most impact in XR for users, and how best to improve these experiences. In this session, two experts with very different backgrounds compare notes and share their experiences from the cognitive science perspective and the simulation technologies perspective. This is the beginning of a conversation establishing guidelines to augment learning by living memorable virtual experiences.
You’ve been told it works, you’ve seen it work, and maybe you’ve even made it work—but why does it work? Even enterprise developers and pilot champions do not always know the robust research that lives in the bones of their wonderful products, and to no fault of their own. The XR industry has somewhat overused valuable terms and frameworks (presence, immersion, transfer, embodiment, recall, DICE, etc.) for the worthy sake of speed of communication to the hoi polloi, but at the expense of breaking down the mechanics that would create a dialogue to push our products and pilots to the bleeding edge of efficacy. This I promise: the words you use will change your outcomes. To that end, in this talk I’ll cover the aforementioned broad vocabulary, add some depth to each item with research and stories (near transfer, far transfer, embodiment congruency, contextual learning, Zone of Proximal Confusion, etc.), and end with a few practical examples of how these terms will very practically impact your simulation design and execution.
The rise of virtual care services is an emerging trend to provide medical services without the need of physical presence. However, there are barriers related to the physician’s ability to measure vital signs, emotional and physical conditions, which prevents virtual care to deliver the same level of experience, as is the case with face-to-face care. We have come up with innovative solutions based on set of our technologies, related to Vital signs measurements, EmoGraphy stress management, 3D depth sensing, etc., to enable AR/VR Enhanced Virtual Care. The proposed solutions try to improve upon existing virtual care by combining remote physiological measurement with 3D computer vision models of human pose and new forms of rendering 3D health information in an immersive and interactive manner, allowing spatially aligned rendering of vital signs data with captured images. The proposed solution, can be extended to remotely monitor multiple subjects, which is useful for providing elderly care in crowded places.