Computers are rapidly evolving to better perceive the world through technologies like computer vision and voice interfaces. Simultaneously, users increasingly expect multisensory user experiences in lieu of traditional, two-dimensional audiovisual interfaces. How does this new technology affect our perception, and what can we learn from our perceptual systems to inform this multisensory design? This talk will cover how the human brain perceives the world, how the brain perceives in immersive experiences, and how we can leverage this understanding to build a more empathetic future of spatial computing. Presenters Stefanie and Laura will draw from their expertise as both cognitive neuroscientists and user experience researchers in industry, sharing their observations from rigorous user research studies at the forefront of AR/VR content creation, and synthesis from previous neuropsychological studies on AR/VR interaction models.
It is inevitable that augmented reality will change the way we explore and interact with our world and although the field will continue to evolve, now is the time to get in, explore, and shape how you want to see it. Storytellers are defining what AR looks like today, and forging what the future will look like.
The "old web" of information, centered on 2D static content with HTML/CSS as the dominant paradigm, is based on web technologies that are decades old now, and is poorly equipped to serve the needs of AR/VR and spatial computing. Currently, a number of exciting new open source initiatives are coming together to create a new open spatial paradigm for the web, with grammars of interaction suitable for blending the virtual with the real and supporting AI and ML safely and wisely, to create a web of perception and shared context. This talk will look at the philosophy, technologies, innovations and innovators, working on this critical mission for spatial computing.
As we know, there has been a lot of recent chatter, while many may even call it “hype”, over the concept of the ‘Metaverse’. As with most emerging technologies, the biggest challenge is navigating through all this hype, filtering through the rogue companies and individuals that may seek to exploit others in the early stages, and identifying viable business models that could appropriately be built upon these technologies and become sustainable. Join us as we curated a group of technologists and entrepreneurs who have discovered early success with investors, earnings, and both community and public support in their companies that will hopefully serve as foundational models that will further contribute to the advancement of the Metaverse.
Are you tired of trying to build a better future? Well you've come to the right place! In this satirical talk, Lucas Rizzotto’s evil alter-ego will teach you everything you need know about using XR for evil! Manipulate the masses using immersive propaganda, pervert their data to uncover their innermost secrets and uncover the XR design tips & tricks that unleash your full villainous potential. So let's get slimy, let's get greedy and let's turn the Metaverse into a dystopia that works for you! (You can also use this information to stop terrible people from doing this to you. But will you?)
Learn how MRTK Figma Toolkit can help your UX design process for Mixed Reality. With MRTK Figma Toolkit, you can use 2D versions of the HoloLens 2 style UI components for creating UI layouts and storyboards. Figma Toolkit's Unity plugin can import the Figma file and recreate corresponding MRTK components in Unity. This will save a lot of time and effort for designer-developer collaboration.