Computers are rapidly evolving to better perceive the world through technologies like computer vision and voice interfaces. Simultaneously, users increasingly expect multisensory user experiences in lieu of traditional, two-dimensional audiovisual interfaces. How does this new technology affect our perception, and what can we learn from our perceptual systems to inform this multisensory design? This talk will cover how the human brain perceives the world, how the brain perceives in immersive experiences, and how we can leverage this understanding to build a more empathetic future of spatial computing. Presenters Stefanie and Laura will draw from their expertise as both cognitive neuroscientists and user experience researchers in industry, sharing their observations from rigorous user research studies at the forefront of AR/VR content creation, and synthesis from previous neuropsychological studies on AR/VR interaction models.