Olfaction, our oldest and most primal sense is deeply intertwined with our memory, emotion, cognition, and behavior. Aaron Wisniewski, Co-Founder and CEO of OVR Technology explores how combining this often overlooked and misunderstood sense with today’s most cutting-edge technology is the crucial missing piece that can positively impact healthcare and therapy. With a focus on mental health applications - like exposure therapy for PTS in war Veterans and first responders - this talk will highlight how Olfactory Virtual Reality works and how clinicians might use it to meet urgent human needs.
Throughout history our tools have defined the shape of our world—and so it will be with the augmented world. If a tool makes something easy, creators will explore and innovate in that space. If a tool makes something difficult, they’ll take their first idea and move on. In this session we’ll break down the unique challenges and opportunities for the XR tools that will empower a diverse group of creators to shape the next generation of augmented worlds.
Beauty and cosmetic industries are constantly evolving. Personalization is becoming essential and mastering a consumer-centric strategy is key. Through the use of advanced image recognition and machine learning technology, smart beauty tech solutions can help brands and retailers connect with consumers in a whole new way. Alice Chang, Founder and CEO of Perfect Corp., will discuss how brands can master AI + AR beauty tech solutions to deliver personalized customer experiences that drive conversion.
Knowing how people will react and engage with your spatial computing experience is an important part of ensuring success. Researching with your target users is the best way to do this but user research on new and emerging platforms can be challenging, particularly as we move "beyond screens" to smart and spatial computing interfaces. Existing user research methods may be less effective meaning we need to experiment with new ways to understand how successful the design of an experience will be. Using examples we will explore the hurdles of researching new and emerging technology and discuss techniques for gathering great user research insights. This talk is aimed at product managers, designers and developers of spatial computing experiences who want to run their own user research.
Every year, it’s the same story - a year-on-year epidemic of closures, retail stores heading to administration and a seemingly bleak outlook for products in the physical world. At SPATIALx, we’re experimenting and creating a new concept we call “_BLANK”. We’re on a mission to transform the high-street through immersive tech integrated into retail environments, creating stores with nothing inside. Curious? – You should be! -- We believe the future is for the bold; for those brave enough to chart a completely new course and try something truly, new and different. In this talk, we’ll explore the concept of Mixed Reality powered retail experiences. You'll learn what it takes to combine immersive technologies with a socialized re-fit of retail, new emerging commercial models, hurdles to overcome and how to uncover the potential that lays ahead with holographically projected products and zero-hour delivery.
A deep dive into Google's Immersive Arts team, a group of designers and technical artists creating original Augmented Reality and Virtual Reality experiences.
Join 8th Wall in a discussion on how WebAR is driving real results for brands using this platform for marketing and advertising. The panel will discuss how they are incorporating web-based augmented reality into their campaign strategy to create content that resonates and converts, and how they foresee the landscape of advertising changing in the context of increasingly immersive media.
12:30-1:00pm PDT: Introduction to the AWE Immersive Arts Symposium, Presented in the Museum of Other Realities with Kaleidoscope; 1:00-2:00pm PDT: The Once and Future Artistic Legacies of Virtual Realties; 2:30-3:30pm PDT: VR Sculpting Best Practices Panel; 4:00-5:00pm PDT: How to Fund your XR project with Kaleidoscope To check these out, read these instructions (https://www.notion.so/A-Guide-to-Attending-Events-in-the-MOR-2410f5de72054b339531fa4d15ec36d5) and enter 'awe1' for the code.
How do new products & services triumph in crowded marketplaces? What can we learn from neuroscience to help inform design choices? This talk will be a brief tour of the science and psychology behind making unforgettable experiences. The talk will also be a second-screen experience - where audiences are invited to contribute, share their views, interact and ultimately affect the outcome of the talk - in real-time.
From black and white to color, and desktop to mobile, our electronic devices are visual mediums that have rapidly evolved and advanced over the past century. New device technologies continue to add clarity and mobility, enhancing our viewing experience to be more convenient and closer to our reality. Today, more than ever, these devices have become indispensable information vehicles that are ingrained into every aspect of our daily lives. The average person is now so acclimated to having these thousands of digital interactions each day, they’re hungry for a more immersive experience. They’re ready for the next medium that will seamlessly enhance their lives and bridge the gap between real life and technology (without a headset). They’re ready for light fields. In this talk, David will discuss Lightfield as the next generation medium. He will explain how this emerging technology provides users with a fully interactive, lifelike viewing experience by rendering images and videos with 3D depth, and complex, realistic light effects such as sparkles, texture and highlights. He will also discuss how this type of technology will change the game for businesses by empowering them to provide end-users with more immersive and engaging experiences across industries including auto, retail, medical and education.
As the media landscape continues to segment, technology is pushing the boundaries on how media companies are connecting brands and audiences. At the center of this digital evolution, ViacomCBS is driving authentic brand extensions that reach fans through emerging products and experiences. Join TimAdams, VP Emerging Products Group, ViacomCBS, to learn how the company is doubling down on innovation to extend brand narratives through the use of spatial computing, augmented reality and voice interactivity.
Transfer your consciousness ,via a virtual avatar, into ENGAGE where you will join Chris Madsen, Steven Sato and fellow AWE peers for an immersive interactive journey into ENGAGE to experience first-hand how virtual reality is being used for collaboration, training, education, communication, events and more! Space is limited. ENGAGE is an education and corporate training platform in virtual reality. It empowers educators and companies to host meetings, presentations, classes and events with people across the world. Using the platform, virtual reality training and experiences can be created in minutes. The tools are very easy to use and require no technical expertise. You can choose to host your virtual reality sessions live, or record and save them for others to experience later. A wide variety of effective and immersive virtual experiences can be created with an extensive library of virtual objects, effects and virtual locations available on the platform. Space is limited to 50 people for one hour each day from May 26-29 at 1pm PST. Details to sign up for this virtual event are on the ENGAGE event calendar. IMPORTANT: The AWE conference password is 'vr' Day 1: https://app.engagevr.io/events/V8jqZ/share The livestream can be found at: https://engagevr.io/awe-conference/
The NBA has been at the forefront of creating innovative digital fan experiences. Scott Stanchak, the league’s Vice President, Emerging Technology, will detail how the NBA is using augmented reality, mixed reality and virtual reality to bring fans from around the world closer to the game. Stanchak will also discuss how the NBA is using a similar innovation process to enhance the live game-viewing experience.
This talk will present a few use cases from Dr. Jayaram’s work at Intel, start-ups, and university to discuss some of the technology and business aspects of bringing VR to two different audiences – 1) to an end-consumer in sports and concerts, and 2) to enterprises for engineering design and manufacturing. What are the challenges in delivering VR based experiences and solutions at scale? What does it mean to weave them into the “normal” way of doing things to arrive at a new normal where VR is a mundane, predictable, and a regular part of entertainment and work?
This year NFL football fans were able to enjoy a new AR activation on-site in Miami, the host city of Super Bowl LIV. Utilizing the Unity platform, Bose created a beacon-based fan experience as part of the 10-day celebration leading up to the game, in partnership with the NFL and Trigger Digital. Fans were asked to don a pair of Bose headphones and approach a set of “players’ lockers”. Activated by a beacon, the lockers came to life with an audio experience offering content related to the relevant NFL player. For example, the fan may have heard a previously conducted interview between the player and a reporter. In this presentation, Michael Ludden, Global Head of Enablement & Principal AR Advocate at Bose, Tony Parisi, Head of AR/VR Ad Innovation at Unity Technologies and Jason Yim CEO at Trigger Digital will explain how this activation was developed and why an exclusively audio-first approach to augmented reality with tailored audio content can be a new way of engaging consumers that is often overlooked. The speakers will demonstrate what is possible with audio AR, including how it can provide a sense of motion and direction.
3D face scanning is the first step in cloning a person and bringing the person to a virtual environment with high fidelity. High quality 3D face scanning typically has been done with specialized and complicated photogrammetry systems. The emergence of low-cost and high-quality mobile depth sensors, such as those in Apple iPhone X, helps to fuel the advance of 3D scanning and brings the scanning capabilities to the hands of millions of users. The talk will discuss different types of mobile depth sensors, including structured light, ToF, and active stereo, and compare their pros and cons for close-range 3d face scanning. The talk will also present the unique challenges of 3d face scanning and discuss their applications in fields such as medical, dental, eyewear, entertainment, and 3d printing.