"Out There" is the first-ever immersive musical in Spatial Computing - a unique combination of storytelling, music and technology. "Out There" was introduced to the world at Comic Con Paris in late 2019, and is making its U.S premiere in 2020. Creator Thibault Mathieu (Wilkins Avenue) and Sound Supervisors Scot Stafford and Richard Warp (Pollen Music Group) will discuss the project's genesis, and how they produced and operated the largest-scale Location-Based Magic Leap experience to date, worldwide.
With XR headset sales moving closer towards mainstream adoption and the accessibility of attractive content on the rise, so are the performance demands of current and next generation headsets. Eye tracking already offers a myriad of benefits for immersive user experiences and more natural interaction but, what does it have to offer to improve the capabilities and performance of headsets? Join Doug Eggert as he explains the value of Tobii Spotlight Technology and how this advanced eye tracking specialized for foveation can help overcome two of the biggest challenges facing headset manufacturers today. As the XR market continues to press for wider field of view and higher resolution displays, Tobii Eye Tracking brings a clever solution inspired by nature but designed for XR.
Cloud-rendered Gaming and XR have unique challenges, in that, the user quality of experience must be equivalent to a locally rendered experience. Merely quantifying this quality of experience is a challenge, let alone building the levers to control it. In this talk we will discuss these challenges and various approaches being trialed. The conversation shall span rendering and streaming (Nvidia), Cloud Gaming (Blade Shadow), Cloud AR (Gridraster), media formats (Tencent) and Edge computing (Charter Communications). We shall also discuss GPU multi-tenancy, network readiness for immersive, as well as the trends and enablers for market traction.
Futurus partnered with United Way of Greater Atlanta, Everfi, and the NFL to create “Call the Play.” The experience is based on the Character Playbook, a digital learning platform designed to help middle school kids learn how to make tough decisions in challenging situations and develop healthy relationships. The experience debuted at the Super Bowl Live Experience in Atlanta, Georgia in the nine-days leading up to Super Bowl LIII. Over 12,000 attendees passed through United Way of Greater Atlanta's booth. Individuals were given the opportunity to put on a headset and be transported to a 3D replica of the Mercedes-Benz Stadium where they were greeted by NFL Hall of Famer, Jerry Rice, who served as the host of the experience. From there, individuals were able to make selections and learn from how to navigate difficult situations in a safe, judgement-free zone. The children (and adults) who tried the experience learned what responses elicited the most ideal outcomes. This interactive, learning experience proved to be an effective way to communicate how best to navigate tough situations like bullying. It was presented in an engaging and memorable format with which the target audience was able to relate. Sara Fleeman (United Way of Greater Atlanta) and Annie Eaton (Futurus) will walk audience members through the decision-making, development and deployment processes. They will share insights, challenges and solutions in a fast-paced deployment on the national stage. Attendees of this session will learn: • How to make tough decisions on a deadline for a national event. • Rationale for using VR to create empathy and educate. • Buying process for a company seeking creative thinking. • The process behind delivering three digital mediums of characters in one experience – 2D video, 3D character modeling, and volumetric capture. • Challenges and learnings from hardware deployment on-site.
Companies always look for ways to create value for their customers and consumers. As any new technologies emerge, they look for ways to grasp the opportunity, changing the world and getting the best ROI. XR and eye tracking are no different. Enterprises who embrace them as a foundation of innovation have increased their competitive advantage. In this presentation, Johan Bouvin, a veteran in eye tracking for 15+ years will look at the applications of eye tracking and its proven business value as eye tracking is becoming a standard. He will discuss how these successful cases inspire more innovations in the XR applications and create value. Developers and practitioners are invited to explore where to leverage eye tracking and XR in enterprise. And where not to.
There is no design standard of interaction for virtual reality which makes it difficult to create intuitive experiences. We will go through use cases based on our learnings at LiveLike building apps for broadcasters to watch live sports events with friends and fans.
To effectively blur the line between the digital world and the physical world, augmented reality (AR) apps need a sense of presence and context. Krikey, the developers of an AR gaming app, will share how they’re creating a deeper sense of immersion and presence in their newest mobile AR game, Sia’s Adventure. Attend this session to discover the insights Krikey learned from using Unity’s Mixed and Augmented Reality Studio (MARS) and AR Foundation in the first AR adventure game with human characters, built with Google Maps. You’ll understand how these tools and workflows work together to create deeply interactive AR experiences that intelligently interact with the real world.
Computers are rapidly evolving to better perceive the world through technologies like computer vision and voice interfaces. Simultaneously, users increasingly expect multisensory user experiences in lieu of traditional, two-dimensional audiovisual interfaces. How does this new technology affect our perception, and what can we learn from our perceptual systems to inform this multisensory design? This talk will cover how the human brain perceives the world, how the brain perceives in immersive experiences, and how we can leverage this understanding to build a more empathetic future of spatial computing. Presenters Stefanie and Laura will draw from their expertise as both cognitive neuroscientists and user experience researchers in industry, sharing their observations from rigorous user research studies at the forefront of AR/VR content creation, and synthesis from previous neuropsychological studies on AR/VR interaction models.
Through the touch and the physical play, the children understand reality, discover the world, and develop social interactions. Kids today spend most of their free time interacting with screens, limiting the physical experiences with the real world. We, at Hasbro, want to create the world's best play by combining what best offered by the physical, the digital, and the storytelling through the use of augmented reality and voice. Hasbro is pioneering the future of play through toy-grade AR platforms and experiences such as Marvel Avengers Hero Vision Iron Man and Bee Vision Bumblebee. Herovision platform relies on smartphone technology, inside an HMD ergonomically designed for kids. The HMD plugs into a roleplay mask (Iron Man or Bumblebee), allowing the kids to see the action from the hero's eyes. Through markers, printed onto the gauntlet, we allow gesture gameplay that mimics the hero. Kids can wear the mask with or without the HMD, allowing both the AR gameplay or the classic roleplay. Another example is "CLUE with the ghost of Mrs. White", the Clue gameplay now augmented with Amazon Alexa. A board game that delivers classic tabletop gameplay with an immersive audio experience that delight, inspire, and connect people. Hasbro is a global play and entertainment company committed to Creating the World's Best Play and Entertainment Experiences. From toys and games to television, movies, digital gaming, and consumer products, Hasbro offers a variety of ways to experience its iconic brands, including NERF, MY LITTLE PONY, TRANSFORMERS, PLAY-DOH, MONOPOLY, BABY ALIVE, MAGIC: THE GATHERING and POWER RANGERS. Spark Labs is Hasbro’s industry leading, forward-thinking technology and innovation group crafting & delivering outstanding products & experiences that push the boundaries of play!
Despite the ongoing megapixel race, the visual experience in VR is still nowhere close to the real-world and falls short of consumer expectations and industry needs. Even the latest high-end HMDs retain the common flaws: blur and color fringing outside of the small “sweet spot,” an insufficient field of view due to quality degradation at wide angles, and a tiny eye box. Almalence has invented a technology to overcome the physical constraints of HMD optical performance that cause picture clarity deficiencies. Utilizing eye tracking and precise and consistent characterization of the HMD optical system, our Digital Lens technique corrects optical aberrations, achieving high apparent resolution and removing color fringing across the entire field of view for any gaze direction, inside and outside the eye box. Objective measurements with the Vive Pro Eye show a twofold visible resolution increase and a tenfold reduction of chromatic aberrations. In the presentation, we will explore how the Digital Lens literally makes high display resolution visible and how it should be used in future HMDs to achieve the long-awaited breakthrough in picture clarity and optical fidelity.
A decade ago, advancements in hardware technology drove innovations in network infrastructure and cloud resource availability -- and we've collectively been reaping the benefits ever since. But this symbiosis of high-speed wireless connectivity and device capabilities is being upended by the arrival of 5G. The advanced network platform is poised to enable an edge computing and hardware revolution, and this has significant (some might say "huge") implications for creators and technologists. The fact is: 5G is here, and innovators are skating to where the puck is going to be. Learn how you can meet them there and be ready when this new technology presents new opportunities for the augmented reality industry. (Spoiler: that time is now.)
User flows for mixed reality are difficult to review effectively using traditional 2D tools like presentations and animated mockups. Instead of creating 2D storyboards, you can build prototypes straight in VR. During the talk, we can review production workflows of companies like Walmart, Unity Technologies and Cartoon Network. Spatial apps need spatial tools. You get speed, effectiveness and feeling of the scale inside the VR. In XR there is a HUGE demand to quickly prototype ideas and have a very iterative process and an efficient workflow. We want to see more apps that feel like born in VR and not like a PC/web kind of app inside the VR, same true for AR with all limitations you have to count on usability, comfort of menu navigation and more. Too much development is happening on guessing without a vivid immersive prototype, which can be approved by stakeholders, the cost of development, especially when it comes to XR, is very high. How can we easily and quickly create prototypes and see the whole simulation in a headset/in a device to decide whether to commit the budget and move forward? With VR prototyping and animation software tools like Tvori or Microsoft Marquette, you can quickly create 3D UI and with Tvori you can animate your users workflow. Within just hours and days instead of months, you get a full simulation of the experience, in a headset, before you actually start coding. Think of a VR game development in Unity. It is extremely hard to get a feeling of scale if you are not prototyping inside VR. You will have to keep putting ON and OFF the headset and probably not getting the proper scale. Spatial apps need spatial creation tools.
During the last ten years, augmented, mixed and virtual reality have been increasingly investing our lives. However, these immersive and interactive technologies are still not so well known by the general public. Due to a certain complexity in the creation as well as the distribution of such experiences, there is still quite a long way to go in order to truly democratize these technologies. However, as time passes, more and more solutions are engineered to that end. This talk will aim at overviewing the current world of creation and distribution of XR experiences by and for the general public.
In this talk, Kirin will outline the technical challenges of bringing a compelling AR gaming experience into the real world - incorporating physical positional data at a room scale, while immersing the user in a world scale fantasy. When taking gaming into an open and ever- changing environment, game design needs to be addressed in a new way. There are certain ‘set up’ elements to AR that need to be weaved into the experience itself, so the gamer feels in control of their own experience and remains engaged (room scanning as one example). But the opportunity of a truly AR gaming experience is huge - augmenting players’ current and real reality. This lets you tap into a new level of excitement, anticipation or fear, making for a more meaningful and habit forming experience that leverages 3D directional audio, haptics, and compelling AR visuals.
Despite herculean efforts from industry heavyweights, the promise of AR smartglasses with mass-market appeal has remained stubbornly out of reach. Key technology breakthroughs are needed in microdisplays and optics to enable bright, vivid content in an all-day-wearable size that doesn't obscure the real world. In this showcase, we'll start with a brief history of AR headsets, displays and optics. We'll explore some of the technology hurdles and tradeoffs, and describe a novel way of thinking about the problem. Then we'll reveal the first live demonstration of the QPI, the wearable display technology needed to unlock the massive economic potential of the next great wave of personal computing.
Experts will discuss all things on Intellectual Property (IP)! Get acquainted with how to build your own IP or work with partners to use their existing IP, what to expect when dealing with licensed-IP, and how to implement licensed-IP into your games and marketing. Our panelists will focus on in-bound and out-bound IP licensing for creating value in a game, and the related advertising issues for those games.
In 2019, Synesthetic Echo made and released Bumblebee Jam, an explorative choose-your-own-adventure musical experience, set in a world where flowers produce music. It was of the first games / creative tools played exclusively with AR audio wearables, such as Bose AR glasses and headphones. Designing for the new medium always presents challenges, so in this talk, Maria will share her design process and discoveries that helped to make Bumblebee Jam accessible and memorable experience. Attendees will learn what are the primary design considerations for AR audio, how to set up a fast development pipeline, how to structure user tests and how to ensure a smooth publishing process.