As the immersive media industry takes shape, we're finding it breaks the mold in more ways than one. Storytellers are embracing new formats and matching them with new narrative mechanisms. These immersive stories leverage alternative funding models and new distribution channels, like location-based and VR storefronts. Without an industry status quo, leading women are stepping up to blaze trails in immersive media.
Our group poses the question: what if the “Metaverse” is made up of an open ecosystem of interoperating, simultaneously running applications that blend the advantages of AR, VR, the web, open source, and standards? We demonstrate a joint WebXR system for multi-user, multi-app composited application development using open standards (WebXR). To participate in the demonstration, enter the Mozilla Hub via https://awe.webaverse.com/ supported devices & browsers since Hubs runs in the browser it works across many different platforms. For 2D experience: Hubs works on most modern desktop browsers on Windows, Mac and Linux. It also works on most mobile browsers for Android, and Safari on iOS. For VR experience: For Desktop VR (Oculus Rift, Windows Mixed Reality, or HTC Vive) Hubs works when used with a WebVR compatible browser (e.g. Firefox). Users must also have either Oculus Desktop App (Oculus rift), or SteamVR installed (Windows Mixed Reality, HTC Vive). For Standalone VR devices (Pico Neo 2, Oculus Quest and Oculus Go) Hubs works in the Mozilla Firefox Reality Browser or the Oculus Browser. Mobile phone powered VR systems, such as Samsung Gear and Google Cardboard, work using Google Chrome browser. Note that some features may be limited on low-powered devices such as mobile phones.https://hubs.mozilla.com/docs/hubs-create-join-rooms.html#supported-devices--browsers
AR is being used to create new, engaging, effective ways for consumers to interact with brands and businesses. By adding AR gamification and location-based context, businesses are increasing reach, user sentiment, and actual foot traffic. Having worked with global and national brands like AT&T, Simon Malls, Starbucks, Mcdonalds, and Samsung, Niantic's sponsorship platform has been evolving to a new form of AR marketing that is non-intrusive, organic, and engaging. Gaming is the new way to reach the so-called "unreachable" demographic, those who can no longer be engaged with via traditional forms of marketing. With the gaming narrative and experience in mind, we will walk through how location-based AR games can drive foot traffic to brick-and-mortar stores and provide highly contextual and timely moments of engagement with customers. Pulling from real world results, we will share case studies on what works in location-based AR gaming and how brands can leverage this new form of marketing. Having worked with global and national brands like AT&T, Simon Malls, Starbucks, Mcdonalds, and Samsung, Niantic's sponsorship platform has been evolving to a new form of AR advertising that is non-intrusive, organic, and engaging. Gaming is the new way to reach the so-called "unreachable" demographic, those who can no longer be engaged with via traditional forms of advertising. With the gaming narrative and experience in mind, we will walk through how location-based AR ads can drive foot traffic to brick-and-mortar stores, create location context for digital brands, and provide highly contextual and timely moments of engagement with customers. Pulling from real world results, we will share case studies on what works in location-based AR gaming and how brands can leverage this new form of advertising.
In this session, leading news organizations will reflect on the current state of immersive journalism. Talking points will include storytelling and the use of interactive narratives, ethical considerations, what medium (AR vs VR) is better for specific stories, and the process of choosing what projects are worth turning into immersive experiences.
Watch VR artist The Sabby Life paint live in virtual reality using the VR art app Tilt Brush. See a galactic scene unfold and explore the alien landscape as she paints live. Join the VR experience here: https://account.altvr.com/events/1475671672559239766 AltspaceVR Minimum Specs and Supported devices: https://help.altvr.com/hc/en-us/articles/115003538414-Minimum-System-Specifications
Learn how Artie’s technology enables developers to create games in Unity that have next-gen interactive features, including speech recognition and computer vision.
The Entertainment Technology Center at USC has now run three Challenges asking students and recent graduates what they would like to be able to create in 3-5 years that they cannot easily create now, and what needs to happen to make it possible. This session will further describe the Challenge and show some of the winning 3-minute pitch videos.
The Augmented Conversation is a dialog between human and AI participants that enables them to imagine, describe and create live virtual objects and simulations that they can interactively and simultaneously explore and modify. The AI is a full participant in this exploration – listening and watching the human participants and responding instantly to reify the ideas and concepts that they discuss. This isn’t an app. This is live collaboration – a foundation for how the next devices will mediate our engagement with others and with the world. Not only will these devices replace your smart phone, but they will replace your PC. The new devices will and must become super workstations that live up to the promise of delivering the “bicycle for the mind”. They will enable you to work with ideas and concepts that are simply not possible today, and you'll be able to share those ideas at any time with anyone.
There is a new form of augmented reality in film and television production. This session will cover advancements in “in-camera visual effects” and how this technique is changing the film and TV industry. There are new software developments in real-time game engines, combining hardware developments in GPUs and on-set video equipment, so that filmmakers can now augment their sets and capture final pixel visual effects while still on set – enabling new levels of creative collaboration and efficiency during principal photography. These new developments allow changes to digital scenes, even those at final pixel quality, to be seen instantly on high-resolution LED walls – an exponential degree of time savings over a traditional CG rendering workflow. This session will include a case study on use of Unreal Engine in the making of ‘The Mandalorian.’
Since the mid-1990s, a significant scientific literature has evolved regarding the mental/physical health outcomes from the use of what we now refer to as Clinical Virtual Reality (VR). While the preponderance of clinical work with VR has focused on building immersive virtual worlds for treating anxiety disorders with exposure therapy, providing distracting immersive experiences for acute pain management, and supporting physical/cognitive rehabilitation with game-based interactive content, there are other emerging areas that have extended the impact of VR in healthcare. One such area involves the evolution of conversational virtual human (VH) agents. This has been driven by seminal research and development leading to the creation of highly interactive, artificially intelligent and natural language capable VHs that can engage real human users in a credible fashion. No longer at the level of a prop to add context or minimal faux interaction in a virtual world, VH representations can now be designed to perceive and act in a 3D virtual world, engage in face-to-face spoken dialogues with real users, and in some cases, can exhibit human-like emotional reactions. This presentation will provide a brief rationale and overview of research that has shown the benefits derived from the use of virtual humans in healthcare applications. Research will be detailed reporting positive outcomes from studies using VHs in the role of virtual patients for training novice clinicians, as job interview/social skill trainers for persons on the autism spectrum, and as online healthcare support agents with university students and military Veterans. The computational capacity now exists to deliver similar VH interactions by way of mobile device technology. This capability can support the “anywhere/anytime” availability of VH characters as agents for engaging users with healthcare information and could provide opportunities for improving access to care and emotional support for a wide range of wellness and clinical applications for a variety of populations. This work will be discussed along with a look into the future of this next major movement in Clinical VR. For more information on this topic, please visit our website: http://medvr.ict.usc.edu/ and YouTube channel: https://www.youtube.com/user/AlbertSkipRizzo/videos?view=0&sort=dd&shelf_id=1
Team Glue is hosting open hours & demos throughout the AWE Online 2020 conference. We are happy to invite you to participate in our virtual gatherings in Glue. You can jump in briefly or stay longer to meet with fellow AWE participants and Glue team members. Request access to Glue AWE 2020 Open Hours & Demos here: https://online.awexr.com/organizations/RHNRxYfACocu7xTHY Learn more about Glue at www.glue.work