The digital revolution has changed every aspect of our lives, the very essence of how we live, work and play. It's defied predictions of visionaries and techno prophets where instead of creating an alternative cyber space parallel to our reality, the opposite is happening: information is no longer confined to the pixels on our screens. The entire physical world -- even living and breathing matter -- is being infused with data, demanding new, unprecedented forms of interactivity. In this talk, Ivan Poupyrev will discuss his explorations of the present and the future where technology, connectivity and intelligence are woven into the very fabric of our lives. He will outline his vision of the world as an interface where everything is connected and interactive, viewed through a lens of projects that cover over 20 years of research and that spans and connects a variety of fields from augmented and virtual reality, haptics and touch interfaces, to radars, smart fabrics and interactive living plants. The talk will present his most recent explorations in creating future ambient computing environments: development of a pico-radar for awareness and touchless gesture interaction (Project Soli), and designing a platform for manufacturing interactive, connected soft goods at scale (Project Jacquard).
Since the mid-1990s, a significant scientific literature has evolved regarding the mental/physical health outcomes from the use of what we now refer to as Clinical Virtual Reality (VR). While the preponderance of clinical work with VR has focused on building immersive virtual worlds for treating anxiety disorders with exposure therapy, providing distracting immersive experiences for acute pain management, and supporting physical/cognitive rehabilitation with game-based interactive content, there are other emerging areas that have extended the impact of VR in healthcare. One such area involves the evolution of conversational virtual human (VH) agents. This has been driven by seminal research and development leading to the creation of highly interactive, artificially intelligent and natural language capable VHs that can engage real human users in a credible fashion. No longer at the level of a prop to add context or minimal faux interaction in a virtual world, VH representations can now be designed to perceive and act in a 3D virtual world, engage in face-to-face spoken dialogues with real users, and in some cases, can exhibit human-like emotional reactions. This presentation will provide a brief rationale and overview of research that has shown the benefits derived from the use of virtual humans in healthcare applications. Research will be detailed reporting positive outcomes from studies using VHs in the role of virtual patients for training novice clinicians, as job interview/social skill trainers for persons on the autism spectrum, and as online healthcare support agents with university students and military Veterans. The computational capacity now exists to deliver similar VH interactions by way of mobile device technology. This capability can support the “anywhere/anytime” availability of VH characters as agents for engaging users with healthcare information and could provide opportunities for improving access to care and emotional support for a wide range of wellness and clinical applications for a variety of populations. This work will be discussed along with a look into the future of this next major movement in Clinical VR. For more information on this topic, please visit our website: http://medvr.ict.usc.edu/ and YouTube channel: https://www.youtube.com/user/AlbertSkipRizzo/videos?view=0&sort=dd&shelf_id=1
As the media landscape continues to segment, technology is pushing the boundaries on how media companies are connecting brands and audiences. At the center of this digital evolution, ViacomCBS is driving authentic brand extensions that reach fans through emerging products and experiences. Join TimAdams, VP Emerging Products Group, ViacomCBS, to learn how the company is doubling down on innovation to extend brand narratives through the use of spatial computing, augmented reality and voice interactivity.
Despite the ongoing megapixel race, the visual experience in VR is still nowhere close to the real-world and falls short of consumer expectations and industry needs. Even the latest high-end HMDs retain the common flaws: blur and color fringing outside of the small “sweet spot,” an insufficient field of view due to quality degradation at wide angles, and a tiny eye box. Almalence has invented a technology to overcome the physical constraints of HMD optical performance that cause picture clarity deficiencies. Utilizing eye tracking and precise and consistent characterization of the HMD optical system, our Digital Lens technique corrects optical aberrations, achieving high apparent resolution and removing color fringing across the entire field of view for any gaze direction, inside and outside the eye box. Objective measurements with the Vive Pro Eye show a twofold visible resolution increase and a tenfold reduction of chromatic aberrations. In the presentation, we will explore how the Digital Lens literally makes high display resolution visible and how it should be used in future HMDs to achieve the long-awaited breakthrough in picture clarity and optical fidelity.
Join Patrick Costello, Sr Director Business Development at Qualcomm Technologies, Inc. and Raffaella Camera, Global Head of Innovation & Strategy at Accenture Extended Reality as they present a case study around event planning in the hotel industry. The Accenture XR Event Planner is an immersive, collaborative and interactive augmented reality (AR) and virtual reality (VR) solution piloted in partnership with Qualcomm, Intercontinental Hotel Group (IHG) and InterContinental Los Angeles Downtown. The XR Event Planner solution extends the digital consumer journey to include mobile, mobile AR, AR glasses and VR headsets—enabling event planners, buyers and hotel sales staff to visualize, customize and move through event spaces remotely and collaborate throughout the process. Focus group feedback and market analysis shows hotels could shorten the sales cycle, improve brand affinity and connection with clients and increase revenue up to 8 percent in the $330 billion annual US meetings and events industry.
Augmented and virtual reality (also known XR) has the potential to transform the retail industry as a powerful merchandising tool, and 3D representations of products will be everywhere in retail, appearing on mobile, web and AR/VR devices enabling customers to interact with photorealistic products in virtual showrooms. Retailers will also benefit from 3D, as they will be able to create richer representations of their products across a diverse set of platforms. This technological advent is proving itself game-changing for retail, but without 3D being experienced consistently across all platforms and devices, the production remains siloed, expensive and tough to scale. Making the future XR- & 3D-rich retail landscape a reality will require collaboration of many different retail & technology companies that must ensure customers can find these experiences on many different platforms—from search results to social feeds to ad units to apps, ecommerce, websites, mobile AR devices, VR/AR headsets, & more. In this session, panelists from retail and technology come together to discuss some of the current challenges of bringing XR & 3D applications to retail & explain what efforts are being made so far to overcome them.
In this session, leading experts will discuss the state of the volumetric capture ecosystem. What are some of the challenges of volumetric capture, from talent sourcing/scheduling to output/post-processing? Non-standard formats have been a big challenge from playback to the distribution of content on viewing hardware. What would 5G help with, in terms of distribution?
This talk will present a few use cases from Dr. Jayaram’s work at Intel, start-ups, and university to discuss some of the technology and business aspects of bringing VR to two different audiences – 1) to an end-consumer in sports and concerts, and 2) to enterprises for engineering design and manufacturing. What are the challenges in delivering VR based experiences and solutions at scale? What does it mean to weave them into the “normal” way of doing things to arrive at a new normal where VR is a mundane, predictable, and a regular part of entertainment and work?
Predicting and immersing one with the future is always a challenge and a desire. The IEEE rises to that challenge based upon the work it does on multiple new and emerging technologies through serving as a catalyst for developing new innovations, products and services. IEEE Future Directions serves as an incubator for these new initiatives. One of its focus areas, Digital Reality serves to explore and enable the coming Digital Transformation through collaboration among technologists, engineers, regulators, practitioners, and ethicists around the world. The Digital Transformation is fueled by advances in technology, such as Artificial Intelligence (AI), Machine Learning (ML), Quantum, and applications using the copious amounts of continuously generated data. By leveraging these technologies and others developed such as Augmented Reality (AR), Virtual Reality (VR), and Digital Twins, the line between the physical world and the digital world will be increasingly less distinct, thus enabling an immersive intelligent digital world. Applications are already quickly emerging across the broad fields of education, manufacturing, medicine, entertainment, automotive, enabling the sharing of services, and more. Emphasis will be upon presenting practical applications and its implementations of interest to attendees. Subject matter expert speakers will comment on current and past implementations.
The flat web is evolving to become spatial. With standards and browser updates gradually being rolled out and development platforms now available to enable WebAR and WebVR experiences - we are witnessing the rise of the Immersive Web. What is the state of the Immersive web today? What needs to happen to fully usher it in? And what role does it play in our Augmented and Virtual futures? Join Erik Murphy-Chutorian, CEO and Founder of 8th Wall, the leading development platform for reality content on the web, as he shines a light on the massive opportunity of WebAR and WebVR.
For years, many have tried to develop a see-through, near-eye display technology that combines beautiful design, excellent image quality and scalable mass production with high yields. At Dispelix, we’ve figured it out. Dispelix is helping product companies to create beautiful AR glasses based on single-waveguide full-color displays with superior image quality and mass manufacturability.
The Fashion industry is one of the most powerful growth engines in eCommerce. Online sales from just Accessories and Bags for example is expected to reach $85 billion by 2020. But Fashion returns are estimated to be between 30% to 50% - amounting to losses of billions of dollars annually. Augmented Reality can change all that. The potential of AR to transform retail fashion is huge. AR is already impacting many fashion experiences, from viewing designer bags in 3D to trying on shoes, eyewear, earrings, hats and other accessories with your smart phone. AR is also now enabling shoppers to experience virtual fashion shows with the world’s top models right in their own living rooms. For an industry that is plagued by high return rates and focused on design and brand, the ability to evaluate apparel online for fit and design is a true game-changer. This panel will examine the current state of AR in fashion, explore the progress of virtual try-on solutions and take a look at the future where personal virtual mirrors could change retail in a profound way.
Despite herculean efforts from industry heavyweights, the promise of AR smartglasses with mass-market appeal has remained stubbornly out of reach. Key technology breakthroughs are needed in microdisplays and optics to enable bright, vivid content in an all-day-wearable size that doesn't obscure the real world. In this showcase, we'll start with a brief history of AR headsets, displays and optics. We'll explore some of the technology hurdles and tradeoffs, and describe a novel way of thinking about the problem. Then we'll reveal the first live demonstration of the QPI, the wearable display technology needed to unlock the massive economic potential of the next great wave of personal computing.
AR can drive real business value and the industry now has the numbers to prove it, so much so that many Fortune 100 companies across many verticals are taking notice and looking for the right use cases for their business. This info-packed session highlights companies across several different industries and their paths to success with AR. Panel participants will share their successes in implementing AR in their organizations, their methodology for choosing their initial use cases, and discuss lessons learned along the way. This is a must-attend session for companies that are looking to begin their AR journey or are thinking about how they can take their existing AR projects to the next level by expanding into new applications or business units.
The new medium of immersive computing offers so much opportunities for expression, and at the same time puts creatives in a tricky spot when they have to consider all the parameters that is beyond their control. When every users have different ergonomic and environmental preferences, it's important for us to design and build dynamic experiences that fits into the audience's context. We will explore why proceduralism and machine learning play a crucial role in growing content and experiences in the new spatial medium. Followed up with examples and case studies, we will review the insights and lesson learnt on how to use technology to augment new forms of hardware input and creative directions.