Emerging technology looks to reshape not just what entertainment is, but how we connect and work with each other everyday. This presentation will include intersectional minorities in emergent VR and AR development to discuss the steps needed towards creating for equity instead of only equality. It will serve as an educational opportunity to guide newcomers and veterans alike to practice inclusive frameworks throughout every stage of a project's life cycle. Our discussion will innovatively push the boundaries of how people perceive the metaverse and its potential impact on the VR/AR community.
Smart glasses in a normal form factor will significantly change our everyday lives. tooz envisioned this more than a decade ago when the tooz journey began in the corporate research labs of ZEISS Germany. Starting with the first curved waveguide, tooz developed several generations of optical engines and provided a continuous stream of patented inventions to the smart glass industry. 5 years ago, Deutsche Telekom/T-Mobile joined the journey as a 50% shareholder and enabled the development of full smart glasses solutions based on tooz’ waveguides. At AWE 2022, tooz will launch its next breakthrough innovation on its mission to lead this market: tooz ESSNZ Berlin is the first market-ready smart glasses reference design with vision correction that will change the daily interaction of consumers with data, media and ecosystem interfaces. The underlying tooz technology is highly customizable, scalable, and marketable – Not in the future, but already today.
With Lightship’s Visual Positioning System (VPS) available in the Lightship ARDK, developers can localize users with centimeter-level accuracy in under three seconds. Learn how you can leverage this technology to build the future of the metaverse and create AR experiences in the real-world.
When an immersive experience takes on all or most of a person’s senses, it becomes a new dimension. Large amount of creativity goes into architecting a dimension (characters, story line, visual, sound, touch feedback) and large amount of compute goes into rendering such dimension into reality (high quality sound and graphics, tracking) this on top of the ingenious instrumented space where such dimension exists. Previously, rendering of sounds and visuals of a dimension were limited to the amount of compute a person could carried. Compute offloading with high bandwidth, low latency wireless technologies are removing such limitation. New dimensions can quickly utilize what state of the art compute and rendering technologies can deliver with the potential of shortening development and deployment cycles.
Leverage your passion and skills for gaming to change the world! Do you like creating incendiary fights with fire demons or finding ways to uncover story secrets, or how about stealing artifacts from spaceships while avoiding robots? Then join Robin Moulder as she shares insights and lessons learned from game development to creating XR applications for enterprises.
Introducing a new technology into an organization is always a challenge- funding it, building it and operationalizing it, even more so. This is especially true when the perception is that it could make existing skillsets obsolete. The creation of 3D content involves new workstreams and new stakeholders, but the aesthetic sensibilities and consumer insights needed to engage customers remain unchanged. The impact of curation and storytelling will never fall away. We need the talent we have today- most of whom have never leveraged 3D assets in their work. Emotional intelligence has a place on your technology roadmap. It’s a big LOE, of course, but baking it in means you’ll spend a lot less time dragging colleagues into the future kicking and screaming. (Not literally, of course.) The process is rooted in this basic truth; We need one another.
If you need to raise capital for your games studio, where do you start? Bobby Thandi from XR Games reveals the do's and don'ts of raising investment. Including what you need to get ready before investment, what to expect during the process, red and green flags regarding potential investors, the questions you need VCs to answer, and the questions VCs will be asking YOU.
At its best technology can be a powerful force for good in the world. New cutting edge computer vision and code scanning techniques are helping CPG brands reimagine their packaging and product offering to help the visually impaired locate and access product information through their phone both in-store and at home. Increasingly brands and businesses are looking to embrace the accessibility and inclusivity agenda for all their customers: providing access to relevant and augmented information for their total customer base, whether they are sighted or blind and partially sighted. The trick is to deliver this in a way that can work operationally through the design, production and manufacturing chain whilst offering a simple and single scanning solution for all end users. Discover a new approach to solve this problem.
In the backdrop of the Metaverse, how does the real world fit in, what does open, interoperable, and ethical look like, and what is the role of the Web ? We explore some potential guiding principles for the real world Spatial Web and early steps to realization through standards activities and the Open Spatial Computing Platform (OSCP). Along the way, we highlight areas that present strong opportunities for academic research, standards, and open source contributions.
MetaVRse is a low-code 3D creation platform for the future of human communication, collaboration, commerce and culture. Launched at AWE 2020, the MetaVRse Engine has been used for training and marketing for Fortune 500 organizations. TheMall.io is a 100m sq ft virtual retail and entertainment destination built on the MetaVRse Engine. We will be showcasing the new features of the completely re-built v2.0 while building a multiplayer experience real-time.
Imagine a world where you attend a Broadway show any day of the week, at a price you can afford, from anywhere across the globe and have the best seats in the house. Now imagine that the community of artists, producers and crew members who create this experience have unified to craft an ecosystem of equality where all make a living wage and issues such as disability, sexual orientation, gender, and race inequity have been reduced to a case study in the history books. Imagine that world where we celebrate rich, diverse stories accessible to everyone, inspiring generations to come in elevating humanity. The intersection of foundational technologies has crossed the exponential growth point, making what was futuristic now real. In this session we explore current and future XR live theatrical versions that intersect stage, screen and beyond. By creating digital assets and story moments in XR we accelerate the traditional development lifecycle while defining and building the virtual future of live entertainment.
Join for a discussion around the security/seamier side of the Metaverse and how we can establish trust and keep safety top of mind as we develop the infrastructure and regulations around immersive and emerging technologies.
The future is volumetric…is the title of a talk you could’ve given in 2017. Five years later, where are we? Anthony Fenu, CEO of Soar, injects hope and possibility into the conversation while summarizing the incredible potential and the pitfalls faced in the development of an entirely new medium. Stepping aside from all the hand-waiving and futurecasting, he explains the technical and pragmatic challenges Soar has solved to bring about the volumetric future we’ve wanted all along.
The Open AR Cloud is working to democratize the AR Cloud with infrastructures based on open and interoperable technology. And we are building city-scale AR testbeds that are being experienced throughout cities around the world. These are real-world use cases that combine the digital with the physical–rich experiences that are synchronous, persistent, and geospatially tied to a specific location. Content in situ allows the user to explore the world, connect with others, and have a shared experience. We will discuss new types of content activations based on proximity, gaze, voice, sensor data, and algorithmic spatial ads. Partners will present use cases such as wayfinding and NFT exhibits, as well as case studies that demonstrate how the technology is being used to build more diverse, equitable, and inclusive, real-world communities that raise awareness on important critical issues like climate change and public health.
Outlets located throughout the Convention Center & the Hyatt. Food available for purchase. Credit card sales only.
The University of Pennsylvania Cancer Center is one of the largest and most technologically advanced centers of its kind in the world. While this esteemed institution is well-known for its work in the areas of immunotherapy, precision surgery, and proton therapy it is quickly gaining a reputation for work being done in the field of medical virtual reality. This technology is now being utilized to help manage the stress and anxiety experienced by patients and caregivers alike. It is quickly finding a place in the areas of patient education, as well as the education the medical trainees. Researchers and clinicians at PENN continue to explore and develop XR applications for the benefit of patients, medical trainees, and caregivers.
The metaverse is enabling us to explore new horizons of inclusivity across every sector. We have the opportunity and the responsibility to build a safe online space with radical inclusive principles at its core. No longer defined by the physical space we can create bespoke experiences that reflect audiences' needs and create a technology led equitable future, whilst keeping these spaces safe for all. Featuring two leading experts from the UK cultural and academic sectors, this talk will draw on a range of international projects in which they have played vital roles. Asking questions and providing practical solutions to how the metaverse can act as a catalyst to rethink our shared future, and how embracing new technologies can encourage fresh perspectives, future thinking, and a radical approach to inclusion and access. Takeaways: • How utilising the metaverse can benefit telling cultural narratives. • What can we expect from the next generation of experiences, festivals, and events? • How do we meaningfully address the challenge of making content that is truly inclusive? • Offer practical solutions to inclusivity and safety in the metaverse.
Where does the Great Resignation leave more than 2 billion frontline workers? Thanks to recent advancements in assisted reality, the frontline workforce will be able to get real-time contextual data in the right place and time and be a part of the industrial metaverse. Digital transformation and XR will connect them to their machines and teams in ways they never thought possible. In so doing, they will be seen as an essential part of the Digital Factory, elevated with new skills to navigate exciting new environments -- including AI and IoT. Hear from RealWear what the future holds using an industrial strength HMD, such as RealWear NavigatorTM 500 solution. Learn how macro trends including AI, IoT and Cloud have converged to create the modern frontline industrial worker. RealWear’s chief product officer will share new emerging use cases that will continue to accelerate the next industrial revolution.
Immersal (mobile) App will revolutionize how you understand metaverse, create, share and collaborate in it. We will introduce an easy way to create a private piece of metaverse anywhere and demonstrate how sharing your private piece of metaverse between your friends or to everyone is fast and fun. Someone has to pay the bill of course so monetization is discussed.
Speakers from AbbVie and Tipping Point Media (TPM) will discuss the case study of AbbVie’s Virtual Reality Suite of resources, including the newest addition – an overview of presbyopia, or age-related degeneration of the eye. These VR experiences have been designed to support the launch of AbbVie’s newest products and indications by engaging healthcare providers (HCPs) and creating disease awareness through experiential learning. The newest eye-care focused expansion to this training tool leverages VR’s unique, visual immersion capabilities to simulate the impact of an eye disease on patient in a first-person perspective capacity. During this time, speakers will detail how the use of innovative technologies can provide physicians and their staff with a deep understanding of various disease states through immersion. The session is meant to give audiences a first-hand account of the huge strides that can be made when companies invest in creativity and innovation and build towards the future.