02:00 PM - 02:25 PM
First responders (FR) need every advantage during critical moments of fleeting life, and SMART cyber-physical systems (CPS) provide crucial information for time-sensitive events. Wireless real-time convergence of building information modeling, interfacing of 3D data through machine-to-machine and machine-to-human analytics exchange, and bio-telemetry monitoring of victims and responders will save lives.
Auto-populated, legacy, and real-time remote sensed point clouds would allow FR personnel and their commanders to intelligently and visually understand their environments during response. Mixed physics computational fluid dynamic programs allow for predictive outputs in structure and wildland fires, hazardous materials releases, and other natural and human-initiated events. Technical rescues and high-life hazards will benefit from SMART Cities, IoT, remote sensing, virtual, augmented, and mixed realities. The unconscious patient will be treated via facial recognition software, allowing responders to know medical history, and complimentary real-time field communications will be transmitted to base hospital trauma surgeons-in 3D.
The SMART FR- during the response, mitigation, and post-incident analysis, will view emergencies through 3D projected light and holographic displays and immersive HUD lenses. The multi-sensor fusion-enabled technology is focused on saving civilian lives and empowering responders.
02:00 PM - 02:25 PM
Fans don’t just want to watch the game - they want to be a part of the experience. To engage their attention, fans demand a personalized, emotional experience, with increasing pressure to show innovation. Tech enabled experiences provide brands an opportunity to gain insights on their fans (their needs, motivations, etc.), and become more influential in their purchasing decisions.
Whether digital platforms, AR, VR, AI, holograms, or NFTs, the gamification of the fan experience is an example of the way teams and brands can optimize consumer interactions to increase sales, analyze customer engagement, and measure partnerships. From pre-event, to on-site, to at-home, and post game experiences, our fireside chat will explore how brands can leverage XR to connect with consumers, build fanbases, and influence opinion.
02:00 PM - 02:25 PM
The increasing popularity of XR applications is driving the media industry to explore the creation and delivery of new immersive experiences, while pushing engineers and inventors to address the challenges of real video content manipulation. Considering this reality, the talk will discuss these challenges and introduce key technologies that leverage open standards to enable large scale distribution of new immersive experiences.
A volumetric video is comprised of a sequence of frames, and each frame is a static 3D representation of a real-world object or scene capture at a different point in time. Volumetric video is bandwidth-heavy content that can be presented as dynamic point clouds, multi-view plus depth, or multi-plane image representations.
These bandwidth requirements can be achieved through dedicated compression schemes that produce data rates and files sizes that are economically viable in the industry. Standards play a crucial role in ensuring interoperability across these different types of content and experiences, and this talk will spotlight the Moving Picture Experts Group (MPEG) Visual Volumetric Video-based Coding (V3C) standard as an open standard solution for efficient streaming of volumetric video.
To enrich XR experiences, inventors and industry must synchronize to create and distribute additional media
and interactions mechanisms alongside all the volumetric video components. In fact, the MPEG-I Haptics and MPEG-I Scene description standards are currently under development and will soon propose solutions to industry.
From the perspective of real content creation, it remains challenging but crucial to enable technologies that cover 3D capturing, calibration, depth processing, format conversion, transmission, and rendering. This session will explore two creation pipelines that leverage the aforementioned standard codecs: (1) a real-time pipeline for telepresence using depth and color cameras; (2) an offline pipeline for sports and media applications using color cameras and prior geometric information.
02:00 PM - 02:25 PM
"AWS Spatial: Building Blocks of the Metaverse" is a session for developers and tech enthusiasts interested in the potential of XR to revolutionize various industries. It is focused on helping those who have moved past the pilot or prototype phase and are now looking to scale and adopt XR in a meaningful way. In this session, attendees will learn how to take disparate 3D workloads and XR proofs of concept and scale them into enterprise-level applications using the tools, building blocks, and patterns available right now. The session will cover the complete XR workload lifecycle, including supporting asset creation, ingestion, transformation, and distribution/deployment.
The session will be led by three AWS Spatial Computing prototype architects, each representing different emerging technology teams within AWS. They will demonstrate how to leverage CAD engineering models, photogrammetry, and AI/ML in 3D asset pipelines to create high-quality XR experiences. Attendees will have the opportunity to get started with the tools, building blocks, and patterns available right now, and learn how to use AWS Spatial to build XR workloads that are scalable, secure, and reliable.
02:00 PM - 02:25 PM
With a proven 20-plus year customer track record of daily use in mission critical health and military applications globally, 3dMD’s dynamic-4D capture systems are providing the same ‘near-ground truth’ 3D images and high-throughput workflows to its tech sector customers. 3dMD captures natural human actions, behaviors, gestures, nuances, etc., thus personalizing ‘ground-truth’ people into ‘near-ground truth’ digital-3D interpretation input for Web3.0. This talk explores the true economic benefits of baselining with a ‘near-ground truth’ digital-3D Doppelganger population and how this can create derivative avatar market opportunities beyond gaming, entertainment, and social media interactions.
02:00 PM - 02:25 PM
This is the very first time Ant Reality will unveil the secret of Crossfire 120° AR optics – the world’s widest OST (optical see-through) solution. First, Zheng will prove to you that Crossfire is almost the best option for wide FoV AR, compared to birdbath, free-form and other waveguide solutions. While targeting AR/VR hybrid use case, Zheng will walk you through the reasons why Crossfire is much better than its competitor Pancake +VST (video see-through) solution. Additionally, Zheng will introduce several reference products that have been designed with some of Ant Reality’s clients.
02:00 PM - 02:25 PM
svarmony revolutionizes indoor navigation with aryve, the cutting-edge augmented reality platform that is transforming the way we explore and experience indoor spaces.
In this presentation, we will delve into the breakthrough technology behind the aryve and how VPS systems have a high potential for disrupting industries, improving user experiences, and generating new business opportunities.
Join us as we navigate the uncharted territory of AR indoor wayfinding and unveil the possibilities it offers.
02:00 PM - 02:25 PM
Immersive soft skills training is one of the most significant untapped opportunities in the bringing mixed reality to the enterprise. When implemented, organization's learning and training processes scale, moving employees from the classroom into the real-world workforce faster. Combining mixed reality with generative AI, enterprise XR unlocks greater levels of efficiency, safety, and knowledge transfer while building mission critical skills in scenarios that span complex presentation, negotiation, leadership skills-building, one-on-one coaching, conflict resolution and customer support.
In this case study, we will showcase how Scoot Airlines, a subsidiary of Singapore Airlines upskills employees in preparation for dynamic interactions with passengers. Leveraging their partnership with TeamworkAR, Scoot Airlines creates rich content experiences regardless of the technical level of the team, deploys XR-as-a-service, including content creation, best practice learning methodology, and device management and uses generative AI with a digital replica of the work environment in immersive 3D to accelerate and improve the quality of employee taining and drive operational efficiencies.
02:05 PM - 03:20 PM
Join us as startups pitch to our panel of judges for a chance to be named AWE's "Startup to Watch" for 2023. The winner will be announced at the Auggie Awards ceremony.
02:30 PM - 02:55 PM
Showcasing the importance of a 3D real-time visualization to improve your workflow in different industries.
02:30 PM - 02:55 PM
From 2005 onwards, Forklift University has been at the forefront of training people in Powered Industrial Trucks (PITs). They do both OSHA-compliant training and issue OSHA certificates.
The brain retains more information from real-life experiences. The millennial generation requires training that captivates and teaches in a more visual manner. So, Forklift University needed a captivating solution for Gen Z while being user-friendly and adaptive for Gen Y to the boomers.
Also, when it comes to PITs training, safety is of utmost importance, followed by cost. They needed a solution that would enable their users to learn the dos and don’ts of forklift driving in a reasonably realistic warehouse setting.
Travancore Analytics (TA) has been a critical player in the extended reality domain. When Forklift University approached TA with the problem, TA suggested a combination of Metaverse and VR to achieve the intended needs. TA created a realistic warehouse and sit-down forklift using 3D software. The same was incorporated into a complex, yet user-friendly application developed using Unity. The application works in conjunction with HTC VIVE Pro 2, a VR head-mounted device. The combination allows the user to drive a virtual forklift in a virtual
environment and undertake training curated with OSHA guidelines in mind. With various training modules of varying degrees of difficulty and functionality, Forklift University can provide its users with close-to real-life forklift training with complete safety. With VR technology, we can create multiple PITs training software under various environments that replicate real-life challenges and incorporate several features that, in turn, develop more in-depth training software.
02:30 PM - 02:55 PM
Ericsson Digital Human enables the next generation of connected, human experiences.
Make language universal with EDH AI video dubbing (Dub Sync). Dub Sync leverages the power of AI to seamlessly dub video of any speaker with audio in any language, bridging the gap better than ever between audiences across the world. Imagine a new world of possibilities for virtual conferencing, live events and content localization. Dub Sync is the first in a family of products purposed with enhancing the connections of today with the technologies of tomorrow.
02:30 PM - 02:55 PM
Enterprises face common challenges with XR, from overcoming the uncertainty of Metaverse hype to understanding how to start and scale XR implementation and the best products to get you there. Slalom and PIXO VR have partnered to help enterprises cut through the noise and confusion in the XR market. Come learn how we are combining the best XR strategists with the best XR platform and products, plus elite customer care all along the way, to help enterprises run successful XR programs.
02:30 PM - 03:15 PM
John Gaeta is the legendary creator best known for his Academy Award-winning work on the original Matrix Trilogy and creator of Bullet Time. John, now Chief Creative Officer of Inworld AI will talk alongside Chief Product Officer and Inworld Co-Founder Kylan Gibbs about their incredible new concept demo. Called ‘Origins’, the demo allows players to take on the role of a detective in a sci-fi world. Using your mic, you’re tasked with questioning witnesses, uncovering the narrative, and cracking the case. All of the NPCs use generative AI to create dynamic conversations, meaning you can ask them whatever you like, and the NPCs will react accordingly - whilst staying in character. John and Kylan will give you an insight into this fascinating new technology, set to revolutionize games and immersive media
02:30 PM - 03:25 PM
E-commerce is the beachhead of the metaverse. Virtual try-on, AR and 3D room design tools, and a rising marketplace for purely digital items are changing both how people shop and what they shop for. Built for interoperability and optimized for web-based applications, the glTF standard remains the best transmission solution for multi-platform e-commerce assets – but to meet the challenges of the open metaverse, it must evolve. The open glTF standard, the “JPEG of 3D,” was built to meet the need for interoperability, and it is evolving to address the next generation of e-commerce use cases.
In this session, members of the asset creation and retail communities will discuss the gaps between the built-in capabilities and attributes of the most common 3D asset format, and current and emerging e-commerce requirements. We’ll examine use cases and discuss the challenges of current approaches. Finally, we’ll explore new extensions under development for glTF, including interactivity, behaviors, and the ability to assemble multiple assets into a single scene. Our panelists will share demonstrations of the kinds of experiences retailers will soon be able to offer cross-platform. You’ll leave inspired, with new ideas for your e-commerce platform and strategies to implement them using cost-efficient, interoperable standards-based assets.
02:30 PM - 03:25 PM
XR is at the heart of industry, education, health, and social networks. The development of immersive technology will only be successful if it is created by a diverse workforce, empowered to create an XR that represents that experience of all people. As our world is transformed by the prospect of XR, employment opportunities abound for people of all levels of education and experience.
**XR Impact**
02:35 PM - 03:30 PM
In the multisensory virtual experience, the interplay of sight, sound, and touch simulation can enhance our sense of immersion, embodiment, and delight. How far away are we from achieving that? How can we integrate visual, audio, and haptic interfaces to create more compelling experiences? How can we use these tools today to facilitate genuine connection and communication – and what advances do we want to watch out for?
Our panelists will share learnings and observations from their research and development work to exceed the expectations of customers who want new, tactile experiences and unique personal environments. Join us to explore how multisensory experiences might affect the ways in which we create our real (and virtual) spaces.
03:00 PM - 03:25 PM
After over a decade searching for mixed reality smart glasses capable of enhancing his low vision, legally blind Chris McNally walked into the Paris offices of Lynx with his blind cane and sighted guide, and met founder Stan Larroque.
After putting on the Lynx R1, Chris’ world was transformed making it possible for him to see tables, chairs and other obstacles in the room and the expressions on people’s faces during conversations - all impossible moments before. Was it magic? No, just the right combination of technology capabilities to meet the needs of Chris’ low vision.
However, this is just the tip of the iceberg, as advancements in smart glasses hardware, software and MR/AR/XR technologies, along with advancing AI and more, will create incredible opportunities for people with low vision.
03:00 PM - 03:25 PM
In this thought provoking talk we follow the story of John Henry, a character known for his incredible strength and stamina, as he competes against a steam drill powered by artificial intelligence in a high-stakes race to build AI powered XR applications, platforms and even reshape society.
Together we probe critical questions on AI generation of art, music, code, and more.
1. How do these AI tools affect XR creation?
2. What new kinds of applications and services do AI tools unlock?
3. How does AI support new opportunities to scale?
4. What are new kinds of AI enabled business models?
5. What concerns should we have about the rapid deployment of AI to change society?
ChatGPT, Midjourney, Stable Diffusion, DALL-E, Copilot, Lensa, and 3D XR Asset Creation are just the beginning.
Will AI eat our reality?
Join us as we consider the age-old question of what happens when technology surpasses humanity's abilities in the race to build a new XR reality and how it fundamentally changes our existing one.
03:00 PM - 03:25 PM
Discover real engagement practices and strategies in XR experience design that put human connection first.
In an age where we work on distributed teams, being able to stay creative, resilient, and connected in a meaningful way makes all the difference.
Being emotionally connected matters. Great leaders know how to reach others and how to tend to their own wellness in order to thrive at their best level.
Come to learn practices and strategies together, and see how they apply to enterprise experiences that are engaging and serve prime well-being needs for our teams, those we serve, and ourselves.