According to leading executives of XR companies, what are the key considerations when designing haptics experiences and products? What insights and lessons do you need to know to save you from reinventing the wheel? What are the brightest minds in XR predicting the future of haptics will look like?
Get ready for a look into the world of spatial computing with AWS! Join us as we dive into how AWS is transforming the way 3D models are brought to life in the cloud. We'll be showcasing the latest spatial computing services on AWS that enable you to build, deliver, and manage your 3D workloads with ease and efficiency.
But that's not all - via an on stage demonstration you'll get to see how we paired a Magic Leap 2 with a Boston Dynamics Spot Robot to showcase how AWS's cutting-edge technology can help users visualize live robot telemetry and control the robot in even the most challenging and remote environments.
This session and the session following it is a must-attend for professionals who are interested in exploring the full potential of spatial computing on AWS. Join us for a captivating and informative presentation that is sure to inspire and inform!
Digital Twins connect physical systems with virtual representations and models to allow for visual representations, integration of sensor data, and predictive capabilities for how assets or processes will behave in the future. Globally, organizations are grappling with the acceleration of remote operations and an increasingly prevalent skills gap. Forward-looking companies are addressing this problem by equipping their in-person, hybrid, and off-site teams with mixed reality (MR) solutions that enhance productivity capabilities especially when integrated with Digital Twins. In this session, learn how Sphere and AWS are working together to develop a digital workspace which enables professionals to communicate across languages, distances, dimensions, and time (with predictive capabilities). By partnering on initiatives which include TwinMaker and Lookout for Vision, as well as cloud rendering powered by AWS and its partners, Sphere’s cutting-edge solution is pioneering the future of collaboration, as well as expert assistance and workflow guidance using MR.
Zheng Qin has since 2018 developed the cutting edge, wide FoV AR optics system called Mixed Waveguide, with the most advanced Crossfire solution that gives you a 120-degree FoV and is interchangeable between AR and VR. It’s even slimmer than most VR optics (including the Pancake solutions), so it could be the ultimate optical solution for AR & VR hybrid glasses. Zheng will walk you through the reasons why Crossfire is much better than its competitor Pancake +VST (video see-through) solution. Additionally, Zheng will introduce the whole family of Mixed Waveguide solutions, which has been adopted by many key clients around the world.
Recently, there has been, and continues to be, a flurry of activities around AR and the Metaverse. How these domains intersect and unfold over time is still very much in the early stages. What is clear, however, is that the “on-ramp” or gateways into the Metaverse starts with the ability to perceive the physical and digital worlds simultaneously. Many technologies and devices are needed to enable the true immersion and first and foremost is the ability to overlay the digital domain onto the physical space. In this talk we will discuss these aspects and delve deeply into near-to-eye display technologies that allows uses to coexist in the physical and digital domains.
Construction is a complex and messy process, combining millions of individual components - all needing to be positioned in the exact right place with mm-level accuracy. However, the construction process has largely remained unchanged for thousands of years. A group of people design a building, listening to every wish and elaborate desire coming from the Owner, and then they hand the design over to thousands of other people to actually build it, without any understanding of how this design fits the real world. It’s kind of like building a massive jigsaw puzzle where thousands of people are responsible for one piece or another, and no one really knows how they all fit together. This waterfall process leads to building things up only to tear them down immediately after and then build them right back again - just moved over by 1ft - something the construction industry spends over $280B per year doing. This is simply not sustainable, for the industry, for the stakeholders, and most importantly - for the planet. With nearly 40% of the world’s Carbon emissions being contributed by the construction industry, transformation is desperately needed.
And that’s exactly what Trimble is working to do. As a leader in high-accuracy positioning technologies, Trimble has a long standing history of bringing precision to the construction industry - helping to fit all those puzzle pieces together. But we see the opportunity to do more. Since 1997, when we filed our first XR patent, Trimble has been transforming the way the world works by connecting the physical and digital worlds. Now, we’re working to change this archaic narrative, by empowering everyone in construction to visualize, interpret, and action the design through Augmented and Mixed Reality technologies in the field. From catching design mistakes by viewing 3D models on any worker’s iPad, to being more efficient by controlling the most precise Total Station with nothing more than your gaze, we are improving communication and collaboration around design intent, enabling more efficient and sustainable projects. Follow us on this journey as we outline how Extended Reality technologies are revolutionizing the way the construction industry operates today, tomorrow, and for decades to come.