09:30 AM - 09:55 AM
Miniaturization is the trend to manufacture even smaller mechanical, optical and electronic products, medical devices, and other high-value parts. This trend continues to be strong, with year-over-year growth in many markets. One of the limiting factors to miniaturization is the inability of traditional manufacturing methods like injection molding and CNC machining to effectively and economically produce smaller and smaller parts.
Additive Manufacturing (AM), or 3D Printing has been around now for over 30 years. For a long time, there were only a few technologies available and applications were generally limited to prototyping. Past advances in AM have come short in meeting the needs of small parts, printing them at a resolution, accuracy, precision and speed that made them a viable option for end-use production parts. That has all changed. Additive Manufacturing and Miniaturization are now converging – in a very meaningful and impactful way.
The growth in the AR/VR market and pace of innovation is opening up applications that were not even imagined a decade ago. With this comes challenges for manufacturing to scale at the same pace. Many leading AR/VR technology companies have started using micro 3D printing as a method for producing various micro-precision components as an alternative to traditional fabrication methods – finding huge time and cost savings.
Learn how micro-precision 3D printing enables companies in this competitive space to address development challenges limited by current microfabrication methods, but also allows companies to explore the potential of pushing the limits on miniaturization by expanding the boundaries otherwise thought impossible with 3D printing.
11:00 AM - 11:25 AM
Come learn about what AWS is doing in spatial computing! Specifically how AWS is helping people bring their 3D models to life on the cloud.
This talk will feature a live demonstration, on stage, showing how a Magic Leap 2 can be paired with the Spot Robot to help the user visualize live robot telemetry and health, and control the robot as it scans and operates in harsh or remote environments. Spot will come out on stage, while being controlled by the Magic Leap 2.
This talk will also feature information about new spatial computing services on AWS that help you build, deliver and manage all of your spatial workloads.
11:30 AM - 11:55 AM
Digital Twins connect physical systems with virtual representations and models to allow for visual representations, integration of sensor data, and predictive capabilities for how assets or processes will behave in the future. Globally, organizations are grappling with the acceleration of remote operations and an increasingly prevalent skills gap. Forward-looking companies are addressing this problem by equipping their in-person, hybrid, and off-site teams with mixed reality (MR) solutions that enhance productivity capabilities especially when integrated with Digital Twins. In this session, learn how Sphere and AWS are working together to develop a digital workspace which enables professionals to communicate across languages, distances, dimensions, and time (with predictive capabilities). By partnering on initiatives which include TwinMaker and Lookout for Vision, as well as cloud rendering powered by AWS and its partners, Sphere’s cutting-edge solution is pioneering the future of collaboration, as well as expert assistance and workflow guidance using MR.
01:55 PM - 02:20 PM
Construction is a complex and messy process, combining millions of individual components - all needing to be positioned in the exact right place with mm-level accuracy. However, the construction process has largely remained unchanged for thousands of years. A group of people design a building, listening to every wish and elaborate desire coming from the Owner, and then they hand the design over to thousands of other people to actually build it, without any understanding of how this design fits the real world. It’s kind of like building a massive jigsaw puzzle where thousands of people are responsible for one piece or another, and no one really knows how they all fit together. This waterfall process leads to building things up only to tear them down immediately after and then build them right back again - just moved over by 1ft - something the construction industry spends over $280B per year doing. This is simply not sustainable, for the industry, for the stakeholders, and most importantly - for the planet. With nearly 40% of the world’s Carbon emissions being contributed by the construction industry, transformation is desperately needed.
And that’s exactly what Trimble is working to do. As a leader in high-accuracy positioning technologies, Trimble has a long standing history of bringing precision to the construction industry - helping to fit all those puzzle pieces together. But we see the opportunity to do more. Since 1997, when we filed our first XR patent, Trimble has been transforming the way the world works by connecting the physical and digital worlds. Now, we’re working to change this archaic narrative, by empowering everyone in construction to visualize, interpret, and action the design through Augmented and Mixed Reality technologies in the field. From catching design mistakes by viewing 3D models on any worker’s iPad, to being more efficient by controlling the most precise Total Station with nothing more than your gaze, we are improving communication and collaboration around design intent, enabling more efficient and sustainable projects. Follow us on this journey as we outline how Extended Reality technologies are revolutionizing the way the construction industry operates today, tomorrow, and for decades to come.
02:25 PM - 02:50 PM
The JPEO-CBRND partnered with MRIGlobal and ForgeFX Simulations to produce The CBNRD HoloTrainer, a networked multiuser Microsoft HoloLens 2 augmented reality training simulator for Chemical, Biological, Radiological, and Nuclear detection device operators. This presentation will cover the entire project lifecycle, including government requirements and the development of holographically projected interactive virtual equipment, culminating in deployment to soldiers. The CBRND HoloTrainer is a groundbreaking spatial computing application that significantly increases the effectiveness of its users.
02:55 PM - 03:20 PM
Port 6 creates hardware and software to enable efficient, robust, and intuitive AR interaction. At this talk, they will showcase how these solutions can work as standalone input modalities or be combined to supercharge eye tracking and hand tracking.
User input is one of the many challenges standing in the way of the mass adoption of AR. How will the everyday person interact in Augmented Reality? At Port 6, we research different ways for people to interact in AR and develop the best hardware to detect these interactions efficiently and reliably.
Currently, there are a lot of input methods for AR using built-in sensors of the headset, such as hand tracking, eye tracking, and voice input. However, if we want it to be as transformative as the graphical user interface or the capacitive touch screen, we need to put deliberate thought into building the ideal input paradigm and the needed hardware that might not be in the headset itself.
At this talk:
• We’ll demonstrate how a machine learning algorithm on existing smartwatches can already significantly improve AR interaction.
• We’ll show how it can be combined with eye tracking and hand tracking sensors in the headset to improve interactions even more.
• Lastly, we'll show some of our future custom hardware dedicated to sensing advanced micro-gestures in a small and convenient form factor.