NIantic is empowering creators to seamlessly and realistically bring virtual content into the real world, today with the new Lightship ARDK. Amanda Whitt, Product Manager for Lightship ARDK, and Pablo Colapinto, Head of Immersive at Nexus Studios, will share insights into how the Nexus team used Niantic’s Lightship ARDK to create the sample app developers can use to quickly learn the features and what's possible with the ARDK. With semantic segmentation, meshing, and multiplayer tools, you can bring your vision for more dynamic and engaging digital experiences to life with AR.
Snapchat Lenses have taught a whole generation how to use augmented reality to express themselves, learn about the world, and even get things done. But behind the Lenses are Snap’s powerful tools to build AR experiences and a vibrant, global community that’s pushing the boundaries of what’s possible in AR. Join Snap AR leaders as they discuss how AR creators, developers, and partners can tap into the power of Snap’s camera products to build the next generation of AR.
Learn more about Unity’s AR companion app and how it helps AR creators speed up their workflows. We’ll touch on how to capture real-world data, create in the context of where the content will be experienced and import your data into the Unity Editor.
OpenXR has transformed the way developers deliver cross-platform AR/VR experiences. As a royalty-free open standard from the Khronos Group, OpenXR provides high-performance access to AR and VR devices. The OpenXR standard enables application developers to program to a single, common, high-performance API which supports a number of different AR and VR platforms. Containing everything an application needs to drive XR devices in a system, OpenXR helps applications run on a multitude of platforms, allowing hardware manufacturers to gain access to a broader range of pre-existing content. In this session, Brent Insko, Lead XR Architect for Intel and OpenXR Working Group Chair, is joined by executive producer from Asobo, Antoine Bezborodko, who will talk about developing the Micosoft Flight Simulator VR , what motivated him to use OpenXR and the value it brings to users. Brent will share recent industry developments for OpenXR, explain how XR developers can integrate OpenXR into their workflows, and share updates about expanded augmented reality support that is under development for OpenXR. Brent and Antoine will highlight the role OpenXR plays in getting gaming, entertainment and even enterprise XR developers to a common ground. Developers will leave with a fresh understanding of how to leverage OpenXR to create cross-platform XR experiences.
An overview of the multi-year development of AR in Google Maps, and how Live View evolved into its current state. Will cover the technology and data powering Live View, along with descriptions of lessons learned from the global deployment of AR.
Our future Metaverse is a shared 3D experience of the physical world and our many digital worlds. Naturally, the physical world is spatial and distributed. This spatial distribution means that physical objects and spaces are self-evident, and each of us can independently explore the world’s diversity. This PTC Reality Lab talk will explore a UI and platform vision for empowering the future builders and explorers of the physical world.
The ISAR SDK (Interactive Streaming for Augmented Reality) is a developer tool enabling remote rendering and real-time streaming of entire XR apps from cloud or on-premises infrastructures to mobile XR devices. This approach breaks the performance limitations of mobile XR devices (HoloLens 2, Oculus Quest 2, iOs/Android,…) and allows developers to empower their apps or even create new experiences with high-quality content. More than 100 enterprises already use applications with the ISAR technology. In this session you’ll learn more about the ISAR SDK and its benefits as well as get a live integration demo.
Covering the revolution in mobile scanning with the recent addition of depth sensors to many mobile devices, Laan Labs talks about creating a capture pipeline for consumers and enterprise alike. They share a few case studies of how the combination of LIDAR & computer vision is opening up new opportunities for industries like construction/BIM, shipping, GIS, and everyday consumers.
The web is a powerful place for digital content. Today we expect websites to provide optimal experiences across various screen sizes including mobile, tablet and desktop computers. But as the web evolves from 2D to 3D, it is expected to support new types of experiences including AR and VR and a new range of devices such as HMDs. As we enter the era of Spatial Computing, what will the future of responsive web design look like? And what role will AR and VR play in the future of the browser? Join Rigel Benton, Lead Product Designer at 8th Wall, the leading development platform for web-based reality content (AR/VR), as he discusses The New Responsive Web.
This session is the culmination of a workshop series back in August designed to forge partnerships and an open community for building the AR Cloud together. Development teams have been collaborating with a group of AR Cloud technology partners since the beginning of September to build experiences that foster collaboration, bridge social divides, and increase the meaning of relationships. In this session you will have the opportunity to see the projects that our illustrious panel of judges selected as finalists in October. Then witness them making the final call for best in class across a number of categories. The qualifying experiences are built using AR Cloud technologies that incorporate some combination of: Persistent content that is always accessible Real-time multi user interaction Instant, ubiquitous localization Geometric understanding of the world that allows real world interaction For details and to learn about the technology partners visit: https://www.awexr.com/usa-2021/ar-cloud-challenge