This talk will be an overview of the different components needed to build the future of spatial computing. First, we'll will dig into how computers perceive the world, and current state of sensor-based tech. Then she'll dive into which event types are most useful or specific for AR. Spatial computing requires a reassessment of how humans interact with computers, so this lecture will cover inputs and modalities, from button presses to facial sentiment analysis.
Finally, we'll close with how to get to the future, both with developer tools, and an overview of the current state of hardware in 2019.
Whether small toys or large building facades, object recognition and tracking allow developers to interact with real-world elements by layering AR content through a mobile device or smart glasses.
In this session, Wikitude CTO Philipp Nagele will explore:
- how AR experiences can benefit from working with recognizable objects
- what steps are required to integrate your 3D objects into an AR experience
- technical challenges to solve for a stable working object tracking in real-time
- the differences between smartphone-based AR and smart glasses AR
The Reality Virtually Hackathon at the MIT Media Lab is the largest spatial computing hackathon in the world spanning five day of learning, ideation, designing and building. It extends the definition of a hackathon to prototyping at scale, Reality Virtually is designed to crowdsource new ideas and innovation, apply XR to new fields and impact participants lives with new jobs and by creating new companies.
Imagine being able to view and interact with 3D objects in your environment in real time, directly in your mobile browser. No app required. AR for the web is here, and we’re pretty excited about it.
8th Wall’s AR Camera allows for prototyping and sharing of AR content on the mobile web by uploading 3D models. No coding required.
Come learn about the latest developments at 8th Wall and see how to create both custom Web AR experiences as well as AR Camera in a matter of minutes.
Most software developers today have “mobile-first” tattooed somewhere on their bodies because we humans currently explore our world primarily through portable devices. While the internet has caught up with this behavior, the physical world has not. Streets signs, airports, and TVs could all do a better job delivering mobile and digital moments.
Geenee helps mobile devices see the world around them through lightweight and robust image recognition that triggers contextual content at the moment of relevancy, from ticket sales and film trailers to augmented reality stories and storefronts.
Geenee provides rich mobile results through a scalable, web-based platform with no downloaded app required. We’re working to make accessing contextual content directly from the world around you easy, rewarding, and meaningful.
nreal’s applications are now ready to be developed to run directly from your mobile device, empowering not only immersive AR and MR apps but your native android ones as well. Contents through glasses have never been more accessible to its consumers.
Kicking off with the ol’ hello world, we will explore ways to migrate mobile AR and native Android apps to nreal’s MR environment.