2 Oct 2025 | Mike Boland
AWE Talks: Tips & Tactics for Building Lenses for Snap Spectacles
AWE USA 2025

Welcome back to AWE Talks, our series that revisits the best AWE conference sessions. With AWE USA 2025 concluded, we have a fresh batch of session footage to sink our teeth into for weeks to come. 

We continue the action this week with a deep dive on Snap's Lens Studio, and its orbiting set of tools for building experiences for Spectacles. What does that toolset look like today and how is it evolving?.

See the summarized takeaways below, along with the full session video. Stay tuned for more video highlights each week and check out the full library of conference sessions on AWE’s YouTube Channel.

Speakers
Bobby Murphy, Snap

Key Takeaways & Analysis

– Snap sits in a unique spot in today's AR competitive landscape.
   – It's the only player launching full-fledged consumer AR glasses in 2026.
– To pause to define that, we're talking full-AR with scene understanding.
   – This means digital elements interact dimensionally with physical scenes.
– Others such as Meta are launching AR glasses but not full AR (yet).
   – For example, Meta Ray-Ban Display Glasses offer a flat HUD interface. 
– Meanwhile, the devices that do achieve full AR today are enterprise focused. 
   – Any other consumer full-AR glasses (Apple, Meta) are planned for ~2027.
– So that makes Snap uncontested in consumer full-AR glasses in 2026.
   – That alone gives Snap an edge in attracting developers to its platform.
– Snap also makes it easy by housing Spectacles development in Lens Studio.
   – This gives it a familar environment and less friction for existing lens creators. 
– Besides these benefits, what else Lens Studio offer in terms of capabilities? 
   – One key feature according to Murphy is standaone/untethered hardware.
      – This liberates consumer movement and, in-turn, developers' creativity. 
   – Other platform features include Spatial Simulator for real-time UX testing.
   – The asset library meanwhle offers a range of 3D elements to get started.
   – The Spectacles Interaction Kit provides defined interaction mechanics.
   – Sync Kit anchors objects in shared spaces for multi-user experiences.
   – Snap ML lets developers bring machine learning models into Lens Studio. 
   – Lastly, OpenAI and Gemini integrations unlock multimodal AI for lenses.
– Using these tools, Snap built a POV experience to track your basketball stats.
   – Enklu's Sightcraft offers an interactive game in real-world spaces.
   – PlayCanvas lets users interact with high-fidelity 3D models in WebXR.
   – We'll see similar work as developers flock to the platform and get creative. 

For more color and depth, see the full session below... 




  Want more XR insights and multimedia? ARtillery Intelligence offers an indexed and searchable library of XR intelligence known as ARtillery Pro. See more here.  

Share This Article