Agenda

  • Expand all
May 30

03:00 PM - 03:30 PM

Description

The medium of 360/VR requires a new taxonomy for discussing point of view (POV). The simple designations of first, second, and third person that we use to categorize books, movies, and video games don't cover all the options and combinations available in this new, immersive medium of virtual reality. Using my experience as a 360/VR content creator, professional screenwriter, and Professor of Virtual Reality Filmmaking, I have created the new taxonomy for VR, which will allow us to better discuss, analyze, and communicate clearly about this medium. I will present -- and explain the applications of -- this taxonomy, which includes 4 POV tiers: 1) Narrative, 2) Visual, 3) Experiential -- whether the viewer feels invisible, acknowledged, active, or passive -- and 4) Transcendental -- the viewer’s relationship to gravity, time and space.

Speakers

CEO & Creator of Realities , Exelauno
May 30

03:00 PM - 03:30 PM

Description

Many people think of brain computer interfaces (BCIs) as devices that provide sci-fi superpowers like telekinesis or "The Force." BCIs do not confer psychic powers; they are systems that allow computers to understand the human mind. Obtaining information about an individual's cognitive state is difficult and often subjective. However, imagine if we had direct, quantified, and objective measures of an individual's cognitive state. At Neurable, we do just that. We create BCIs this way both so that people can control next-generation interfaces but also so that we can build experiences that understand and react to an individual, a future with affective computing. This same information can also be used to add new layers of understanding to training applications, educational programs, wellness efforts, and more.

Speakers

CEO , Neurable
May 30

03:00 PM - 03:30 PM

Description

For years, many have tried to develop a see-through, near-eye display technology that combines beautiful design, excellent image quality and scalable mass production with high yields. At Dispelix, we’ve figured it out. Dispelix is helping product companies to create beautiful AR glasses based on single-waveguide full-color displays with superior image quality and mass manufacturability.

Speakers

CEO , Dispelix Ltd
May 30

03:00 PM - 03:30 PM

Description

Smart devices don't offer enough computing and graphics power to enable high-quality AR experiences and the AR community invests time and money in the polygon reduction and optimization of 3D models. In this session you will learn, how Holo-Light solves this solution with its own Remote Rendering solution optimized for industrial users.

Speakers

Co-founder , Holo-Light
May 30

03:00 PM - 04:00 PM

Description

Network with your peers and share your session highlights. NOTE: Food and beverages are available for purchase at the following food stands located within the Santa Clara Convention Center: Pete’s Coffee Cart, Great America Lobby Food Court, Mission City Lobby Food court

May 30

03:00 PM - 04:00 PM

Description

HTC VIVE's Head of Product Vinay Narayan and Circuit Stream's Lead XR Instructor Usman Mir share the development potential of HTC VIVE's latest eyetracking SDK: Pro Eye. Discover the possibilities Pro Eye opens for developers, experience enterprise use cases, and learn how to build an eyetracking app yourself from expert VR instructors.

May 30

04:00 PM - 04:30 PM

Description

Gaming and entertainment technology is evolving rapidly to become more immersive and interactive. Significant investment in 5G infrastructure will enable more content to be delivered to consumers while simultaneously giving those customers more control and detail. Currently, content developers and broadcast/streaming services are working to determine the best container for delivering that content over 5G. MPEG-H is a leading contender as it allows delivery of a single type of audio/video experience to its customer base over multiple platforms (e.g. VR/AR, music video, film, streaming or broadcast). Likewise, MPEG-H can be used for mobile, PCs, the car and a multitude of connected devices, forever changing the workflows of mixers and broadcasters having to work on numerous (and separate) audio mixes for each method of content consumption. Furthermore, MPEG-H also enables the end consumer to personalize their entertainment experience like never before. Examples would be to allow the hard of hearing to adjust the dialog of a sports broadcast independently of the other audio or allowing the super fan to remove the commentary altogether. Additional features include playing back the entire soundscape on headphones with high immersion using spatial audio processing and allowing the HRTF’s of a device to be changed on the playback device for an even more detailed and accurate representation of the audio. In this talk, we will talk about how THX is working with the industry to enable the creation and delivery of next-generation immersive content to enhance the entertainment experience for consumers.

Speakers

VP, Audio Architecture , THX
May 30

04:00 PM - 04:30 PM

Description

In March, Kevin wrote an eye-opening cover story for Wired about Mirrorworlds, the digital twin of the real world. This is an old idea, but now companies like Google, Snap, Niantic, and startups like 6D.ai, are building parts of the Mirrorworlds. They use tech like SLAM - geolocation + computer vision - which enables us to precisely anchor the digital to the physical. The real world will thus become searchable and clickable making Mirrorworld the most valuable asset in spatial computing.

Speakers

XR Consultant, Columnist, Author , Forbes
Founding Executive Editor , WIRED
May 30

04:00 PM - 04:30 PM

Description

Not only does it take great hardware and apps to bridge our digital and physical worlds, but also endurance to survive as the market matures.

Speakers

CEO , DigiLens
May 30

04:00 PM - 04:30 PM

Description

We have seen a shift in Life Sciences development from small molecule to highly sophisticated biologics to the development of complex personalized medicine. With that complexity comes the need for streamlining human-centric processes and assurance of greater reliability, compliance, speed and safety across all unit operations. Paper procedures lack the ability to provide deep insight or process improvement and current EBR systems are rigid, lacking the contextual understanding that makes augmented reality-driven systems so successful within complex environments. As a result, these intelligent and spatial systems are being implemented across life science teams from R&D through manufacturing to improve collaboration, communication and speed. These systems allow organizations to take their existing content, augment it and execute on any device, providing a fully traceable system for accessing and storing critical data.

Speakers

CEO & Co-Founder , Apprentice
May 30

04:00 PM - 05:00 PM

Description

nreal’s applications are now ready to be developed to run directly from your mobile device, empowering not only immersive AR and MR apps but your native android ones as well. Contents through glasses have never been more accessible to its consumers. Kicking off with the ol’ hello world, we will explore ways to migrate mobile AR and native Android apps to nreal’s MR environment.

Speakers

Product Manager , nreal Technology Ltd.
Software Product Manager , nreal Technology Ltd.
May 30

04:30 PM - 05:00 PM

Description

Microsoft has released HoloLens 2 on MWC. I think it is the benchmark for mixed reality devices, but it only reaches the pass mark. In order to be used by ordinary consumers, headsets must be turned into glasses to eliminate their fears of tech equipment.

Speakers

CEO , Shadow Creator
May 30

04:30 PM - 05:00 PM

Description

The future of live experiences such as sports, concerts and theatre venues is radically changing. The rollout of 5G will trigger a revolution in AR and spatial mapping technologies. Nexus Studios, a leader in immersive storytelling, and AEG, the world’s leading sports and live entertainment company, are working closely to enhance live experiences with immersive technologies. By creating a highly accurate digital overlay of the venue that can be accessed via your phone’s camera, venues will soon offer an array of new services to offer customers. In this session we'll explore the opportunities within this digital infrastructure, and cover new ways for brands to engage with customers. Together, Nexus Studios and AEG are designing the venue of the future.

Speakers

Head of Interactive , Nexus Studios
Vice President of Strategy, Digital and Analytics for Global Partnerships , AEG
May 30

04:30 PM - 05:00 PM

Description

Sharing reliable information via a virtual platform is key to success in day to day work activities. Integrating wearable technology in the right marriage of hardware and software is an all important 1st step that leads to a game changer versus a dud. How can we properly mix the two and have success?

Speakers

Fleet Engineering Innovation and Technology Project Manager , Southern Company
May 30

04:30 PM - 05:00 PM

Description

Come hear Matt Miesnieks, CEO and co-founder of 6D.ai, dive into the latest AR technical challenges 6D is tackling, catch a glimpse of what’s to come with the most recent breakthroughs and learn more about how spatial computing solutions are being applied across businesses, organizations and entertainment alike.

Speakers

CEO , 6D.ai
May 30

04:30 PM - 06:00 PM

Description

Coming Soon

May 30

05:00 PM - 05:30 PM

Description

Architectural teams at Arrowstreet are on a mission to advance their industry’s approach to design. Establishing an in-house innovation lab AIR (Arrowstreet Innovation + Research) in 2016, the firm is leading their industry in collaborative project practices and design visualization. The use of immersive technologies has elevated decision making, streamlined project coordination, and built stronger relationships with clients and end users. This fireside chat features Arrowstreet’s President Amy Korté AIA discussing the implementation of immersive design practice with Kachina Studer : Product Developer + UX Strategist for the innovation lab and Kat Schneider : Architectural Designer. The conversation will examine the firms bold investment in the architectural practice, how they are planning for future cities, and take deep dive into one of their most progressive and complex project in Boston.

Speakers

Product Developer + UX Strategist , Arrowstreet
Architectural + Interface Designer , Arrowstreet
President , Arrowstreet
May 30

05:00 PM - 05:30 PM

Description

In the early nineties, I co-invented the CMOS image sensor at the Jet Propulsion Labs in Pasadena. Although it was clear to all of us from the start that this technology would transform the world of digital imaging, industry adoption was initially non-existent. In 1995, I founded Photobit Corporation, obtained an exclusive license for the new technology, and built a team to publicize and commercialize it. Now, two decades later, CMOS image sensors are in every smartphone and digital camera, and is the enabling technology behind image and scene recognition for the XR industry. In this talk, I will discuss the elements which propel the adoption of disruptive technologies, and will talk about my personal experiences and challenges as a woman and CEO in the world of advanced technology.

Speakers

President , Tap Systems, Inc.
May 30

05:00 PM - 05:30 PM

Description

From face filters to avatars, augmented reality and virtual reality is equipping people with new ways to express themselves and change the way they are perceived to the rest of the world. This panel will discuss how spatial computing is impacting identity and the implications of being able to remix who we are.

Speakers

Social Media & Community Manager , Facemoji
Co-Producer, AWE & Partner, , Super Ventures
Software Engineer, (AR/VR) , Google
Co-founder & CEO , Loom.ai