10:10 AM - 10:35 AM
Within USA TODAY NETWORK's DNA is the desire to innovate how the organization tells stories. Since 2018 USA TODAY's Emerging Technologies team has developed over 40 highly engaging and interactive augmented reality (AR) experiences, exploring complex, historical, and trending news topics. Recognizing an opportunity to improve the core user experience within the app's framework, the team leveraged engagement analytics and external user testing to develop a robust and intuitive user onboarding sequence, resulting in a 3x increase in average time spent per user.
A few media-rich AR experiences that include the updated framework are “Seven Days of 1961: A Dangerous Ride on the Road to Freedom,” "Champions of Women's Suffrage,” "Accused: The Impending Execution of Elwood Jones,” “Attica: A Prison Uprising 50 Years Later,” and many others. The Emerging Technology team has also implemented accessible designs to meet their audience's growing expectations for new and inclusive forms of premium interactive experiences.
Will Austin, the Senior Designer of Emerging Technology, will explore the evolution of UI/UX to foster a deeper understanding of AR amongst casual consumers. He will also share insights and case studies on USA TODAY's onboarding challenges, new instructional design, and its vision for the future of premium interactive editorial experiences.
01:00 PM - 01:55 PM
02:00 PM - 02:25 PM
The increasing popularity of XR applications is driving the media industry to explore the creation and delivery of new immersive experiences, while pushing engineers and inventors to address the challenges of real video content manipulation. Considering this reality, the talk will discuss these challenges and introduce key technologies that leverage open standards to enable large scale distribution of new immersive experiences.
A volumetric video is comprised of a sequence of frames, and each frame is a static 3D representation of a real-world object or scene capture at a different point in time. Volumetric video is bandwidth-heavy content that can be presented as dynamic point clouds, multi-view plus depth, or multi-plane image representations.
These bandwidth requirements can be achieved through dedicated compression schemes that produce data rates and files sizes that are economically viable in the industry. Standards play a crucial role in ensuring interoperability across these different types of content and experiences, and this talk will spotlight the Moving Picture Experts Group (MPEG) Visual Volumetric Video-based Coding (V3C) standard as an open standard solution for efficient streaming of volumetric video.
To enrich XR experiences, inventors and industry must synchronize to create and distribute additional media
and interactions mechanisms alongside all the volumetric video components. In fact, the MPEG-I Haptics and MPEG-I Scene description standards are currently under development and will soon propose solutions to industry.
From the perspective of real content creation, it remains challenging but crucial to enable technologies that cover 3D capturing, calibration, depth processing, format conversion, transmission, and rendering. This session will explore two creation pipelines that leverage the aforementioned standard codecs: (1) a real-time pipeline for telepresence using depth and color cameras; (2) an offline pipeline for sports and media applications using color cameras and prior geometric information.
02:30 PM - 02:55 PM
John Gaeta is the legendary creator best known for his Academy Award-winning work on the original Matrix Trilogy and creator of Bullet Time. John, now Chief Creative Officer of Inworld AI will talk alongside Chief Product Officer and Inworld Co-Founder Kylan Gibbs about their incredible new concept demo. Called ‘Origins’, the demo allows players to take on the role of a detective in a sci-fi world. Using your mic, you’re tasked with questioning witnesses, uncovering the narrative, and cracking the case. All of the NPCs use generative AI to create dynamic conversations, meaning you can ask them whatever you like, and the NPCs will react accordingly - whilst staying in character. John and Kylan will give you an insight into this fascinating new technology, set to revolutionize games and immersive media
04:05 PM - 04:30 PM
The roots of current-day VR & AR technology date back to the pioneers in military, healthcare, and gaming innovation. Those initial steps to allow humans to interact with real-world visualizations drove the development of the technology and architecture that formed the basis of what we now see in movies, games, and the many virtual scenarios in which we engage every day. Milestone movies like Ironman and Minority Report were inspired by these same early innovators, and these movies are now the inspiration for our future interactions with technology. Today, we are on the cusp of a revolution that will redefine how we communicate and interact with others in small and large ways. Data visualization, AI and real time data processing is well on its way to changing our lives. This presentation will explore how VFX artisans, technology, and programmers have influenced and are currently redesigning this future.
04:35 PM - 05:30 PM
Producing VR content is a hard problem to solve. Studios invest in multi-year efforts to produce one title. The technical requirements, new design paradigms and a relatively small market size make it difficult to scale content production for VR. With headsets on the rise, now is the time to scale content and the solution is VR tools that leverage the power of the medium itself.
Topics led by panelists to explore include:
• Insights from the companies who build these tools and their stories about working with the next generation of content creators.
• What can businesses and industries do to adopt VR creation?
• How do we get “YouTube” level content creation by enabling at-home creators?