In 2019, the USA TODAY NETWORK team has an ambitious goal – produce a dozen emerging tech projects this year. In this case study presentation, Ray will dig into the AR projects the NETWORK has currently released and best practices for creating AR experiences that stick with readers. Drawing on his experience in gaming, Ray will lay out the process in which his team approached a variety of AR initiatives for the NETWORK, everything from the 321 LAUNCH app bringing real-life rocket launches to life to the interactive features launched alongside podcast, The City, showing the devastating effects a scandalous 6-story tall pile of waste had on Chicago in the 90s. Most recently, he and his team worked closely with the Fashion Institute of Design & Merchandising, collaborating with the 27th Annual Art of Motion Picture Costume Design team and its engineers to shoot each set of Oscar-nominated costumes to give readers an immersive up close & personal look at the award-worthy designs. These are just a few examples of how Ray and his team are creating sticky AR experiences for news audiences and through this case study presentation he can dig into the technical, creative and journalistic requirements for successful AR activations in media.
The optimization of product placement and assortment in-store and on-shelf helps maximize sales and reduce costs. Using mobile VR with embedded eye-tracking, Accenture teamed with Qualcomm and Kellogg’s to reinvent how brands and retailers gather consumer data and perform research —more rapidly and affordably at larger scale. Kellogg’s was launching Pop Tarts Bites and needed market data to determine placement, assortment and promotion. Using Qualcomm’s reference headset powered by the Snapdragon 845 with embedded eye-tracking, and software from InContext Solutions and Cognitive3D, Accenture Extended Reality (XR) developed a pilot VR solution that immersed consumers in a full-scale, simulated store, enabling them to move through the space, shop, pick up products and place them in carts--all while tracking what consumers were looking at, for how long and why. The pilot proved the high degree of correlation between the VR test results and those of traditional testing methods. More importantly, eye-tracking provided deeper behavioral data that led to a different merchandising conclusion—to place Pop Tarts Bites on a lower shelf, increasing total brand sales by 18 percent during testing
Education is on the cusp of a technology revolution that will transform teaching and learning to an experience-based approach. At Facebook, we are creating virtual reality hardware and immersive experiences to pioneer this movement. This talk will cover how VR is the next step in the natural evolution of computer-based instruction and how we are redefining the way we teach and learn by developing industry-leading content, virtual collaboration and creation tools. We believe that virtual reality has the power to transform education and turn classrooms into 21st-century technologically advanced places of learning. The technology and content will not only increase engagement and knowledge retention, it will also allow students to get equitable access to education regardless of distance and turn students into innovative creators. After all, our generation built the technology, but it's going to take the next generation to make it great!
In this fireside chat, we'll dive deep into the Niantic Real World Platform. What problems is it trying to solve? For whom? What types of experiences will be unlocked? How does it compare with competing efforts from Apple, Google, or 6D.ai? What impact could Harry Potter have on the industry?
Moderated by Kai Frazier, Founder & CEO, Curated x Kai, Afrofuturism imagines a future that values Black lives and social justice through the creative power of Black culture, music, film, literature and science fiction. Coined in 1994 by the cultural critic Mark Dery, its roots go back decades and include jazz musician Sun Ra, Parliament Funkadelic, writers Octavia Butler and Samuel R. Delany. As the next generation of artists like Janelle Monáe (Dirty Computer) and filmmakers Ryan Coogler (Black Panther) and Boots Riley (Sorry to Bother You) immerse audiences in a hyper-real Black experience, our panelists will explore Afrofuturism today and how it impacts AR, VR and spatial design.
We are in a transition from "see what you see" to "understand what you see." In this session we will be touching on, but not limited to, the following topics: 1. Applications that recognize the objects in your view (using image caption, object recognition, landmark recognition) 2. Understand your surroundings application: ARKit, ARCore, and technology built upon them. 3. Assist people’s social interactions (face recognition, facial expression recognition, voice recognition) 4. Industry vertical applications in AR 5. Limitations for the deployment of AI applications.
Technology support service providers typically maintain thousands of products to provide the highest possible level of service to clients. Delivering such a high-dimensional service necessitates maintaining a significant technical force who have been trained on a wide range of products. Given the increasing product portfolio and the pressure for technicians to cover multiple domains equally effectively, training these technicians becomes a major impediment for service providers. In this talk, we describe our experience of building and deploying a scalable AR-driven solution to address this problem.
In recent years there has been a huge growth in interest in watching people play games, both through streaming services as well as through Esports events to the extent that it is now possible to get college scholarships for participating in these events. At the same time, technologies such as virtual reality, augmented reality, and tracking systems have seen rapid improvements in quality and adoption. At PlayStation’s Magic Lab at Sony Interactive Entertainment, we have been exploring ways to leverage these technologies to create novel spectating experiences. We believe that there are many opportunities to make spectating a more dynamic, interactive experience allowing for increased engagement for both players and spectators. This session will cover some of the exploration that we have conducted in this space.
2 different use cases running on the Insider Navigation platform: 1. Schiphol Airport: Maintenance without prior knowledge of the venue, managing in real-time assets gets faster, more accurate and efficient. Quality rate the assets and report defects. All this information is linked to any database system and work orders can be put in place quickly. 2. US Navy: big vessels are hard to navigate in. In case of an emergency certain items like pumps and the engine have to found immediately plus showing their values in real-time including offering manuals for solving issues.
USC's JOVRNALISM has combined photogrammetry, 360, audio, AR -- and worked within Snapchat Lens Studio limits -- to create interactive, immersive experiences around homelessness, immigration and more. This session discusses how using the technology -- which is meant for light content -- can be used for serious, non-fiction storytelling. (One example: http://bit.ly/JOVRNALISM-HR-01)
This session will provide insights and a basis for further discussions on how technologies like Augmented and Virtual Reality are being tested and explored in the healthcare sector.
Millions of people live in fear of their job being taken by a robot, but the truth is that autonomous industrial robots have not yet lived up to their initially inflated expectations. Without next generation enterprise software that uses MR to connect people with robots and AI-driven task optimization, robotics vendors will never unlock the complexity and scale needed to drive ROI for their customers and investors. Join this session to learn how connecting technology-augmented human workers with robots is the key to unlocking the potential of autonomous robots.
Charity Everett's hands-on tutorial teaches Creators about processes and tools they can use to pair traditional storytelling techniques with AR data visualization -- for the purpose of opening hearts and minds to new perspectives that bring people together and build the future of Spatial computing through pairing art, storytelling, edutainment, and virtual beings to create a seamless landscape that will allow you to quantum leap between networked WebXR based interactive projects and spatial information sites ie Wikipedia.
• Current state of VR/AR in esports. • Current player base for VR/AR esports and how to target excited fans. • What games are most suited for VR/AR esports? • How do we start to get decent size prize pools? • Different esports events that were produced well for VR/AR. • Future of esports in VR/AR field.
Edge compute and 5G are coming fast and they will separately and together change the way immersive applications can function. Learn how multiple telcos are working to enabling device manufacturers and service providers to address significant technical challenges with bandwidth, location positioning, AR cloud deployment, multi-user collaboration, computer vision services and more while users operate with true mobility.
30 years ago we began working with 3D visualization to support product design review and analysis. 3D computer graphics was a struggle in 1989. There was no general purpose software available, and the computing hardware had serious performance limitations. This early work led to the development of a practical, large scale virtual reality system in the early 1990s. This system was used as the electronic mockup to support the design and construction of the Virginia Class submarine. In the late 90s we began to look at mobile computing and wearable computing. Before the ubiquity of cell phones and the advent of smartphones, mobile and wearable computing was hard. We used a series of Palm Pilot and PocketPC devices, as well as some unique, purpose-built computing devices. Around 2000 we also started to work with wireless networking. This was before the term WiFi was in use. We were using PCMCIA cards running something called "WaveLAN". We looked at amplified signals for connection to planes and boats. And we looked at area connectivity inside of ship hulls. In the early 2000s we worked on the development of a Virtual reality system for welder training. This was an early Mixed Reality installation with tracked physical hardware and VR display. In the later 2000s we started developing some applications in the area of computer vision using OpenCV with digital SLRs and FireWire digital video cameras. The final piece in the XR puzzle came with investigation of AR in 2013. We worked with early SLAM implementations (PTAM) and various toolkits: KinectFusion, Metaio, ARUCO (markers), and finally Google Tango. We developed some early Tango applications for electrical wiring and cable installation. This led to a prototype application for overlay of product model geometry on the work surface as a replacement for paper drawings. We have been using this in limited-scope production. We will show some examples of how our AR installation tool is being used in final production of end products.
Social VR platforms like Facebook Spaces can help community builders and purpose-driven organizations innovate the way they engage their members and bring people together. Find out how ARVR Women uses social VR to elevate the visibility of their members and add value to the group.
As technology constantly evolves, so is the way we integrate it into our workplaces. What does the future hold? What innovations will we need to keep up and how will this impact our future workforce and how they engage and train? In this session, we will look at virtual reality and how it impacts our daily workplace and future success and discuss how it can help solve current industry challenges — like the skills and labor gap.
BREAK – Visit the AWE XR expo hall to interact with the latest and greatest in the industry. NOTE: Food and beverages are available for purchase at the following food stands located within the Santa Clara Convention Center: Pete’s Coffee Cart, Great America Lobby Food Court, Concession Stand C (located inside the exhibit hall) and Concession Stand A (located inside the exhibit hall).
This session will cover how sound is an integral part of spatial experiences from enhancing immersion, reducing cognitive load and, at times, replacing the need for visuals. Advances in hearables will support a new wave of creation using audio to augment reality.