Ulta Beauty, one of Fast Company’s most innovative retailers of 2020, is a proud leader in the AR space. The retailer acquired two startups as part of its strategic approach to AR and continues to prioritize in its talented, in-house Digital Innovation team. In this session, learn more about the nation’s largest beauty retailers’ bold, engaging AR journey, challenges and strategies to scale, as well as a look at what’s next.
The central location of the book "Ready Player One" is The OASIS, a fully embodied VR metaverse that began as a game, but was so compelling, so useful and fun, that everyone - every person, every business, every institution and even every game - had to have a presence there. Sort of like the formative Internet in the 90s. This is fiction, of course, Mark Zuckerberg says we will soon regard Facebook as a "metaverse company." Epic Games (maker of Fortnite), announced their most recent funding round of one billion dollars will be spent building a metaverse. But how shall we define it? Who else is building - or has built - a metaverse? What technology does a metaverse require? What might its terminal form be? On which platforms - AR, VR, Mobile - will these virtual, social spaces manifest, when, and how?
Mixed Reality (XR) offers an opportunity to catalytically change how humans operate in the world, enabling understanding and action never before possible. At the center of this potential are the challenges and opportunities of the human-machine interface and the ability to use these systems in real-world scenarios. Specifically, the challenge entails how (and when) to integrate XR appropriately and generatively into the Perception-Acton loop during real world tasks. The DoD industry base, like many other industries, is investing heavily into XR across many dimensions. The implementation of XR technologies on the individual Soldier entail a complete and robust understanding of not only the optical-mechanical systems themselves but, more importantly, the understanding of the challenges of implementing such a device on the human system. Dr. Palmer will speak to these challenges and opportunities, as well as their portability to other industries looking to enable increased performance, efficiencies, or joy to the users of these devices. Topics will include Data to Decision (D2D) in Context, Architectural Requirements and Trades when implementing on Human Systems, the challenge of Cognitive/Attentional/Physical Load, and Management of Local (XR) – Global (Real World) relations in different contexts.
As AI and machine learning advance, human augmentation becomes less dystopian and more utilitarian. There is definite trepidation about the increase of machines in the workforce, many fear for their livelihood, robotic overlords and other such visions of how a robot / AI future would look. Some roles may disappear, others will emerge bringing employment to more people. Recent global disruptions, shifting marketplaces, and other unexpected situations have magnified the need for flexibility and new ways of working. Technology will further revolutionize the relationship between people and technology, powering us towards greater productivity. As organizations gradually transition out of ‘survival’ and review what lies ahead, the experience and lessons learned from 2020’s biggest remote working experiment show us the best world of work is ours to create, if we are willing to adapt. Leaders must address the significant upskilling and augmentation that will be required to elevate their workforces and communicate a compelling vision in which technology plays an additive, not subtractive role, in the lives of employees- empowering them, making work more interesting, productive and meaningful, so that people can think, learn, and create. Talking points: • How NEW is augmentation at work? • Is it replacing or enhancing, or both? • Greater digital divide, or new employment opportunities? • Does EQ currency come into play?
There remains a huge gap between hopes for scalability of wearable AR to the general public and real world success stories involving large volumes of paying customers. ARtGlass has bridged that gap. CEO Greg Werkheiser and Vice President Lexi Cleveland will present with client Therese Alvich of Merlin Entertainments. They’ll share lessons applicable across many verticals from a journey scaling to serve millions of thrilled visitors to iconic cultural sites globally who pay for 45-120 minute tours. How and why did ARtGlass and their client sites break through quietly, inexpensively, and with a team characterized by atypical diversity?
Every human is creative. Over the past year and a half, in a pandemic world, creativity was implemented in new ways - whether through innovative ways to attack a challenge or of building individual creativity and habits creativity is the essence of the human mind. Technology has played a significant role in expanding the ability to create - the evolution of the camera, the synthesizer for music, social media for brands and connecting with customers. And now, we are in a new era with AI. AI is a tool that can enable creativity and imagination through original avenues that are the link to making innovation more human for us, not less. After all, the future belongs to those who invent it. Creative AI tools will transform how we create and express ourselves, around online and offline spaces, between people and the businesses in our lives. In this conversation Spencer Ante, Head of Facebook IQ Editorial at Facebook, will share the latest insights into what’s new and what’s next in AI and how it is being applied across passions, productivity and purchases. In this conversation attendees will also have a deeper understanding of: Real life use-cases showcasing how creative AI tools can reach business objectives The ways in which AI can rapidly expand creativity and create meaningful and human-driven results The marketing and advertising value and potential of AI Tangible ways in which you can introduce AI tools into your business
Why are investors talking about the Metaverse and how does that inform where they are funding? This panel will feature the top investors in AR, VR, AI ,and the Metaverse who have backed the first VR Unicorn RecRooman, the most successful VR game Beat Saber and had acquisitions to Faceook, Apple, Google and more. We will look at what parts are available today and what is missing and needs to be created to enable this future vision and see where the next cycle of funding is going. A must for startups looking for funding.
Black Movement Library (BML): BML is a library for activists, performers & artists to create diverse XR projects, a space to research how and why we move, and an archive of Black existence. BML seeks to grow community through the use of performances, xr experiences, workshops, conversations and tool making.
Through countless partnerships, ZEPETO is home to virtual versions of celebrities like Selena Gomez, BLACKPINK, Twice, and more. From avatar-driven official music videos to virtual merchandise and virtual meet and greets, real world celebrities are embracing ZEPETO as they double down on metaverse strategies. However, there is a rapidly growing community of influencers in ZEPETO who are avatar natives. From a virtual fashion designer who makes six-figure incomes by selling clothes on ZEPETO to virtual influencers who land deals with MCNs that picked up their ZEPETO activities, there is a large - almost 1 million - community of creators on ZEPETO who are using ZEPETO exclusively or near-exclusively for their online output.
Automation is not the future, human augmentation is... A simple strategist’s guide to successfully augmenting human capabilities to drive scalable value •There can be no success without a vision. •Evolution vs revolution. •Rethinking how we work. •Culture eats strategy for breakfast and technology for lunch. •Continuous Innovation, the new continuous improvement? •When you have the ideas but the plants have the money. •Millennials, meet the Gen Z workforce. •Upskilling for factory of the future •Standards are meant to be modernized. •Success is not final, failure is not fatal (Winston Churchill).
AR and VR is one of the most exiting new technologies that will have a big impact on business and enterprise over time. Although some see these technologies having slow adoption in mainstream enterprise, it turns out that major segments of the enterprise, especially CRM, Field Service and Training, are seeing rapid interest and serious adoption within all types of CRM IT Programs. Mr Goldenberg, a noted expert and author on CRM and Mr Bajarin, a well known technology analyst, will share how AR and VR is already being used in large enterprises around the world, and share some of the most successful enterprise based CRM implementations that utilize AR and VR to add greater value to their customers.
WorldsAI is the platform for the enterprise metaverse. Our live digital twin platform combines spatial data, sensor streams and deep learning to convert actions in the physical world into a live data stream that can be used to measure, analyze and automate real-world processes in ways that were never possible before. Our platform is an important, new software infrastructure - a live 1:1 map of the world that empowers companies to extend digital transformation beyond their firewall to their partners and suppliers. WorldsAI introduces an entirely new way for organizations to experience their world and augment people’s capabilities through more meaningful interactions with spatial AI.
The Metaverse is largely bullshit, as defined. Proponents claim our real lives (IRL) will be inundated with digital worlds, virtual stuff everywhere we look. AR and VR certainly share many technologies and source data. One might expect the experiences to be the same. This talk will explain why the Metaverse hipsters are wrong, revealing a brighter recipe for our future.
When the world’s largest furniture manufacturer, Ashley Furniture, decided to fully shift away from traditional product photography to 3D modeling, there were a lot of unknowns. Although the company identified CGI as one of the focus areas for their $1 billion investment in digital transformation, the sheer scale of the initiative presented numerous challenges. From rewiring the entire organization to work with 3D first to building out QA processes at scale, converting thousands of products to 3D and augmented reality required a radically different approach. Join Dalia Lasaite, CEO at CGTrader, and Bethany Foose, Senior Director of Digital Asset Services at Ashley Furniture Industries, to learn how the two companies came together to build a streamlined, scalable 3D production pipeline that made the transformation possible.
Should robots feel emotions? More importantly, would these emotions cause them to perform their tasks better? In humans, emotions have historically played a vital role in the way we survived and evolved as a species. Moreover, emotions are an indispensable part of what makes up the fabric of intelligence and critical functioning. No wonder it has been a long-held view amongst leading experts in Artificial Intelligence (Fig.1) that infusing robots and intelligent systems with emotions would greatly heighten their functioning capabilities in the role they were designed for and beyond. "I don't personally believe we can design or build autonomous intelligent machines without them having emotions." Yann LeCun - Chief Ai Scientist at Facebook silver professor winner of the Turing award in 2019. Affective computing, born in 1995, relies on big data to recognize, process, and simulate human emotions;however, it does not provide the machines with their own emotional response. In contrast, Emoshape aims to encode real-time emotions in AI using Emoshapeâs EPU (Emotion Processing Unit) through Emotion Synthesis. Instead of the bootstrap approach of Googleâs Deep Mind (1, 2), Emoshapeâs vision is to teach the machine to preserve biological life above mechanical life. Through the combination of cloud learning and a novel microchip, Emoshapeâs patent-granted technology can be ported into any existing AI or robot. The EPU will synthesize an emotional state within the machine, enabling real-time language appraisal, emotion-based reasoning and emotional personality development. This will make possible intelligent machines capable of understanding how words are associated with feelings, and able to respond with the natural empathy of a human. This response will also be reflected in their vocal responses and visual expression, which play an important role in audiovisual speech communication, increasing comprehension and creating a soothing environment or a trust/loyalty connection with wide applications in the entertainment, education, healthcare, or industrial sectors. Moreover, emotion also improves body-consciousness based on situation, particularly relevant to real-world autonomous intelligent machines, whether it is robots, metaverse, games, etc.
Post our successful second year of BRCvr Burn Week, Doug and Athena will share what they learned, new discoveries, best practices and ways utilized to bring the community together while empowering them to create in the digital plane.