The Virtual Reality (VR) market could surpass $ 40 Billion by 2020. The U.S. Military recently closed a deal worth $ 480 Million for the Microsoft HoloLens Mixed Reality (MR) device. Oculus has already released the first immersive VR system that is mobile with no wires and no need for a high-end gaming PC for $399. While these are exciting times, an important question needs to be investigated: Are we ensuring the security and privacy of these systems? In this talk I will present various experiments and findings we conducted in our lab related to the security and forensics of consumer grade immersive VR systems. I will show you how we are able to move people in physical spaces without their knowledge or consent, as well as other attacks that we coined and implemented related to immersive VR. Furthermore, we will also explore the forensic artifacts these systems produce.
AR/VR applications are becoming increasingly social and collaborative. From multiplayer games to virtual events to immersive workspaces, these technologies are bringing individuals across the world together. We are entering a new paradigm shift with the emergence of the Metaverse. This has the potential to transform the way we communicate—but as other digital communications platforms have shown, it also raises risks of harm to users. What's more, the immersive, 3D nature of AR/VR experiences means that traditional approaches to mitigate these harms through trust and safety measures will likely be insufficient to fully protect users in these virtual spaces. This panel will explore how to both define and address trust and safety concerns in immersive experiences. Experts in trust and safety and multi-user immersive experiences will highlight the unique challenges for trust and safety in the Metaverse and discuss how companies building these realities can begin to address them.
In today's world technology is ubiquitous. COVID-19 has taught us that we can(*) live and operate in a virtual world. We as a society rely on technology more than ever, from doing our jobs, to running our homes to education. The future of tomorrow will bring more immersive technologies to everyday use for younger and younger audiences. This talk is meant to spark conversations about topics that we’re not talking about today. We’ll explore hypothetical scenarios and ask difficult questions to ensure we’re protecting our next generation to the best of our ability.
(*) with a lot of effort and sacrifice.
Today we see advancement in practical applications of emerging technologies in our homes and at work. As these solutions grow towards ubiquity, so do the possible harms and obstacles towards the world in which we all want to live.
In this talk Noble Ackerson, talks about trust and ethics in XR specifically:
Dark patterns in XR - impact, example, opportunity;
AI fairness & bias - impact, example, opportunity;
Data privacy/trust and security - impact, example, opportunity.
He presents practical examples on principles to avoid wading in dark patterns within the shadows of the emerging world and how you optimize for trust through explainability, context, control, choice.
A few years ago, a client told us we had a year to become ISO27001 compliant to continue working with them. As a small creative agency with a one-man I.T. department, we had no clue what ISO certification was, let alone how to achieve it. What followed was a journey to the center of the cybersecurity sun that culminated in an entire overhaul of not only our security but project management policies and procedures. ISO27001, GDPR, SOC, Privacy Shield, MPA Content Security, and this list goes on and on. More and more, enterprise clients are asking for these certifications from their suppliers and vendors. In this presentation, I take you through what these different certifications mean to you and why they matter, how to start your cybersecurity journey, and how your team can keep creativity and innovation moving when it seems security protocols only want to slow you down. No I.T. or security experience required.