University Administrators Address Privacy, Security, and “danger of physical and psychological harm” of AR/VR Systems

By B.N. Frank

Despite research and reports confirming that virtual reality (VR), augmented reality (AR), and mixed reality (MR) systems can be extremely harmful to our wellbeing, they continue to be manufactured and marketed to people of all ages. In fact, last month a report from the Department of Defense (DoD) confirmed that that 80% of the soldiers who used Microsoft HoloLens mixed reality headsets experienced “mission-affecting physical impairments”.  Kudos to these university administrators for acknowledging the numerous issues (including liability) associated with incorporating this technology into their curriculums.

From Gov Tech:


Metaverse Meets Higher Ed: Security, Privacy, Safety Concerns

Administrators from the Georgia Institute of Technology and University of Michigan say that users and providers of emerging XR technologies should be conscious of privacy, security and safety challenges.

Jeremy Nelson, director of the University of Michigan Extended Reality Initiative, demonstrates how AR/VR tools can be used for skill development.

(University of Michigan)

As AR/VR technology continues to improve and as universities experiment with the “metaversity” concept to provide experiential digital lessons within extended reality (XR) environments, higher education institutions will need to carefully consider the data privacy and security implications that could come with mass adoption of VR technologies.

These concerns were the focus of a recent webinar led by Richard LaFosse, compliance and policy lead for academic innovation at the University of Michigan, and Didier Contis, executive director of academic technology, innovation and research computing at the Georgia Institute of Technology, who offered an overview of security and ethical challenges of XR adoption both on and off campus at the virtual Educause Annual Conference this month.

According to LaFosse, AR/VR technology’s scanning capabilities could raise privacy concerns similar to those associated with virtual room scans for remote test proctoring. As an example, he cited a recent ruling from the U.S. District Court for the Northern District of Ohio which said “room scans” have the potential to violate students’ constitutional rights to privacy when required by a public university.

“XR technologies such as VR headsets far exceed webcams in terms of data capture concerning one’s surroundings, so it’s easy to see how similar violations might be found where privacy considerations are left out of XR initiatives, particularly if students are using devices in their home where the expectation of privacy is so high,” he said. “XR devices collect a lot of [personally identifiable information] that could trigger [Family Educational Rights and Privacy Act] FERPA protections.”

When it comes to student privacy and data practices, Contis said it’s important to take note of just how much data can actually be collected through the use of XR devices. He said XR headsets can scan and analyze the space around students wearing them in detail, similar to room scans used for proctoring, and also track students’ unique movements, interactions with objects, facial features and biometric data, among other data points. He said institutions must know if data such as this is stored locally on the device or in the cloud, and whether the collection itself could violate current federal or state student privacy regulations.

“With each new end user computing device, more data and personal information is collected,” Contis said. “To understand how much data is collected by XR devices, we need to consider the sensors this new class of devices has. In fact, XR headsets have a lot more sensors than a phone.”

In addition to carefully reviewing data collection practices and terms of service before adopting new devices, Contis said universities must consider the implications of requiring students to use their student accounts when using or logging into XR devices, as well as ways to manage and keep them up to date.

“Analyzing device manufacturers’ data collection and use practices and processes to address concerns is not enough,” he said. “The same needs to apply to XR application developers or other service providers themselves, as they may be collecting potentially sensitive and personally identifiable information as part of delivering a specific service or app functionalities.”

Other considerations include the danger of physical and psychological harm from the use of these devices, such as users bumping into objects in the room while using headsets or getting “cyber sickness,” a form of disorientation or nausea that can come from seeing motion without physically being in motion. LaFosse added that these devices, used by so many students, could also raise concerns about hygiene and sanitation.

Noting potential liability concerns in this area, he said, universities should clearly communicate lab safety procedures and possible dangers for students using XR technologies.

“As much as XR is a great technology, the risk of harm either to yourself or others is unfortunately real and should not be underestimated,” he said.


Activist Post reports regularly about AR, MR, VR, and other unsafe technology.  For more information, visit our archives and the following websites:

Become a Patron!
Or support us at SubscribeStar
Donate cryptocurrency HERE

Subscribe to Activist Post for truth, peace, and freedom news. Follow us on SoMee, Telegram, HIVE, Flote, Minds, MeWe, Twitter, Gab, What Really Happened and GETTR.

Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.


Activist Post Daily Newsletter

Subscription is FREE and CONFIDENTIAL
Free Report: How To Survive The Job Automation Apocalypse with subscription

Be the first to comment on "University Administrators Address Privacy, Security, and “danger of physical and psychological harm” of AR/VR Systems"

Leave a comment