While virtual reality is finding more and more applications in everyday life, the data collected by these systems could say too much about us… going as far as allowing us to be identified even when we think we are playing anonymously.
This is the conclusion of a study published by Nature in October in which researchers want to alert public opinion to the risks associated with the collection of gesture and movement data by virtual reality systems.
Our signature movements constitute our identity
Virtual reality allows you to immerse yourself in a 3D simulation using headset. To faithfully reproduce our movements in the virtual world, these systems use sensors measuring the orientation of the head, spatial body position and hand gestures. Some (rarer) go so far as to collect data on feet position, back and the chest posture, and even the elbows and knees.
It is the collection and use of this data that poses a great threat to the privacy of users. From this information, virtual reality systems can guess the identity of players, their medical conditions and even their moods.
In the experiment conducted by a group of scientists from Stanford University with 511 participants, a simple machine learning model was able to identify 95% of participants in just 5 minutes by simply analysing their ways of positioning themselves in space. The system is not able to find the identity of the player itself (i.e. the name of the person), but it can recognise and distinguish players over several sessions thanks to the unique way they move, each having a signature movement that sets them apart from other participants.
And the more players play, the more accurately the system is able to identify them. Beyond 8 sessions, the algorithm approaches 100% recognition success.
If this data can be used to improve the gaming experience, it could also be used to draw up a complete marketing profile to offer targeted advertising according to our morphology (build, size, etc.) and our possible health problems (overweight, hyperactivity, depression…).
Three major threats linked to this identification
The authors of the study point to three main risks:
1/ It is impossible to anonymise the data of virtual reality players: the two largest headset manufacturers (HTC and Oculus) indicate in their general conditions of use that they reserve the right to share anonymised data (without the name) of the users of their services, a fairly common procedure among digital giants, Facebook and Google in the lead. Except that… removing the name would no longer suffice to protect the data since the way of moving makes it possible to identify a person, or in any case to recognise them in a distinctive way.
2/ Players can easily be identified over several sessions: it is possible to unify the different sessions of a player, even when they do not identify themselves, thanks to their “signature movement”, the unique way they move. Ultimately, there is therefore a risk of being able to generate a gigantic individual file of all our experiences in virtual reality, including those considered to be “sensitive” (which we will discuss a little below).
3/ In virtual reality, forget about incognito mode: even if it requires many layers of protection, private Internet browsing is possible when you are on a computer or a smartphone. But it seems impossible to untie the identity of a player from the data collected in virtual reality since it is linked to the very essence of their body, to the way in which they move.
This is not the first study to point the finger at such dangers.
Even without providing data related to our movements, the choices we make in video games can already say a lot about our personality. As early as 2011, researchers had shown that it was possible to better understand the real personality of players based on the way they played in Second Life, a virtual world very popular in the 2010’s. According to the findings of the study, a high degree of exploration of the map tends to refer to personalities with a high degree of conscientiousness, for example. Add to this information physiological data collected by virtual reality headsets and profiling becomes extremely precise.
In 2018, Jeremy Bailenson, professor of communication at Stanford, was already alerting public opinion to the danger of large-scale data collection by virtual reality systems. In an article for the Virtual Interaction Lab of which he is one of the founders:
“These [virtual reality] systems typically track body movements 90 times per second to display the scene appropriately, and high-end systems record 18 types of movements across the head and hands. Consequently, spending 20 minutes in a VR simulation leaves just under 2 million unique recordings of body language.”
Virtual reality will gradually invade our lives
Formerly reserved for entertainment, virtual reality headsets are finding more and more concrete applications in everyday life.
It must be pointed out that technologies are progressing very quickly. For example, Microsoft’s mixed reality headset, the Hololens 2 (released in late 2019), has a field of view twice as large and is much more comfortable than the first model. It is not as immersive as the Oculus or HTC’s headset, but allows you to create mixed reality applications (including virtual elements in the real world).
These advances in the comfort and quality of headsets have made it possible to develop a whole host of new services. Virtual reality can be found in professional environments, such as with the solution from the American start-up Spatial, which creates a collaborative workspace in mixed reality to have more “immersive” meetings for employees.
Spatial, which presents itself as the Zoom or Google Hangout of virtual reality, raised $14 million in funds in early January 2020. The start-up boasts of counting among its clients large companies without “tech profiles” like Mattel, Nestlé or BNP Paribas who use it as a tool to stimulate creativity and the sharing of ideas.
There are also more and more applications related to well-being or even health-care. The Spanish startup Psious is developing immersive therapeutic treatments using virtual reality that simulate stressful situations to learn how to overcome phobias (a demo here).
And of course… pornography. Today, the biggest porn sites offer videos to watch in virtual reality (we’ll let you do your own research 😇). Just one year after the launch of its special “virtual reality” section in 2016, Pornhub was already announcing more than 500,000 views per day for a total catalog of 2600 available videos. This type of video seems to particularly appeal to women since they constitute 22%of the audience, according to figures from the Womansera site.
Now imagine that you can tie all these sessions together with a signature movement, to the unique way of moving your arms and head. This would give a very precise profile of the user’s identity: his profession, his fears, his desires…
The risks to our privacy
Giving information on how one moves the head or the hands does not appear to constitute a major threat.
But that data defines our individual cognitive signature (also known as a kinematic fingerprint), information that allows systems to uniquely identify us. The US military is already using this concept to secure its internal networks by analyzing how each individual types on the keyboard or moves the mouse to recognise their identity.
Our individual cognitive signature is therefore an element as important to protect as a photo or a civil status since it is linked to our own identity. The sensitive nature of this data is also due to the fact that it is closely linked to our state of health.
This is supported by digital law lawyer Laura Barrera Cano, who demonstrates this in her analysis on Virtual reality, biometric data and RGPD (General Data Protection Regulation):
“A study carried out in 2004 on students in a virtual classroom made it possible to diagnose each student’s attention. The amount of movement of the head, arms and legs was higher in children who were diagnosed with attention deficit disorder or hyperactivity disorder than in those who did not receive such a diagnosis. They also measured head movements and demonstrated that students diagnosed with high-level autism disorder looked less frequently at their classmates in the virtual classroom during a conversation, compared to those who were not diagnosed as such.”
For the moment, the use of virtual reality in the context of health seems rather therapeutic (as the case of Psious shows us) but we cannot exclude that in the long term, these identification elements could be used for the purposes of exclusion or stigmatisation “What about an insurance policy that refuses a customer because the biometric data collected during a virtual reality game made it possible to diagnose depression? This is a scenario that could arise in the society of tomorrow”, concludes Laura Barrera Cano.
How can you protect yourself from such risks?
So what solutions exist to protect us from such risks? Some options :
►Idea n ° 1: Inform players more precisely and collect their consent more explicitly
The GDPR is quite clear: “The processing of personal data (..) as well as the processing of genetic data, biometric data for the purpose of uniquely identifying a person, data concerning health (..) are forbidden. (…) unless the subject has given their explicit consent (…)”.
But the problem stems from what is clearly meant by “uniquely identifying a person”. The manufacturers rely on the fact that this collected data is not used to identify people but to make the game function better.
Except that the collection of this data seems impossible to untie from an identification process. Virtual reality games should therefore collect public consent in a much more explicit manner and put forward a clear message on the processing of their personal data and the identification that may result from it.
► Idea n ° 2: Rethink virtual reality with the principle of Privacy by design
This is one of the founding principles on which the GDPR was developed, which recommends including the issue of respect for privacy and the processing of sensitive data at the core of the various stages of product design.
Concretely, in the context of virtual reality headsets, this could mean not sharing players’ personal information (even anonymised) with third parties. But this would force manufacturers to completely rethink their business model which, on the contrary, relies on the massive collection of personal data.
It is no coincidence that Facebook got its hands on the headset maker Oculus in 2014 for a little over $ 2 billion. It is also in order to have privileged access to this pool of personal data.
► Idea n ° 3: Entrust the sector’s regulation to an independent committee
Finally, a proposal emerged at a summit of experts which met in 2018 at Stanford, that of the establishment of an independent committee for the virtual reality sector also called an IRB ( Institutional Review board). This type of group already regulates university and medical frameworks to ensure that there are no conflicts of interest of researchers or professors on certain subjects of study, in the manner of an independent ethics committee.
This board would be responsible for identifying the extent and likelihood of potential harm to users and to act preventively to minimise such risks.
However, it is still necessary that virtual reality companies agree to self-regulate; but what would they gain from it? Perhaps introducing a little more transparency into their operation would ultimately make it possible to offer a technology that is better understood and therefore better accepted by all.