As Virtual Reality and Augmented Reality, two techs that are growing in popularity, are said to dominate the upcoming new era of cyberspace, giving most of us a fresh look into the novel digital scope, the hardware of those headsets, and the interfaces of the virtual keyboards may be at risk of providing hackers with more possibilities.
University of California, Riverside, computer scientists attested the results of their research in two papers, which they will exhibit in the yearly cyber security international conference known as Usenix Security Symposium.
We know that Mark Zuckerberg of Facebook and Meta and other leading companies in tech are rapidly working to develop those metaverse technologies, which make use of hardware devices that detect our physical gestures, including blinks, steps, and nods. That way, users can explore the realm of Virtual Reality and Augmented Reality for gaming, meet new people, socialize, and set up a new way to conduct commerce.
Professors Nael Abu-Ghazaleh and Jiasi Chen, the computer team from UCR, have proved that surveillance software can document and monitor every single one of our movements to make use of AI to convert them to written speech or texts that have an accuracy of 90 percent and more.
Abu Ghazaleh claimed that if a user has more than one softwares running, one of them may be spyware and therefore track user activities on the other applications and can monitor their surrounding space— including being able to see those near the user and their distance from the user. The hacker can also access the user’s interactions with the Metaverse hardware devices.
To further elaborate on the privacy risks, the spyware can even track the user’s personal data; when they switch from one application to the other and key in their password using the virtual keyboard from the headset. These hackers can use the same methods to access the user’s body motions during a virtual meeting and decode their movements to grasp confidential information.
Professor Chen and Abu-Ghazaleh, along with a University of California, Riverside doctoral student studying CS, Yicheng Zhang, and a guest Assistant Professor from Harvey Mudd College, Slocum, collaboratively wrote the two research papers that they will discuss at the conference.
Zhang is the lead author of one of the papers titled 'It’s all in your head(set): Side-channel attacks on AR/VR systems.' The research delves into the ways through which malicious hackers can exploit users' body movements, speech, and typing actions on the virtual keyboard that has a precision of more than 90 percent. The study also provides insights into ways these malicious actors can monitor applications as users initiate them. Additionally, it explains how they can precisely determine the proximity of other individuals to the headset user, achieving a precision distance of 10.3 cm.
Solcum is the primary writer of the second research paper, titled, ‘Going through the motions: AR/VR keylogging from user head motions’, which reveals the more pressing concerns regarding the security threats that come with virtual keyboards. The paper also discusses the ways intricate body gestures, such as user head motions wearing the headset as they type in their passwords on the virtual keyboards, are more than enough to allow these hackers to capture the details of the text. The team then created TyPose, a system based on machine learning to collect such gestures and movements to interpret them into characters or words as the user types.
Those two research papers have one purpose: to alert the tech industry planning to incorporate the metaverse to a new scale about the cybersecurity threats that come with it. Abu-Ghazaleh stated that they worked on showcasing the possibility of such threats and on the much-needed transparency to give these tech giants a chance to tackle these security issues prior to making their research public.
Read next: A Growing Threat: How AI Poses Risks to Cybersecurity in 2023
by Ahmed Naeem via Digital Information World
No comments:
Post a Comment