BLOG & RESOURCES / BLOG
Trends in XR Interaction Technology: 6DoF Tracking, Gesture Recognition, Eye
16 June 2023
XR interaction technology provides users with ways and means to interact with virtual environments, while real-time cloud rendering offers true 3D, interactive, and highly immersive graphics rendering and computing capabilities. By combining these two, users can experience more realistic and immersive virtual experiences through XR devices, while enjoying high-quality graphics and smooth interaction responses. This article will introduce the current trends in XR interaction technology for reference by professionals in the XR field.
Trends in XR Interaction Technology: 6DoF Tracking, Gesture Recognition, Eye Tracking, and more…
XR interaction technology provides users with ways and means to interact with virtual environments, while real-time cloud rendering offers true 3D, interactive, and highly immersive graphics rendering and computing capabilities. By combining these two, users can experience more realistic and immersive virtual experiences through XR devices, while enjoying high-quality graphics and smooth interaction responses. This article will introduce the current trends in XR interaction technology for reference by professionals in the XR field.
In order to achieve a more immersive experience, future interaction technologies will evolve towards multimodality and refinement.
Currently, 6DoF interaction for the head and hands has become the standard for mainstream virtual reality headsets, and the focus of future development will be on interaction technologies for the head and hands. Gesture tracking and eye-tracking technologies will be widely applied in the medium to short term and become key areas for breakthroughs in enterprise applications. These trends will drive further advancements in virtual reality technology, providing users with more realistic, natural, and seamless virtual experiences.
6DoF tracking and positioning
In XR immersive experiences, 6DoF tracking and positioning technology is a key component.
This technology enables tracking of six degrees of freedom, including rotational and translational movements, allowing for more realistic and precise spatial positioning and movement tracking, thereby enhancing user immersion. Currently, 6DoF tracking and positioning technology is primarily applied to interactions involving the head and hands, serving as a critical factor in achieving high-quality XR experiences.
The development of XR interaction technology encompasses multiple directions. 6DoF tracking and positioning technology is a crucial element for achieving immersive experiences, as it enables tracking of six degrees of freedom for rotational and translational movements, with the main focus being on interactions involving the head and hands. Gesture tracking technology utilizes the detection of hand gestures to facilitate interactive control, including solutions such as bare-hand recognition and data gloves. Eye-tracking technology allows for more precise interactions, including gaze-based rendering, interpupillary distance adjustment, and iris unlocking features. Facial recognition technology captures users’ facial expressions through sensors such as cameras, enabling more vibrant and interactive experiences. Electromyography (EMG) sensing technology detects muscle and nerve activity to achieve more natural human-computer interaction. Virtual smell technology provides users with olfactory sensations, such as emitting odor particles, to create a more realistic olfactory interactive experience.
Additionally, haptic feedback technology provides users with tactile sensations through methods such as vibration and ultrasound, enhancing the sense of realism in virtual experiences. Brain-computer interface (BCI) technology enables human-computer interaction through brainwave-based interfaces, allowing people to control actions such as mental typing and mind-controlled gaming. The continuous development and application of these interaction technologies will further drive the advancement of XR technology, delivering more realistic, natural, and convenient virtual experiences for users.
XR Gesture Tracking
The current device solutions for XR gesture tracking technology mainly consist of bare hand tracking and wearable devices. Bare hand tracking relies on technologies such as visual tracking, inertial tracking, and bend sensors, but it faces challenges such as hand occlusion and limited tracking range. Data gloves integrate multiple sensing technologies, providing high tracking accuracy but at the cost of comfort and higher expenses. In comparison, lightweight and portable wearable devices like wristbands and finger rings are expected to become the main solutions for consumer-grade XR gesture tracking. These devices are projected to have a balance between integration of technology and features, tracking accuracy, and portability. Moreover, they are less intrusive to hand movements. Future iterations of these wearable devices may incorporate additional features like buttons and displays to enable more interactive functionalities.
In addition to the existing gesture tracking technologies, there are emerging technologies under development, such as ultrasound-based gesture tracking, machine vision-based gesture recognition, and electromagnetic wave-based gesture tracking. These new technologies offer different characteristics and advantages, further enhancing the precision and stability of gesture tracking for a superior XR interaction experience.
Furthermore, future developments in gesture tracking technology will prioritize comfort and naturalness. Current gesture tracking devices often require wearing or holding, which limits user freedom and comfort. However, future gesture tracking technologies will focus more on ergonomic design, allowing users to perform gestures naturally and improving the comfort and naturalness of interactions.
With the continuous advancement and widespread adoption of XR technology, gesture tracking will become an integral part of XR interaction experiences, finding applications in various domains such as virtual reality, augmented reality, smart homes, and smart offices.
Eye tracking technology
Eye tracking technology has been widely applied in XR hardware with various benefits.
It can be used for dynamic foveated rendering, eye-gaze interaction, eye-tracking data analysis, virtual character expression tracking, identity recognition, as well as assessing visual and psychological health. By leveraging the physiological characteristics of eye fixation, eye tracking technology selectively renders high-definition graphics only in the central foveal region, significantly reducing the rendering load on head-mounted devices. Additionally, eye tracking technology can record user gaze trajectories, fixation durations, pupil size, and other data, providing valuable insights for optimizing product design and enhancing training efficiency in domains such as shopping and education.
XR hardware manufacturers employ different principles for eye tracking technology, including pupil corneal reflection vector method, retinal imaging method, reflected light intensity modeling method, and 3D model reconstruction method. Among these, the pupil corneal reflection vector method is the most widely adopted. This approach uses infrared light to illuminate the eyes and captures the reflected infrared light passing through the pupil using a camera. By calculating the angle between the corneal and pupil reflection rays, the direction of eye movement can be determined. Each tracking method has its own strengths, weaknesses, and suitable scenarios.
In addition to the mentioned applications and benefits, eye tracking technology can also be utilized in XR hardware for body language recognition and emotion analysis in human-computer interaction. By analyzing data such as eye movement trajectories, gaze points, and fixation durations, it is possible to recognize users’ body language cues, such as feelings of anxiety, tension, or excitement. This technology can be applied in virtual meetings, educational training, psychological therapy, and other scenarios to help users better understand and adjust their emotions and body language.
Eye tracking technology can also be combined with other XR technologies, such as head tracking and gesture tracking, to achieve more natural, intuitive, and efficient human-computer interaction. Head tracking technology tracks the movement of the user’s head, enabling more natural changes in the viewing perspective, while gesture tracking technology tracks the movement of the user’s hands, enabling more natural hand interactions. The integration of these technologies allows users to experience a truly immersive and intuitive interaction in VR/AR environments.
Eye tracking technology has a wide range of applications and enormous potential in XR hardware. With ongoing technological advancements, eye tracking technology is expected to become more mature and widely adopted, delivering even more impressive results for XR interactive experiences.
XR interaction technology and real-time cloud rendering technology work together to form and advance a comprehensive XR experience.
Real-time cloud rendering enables large-scale graphic computation on high-performance cloud servers, delivering real-time rendering results to terminal devices over the network, allowing them to present complex virtual scenes without requiring high-performance hardware. This is particularly important for resource-limited terminals such as mobile devices. Additionally, real-time cloud rendering supports multi-user collaborative experiences and cross-platform application development, providing a broader space for the development of XR interaction technology.
Therefore, XR interaction technology and real-time cloud rendering are complementary, driving the development and application of XR technology and providing users with more immersive and interactive virtual reality and augmented reality experiences.
As one of the most widely used real-time cloud rendering and Cloud XR solutions in the industry, Paraverse LarkXR supports a wide range of applications in industries such as digital twins, education, and virtual events and exhibitions in the metaverse.
“In the recently released 【2022-2023 User Survey Report on Real-time Cloud Rendering Solution LarkXR】 by Paraverse, you can learn relevant data and information about these application scenarios, allowing practitioners in the metaverse/XR industry to better understand the industry landscape.”
We welcome close communication and interaction with users who are implementing cloud rendering solutions, exploring the Cloud XR technology roadmap, and using or considering the adoption of Paraverse LarkXR solution. Together, we can continuously enrich the depth and breadth of real-time cloud rendering solutions.