A rendering of a UGV (right) and a UAV (centre) working in tandem at a rescue site
Author profile picture

Researchers from Xi’an Jiaotong-Liverpool University, King’s College London, and the University of British Columbia have developed a virtual reality (VR) system to improve remote human-vehicle interaction. The team won the Best Paper Award at the 18th International Conference on Virtual Reality Continuum and its Application in Industry. The technology employs the “world in miniature” (WiM) technique, allowing users to control unmanned ground vehicles (UGVs) with hand gestures on VR devices. This approach is particularly useful for search and rescue missions, where UGVs and unmanned aerial vehicles (UAVs) work together.

Introducing Worlds-in-Miniature for efficient teleoperation of UGVs

Worlds-in-Miniature (WiM) is a technique that offers a scaled-down replica of the original environment, combining the advantages of an operation space, a cartographic map, and an interface that allows users to observe overview and detail quickly. This technique has been applied to the remote control of drones using VR, but its potential in teleoperator-drone manipulation remains under-explored. Traditional approaches for teleoperator-drone remote control rely on computer monitors to display information and keyboards, mice, or joysticks to control the drone. However, these conventional methods have limitations, especially when controlling UGVs with fixed camera perspectives.

UGVs are commonly used in search and rescue operations, where they can be remotely controlled based on real-time pictures transmitted by UAVs. This combination of aerial and ground perspectives enables large-scale searches without blind spots, making the process highly efficient. Nevertheless, the traditional method of operating UGVs using two joysticks can be challenging, as the angle of the UAV’s camera is fixed, and the map’s perspective does not change when the vehicle changes direction[2]. This can lead to difficulties in distinguishing between left and right, negatively impacting the operator’s control over the UGV.

Implementing WiM for better control and natural interaction

To tackle these challenges, the research team applied the WiM technique to the remote human-vehicle interaction system, providing a first-person perspective that moves with the vehicle. This innovative approach enables operators to control UGVs more conveniently and accurately using hand gestures, such as grasping and moving, on VR devices. The WiM-based VR interface not only addresses the limitations of traditional methods, but it also supports teleoperators in their interaction with and control of a miniature UGV directly on the miniature map.

The researchers conducted a user study comparing four conditions, consisting of two factors: map visibility and control methods, to evaluate the effectiveness of their approach. The findings showed that the WiM technique significantly improved teleoperators’ performance, reduced their workload, and enhanced their preference for teleoperation tasks. Consequently, the team’s research contributes to the VR community by providing a VR interface based on WiM for UGV teleoperation and opening doors for other researchers interested in designing VR interfaces in similar contexts.

Applications beyond search and rescue missions

While the primary focus of this research is on search and rescue missions, the technology has the potential for broader applications. Jialin Wang, the paper’s second author and a PhD student at the School of Advanced Technology, expressed hope that the technology will contribute to fields such as agricultural measurement, TV and filmmaking, and more.

Dr Yushan Pan, a supervisor of the research team at Xi’an Jiaotong-Liverpool University’s Department of Computing, highlighted the team’s interface’s ability to allow users to control UGVs more flexibly and accurately in the top-view perspective. The research paper also offers unique insights for researchers worldwide in areas such as natural human-machine collaboration and remote control. With their cross-disciplinary research using robots, remote control, and VR, the team proved that remote control of UGVs could be achieved even without computer monitors, keyboards, mice, or joysticks, making the remote interaction between humans and UGVs more natural.