Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) recently released a presentation detailing their work on a virtual reality (VR) system for robot tele-operation. The team, funded in part by Boeing, used an Oculus Rift headset to monitor and control a Baxter robot from Rethink Robotics.
There are two fundamental methods of VR tele-operation. In the direct model, the user experience is coupled to the robot: the user sees what the robot sees, and the users motion is mirrored by the robot. While this method provides an immersive experience, it is also limits the amount of information that can be communicated and increases the chance of nausea and motion sickness. In the cyber physical model, which was implemented in this project, the user is placed within a virtual control room, which provides a layer of abstraction for incoming information and outgoing control. In the MIT experiment the virtual control room was equipped with several video feeds, displayed in 2D, and a simplified 3D mapping of the robot end effectors. By manipulating the virtual end effectors using Oculus controls, participants completed tasks such as picking up screws, stapling wires, and stacking and assembling blocks.
The project was a huge success when compared to similar, state-of-the-art VR implementations, completing tasks in nearly half the time, and with a much higher success rate. However, the technology is still far from industry use. Success in a controlled environment with specific controlled tasks is one thing, moving to industrial applications will require significantly more complicated tasks and greater risks and is still some time away.
The potential benefits are intriguing. While the human controlled robot may not be ideal for a standard manufacturing process – which relies on quick repeated operations and high levels of precision – it could excel in open environment applications. Applications that come to mind immediately are inspection and repairs, especially in the marine and power generation industries, where harsh environments and remote locations restrict access. The core concept, what is essentially a more immersive drone control method, is well suited for the aerospace & defense industry, which explains Boeing’s involvement. But I don’t think the opportunities end there. This technology could enable instant worldwide access for any job that requires a humans full control over a piece of industrial equipment. In the future roads, bridges, buildings, and plants might be constructed by machines with their operators tele-commuting from all around the globe. Furthermore, MIT is investigating taking their project and moving it one level up – rather than looking at and controlling a single piece of equipment, the operator would have a view of several pieces of equipment, allowing him or her to choreograph their movement in 3D space like a virtual foreman.