Soon You’ll Sit Inside a Robot’s Head at Work

MIT’s Computer Science and Artificial Intelligence Lab, CSAIL, has created a process of teleoperating a Baxter humanoid robot with an Oculus Rift VR headset. This project is partially aimed towards making manufacturing jobs a hell of a lot of fun telecommutable. It could even be a way to supervise robot workers from a distance.

In a nutshell, the user controls the robot remotely in a virtual reality environment. The user does this specifically in a VR environment modeled like a control room with multiple sensor displays, making it feel like they are sitting inside the robot’s head. By using hand controllers, users can match their movements to the robot’s to complete various tasks. If you’ve seen Pacific Rim, you are probably envisioning a Jaegar right about now — minus the psychic linking.

The whitepaper (PDF) walks through the choices the team made in developing this system. In the past, teleoperation was usually approached with a direct model or a cyper-physcial model.

In a “direct” model, the user’s vision is directly coupled to the robot’s state but a delayed signal could lead to nausea and headaches because of a lack of vestibular stimulation with visual representation of motion. In a “cyber-physical” model, the user is separate from the robot and interacts with a virtual copy of the robot and the environment. While this method was better for the driver it required much more data and specialized spaces.

Like any successful hodgepodge, the CSAIL team’s system has combined the two previous methods into something a bit more advanced. It solves the delay problem as the user is constantly receiving visual feedback from the virtual world. It additionally solves the cyber-physical issue of feeling distinct from the robot: once a user puts on the headset and logs into the system, it will seem as if they are inside Baxter’s head. The brilliance of this system is that it mimics the “homunculus model of mind” — the idea that there is a tiny human inhabiting our brains, controlling our actions, viewing the images we see, and understanding them for us.

Unsurprisingly, users with gaming experience had much more ease with the system as well. Much like the military shows an interest in skilled FPS players, we might soon see a wave of recruitment for video gamers in the manufacturing industry. Maybe all of those hours poured into the claw machine was worth it after all.

[Thanks Adam]


Filed under: robots hacks, Virtual Reality

from Hackaday http://ift.tt/2xwdhW0
via IFTTT