Gesture-Based Commanding of Robotic Movements
What sorts of gestures might be used to command or fine-tune a robot's movements? How does one program a computer to interpret gestures? That is the first problem I set out to try to solve.
I started by trying to interpret "nudge" gestures imparted to the end-effector of a robot to modify the way it was moving. It's rather trivial to have the robot just move in the direction it's pushed. Instead, I assume the robot has a finite set of trajectories it can follow. The nudges fine-tune those trajectory shapes, making them longer or shorter, wider or skinnier, etc. This is still not too hard. The real challenge is how to interpret a change from one shape to another.
Is this useful? Here is a mock-up of a cable harness routing task. In routing the wires, the robot has a finite number of shapes it needs to drag wires through the bed of na oils (pegs to facilitate bending the wire). A human can demonstrate the path and then nudge the robot to fine-tune the trajectory.
Image Sequence of Cable Harnessing (about 300 Kbytes)
Gesture-Based Programming Demos
Gesture-Based Programming (GBP) is a form of programming by human demonstration aimed at contact-intensive tasks. First, the human demonstrates the task:
Image Sequence of Task Demonstration (about 300 Kbytes)
Then, the robotics system (here, a PUMA and Utah/MIT hand) attempts to replicate it based on its own previously-learned skills and the observations of the task:
Image Sequence of Robotic Execution (about 300 Kbytes)
Copyright: © 1999 by Richard Voyles
rvoyles [at] purdue [dot] edu