Purdue University Mark

Purdue University

Super Baxter and Gestures

SuperBaxter: Next Generation Collaborative Robot for Manufacturing


Super Baxter is a collaboration between the NSF RoSe-HUB Center, Rethink Robotics, and Barrett Technology to develop a next-generation collaborative robot with programming by demonstration, human activity recognition, human facial emotion recognition and display, human gesture recognition and display, and natural language understanding and generation that can engage in verbal and non-verbal dialog with humans for manufacturing and assistive applications.

Super Baxter Photo

Gesture-Based Programming

Gesture-Based Programming (GBP) is a form of programming by human demonstration based on an expansive definition of gestures beyond the traditional hand motions. The process begins by observing a human demonstrate the task to be programmed. Observation of the human's hand and fingertips is achieved through a sensorized glove with special tactile fingertips. The modular glove system senses hand pose, finger joint angles, and fingertip contact conditions. Objects in the environment are sensed with computer vision while a speech recognition system extracts "articulatory gestures." Primitive gesture classes are extracted from the raw sensor information and passed on to a gesture interpretation network. The agents in this network extract the demonstrator's intentions based upon the knowledge they have previously stored in the system's skill library from prior demonstrations. Like a self-aware human trainee, the system is able to generate an abstraction of the demonstrated task, mapped onto its own skills. In other words, the system is not merely remembering everything the human does, but is trying to understand -- within its scope of expertise -- the subtasks the human is performing ("gesturing"). These primitive capabilities in the skill base take the form of encapsulated expertise agents -- semi-autonomous agents that encode sensorimotor primitives and low-level skills for later execution.

Gesture-Based Programming Photo

The output of the GBP system is the executable program for performing the demonstrated task on the target hardware. This program consists of a network of encapsulated expertise agents of two flavors. The primary agents implement the primitives required to perform the task and come from the pool of primitives represented in the skill base. The secondary set of agents includes many of the same gesture recognition and interpretation agents used during the demonstration. These agents perform on-line observation of the human to allow supervised practicing of the task for further adaptation.

Demos

Link to Demo

Gesture-Based Programming Demos

RoSe-HUB Publications

Prior Publications

  • S. Watters, T. Miller, P. Balachandran, W. Schuler, R. Voyles, "Exploiting a Sensed Environment to Improve Human-Agent Communication," in Proceedings of the 4th International Joint Conference on Autonomous Agents and Multi-Agent Systems
  • M.S. Sutton, A. Larson, R. Voyles, "Performance Evaluation of Sensorimotor Primitives Using Eigenvector Learning Method," in Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, v. 1, pp. 963 - 967.
  • P.E. Rybski and R.M. Voyles, "Task Training of a Mobile Robot Using a Hidden Markov Model-Based Gesture Recognition System," in Proceedings of the 1999 IEEE International Conference on Robotics and Automation, Detroit, MI, v. 1, pp. 664-669.
  • R.M. Voyles and P.K. Khosla, "Gesture-Based Programming: A Preliminary Demonstration," in Proceedings of the 1999 IEEE International Conference on Robotics and Automation, Detroit, MI, v. 1, pp 708-713. (19 MBytes!)
  • R.M. Voyles, J.D. Morrow, and P.K. Khosla, "Towards Gesture-Based Programming: Shape from Motion Primordial Learning of Sensorimotor Primitives," Journal of Robotics and Autonomous Systems, v. 22, n. 3-4, Dec. 1997, pp. 361-375.

Additional Support

This work is supported by the NSF Center for Robots and Sensors for the Human Well-Being through CNS-1439717 with additional support from an NSF MRI grant, CNS-1427872.


Copyright: © 2017, 2018 by Richard Voyles

rvoyles [at] purdue [dot] edu

Purdue University, West Lafayette, IN 47907, (765) 494-3733