Gesture-Based Programming (GBP) is a form of programming by human
demonstration.
The process begins by observing a human demonstrate the task to be programmed.
Observation of the human's hand and fingertips is achieved through a sensorized glove with
special tactile fingertips. The modular glove system senses hand pose, finger joint angles, and fingertip contact conditions. Objects in the environment are sensed with computer vision while a speech recognition system extracts "articulatory gestures."
Primitive gesture classes are extracted from the raw sensor information and passed on to a
gesture interpretation network. The agents in this network extract the demonstrator's intentions based upon the knowledge they
have previously stored in the system's skill library from prior demonstrations. Like a self-aware human trainee, the system is
able to generate an abstraction of the demonstrated task, mapped onto its own skills. In other words, the system is not merely
remembering everything the human does, but is trying to understand -- within its scope of expertise -- the subtasks the human
is performing ("gesturing"). These primitive capabilities in the skill base take the form of encapsulated expertise agents -- semi-autonomous agents that encode sensorimotor primitives and low-level skills for later execution.
The output of the GBP system is the executable program for performing the demonstrated task on the target hardware. This
program consists of a network of encapsulated expertise agents of two flavors. The primary agents implement the primitives
required to perform the task and come from the pool of primitives represented in the skill base. The secondary set of agents
includes many of the same gesture recognition and interpretation agents used during the demonstration. These agents perform
on-line observation of the human to allow supervised practicing of the task for further adaptation.
Learning from Demonstration with Coaching
Super Baxter
Super Baxter is a next-generation collaborative robot based on existing technologies from Rethink Robotics and Barrett Technology,
coupled with the Purdue/DU physical avatar for natural human interaction.
Demos
Publications
- GT Gonzalez, U Kaur, M Rahman, V Venkatesh, N Sanchez, G Hager, Y. Xue, R. Voyles, J. Wachs, "From the
Dexterous Surgical Skill to the Battlefield—A Robotics Exploratory Study," in Military Medicine 186 (Supplement_1), 288-294, 2021.
- Md Masudur Rahman, Mythra V. Balakuntala, Glebys Gonzalez, Mridul Agarwal, Upinder Kaur, Vishnunandan L. N. Venkatesh, Natalia Sanchez-Tamayo, Yexiang Xue, Richard M. Voyles, Vaneet Aggarwal & Juan Wachs,
"SARTRES: a semi-autonomous robot teleoperation environment for surgery," in Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, DOI: 10.1080/21681163.2020.1834878, 2020.
- N Madapana, MM Rahman, N Sanchez-Tomayo, MV Balakuntala, ...R Voyles, Y Xue, J Wachs, "DESK: A Robotic Activity Dataset for Dexterous Surgical Skills Transfer to Medical Robots,"
in IROS, New Delhi, India, 2019.
- ME Cabrera, N Sanchez-Tamayo, R Voyles, JP Wachs, "One-shot Gesture Recognition: One Step Towards Adaptive Learning,"
in 12th IEEE Intl Conf on Automatic Face & Gesture Recognition, pp. 784-789, 2017.
- ME Cabrera, K Novak, D Foti, R Voyles, JP Wachs, "What Makes a Gesture a Gesture? Neural Signatures Involved in Gesture Recognition,"
in 12th IEEE International Conference on Automatic Face & Gesture, pp. 748-753, 2017.
- T. Soratana, M.V.S.M. Balakuntala, P. Abbaraju, R. Voyles, J. Wachs, M. Mahoor, "Glovebox Handling of High-Consequence
Materials with Super Baxter and Gesture-Based Programming - 18598",
in Waste Management (WM 2018), 44th International Symposium on, Phoenix, AZ, Mar., 2018.
- S. Watters, T. Miller, P. Balachandran, W. Schuler, R. Voyles,
"Exploiting a Sensed Environment to Improve Human-Agent Communication," in
Proceedings of the 4th International Joint Conference on Autonomous Agents
and Multi-Agent Systems
- M.S. Sutton, A. Larson, R. Voyles, "Performance Evaluation of
Sensorimotor Primitives Using Eigenvector Learning Method," in Proceedings
of the 2001 IEEE/RSJ International Conference on Intelligent Robots and
Systems, v. 1, pp. 963 - 967.
- P.E. Rybski and R.M. Voyles, "Task Training of a Mobile Robot Using a
Hidden Markov Model-Based
Gesture Recognition System," in Proceedings of the 1999 IEEE
International Conference on
Robotics and Automation, Detroit, MI, v. 1, pp. 664-669.
- R.M. Voyles and P.K. Khosla,
"Gesture-Based Programming: A Preliminary
Demonstration," in
Proceedings of the 1999 IEEE International Conference on Robotics and
Automation, Detroit, MI, v. 1, pp 708-713. (19 MBytes!)
- R.M. Voyles, J.D. Morrow, and P.K. Khosla, "Towards Gesture-Based
Programming: Shape from Motion Primordial Learning of Sensorimotor
Primitives," Journal of Robotics and Autonomous Systems, v. 22,
n. 3-4, Dec. 1997, pp. 361-375.
Copyright: © 1999, 2006,2017-2021 by Richard Voyles
rvoyles [at] purdue [dot] edu