STAR

Surgical Telementoring using AR

An innovative System for Telementoring with Augmented Reality (STAR) that relies on table and touchscreen displays, transparent screens, tablets, and color and depth sensors to increase the quality of the communication between mentor and trainee.

Read more
image01

FIST-D

A Robotic Explosive Ordnance Disposal (EOD)

Our telerobot can deliver an easily interpretable multimodal image; learn from limited observations using principles of transfer learning; and recognize objects from texture, chemical and weight signatures.

Read more
image02

Gesturenurse

A robotic nurse that understands human gestures

Our goal is to bridge these gaps by introducing a robotic assistant that understands speech and gestural commands. anticipate/detect a surgeon’s requests without requiring the surgeon to alter her behavior or other re-training

Read more
image03

Welcome to the website of Juan P. Wachs at the School of Industrial Engineering at Purdue University!

Our research lies in the nexus between robotics, interaction and data science, specifically as applied to healthcare, which is in line with the foci of Human Systems and Healthcare Systems. Throughout our work at Purdue, our lab focuses in the field of human-machine interaction, specifically in the medical field. Key areas of research in the ISAT lab involves the study of surgical robotics, hand gesture interfaces and assistive technologies.

Research

Research

Publications

Publications

Videos and News

Videos and News

Our Research Projects

These are some of example (non-exahustive) list of research project so you get an idea of the type of work done at the ISAT Lab.

forward sketch

Autonomous Surgical Robots

GoalAlgorithms to Make Surgical Robots Intelligent

Our objective is to develop theoretical framework for supervised autonomy, capable to self-adjust its autonomous behavior and perform procedures in never seen settings using a transfer learning paradigm. Our working hypothesis is that an existing procedure can be adapted to a new domain using an encoding scheme to restore supervisory content combined with a one shot learning framework.

Sample paper:

Rahman MM, Sanchez-Tamayo N, Gonzalez G, Agarwal M, Aggarwal V, Voyles RM, Xue Y, Wachs J. Transferring Dexterous Surgical Skill Knowledge between Robots for Semi-autonomous Teleoperation In 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) 2019 Oct 14 (pp. 1-6).

project Gestix

Gestureclean

GoalDevelop an optimal vocabulary for gesture control in the OR

The use of keyboards and mice in the surgical setting can compromise sterility and spread infection. Instead, interfaces based on natural means of communication, such as gestures, have been suggested to address this problem. This research proposes an analytical and systematic approach for the design of gesture lexicons for the operating room.

Sample paper:

Madapana N, Gonzalez G, Rodgers R, Zhang L, Wachs JP (2018) Gestures for Picture Archiving and Communication Systems (PACS) operation in the operating room: Is there any standard? PLoS ONE 13(6): e0198092.

project Wheelchair

Robotic Control for Wheelchair AT

Goal:Wheelchair mounted robotic arm with object recognition

We developed an integrated, computer vision-based system to operate a commercial wheelchair-mounted robotic manipulator (WMRM). In addition, a gesture recognition interface system was developed specially for individuals with upper-level spinal cord injuries including object tracking and face recognition, hands-free WMRM controller

Sample paper:

Jiang, H., Duerstock, B. S., and Wachs, J. P. (2013). A machine vision-based gestural interface for people with upper extremity physical impairments IEEE Transactions on Systems, Man, and Cybernetics: Systems, 44(5), 630-641.

project Gestonurse

Gestonurse

GoalDevelop a robotic assistant that understand gestures

This is an automated solution to anticipate/detect a surgeon’s requests without requiring the surgeon to alter her behavior or other re-training. This automated solution prevents many communication failures in the OR. To address this problem, a holistic approach that considers both variability of surgical teams and instrumentation, surgeons’ verbal and nonverbal communication, and the context of the surgery is developed.

Sample paper:

Jacob M, Li YT, Akingba G, Wachs JP. Gestonurse: a robotic surgical nurse for handling surgical instruments in the operating room. Journal of Robotic Surgery. 2012 Mar 1;6(1):53-63.

project telementoring

System for Telementoring using AR

GoalIncrease the sense of co-presence for mentor and surgeon

Develop and assess a System for Telementoring with Augmented Reality (STAR) that allows increasing the mentor and trainee sense of co-presence through an augmented visual channel. The technology uses see-through glasses and augmented reality for surgical annotations.

Sample paper:

Rojas-Muñoz E, Cabrera ME, Andersen D, Popescu V, Marley S, Mullis B, Zarzaur B, Wachs J. Surgical telementoring without encumbrance: a comparative study of see-through augmented reality-based approaches. Annals of surgery. 2019 Aug 1;270(2):384-9.

project manifolds

Accesibility in Gaming for Everyone

GoalDevelop a Vocabulary of Gestures for People with Quadriplegia

The objective of this work is to include gesture variability analysis into the existing framework using robotics as an additional validation framework. Based on this, a physical metric (referred as work) was empirically obtained to compare the physical effort of each gesture.The voacbularies created were tested in a Pac-Man Game.

Sample paper:

Jiang H, Duerstock BS, Wachs JP. User-centered and analytic-based approaches to generate usable gestures for individuals with quadriplegia. IEEE Transactions on Human-Machine Systems. 2015 Nov 20;46(3):460-6.

fist-D

FIngers See Things Differently (FIST-D)

GoalModel algorithms to indetify and co-act on threats

In addition to its main application, the theories and technologies involved in this proposal impact other fields in which dexterity and tactile feedback is key for successful task completion, such as tele-surgery. A unique feature of this project is a set of bimanual tool tips, equipped with multi-sensory devices for collecting tactile, force and chemical composition information for target characterization and action.

Sample paper:

Xiao C, Wachs J. Triangle-Net: Towards Robustness in Point Cloud Classification. arXiv preprint arXiv:2003.00856. 2020 Feb 27.

turn-taking

Turn Taking in Surgery

GoalUsing Spiking Neural Networks for Turn-Taking Prediction

We present the Turn-Taking Spiking Neural Network (TTSNet), which is a cognitive model to perform early turn-taking prediction about a human or agent’s intentions. The TTSNet framework relies on implicit and explicit multimodal communication cues (physical, neurological and physiological) to be able to predict when the turn-taking event will occur in a robust and unambiguous fashion.

Sample paper:

Zhou T, Wachs JP. Spiking Neural Networks for early prediction in human–robot collaboration. The International Journal of Robotics Research. 2019 Dec; 38(14):1619-43.

project haptimage

Multimodal Interface for the Blind

GoalAn interface to assist individuals who are blind explore images

This research developed a realtime multimodal user interface that conveys visual information to the blind through haptics, auditory and vibrational feedback. It can also give the users exploratory strategies based on their exploration behaviors to help them understand the image more accurate and more efficient.

Sample paper:

Zhang, T; B. S. Duerstock, and Wachs, JP. 2017. Multimodal Perception of Histological Images for Persons Who Are Blind or Visually Impaired. ACM Trans. Access. Comput. 9, 3, Article 7 (February 2017).

  • project forward

    FORWARD

    Autonomous, Unmanned Medical Robots

  • project Gestix

    Gestureclean

    Touchless MRI Navigation System

  • project wheelchair

    Robotic control for Wheelchair

    Wheelchair-mounted robot for assistance

  • project Gestonurse

    GESTONURSE

    The first multimodal robotic assistant to the OR

  • project telementoring

    STAR

    Telementoring using Augmented Reality

  • project manifolds

    Accesibility in Gaming for Everyone

    Gestures for Individuals with Quadriplegia

  • fist-d

    FIST-D

    Bomb dismantling through tactile exploration

  • turn-taking

    Turn Taking in Surgery

    Using Spiking Neural Networks for Turn Taking

  • project haptimage

    Accessible Imaging for the Blind

    Empowering blind people explore images

ISAT Lab Members

Welcome to the ISAT (Intelligent Systems and Assistive Technologies) Laboratory. Our team is diverse, multicultural and above all - friendly people. The envirnoment and culture at the lab is of inclusion and tolerance. If you are interested in robotics, AI, and HCI for medical applications, you have a place with us!

team 1

Juan P Wachs, PhD

Principal Investigator

Professor

(Currently Program Director at NSF)

University Faculty Scholar

Regenstrief Center for

Healthcare Engineering

Adjunct Professor of Surgery

IU School of Medicine

Professor of Biomedical

Engineering (by courtesy)

School of Industrial Engineering

team 1

Naveen Madapana

PhD Student

School of Industrial Engineering

Purdue University

nmadapan{at}purdue{dot}edu

Chenxi

Chenxi Xiao

PhD Student

School of Industrial Engineering

Purdue University

xiao237{at}purdue{dot}edu

Edgar Rojas

Edgar Rojas

PhD Student

School of Industrial Engineering

Purdue University

edtec217{at}gmail{dot}com

Glebys Gonzalez

Glebys Gonzalez

PhD Student

School of Industrial Engineering

Purdue University

gonza337{at}purdue{dot}edu

Ting Zhang

Ting Zhang

PhD (Alumni)

School of Industrial Engineering

Purdue University

juliet.tingcheung{at}gmail{dot}com

Juan Antonio

Juan Antonio Barragan

Master Student

School of Industrial Engineering

Purdue University

barragan{at}gmail{dot}com

Xingguang Zhang

Xingguang Zhang

Master Student

School of Electrical and Computer Engineering

Purdue University

zhan3275{at}purdue{dot}edu

Natalia Sanchez

Natalia Sanchez Tamayo

Master (Alumni)

School of Industrial Engineering

Purdue University

n.sanchez3330{at}gmail{dot}com

Nuela Enebechi

Nuela Enebechi

PhD Student

School of Industrial Engineering

Purdue University

cenebech{at}gmail{dot}com

Daniela Chanci Arrubla

Daniela Chanci Arrubla

Master Student

School of Industrial Engineering

Purdue University

dchancia{at}purdue{dot}edu

Natalia Sanchez

Akash Agarwal

Master Student

School of Industrial Engineering

Purdue University

sanch174{at}gmail{dot}com

About Us

Our vision is to enable robots to understand the variability and range of human motions and gestures (including physical constraints) to support patient rehabilitation (instead of the other way around), to facilitate work with machines, and to improve surgical outcomes in the Operating Room through medical robotics.

We are currently working on fundamental new problems in the intersection between robotics and human AI interaction. I particularly want to study new AI and data science based paradigms that can help transfer learning from controlled settings to uncontrolled/austere scenarios. Of special interest are transfer learning theories that explore the ability of machines to recognize actions and execute them effectively from observing few instances of it.

Skills Needed

  • Robotics

  • Machine Learning & AI

  • Machine Vision

  • Data Science & Python

We're Hiring

We are always looking for excellent students with good communication skills, programming experience, team players and excitied to work with robots in clinical settings. We look forward having you join us and enjoy of the wonderful academic environment at Purdue

Covers in Scientific Journals and Press

ISAT Lab - Machine Intelligence to Save Lives

Our Work on the Media

These are brief quotes of media apperances of our work at ISAT labs.

"Surgeons of the future might use a system that recognizes hand gestures as commands to control a robotic scrub nurse or tell a computer to display medical images of the patient during an operation."

client 1 Science Daily Read More

"That’s Dr. Juan Wachs, the professor who helped design this, a machine that’s also good at preventing what he calls “retained instruments.” Retained, as in surgical tools being sewed up inside patients by mistake."

client 2 NPR Marketplace Read More

"Purdue University researchers are developing a gesture-driven robotic scrub nurse prototype that may one day relieve the nurse of some of her technical duties or replace the scrub technician"

client 3 Fox News Read More

" Blind people ‘see’ microscope images using touch-feedback device ."

client 1 NewScientist Read More

"He and his team are developing devices like augmented reality lens to allow surgeons to collaborate via hologram-like visuals in real time from afar, AI-imbued robots to assist doctors in operation. "

client 2 WIRED Read More

"Researchers are developing an “augmented reality telementoring” system that could provide effective support to surgeons on the battlefield from specialists thousands of miles away."

client 3 Futurity Read More

" Juan Wachs, an assistant professor at Purdue University who builds gestural interfaces to help surgeons work with robots in the operating room."

client 1 Perfil Noticias Read More

"Robots en el quirófano: el argentino pionero de la cirugía a distancia"

client 2 La Nacion Read More

"Teleportation ala Star Trek—beaming a person place to place—remains science fiction, but a Purdue University researcher is developing a system for surgeons that he believes could be the next best thing."

client 3 Inside Indiana Business  Read More

Contact Us

Double-click the map to zoom in. Click and hold to drag.

ISAT Lab Rm. 128

315 North Grant St

West Lafayette, IN 47906