Header image

Regenstrief Center for Healthcare Engineering Scholar
Assistant Professor, School of Industrial Engineering
Purdue University

 
 

Research


Taurus imageTouchless Robotic Surgery. Advantgas of working with touchless interfaces have to do with allowing more natural gestures for surgical robotic control, as well as maintaining asepsis. This type of solution, when applied to robotic surgery, has the potential to allow surgeons to operate as if they were physically engaged when doing in-situ surgery. By relying on touchless interfaces, the system can incorporate more natural gestures that are similar to instinctive hand movements, thus enhancing the user experience, which is a trending topic in the area of AI User Experience.

By Juan Wachs, Maria Eugenia Cabrera, Tian Zhou, Glevys Gonzales

|
STAR image










Telementoring using Augmented Reality. Optimal trauma treatment integrates different surgical skills not all available in military field hospitals. Telementoring can provide missing expertise, but current systems require the trainee to frequently focus on a nearby telestrator, fail to illustrate next surgical steps, and give the mentor an incomplete picture of the ongoing surgery. We address these gaps by developing STAR – System for Telementoring with Augmented Reality. We believe that Increasing the mentor and trainee sense of co-presence through augmented visualization will measurably improve the trainee’s surgical performance.

By Juan Wachs, Voicu Popescu, Dan Andersen, Maria Eugenia Cabrera, Aditya Shanghavi

|
TSP graph

Modeling Attention Through Physical Action. Embodied interaction concerns the way that user senses the environment, acquires information, and exhibits intention by means of physical action. This project is about Creating a framework that allows decision makers to interact with information using the whole body in intuitive ways may offer cognitive advantages and greater efficiency. AS a result of this work a computational framework based on a Bayesian approach (coined BAN) to infer operators’ focus of attention based on the operators’ physical expressions. Then, utility theory is adopted in order to determine the best combinations of interaction modalities and attentional levels for rendering better feedback to the operator.

By Juan Wachs, Ting Zhang, and Yu-Ting Li

|
Gestonurse
 

Gestonurse: A Robotic Scrub Nurse that Understands Hand Gestures. Up to 31% of interactions between surgeons and scrub nurses in the operating room (OR) involve errors that can have negative effects on patients. The proposed research will reduce the morbidity risks to patients due to communication failures and retained surgical instruments (instruments being left in patients) by introducing a robotic scrub nurse that responds to hand gesture commands. This research provides an accurate and rapid method of detecting the need for specific surgical instruments (through gestures), thus increasing efficiency and lessening the risk of complications. Click here to see a video.

By Juan Wachs, Stephen Adams, Mithun Jacob, and Yu-Ting Li

|
Gestix II
|
 

Gestix II: Context-Based Hand-Gesture Recognition for the Operating Room. Keyboards, mouse, and touch screens are the main methods of accessing visual information (images) in the operating room. They are also the main channels of contamination in the operating room. We plan to develop an effective sterile surgeon-computer interface for the operating room, for medical image browsing and manipulation. The deployment of this interface has the potential to reduce healthcare-acquired infections (and thereby costs), while providing a more intuitive, fast and reliable way for surgeons to access medical imaging.

By Juan Wachs, Rebecca Packer , and Mithun Jacob

Larynx
 

“A Window on Tissue” - Using Facial Orientation to Control Endoscopic Views of Tissue Depth. The goal of this project is to dynamically update the displayed image on a laparoscope’s monitor screen according to the surgeon’s head orientation with respect to the monitor—thereby providing a sense of depth and space. The surgeon’s head movements will be converted into rotation (pan/tilt motion) commands to the laparoscope device. These commands will result in a rotation of the image around the surgical point of interest, thus providing a panoramic view of the tissue. Click here to see a video.

By Juan Wachs and Stephen Adams .

|

marines

Marine localization, pose estimation and classification. We consider the problem of detecting and classifying marines postures in still images, where the background is unconstrained, cluttered and not modeled. This imposes the detection of a large number of different objects and different views. We are focusing our efforts in a multi-class boosting procedure that favors discriminative features shared by multiple objects and views. As opposed to traditional methods, the approach implemented scales logarithmically with growing number of classes.Click here to see a video.

By Juan Wachs and Mathias Kolsch.


IR track  

Tracking objects at sub-pixel resolution. Robust video surveillance systems require detection of events at sub-pixel resolution to gain practicability. This work involves background modeling, using both spatial and intensity distributions charactersitcs. The first, assumes a normal model for background and noise intensity distribution, while the last is captured by an on-line estimate of the neighborhood structure affected by the sub-pixel event.

By Juan Wachs, Mathias Kolsch and Kevin Squire.


detect apples  

Enhancement of Sensing Technologies for Selective Tree Fruit Identification and Targeting in Robotic Harvesting Systems. A vision system that recognizes occluded apples within a tree canopy using IR and color is developed. The optimal registration parameters for affine transformation are obtained using maximization of mutual information. Haar features are then applied separately to color and IR images, and finally, a voting scheme which reduces false alarms without affecting the recognition rate is proposed.

By Juan Wachs, Victor Alchanatis, Helman Stern and Tom Burks.


surgeon using Gestix  

A Gesture-based Tool for Sterile Browsing of Radiology Images. The use of doctor-computer interaction devices in the operation room (OR) requires new modalities that support medical imaging manipulation while allowing doctors' hands to remain sterile, supporting their focus of attention, and providing fast response times. "Gestix," is a vision-based hand gesture capture and recognition system that interprets in real-time the user's gestures for navigation and manipulation of images in an electronic medical record (EMR) database. Click here for details

By Juan Wachs, Helman Stern, Yael Edan, Michael Gillam, Jon Handler, Mark Smith and Craig Feied.


 

Optimal Hand Gesture Vocabulary Design Methodology for Robot Control. A global approach to hand gesture vocabulary (GV) design is proposed which includes human as well as technical design factors. The human centered desires (intuitiveness, comfort) of multiple users are implicitly represented through indices obtained from ergonomic studies representing the psycho-physiological aspects of users. The main technical aspect considered is that of machine recognition of gestures. We believe this is the first conceptualization of the optimal hand gesture design problem in analytical form.

By Juan Wachs, Helman Stern and Yael Edan.


 

Tele-Gest Project: Hand Gestures Telerobotic Control. This project describes a telerobotic gesture-based user interface system using visual recognition. Experimental results showed that the system satisfies the requirements for a robust and user-friendly input device. The Fuzzy C-Means algorithm provided enough speed and sufficient reliability to perform the desired tasks. Although gestures were recognized quickly and sent in packet form through the internet, successful execution of the commands could not be verified until the image of the robot environment was received at the user interface.

By Juan Wachs, Uri Kartoun, Helman Stern and Yael Edan.