For my term project in User Centered Research and Evaluation, I worked with doctors from UPMC (Univ. of Pittsburgh Medical Center) on improving the workflow in the OR. Through a series of interviews and contextual inquiry, I discovered that surgeons had a hard time manipulating radiological images such as X-Rays, CTs and MRIs during the course of the surgery. They have to be sterile at all times in order to prevent contamination and this prevents them from touching any foreign object such as a mouse or a keyboard.
So I created KineMed, a system that uses a Kinect sensor to create a gestural interface for scrolling through various images. Doctors can now control playback, navigate through various studies and even perform brightness adjustment without actually touching anything. Extensive user research was done to create the most suitable gestural vocabulary to ensure deliberation and robustness in the dynamic environment of the operating room. The prototype was evaluated and improved upon through think aloud sessions conducted with surgeons and medical professionals across specialties.