Gestural Interaction in the Operating Room
- Posted: Dec 19, 2013 at 6:00 AM
- 11,078 Views
Today's inspirational project is another example of how the Kinect is being used in unusual ways in the medical industry...
Student Tamara Worst has completed a fascinating project with Interaction Design group IxD Hof, in cooperation with Siemens Healthcare, that frees doctors from using their hands for menial tasks during operations. Siemens was looking for a non-contact interaction solution for surgeons in the operating room, so Tamara used two 3D sensors, a Wii remote and Microsoft Kinect, to detect the orientation and position of the surgeon’s right foot. She then wrote software in Processing to allow them to view and manipulate images and 3D models without the use of their hands.
The Kinect sensor detects the position of the surgeon’s right foot, allowing him or her to accomplish different tasks, depending on where the foot lies.
In this lengthy post, Tamara goes in depth into her prototyping process. She initially used simple pressure sensors to detect different foot movements and orientations but found that the wires required to connect to the sensors were clunky, annoying, and restricted movement of the surgeon in the operating room.