Kinesthesia - A Kinect Based Rehabilitation and Surgical Analysis System
Today's inspirational project shows how the Kinect has a future helping in stroke rehabilitation, gait analysis and laprascopic surgery...
I’ve been scratching my head about how to kick my Ubelly career off, but luckily (as things often do) it all fell into place. One of my old school friends is at University of Leeds, studying a Mechanical Engineering Masters. And it so happens his final year research project involves a rather nifty use of the Kinect SDK…
Who are you guys? We are Chris, Barnaby and Dom. We’re all in our final year of study for a MEng in Mechanical Engineering at the University of Leeds. Barnaby and I have focused largely in tech-based subjects, a lot of programming and mechatronics, while Dom has focused his studies on Biomedical Engineering (Engineering at Leeds is a world leader in the field of research for biomedical engineering).
Tell me a bit about your research project. Who/ what/ why/ how? During the summer of 2011, Barnaby and I were both on independent summer placements. Over the summer, a large number of Kinect hacks started going viral – stimulating our imagination. Through a long stream of emails, we started to come up with some ideas and thinking about how we could try to get the project approved as a final year project. After approaching the university, two academics were keen to investigate the potential of the Microsoft Kinect. One was Prof. Martin Levesley (who is researching the next generation of healthcare devices and Dr. Peter Culmer (who is head of Surgical Technology). We were linked to National Instruments as our industrial mentor. National Instruments’ LabVIEW software is used extensively within the department and so developing code that could be used in the department was pretty damn vital.
When did you adopt the Kinect SDK? Why? The Microsoft Kinect is game changing technology. Through the combination of its depth map and its skeleton tracking abilities, it enables a massive range of applications. It was a clear choice to attempt to use this incredible technology to produce a low cost and effective toolkit for the use of biomedical engineers. From working with it we, and the academic staff supporting us, are consistently blown away with the capabilities of the hardware and how it has been packaged into such an affordable piece that’s open to anyone.
How does the Kinect facilitate your research? The SDK has enabled us to gain access to the full capabilities of the Kinect, controlling camera functions and collecting all data streams from it. We have built Virtual Instruments (sub-programmes within LabVIEW) that provide a simplified interface for people wanting to develop Kinect-based applications in LabVIEW and gain access to the data they need. Through these VIs we are developing software to analyse the data collected from the Kinect, so that clinicians and researchers can assess patients.
What implications does your research with the Kinect have for changing the current tools used for treatment/ research? ...
Project Information URL: http://www.ubelly.com/2012/03/innovative-tech-man-3-students-and-a-kinect-sdk/
This project encompasses the development of a user-friendly interface between the Microsoft Kinect SDK and National Instruments' LabVIEW, and the subsequent development of a selection of tools for use in the field of stroke rehabilitation, gait analysis and laprascopic surgery. This LabVIEW-Kinect toolkit is designed with the intention of allowing both ourselves and future users to easily interface the depth and skeletal tracking functionalities of the Kinect with any LabVIEW system. In addition to the medical applications detailed, a number of further examples are demonstrated in order to show-case the potential for connectivity between the two systems.
Products: NI LabVIEW, NI Vision Development Module, Microsoft Kinect, Microsoft Kinect SDK
The video below shows an example of how we can interface the Microsoft Kinect with labVIEW to produce an intuitive control system. Using the Kinect's skeleton tracking ability to control a VTOL aircraft demonstration through the use of NI DAQ board. Users are have the option to control the rig using a hand control, signal generator or through body position using the Kinect Toolkit developed as part of this project all can be operated through a manual or Fly-by-Wire control system.
Besides the development of the Kinect toolkit as a whole, three key challenges are addressed in this project. These are:
Stroke, the disturbance of blood supply to the brain, is the leading cause of disability in adults in the USA and Europe, and has a profound effect on the quality of life of those to whom it occurs. The physical effects of stroke are numerous, and there exists a substantial field of study devoted to the assistance and rehabilitation of stroke sufferers. It is a complex and multidisciplinary affair, requiring constant monitored cognitive and physical therapy. As such, a need is presented for a system that provides mental stimulation to the patient whilst accurately recording body position and movement, allowing physiotherapists to both maintain patient motivation and extract detailed information on their movements.
With the advent and continued development of laprascopic (keyhole) surgery, it is essential that the operating surgeon has accurate and up to date information on an area that may not be directly visible to them. Before operating, the abdomen is usually inflated to provide the surgeon with the necesessary space to both view and access the internal organs. However, it is difficult for the surgeon to accurately guage the level of inflation, and the working space available to them. A tool that can determine inflation of the abdomen, and provide an estimate of the inflated volume would increase the safety of procedures and provide the surgeon with more information about the patient prior to operation.
On top of this work will be carried out to see if the skeleton tracking abilities of the Kinect can be used to programmatically and objectively assess a surgeons skill through economy of movement, accuracy and time.
It has been suggested that a person's gait is more unique than their fingerprint. Indeed, the way in which we walk can offer insight into a number of medical problems which may take longer to present in other forms. Stroke, for example, can result in a defined limp in one side of the body, and the extent of this limp may offer further information into the extent of the stroke. Similarly, analysis of one's gait can also offer information into struggling hip, knee or ankle joints, and can suggest not only that a joint replacement is required, but also that a specific type of replacement would be beneficial. A low-cost, investigative tool that can be used prior to expensive specialist referrals would offer a significant benefit to clinicians.
The Microsoft Kinect has already revolutionised the gaming industry with its ability to track users motions, marking a key movement away from traditional control systems. This project is aimed at taking advantage of Microsofts innovative technology, and interfacing it with NI LabVIEW, via the development of a fully functional LabVIEW driver and toolkit. With this in place, we are developing a selection of motion tracking tools and programs specifically aimed at tackling three key areas:
The toolkit is approaching full functionality, allowing the user to initialise and close the Kinect's different components through polymorphic VIs (RGB Camera, Depth Camera and Skeletal Tracking) dependent on the functionalities they require, and a number of sub-VIs are either completed, or approaching completion, which relate to the processing, extracting and displaying of data from the Kinect. Figure 10 shows the Polymorphic VIs that the user can simple drag into the block diagram, the code displayed lets the users access the Video, Depth and Skeleton data. In the Example below the code is also producing a 3D picture control plot of the skeleton on the fly.
Project Information URL: https://decibel.ni.com/content/docs/DOC-20973