Today's project has been in my queue to be highlighted since I first saw it in September, but there was always something...
Well enough of that (better late than never and all that). While this may be a niche use case, if you DO need this, this will really come in handy...
Tool for extracting depth frames from Kinect v2 to .mat files, with point cloud generator script. Ready to use!
The new Kinect v2 is awesome, however for people who are not coding experts, it can be hard to get the data of the Kinect into a workable setting, like MatLab. This tool is meant as a solution to solve the problem of getting depth data from the Kinect SDK into MatLab. And no external libraries are needed! (except for the ones needed for Windows Kinect and Windows)
Furthermore, a class is also provided, which can be used to export any
uint16) array to a loadable .mat file.
I tried a few libraries (including
matio) for extracting the depth frames to .mat files. None of them seemed to work and therefore I decided to make my own .mat file writer. It's not meant to be an example of good coding, but rather a usable tool.
The main tool is called "
KinectMLConnect", which is both found as source code and .exe, ready to be built in VS (tested in VS 2013), or run directly in Windows.
(The .exe is located in: "KinectMLConnect\KinectMLConnect\KinectMLConnect\bin\Release".)
The tool simply listens for an active sensor (or for the Kinect studio sensor emulator), grabs the stream and exports each frame as a .mat file.
The interface is quite simple and self explanatory, and is shown here:
Included is also a class file, for the MATWriter class, which is the one used for the actual export of the frames. Its constructor (and only callable code) is given here:
To make it even more simple, I've added a matlab script, ...
Project Information URL: http://www.codeproject.com/Tips/819613/Kinect-version-depth-frame-to-mat-file-exporter-to
- Blog: SergentMT
Comments have been closed since this content was published more than 30 days ago, but if you'd like to send us feedback you can Contact Us.