Azure Kinect Body Tracking Unity Integration

I came across this project in the Kinect 2 SDK forum and thought you all would like it...
[Post copied in full]
I recently finished an installation for the Swiss National Library (CH) using Kinect v2, Unity3D and the C# SDK. It's about presentation and direct interaction (e.g. rotation, zoom) with sensitive exhibit's in a museum and exhibition context using a large venue projection.
Showcase video: https://vimeo.com/146547262
I'd like to share some experiences back to the forum as I was very happy to see and use answers from other projects / questions. If you have questions on specific topics, feel free to ask.
- Equipment: gaming notebook with Nvidia graphics card (Lenovo Y50-70), Kinect v2 for Windows, Windows 8.1 Pro, Visual Studio CE for C# code
- In the video, you can see two 3d cursors for the hands. In contrast to the UI SDK, the hands cursor also transmit depth (Z) of the hands. Most important: use the joint smoothing code from here: https://social.msdn.microsoft.com/Forums/en-US/045b058a-ae3a-4d01-beb6-b756631b4b42/joint-smoothing-code?forum=kinectv2sdk
- To get a nice cursor, you need to implement something like a 3D PHIZ (2D PHIZ described in https://channel9.msdn.com/Series/Programming-Kinect-for-Windows-v2/07). This normalizes with body/shoulder rotation, arm lengths, and makes the movement of each hand relative to the body.
- Hand states: if a person is more than 2m away from the sensor, the hand states get very jittery. In my installation, I use a closed hand state for zooming and rotating. I smooth out hand state at 0.25s - which seems quite stable and yet is not too long to be really noticeable.
- For the exposition context, only the foremost person in front of the sensor is able to control the object (artificial limit)
- Body movement and position in the room is also used for walking "around" the object or coming closer. This really works very well and stable.
- I used floor marks to show best standing position to the visitors. When walking out of reach of the sensor, the object gets into a screensaver state but just moves slowly away from the position. So when a person gets closer again, the position seems quite natural.
- Implementing 3d cursor to drive the physx physics body (touch, pull) is not trivial in Unity3D. The problem is that the hands can pass through the object or a pull with closed hands collides with the object mesh. In the installation, a mesh collider is used with custom raycasting to prevent pass-throughs and with custom push/slide mechanism (similar to a character controller).
- The 3d models themselves were digitized using photogrammetry and have a high quality.
- The chocolate cake shown in the video is a real exposition object that must not be exhibited because the chocolate would melt. So virtual touch is the ideal means to present this object publicly.
- There's more information on the objects, including a WebGL viewer at http://nationalbibliothek.ch/3d.
Regards, Kai
Project Information URL: https://social.msdn.microsoft.com/Forums/en-US/f6accdeb-bc4e-4d20-b319-af221bb9ebf8/interactive-artists-books-direct-gestural-interaction-with-3d-objects?forum=kinectv2sdk
Follow @CH9
Follow @Coding4Fun
Follow @KinectWindows
Follow @gduncan411
This conversation has been locked by the site admins. No new comments can be made.