Kinect Controlled Unity Avatars with RUIS Toolkit
I had to search the blog here a couple times before posting this. I just couldn't believe I've not highlighted this previously!
I've seen several topics about Kinect controlled Avatars in Unity, so I'll bring to your attention our RUIS toolkit which is intended for creating virtual reality applications and can be used to animate 3D avatars:
In 2013 we used our toolkit and Kinect v1 to animate the character in this demo:
For Kinect v2 avatars our toolkit includes the following useful features:
- Bones can automatically obtain rotation and scale from Kinect, so that avatar joint positions match what Kinect sees
- Bone rotations can be filtered (currently joints between torso and hands only)
- Angular velocity for bone rotations can be capped, making the avatar more stable but less responsive
- Fingers can be set to curl when Kinect v2 detects that you are making a fist
- Avatar root position can be scaled, so that your movement is amplified and covers a larger area
- The avatar can also be controlled with keyboard/gamepad, in which case its leg pose is blended with walking animation
- You can create your own scripts to blend custom Mecanim animations into the Kinect animated avatar's individual limbs
- You can use fist gesture to grab objects
If you download RUIS toolkit, you can get quickly started by opening the example scene in \RUISunity\Assets\RUIS\Examples\KinectTwoPlayers
The above example includes Constructor model from Unity standard assets. If you want to replace that model with your own, you need to parent your rig under the MecanimBlendedCharacter gameObject, and move all the scripts and components from Constructor gameObject to your rig, and relink the joint transforms. For details, see the last paragraph from "Oculus Rift with Kinect, PS Move, and Razer Hydra" section of our readme:
Please note that currently the main "modules" of RUIS (RUISInputManager and RUISDisplayManager) are coupled, and if you animate your Kinect avatars with RUIS, you also need to use the display management of RUIS, replacing your scene camera with RUISCamera prefab that you link to RUISDisplay. See the readme for details.
Reality-based User Interface System (RUIS) is an open source toolkit for creating the next generation of virtual reality applications. The idea of RUIS is to give hobbyists an easy access to the state-of-the-art interaction devices, so that they can bring their innovations into the field of virtual reality.
RUIS is available for Unity3D and Processing, both of which are widely used development environments that are favored by many game developers, artists, designers, and architects. RUIS enables development of virtual reality applications where devices like Kinect, Razer Hydra, and PlayStation Move are used together with immersive display devices such as Oculus Rift. The simultaneous use of these devices allows creation of novel virtual reality applications and 3D user interfaces.
RUIS includes a versatile display manager for handling several display devices simultaneously, features stereo 3D rendering and head-tracking, and supports the use of Kinect (1&2), Oculus Rift DK2, and PlayStation Move together in the same coordinate system. Developers can implement and test their own motion controlled applications even with just a mouse and keyboard, which are emulated as 3D input devices. RUIS has been used to teach virtual reality concepts and application implementation for five consecutive years in Aalto University’s virtual reality course.
RUIS project was initiated by researchers Tuukka Takala and Roberto Pugliese in Aalto University’s Department of Media Technology, Finland. Other important contributors are Mikael Matveinen and Yu Shen.
Project Information URL: http://blog.ruisystem.net
Project Download URL: http://blog.ruisystem.net/download/
This conversation has been locked by the site admins. No new comments can be made.