Humanoid Robot Torso A1 made more human with help from the Kinect


Today's project is an cool inspirational example of how the Kinect and Kinect for Windows SDK can be used as part of a robot to provide some human like behavior, like following you with its "eyes" as you move about the room...

Humanoid robot torso A1

[Machine Translated] A larger project in which I work for some time, is a humanoid robot torso. Even if he is not even finished in its basic functionality, I took now the time me, closer to present it here something and to describe.

The project because nothing better occurred to me, I have given the working title "A1". Because that sounds but yet something bad dehumanized for a humanoid robot, he has received the additional name "Adam". After God called so his first attempt, I can finally also do that. Whom a more rationale rather is: you can read Adam also as an abbreviation for Advanced Dual Arm Manipulator"- but this is now also not much more than an ambitious vision is...

A crucial component in a sense even starting point of this project is the robolink joint system of igus, with which I since about mid-2010 experiment. A robolink joint is two degrees of freedom available, a swivel and a rotary movement. The power transmission in the joint is done via cable. The actuators are spatially separated from the degrees of freedom in this and must not be moved. The joints can be connected via profiles so there offering to form a human-like arm from multiple joints. First of all, I have a four axle arm with drive developed, two arms with five degrees of freedom are foreseen for the A1. First an arm including motorization and adequate control should be however completed, once there everything works perfectly, also the second arm can be built accordingly.

A humanoid robot is of course sufficiently complex to provide a huge playground on different technical levels. Accordingly, the project has still a whole range of other construction sites, where I currently work in addition to construction, automation and control of arms. The sensor-head, a mobile head with some sensors and the possibility of the speech is relatively well advanced by the hardware. Also I'm going currently a gripper concept continue to work out, that I already here ever described in its basic features.

This project is of course, if the basic functionality of the torso are again interesting, then we can deal with the realization of - perhaps "intelligent" - behavior. The picture above indicates that (is however).

Here there is somewhat more detailed information to the described projects:

Project Information URL: