Coffeehouse Thread

8 posts

Forum Read Only

This forum has been made read only by the site admins. No new threads or comments can be added.

Interview with Clint Rutkas and Dan Fernandez about the Kinect controlled Lounge Chair

Back to Forum: Coffeehouse
  • User profile image

    Hey Niners,

                        I am looking for your questions for an interview tomorrow.  I am talking to Clint and Dan about the Kinect powered Lounge Chair they built for the Mix Keynote.  You can check it out here: 

    At roughly 7:09

    Let me know if you have any questions for them and I will make sure to ask.   Thanks,



  • User profile image

    Who gets to keep the chair?

  • User profile image

    Have you had any contact from people who are considering using Kinect to control wheelchairs?  A one-handed version that requires just a finger to control, would be useful.  I am not up to date on modern electric wheelchair controls, but my brother had Duchenne's muscular dystrophy, and over time due the deterioration of the muscles, it was harder and harder for him to control it via the joystick which still required mechanical input.  Other motions besides hand/finger control would also be useful for people with paralysis and other conditions.  One recognizing blinking and eye motion might be possible as well.

  • User profile image
    Dr Herbie

    How much code did they have to write on top of the SDK to track the position of both hands?


  • User profile image

    How much calibration was required? Was it easy to move from a laboratory setup to the final product (if you had a lab/working setup to begin with)? I'm mostly just curious how easy the SDK adapts to different environments (dev setup -> final environment).

  • User profile image

    I'm curious about the same thing Herbie is asking. You probably can't divulge much about the actual SDK yet, but was it as simple as player1.LeftHandVector or did you need some work on top of that?

    Also: as far as I could tell, the current scheme involves moving two virtual thumbsticks around by moving your hands up and down, and left and right. Another scheme I can imagine would be tilting that 90 degrees so that you move your hands forwards and backwards, and left and right, as if you're moving two levers that stick out of the floor. I'd imagine that would provide a greater range of motion (your hands wouldn't be 'off camera' as quickly) making the controls a bit less finicky, but there are probably reasons you decided against this. How did you decide on the current control scheme, and which others did you try?

  • User profile image

    in dans demo during the keynote it looked like the fps from the camera was way below 30 fps, this was especially visible in the drawing demo. what was up with that?

    can you poll the camera for an image as well as wait for one?

    related to @bas and @herbie, did you run into issues with control sensitivity?

    it looked from the demos that the version you where using didnt include skeletal tracking at all, is that true or didnt you just show that?

    can you talk about any differences in terms of stream options compared to the unofficial sdk? (long shot)


  • User profile image

    Probably too late, but I just saw this video in which Clint mentions IR flares, and I was wondering how they solved that problem.

Conversation locked

This conversation has been locked by the site admins. No new comments can be made.