Today's post from Mashable's Adario Strange highlights a project from MIT that is pushing Kinect driven interfaces beyond two dimensions...
With the advent of touchscreens and increasingly powerful mobile computers, digital interface innovation has become one of the most exciting areas of research. Now a new project called inFORM, crafted at the MIT Media Lab, offers a peek at an even richer interface dynamic for future devices.
Combining 900 actuators connected to square rods with the Microsoft Kinect, the team, which included Sean Follmer, Daniel Leithinger, Alex Olwal, Akimitsu Hogge and Hiroshi Ishii, was able to create an interface that brings computer generated 3D objects and motions to life via real world shapes and movements.
“We are using two Kinects — one Kinect is mounted in a remote location to capture the remote user, who has two video screens so they can see both the inFORM table surface and other participants,” Follmer, one of the project leaders, told Mashable.
“[The] remote Kinect is mounted on the ceiling and captures the depth image and 2D color image of the user's hands or any other objects placed under it," Follmer added. "This captured data is sent over the network to the inFORM surface, which render's it physically on the inFORM pins. A projector above the inFORM projects the color image of the remote user's hands onto the rendered shape.”
According to the team’s research paper outlining the details of inFORM, the cost and scale of the system at this point limits it to a research device, but the team members do have an ambitious vision for possible future applications.
“Urban planners and architects can view 3D designs physically and better understand, share and discuss their designs. We are collaborating with the urban planners in the Changing Places group at MIT on this. In addition, inFORM would allow 3D modelers and designers to prototype their 3D designs physically, at a low-resolution, without 3D printing.”
Aside from those exciting possibilities, the team is already working on a new version of the system that could allow two people to virtually interact with each other with tactile feedback.
“We are working on building a second inFORM to connect to this current one,” Follmer said. “This will allow us to have bi-directional remote collaboration.”
Project Information URL: http://mashable.com/2013/11/12/mit-inform-kinect/
Past research on shape displays has primarily focused on rendering content and user interface elements through shape output, with less emphasis on dynamically changing UIs. We propose utilizing shape displays in three different ways to mediate interaction: to facilitate by providing dynamic physical affordances through shape change, to restrict by guiding users with dynamic physical constraints, and to manipulate by actuating physical objects. We outline potential interaction techniques and introduce Dynamic Physical Affordances and Constraints with our inFORM system, built on top of a stateof-the-art shape display, which provides for variable stiffness rendering and real-time user input through direct touch and tangible interaction. A set of motivating examples demonstrates how dynamic affordances, constraints and object actuation can create novel interaction possibilities.