Today's inspirational project shows how the Kinect and Kinect SDK is being used in some unusual ways.
This project is an extension of some work I did last November that integrated gesture-based control with a Programmable Automation Controller (PAC). The purpose of this simple follow-on experiment is to demonstrate interactions between a control program running in a PAC and a Windows-based HMI program that uses both conventional mouse-based input and gesture input. A short video of this experiment can be viewed below.
PACs are used extensively in automation and machine control applications. They can be programmed to perform advanced data acquisition and control tasks, and offer many flexible options for communicating with other hardware devices and SCADA/HMI systems. The concept I am trying to promote in my experimentation is the use of advanced natural user interface (NUI) devices like the Microsoft Kinect in automation and machine control applications. There are many practical uses for NUI technologies in industrial applications, where conventional "hands-on" human-machine interactions may otherwise prove to be difficult or potentially hazardous.
This project was created with the recently released Kinect for Windows sensor and the Microsoft v1.0 commercial SDK. The hardware device used in this demonstration is a Snap PAC Learning Center courtesy of Opto 22 Corporation, which includes a PAC-R1 controller and I/O rack containing a mix of various input and output modules. The Learning Center provides a convenient platform for these types of experiments, as it includes switches, LEDs and other components for simulating real-world operating conditions.
In this latest experiment I wrote a simple PAC program that monitors state transitions of an ON-OFF toggle switch. The toggle switch provides a voltage to one of the DC input modules on the rack, and another DC output module switches one of the panel LEDs on and off. The PAC program monitors both state transitions of the switch and the on/off status of the LED; if a switch-state change is detected, the program toggles the voltage to the LED. This could have easily been accomplished by directly mapping the input module to the output module, but the idea here is to allow an external HMI program to also interact with the I/O points.
Project Information URL: http://bkbrown.com/projects-Opto2.html
Related is this project, which is kind of meta...
UNITY GAME PROTOTYPE
April 2012 - This is a prototype for a 3D computer game that incorporates motion control using a Kinect sensor and real-world device control using a programmable automation controller (PAC). All character movement in the Unity game environment is controlled through right-hand gesture recognition.
In this proof-of-concept, the panel lamp on the virtual instrument tracks the physical instrument lamp, regardless of whether control is from the toggle switches on the panel or from the virtual buttons. The buttons react to "collisions" with the green sphere hovering in front of the first person game character. The game's control scripts interact with the PAC hardware, which is also running a simple program for physical control of the panel lamp.
Project Information URL: http://bkbrown.com/projects-Unity.html