Been a while since we've covered RoomAlive, so when I this this download I thought it a good time to revisit the project...
The RoomAlive Toolkit calibrates multiple projectors and cameras to enable immersive, dynamic projection mapping experiences such as RoomAlive. This download includes sample calibration files which can be used with the RoomAlive Toolkit. The RoomAlive Toolkit itself is hosted at https://github.com/Kinect/RoomAliveToolkit
The RoomAlive Toolkit calibrates multiple projectors and cameras to enable immersive, dynamic projection mapping experiences such as RoomAlive. It also includes a simple projection mapping sample.
This document has a few things you should know about using the toolkit's projector/camera calibration, and gives a tutorial on how to calibrate one projector and Kinect sensor (AKA 'camera').
- Visual Studio 2015 Community Edition (or better)
- Kinect for Windows v2 SDK
The project uses SharpDX and Math.NET Numerics packages. These will be downloaded and installed automatically via NuGet when RoomAlive Toolkit is built.
The 'Shaders' project requires Visual C++. Note that in Visual Studio 2015, Visual C++ is not installed by default. You may be prompted to install the necessary components when building the 'Shaders' project of the RoomAlive Toolkit.
Tutorial: Calibrating One Camera and One Projector
We outline the procedure for calibrating one projector and one camera. While one of the main features of RoomAlive Toolkit is support for multiple cameras and multiple projectors, this minimal configuration is a good place to start.
- Room Setup...
- Configure calibration.xml...
- Acquire Calibration Images...
- Run the Calibration...
- Inspect the Results ...
The projection mapping sample included in the RoomAlive Toolkit uses the calibration information. Pass the path of your calibration .xml as a command line argument so that the sample can find it. The main window shows the target rendering used in projection mapping.
In Visual Studio, look at the Settings.settings file under the Properties folder in the ProjectionMappingSample project. There are a few settings that control how the sample runs:
Calibrating Mutiple Cameras and Multiple Projectors
How Does Projection Mapping Work?
- A 'user view' off-screen render is peformed. This is the 'target' or 'desired' visual the user should see after projection onto a possibly non-flat surface. When rendering 3D virtual objects, this will likely require the user's head position.
- A graphics projection matrix is assembled for each projector in the ensemble. This uses the projector intrinsics, and, because the principal point of the projector is most likely not at the center of the projected image, uses an 'off-center' or 'oblique' style perspective projection matrix.
- The projector's projection matrix is combined with calibrated projector and depth camera pose information to create a transformation matrix mapping a 3D point in the coordinate frame of a given depth camera to a 3D point in the projector's view volume.
- A second transformation matrix is assembled, mapping a point in a given depth camera's coordinate system to the user's view volume. This is used to compute the texture coordinates into the 'user view' (above) associated with each 3D depth camera point.
- Vertex and geometry shaders use the above transformations to render a depth image to transformed vertices and texture coordinates for a given projector and a given depth camera. Essentially, the shaders render the receiving surface of the projected light, with a texture that is calcuated to match the 'user view' from the user's point of view, as projected by the projector.
- A projector's final rendering is perfomed by rendering each Kinect depth image using the above shaders. This procedure is performed for all projectors in the ensemble. Note that in this process, the depth images may be updated every frame; this is possible because the calibration and projection mapping process is fundamentally 3D in nature.
More Online Resources
- Channel9 pre-recorded talk introducing the RoomAlive Toolkit with Ben Lower and Andy Wilson
- video of 3 camera/3 projector calibration
- The resulting calibration set can be downloaded here. Try opening this in CalibrateEnsemble.exe.
- video of projection mapping sample, showing view dependent rendering of 3D object
- video of projection mapping sample, showing wobble effect
- RoomAlive video
- 5 projector, 8 camera calibration example (2GB!)
Project Source URL: https://github.com/Kinect/RoomAliveToolkit
Comments have been closed since this content was published more than 30 days ago, but if you'd like to send us feedback you can Contact Us.