Live Projection Mapping Software

Project Update:

Andy completed the second phase of this project for a December final project showcase. In the video above, he explains his proposed method for real-time, markerless facial projection mapping.

Summary

The goal of this project is to achieve real-time projection mapping and masking onto moving individuals using Intel Realsense™ Technology and a custom software solution. The project will deploy in two phases, with the first phase, static map rasterization, achieved by the midterm presentation date and showcased with a brief artistic demonstration. The second phase, continuous masking and mapping, will be prepared towards a final project showcase later in the semester. A stereoscopic IR sensor, the Intel D435 Depth Camera, was purchased for use alongside the Intel Realsense™ SDK 2.0. The supporting application will be developed for deployment in a Windows environment through the Visual Studio IDE configured for C++ coding. The hope is for this system to be a worthy replacement to the expensive, overly engineered solution currently used in rare instances of real-time mapping onto live choreography.

System Vision

The system designed for this project will read stereoscopic IR depth data from the D435 sensor placed just over the projector lens and relay that information to a host computer running a custom Realsense™ application. This program interprets depth data in a 1-10m range to both map and mask projection media, received via capture card from a conventional media cue list hosted on QLab and digitally projected into the 3D depth sensing environment, effectively providing a “texture” for objects (or people) rendered in the defined space, simultaneously producing an implicit mask for out of range elements. Each resulting frame is then rasterized by the program and sent to the projector. Simultaneously, Isadora, configured as a mezzanine interceptor for the QLab video signal, will perform basic object tracking using the RGB input from the D435 sensor. The resulting modifications to the source video are then passed into the custom 3D environment before frames are rasterized and masked, achieving a basic real-time map. Ideally, the mezzanine would be replaced with native Realsense™ functionality, but the existence of such features is presently unknown.  

Deliverables

Several proximate goals are defined in preparation for the final project presentation in December. Building first to a midway presentation, phase one of the Realsense™ application development must be completed at least one week before the presentation in order to allow for sufficient design time for the projected media. This presentation will ideally feature projection masking around several articles of clothing placed around the stage shortly before the performance, highlighting the speed at which a map can be generated in the proposed workflow, even for chaotic or random geometries. After the midterm, the full application will develop alongside of a choreographed performance, likely created in conjunction with a dance student before the introduction of video elements in the final phases of the software development process. The final presentation deliverable would ideally consist of live projection mapping and masking onto a highly dynamic dancer augmented by affiliated video designs.

Space Shirt Demo
Live Static Mask Demo

© 2019 by Andy Carluccio

  • LinkedIn - White Circle