Paper
28 February 2014 Interactive projection for aerial dance using depth sensing camera
Tammuz Dubnov, Zachary Seldess, Shlomo Dubnov
Author Affiliations +
Proceedings Volume 9012, The Engineering Reality of Virtual Reality 2014; 901202 (2014) https://doi.org/10.1117/12.2041905
Event: IS&T/SPIE Electronic Imaging, 2014, San Francisco, California, United States
Abstract
This paper describes an interactive performance system for oor and Aerial Dance that controls visual and sonic aspects of the presentation via a depth sensing camera (MS Kinect). In order to detect, measure and track free movement in space, 3 degree of freedom (3-DOF) tracking in space (on the ground and in the air) is performed using IR markers. Gesture tracking and recognition is performed using a simpli ed HMM model that allows robust mapping of the actor's actions to graphics and sound. Additional visual e ects are achieved by segmentation of the actor body based on depth information, allowing projection of separate imagery on the performer and the backdrop. Artistic use of augmented reality performance relative to more traditional concepts of stage design and dramaturgy are discussed.
© (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Tammuz Dubnov, Zachary Seldess, and Shlomo Dubnov "Interactive projection for aerial dance using depth sensing camera", Proc. SPIE 9012, The Engineering Reality of Virtual Reality 2014, 901202 (28 February 2014); https://doi.org/10.1117/12.2041905
Lens.org Logo
CITATIONS
Cited by 3 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Infrared imaging

Visualization

Cameras

Video

Infrared cameras

Augmented reality

3D modeling

Back to Top