DT&Z | Upcoming Events | Artist Links | Critical Theory | Bibliography | Roll Your Own
THE ELECTRIC BODY PROJECT:
Gestural Sampling and Mapping Techniques for Dance Choreography
(Applications Sketch for Siggraph '97)
By Thecla Schiphorst thecla@cs.sfu.ca & Sang Mah
The Electric Body Project
The Electric Body Project is a software tool for creating choreography using gestural sampling and body mapping techniques. Movement is sampled using Ascension Technologies Flock of Birds, a six degree of freedom motion capture system.
The Electric Body Project is a standalone software package running on the SGI platform designed to work as a plugin with LifeForms choreographic software. It offers output of file format compatibility creating movement for choreography and animation. What was previously accessible only through synthesized movement phrases in Life Forms, can now be accessed and explored with real-time sampled movement phrases input directly from a participant "dancing" within the system.
In The Electric Body Project, the movement of the participant is literally sampled and displayed, and is metaphorically treated to influence, direct, and determine what is presented and represented visually. The Electric Body uses a series of mapping systems, which originate from the physical mirrored body and result in the transformed mapped body. Mapping strategies include 'imprinting', 'following', and 'tracing', concepts informed by choreographic studio techniques. Movement is sampled and treated using inverse kinematics techniques.
The Electric Body extends the literal photo-realistic world model of motion capture by introducing the concept of metaphorical mapping used in compositional practices of choreographing movement for dance in the studio. Metaphorical mapping means that a sampled input gesture can be transformed using User Defined Maps that are predefined, or that are created on the fly during the data capture process. Sampled input data can be amplified, filtered, mapped to alternate limbs or limb groups, and also distributed to multiple limb groups using the concepts of acceleration (gradual increase of the range of movement over time) or deceleration (gradual decrease of the range of movement over time). In addition User Maps can be combined over time, to produce additive phasing. These Filters can be applied infinitely to sampled movement, resulting in an ability to define movement variations in layers (much in the same way that images can be layered in Photoshop). These techniques provide a mechanism for composing movement which parallels sound filtering used in music composition and image filtering used in image processing.
Users can work directly with the computer monitor display, or in a performance situation, interact with their own life size projected images. It is possible for the performer to replicate and enhance those images, and encounter and respond to imprints of their own projected movement. This enables a dancer to use characteristics of their own gesture to compose and influence movement created through the filtering process. Gesture enables the participant to compose their own place within a dance, modify timing and placement of visual images, and select and integrate movement phrases. This real-time movement exploration system can be used in performance by creating virtual dancers whose movement is generated directly from the live performer in such a way as to produce a complex responsive system.
Funding:
This project has been sponsored by the Media Arts Section of the Canada Council.
DT&Z | Upcoming Events | Artist Links | Critical Theory | Bibliography | Roll Your Own