Movement sonification

Movement sonification

> Sonification of body movement

Use of computer-vision, body-tracking techniques or electronic sensors where collected data used to generate or control sound.

The Terpsitone 1932
A dancer may create music by the movements of her body. A capacity device in the floor is mainly responsible. – The dancers dancing in tune.”. (Leon Theremin)
The Terpsitone was an electronic musical instrument that used the capacitance of the dancer bodies to cause variations in the pitch of the Terpsitone’s oscillators. It was created by Leon Theremin and it was an adaptation of the Theremin with the goal of being used by dancers, where they could use all their body to create sound.The instrument had a metallic conductive platform where the dancer would move. Vertical movements would change pitch and volume is controlled by the distance from the back of the platform. More information here and here.
For more deep knowledge related to the Terpsitone, the original Theremin and its variations and another experimental sound instruments from the early 20th century in Russia check the  book : Sound in Z – Experiments in Sound and Electronic Music by Andrei Smirnov.

Seine hohle form 2000

“It is a fragment from the prose poem “Gesichter” by Rainer Maria Rilke, roughly translating to “its hollow form.” As a starting point for this interactive work, it serves as an emblem for the interesting challenge of creating a musical work that only exists when a dancer moves. Using real-time synthesis and video tracking technology, the choreography is affected by the live generation of sound through sensors and computer systems, and the music is in turn shaped by these movements.”
Music & Interactive Programming: Butch Rovan, Video Tracking System: Frieder Weiss, Choreography: Robert Wechsler, Dance: Robert Wechsler / Laura Warren. More info here.

 

AUDFIT 2014
“AUDFIT is a dance costume that precisely reads the movement and position of the dancer’s body using motion detectors and accelerometers designed by Krystian Klimowski. Data that is thus provided is converted in real time into sounds. The audience listens to the music created from the dancers movement through the headphones (Silent Disco headphones sets), with a choice of three CB radio channels, each expressing the sound in a completely different manner.”

“AUDFIT project the music is generated by the movement – even the tiniest move of the body releases a sound sequence and electronic tunes. The body of the dancer is mapped by 9 strategic points that orientate its position in space. In AUDFIT the movement creates the sound, which is integral with the action, dance becomes an instrument, a programmer of audio-visual perception formed in the skull of the recipient.”
By Strangeloop. Marta Romaszkan – dance/movement, Krystian Klimowski – physical computing, sensor programming, Patryk Lichota – conception and sound programming.

 

Ominous 2014
“Similarly to a mime, I model malleable sonic matter produced by my body. By using the wearable biotech instrument “Xth Sense”, the bioacoustic sound of muscle contractions is amplified, digitally processed, and played back through eight loudspeakers and subwoofers. The natural sound of my muscles and its virtual counterpart blend together into an unstable sonic object.”

“This oscillates between a state of high density and one of violent release. As the listeners imagine the object’s shape by following the gesture, the sonic stimuli induce a perceptual coupling. The listeners see through sound the sculpture which their sight cannot perceive.” By Marco Donnarumma. More info here.

 

                Experiments in choreography and 3D sound 2013


“Sound spazialization and choreography Experimentation” by Sebasthiankurth. Performance by Sebastian Kurth & Mario Athanasiou. Spatialization setup: Johannes Schütt

image

 

Sonic dance
Body motion instruments developed by Pablo Palacio y Muriel Romero.
Interactive Dynamic Stochastic Step Size Modulator
“This instrument is an interactive version of a dynamic stochastic synthesizer. This method initially devised by Iannis Xenakis employs simulated brownian motion as a stochastic mechanisms to directly generate the sound pressure curve via the manipulation of individual digital samples.”

 

STOCOS (motion tracking detail)
“This interactive instrument combines artificial intelligence simulations of swarm behaviour with dynamic stochastic synthesis. The first 5 movements of the dancer give rise to 5 agents that will stop moving and freeze into a stochastic chord when the dancer moves, and dance frantically when the dancer stops. Accordingly this meta instrument is finally performed by an hybrid sextet comprised by 5 artificial participants and a natural one, the dancer.”


By Pablo Palacio y Muriel Romero. More instruments can be seen here.

 

EPI::AKTEN (Motion Tracking Experiments)

“We used a combination of technologies to create a flexible and transportable low-impact energy motion-tracking system. Through the SenseStage microcontroller boards developed by Marije Baalman at STEIM we can determine the acceleration of joint positions wirelessly.”
“Through the ASUS Xtion PRO LIVE interface we can determine the skeleton joint position in a 2d space. We used network communication to transfer the bodily information to a second computer for sound design, which used pre-recorded sound of Montealbano village in a concatenative synthesis fashion. This added the sense of location-specific to the raw technology.”

“The dancer was inspired by the metaphors of body giving/body receiving in her actions and could construct autonomously several improvisations which were always very creative and different. The aesthetic imposed by the developed technology plays a lot with synchronicity of machine-human action-reaction. Mostly the dancer decides when to trigger sound events but sometimes the unexpected sound trigger the creativity of the dancer which adds a random creative element in the discourse”.
By Federica Dauri – Movement Improvisation, Antoni Rijekoff and JesterN a.k.a. Alberto Novello: Motion tracking and Sound Design.
More info at Transformatorio, and Ikpan website.

Interactive music for dance performance

Work in progress. Experiment #1.
“Real time sound is generated according to dancer’s motion data, acquired through an inertial motion capture system (motioner) .OpenFrameworks (RAM Dance Toolkit) + MaxMSP”


By João Menezes. More info here.

 

Traces
“Microphones attached on Sebastian’s wrists are capturing the sound of his movement. The sound is digitally manipulated and is played back through a surround sound system. None of the sounds are prerecorded. Part of an ongoing collaboration with Sebastian Kurth.”


By Mariosathanasiou.

 

> Machines that are put in motion by sequenced/generative data or live triggering

Produce sound is the result of its mechanisms in motion.

image
Pic. “Radius Music”, Dave Youn

Radius Music
“Radius Music combines ideas of cartography and graphic scores as a means to produce sound. The device itself is an autonomous revolving machine that reads a distance value in real-time between itself and another object. As the machine slowly rotates and scans the room, it takes this radial distance and outputs it as a relative sonic frequency and a corresponding visual score”


More info here. By Dave Young.

 

Signal To Noise
“Signal To Noise’ is a kinetic installation immersing the spectator in patterns of sonic motion, based on generative principles executed by 512 mechanical split-flaps”.
“The works consists of a 3.40 m circular structure, containing 4 horizontal rows of 128 split-flaps at eye height. The external surface exposes the stripped back technology of the split-flaps and driver boards, while the internal surfaces reveal the characters of the split-flaps. The circular installation invites the visitor to plunge into a kinetic composition in the midst of the eternal calculation process of an auto-poetic machine. The split-flaps are constantly spinning on a variable speed/rhythm which is dependent upon on the underlying algorithm, analyzing in the maze of information the appearance of a word-equal-meaning.”


By LAb[au] – laboratory for architecture and urbanism (Manuel Abendroth, Jérôme Decock, Els Vermang)

 

Solenoid β
“Eight tap shoes are set in a symmetrical circular pattern. This allows the kinetic elements of the installation to produce three-dimensional sounds in relation to the listener / observer. The shoes are attached to simple robotic structures which utilize pneumatic actuators and solenoid valves. This enables the shoes to be moved in a multitude of ways with each movement having a distinct characteristic sound. The movements of the shoes are sequenced with a computer to produce an auditory performance.”


By Peter William Holden

 

Frequences [A]
“Frequencies (a) is a sound performance combining the sound of mechanically triggered tuning forks with pure digital soundwaves. The performer is triggering sequences from the computer, activating solenoids that hits the tuning forks with high precision. Streams of light burst in synchronicity with the forks, creating a not-quite-minimal sound and light composition.”


By Nicolas Bernier
David Rokeby was part of a group of pioneers, like Erkki Kurenniemi, that developed interactive systems that would allow using the human body as an instrument. Moving the body would have a direct sound feedback. Different parts of the body controlled different parameters of the sound.
His early explorations on computer based interactive systems defined metaphors and interactions that are the basis of many of today’s interactive systems.

image
Image from the 16×16 pixels camera tracking Rokeby movements

 

Reflections [1983]
”Reflexions was my first interactive sound installation. I constructed some very bulky 8 x 8 pixel video cameras (the large black box over the monitor in the image), connected then to a wire-wrapped card in the Apple ][ which digitized the images, and wrote a program in 6502 assembly code for the Apple ][ which controlled a Korg MS-20 Analog synthesizer to make sounds in response to the movements seen by the cameras. Movement also controlled the volume of two tape loops of water sounds. The synthesizer and water sounds were mixed individually to 4 speakers in a tetrahedron (one on the ceiling and three in a triangle on the floor. The sounds would move around you in addition to responding to your movement.” (Rokeby )


More info on Reflexions here.

 

Very Nervous System 1986
Very Nervous System was an evolution of the Rokeby’s previous interactive sound installations (Reflexions and Body Language).

Human gestures and movements are captured through a video camera, and translated in real time into an improvised music system that reflects and reacts to the qualities of the movements.

image
Interactive system  scheme (left). Audience interacting with V.N.S. (right)

“I use video cameras, image processors, computers, synthesisers and a sound system to create a space in which the movements of one’s body create sound and/or music. It has been primarily presented as an installation in galleries but has also been installed in public outdoor spaces, and has been used in a number of performances.” (Rokeby)

Rokeby interacting with Very Nervous System in 1991

“The installation is a complex but quick feedback loop. The feedback is not simply ‘negative’ or ‘positive’, inhibitory or reinforcing; the loop is subject to constant transformation as the elements, human and computer, change in response to each other. The two interpenetrate, until the notion of control is lost and the relationship becomes encounter and involvement.
The diffuse, parallel nature of the interaction and the intensity of the interactive feedback loop can produce a state that is almost shamanistic. The self expands (and loses itself) to fill the installation environment, and by implication the world. After 15 minutes in the installation people often feel an after image of the experience, feeling directly involved in the random actions of the street.” (Rokeby).

image
One of the 3 cameras used to track the interactive space (left), Rokeby performing with Very Nervous System in Dam Street ,1993. (right)

More info on Very Nervous System here.

 

International Eel (2011)
“International Feel” was an interactive sound installation made for Strategic Arts Initiative 2.0 (2011) exhibition, and it was an update of “Body Language”, made for the same exhibition in 1986.

“International Feel” is a telematic version of Very Nervous System. Two systems are installed in two different physical locations, the visitors of both systems meet in cyberspace and interact together to create a collaborative soundscape.

image
A visitor interacting with the installation in Toronto (left), and Robert Rokeby in Rotterdam (right).

”For “International Feel” I created identical 2.8 x 2.8 meter spaces in Toronto at Inter/Access and in Rotterdam at V2. The kinect sensor in each space captured the depth image of whoever was in this space, and translated it into a “bubble-body”, a set of spheres that built up an approximate representation of that body in space.
This body data was transmitted over the internet to the other location, allowing each installation to place both virtual bubble- bodies into an imaginary shared space. The spaces where outfitted with directional sound, and this was used to give a sense of the location of the other person in your shared space. If there was no contact between the bodies, there was the sound of breathing, coming from the exact direction of your invisible partner. By moving toward the sound of breathing, you could attempt to touch the other virtual body. On contact, other sounds emerged, with the sounds changing to indicate how much of your body was in contact with the remote body, and the directionality of the sound intensified to give precise cues as to the direction to move to maximize contact. The sense of physical engagement was very powerful. One found oneself almost bouncing off the remote person’s body on contact.”
(Rokeby).

More info about “International Feel” here.

Scroll up