SOUNDMAPPING:

INTERFACING AUDIO & VIDEO IMAGES, TECHNICAL INFORMATION

 

Michael Saup

Institute für neue Medien, Frankfurt 1993

web

Abstract

 

This text tries to demonstrate several approaches to interface audio and images, all dome at the Institute for New media in Frankfurt, Germany. I believe very much, that any of the discussed topes could be easily applied for an interactive stage show using real-time graphics hardware, which to this dimension none has performed yet.

 

1. Introduction

For the last years, the main focus of my work has been on controlling pictures through audio-input and applying different set-ups in an art and performance environment. There already exists some tradition in synchronising sound to images like in ulimedia-installations or video-games.

A main fact is, that most artistic electronic work in a sense of craftmanship can be automatically done by computer instead of an operator. There exist already big library of file formats both for audio and images that will serve this fact like standard MIDI Files (SMF), AIFF, TIFF an so on. Along with sophisticated hard and software there should now exist enough tools for interfacing pictures with sound, even in real-time applications. These set-ups could then be used for live-stageshows or various interactive purposes. Even the production of video clips could be atomised in very complex way. Instead of an exhausting frame by frame-rendering of images, 3D-models could be so directly extracted from video images or audio input in any format and the effects could be seen just as it happens.

2. Input materials

2.1. Audio

Audio data can be supplied by the following sources:

- Standard MIDI files (created by software sequences like Cubase, Vision, Digital Performer...)

- MIDI - data (created by MIDI-keyboards, MIDI guitars)

- Analogue sound sources (violins, space shuttle, voice, atomic explosion...)

- Pre-recorded samples on hard disk

2.2. Images

Image date can be supplied by the following sources:

- Live camera input

- laser disk

- hard disk

- video tape

- workstation that have real-time graphic-capabilities

By combining these different devices, it is possible to control images by musical parameters such as frequency, amplitude, spectrum or rhythm

3. Image manipulation

3.1. Non-Real-time

Non-Real-time technique can supply visual footage for laser disks, hard disks or video tapes that can be triggered later on. Mainly these techniques include algorithms, that require large amounts calculation time:

- Digital Signal processing; fft, filtering, edge detection...

- traditional 2D or 2D animation

- complex audio-triggered image synthesis

- standard video production

3.2. Real-time

Real-time techniques can be used to manipulative image data in live-interaction:

- live mixing with video diverse

- computer controlled hard and laser disk access

- mapping of raw or interpreted audio data on 3D grids or models

- warping

- morphine

4. Sound synthesis

Along with the manipulation of image data, also sound can be processed or changes with the help of MIDI-controllable devices like samplers mixing consoles, effects or light-controllers

Auto composition:

- MIDI brain: listens to the input and supplies a musical backup

- MIDI cloning: imitates the impute

- mathematical methods: random walk, Faigenbaum, scaling...

- 2D sound

5. Results

XTRA. TRAX

A program creates Video-edit lists from Standard MIDI files, that will then automatically assemble a video clip

A 6 6 6

The Abekas A60 Hard-disk, recorder is controlled by a musical instrument. With this, a video is no more cut, but instantly controlled in time.

PAULA CHIMES

A musical instrument made out of 16 chimes and 2 video monitors is capable of real time image warping and auto-composition of music.

 

TILT IT

A video, that uses the sound track for the direct creation of 3-D computer graphics. For instance, a sample of guitar solo is calculated and animated into 3D-space and reflection mapped with the original pictures that created the sound.

 

DWARFMORPH

A software, that is able to create 3D-models out of 2D-video images. In a real-time application, the viewer of a scene can be directly park of the 3D-world he looks at.

 

HYENA DAYS

(Steina Vasulka: violin, Michael Soup: guitar) The musicians control image - and sound devices with their instrument)

web