Menu


Word/ Phrase Index

Cellbytes Home

Documentation Introduction

Streaming Rehearsals

Research Presentations

Video Archive

-

Working with Software

Questions and Answers:

Question #6: describe your experience of working with the MAX or Image/ine interface in real time in co-operation with the other artists in the space - as you change those tiny parameters on the screen - what are you looking for? is it an intuitive process for you?

Answer #1: Working with MAX in realtime for me is a matter of at first finding the expected range of data coming from the space. For sound to be responsive it is necessary to take advantage of the full range of data, being able to anticipate the minimum range of activity and the maximum. After this mundane step is taken, then it becomes a matter of finding the "interesting" range of whatever patch has been previously created and then of saving those parameters. I look for audio responses to movement to somehow approximate what the dancers feel when they are moving, striving to get sounds to have more energy when the dancers exert more, etc.. This is not particularly intuitive for me except in terms of drawing on my 10 years of experience in this process. It is a matter of tedious trial and error otherwise known as "tweaking". (John Mitchell)

Question #7: what is MAX and what is MIDI - and how and why are we using these two technologies here in this project?

Answer #1: MAX is an object oriented programming language originally built around MIDI but later expanded to be a multipurpose program for controlling midi or serial controllable media. Add MSP, a programming environment for the creation of realtime audio synthesis, and you have a way to get from the media "control data" (like sensing information) to actual realtime phenomena such as the production of sound waves. MIDI is basically the control system used for audio keyboards to communicate with other audio devices. (This was expanded to a special protocol "MIDI Show Control" used for certain theater applications). MIDI is limited because of it's structure to parameters that usually include 16 channels of data and range from 0 - 127 in value (resolution). This small resolution makes MIDI unsuitable for representing real life (or analog ) activities with any degree of accuracy. With the very nervous system, for example, we often look at data that ranges from a minimum of 100 to a maximum of over 60,000. Compare this with MIDI's 0-127. However, with MSP we are in a system where resolution in both time and range far surpass anything capable in MIDI. Amplitude is expressed as a value between 0 and 1. The resolution is basically unlimited (limited only by the floating point processing capability of the computer) thus providing the ability for a much more natural and smooth mapping of motion data on to sound. (John Mitchell)

Question #15: What is Image/ine, what does it do, what is its relevance for live performance and what are the possibilities in the context of this project as different from other contexts?

Answer #1: Image/ine is a real-time video system. It does a lot of the same things as more well-known software like Adobe Premiere or AfterEffects, but all in real-time (no waiting for rendering). This makes it ideal for a performance situation - video clips, still images, texts and drawings can be recalled instantly, either by an operator working at the computer, or by some kind of triggering mechanism (such as VNS). More interestingly, a live video input can be used, so that things that are actually happening can be manipulated and relayed over monitors or projectors. Live video can also be stored, either in memory or on disk, and recalled - a kind of 'video sampling'. All of these things can be combined, making it a very flexible system. Disadvantages are that it can be quite slow (often the video will be jerky or slowed down), and a little temperamental. Image/ine was developed at STEIM in Amsterdam, an institution specialising in the development of technology interfaces for performance (Jo Hyde).

-

Images

actg