|
The interface is concerned
with how your software appears to and
communicates with the outside world. In this section there are
tutorials on building a graphical user interface (GUI), communicating
with external devices such as MIDI controllers.
This section is also concerned with issues about interface devices,
such as communications protocols, haptics and so on. |
|
|
|
Interface - By Paul Doornbusch
A controller is that part of an instrument, acoustic or electronic, left
if we remove the production of sound. It is the part of the instrument
the performer “plays”. The controller or interface is an extension
of the performer’s body. While this is called an “interface”
we should remember that musicians have very intimate relationships with
their instruments – they exert significant energy with it, they
share their saliva and sweat with their instruments – and the instrument
should be capable of capturing every nuance of their control gesture and
translating that into an expressive sound. Instrumental performance gestures
are extensions of physical and vocal gestures.
Making a computer based musical instrument that matches a mechano-acoustic
instrument in these regards is not an easy task because when designing,
working with, or listening to computer based musical instruments we are
directly confronted with the abstractness of the computer, which seems
at odds with physical and gestural intimacy.
That this is the way music is heading is undeniable, which is why this
is such an important field of musical research.
At the outset of designing an electronic instrument controller there
are a myriad of choices available; knobs or rotary and linear controllers
can be used, proximity detectors (for large and small distances), pressure
sensors, video cameras, joysticks, turntables, accelerometers and so are
also available. It is important to design the instrument to fit the gestural
behaviour and the musical intentions. This will narrow the range of sensible
options for the controller and a final design will only be possible with
a degree of experimentation and analysis. Controllers need to be selected
that make maximum use of the physical gestures, while not minimising them
or making them meaningless or trivial.
The interface or controller can be in the form of a traditional instrument,
where sensors are used to detect the gestures of the performer. An example
of this would be a wind instrument where the closure of holes or keys
and breath are detected via sensors and the information used to control
a synthesis engine. Another traditional instrument, which can be used
as a controller, is a keyboard and commercial manufacturers make many
of these. Traditional instruments may also be modified to add new controllers,
such as a joystick or light sensor added to a guitar (as practiced by
Tom Fryer) or an accelerometer on a percussionist’s sticks. Other
controllers may take completely new forms such as Michele Waiswisz’s
and Laetitia Sonami’s hand controllers and the instruments of the
Sensorband. Some tools, such as fader boxes, game controllers and control
surfaces, may also find use as gestural interfaces for computer musical
instruments. Some of the most interesting current research is in using
force-feedback mechanisms (from game controllers) to provide physical
feedback for the performer. This opens completely new territory in the
field of expressive control interfaces for musical performance.
There is no “free lunch” with computer instrument controllers
(or computer music for that matter) – it is at least as difficult
to build and play a worthwhile and non-trivial digital computer instrument
as it is an acoustic instrument. |