Realtime audio in jMusic

This tutorial provides an introduction to real time audio with jMusic. When producing audio in realtime, data is played back as soon as it's created. Usually data is not stored as a file so no record of the music in maintained. Realtime playback can go on forever and is not limited by file size. The complexity of the realtime audio production is limited by processing capacity, however, and so the number of parts and the sophistication of the audio processing varies from computer to computer.

The architecture of realtime audio in jMusic varies considerably from rendered audio in that the audio engine deals directly with notes, and they need not be encapsulated inside phrases, parts, or a score. The composition can be generated in realtime also, note-by-note. This opens up new opportunities for interactive music. For realtime audio, notes are generated within an RTLine object. This class is conceptually similar to a phrase expect that it generates notes rather than holds them.

The tutorial is in two parts. First, the RTTest class is presented which sets up the instruments and the RTline. Second, the RealtimeMelody class is presented. This class is an extension of the RTLine class. Each realtime piece must create extensions of the RTLine class of each musical line (phrase). The RTTest class has the main() method.

Here is the source for these two files:

RTTest source.

RealtimeMelody source.

In addition the program uses the RTPluckInst class that is included in the inst directory of the jMusic download.

Lets have a closer look at the RTTest class

import jm.JMC;
import jm.audio.RTMixer;
import jm.audio.Instrument;
import jm.music.rt.RTLine;

As well as the familiar JMC class, we import a number of jMusic audio classes. The RTLine class is in the jm.music package because of its similarity to the other music data classes (Note, phrase, score etc.).

public class RTTest implements JMC{
public static void main(String[] args){
int sampleRate = 44100;
Instrument inst = new RTPluckInst(sampleRate);
Instrument[] insts = new Instrument[] {inst};
RTLine[] rtlines = {new RealtimeMelody(insts)};
final RTMixer rtm = new RTMixer(rtlines);
rtm.begin();
}
}

The class is short and so all the workings are in the main method.

The first four lines of the main() method set up the audio specifications for the music. The samplerate in this case is the same as CD quality audio, it will produce a monophonic audio stream, the control rate specifies how frequently adjustments to the sound are calculated (shorter = more precision).

An instrument to use for the music is declared - the PluckInst. We only have one musical line in this piece so one instrument will suffice. Realtime instruments in jMusic are slightly different from rendered instruments in that they don't write a file to disk they send audio to the speaker. (Writing jMusic instruments is quite detailed and covered in other tutorials).

The instrument is added to an instrument array. This seems superfluous when we only have one instrument, but in cases where there are several this makes more sense.

An array of RTLines is declared, in this case it only contains one instance of an RTLine and uses our RealtimeMelody class (see below). Notice that an RTLine takes the instruments as arguments.

Each jMusic realtime audio piece has an RTMixer object. Its function is to combine all of the RTLines together and send the mix to the audio output. The processing is begun by calling the begin() method of the RTMixer object.


Now let's look at the note generation class - what are we going to play with this PluckInst?

import jm.music.rt.*;
import jm.audio.*;
import jm.music.data.*;

public class RealtimeMelody extends RTLine {

public RealtimeMelody(Instrument[] inst, double controlRate, int bufferSize) {
super(inst, controlRate, bufferSize);
}

// required method overrided
public Note getNextNote() {
return new Note((int)(Math.random() * 60 + 30), 0.25,
(int)(Math.random() * 100 + 27));
}

The RealtimeMelody class has the job of providing notes as the music proceeds. It gets much of its functionality by extending the RTLine class. Each realtime audio program in jMusic must contain one or more classes that extend the RTLine class in this way.

The class imports the appropriate jMusic packages.

There are two classes that are critical, the constructor and the getNote() method.

The constructor takes the necessary arguments allowing the appropriate audio data to be calculated. these are passed to the super class - RTLine - for use.

The getNote() method simply generates one note. When this has been rendered the method will be called again, and so on, until we quit the application. In this case a semiquaver of random pitch and dynamic is produced each time the method is called. Simplistic, but effective. You can put any compositional logic you want in this method - or have it call a method in another class - so long as the result is the next note to be played back.

As One

We can put all this together in one class as in the MyLine example below. Don't forget to include the inst directory in your classpath when compiling and running to access the SimpleFMInst.

Here is the source for this example:

MyLine source.

import jm.JMC;
import jm.music.data.*;
import jm.audio.*;
import jm.music.rt.*;

public final class MyLine extends RTLine implements JMC {

     public static void main(String[] args) {         Instrument inst = new SimpleFMInst(44100, 800, 34.4);         new MyLine(new Instrument[] {inst});     }

     public MyLine(Instrument[] insts) {         super(insts);         RTMixer mixer = new RTMixer(new RTLine[] {this});         mixer.begin();     }

     public Note getNextNote() {         Note n = new Note((int)(Math.random() * 12 + 60), QN);         return n;     } }




© 2002 Andew R Brown