Realtime audio in jMusic
        
        This tutorial provides an introduction to real time audio 
          with jMusic. When producing audio in realtime, data is played back as 
          soon as it's created. 
          Usually data is not stored as a file so no record of the music in maintained. 
          Realtime playback can go on forever and is not limited by file size. 
          
          The complexity of the realtime audio production is limited by processing 
          capacity, however, and so the number of parts and the sofistication 
          of the 
          audio processing varies from computer to computer.
        The architecture of realtime audio in jMusic varies considerably 
          from rendered audio in that the audio engine deals directly with notes, 
          and they need
          not be encapsulated insdie phrases, parts, or a score. The composition 
          can be generated in realtime also, note-by-note.
          This opens up new opportunities for interactive music. For realtime 
          audio, notes are generated within an RTLine object. 
          This class is conceptually not dissimilar to a phrase for realtime.
        The tutorial is in two parts. First, the RTTest class is 
          presented which sets up the instruments and the RTline. 
          Second, the RealtimeMelody class is presented. 
          This class is an extension of the RTLine class. Each realtime piece 
          must create extensions of the RTLine class of each musical line (phrase).
          the RTTest class has the main() method.
        Here is the source for these two files:
        RTTest source.
        RealtimeMelody source.
        in addition you will need the RTPluckInst class availible 
          from the jMusic instruments page, or directly from here:
        RTPluckInst source.
        Lets have a closer look at the RTTest class
        
          
             
               import jm.JMC; import jm.audio.RTMixer; import jm.audio.Instrument; import jm.music.rt.RTLine;                | 
            
          
        
        As well as the familiar JMC class, we import a number of 
          jMusic audio classes. 
          The RTLine class is in the jm.music package because of its similarity 
          to the other music data classes (Note, phrase, score etc.).
          
        
        
          
             
               public class RTTest implements JMC{ 	public static void main(String[] args){ 	       int sampleRate = 44100; 	       int channels = 1; 	       double controlRate = 0.1;                	       int bufferSize = (int)((sampleRate*channels)*controlRate);     
   | 
            
          
        
        The class is short and so all the workings are in the main 
          method.
        The first four lines of the main(0 method set up the audio 
          specifications for the music.
          The samplerate in this case is the same as CD quality audio, it will 
          produce a monophonic audio stream, the control rate specifies how frequenctly
          adjustments to the sound are calculated (shorter = more precission), 
          and the buffer size is calulated from those previous settings. 
          The buffer size is the amount of sample data computed at a time - it 
          effects the latency of the audio system. 
          In practice it is good to have as small a control rate as possible to 
          minimise the buffer size.
          
        
        
          
             
                       Instrument inst = new RTPluckInst(sampleRate);   	       Instrument[] insts = new Instrument[] {inst};   	       RTLine[] rtlines = {new RealtimeMelody(insts, controlRate, bufferSize)}; 	                                	       RTMixer rtm = new RTMixer(rtlines, bufferSize,                                   sampleRate, channels, controlRate); 	       rtm.begin(); 	} 	        }
   
               | 
            
          
        
        An instrument to use for the music is decalred - the RTPluckInst.
          We only have one line in this piece so one instrument will suffice.
          Realtime instruments in jMusic are slightly different from rendered 
          instruments in that they don't write a file to disk. it is usually better 
          to have specific realtime and non-realtime versions of an instrument.
        The instrument is added to an instrument array. 
          This seems superfluous when we only have one instrument, but in cases 
          where there are several this makes more sense.
        An array of RTLines is decalared. in this case it only contains 
          one line and uses our RealtimeMelosy class (see below).
          Notice that the RTLine takes the instruments, controlrate, and buffersize 
          as arguments.
        Each jMusic realtime audio piece has an RTMixer object. 
          Its function is to combine all of the RTLines together and send the 
          mix to the audio output. 
          The processing is begun by calling the begin() method of the RTMixer 
          object.
        
          Now let's look at the note generation class - what are we going to play 
          with this RTPluckInst?
        
          
             
               import jm.music.rt.*; import jm.audio.*; import jm.music.data.*;   public class RealtimeMelody extends RTLine { 	 	public RealtimeMelody(Instrument[] inst, double controlRate, int bufferSize) {             super(inst, controlRate, bufferSize); 	}
  	 	public Note getNote() { 	       return new Note((int)(Math.random() * 60 + 30), 0.25,  			       (int)(Math.random() * 100 + 27)); 	}               | 
            
          
        
        The RealtimeMelody class has the job of providing notes 
          as the music proceeds. 
          It gets much of its functionality by extending the RTLine class.
          Each realtime audio program in jMusic must contain one or more classes 
          that extend the RTLine class in this way.
        The class imports the appropriate jMusic packages.
        There are two clasess that are critical, the constructor 
          and the getNote() method.
        The constructor takes the necessary arguments allowing the 
          appropriate audio data to be calulated. 
          These are passed to the super class - RTLine - for use.
        The getNote() method simply generates one note.
          When this has been rendered the method will be called again, and so 
          on, until we quit the application. 
          In this case a semiquaver of random pitch and dynamic is produced each 
          time the method is called.
          Simplistic, but effective. 
          You can put any compositional logic you want in this method - or have 
          it call a method in another class - 
          so long as the result is the next note to be played back.