Creating an Internal MIDI Environment


Because a basic 3DO unit has no MIDI ports or internal MIDI hardware, the Music library must import MIDI messages from a score file. It must then create an internal MIDI environment that uses the audio voices of a 3DO unit for note playback. This effectively turns the 3DO audio system into a multichannel polyphonic synthesizer. The system is set up to receive MIDI messages extracted from the file, act on them, and keep track of its own internal state as MIDI messages change parameters such as overall volume or number of voices currently being played.

To play MIDI messages in this internal MIDI environment, the Music library must be able to translate a score file's messages into events that the Juggler can read. The Juggler can then use appropriate software MIDI playback mechanisms for each event. Once the score is translated, the Music library must be able to control score timing, that is, the rate at which it feeds the score's MIDI messages to MIDI playback mechanisms. This is the same type of score timing used as a standard MIDI sequencer sets the tempo according to the number of MIDI clocks it receives.

The following sections describe the MIDI environment set up by the Music library, an environment of Juggler objects and data structures where the MIDI playback functions can go to work.

Creating a Virtual MIDI Device

To play an imported MIDI score, a task must first create a virtual MIDI device using Music library elements. The virtual MIDI device responds to MIDI messages and keeps track of its own state. While playing, a hardware MIDI device has an internal state that changes during playback. The number of voices playing can increase and decrease, the program assigned to each channel can change, the overall volume can go up or down, the stereo mix can pan left and right, and many other conditions can change.

The virtual MIDI device created by the Music library also changes its state during playback. To keep track of the device's current state, the Music library provides a set of data structures designed to store current setting values. The values within these data structures are maintained internally by the Music library, so a task need not (and should not) write to them directly. Nevertheless, it is useful to know what these data structures are and how they are used because a task playing back a MIDI score needs to provide these data structures for score playback.

Providing Voices

A dedicated synthesizer usually has a fixed set of voices. Each voice provides the capability to play a single note at one time. The more voices the synthesizer has, the more notes it can play at once. For example, a 16-voice synthesizer can play up to 16 notes at one time. If the synthesizer is a multichannel synthesizer, it can allocate those voices to different channels. Each channel can then use its allocated voices to play the program assigned to the channel. For example, if a 16-voice synthesizer assigns nine voices to channel 1 and seven voices to channel 2, then channel 1 can play up to nine simultaneous notes, while channel 2 can play up to seven simultaneous notes.

When a hardware synthesizer plays notes, it uses one voice for the first note, and then, if a second note begins before the first finishes, it plays the second note on a second voice so the notes sound simultaneously. Whenever a new note starts, it uses an unused voice, unless there are no unused voices available, in which case it steals a used voice to start the new note.

To provide the equivalent of a voice in a dedicated synthesizer, the Music library uses the NoteTracker data structure, one for each voice. Each NoteTracker data structure is tied to an audio instrument and keeps track of whether the instrument is playing a note or not. If it is playing a note, and score playback requires another note to be started, the Music library's voice allocation looks for a free NoteTracker. It then plays the note using that NoteTracker voice and marks the NoteTracker data structure to show that a note is playing.

Note that the NoteTracker data structure is only used internally by Music library functions. A task should not write values directly to a NoteTracker data structure, just as a task should not touch other internally used data structures.

Setting Channels

The virtual MIDI device, like a dedicated synthesizer, must keep track of the state of its channels. To do so, it uses the ScoreChannel data structure, which holds values that, among other things, give the default program for the channel, the current program setting, the priority of the channel's voices, the number of voices assigned to the channel, the amount of pitch bend set for the channel, and the overall volume of the channel. The ScoreChannel data structure, like the NoteTracker data structure, is for Music library internal use only and a task should not write to it directly.

Assigning Instrument Templates to Program Numbers

MIDI messages use programs and program numbers where the Audio folio uses instrument templates and template names. To translate, the Music library uses a PI (program-to-instrument) map to associate program numbers with instrument template names. The PIMap is an indexed array of instrument template names. When a MIDI message supplies a program number, the function accepting the number uses it as an index into the PIMap. It retrieves the appropriate instrument template name and uses that template as the timbre for the specified program.

Keeping Track of Overall Playback

To keep track of overall playback, the Music library supplies the ScoreContext data structure. This data structure contains, among other things, a pointer to the PIMap structure used for score playback, the maximum number of voices available (to limit strain on system resources), the maximum volume allowed per voice, the name of a mixer instrument used to accept the output of NoteTracker voices, the right and left gain values for the mixer, a pointer to an array of NoteTracker data structures used for playback, a list of free NoteTrackers available to play a note, and an array of ScoreChannel data structures, one for each possible channel. The ScoreContext data structure sets the score context, the overall control for this virtual MIDI device.

The ScoreContext data structure is another Music library internal data structure that should not be written to directly by a task.

Importing a MIDI Score

The Music library provides two functions that import and translate MIDI scores: MFLoadCollection(), which handles MIDI format 1 scores, and MFLoadSequence(), which handles MIDI format 0 scores. Both of these functions are described in Importing a MIDI Score.

When MFLoadSequence() imports a Format 0 score, it turns the score into a single Juggler sequence. When MFLoadCollection() imports a Format 1 score, it turns the score into a Juggler collection that contains a Juggler sequence for each of the MIDI score's component tracks. Each track within a MIDI score, whether Format 1 or Format 0, is always translated into a single Juggler sequence, whether the sequence stands alone (for a Format 0 score) or is part of an encompassing collection (for a Format 1 score).

Within a Juggler sequence created from a MIDI score, each MIDI message is turned into a MIDIEvent data structure, a form that the Juggler can handle directly during playback. The MIDIEvent data structure is defined as follows:

typedef struct MIDIEvent
{
    uint32  mev_Time;
    unsigned char mev_Command;
    unsigned char mev_Data1;
    unsigned char mev_Data2;
    unsigned char mev_Data3;
} MIDIEvent;

The first element of the structure is the MIDI message's time clock value, which is translated here into a 32-bit unsigned value that gives an equivalent number of audio ticks. For example, a MIDI message with a timing clock value of 328 is assigned an audio tick value of 328. Audio tick values are ticked off by the audio clock to time events during playback by the Juggler.

The second element of the structure is the single-byte MIDI message type. The third, fourth, and fifth elements are single-byte data values that may (or may not) accompany a MIDI message. If a MIDI message type does not require accompanying data or does not require all three bytes of data, the contents of the unused data elements are not relevant and are not read.

Providing MIDI Playback functions

Once a MIDI score is in place as a Juggler object, and a MIDI environment is set up using the proper data structures, the Juggler plays the object and the MIDI messages stored in the object's events must be properly executed. To do so, the Music library provides a collection of MIDI playback functions that act according to the MIDI messages in the score. These functions initiate Audio folio activities to carry out the action called for by a MIDI message. They also update the appropriate data structures to reflect their actions.

The MIDI Interpreter Procedure

Every Juggler sequence includes a pointer to an interpreter procedure. As the events in the sequence are played, the interpreter procedure receives the data associated with each event and processes that data. When a MIDI score is imported as a Juggler object, that Juggler object points to the Music library function InterpretMIDIEvent() as its interpreter procedure. When the Juggler plays the MIDI score object, its events are passed to InterpretMIDIEvent(), which strips out the MIDI message embedded in the event and passes it on to InterpretMIDIMessage().

InterpretMIDIMessage() reads the MIDI message and calls an appropriate Music library function to act on that message type. InterpretMIDIMessage() passes along the data that accompanies the MIDI message. The function that acts on the MIDI message, makes the appropriate Audio folio calls, and updates the appropriate Music library data structures.

Setting the Audio Clock Speed

An imported MIDI score has no innate tempo, only arbitrary time values measured in audio ticks. Because a MIDI sequence is played back using a BumpJuggler() loop (a technique described in Creating and Playing Juggler Objects"), the audio clock controls the playback tempo. A MIDI score is usually saved with a particular tempo measured in MIDI clocks per second. The audio clock should be set to the same tempo, measuring audio ticks per second instead of MIDI clocks per second. To do so, the task playing back the score takes ownership of the clock and changes its rate as described in Playing Instruments." Once the clock rate matches the prescribed tempo, the MIDI environment is set for playback.