Tutorial On MIDI And Music Synthesis

5m ago
41 Views
0 Downloads
3.24 MB
24 Pages
Transcription

Tutorial on MIDI and Music SynthesisWritten by Jim Heckroth, Crystal Semiconductor Corp.Used with Permission.Published by:The MIDI Manufacturers AssociationPOB 3173La Habra CA 90632-3173Windows is a trademark of Microsoft Corporation. MPU-401, MT-32, LAPC-1 and Sound Canvasare trademarks of Roland Corporation. Sound Blaster is a trademark of Creative Labs, Inc. All otherbrand or product names mentioned are trademarks or registered trademarks of their respectiveholders.Copyright 1995 MIDI Manufacturers Association. All rights reserved.No part of this document may be reproduced or copied without written permission of thepublisher.Printed 1995HTML coding by Scott LehmanTable of Contents IntroductionMIDI vs. Digitized AudioMIDI BasicsMIDI MessagesMIDI Sequencers and Standard MIDI FilesSynthesizer BasicsThe General MIDI (GM) SystemSynthesis Technology: FM and WavetableThe PC to MIDI ConnectionMultimedia PC (MPC) SystemsMicrosoft Windows ConfigurationSummaryIntroductionThe Musical Instrument Digital Interface (MIDI) protocol has been widely accepted and utilized bymusicians and composers since its conception in the 1982/1983 time frame. MIDI data is a veryefficient method of representing musical performance information, and this makes MIDI anattractive protocol not only for composers or performers, but also for computer applications whichproduce sound, such as multimedia presentations or computer games. However, the lack ofstandardization of synthesizer capabilities hindered applications developers and presented newMIDI users with a rather steep learning curve to overcome.

Fortunately, thanks to the publication of the General MIDI System specification, wide acceptanceof the most common PC/MIDI interfaces, support for MIDI in Microsoft WINDOWS and otheroperating systems, and the evolution of low-cost music synthesizers, the MIDI protocol is nowseeing widespread use in a growing number of applications. This document is an overview of thestandards, practices and terminology associated with the generation of sound using the MIDIprotocol.MIDI vs. Digitized AudioOriginally developed to allow musicians to connect synthesizers together, the MIDI protocol is nowfinding widespread use as a delivery medium to replace or supplement digitized audio in games andmultimedia applications. There are several advantages to generating sound with a MIDI synthesizerrather than using sampled audio from disk or CD-ROM. The first advantage is storage space. Datafiles used to store digitally sampled audio in PCM format (such as .WAV files) tend to be quitelarge. This is especially true for lengthy musical pieces captured in stereo using high sampling rates.MIDI data files, on the other hand, are extremely small when compared with sampled audio files.For instance, files containing high quality stereo sampled audio require about 10 Mbytes of data perminute of sound, while a typical MIDI sequence might consume less than 10 Kbytes of data perminute of sound. This is because the MIDI file does not contain the sampled audio data, it containsonly the instructions needed by a synthesizer to play the sounds. These instructions are in the formof MIDI messages, which instruct the synthesizer which sounds to use, which notes to play, andhow loud to play each note. The actual sounds are then generated by the synthesizer.For computers, the smaller file size also means that less of the PCs bandwidth is utilized in spoolingthis data out to the peripheral which is generating sound. Other advantages of utilizing MIDI togenerate sounds include the ability to easily edit the music, and the ability to change the playbackspeed and the pitch or key of the sounds independently. This last point is particularly important insynthesis applications such as karaoke equipment, where the musical key and tempo of a song maybe selected by the user.MIDI BasicsThe Musical Instrument Digital Interface (MIDI) protocol provides a standardized and efficientmeans of conveying musical performance information as electronic data. MIDI information istransmitted in "MIDI messages", which can be thought of as instructions which tell a musicsynthesizer how to play a piece of music. The synthesizer receiving the MIDI data must generatethe actual sounds. The MIDI 1.0 Detailed Specification provides a complete description of theMIDI protocol.The MIDI data stream is a unidirectional asynchronous bit stream at 31.25 Kbits/sec. with 10 bitstransmitted per byte (a start bit, 8 data bits, and one stop bit). The MIDI interface on a MIDIinstrument will generally include three different MIDI connectors, labeled IN, OUT, and THRU.The MIDI data stream is usually originated by a MIDI controller, such as a musical instrumentkeyboard, or by a MIDI sequencer. A MIDI controller is a device which is played as an instrument,and it translates the performance into a MIDI data stream in real time (as it is played). A MIDI

sequencer is a device which allows MIDI data sequences to be captured, stored, edited, combined,and replayed. The MIDI data output from a MIDI controller or sequencer is transmitted via thedevices' MIDI OUT connector.The recipient of this MIDI data stream is commonly a MIDI sound generator or sound module,which will receive MIDI messages at its MIDI IN connector, and respond to these messages byplaying sounds. Figure 1 shows a simple MIDI system, consisting of a MIDI keyboard controllerand a MIDI sound module. Note that many MIDI keyboard instruments include both the keyboardcontroller and the MIDI sound module functions within the same unit. In these units, there is aninternal link between the keyboard and the sound module which may be enabled or disabled bysetting the "local control" function of the instrument to ON or OFF respectively.The single physical MIDI Channel is divided into 16 logical channels by the inclusion of a 4 bitChannel number within many of the MIDI messages. A musical instrument keyboard can generallybe set to transmit on any one of the sixteen MIDI channels. A MIDI sound source, or sound module,can be set to receive on specific MIDI Channel(s). In the system depicted in Figure 1, the soundmodule would have to be set to receive the Channel which the keyboard controller is transmittingon in order to play sounds.Figure 1: A Simple MIDI SystemInformation received on the MIDI IN connector of a MIDI device is transmitted back out (repeated)at the devices' MIDI THRU connector. Several MIDI sound modules can be daisy-chained byconnecting the THRU output of one device to the IN connector of the next device downstream inthe chain.Figure 2 shows a more elaborate MIDI system. In this case, a MIDI keyboard controller is used asan input device to a MIDI sequencer, and there are several sound modules connected to thesequencer's MIDI OUT port. A composer might utilize a system like this to write a piece of musicconsisting of several different parts, where each part is written for a different instrument. Thecomposer would play the individual parts on the keyboard one at a time, and these individual partswould be captured by the sequencer. The sequencer would then play the parts back together throughthe sound modules. Each part would be played on a different MIDI Channel, and the sound moduleswould be set to receive different channels. For example, Sound module number 1 might be set toplay the part received on Channel 1 using a piano sound, while module 2 plays the informationreceived on Channel 5 using an acoustic bass sound, and the drum machine plays the percussionpart received on MIDI Channel 10.

Figure 2: An Expanded MIDI SystemIn this example, a different sound module is used to play each part. However, sound modules whichare "multitimbral" are capable of playing several different parts simultaneously. A singlemultitimbral sound module might be configured to receive the piano part on Channel 1, the basspart on Channel 5, and the drum part on Channel 10, and would play all three parts simultaneously.Figure 3 depicts a PC-based MIDI system. In this system, the PC is equipped with an internal MIDIinterface card which sends MIDI data to an external multitimbral MIDI synthesizer module.Application software, such as Multimedia presentation packages, educational software, or games,sends MIDI data to the MIDI interface card in parallel form over the PC bus. The MIDI interfaceconverts this information into serial MIDI data which is sent to the sound module. Since this is amultitimbral module, it can play many different musical parts, such as piano, bass and drums, at thesame time. Sophisticated MIDI sequencer software packages are also available for the PC. With thissoftware running on the PC, a user could connect a MIDI keyboard controller to the MIDI IN portof the MIDI interface card, and have the same music composition capabilities discussed in the lasttwo paragraphs.There are a number of different configurations of PC-based MIDI systems possible. For instance,the MIDI interface and the MIDI sound module might be combined on the PC add-in card. In fact,the Multimedia PC (MPC) Specification requires that all MPC systems include a music synthesizer,and the synthesizer is normally included on the audio adapter card (the "sound card") along with theMIDI interface function. Until recently, most PC sound cards included FM synthesizers withlimited capabilities and marginal sound quality. With these systems, an external wavetablesynthesizer module might be added to get better sound quality. Recently, more advanced soundcards have been appearing which include high quality wavetable music synthesizers on-board, or asa daughter-card options. With the increasing use of the MIDI protocol in PC applications, this trendis sure to continue.

Figure 3: A PC-Based MIDI SystemMIDI MessagesA MIDI message is made up of an eight-bit status byte which is generally followed by one or twodata bytes. There are a number of different types of MIDI messages. At the highest level, MIDImessages are classified as being either Channel Messages or System Messages. Channel messagesare those which apply to a specific Channel, and the Channel number is included in the status bytefor these messages. System messages are not Channel specific, and no Channel number is indicatedin their status bytes.Channel Messages may be further classified as being either Channel Voice Messages, or ModeMessages. Channel Voice Messages carry musical performance data, and these messages comprisemost of the traffic in a typical MIDI data stream. Channel Mode messages affect the way areceiving instrument will respond to the Channel Voice messages.Channel Voice MessagesChannel Voice Messages are used to send musical performance information. The messages in thiscategory are the Note On, Note Off, Polyphonic Key Pressure, Channel Pressure, Pitch BendChange, Program Change, and the Control Change messages.Note On / Note Off / VelocityIn MIDI systems, the activation of a particular note and the release of the same note are consideredas two separate events. When a key is pressed on a MIDI keyboard instrument or MIDI keyboardcontroller, the keyboard sends a Note On message on the MIDI OUT port. The keyboard may be setto transmit on any one of the sixteen logical MIDI channels, and the status byte for the Note Onmessage will indicate the selected Channel number. The Note On status byte is followed by two

data bytes, which specify key number (indicating which key was pressed) and velocity (how hardthe key was pressed).The key number is used in the receiving synthesizer to select which note should be played, and thevelocity is normally used to control the amplitude of the note. When the key is released, thekeyboard instrument or controller will send a Note Off message. The Note Off message alsoincludes data bytes for the key number and for the velocity with which the key was released. TheNote Off velocity information is normally ignored.AftertouchSome MIDI keyboard instruments have the ability to sense the amount of pressure which is beingapplied to the keys while they are depressed. This pressure information, commonly called"aftertouch", may be used to control some aspects of the sound produced by the synthesizer (vibrato,for example). If the keyboard has a pressure sensor for each key, then the resulting "polyphonicaftertouch" information would be sent in the form of Polyphonic Key Pressure messages. Thesemessages include separate data bytes for key number and pressure amount. It is currently morecommon for keyboard instruments to sense only a single pressure level for the entire keyboard. This"Channel aftertouch" information is sent using the Channel Pressure message, which needs only onedata byte to specify the pressure value.Pitch BendThe Pitch Bend Change message is normally sent from a keyboard instrument in response tochanges in position of the pitch bend wheel. The pitch bend information is used to modify the pitchof sounds being played on a given Channel. The Pitch Bend message includes two data bytes tospecify the pitch bend value. Two bytes are required to allow fine enough resolution to make pitchchanges resulting from movement of the pitch bend wheel seem to occur in a continuous mannerrather than in steps.Program ChangeThe Program Change message is used to specify the type of instrument which should be used toplay sounds on a given Channel. This message needs only one data byte which specifies the newprogram number.Control ChangeMIDI Control Change messages are used to control a wide variety of functions in a synthesizer.Control Change messages, like other MIDI Channel messages, should only affect the Channelnumber indicated in the status byte. The Control Change status byte is followed by one data byteindicating the "controller number", and a second byte which specifies the "control value". Thecontroller number identifies which function of the synthesizer is to be controlled by the message. Acomplete list of assigned controllers