I have eight articles currently, the History of Music, the History of Midi, the History of my Musical Creations, the History of my BBC Micro Music, a Brief History of the Single, Technology and Media in 1996, Autographs and Tomorrow's World - The Jam Spreading Myth.
History of Midi
This paper has been sources from several other sites too numerous to mention. If there are any mistakes or omissions please let me know.
MIDI - the Musical Instrument Digital Interface - is a protocol specifying how electronic musical instruments may be controlled remotely. Synthesisers were originally a mass of cables and components and the size of a large fridge or bigger, but following the development of the transistor, Robert Moog was able to design analog synthesisers for mass market which were considerably smaller and cheaper. By the end of the 1960s Moog synthesisers were appearing in popular music.
The monophonic (single note) Moog and ARP brands of synthesisers were being used by bands such as ELP, Genesis and others, when the Oberheim company introduced the first commercial polyphonic (able to play several notes at a time) keyboard synthesiser. Relative to its predecessors, it was simple to use, had a built-in keyboard, had four note polyphony and a simple array of knobs and switches you could manipulate to quickly create rich, wonderful new sounds. It was far more portable and easy to program than most of its predecessors.
In the late 1970s, the analog and digital worlds of electronic music began to merge, and with the decreasing cost of microprocessors and mass-produced integrated circuits, synthesiser manufacturers began to incorporate digital circuits into their instruments. The patching of oscillators and controllers was replaced by small control panels and special software. By the end of the decade many synthesisers were entirely digital.
Sequential Circuits, Yamaha, Moog, Roland, ARP, and other companies introduced new models of electronic instruments, all able to play multiple notes simultaneously. Just a few years earlier, what was an expensive, unwieldy and difficult to use machine, was becoming a popular instrument with a growing crowd of diverse musicians.
Prior to programmable memory, the reason that people like Keith Emerson and Rick Wakeman had such extravagant keyboard setups on stage was that each of the instruments could only be set-up to produce a single sound per show. Hours of preparation were needed to patch together the sounds and the different instruments. When memory came along, it allowed a single synthesiser to be used for several different sounds during a live show or recording session, by simply pressing a single button.
Josef Zawinful, of the seventies jazz group Weather Report, developed a unique technique for playing on two keyboards simultaneously. He placed himself between a pair of ARP 2600 synthesisers, one of which had its keyboard electronically reversed, going from high notes on the left to low notes on the right.
All these elaborate measures were designed to accomplish one thing - getting the most from these great new instruments. The layering of sounds upon sounds became an important tool, almost like a trademark sound for some of these and other artists. Then, in 1979, came the next big step: some new keyboards were coming equipped with computer interface plugs on the back. Instruments from the Oberheim. Rhodes, and Roland companies could, for the first time, be connected to another of the same model of synthesiser. For example, an Oberheim OBX synthesiser could be connected to other OBXs. When you played on the keyboard of one, both would play whatever sound was programmed. This was an improvement for performers, since sounds could be layered on top of each other while playing a single keyboard, but id didn't answer the big question of how to connect different instruments from different brands together for unique combinations.
One person who took matters into his own hands was jazz musician Herbie Hancock. Newly enthralled with the technology of synthesisers, he spent a small fortune to have many of his electronic instruments custom modified to connect with each other, allowing him to mix and match sounds any way he wished. For the first time, instruments of different makes were connected with each other by means of a common, though custom, digital connection.
The work that would eventually result in the MIDI 1.0 standard began as a conversation between three audio engineers at the June 1981 trade show of the National Association of Music Merchants (NAMM). I. Kakehashi (Roland Corporation), Tom Oberheim (Oberheim Electronics) and Dave Smith (Sequential Circuits) were all concerned with the difficulty musicians faced in connecting synthesisers from the different manufacturers. Starting from the existing literature on computer networks, Smith worked up an initial proposal which he presented to the Audio Engineers Society in November of that year, called the Universal Synthesiser Interface (USI).
Smith's proposal reached the ears of Japanese synthesiser manufacturers, who had been working on their own standard. The Japanese standard was more complex than USI, which was intended mainly for note on and off events. At the next NAMM show in January 1982, the Japanese, who included such major manufacturers as Korg, Kawai, and Yamaha, joined with the American manufacturers to coordinate the two efforts. Five months later, the basics of the Musical Instrument Digital Interface were presented at the June NAMM show, and vendors began production of commercial instruments which conformed to the infant standard. The whole concept of midi was created to make live performances more manageable for the musician.
Roland was one of the first companies to start making musical products that attached to computers. At the time, IBM had just released its first personal computer, the IBM PC and Commodore had released the Commodore 64, one of the first affordable home computers. The Commodore 64 had a built-in analog synthesiser chip made by a musical company called Ensoniq. It wasn't very fancy, but nonetheless, musicians started playing around with it, and quickly began to discover that the programmability of digital computers, combined with a musical instrument, offered them a lot of potential solutions to those problems they were having back then.
Roland saw the potential musical use that computers offered, so they began work on a musical interface for the IBM PC. Unlike the C-64, the IBM PC had no built-in sound chip, so there was a void to be filled with a third party product. Roland envisioned a digital sequencer to replace the analog sequencers of the time, and it would be built around an IBM PC, which offered a lot of programmability/versatility using tools made by many other companies. Of course, since the PC had no built-in sound chip, and also, Roland wanted this digital sequencer to be able to work with Roland's entire line of new keyboards.
The PC card itself would become the MPU-401 the first MIDI interface for a computer. Roland made some MPU-401 programming information available to other parties and soon there were other PC programs that supported the MPU-401 interface. Indeed, those other programs proved to be more popular than Roland's initial MESA software. One of those products was called Cakewalk, made by a small upstart known as 12 Tone Systems, who later changed their name to Cakewalk since that product became synonymous with the company.
The first keyboard on the market with a MIDI interface was the Prophet 600 by Sequential Circuits in 1983. New Roland keyboards also sported a MIDI interface. Yamaha released the DX7 later that same year, their first keyboard with a MIDI interface. The DX7 proved to be a huge hit with musicians. They loved the sound of its new FM synthesis, bought it in droves, and started to fool around with this new MIDI interface thing. In the UK, whenever you saw a group with a keyboard on Top Of The Pops, it was likely to be a DX7, whereas before it could have been a Roland Jupiter 8, Korg Wavestation or a Fairlight CMI. Apple made a MIDI interface available for its Macintosh computer, and started promoting the computer in the music market during the mid-to-late 1980's since that was a market completely ignored by their biggest competitor IBM, and one that offered them many potential sales, being that musicians loved the flexibility and ease of use of the new, computer-based digital sequencers. By 1985, virtually every new musical keyboard on the market had a MIDI interface.
With the experience of early implementations, the standard was reshaped and refined; and in August 1983 the MIDI 1.0 Standard was formally published. MIDI specifies that MIDI devices must have a MIDI IN port, a MIDI OUT port, and (optionally) a MIDI THRU port. The MIDI IN port allows the device to accept messages in the electronic language which is also part of the MIDI protocol. The MIDI OUT port allows the device to issue such messages to other devices. The optional THRU port allows the device to pass messages directly from its IN port, allowing devices to be daisy-chained together.
In addition to the hardware specification, MIDI provides a description of the information that is passed along the three types of ports. MIDI commands consist of one or more bytes of data, and include such things as note on and off commands, clock synchronisation, etc. These commands are transmitted serially, at a specified rate of 31250 bits/sec, but most MIDI messages have a slot for a channel number. This allows simulation of parallel instrument control: MIDI devices can be configured to accept only messages corresponding to a given channel or set of channels. The possible such configurations are specified in the protocol, and are called modes.
Thus, the abstraction layer between the control signal and the sound of the synthesiser is a mixed blessing. For those composers who want to use MIDI as musical notation, MIDI by itself is not enough; there is no way in the original standard, for example, to differentiate the violin part from the flute part. In late 1982 the first set of universal MIDI specifications were adopted. This caused a large interest in synthesisers and keyboards. With the onset of midi the creative possibilities became endless. Midi could now allow users to connect not only synthesisers and keyboards together but also drum machines and sound modules.
The next logical progression for midi the connection of computers to midi devices. Originally this was done in the form of having the computerised system with the synthesiser. The Fairlight CMI (Computer Music Instrument) was one of the first of this sort. Computer based music systems weren't without there drawbacks. The two major ones were (a) very expensive £2000 to £4000 for the Fairlight CMI or £4000 - £6000 for the New England Digital Synclavier (b) the systems tended to be difficult to operate.
Industry leaders quickly realised the demand for some level of uniformity in the structure of synthesisers, and began work on what became General MIDI to try to address this need. General MIDI specifies a relationship between timbres and MIDI instrument numbers. These instrument numbers consist of two lists of 128 slots, where the first list is predetermined by the General MIDI specifications and the second list is for proprietary use by the manufacturer (or, in some cases, by the user). General MIDI was adopted by the MMA and JMSC in 1991.
SMF (the Standard MIDI File format) was created to provide a standard file format that could be used to transfer music data between all sequencers and software sequencer applications. Most sequencers now give the user the choice of saving files in the sequencer's own format, or in SMF for convenient transfer to other equipment.
SMF actually supports three subtly-different formats:
Format 0 - This format assembles all MIDI data on a single track, allowing playback on even the simplest of sequencers or playback devices. It stands to reason that this format also offers the greatest compatibility.
Format 1 - Format 1 is capable of handling multiple tracks, and is designed to work best with sequencers that allow different parts to be recorded and played back on different tracks - essential for editing and modifying data as well as simple playback.
Format 2 - This little-used format allows multiple tracks and multiple sequence patterns.
In between 1988 and 1991 seven members of the Just Intonation Network constructed a proposal to extend MIDI to include not only nonstandard tunings, but ways of adjusting tunings in real time. This proposal, which uses a system-exclusive command to send tuning data, was adopted by the MMA/JMSC, again in 1991. However, because the system-exclusive command set can be heeded or ignored depending on the implementation, this MIDI Tuning Specification amounts to a recommendation rather than a strict standard.
In 1999, yet another update was introduced and was called General MIDI Level 2. GM2 increases the number of sounds available and also the amount of control over the sound. It is also backwards compatible, meaning all GM2 devices are compatible with GM1 devices. Yamaha and Roland both found the original GM standard limited, and developed their own additions to it (Yamaha's XG standard and Roland's GS standard). Features from both of these were incorporated into GM2.
The XG Format is upward compatible with GM and provides more voices, editing capabilities, three effect processors and other functionalities to enhance the musical expressiveness of MIDI data.
Minimum of 520 Voices
16 or 32 MIDI Channels
Minimum 32 Note Polyphony & Optional Support for Additional
Minimum of 11 Drum Kits on MIDI Channels 10 & 26
The GS Format created by Roland builds on GM and is designed to make richer musical expressiveness possible and to enhance compatibility by standardising in detail expanded functions such as voice editing and effects, as well as providing additional tone.
Minimum of 226 Voices
16 or more MIDI Channels
Minimum 24 Note Polyphony & Optional Support for Additional
Minimum of 9 Drum Kits on MIDI Channel 10
History of MIDI Time line
1981 - Sequential Circuits introduced a common digital interface, called USI (Universal Synthesiser Interface), as a technical paper at the 1981 AES convention.
1982 - USI was revised and completed as the Musical Instrument Digital Interface specification. The first MIDI interface was introduced on the Sequential Circuits Prophet 600.
1983 - The official MIDI 1.0 Detailed Specification was published.
1984 - Popularity of the Yamaha DX7 helped popularise MIDI.
1986 - The MIDI Sample Dump Standard added to the MIDI spec.
1987 - MIDI Time Code added to the MIDI spec.
1991 - General Midi standard added to MIDI spec allowing specific types of sounds to voice numbers.
1999 - General Midi 2 extensions added to MIDI spec increasing number of available sounds and control (like reverb, chorus and tuning).