CA2234419A1 - Real-time music creation system - Google Patents

Real-time music creation system Download PDF

Info

Publication number
CA2234419A1
CA2234419A1 CA002234419A CA2234419A CA2234419A1 CA 2234419 A1 CA2234419 A1 CA 2234419A1 CA 002234419 A CA002234419 A CA 002234419A CA 2234419 A CA2234419 A CA 2234419A CA 2234419 A1 CA2234419 A1 CA 2234419A1
Authority
CA
Canada
Prior art keywords
rhythm
pitch
user
input device
note
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002234419A
Other languages
French (fr)
Inventor
Eran B. Egozy
Alexander P. Rigopulos
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harmonix Music Systems Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CA2234419A1 publication Critical patent/CA2234419A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0033Recording/reproducing or transmission of music for electrophonic musical instruments
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/32Constructional details
    • G10H1/34Switch arrangements, e.g. keyboards or mechanical switches specially adapted for electrophonic musical instruments
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/361Recording/reproducing of accompaniment for use with an external source, e.g. karaoke systems
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/40Rhythm
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/40Rhythm
    • G10H1/42Rhythm comprising tone forming circuits
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/315User input interfaces for electrophonic musical instruments for joystick-like proportional control of musical input; Videogame input devices used for musical input or control, e.g. gamepad, joysticks
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/171Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
    • G10H2240/281Protocol or standard connector for transmission of analog or digital data to or from an electrophonic musical instrument
    • G10H2240/311MIDI transmission

Abstract

An electronic music system has an input device (12), one or more computer storage media, a rhythm generator (100), a pitch selector (108), and a sound generator (102). The input device (12) generates rhythm-related input signals and pitch-related input signals in response to manipulations of the input device (12) by a user attempting to create and play a solo. The computer storage media have a plurality of user-selectable musical accompaniment tracks over which the user can create and play the solo and a plurality of rhythm blocks wherein each rhythm block defines, for at least one note, at least a time at which the note should be played. The computer storage media also store at least a portion of the solo created by the user over a predetermined time interval in the immediate past. The rhythm generator (100) receives the rhythmrelated input signals from the input device (12), selects one of the rhythm blocks from the computer storage media based on the rhythm-related input signals, and outputs an instruction to play the note at the time defined by the selected rhythm block. The pitch selector (108) receives the pitch-related input signals from the input device (12) and selects an appropriate pitch based on the pitch-related input signals, the user-selected musical accompaniment track, and the stored solo information. The pitch selector (108) then outputs that appropriate pitch. The sound generator (102) receives instructions from the rhythm generator (100), pitches from the pitch selector (108), and the user-selected musical accompaniment track and generates an audio signal representative of the user-created solo and the accompaniment track.

Description

REAL-TIME MUSIC CREATION SYSTEM

Field of the Invention This invention relates to electronic music and, more particularly, to an electronic music system with which a non-musician can produce melodic, creative music without knowledge o~ music theory or the ability to play an instrument or keep time.
- Background o~ the Invention Electronic keyboards and other electronic musical instruments are known. Many electronic keyboard instruments generate digital data compatible with the Musical Instrument Digital Inter~ace (MIDI) standard. Many electronic musical instruments also provide an automatic accomp~n;m~nt or background which is played by the instrument at the per~ormer's request. With many known electronic musical instruments, in order to make organized melodic sounds which would be considered "musicn, the per~ormer must actually be able to play the instrument or at least be able to strike the instrument's "actuators"
(i.e., keys o~ a music keyboard, strings o~ a stringed instrument such as a guitar, etc.) in "time", m~n~ng in some order appropriate ~or the time signature and tempo o~
the piece o~ music, song, or melody being played by the per~ormer on the instrument. With other known musical instruments, the per~ormer makes music by keying a pre-recorded melody on and o~ whenever desired.

W O 97/15043 PCT~US96/lS913 U.S. Patent No. 5,099,738 to Hotz discloses a MIDI-compatible electronic keyboard instrument that does not allow the musician to strike a wrong note. During the interval o~ time in which a particular chord is being played, the instrument generates, in response to the musician's depression of any key, a "correct" note (i.e., pitch) in that chord or a "correct" note in a scale which is compatible with that chord. Like other known electronic musical instruments, the time when notes are played are determined entirely by when the musician depresses a key on the keyboard. I~ the musician does not or cannot depress the keys at appropriate times, the result will be "correct~' notes played in an unorganized, random sequence. The musician thus is given "creative input" as to the time when notes are played but does not have the option o~ playing an incorrect chord or note.
U.S. Patent No. 5,393,926 to Johnson discloses a virtual MIDI guitar system. The system has a personal computer which utilizes a CD-ROM player to play back a stored audio and video aCcomp~n;m~nt selected by a user.
The accomp~n;m~nt is a recording o~ a song with the guitar track omitted. The personal computer stores the guitar track o~ the song. The guitar has strings and a tremolo bar, and a user's manipulation o~ the strings and tremolo bar sends digital data to the personal computer. The personal computer uses that data to access and play back relevant portions o~ the guitar-only track, as described below. The personal computer also mixes the guitar track with the audio track ~rom the CD-ROM player and broadcasts it through speakers while at the same time displaying the video image on a monitor connected to the personal W O 97/15043 PCT~US96/15913 computer. The guitar-only track contains all o~ the guitar notes in the sequence in which they are to be played, and it is partitioned into a sequence o~ ~rames. The guitar player is able to generate only those notes that are within the current ~rame and only in the order in which they appear in the current ~rame, "current" being determined by a clock variable which tells the elapsed time since the song began. The pace at which the notes are played within the current ~rame is determined by when the user strikes the strings such that the user may be able to get somewhat out o~ alignment with the ac~comp~n;m~nt in any particular ~rame and may have some ~lexibility to modify or experiment with the timing o~ the notes of the guitar track within a ~rame. I~ the player does not play the guitar during a period associated with a given ~rame, none o~ the music within that ~rame will be generated. Striking strings o~
the guitar thus causes an otherwise silent, running, pre-recorded guitar-only track to be heard, and the guitar thus essentially operates as an on~o~f or play/silent button ~or the pre-recorded guitar track.
U.S. Patent No. 5,074,182 to Capps et al. discloses a guitar-like instrument with~encoded musical material that includes a plurality o~ multi-part background songs and a plurality o~ solo parts or "rif~s" that harmonize with the background songs. A read only memory (ROM) in the instrument stores a program and the encoded musical - material. Once the user has selected and started a background song, the user can trigger a guitar ri~ by operating some switches on t~e instrument. Manipulating the switches thus causes one o~ a plurality o~ pre-stored ri~s to play over the selected background song.

W O 97/15043 PCTnUS96/15913 Summary of the Invention It is an object of the invention to provide an electronic music system that non-musicians can use to generate melodic, creative music in real-time without knowledge of music theory and without the ability to play an instrument or keep time.
It is also an object to allow the user of the system to create and play easily a non-pre-recorded solo over a pre-recorded background or accomp~n;m~nt track.
It is a further object to allow the user of the system to create solos without the need to strike actuators in time or otherwise physically establish and maintain the timing of the notes of the solo. The system does not require the user to, for example, keep a steady beat.
~ It is still another object to provide the user o~ the system with one or more simple controllers (e.g., a joystick which can have one or more buttons) ~or manipulating the system and generating the solo in real-time.
All of the complexity associated with creating music is placed in the system o~ the invention. A user of the system need not know anything about music or musical instruments to create music with the system. Except for the background track, the music generated by the system under the control of the user is produced in real-time and it is not simply a play back of a pre-recorded solo track.
In general, in one aspect, the invention features an electronic music system having an input mechanism, computer storage media, a rhythm generator, a pitch selector, and a sound generator. The input mechanism provides rhythm-related input signals and pitch-related input signals, for W O 97/15043 PCT~US96/15913 example, in response to a user's manipulations o~ it In one embodiment, the user manipulates the input mechanism to create and play music (e.g., a solo line) over one o~ a plurality o~ user-selectable musical background or accomp~n;m~nt tracks. In general, a solo means a composition or section ~or one per~ormer. A solo can be a musical line of single pitches sounded one a~ter another (i.e., a melody), or it can be a line that has intervals (i.e., two di~ferent pitches sounded at the same) and/or chords (i.e., three or more di~erent pitches sounded simultaneously) as well as, or in place o~, just single pitches. (In general, whenever "melody" is used hereina~ter, it should be taken to mean a melody or a solo, as those two words have been de~ined above. Also, "solo"
includes ~melody~ by de~inition.) The computer storage media (e.g., computer memory such as RAM, a computer hard disk drive, and/or a CD-ROM drive with a CD-ROM therein) contain the user-selectable accomp~n;m~nt tracks and a plurality o~ rhythm blocks.
Each rhythm block de~ines, ~or at least one note, at least a time at which the note should be played. A rhythm block also can speci~y a duration and a loudness ~or the note, and i~ these are not speci~ied by the rhythm block, de~ault or predetermined values are used. The computer storage (e.g., RAM) also stores at least the portion o~ the solo created over some time interval in the immediate past. It pre~erably stores all o~ the user's solo line automatically in real-time as it is created by the user. This "past solo" in~ormation is used by the pitch selector in selecting the next pitch to output.

The rhythm generator receives the rhythm-related input signals ~rom the input device, selects one o~ the rhythm blocks ~rom storage based on the rhythm-related input signals, and then outputs a "play note" instruction which S indicates the time at which to play the note as de~ined by the selected rhythm block. The pitch selector receives the pitch-related input signals ~rom the input device and selects an appropriate pitch based on the pitch-related input signals, harmony and metric data in the user-selected accomp~n;mPnt track, and the "past solo" in~ormation. The pitch selector then outputs that appropriate pitch. The sound generator receives both: (i) the user-selected accomp~n;mPnt track; and (ii) the user-created solo track which includes timing in~ormation ~rom the rhythm generator and pitch in~ormation ~rom the pitch selector. The sound generator then generates a representative audio signal.
In a pre~erred embodiment, the input device is a joystick having a base, a movable handle, and one or more buttons. The buttons can be used by the user to tell the electronic music system ~play~' and to perform certain musical e~ects such as: sustain the current note; play a particular ri~; repeat the lick just played; alter the timbre; bend the pitch; play a chord instead o~ a single note; add a dynamic accent; and/or add articulation.
Moving the joystick's handle along the ~orward/backward axis can provide the rhythm-related input signals, and the right/le~t axis can be associated with the pitch-related input signals. For example, pulling the handle all the way backward can be an indication to the electronic music system to generate notes with the lowest rhythmic activity (e.g., whole notes), and pushing it all the way ~orward can W O 97/15043 PCT~US96/15913 -- 7 mean generate the highest-activity notes (e.g., sixty-~orth notes). When the handle is between these two extremes, - rhythmic activity between the two extremes is generated.
Also, moving the handle all the way to the right can correspond to the highest possible pitch, and the le~tmost position can mean the lowest possible pitch, with a position therebetween meaning a pitch between the highest and lowest pitches. The user can manipulate the joystick handle and quickly and easily switch rhythms and pitches.
These two simple movements alone (back and ~orth, and le~t and right) allow the user to create a solo with rich and varied rhythmic and tonal qualities.
In some embodiments, the electronic music system includes one or more speakers ~or broadcasting the audio signal ~rom the sound generator. An ampli~ier generally must be used to ampli~y the audio signal be~ore it is broadcast through the speaker(s).
Also, in a pre~erred embodiment o~ the electronic music system according to the invention, a programmed computer per~orms the functions o~ the rhythm generator and the pitch selector. The programmed computer can also per~orm the ~unctions o~ the sound generator, or the sound generator can be a MIDI tone generator separate ~rom the computer. The speaker(s) and/or the ampli~ier can be internal to the computer as well.
In general, in another aspect, the invention ~eatures . an electronic rhythm generation system having an input device, a computer storage medium, and a rhythm generator.
The input device generates rhythm-related input signals in response to manipulations o~ the input device by a user.
The computer storage medium has a plurality o~ rhythm blocks wherein each rhythm block de~ines, ~or at least one note, at least a time at which the note should be played.
The rhythm generator receives the rhythm-related input signals ~rom the input device, selects one o~ the rhythm blocks from the computer storage medium based on the rhythm-related input signals, and outputs an instruction to play the note at the time de~ined by the selected rhythm block.
In still another aspect, the invention involves an electronic pitch selection system which comprises an input device, computer storage media, and a pitch selector. The input device generates pitch-related input signals in response to manipulations o~ the lnput device by a user.
The computer storage media has a plurality o~ user-selectable musical accomp~n;mpnt tracks, and stores atleast the pitches selected by the system over a predetermined time interval in the immediate past. The pitch selector receives the pitch-related input signals ~rom the input device, and then selects an appropriate pitch based on the pitch-related input signals, the user-selected musical accomp~n~mPnt track, and the stored pitches. The pitch selector outputs that appropriate pitch.
In general, in yet another aspect, the invention involves an electronic system ~or processing data representative o~ a musical score to modi~y automatically the score by adding instrument-speci~ic per~ormance parameters or musical orn~mPntation.
The ~oregoing and other objects, aspects, ~eatures, and advantages o~ the invention will become more apparent ~rom the ~ollowing description and from the claims.

g Brie~ Description of the Drawings In the drawings, like re~erence characters generally re~er to the same parts throughout the di~ferent views.
Also, the drawings are not necessarily to scale, emphasis S instead generally being placed upon illustrating the principles of the invention.
FIG. l is block diagram o~ a computer-assisted real-time music composition system which uses a simple controller in accordance with the invention.
FIG. 2 is a simpli~ied block diagram o~ a computer in which the present invention can be embodied.
FIG. 3A is a perspective view o~ a computer joystick ~or use as an input device/controller o~ the system in accordance with the invention.
~ FIG. 3B is also a perspective view o~ the joystick showing the m~n;ng o~ various movements in one embodiment o~ the invention.
FIG. 4A is a simpli~ied ~lowchart o~ a set-up procedure a user goes through be~ore generating music with the system o~ the invention.
FIG. 4B is a more complete depiction o~ the set-up procedure.
FIG. 4C is a data path diagram showing which ~unctional blocks o~ the system according to the invention use what data/variables.
FIG. 5 is a high-level ~lowchart o~ the operations per~ormed by the system o~ the invention a~ter set-up is complete.
FIGS. 6A, 6B, 6C, and 6D each shows an example o~ the rhythm block data structure.

W O 97/15043 PCT~US96/15913 FIGS. 7A and 7B each shows an example o~ the rhythm style data structure.
FIG. 8 is a high-level ~lowchart o~ the steps performed by the rhythm generator ~unctional block of the system o~ the invention.
FIG. 9 is a high-level ~lowchart o~ the steps per~ormed by the pitch selector ~unctional block o~ the system o~ the invention.
FIG. 10 is a detailed ~unctional block diagram o~ the computer-implemented system according to the invention.

W O 97/15043 PCT~US96/15913 Description Referring to FIG. 1, a system 10 according to the invention generates music in real-time in response to a user's manipulation of one or more simple controllers/input - S devices 12 such as a joystick. The system 10 includes a computing device 14, a sound generator 16, and one or more speakers 18. The computing device 14 typically is a personal-type computer running programs which generate in real-time digital data representative of music in response to the joystick 12 manipulations The data is then turned into audible music by the combination of the sound generator 16 and the speaker(s) 18.
The system 10 is an electronic music system that is designed for non-musicians but which can be used by anyone t~at wants quickly and easily to generate melodic, creative music in real-time. The user is not required to have any knowledge of music theory or the ability to play an instrument or keep time. All the user needs to know how to do is to manipulate the joystick 12. Other equally simplistic input devices can be used in place of the ~oystick 12 to create music including, ~or example, a mouse, a game pad, a tracball, a MIDI keyboard, a MIDI
guitar, other MIDI instruments, any of a variety of spatial sensors that can track hand/body motion through the air, one or more switches such as the up/down volume touch buttons on an electronic car radio, or any combination of such input devices. The user's manipulations of the input device (e.g., joystick 12) send actuator signals (e.g., changes in the positions of buttons or continuous controllers like the axes o~ a joystick's handle) which cause the system 10 to generate and play a non-pre-recorded WO 97/15043 PCT~US96/15913 melody over a user-selected pre-recorded accomp~n;m~nt/background track.
All o~ the complexity associated with creating music ~rom a traditional or known instrument has been incorporated into the system 10 o~ the invention. The system 10 relieves the user o~ the burden o~ having to learn to play a traditional or known instrument. The system 10 provides the user with a simple controller/input device (e.g., the joystick), and the user thus is ~ree to concentrate solely on the music itsel~. The user does not have to worry about instrument-playing technique, being in tune, playing in time, etc The system 10 o~ the invention has been designed to handle all o~ those concerns Even though the system 10 uses a very simple-to-operate inter:~ace (e.g , the joystick 12) and the user need not have any special musical abilities or knowledge, the user generally is not limited in the type, style, or variety o~
music that he can produce with the system 10 o~ the invention. The system 10 allows a user to do essentially anything that can be done with any traditional or known instrument.
Re~erring still to FIG. 1, the ~unction o~ the sound generator 16 is to generate signals representative o~
audible music, and this can be accomplished by, ~or example, synthesis or sample playback. The electronic hardware needed to generate these signals can reside on a card plugged into the computer 14, or it can be in a separate box external to the computer 14. Also, in the case o~ synthesis, the signal generation can be per~ormed either in hardware or entirely by so~tware running on the computer 14. The sound generator 16 can be, ~or example, a -W O 97/15043 PCT~US96/15913 MIDI tone generator or other synthesis device. The signals generated by the sound generator 16 generally must be - ampli~ied and broadcast by the speakers 18. The ampli~ication and broadcasting can be accomplished by, ~or example, hardware internal to the computer 14 or hardware external to the computer 14.
The computer 14 can be any personal-type computer or workstation such as a PC or PC-compatible machine, an Apple Macintosh, a Sun workstation, etc. The system 10 was developed using a Macintosh Powerbook 540c computer with 12 megabytes o~ RAM and the MAC~OS 7.5.1 operating system, and the computer programs ~or implementing the ~unctionality described herein were written in the C++ programming language. In general, any computer could be used as long as it is fast enough to per~orm all o~ the ~unctions and capabilities described herein without adversely a~ecting the quality o~ the generated music. The particular type o~
computer or workstation is not central to the invention.
In ~act, the music composition system according to the invention can be implemented in a variety o~ ways including an all-hardware embodiment in which dedicated electronic circuits are designed to per~orm all o~ the ~unctionality which the programmed computer 14 can per~orm.
Re~erring to FIG. 2, the computer 14 typically will include a central processor 20, a main memory unit 22 ~or storing programs and/or data, an input/output (I/O) controller 24, a display device 26, and a data bus 28 coupling these components to allow communication therebetween. The memory 22 includes random access memory (RAM) and read only memory (ROM) chips. The computer 14 typically also has one or more input devices 30 such as a W O 97/15043 PCTAUS96/1~913 keyboard 32 (e.g., an alphanumeric keyboard and/or a musical keyboard), a mouse 34, and the joystick 12. In a pre~erred embodiment, the system 10 includes the single joystick 12, the alph~n~lm~ic keyboard 32, and the mouse 34. In general, the joystick 12 is used by the user to create music with the system 10, and the alph~nllm~ic keyboard 32 and mouse 34 are used by the user to setup and con~igure the system 10 prior to the actual creation o~
music with the system 10.
The computer 14 typically also has a hard drive 36 with hard disks therein and a ~loppy drive 38 ~or receiving ~loppy disks such as 3.5 inch disks. Other devices 40 also can be part o~ the computer 14 including output devices (e.g., printer or plotter) and/or optical disk drives ~or ~'eceiving and reading digital data on a CD-ROM. In the disclosed embodiment, one or more computer programs written in C++ de~ine the operational capabilities o~ the system 10, as mentioned previously. These programs can be loaded onto the hard drive 36 and/or into the memory 22 o~ the computer 14 via the ~loppy drive 38. In the disclosed embodiment, the executable version o~ the C++ programs are on the hard drive 36, and the music composition system 10 according to the invention is caused to run by double-clicking the appropriate icon. In general, the controlling so~tware program(s) and all o~ the data utilized by the program(s) are stored on one or more o~ the computer's storage mediums such as the hard drive 36, CD-ROM 40, etc.
In general, the programs implement the invention on the computer 14, and the programs either contain or access the data needed to implement all o~ the ~unctionality o~ the invention on the computer 14.

Focusing back on some of the more central aspects o~
the invention, the input device/controller (e.g., the joystick 12) which a user o~ the system 10 manipulates to create music pre~erably allows the user to indicate to the ~ 5 computer 14 a variety o~ in~ormation Re~erring to FIGS.
3A and 3B, in a pre~erred embodiment, this is accomplished by the joystick 12 being movable in at least ~our directions 42, 44, 46, 48 and having at least three buttons 50, 52, 54.
In the disclosed embodiment, pulling the handle o~ the joystick o~ FIG. 3B in the backward direction 42 indicates to the computer 14 that the user wants to play ~ewer notes over time (e.g., hal~ notes as opposed to eighth notes) in the given time signature, and pushing it ~orward 44 is an i~dication to play more notes over time (e.g., thirty-second notes as opposed to quarter notes). The handle o~
the joystick 12 moves ~rom its backwardmost position to its ~orwardmost position through a series o~ rhythmic values starting with notes having the lowest rhythmic activity (e.g., whole notes) at the backwardmost position and going all the way to notes having the highest rhythmic activity (e.g., sixty-~orth notes) at the ~orwardmost position. The user generally can create any rhythmic output by moving the handle of the joystick back and ~orth. The selection o~
the end points o~ this series and the number and type o~
notes in between the two end points generally is made by - the system designer/programmer. There are a large number o~ possible series or continuums, and the system usually selects one or more particular series automatically without any user involvement. The system typically will select one or more series o~ rhythm values ~ased on the user-selected W O 97/lS043 PCT~US96/15913 - 16 -(or de~ault) accomp~n;m~nt and/or style o~ music. These rhythm continuums and the selection o~ them will become clear hereina~ter ~rom discussions about the "rhythm generator" aspect o~ the system 10 according to the invention.
Continuing with the current example, pushing the handle o~ the joystick to the le~t 46 indicates to the computer 14 that the user wants to play notes o~ a lower pitch (i.e., ~re~uency or tone), and pushing it in the right direction 48 is an indication to play higher-pitched notes. As with the rhythmic values, the joystick 12 moves ~rom its le~tmost position to its rightmost position through a series o~ pitches starting with a lowest-pitched note at the le~tmost position and going all the way to a highest-pitch note at the rightmost position. The user can produce virtually any combination o~ pitches by manipulating the handle side to side. The program running on the computer 14 generally determines the notes in the series, and the determination typically is based on the selected accomp~n;m~nt and/or style o~ music.
Re~erring to FIG. 3A, in the disclosed embo~;m~nt, the joystick 12 has at least a play button 50, a sustain button 52, and a lick repeat button 54.
The play button 50 is used by the user to indicate to the computer 14 when to start creating and playing the melody under the user's joystick control. The user must depress and hold the play button 50. Depressing the play button 50 enables the "rhythm generator" (discussed hereina~ter). As alluded to previously, in the disclosed embodiment, the output o~ the rhythm generator is determined by the ~orward/backward position o~ joystick 12 W O 97/15043 PCT~US96/15913 (FIG. 3B). The user is only allowed to create and play a melody a~ter the aCcomp~n;m~nt has been started, and the user pre~erably starts the accomp~n;m~nt by using the mouse 34 and/or alphanumeric keyboard 32 to click on a graphic ~ 5 start button on the monitor 26 of the computer 14.
The sustain button 52 is used by the user to indicate to the computer 14 that the note currently playing (or the next note played) should be sustained or held such that it continues to sound. That is, the current note is maintained ~or an extended period o~ time. This is similar to a vocalist "holding a note". The note ends when the user releases the sustain button 52.
The lick repeat button 54, when depressed, causes the system 10 to repeat a particular collection o~ notes previously played. This button 54 is use~ul i~ the user has just created a particularly pleasing "lick" or "ri~"
(which generally is a catchy collection o~ several notes) and wants to repeat it automatically without having to ~igure out and re-enact exactly what she just did with the joystick 12 to create the lick in the ~irst instance. The lick stops repeating when the user releases the lick repeat button 54. The point in history at which the system 10 demarcates the beginning o~ the lick is randomly or algorithmically determined by the computer program. The length o~ the repeated segment is typically a ~ew beats or less, as described hereina~ter under the "lickern section.
The discussion o~ this algorithm brings up the important point that the programmed computer 14 is a digital processing device which is capable o~ storing in digital ~ormat some or all o~ the data it generates and outputs to the sound generator 16 (FIG. 1). That is, it CA 022344l9 l998-04-08 W O 97/15043 PCTAUS96/lS913 can, and does, store (e.g., on the hard drive 36, in memory 22, etc.) the data representative of the melody the user is creating as it is being created This capability is what allows the user to repeat a lick with the lick repeat button 54. The computer 14 generally stores the last ten notes o~ the melody, although this parameter is con~igurable and can be set to store more or less notes.
Having described the environment in which the invention operates and generally the overall purpose and ~unctionality o~ the music composition system 10 o~ the invention, the ~ollowing is a more detailed description o~
the invention.

SET~P:
Re~erring to FIGS 1, 2, and 4A, the programmed 15 computer 14 oi~ the system 10 takes the user through a con~iguration or setup procedure be~ore the user is allowed to create music with the system lO. In a pre~erred embodiment, the input devices 30 used by the user to con~igure or setup the system 10 are the keyboard 32 and/or 20 the mouse 34. A:Eter the setup is complete, the user generally uses the joystick 12 (or other similarly simple-to-operate input device) to create music with the system 10 During the setup stage, the programmed computer 14 allows the user to select a particular background or 25 accomp~n;m~nt track (step 68) :Erom a list o~ a plurality o:E
possible tracks In a pre~erred embodiment, the background tracks are stored either as MIDI ~iles or as audio ~iles.
In general, MIDI ~iles are small (i e , do not take up a large amount o~ space on the storage medium such as the 30 memory 22, the hard drive 36, or the CD-ROM 40) and audio ~iles are comparatively large I~ the tracks are MIDI, the selected track typically will be loaded into the memory 22 of the computer 14 from its hard drive 36 or CD-ROM 40, for example. If the tracks, however, are audio, the selected track typically will not be loaded into memory 22 and will ~ 5 instead be streamed off of, for example, the hard drive 36 or the CD-ROM 40 as needed during the user's performance.
After selection of the desired accomp~n;m~nt, the user may select a particular style of music that he wishes to play, but the default is that the computer 14 chooses the style that has been pre-associated with each of the possible background tracks. Once the style is determined by either default or user selection, the computer 14 loads into memory 22 the data relevant to that style.
The user is then allowed by the computer 14 to select a~ instrument from a list of a plurality of possible instruments (step 70). In a preferred embodiment, the instrument list is stored by the computer 14 on, e.g., the hard drive 36 or the CD-ROM 40. For each instrument in the list, there are stored all kinds of data relevant to that instrument. These instrument-specific data are representative of, for example, the functionality of actuators (e.g., buttons) on the joystick 12 or other input device 30, whether the instrument can play chords and what voicings for the chords, the timbre of the instrument which is the characteristic quality of sounds made by the instrument independent of pitch and loudness, pitch - envelopes for one or more notes that the instrument is capable o~ producing, and the pitch range for the instrument.
A more complete description of the setup stage is provided below with reference to FIG. 4B. In FIG. 4B, the W O 97/15043 PCTnUS96/lS913 user-selectable items include the skill level 72 (novice through expert), the type o:E inter~ace 74 (e.g., joystick, game pad, MIDI keyboard, spatial sensors, etc.), the type o~ instrument 76 (e.g., guitar, piano, saxophone, etc.), the background track 78 (i .e., the accomp~n;m~nt piece over which the user wishes to play), and a musical style 80 in which the user wishes to play. Each background track has associated with it a de~ault musical style that is most compatible with the aCcomp~n;m~nt~ but the user may choose an alternative style ~or the sake o~ experimentation. Once the user makes a selection ~rom the plurality o~ choices available ~or each o~ these user-selectable items (e.g., novice, expert, or somewhere in between ~or the skill level 72 item), a whole set o~ data/variables associated with that selection ~or that item are loaded into memory 22 Erom the hard drive 3 6, and those data/variables are used to con~igure the system l0 in a particular way, as described hereinaf~ter with ref~erence to FIG. 4C.
Re~erring now to FIGS. l, 2, 3A, 3B, and 5, with the setup stage complete, the programmed computer 14 waits until the user presses a "start" button (e.g., a graphic button on the monitor 26 which the user points to with the mouse 34 and clicks on). See step 82 in FIG. 5. Once "start" has been indicated, the playback o~ the background track commences (step 84). In the pre~erred embodiment, the user then uses the joystick 12 (or other similarly simple-to-operate input device) to create music with the system l0. As described previously with re~erence to FIG.
3A, the user must depress and hold the play button 50 on the joystick 12 (step 86) to enable the "rhythm generator~
(discussed hereina~ter) and thus the system l0 (step 88).

W O 97/15043 PCTrUS96/15913 The following is a further description of the user-selected (or default) configuration data/variables (FIG.
4B), and the way that they affect the operation o~ the system lO according to the invention. Referring to FIG.
4C, the configuration data associated with the selected skill level 72, interface type 74, instrument type 76, and musical style 80 are provided to one or more of the functional blocks of the system lO of the invention as depicted. These functional blocks are all described hereina~ter with reference to FIG. lO. The selected background track 78 also is provided to some o~ the functional blocks.
Still referring to FIG. 4C, some of the configuration data for the selected skill level 72 is provided to an automator functional block, and some is provided to an interface processor ~unctional block. Both of those blocks are described hereinafter with reference to FIG. lO. The automator receives data about how much system automation should be turned on. For a novice, full-automation will be turned on such that the novice user need only operate the play button to create music, for example. For each level higher than novice, the level of system automation decreases to the point where an expert is given the most amount of control possible. For an expert, the system might enable all buttons and axes on the joystick and a plurality of additional buttons. These additional buttons - typically are keys of the alphanumeric computer keyboard (or a MIDI keyboard or similar device). The interface processor is told what buttons, sliders, etc. on the interface (e.g., joystick and/or keyboard) are enabled/disabled.

Some o~ the con~iguration data ~or the selected inter~ace type 74 is provided to a gesture analyzer ~unctional block, and some is provided to the inter~ace processor Both o~ those blocks are described hereina~ter with re~erence to FIG. lO. The gesture analyzer can be a joystick-sensing system or possibly an electronic eye system, and the data it receives indicates the user's gestures or movements (with the joystick) ~or which the gesture analyzer should be looking and also the corresponding system ~unctions that should be triggered as a result o~ those gestures. The inter~ace processor is told what non-instrument-speci~ic system functions should be triggered by each o~ the various enabled actuators (e.g., buttons) on the inter~ace (e.g., joystick).
~ Some o~ the con~iguration data ~or the selected instrument type 76 is provided to the inter~ace processor, and other data is provided to a chord builder, a timbre manager, an envelope manager, an articulator, and a pitch selector. All o~ these ~unctional blocks are described hereina~ter with re~erence to FIG. lO. The inter~ace processor is told what instrument-speci~ic system ~unctions should be triggered by each o~ the various enabled actuators (e.g., buttons) on the joystick. The chord builder is told whether or not the selected instrument can play chords and i~ so what are the characteristic chord structures or voicings for the selected instrument. The timbre manager is provided with the timbre in~ormation ~or the selected instrument. The envelope manager is told the pitch envelopes to be used ~or the selected instrument in order to shape the pitch o~ the note (e.g., bend it up or down) to simulate how that instrument would sound i~ played W O 97/15043 PCTrUS96/15913 by a trained musician. The articulator is told whether slurring the chosen instrument will a~ect the attack portion o~ the timbre ~or that instrument. The pitch selector is provided with in~ormation about the range o~
~ 5 pitches (lowest to highest) that the selected instrument could produce i~ played by a trained musician.
Some o~ the con~iguration data ~or the de~ault (or selected) musical style 80 is provided to the pitch selector, and the other data is provided to a sustainer, a ri~er, an accenter, and the rhythm generator. These ~unctional blocks are described hereina~ter with re~erence to FIG. lO. The pitch selector is provided with in~ormation about various melodic constraints ~or the given style such as at which times (metrically) consonant notes a~e more likely. The sustainer is told which times (metrically) are eligible ~or sustaining notes in the given style. The ri~er is provided with "ri~s" (which generally are rhythm blocks coupled with melodic contours) appropriate ~or the given style, and these are used ~or e~ects such as grace notes, gliassandi, trills, tremolos, and other melodic ornaments. The accenter and the rhythm generator are both provided with rhythm blocks associated with the given style.
As somewhat of an aside, it is noted that each o~ the background tracks ~rom which the user can select comprises:
(i) a harmony track 90; (ii) a tempo track 92; and (iii) a - MIDI and/or audio track 94. The third component o~ the background track typically is either a MIDI track or an audio track. In either case, it is a data ~ile o~ the music over which the user wants to play a solo or melody.
It could be a song by, ~or example, James Brown, Black W O 97/1~043 PCT~US96/15913 Sabbath, or Barry Manilow. The other two tracks, the harmony and tempo tracks, are created from scratch by the system programmers/designers based on the song (i.e., the MIDI/audio track). Unlike the MIDI/audio track, the harmony and tempo tracks are not recordings of a song that a person could listen to and recognize. Instead, these two tracks contain data that the system 10 of the invention utilizes in selecting and playing notes (under the user's control) that are appropriate ~or the song. The harmony track contains key and chord information about the song.
More specifically, it contains data representative of the key and chord at any particular point in the song. The tempo track contains data representative of the timing of the song. It essentially provides timing information about t~e song in the form of a time grid.
The harmony track provides to the pitch selector the current "key" and the current "chord". The "key" data provided to the pitch selector includes both the root note of the key and the type of key. Examples are: "F major"
(where ~F~ is the root and "major" is the type) which is defined by the notes F, G, A, B-~lat, C, D, and E; "D
minor" (where "D" is the root and "minor" is the type) which is defined by the notes D, E, F, G, A, B-flat, and C;
and "C major" which includes the notes C, D, E, F, G, A, and B.
The tempo track provides to the pitch selector the aforementioned time grid which the pitch selector uses to select a pitch from one o~ two or more classes o~ pitches.
The pitch selector makes this selection between or among classes based, in part, on the current metric position.
For instance, the two classes might be chord tones (i.e., notes in the current chord) and passing tones (i.e., notes in the current key or scale). For example, it is a general melodic principle that chord tones should normally be played on the beat (e.g., the down beat or other strong beats) and passing tones should normally be played o~ the beat or on weak beats. Given the current metric position with respect to the beat or measure, the pitch selector will select the most appropriate pitch class. Then, a particular pitch ~rom that class is selected by the pitch selector based on the current harmony and the current pitch-related joystick position. An example is when the current chord is a C chord and the current key is "D minor"
in which case a G note might be played on a strong beat and a B-~lat note might be played o~ the beat or on a weak ~eat. It is noted that some notes may, and very o~ten will, overlap between or among the plurality o~ classes such as in the previous example where the current chord is C (i.e., the chord tones are C, E, and G) and the key is "D
minor" (i.e., the passiny tones are D, E, F, G, A, B-~lat, and C).
The tempo track also provides data to the rhythm generator. The rhythm generator gets the a~orementioned time grid which the rhythm generator uses to synchronize the user-created melody or solo line with the background track.
K~Yl~$ BLOCRS:
The "rhythm blocks" alluded to above are now described in detail. Rhythm blocks are ~lln~AmPntal to the operation o~ the invention. Rhythm blocks are utilized by the "rhythm generator" (described hereina~ter) to produce rhythmic signals when, ~or example, the user depresses the play button 50 on the joystick ( FIGS. 3A and 5). As alluded to above with rei~erence to FIG. 3B, rhythm blocks are organized by the system designer/programmer into a plurality o~ groupings where each grouping ranges ~rom a block with a lowest rhythmic activity ~or that group to a block with a highest rhythmic activity for that group.
Once the musical style is selected (by the user or by de~ault), the associated group or list o~ rhythm blocks are copied into the memory 22 o:E the computer 14 :Erom, i~or example, the hard drive 36. A given style o:E music might cause a set o~ rhythm blocks to be copied into memory that range ~rom a whole note at the lowest activity level block to a sixty-~orth note at the highest activity level block.
In such a case, and i:E the joystick o~ FIG. 3B is used as the inter~ace, pulling the handle o~ the joystick 12 all the way backward and holding it there would result in a series o~ whole notes to be output by the rhythm generator and played by the system, holding the handle all the way ~orward would cause a series o~ sixty-~orth notes to be output, and moving the handle to a position somewhere therebetween would result in the output o~ a series o~
notes having a rhythmic activity level somewhere between whole notes and sixty-~orth notes such as eighth notes.
Also, and perhaps more importantly, as the user moves the handle back and ~orth, the rhythmic output is varied accordingly and the user is thus able to, ~or example, ~ollow a hal~ note with a sixteenth note, and the user generally is able to create a rhythmic output o~ any variety or combination.
A rhythm block can be thought o~ as a data structure that has ~ive ~ields: (i) identi~ier (a name and an identification number); (ii) length; (iii) event list; (iv) dynamics list; and (v) durations list. A rhythm block does not need to have a value in each o~ these ~ive ~ields, but every rhythm block typically will have at least an ~ 5 identi~ier, a length, and an event list. In the current embodiment, all ~ive ~ields are used. The name component o~ the identi~ier indicates the type o~ note(s) in the rhythm block. The length o~ a rhythm block typically is one beat, but in general the length can be more than one beat such as l.5 beats or two beats. The system designer/programmer has set one beat to equal 480 "ticks"
o~ a scheduler. The pre~erred s~h~Alller is OMS 2.0 which is available ~rom Opcode Systems of Palo Alto, CA, although another srh~A~ler could be used such as Apple's QuickTime product. The event list speci~ies the precise times (in units o~ ticks) within a beat when the rhythm is to play.
In the disclosed embodiment, the dynamics (i.e., volume, loudness, accent - which is called "velocity" in MIDI
terminology) are measured or speci~ied on a scale ~rom O to 20 127 where O is silent and 127 is m~3~r;mllm velocity. The dynamics list speci~ies the loudness o~ each o~ the notes in the rhythm block. The duration list o~ the rhythm block sets how long the note(s) should last in units o~ ticks.
Re~erring to FIG. 6A, one possible rhythm block de~ines two eighth notes. The block has a length o~ one beat and an event list with the values 0 and 240 which ~ means that the ~irst eighth note will sound at the beginning o~ the beat and the second one will sound at exactly hal~ way through the beat (240/480 = 1/2). The 30 dynamics list has values o~ 84 and 84, implying mezzo :Eorte loudness ~or each note. The durations list has values o~

CA 022344l9 l998-04-08 W O 97/15043 PCT~US96/15913 240 and 240, implying legato articulation ~or each eighth note. In other words, the ~irst eighth note will last until the second one plays (i.e., ~or ticks 0 through 239), and the second eighth note will last until the end o~ the beat (i.e., ~or ticks 240 to 479). The repeat notation in the "musical equivalent" section of this example indicates that the rhythm generator will continue to output this same rhythm block unless the user moves the position o~ the handle of the joystick 12. The same is true for all rhythm blocks; once the user has depressed the play button 50 on the joystick, the only way the rhythm generator will stop outputting the appropriate rhythm blocks is i~ the play button 50 is released.
Referring to FIG. 6B, another example of a rhythm block is two syncopated 16th notes. This block has a length of one beat, and an event list with the values 120 and 360 which means that the ~irst sixteenth note will sound one-quarter o~ the way through the beat (120/480 =
1/4) and the second sixteenth note will play at three-20 quarters oE the way through the beat (360/4 80 = 3/4). The dynamics are as in the previous example. The durations list has values of 120 and 120, implying detached articulation.
Re~erring to FIG. 6C, a third example o~ a rhythm block is a dotted eighth note cross-rhythm. In this example, the length is not one beat but instead 1.5 beats (i.e., 720 ticks). The event list has the value zero which means that the dotted eighth note will play at the beginning o~ the block. The dynamics and duration are as indicated in the figure.

W O 97/15043 PCT~US96/15913 Re~erring to FIG. 6D, the ~inal example shows two eighth notes with an o~beat accent. The length is one ~ beat or 480 ticks, and the event list values of 0 and 240 will cause the ~irst eighth note to play at the beginning of the beat and the second one to play in the middle o~ the beat, as in FIG. 6A. The dynamic values o~ 72 and 96 will cause the second note to sound accented. The duration values o~ 120 and 240 will ~urther distinguish the two notes.
Once the system designer/programmer has de~ined all desired rhythm blocks, he assembles a plurality o~ groups or lists using the rhythm blocks as the items in the list.
As described previously, each grouping contains two or more rhythm blocks organized in order o~ increasing rhythmic activity. The rhythm blocks and the groupings o~ them are essentially transparent to the user. The musical style that is selected by the user or by de~ault (FIGS. 4A-4C) determines the group(s) o~ rhythm blocks that will be available to the user.
Re~erring to FIG. 7A, one example o~ a style and its associated rhythm block data is the slow rock musical style. Associated with this style are ~our separate groupings o~ rhythm blocks, each one having its rhythm blocks ordered in increasing rhythmic activity. The ~our groupings o~ rhythm blocks are titled "Normal", "Syncopated", "Alternate ln, and "Alternate 2". A user can ~ be allowed to switch among these ~our groups by, ~or example, operating a button on his joystick. Like the example in FIG. 3B, in this example, with the handle o~ the joystick in the le~tmost position, the rhythm block at the top o~ the appropriate list is selected, and with the W O 97/15043 PCT~US96/15913 handle in the rightmost position, the rhythm block at the bottom o~ the appropriate list is selected. This example o~ a musical style shows other data or variables that can be determined by the style con~iguration 80 (FIGS. 4B and 4C), and these are "swing" and "hal~-shu~le" parameters.
In the slow rock style example, the swing is set to 0~ and the hal~-shu~le also is set to 0~. Swing and hal~-shu~le are de~ined below.
The "swing" parameter is a measure o~ how much the o~beat (or upbeat) eighth note should be delayed. The delay range is 0 to 80 ticks where 0~ corresponds to 0 ticks and 100~ corresponds to 80 ticks. Thus, a swing o~
50~ means to delay the o~beat eighth notes by 40 ticks.
Swing is a well-known term used by musicians and composers t-o indicate the of~beat eighth note delay described above.
The "hal~-shu~le" parameter is a measure o~ how much the upbeat sixteenth notes (occurring at ticks 120 and 360 within the beat) should be delayed. The delay range is 0 to 40 ticks where 0~ corresponds to 0 ticks and 100~
corresponds to 40 ticks. Thus, a hal~-shu~le o~ 50~ means to delay the o~beat sixteenth notes by 20 ticks. Hal~-shu~le is a well-known term used by musicians and composers to indicate the upbeat sixteenth note delay described above.
Re~erring to FIG. 7B, another example o~ a style and its associated rhythm block data is the ~ast blues musical style. Associated with this style are three separate groupings o~ rhythm blocks, each one having its rhythm blocks ordered in increasing rhythmic activity. The three groupings o~ rhythm blocks are titled "Normal+Syncopated", "Alternate 1", and "Alternate 2". A user can be allowed to switch among these three groups by, ~or example, operating a button on his joystick. Like the style example in FIG.
7A, in this example, with the handle o~ the joystick in the le~tmost position, the rhythm block at the top o~ one o~
- 5 the three lists is selected, and with the handle in the rightmost position, the rhythm block at the bottom is selected. The swing parameter ~or this example style is set to 50~ which means that all o~beat eighth notes will be delayed by 40 ticks. As with the previous style example, the hal~-shu~le parameter is set to 0~ which means no delay o~ the o~beat sixteenth notes.
~TOR:
The "rhythm generator" that outputs the above-described rhythm blocks is now described in detail. The rhythm generator allows the user to produce "musically correct" rhythms without requiring the user to have the physical dexterity needed to play those rhythms on a traditional or known instrument. The user can enable and disable the rhythm generator with the play button on the joystick. This button causes the music to start and stop, and thus it can be used by the user to simulate the way an improvising musician starts and stops musical phrases during a solo. The user can use a combination o~ buttons and continuous controllers (e.g., the axes o~ a joystick handle, ~aders, sliders, etc.) on his inter~ace to control the activity and complexity o~ the generated rhythms.
Referring to FIGS. 8 and l0, the rhythm generator l00 selects a rhythm block (~rom the group o~ rhythm blocks provided by the style con~iguration 80, FIGS. 4B and 4C) in response to every rhythm-related input signal ~rom the joystick or other similarly simple-to-operate inter~ace 12 W O 97/15043 PCT~US96/15913 (steps 202 and 204). Once the rhythm yenerator 100 selects a rhythm block, it transmits messages to a note builder ~unctional block 102, the ri~er 104, and the accenter 106.
To the note builder 102, the rhythm generator 100 sends a "play note" instruction at the correct times as de~ined by the rhythm block itsel~ (step 206). A "play note" instruction includes all o~ the in~ormation de~ined by the rhythm block, speci~ically the name o~ the block, its length, and its event list as well as either speci~ied or de~ault dynamics and duration in~ormation I~ the rhythmic activity is su~iciently high, it can be di~icult or impossible ~or the user to manipulate the input device (e.g., joystick) ~ast enough to avoid rapid repetition o~ the same pitch To remedy this situation, w~en the rhythmic activity gets su~iciently high, the rhythm generator 100 sends an instruction to enable the ri~er 104. Once enabled, the ri~er 104 disables the rhythm generator 100, and the ri~er 104 then automatically outputs pre-stored melodic elaborations (e.g., arpeggios).
When the rhythmic activity becomes su~iciently low again, the ri~er 104 will return control to the rhythm generator 100 The ri~er 104 is described in more detail hereina~ter under the "ri~er" he~;ng The in~ormation transmitted by the rhythm generator 25 100 to the accenter 106 is the identi~ication number ~or the current rhythm block. The accenter 106 uses that ID
number to add accent patterns, as described hereina~ter under the accenter heading.

PITCH SELECTOR:
Re~erring to FIGS. 9 and 10, the "pitch selector" 108 ensures that the pitches o~ the notes generated by the user are "musically correct". In response to every pitch-related input signal ~rom the joystick or other similarly simple-to-operate interface 12, the pitch selector 108 selects a pitch ~or playback (steps 208 and 210). The pitch selector selects an appropriate pitch as a ~unction o~ the pitch-related input signals ~rom the ]oystick, the current key and chord o~ the accompAn~m~nt (provided by the harmony track 90 part o~ the background track 78, FIG. 4C), the current metric position (provided by the tempo track 92 part o~ the background track 78), and in~ormation about previous pitches played. See steps 218, 210, 208, 212, 216, and 214 of FIG. 9. Note that the metric position is an indication o~ the current position in, ~or example, a beat (e.g., on the beat or o~f the beat) or a measure (e.g., strong beat or weak beat), and it generally is independent o~ the harmony associated with that same point in time. Once a pitch has been selected, the pitch selector sends the selected pitch to the note builder 102 to be used in the next note that is played (step 220).
As described previously, the pitch selector 108 selects an appropriate pitch ~rom one o~ a plurality o~
classes o~ pitches. The pitch selector 108 makes this selection between or among classes based on the ~actors disclosed in the preceding paragraph. As an example, there might be two classes where one is a collection o~ chord tones (i.e., notes in the current chord), another is a collection o~ passing tones (i.e., notes in the current key or scale), and another is a collection o~ chromatic tones.
For example, a general melodic principle is that chord tones should normally be played on the beat (e.g., the down beat or other strong beats) and passing tones should W O 97/15043 PCT~US96/15913 normally be played o~ the beat or on weak beats. Given the current metric position with respect to the beat or measure, the pitch selector will select the most appropriate pitch class. Then, a particular pitch ~rom that class is selected by the pitch selector based on the current harmony and the current pitch-related joystick position. An example is when the current chord is a C
chord and the current key is "D minor" in which case a G
note might be played on a strong beat and a B-~lat note might be played of~ the beat or on a weak beat It is noted that some notes may, and very o~ten will, overlap between or among the plurality o~ classes such as in the previous example where the current chord is C (i.e., the chord tones are C, E, and G) and the key is "D minor"
(i.e., the passing tones are D, E, F, G, A, B-~lat, and C).
When selecting a pitch class, the pitch selector also utilizes historical in~ormation about the melody. The pitch selector utilizes in~ormation such as the pitch classes o~ the preceding notes, the actual pitches o~ the preceding notes, and other melodic ~eatures o~ the preceding notes such as melodic direction. For example, a general melodic principle is that i~ a melody leaps to a non-chord tone, the melody should then step in the opposite direction to the nearest chord tone.
Once pitch class is determined by the pitch selector 108, the pitch selector 108 then utilizes the pitch-related input signals to select a particular pitch ~rom within that class. In general, the pitch-related input signal corresponds directly to either: (i) pitch register (i.e., how high or low the pitch o~ the note should be); or (ii) change in pitch register (i.e., whether the next pitch W O 97/15043 PCT~US96/15913 should be higher or lower than the preceding pitch and by how much).
I~TERFACE PROCESSOR:
Re~erring to FIG 10, the "inter~ace processor"
~unctional block 110 is responsible for r.h~nn~ling or "mapping" the signals ~rom the input device 12 (e.g., joystick) to the correct system ~unctional blocks. There are many ways that the inter~ace processor 110 could have been con~igured. In the disclosed embodiment, the inter~ace processor 110 is con~igured to transmit messages to the rhythm generator 100, the pitch selector 108, the sustainer 112, the ri~er 104, a licker 114, the timbre manager 116, the envelope manager 118, the chord builder 120, an articulator 122, and the accenter 106.
To the rhythm generator 100, the inter~ace processor 110 sends the position o~ the play button 50 on the joystick 12 which enables/disables the rhythm generator 100. Also sent is the position o~ the joystick handle along the ~orward/back axis, or whatever axis is used to increase/decrease rhythmic activity. The inter~ace processor 110 also sends to the rhythm generator 100 the position o~ the other buttons on the joystick which can be used to change rhythm blocks ~or rhythmic special e~ects such as cross-rhythms, poly-rhythms, and syncopation.
To the pitch selector 108, the inter~ace processor 110 sends the position o~ the joystick's handle along the le~t-right axis, or whatever axis is used to raise/lower the pitch o~ the notes.
To the sustainer 112, the inter~ace processor 110 sends the position of the sustain button 52 on the joystick 12 which enables/disables the sustainer 112.

CA 022344l9 l998-04-08 To the ri~er 104, the inter~ace processor 110 sends the position o~ the various ri~ buttons which enable/disable the ri~er's ~unctions, and it sends in~ormation about the release o~ the sustain button 52 and the simultaneous position o~ the joystick handle along the le~t-right axis to trigger the ri~er 104.
To the licker 114, the inter~ace processor 110 sends the position o~ the lick repeat button 54 which enables/disables the licker 114, and it sends in~ormation about when the lick repeat button 54 is held depressed and the coincident position o~ the joystick handle along the le~t/right axis to move the lick up and down in register on each repeat.
To the timbre manager 116, the inter:Eace processor 110 sends the position o~ the various timbre buttons which enable/disable various ~unctions o~ the timbre manager 116, and it sends in~ormation about when the sustain button 52 is held depressed and the coincident position o~ the joystick handle along the ~orward/backward axis to control continuous blending o~ multiple timbres.
To the envelope manager 118, the inter~ace processor 110 sends the position o~ the various envelope buttons which enable/disable various ~unctions o~ the envelope manager 118, and it sends in~ormation about when the sustain button 52 is held depressed and the coincident position o~ the joystick handle along the le~t/right axis to control pitch bending.
To the chord builder 120, the inter~ace processor 110 sends the position o~ various chord buttons which enable/disable various ~unctions o~ the chord builder 120.

W O 97/15043 PCTrUS96/15913 To the articulator 122, the inter~ace processor 110 sends the position o~ various articulation buttons which enable/disable various ~unctions o~ the articulator 122.
To the accenter 106, the inter~ace processor 110 sends S the position o~ various accenter buttons which enable/disable various ~unctions o~ the accenter 106.
The input device 12 typically will not include all o~
these buttons, although it may. FIGS. 3A and 3B show only three buttons, but there can be a variety o~ other buttons provided on, ~or example, the base o~ the joystick (or they can be the keys o~ the computer keyboard or the keys o~ a MIDI keyboard).

.~TURE ANALYZER:
Re~erring to still FIG. 10, instead o~ allowing the lS user only to provide input to the system by pressing a button or moving a continuous controller on the input device 12 (e.g., joystick), the gesture analyzer 124 can be used to allow the user to trigger speci~ic system ~unctions with "gestures". For example, the system can be con~igured to recognize "wiggling the joystick wildly" as a trigger ~or some special rhythmic e~ect. The gesture analyzer 124 is responsible ~or analyzing the user's manipulation o~ the inter~ace and determining whether or not the user is, ~or example, currently "wiggling the joystick wildly" which would mean the gesture analyzer 124 should send the appropriate signal to the inter~ace processor 110 in order to enable the desired rhythmic e~ect.

Sl:JsTA TNl2~
The sustainer 112 allows the user to sustain a played note ~or an inde~inite duration. When the sustainer 112 is enabled, it sends an instruction to the rhythm generator 100. This instruction tells the rhythm generator 100 to interrupt its normal stream o~ "play note" messages and sustain the next played note until ~urther notice. When the sustainer 112 is disabled, it sends an instruction to S the rhythm generator 100 to silence the sustaining note and then resume normal note generation.
RIFF~R
The ri~er 104 is used to play back "ri~s" which are pre-stored data structures that each contain: (i) a time-10 ;n~ list of "play note" events; and (ii) a listspeci~ying a melodic direction o~set (up or down, and by how much) ~or each o~ those "play note" events. This data structure enables the ri~er 104 to automatically per~orm musical "ri~s" ~or the purpose o~ melodic automation.
Some examples o~ pre-stored ri~s are: yrace notes, mordents, trills, tremolos, and glissandi. Another use ~or ri~s is to add melodic contours (e.g., arpeggios) when the rhythmic activity gets so high that it would be di~icult ~or the user to add plausible melodic contours m~nll~lly.
When enabled, the ri~er 104 transmits messages to the rhythm generator 100 and the note builder 102.
To the rhythm generator 100, the ri~er 104 sends an instruction to stop generating rhythms when the rhythmic in~ormation ~or the note builder 102 starts being supplied by the ri~er 104.
To the note builder 102, the ri~er 104 sends an instruction to play a note (or chord) at the correct times as determined by the current ri~. This "play note"
instruction is also accompanied by a melodic o~set, duration, and loudness (i.e., MIDI "velocity") as speci~ied by the current rhythm block.

LTCKER:
The licker 114 allows the user to "capture" pleasing melodic ~ragments ~rom the immediate past and replay them in rapid succession. Licks are stored in the same data structure ~ormat as ri~s. However, licks are not pre-stored. The user's solo or melody is recorded automatically in the memory 22 o~ the computer 14 in real-time as it is created by the user. When the licker 114 is enabled (by the lick repeat button 54), it chooses a lick o~ random length from recent memory (usually a ~ew beats or less) and saves the lick into the ri~ data ~ormat. The licker 14 then passes that lick to the ri~er 104 along with an instruction to enable the ri~er 104. The licker 114 then resumes recording the generated music.
TIMBRE M~N~GF~
This ~unctional block, the timbre manager 116, allows the user to a~ect the timbre o~ the current solo instrument. This is accomplished by sending the generated notes to multiple MIDI rh~nnel S, each o~ which is using a di~erent MIDI patch (timbre). The timbre manager 116 can then continually adjust the MIDI volume o~ these respective MIDI ch~nn~ls, thus changing the timbral "mix" o~ the output. Note that some MIDI tine generators also allow direct manipulation o~ timbre by controlling synthesis parameters. The de~ault MIDI patches ~or each instrument are provided in the instrument con~iguration 76.
ENVELOPE ~N~q~.~
The envelope manager 118 allows the user to modulate the pitch and loudness o~ sounding notes to achieve multiple e~ects such as pitch bends or crescendi. When enabled, the envelope manager 118 uses pitch bend and loudness (i.e., MIDI "velocity") envelopes to alter the playback of notes. These envelopes are either pre-stored (in which case they are provided in the instrument con~iguration 76) or controlled in real-time by signals ~rom the input device 12. The envelope manager 118 also automatically adds minute random ~luctuations in pitch to some instruments (speci~ically string and wind instruments) so as to mimic human per~ormance imper~ections.
rRnRn B~ILDRR
The chord builder 120 directs the note builder 102 when to per~orm chords instead o~ single notes. When enabled, the chord builder 120 sends a message to the note builder 102 telling it: (i) how many chord notes to play in addition to the main melody note just created; (ii) how close together (in pitch) those chord notes should be; and (iii) whether those chord notes should be above or below the main melody note. This in~ormation is provided to the chord builder in the instrument con~iguration 76.
ARTICULATOR:
The articulator 122 allows the user to add articulation e~ects to the generated notes. Articulation is de~ined as the way in which individual notes are attacked and how much rest space is le~t between sequential notes. For example, i~ the "staccato" ~unction o~ the articulator is enabled, it will send an instruction to the note builder 102 to shorten the duration o~ the next generated note. I~ the "slur" ~unction o~ the articulator is enabled, it will send an instruction to the note builder 102 to lengthen the duration o~ the next generated note, and it will tell the note builder 102 to enable MIDI Porta Mode (with Porta Time = 0), in which case the attack W O 97/15043 PCT~US96/15913 portion o~ the timbre envelopes o~ new notes will not be re-articulated. This is analogous to slurring notes on a traditional instrument.

AC~:~;L. l~;K:
The accenter 106 allows the user to add accent patterns to the generated notes. The accenter 106 has knowledge (~rom the style configuration 80) o~ all o~ the available rhythm blocks. The accenter 106 also has knowledge (~rom the rhythm generator 100) o~ which o~ those rhythm blocks are currently being used. When enabled, the accenter 106 uses this in~ormation to choose a complimentary rhythm block ~or use as an accenting template or an "accent block/' in which a certain note or notes have a higher loudness value than the loudness o~ the corresponding note(s) in the rhythm block ~rom the rhythm generator 100. At the times o~ the "play note~ events de~ined by that accent block, the accenter sends messages to the note builder 102 instructing it to add a speci~ied accent to any notes generated at that time.

NOTE B~ILDER:
The note builder 102 combines all o~ the per~ormance in~ormation ~rom all o~ the other enabled ~unctional blocks. As an example, the note builder 102 can integrate a "play note" instruction ~rom the rhythm generator 100, a pitch ~rom the pitch selector 108, a timbre adjustment ~rom the timbre manager 116, a pitch bend value ~rom the envelope manager 118, a duration value ~rom the articulator 122, and a loudness (i.e., MIDI "velocity") value ~rom the accenter 106. These data come into the note builder 102 and are integrated thereby and then sent out to the MIDI

CA 022344l9 l998-04-08 W O 97/~5043 PCTnUS96/lS913 output device 16. Other outputs ~rom the note builder 102 go to the pitch selector 108 and the licker 114.
When instructed by the chord builder 120 to play a chord, the note builder 102 causes the pitch selector 108 to execute X num.ber o~ additional times in order to produce X number o~ pitches until the desired chord has been constructed, where X and the desired chord are determined the chordal parameters supplied by the chord builder 120.
To the memory bu~er o~ the licker 11~, the note builder 102 sends all o~ the note builder's output such that the licker 114 will always have a record o~ what has been played and will be able to per~orm its ~unction (which is described hereinabove) when called upon by the user to do so.
MIDI OU-1~U1 DEVICE:
This block 16 is the actual sound generating hardware (or, in some cases, so~tware as mentioned previously) that "renders" the MIDI output stream ~rom the note builder 102 m~n~ng it translates the MIDI output stream into an audio signal which may then be ampli~ied and broadcast.
MIDI 12Rc n~
The MIDI output stream ~rom the note builder 102 also can be recorded. For example, the MIDI output stream can be sent to the hard drive 36 o~ the computer 14 and stored thereon. This allows a user to save his per~ormance and easily access (e.g., listen to) it at any time in the ~uture.
AUTOMATQR:
The system o~ the invention thus clearly provides the user with a large number o~ control ~unctionalities. Given enough buttons and ~aders, a user could independently control rhythm, pitch, sustain, ri~s and licks, timbre, pitch envelopes, chords, articulation, and accents However, such a great degree of control would be overwhelming ~or most users.
The purpose oE the automator 130 is to act like a user's assistant and to automatically control many o~ these system ~unctions thereby allowing the user to concentrate on just a ~ew o~ them. The automator 130 is told which system ~unctions to control by the skill level con~iguration 72.
In FIG. 10, the automator 130 is a di~ferent shape than all o~ the other blocks to indicate that it receives in~ormation ~rom every block in FIG. 10 (even though all of the lines are not shown). The automator 130 has access to all o~ the in~ormation in the entire system, and it uses this in~ormation to decide when to enable various system ~unctions.
As an example, the automator 130 can regularly or occasionally send pre-stored pitch-related input signals to the pitch selector 108. This might be done, ~or example, i~ the user has identi~ied himsel~ as having a very low skill level (i.e., a beginner) to the skill level con~iguration 72.
As another example, the automator 130 can regularly or occasionally send pre-stored rhythm-related input signals to the rhythm generator 100. Again, this might be done, ~or example, i~ the user has identi~ied himsel~ as having a very low skill level (i.e., a beginner) to the skill level con~iguration 72.

W O 97/15043 PCT~US96/15913 Another example is where the automator randomly or algorithmically enables one or more functional blocks (e.g., the timbre manager 116, the envelope manager 118, the chord builder 120, the articulator 122, the accenter 5 106, and/or the riffer 104) in order to add automatically complexity to the user's solo line.
One component of this complexity is instrument-specific performance parameters such as pitch bends and timbre substitutions (e.g., guitar harmonics). Another component of this complexity is automatic ornamentation of the score by the addition of effects such as grace notes, tremolos, glissandi, mordents, etc.
In general, the automator is an electronic system for processing a musical score to modify automatically the score by adding instrument-specific performance parameters or musical ornamentation. The musical score is represented by digital data such as MIDI data. The score can be the score that is created in real-time by the system according to the invention, or it can be a score which has been created in the past and stored or recorded on, for example, a computer hard disk drive or other computer-readable data storage medium.
Variations, modi~ications, and other implementations of what is described herein will occur to those of ordinary skill in the art without departing from the spirit and the scope of the invention as claimed. Accordingly, the invention is to be defined not by the preceding illustrative description but instead by the following claims.
What is claimed is:

Claims

Claims 1. A system for creating music in real time, comprising:
an input mechanism which provides rhythm-related input signals and pitch-related input signals;
one or more computer storage media having a plurality of selectable musical accompaniment tracks over which music can be created and played in real time, having a plurality of rhythm blocks wherein each rhythm block defines, for at least one note, at least a time at which the note should be played, and for storing at least a portion of the music created in real time over a predetermined time interval in the past;
a rhythm generator which in real time: receives the rhythm-related input signals from the input mechanism;
selects one of the rhythm blocks from the computer storage media based on the rhythm-related input signals; and outputs an instruction to play the note at the time defined by the selected rhythm block;
a pitch selector which in real time: receives the pitch-related input signals from the input mechanism;
selects an appropriate pitch based on the pitch-related input signals, a selected one of the musical accompaniment tracks, and the stored music; and outputs the appropriate pitch; and a sound generator which in real time: receives instructions from the rhythm generator, pitches from the pitch selector, and the selected musical accompaniment track; and generates an audio signal representative of at least the created music.

2. The system of claim 1 wherein the input mechanism comprises a joystick having a base and a movable handle which a user manipulates to generate the rhythm-related and pitch-related input signals.
3. The system of claim 2 wherein the joystick generates the rhythm-related input signals in response to the user's manipulations of the handle along a first predetermined axis, and the joystick generates the pitch-related input signals in response to the user's manipulations of the handle along a second predetermined axis.
4. The system of claim 1:
wherein the input mechanism also provides musical effect-related input signals;
further comprising a musical effect generator which receives the musical effect-related input signals from the input mechanism and which alters the instructions from the rhythm generator or the pitches from the pitch selector based on the musical effect-related input signals; and wherein the sound generator receives the altered instructions and pitches and generates the audio signal which is representative of the at least the created music with musical effects.
5. The system of claim 4 wherein the input mechanism comprises a joystick having a base, a movable handle, and at least one button.
6. The system of claim 5 wherein the joystick generates the rhythm-related input signals in response to a user's manipulations of the handle along a first predetermined axis, the joystick generates the pitch-related input signals in response to the user's manipulations of the handle along a second predetermined axis, and the joystick generates the musical effect-related input signals in response to the user's manipulations of the button.
7. The system of claim 1 wherein the rhythm generator outputs the instruction to play the note at the time defined by the selected rhythm block and for a default duration and at a default loudness.
8. The system of claim 1 wherein:
at least one of the plurality of rhythm blocks in the one or more computer storage media further defines, for at least one note, how long the note should last when played and how loud the note should be played; and the instruction output by the rhythm generator indicates the time at which the note should be played and the note's length and loudness as defined by the rhythm block.
9. The system of claim 1 further including at least one speaker for broadcasting the audio signal.
10. The system of claim 9 further comprising an amplifier for amplifying the audio signal before it is provided to the speaker for broadcast.
11. The system of claim 10 wherein a programmed computer performs the functions of the rhythm generator and the pitch selector, and the amplifier and the speaker are internal to the computer.
12. The system of claim 11 wherein the programmed computer also performs the functions of the sound generator.

13. The system of claim 1 wherein the sound generator comprises a MIDI tone generator.
14. The system of claim 1 wherein a programmed computer performs the functions of the rhythm generator and the pitch selector.
15. The system of claim 14 wherein the programmed computer also performs the functions of the sound generator.
16. The system of claim 1 wherein the one or more computer storage media comprise computer memory.
17. The system of claim 1 wherein the one or more computer storage media comprise computer memory and a computer hard disk drive, the hard disk drive having the plurality of selectable musical accompaniment tracks over which music can be created and played.
18. The system of claim 1 wherein the one or more computer storage media comprise computer memory and a CD-ROM having the plurality of selectable musical accompaniment tracks over which the music can be created and played.

49a 21. A system for creating music in real time, comprising:
an input device including at least a first axis of manipulation and a second axis of manipulation, the input device for providing rhythm-related signals in response to a user's manipulation of the input device with respect to the first axis and for providing pitch-related signals in response to the user's manipulation of the input device with respect to the second axis; and a real-time music generator for receiving the rhythm-related and pitch-related signals from the input device and creating in real time music comprising (i) pitches based on the pitch-related signals and (ii) rhythmic activity based on the rhythm-related signals.

22. The system of claim 21 wherein the input device comprises a joystick having a movable handle which the user manipulates along the first axis to cause the joystick to provide the rhythm-related signals and along the second axis to cause the joystick to provide the pitch-related signals.

23. The system of claim 21 wherein the input device has the first and second axes arranged perpendicular to each other.

24. The system of claim 21 wherein the input device comprises a continuous controller.

25. The system of claim 21 wherein the input device comprises a mouse.

26. The system of claim 21 wherein the input device comprises a tracball.

27. The system of claim 21 wherein the input device comprises a fader.

28. The system of claim 21 wherein the input device comprises a slider.

29. The system of claim 21 wherein the input device comprises a spatial sensor.

49b 30. The system of claim 21 wherein the input device comprises a discrete controller.

31. The system of claim 21 wherein the input device comprises a game pad.

32. The system of claim 21 wherein the input device comprises a plurality of switches.

33. The system of claim 21 wherein the input device comprises a plurality of buttons.

34. A method for creating music in real time, comprising:
(A) receiving rhythm-related signals provided in response to a user's manipulation of an input device with respect to a first axis of manipulation of the input device;
(B) receiving pitch-related signals provided in response to a user' s manipulation of the input device with respect to a second axis of manipulation of the input device; and (C) creating in real time music comprising (i) pitches based on the received pitch-related signals and (ii) rhythmic activity based on the received rhythm-related signals.

35. The method of claim 34 wherein:
step (A) comprises receiving the rhythm-related signals which are provided in response to the user manipulating the input device along the first axis which is perpendicular to the second axis; and step (B) comprises receiving the pitch-related signals which are provided in response to the user manipulating the input device along the second axis which is perpendicular to the first axis.

36. The method of claim 34 wherein:
step (A) comprises receiving the rhythm-related signals which are provided in response to the user manipulating a handle of a joystick along the first axis; and step (B) comprises receiving the pitch-related signals which are provided in response to the user manipulating the handle of the joystick along the second axis.
CA002234419A 1995-10-16 1996-10-03 Real-time music creation system Abandoned CA2234419A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/543,768 US5627335A (en) 1995-10-16 1995-10-16 Real-time music creation system
US08/543,768 1995-10-16

Publications (1)

Publication Number Publication Date
CA2234419A1 true CA2234419A1 (en) 1997-04-24

Family

ID=24169491

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002234419A Abandoned CA2234419A1 (en) 1995-10-16 1996-10-03 Real-time music creation system

Country Status (9)

Country Link
US (2) US5627335A (en)
EP (1) EP0857343B1 (en)
JP (1) JPH11513811A (en)
KR (1) KR19990064283A (en)
AT (1) ATE188304T1 (en)
AU (1) AU7389796A (en)
CA (1) CA2234419A1 (en)
DE (1) DE69605939T2 (en)
WO (1) WO1997015043A1 (en)

Families Citing this family (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6362409B1 (en) 1998-12-02 2002-03-26 Imms, Inc. Customizable software-based digital wavetable synthesizer
US6011212A (en) * 1995-10-16 2000-01-04 Harmonix Music Systems, Inc. Real-time music creation
US5852800A (en) * 1995-10-20 1998-12-22 Liquid Audio, Inc. Method and apparatus for user controlled modulation and mixing of digitally stored compressed data
US7423213B2 (en) * 1996-07-10 2008-09-09 David Sitrick Multi-dimensional transformation systems and display communication architecture for compositions and derivations thereof
US7098392B2 (en) * 1996-07-10 2006-08-29 Sitrick David H Electronic image visualization system and communication methodologies
US7297856B2 (en) * 1996-07-10 2007-11-20 Sitrick David H System and methodology for coordinating musical communication and display
US7989689B2 (en) 1996-07-10 2011-08-02 Bassilic Technologies Llc Electronic music stand performer subsystems and music communication methodologies
US5990407A (en) * 1996-07-11 1999-11-23 Pg Music, Inc. Automatic improvisation system and method
JP3632411B2 (en) * 1997-09-24 2005-03-23 ヤマハ株式会社 Music signal generation method, music signal generation device, and medium recording program
US6121533A (en) * 1998-01-28 2000-09-19 Kay; Stephen Method and apparatus for generating random weighted musical choices
US6103964A (en) * 1998-01-28 2000-08-15 Kay; Stephen R. Method and apparatus for generating algorithmic musical effects
US6121532A (en) 1998-01-28 2000-09-19 Kay; Stephen R. Method and apparatus for creating a melodic repeated effect
US6051770A (en) * 1998-02-19 2000-04-18 Postmusic, Llc Method and apparatus for composing original musical works
JP3533974B2 (en) * 1998-11-25 2004-06-07 ヤマハ株式会社 Song data creation device and computer-readable recording medium recording song data creation program
US6087578A (en) * 1999-01-28 2000-07-11 Kay; Stephen R. Method and apparatus for generating and controlling automatic pitch bending effects
US6153821A (en) * 1999-02-02 2000-11-28 Microsoft Corporation Supporting arbitrary beat patterns in chord-based note sequence generation
JP3371132B2 (en) * 1999-08-25 2003-01-27 コナミ株式会社 GAME DEVICE, GAME DEVICE CONTROL METHOD, AND COMPUTER-READABLE STORAGE MEDIUM USED FOR THE GAME DEVICE
JP4211153B2 (en) * 1999-09-17 2009-01-21 ソニー株式会社 Recording apparatus and method
US6702677B1 (en) 1999-10-14 2004-03-09 Sony Computer Entertainment Inc. Entertainment system, entertainment apparatus, recording medium, and program
DE60023081D1 (en) 1999-10-14 2005-11-17 Sony Computer Entertainment Inc Entertainment system, entertainment device, recording medium and program
US7058462B1 (en) 1999-10-14 2006-06-06 Sony Computer Entertainment Inc. Entertainment system, entertainment apparatus, recording medium, and program
US7176372B2 (en) * 1999-10-19 2007-02-13 Medialab Solutions Llc Interactive digital music recorder and player
US9818386B2 (en) 1999-10-19 2017-11-14 Medialab Solutions Corp. Interactive digital music recorder and player
US6392133B1 (en) 2000-10-17 2002-05-21 Dbtech Sarl Automatic soundtrack generator
US7078609B2 (en) * 1999-10-19 2006-07-18 Medialab Solutions Llc Interactive digital music recorder and player
JP3700532B2 (en) * 2000-04-17 2005-09-28 ヤマハ株式会社 Performance information editing / playback device
US7827488B2 (en) * 2000-11-27 2010-11-02 Sitrick David H Image tracking and substitution system and methodology for audio-visual presentations
JP4497264B2 (en) * 2001-01-22 2010-07-07 株式会社セガ Game program, game apparatus, sound effect output method, and recording medium
US6388183B1 (en) 2001-05-07 2002-05-14 Leh Labs, L.L.C. Virtual musical instruments with user selectable and controllable mapping of position input to sound output
US6822153B2 (en) 2001-05-15 2004-11-23 Nintendo Co., Ltd. Method and apparatus for interactive real time music composition
WO2002103671A2 (en) * 2001-06-18 2002-12-27 Native Instruments Software Synthesis Gmbh Automatic generation of musical scratching effects
DE10153673B4 (en) * 2001-06-18 2005-04-07 Native Instruments Software Synthesis Gmbh Automatic generation of musical scratch effects
KR20030000379A (en) * 2001-06-23 2003-01-06 정우협 Joystick omitted
US20030069655A1 (en) * 2001-10-05 2003-04-10 Jenifer Fahey Mobile wireless communication handset with sound mixer and methods therefor
US7174510B2 (en) 2001-10-20 2007-02-06 Hal Christopher Salter Interactive game providing instruction in musical notation and in learning an instrument
EP1326228B1 (en) * 2002-01-04 2016-03-23 MediaLab Solutions LLC Systems and methods for creating, modifying, interacting with and playing musical compositions
US7076035B2 (en) * 2002-01-04 2006-07-11 Medialab Solutions Llc Methods for providing on-hold music using auto-composition
US6977335B2 (en) * 2002-11-12 2005-12-20 Medialab Solutions Llc Systems and methods for creating, modifying, interacting with and playing musical compositions
US7928310B2 (en) * 2002-11-12 2011-04-19 MediaLab Solutions Inc. Systems and methods for portable audio synthesis
US7169996B2 (en) * 2002-11-12 2007-01-30 Medialab Solutions Llc Systems and methods for generating music using data/music data file transmitted/received via a network
US9065931B2 (en) * 2002-11-12 2015-06-23 Medialab Solutions Corp. Systems and methods for portable audio synthesis
US8841847B2 (en) 2003-01-17 2014-09-23 Motorola Mobility Llc Electronic device for controlling lighting effects using an audio file
US8008561B2 (en) * 2003-01-17 2011-08-30 Motorola Mobility, Inc. Audio file format with mapped lighting effects and method for controlling lighting effects using an audio file format
KR20100067695A (en) * 2003-02-07 2010-06-21 노키아 코포레이션 Control of multi-user environments
TWI221186B (en) * 2003-09-19 2004-09-21 Primax Electronics Ltd Optical detector for detecting relative shift
JP2006084749A (en) * 2004-09-16 2006-03-30 Sony Corp Content generation device and content generation method
US7563975B2 (en) * 2005-09-14 2009-07-21 Mattel, Inc. Music production system
KR100689849B1 (en) * 2005-10-05 2007-03-08 삼성전자주식회사 Remote controller, display device, display system comprising the same, and control method thereof
WO2007053687A2 (en) * 2005-11-01 2007-05-10 Vesco Oil Corporation Audio-visual point-of-sale presentation system and method directed toward vehicle occupant
WO2007073353A1 (en) * 2005-12-20 2007-06-28 Creative Technology Ltd Simultaneous sharing of system resources by multiple input devices
SE0600243L (en) * 2006-02-06 2007-02-27 Mats Hillborg melody Generator
US20080000345A1 (en) * 2006-06-30 2008-01-03 Tsutomu Hasegawa Apparatus and method for interactive
US9251637B2 (en) * 2006-11-15 2016-02-02 Bank Of America Corporation Method and apparatus for using at least a portion of a one-time password as a dynamic card verification value
US8907193B2 (en) * 2007-02-20 2014-12-09 Ubisoft Entertainment Instrument game system and method
US20080200224A1 (en) 2007-02-20 2008-08-21 Gametank Inc. Instrument Game System and Method
US20080223199A1 (en) * 2007-03-16 2008-09-18 Manfred Clynes Instant Rehearseless Conducting
US8678896B2 (en) 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for asynchronous band interaction in a rhythm action game
US8439733B2 (en) * 2007-06-14 2013-05-14 Harmonix Music Systems, Inc. Systems and methods for reinstating a player within a rhythm-action game
US8138409B2 (en) 2007-08-10 2012-03-20 Sonicjam, Inc. Interactive music training and entertainment system
CN102037486A (en) * 2008-02-20 2011-04-27 Oem有限责任公司 System for learning and mixing music
US8827806B2 (en) 2008-05-20 2014-09-09 Activision Publishing, Inc. Music video game and guitar-like game controller
US8558100B2 (en) * 2008-06-24 2013-10-15 Sony Corporation Music production apparatus and method of producing music by combining plural music elements
US8663013B2 (en) 2008-07-08 2014-03-04 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
US9120016B2 (en) * 2008-11-21 2015-09-01 Ubisoft Entertainment Interactive guitar game designed for learning to play the guitar
US8237042B2 (en) * 2009-02-18 2012-08-07 Spoonjack, Llc Electronic musical instruments
US7939742B2 (en) * 2009-02-19 2011-05-10 Will Glaser Musical instrument with digitally controlled virtual frets
US8449360B2 (en) 2009-05-29 2013-05-28 Harmonix Music Systems, Inc. Displaying song lyrics and vocal cues
US8465366B2 (en) 2009-05-29 2013-06-18 Harmonix Music Systems, Inc. Biasing a musical performance input to a part
US8158873B2 (en) * 2009-08-03 2012-04-17 William Ivanich Systems and methods for generating a game device music track from music
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US8702485B2 (en) 2010-06-11 2014-04-22 Harmonix Music Systems, Inc. Dance game and tutorial
EP2494432B1 (en) 2009-10-27 2019-05-29 Harmonix Music Systems, Inc. Gesture-based user interface
US8874243B2 (en) 2010-03-16 2014-10-28 Harmonix Music Systems, Inc. Simulating musical instruments
US8299347B2 (en) 2010-05-21 2012-10-30 Gary Edward Johnson System and method for a simplified musical instrument
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
US8562403B2 (en) 2010-06-11 2013-10-22 Harmonix Music Systems, Inc. Prompting a player of a dance game
US9024166B2 (en) 2010-09-09 2015-05-05 Harmonix Music Systems, Inc. Preventing subtractive track separation
US8847053B2 (en) 2010-10-15 2014-09-30 Jammit, Inc. Dynamic point referencing of an audiovisual performance for an accurate and precise selection and controlled cycling of portions of the performance
FR2973549B1 (en) * 2011-04-01 2014-02-21 Espace Musical Puce Muse DEVICE FOR PLAYING RECORDED MUSIC
US20140129235A1 (en) * 2011-06-17 2014-05-08 Nokia Corporation Audio tracker apparatus
US10496250B2 (en) * 2011-12-19 2019-12-03 Bellevue Investments Gmbh & Co, Kgaa System and method for implementing an intelligent automatic music jam session
US9281793B2 (en) 2012-05-29 2016-03-08 uSOUNDit Partners, LLC Systems, methods, and apparatus for generating an audio signal based on color values of an image
CN104380371B (en) * 2012-06-04 2020-03-20 索尼公司 Apparatus, system and method for generating accompaniment of input music data
US8847054B2 (en) * 2013-01-31 2014-09-30 Dhroova Aiylam Generating a synthesized melody
WO2014204875A1 (en) 2013-06-16 2014-12-24 Jammit, Inc. Synchronized display and performance mapping of musical performances submitted from remote locations
JP6228805B2 (en) * 2013-10-17 2017-11-08 Pioneer DJ株式会社 Additional sound control device, acoustic device, and additional sound control method
KR20150072597A (en) 2013-12-20 2015-06-30 삼성전자주식회사 Multimedia apparatus, Method for composition of music, and Method for correction of song thereof
JP6631444B2 (en) * 2016-09-08 2020-01-15 ヤマハ株式会社 Electroacoustic apparatus and operation method thereof
US10319352B2 (en) * 2017-04-28 2019-06-11 Intel Corporation Notation for gesture-based composition
GB2574429B (en) * 2018-06-06 2022-07-20 Digit Music Ltd Input device
CN110444185B (en) * 2019-08-05 2024-01-12 腾讯音乐娱乐科技(深圳)有限公司 Music generation method and device
CN112420002A (en) * 2019-08-21 2021-02-26 北京峰趣互联网信息服务有限公司 Music generation method, device, electronic equipment and computer readable storage medium

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US539799A (en) * 1895-05-28 William k
JPS4829416A (en) * 1971-08-20 1973-04-19
US5177311A (en) * 1987-01-14 1993-01-05 Yamaha Corporation Musical tone control apparatus
US5146833A (en) * 1987-04-30 1992-09-15 Lui Philip Y F Computerized music data system and input/out devices using related rhythm coding
US5099738A (en) * 1989-01-03 1992-03-31 Hotz Instruments Technology, Inc. MIDI musical translator
US5403970A (en) * 1989-11-21 1995-04-04 Yamaha Corporation Electrical musical instrument using a joystick-type control apparatus
US5074182A (en) * 1990-01-23 1991-12-24 Noise Toys, Inc. Multiple key electronic instrument having background songs each associated with solo parts which are synchronized with and harmonious with the background song
US5254803A (en) * 1991-06-17 1993-10-19 Casio Computer Co., Ltd. Automatic musical performance device for outputting natural tones and an accurate score
US5245803A (en) * 1991-11-14 1993-09-21 Haag E Keith Connector means for roof panels and a method for installation thereof
JP2650546B2 (en) * 1991-12-26 1997-09-03 ヤマハ株式会社 Electronic musical instrument
US5451709A (en) * 1991-12-30 1995-09-19 Casio Computer Co., Ltd. Automatic composer for composing a melody in real time
US5399799A (en) * 1992-09-04 1995-03-21 Interactive Music, Inc. Method and apparatus for retrieving pre-recorded sound patterns in synchronization
US5465384A (en) * 1992-11-25 1995-11-07 Actifilm, Inc. Automatic polling and display interactive entertainment system
US5440071A (en) * 1993-02-18 1995-08-08 Johnson; Grant Dynamic chord interval and quality modification keyboard, chord board CX10
US5393926A (en) * 1993-06-07 1995-02-28 Ahead, Inc. Virtual music system
US5491297A (en) * 1993-06-07 1996-02-13 Ahead, Inc. Music instrument which generates a rhythm EKG
US5464384A (en) * 1993-11-24 1995-11-07 Leonardo W. Cromartie Achilles tendon support brace
US5488196A (en) * 1994-01-19 1996-01-30 Zimmerman; Thomas G. Electronic musical re-performance and editing system

Also Published As

Publication number Publication date
KR19990064283A (en) 1999-07-26
US5763804A (en) 1998-06-09
US5627335A (en) 1997-05-06
EP0857343B1 (en) 1999-12-29
WO1997015043A1 (en) 1997-04-24
ATE188304T1 (en) 2000-01-15
EP0857343A1 (en) 1998-08-12
JPH11513811A (en) 1999-11-24
DE69605939T2 (en) 2000-08-03
DE69605939D1 (en) 2000-02-03
AU7389796A (en) 1997-05-07

Similar Documents

Publication Publication Date Title
US5763804A (en) Real-time music creation
US6011212A (en) Real-time music creation
US5355762A (en) Extemporaneous playing system by pointing device
JP3309687B2 (en) Electronic musical instrument
JP3829439B2 (en) Arpeggio sound generator and computer-readable medium having recorded program for controlling arpeggio sound
JP3266149B2 (en) Performance guide device
JP3915807B2 (en) Automatic performance determination device and program
US7381882B2 (en) Performance control apparatus and storage medium
JPH09204176A (en) Style changing device and karaoke device
Jaffe et al. The computer-extended ensemble
JP3047879B2 (en) Performance guide device, performance data creation device for performance guide, and storage medium
Menzies New performance instruments for electroacoustic music
US20230035440A1 (en) Electronic device, electronic musical instrument, and method therefor
JP3430895B2 (en) Automatic accompaniment apparatus and computer-readable recording medium recording automatic accompaniment control program
JP2002297139A (en) Playing data modification processor
JP7331887B2 (en) Program, method, information processing device, and image display system
JP2000356987A (en) Arpeggio sounding device and medium recording program for controlling arpeggio sounding
JP4175364B2 (en) Arpeggio sound generator and computer-readable medium having recorded program for controlling arpeggio sound
JP2002182647A (en) Electronic musical instrument
JP2003114680A (en) Apparatus and program for musical sound information editing
Risset The computer as an Journal of New Music Research: Interlacing instruments and computer sounds; real‐time and delayed synthesis; digital synthesis and processing; composition and performance
JP4178661B2 (en) Teaching data generation device and recording medium
Huber Midi
Casabona et al. Beginning Synthesizer
JP2002014673A (en) Method and device for processing performance data of electronic musical instrument, and method and device for automatic performance

Legal Events

Date Code Title Description
FZDE Discontinued