US20010039872A1 - Automatic compilation of songs - Google Patents
Automatic compilation of songs Download PDFInfo
- Publication number
- US20010039872A1 US20010039872A1 US09/805,170 US80517001A US2001039872A1 US 20010039872 A1 US20010039872 A1 US 20010039872A1 US 80517001 A US80517001 A US 80517001A US 2001039872 A1 US2001039872 A1 US 2001039872A1
- Authority
- US
- United States
- Prior art keywords
- songs
- song
- beat
- compilation
- profile
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0033—Recording/reproducing or transmission of music for electrophonic musical instruments
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
- G10H1/40—Rhythm
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2250/00—Aspects of algorithms or signal processing methods without intrinsic musical character, yet specifically adapted for or used in electrophonic musical processing
- G10H2250/025—Envelope processing of music signals in, e.g. time domain, transform domain or cepstrum domain
- G10H2250/035—Crossfade, i.e. time domain amplitude envelope control of the transition between musical sounds or melodies, obtained for musical purposes, e.g. for ADSR tone generation, articulations, medley, remix
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/00007—Time or data compression or expansion
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/21—Disc-shaped record carriers characterised in that the disc is of read-only, rewritable, or recordable type
- G11B2220/215—Recordable discs
- G11B2220/218—Write-once discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2537—Optical discs
- G11B2220/2545—CDs
Definitions
- This invention relates to a method and an apparatus for producing a compilation of overlapping songs, each song in the compilation having a sequence of repeating beats.
- the compilation may, for instance, be suitable for recording onto a data carrier such as a compact disc, or for broadcast by a radio station.
- song we mean any musical song or tune, with or without a vocal element, and with or without elements of melody and harmony, where the rhythmic element of the music is characterised by repetitive beats.
- the invention is especially but not exclusively suitable for compilations of dance music, which is generally characterised by a continuous repetitive sequence of beats. The beat is important for dance music and usually dominates the overall subjective impression provided by the song.
- DJ Disk Jockeys
- EP 0932157 has problems in that it requires, as a pre-requisite, that two songs are provided which include markers for the end part and begin part. Most, if not all songs, can not be purchased with suitable markers allocated to them for use by such an apparatus. It is not taught how such markers are established, and presumably a skilled operator is needed in order to set the markers prior to play back and choose the sequence of the songs.
- An object of the present invention is to produce a method and apparatus for the production of a compilation mix of songs which does not require the presence of pre-established markers and provides for greater flexibility.
- a method of automatically producing a compilation mix from a set of songs comprising the steps of:
- the invention thus provides a method of automatically determining the sequence of the songs according to their rate of repeat of the dominant beat (the tempo—usually measured as beats per minute) and an ideal temporal map for the resulting compilation. This enables the production of effective compilations with a predetermined overall feel to be achieved from any set of songs.
- the invention may include a preliminary step of selecting the set of songs to form the compilation mix from one or more master sets of songs.
- the method step (a) may comprise the sub-steps of selecting a beat profile from a set of different predetermined beat profiles.
- the predetermined beat profiles may be stored within an area of memory.
- the beat profile may comprise a quantitative map of the ideal beat repetition rate throughout the duration of the compilation to be constructed. It may comprise a set of beat rate values and corresponding time values for a compilation.
- the temporal beat profile may be a qualitative map in which relative values of the beat repetition rate at various time locations in the compilation are provided.
- the map may consist of the location in time in the compilation of where the maximum and minimum rates should be, and the change in rate between these points.
- the time information in the map may also be either quantitative, i.e. from zero to 74 minutes or qualitative, i.e. from “start” to “finish”.
- the beat profile may be stored as a set of pairs of data, each pair consisting of a time location in the compilation and either a quantitative or qualitative value of the beat repetition rate.
- the predetermined beat profiles may all be of a predetermined duration.
- the method may further comprise the steps of temporally expanding or compressing the selected profile to match the length of the compilation mix that is being produced.
- the step of selecting the beat profile may be automatically performed by the method.
- a beat profile may, for example, be selected at random from a set of profiles.
- the beat profile may be entered or selected manually by a user.
- the method may therefore include a sub-step of requesting a user to enter or to select a beat profile.
- the step (b) of generating song data may comprise processing each song that is selected to produce a beat value representative of the rate of repetition of the dominant beat of the song.
- the beat repetition rate (tempo) of a song will remain substantially constant over the duration of the song.
- the beat repetition data for each song may therefore comprise a step of, for each song, measuring the beats per minute of only one part of the song. This may, for instance, be a start portion, a middle portion or an end portion of the song.
- It may also include the step of processing each song to produce a time value indicative of the overall duration of the song from start to finish.
- the generated song data may be stored in individual data files corresponding to each song or in a single file.
- the beat profile may be used to calculate the value and the temporal position of the minima and maxima beat rates in the profile.
- Data indicative of the value and position may be stored in a memory.
- the temporal map may already be stored in terms of minima and maxima, and so this processing is not required.
- the songs may be allocated to their temporal positions within the compilation such that the songs having the minima and maxima beat per minute are located at the point in the sequence corresponding approximately to the position of the minima and maxima of the temporal map.
- the remaining songs may be allocated to the remaining positions in the compilation mix to complete the sequence by matching the beat rates of the songs to the temporal profile using a “best-fit” approximation.
- the allocation may be made by processing the song data in combination with the temporal map.
- the duration of the created compilation will be required to be no greater than some maximum time-duration M seconds.
- the compilation mix may have to fit into a precise time slot (i.e.12:05 to 1:00am) and so the maximum time in this case would be 55 minutes.
- the songs in the compilation need to be overlapped so that the mix can cross-fade from one song to another. A variety of overlaps are envisaged within the scope of the present invention.
- the method may therefore comprise the further step of, after allocating the songs to the temporal map to determine the order of the songs, locating the songs relative to one another in time such that the songs overlap in a predetermined manner.
- the predetermined amount of overlap between songs may initially be chosen to be the same for each song.
- the initial amount of overlap L between each song may be automatically determined from the relative values of M and T. If T is greater than M, there is more music than can be fitted into the time duration so each song may be overlapped by an equal amount determined by the equation:
- the individual overlap between pairs of songs may be adjusted. For example, where the song data indicates that a beat drop out is present in a song, this may be used as the point at which the song ends and the subsequent song begins. In that case, the decision to provide equal overlap may be overruled.
- the method may further comprise the step of processing the songs so that the beats occurring in the overlap section from each song are aligned temporally.
- This step may comprise the step of phase aligning the overlapping songs so that the dominant beats of each song are exactly aligned. Thus, the beats of each song will occur at the same time and to the listener will appear as a single beat.
- the dominant beats in the overlap can only be aligned throughout the entire overlap portion if they are at the same repetition rate.
- the method may therefore include the step of time-stretch and/or time-compression processing one or both of the overlapping portions to match the rate of repetition. This may be achieved by speeding up one song in the overlap region, or slowing down one song, or a combination of both.
- the method may further comprise the step of gradually adjusting the speed of the remainder of the song over time to bring it back to its nominal (pre-adjusted) beat repetition rate.
- the human ear is not especially good at identifying slow changes in tempo over time, and so such adjustment can be made without the user noticing in the final compilation.
- the subsequent alterations to the overlaps may require that the overall length of the resulting compilation exceeds the allocated duration, i.e. exceeds M seconds.
- the method may further comprise the step of automatically truncating the compilation by fading out the final song early (before its end) or fading in the first song late ( after its start), or by a combination of both.
- the same step of early/late fade can also be applied if the duration of any one of the calculated overlaps exceeds a predetermined value L.
- L may be quantitative, i.e. 30 seconds, or qualitative, i.e. 30 percent of the duration of a song.
- an appropriate fade-out, fade-in or both may be used to lower the overlap to within the threshold.
- the overlap may be chosen at random or set to some predetermined initial value, say 20 seconds. This may also be used where the total length T of the songs for the compilation are less than the predetermined limit M.
- the value of the overlap may be selected as a function of the length of the songs being compiled, for example as a percentage of the length of an outgoing song, as a percentage of a length of an incoming song, or as a percentage of the average length of the songs being overlapped. This percentage could be fixed or generated at random from a probability distribution.
- the method may further include the step of fading out the amplitude (volume) of the overlapping end portion of a song.
- the method may also include the step of fading in the overlapping start portion of a song.
- a linear increase in amplitude may be provided or it may be faded in accordance with any other predetermined pattern.
- the method may further comprise the step of recording the songs in their selected order over time onto a data carrier, such as a compact disc.
- the invention also provides apparatus adapted to automatically produce a compilation mix of songs in accordance with the method of the first aspect.
- the apparatus may include a first area of memory in which the songs to be compiled are stored, a processing unit which processes the data in the memory, a computer program which, when run on the processor instructs the processor to perform the method of the first aspect of the invention.
- a second area of memory may be provided in which one or more user defined beat profiles are stored for access by the processor.
- the apparatus may further include a connector for connection to an output storage device adapted to record the resulting compilation on a storage data carrier.
- the storage device may comprise a compact disc writer.
- the apparatus may include a data input device whereby a user may select the temporal profile in response to a prompt by the apparatus.
- the apparatus may produce an audio or visual prompt.
- the invention provides a data carrier including a computer program which, when operating on a computer, produces an apparatus according to the second aspect of the invention.
- FIG. 1 is a schematic of an apparatus in accordance with the present invention for automatically producing a compilation of songs
- FIG. 2 is a general flow diagram of the overall method carried out by the apparatus of FIG. 1;
- FIG. 3 is a flow diagram illustrating the steps involved in selecting the songs to be used in constructing the compilation mix
- FIG. 4 is a flow diagram illustrating the steps of calculating the temporal profile of the compilation mix
- FIG. 5 is a flow diagram illustrating the steps of calculating the song data for storage in data files
- FIG. 6 is a flow diagram illustrating the steps of allocating the songs to temporal locations within the compilation
- FIG. 7 is a flow diagram of the steps for providing seamless transitions between songs.
- FIG. 8 is a diagram showing three typical beat profiles.
- FIG. 1 is a schematic diagram of an apparatus in accordance with one aspect of the present invention.
- the apparatus comprises three main functional elements. Firstly, a memory storage area 1 is provided which stores, in either digital or analogue form, a number of songs from which a compilation mix is to be made. As shown, this comprises a compact disc player which reads compact discs on which songs are recorded in a digital format.
- the storage may alternatively or additionally comprise a solid state electronic memory, and/or magnetic storage such as a number of hard drives and/or magnetic tape drives. Any combination of these and other storage technologies are appropriate.
- a compilation generation apparatus 2 which includes a central processing unit 3 and a first area of memory 4 which stores a program that provides operating instructions to the central processing unit 3 .
- the apparatus further includes an input port 5 which is connected to the memory storage area 1 whereby the central processing unit 3 can download the song data from the storage area.
- the compilation apparatus 2 is located in the same room as the memory storage area 1 , and is connected through a conductive cable 6 , such as a copper cable. Optical connections could be used, and it is envisaged that the storage area may be located in a different geographical area for access over the telephone network using a modem.
- the compilation generation apparatus 2 further includes an intermediate memory buffer 7 into which the songs to be compiled are stored prior to processing, and into which song data produced by processing the songs can be stored.
- An output port 8 is also provided with enables the final compilation mix in the intermediate memory 7 to be passed along a cable 9 to a storage device such as a CD writer 10 .
- the compilation generation apparatus may be embodied as a personal computer in which case it will further include an input device (not shown) for receiving commands from a user and a display (not shown).
- the apparatus automatically produces a compilation by performing the steps illustrated in the flow charts of FIGS. 2 to 7 of the accompanying drawings.
- FIG. 2 is a flowchart which gives an overview of the key steps performed in the compilation method performed by the apparatus of FIG. 1 of the accompanying drawings.
- a temporal beat profile is selected 22 for the finished compilation. This profile provides a template for the finished compilation and determines the mood of the finished compilation. Several different profiles may be provided.
- FIGS. 8 ( a ) to 8 ( c ) illustrate three possible profiles. Each illustration plots a qualitative value of beats per minute (bpm) against time over the length of the compilation.
- the bpm starts at a lower value, designated “minimum”. This gradually increases over time to a “maximum” value, and then decreases towards the end.
- the bpm starts at a “maximum” value and gradually falls.
- FIG. 8( c ) it starts at a minimum and gradually increases over time.
- the profile covers a time span from “start” to “stop”. The profile can thus be scaled, for example to correspond to the maximum run-time of a standard format compact disc.
- the selected songs are next classified 23 to generate a set of song data.
- a data record is generated for each song in which is recorded the average bpm of the song, its run-time (length) and the location of any drop-outs in the song.
- the songs are allocated 24 to the profile to determine the order in which the songs will play in the finished compilation. This is performed by comparing the song data in the data records with the temporal profile that has been selected to provide a best fit. This ensures the bpm of the compilation follows, as closely as possible, the ideal bpm profile.
- the amount of overlap between songs is calculated and the overlapping portions are processed 25 to provide a seamless transition between songs.
- the processed set of songs is output 26 to the storage area for recording onto the compact disc by the compact disc writer.
- Song selection can be made either manually by a user who selects from a list or automatically. Initially, the apparatus queries 31 whether manual or automatic selection is required.
- the user is next prompted 32 to select a song.
- a list of songs from which the selection is to be made may be displayed or a prompt may be made for the user to provide a song. For example, following the prompt, the user may load a compact disc containing the song into the CD player.
- the selected song is stored 35 in intermediate memory.
- the song itself may be recorded in the memory, or a marker may be stored which points to the location of the selected song.
- the total run-time of the compilation mix is also requested 34 . This may be entered by a user from a prompt or may be a default value. For example, for a compact disc the default of 74 minutes may be used. For a compilation for play on a radio show, other lengths may be stored.
- the apparatus continues to request songs to be selected until the user indicates that the selection is complete or until the length of the songs when played end-to-end exceeds the requested run-time. The selection is then complete. After selection, a complete set of songs, or markers pointing to the location of songs, is stored in the intermediate memory.
- the temporal beat profile is next calculated or selected, as illustrated in the flow chart of FIG. 4 of the accompanying drawings. Again, a profile may be selected automatically or manually by a user.
- the apparatus therefore initially queries 41 the selection status.
- a set of profiles is stored in memory and identified by an appropriate label, i.e. profile A, profile B, profile C, etc. j
- the apparatus prompts 42 the user to select an appropriate profile by entering the label relating to the profile from the input device.
- the profile is either chosen at random or a default profile is used. This profile is then stretched/compressed temporally to fit the length of the compilation required.
- the selected songs are analysed to generate a set of data records.
- a blank data file is first produced 52 .
- the average rate of repetition of the dominant beat of the song is calculated 53 and stored 54 in the data file for the respective song.
- the temporal length of the song is calculated and stored 55 , and the temporal location within the song of any drop-outs is calculated 56 and stored 57 in the data file.
- the process is repeated 58 until all songs are analysed and a full set of data files has been produced.
- the profile is first analysed 61 to calculate the ideal temporal position of the highest and lowest beats per minute.
- the profile is then divided 62 into N temporal slices, each slice being ranked in order by the beats per minute value in the profile.
- the slice having the highest beats per minute is then matched 63 to the song whose data file indicates it has the highest beats per minute. Similarly, the slice with the lowest beats per minute is matched 64 to the song whose data file has the lowest bpm.
- the remaining songs are then allocated 65 to the profile so as to provide a “best-fit” solution before the songs are aligned 66 .
- the temporal profile is divided into equal width slices, the width of the slice being equal to the length of a song. In that case, songs are fitted directly to the slots with a small amount of overlap.
- the highest and lowest bpm songs may be “centred” around the maxima and minima in the profiles.
- the remaining songs are then fitted into the profile and the songs are re-aligned to provide the smallest overlap between songs.
- the overlaps may be made equal length. Having arranged the songs into an approximate run-time order, and roughly overlapped the songs, the overlapped portions are next processed to provide a seamless transition between songs. This is illustrated in the flow chart of FIG. 7 of the accompanying drawings.
- the apparatus checks 71 to see if the bpm value data in the two data files is the same. If not, then the two overlapping portions are processed 72 by either time-compressing to speed up or time-stretching to slow down one or both of the songs until their tempos are matched.
- the phase of the two end portions are aligned 73 . This is achieved by moving the two songs temporally until the beats coincide.
- a fade is then applied to both songs.
- the end of the first song is faded out 74
- the start of the next song is faded in 75 .
- the data record contains this information alongside the information indicating where the songs start and finish.
- the data held in the data files which provides a full indication of the temporal alignment, phase adjustment, and amplitude data for fading is used to record the completed compilation mix onto the CD in the CD writer.
- the details may be written to the CD writer alongside a set of separate tracks.
- the data file is used to control the order and timing of play-back so that a full compilation mix is heard.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Electrophonic Musical Instruments (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
Abstract
Description
- This invention relates to a method and an apparatus for producing a compilation of overlapping songs, each song in the compilation having a sequence of repeating beats. The compilation may, for instance, be suitable for recording onto a data carrier such as a compact disc, or for broadcast by a radio station.
- It will be understood that by song we mean any musical song or tune, with or without a vocal element, and with or without elements of melody and harmony, where the rhythmic element of the music is characterised by repetitive beats. The invention is especially but not exclusively suitable for compilations of dance music, which is generally characterised by a continuous repetitive sequence of beats. The beat is important for dance music and usually dominates the overall subjective impression provided by the song.
- It is well known to manually produce a compilation by “mixing” together a set of individual songs, usually dance music songs, to form a single continuous track in which one song is merged seamlessly into the adjacent song or songs. The resulting compilation comprises an apparently continuous stream of music.
- At present, the production of such “compilations” (sometimes referred to as “mixes”) have required many hours work by skilled Disk Jockeys (DJ's). The job of a DJ involves great skill in deciding on the order of the songs, (the “sequence”) and also skill in seamlessly mixing one track into the other. The amount of time and skill needed puts the production of compilation mixes beyond the abilities of most listeners.
- We are aware of European patent application No. EP 0932157A1. This document describes an apparatus and method for automatically performing a cross-over between two songs which are played back consecutively. As soon as the start of a pre-determined end part of the first song is detected, this end part is played simultaneously with a pre-determined beginning part of the second song. After play-back of the end part concludes, the play-back continues for the second song only. During simultaneous play-back a fade-out and/or a fade-in may be performed for the end part and/or the begin part respectively.
- The apparatus known from EP 0932157 has problems in that it requires, as a pre-requisite, that two songs are provided which include markers for the end part and begin part. Most, if not all songs, can not be purchased with suitable markers allocated to them for use by such an apparatus. It is not taught how such markers are established, and presumably a skilled operator is needed in order to set the markers prior to play back and choose the sequence of the songs.
- An object of the present invention is to produce a method and apparatus for the production of a compilation mix of songs which does not require the presence of pre-established markers and provides for greater flexibility.
- According to the present invention, a method of automatically producing a compilation mix from a set of songs is provided, the method comprising the steps of:
- (a) generating a beat profile comprising a temporal map representing an ideal rate of repeat of the dominant beat for the compilation mix at different points in time over the duration of the compilation;
- (b) generating at least one set of song data indicative of the rate of repeat of the dominant beat for each of the songs in the set of songs,
- (c) allocating the songs temporally within the compilation mix by processing the song data together with the temporal map, the songs being located temporally with respect to one another such that the rate of repeat of the dominant beat of the compilation approximates that of the temporal map and end portions of adjacent songs overlap. The invention thus provides a method of automatically determining the sequence of the songs according to their rate of repeat of the dominant beat (the tempo—usually measured as beats per minute) and an ideal temporal map for the resulting compilation. This enables the production of effective compilations with a predetermined overall feel to be achieved from any set of songs.
- The invention may include a preliminary step of selecting the set of songs to form the compilation mix from one or more master sets of songs.
- The method step (a) may comprise the sub-steps of selecting a beat profile from a set of different predetermined beat profiles. The predetermined beat profiles may be stored within an area of memory.
- The beat profile may comprise a quantitative map of the ideal beat repetition rate throughout the duration of the compilation to be constructed. It may comprise a set of beat rate values and corresponding time values for a compilation.
- Alternatively, the temporal beat profile may be a qualitative map in which relative values of the beat repetition rate at various time locations in the compilation are provided. For instance, the map may consist of the location in time in the compilation of where the maximum and minimum rates should be, and the change in rate between these points.
- The time information in the map may also be either quantitative, i.e. from zero to 74 minutes or qualitative, i.e. from “start” to “finish”. The beat profile may be stored as a set of pairs of data, each pair consisting of a time location in the compilation and either a quantitative or qualitative value of the beat repetition rate.
- The predetermined beat profiles may all be of a predetermined duration. In that case, the method may further comprise the steps of temporally expanding or compressing the selected profile to match the length of the compilation mix that is being produced.
- The step of selecting the beat profile may be automatically performed by the method. A beat profile may, for example, be selected at random from a set of profiles.
- Alternatively, the beat profile may be entered or selected manually by a user. The method may therefore include a sub-step of requesting a user to enter or to select a beat profile.
- The step (b) of generating song data may comprise processing each song that is selected to produce a beat value representative of the rate of repetition of the dominant beat of the song.
- Typically, the beat repetition rate (tempo) of a song will remain substantially constant over the duration of the song. The beat repetition data for each song may therefore comprise a step of, for each song, measuring the beats per minute of only one part of the song. This may, for instance, be a start portion, a middle portion or an end portion of the song.
- It may also include the step of processing each song to produce a time value indicative of the overall duration of the song from start to finish.
- It may additionally or alternatively include the step of identifying the location of any beat drop-outs within each song. These represent points in the song where the beat stops before restarting. For instance, in a dance music song it may be a point at which a drum beat defining the dominant beat stops for a few beats or measures before restarting.
- The generated song data may be stored in individual data files corresponding to each song or in a single file.
- The beat profile may be used to calculate the value and the temporal position of the minima and maxima beat rates in the profile. Data indicative of the value and position may be stored in a memory. In some arrangements, the temporal map may already be stored in terms of minima and maxima, and so this processing is not required.
- The songs may be allocated to their temporal positions within the compilation such that the songs having the minima and maxima beat per minute are located at the point in the sequence corresponding approximately to the position of the minima and maxima of the temporal map.
- The remaining songs may be allocated to the remaining positions in the compilation mix to complete the sequence by matching the beat rates of the songs to the temporal profile using a “best-fit” approximation.
- The allocation may be made by processing the song data in combination with the temporal map.
- It is expected that in many applications of the present invention the duration of the created compilation will be required to be no greater than some maximum time-duration M seconds. For example, in creating a compilation to be written to a standard length compact disc (CD), the CD format imposes a maximum duration of 74 minutes (M=4440 seconds). In a broadcast application the compilation mix may have to fit into a precise time slot (i.e.12:05 to 1:00am) and so the maximum time in this case would be 55 minutes. To create a continuous/seamless mix, the songs in the compilation need to be overlapped so that the mix can cross-fade from one song to another. A variety of overlaps are envisaged within the scope of the present invention.
- The method may therefore comprise the further step of, after allocating the songs to the temporal map to determine the order of the songs, locating the songs relative to one another in time such that the songs overlap in a predetermined manner.
- The predetermined amount of overlap between songs may initially be chosen to be the same for each song.
- For a set of N songs, with total combined length T seconds, the initial amount of overlap L between each song may be automatically determined from the relative values of M and T. If T is greater than M, there is more music than can be fitted into the time duration so each song may be overlapped by an equal amount determined by the equation:
- L=(T−M)/(N−1).
- Subsequent to the generation of the initial overlap, the individual overlap between pairs of songs may be adjusted. For example, where the song data indicates that a beat drop out is present in a song, this may be used as the point at which the song ends and the subsequent song begins. In that case, the decision to provide equal overlap may be overruled.
- After initial overlap of the songs, the method may further comprise the step of processing the songs so that the beats occurring in the overlap section from each song are aligned temporally.
- This step may comprise the step of phase aligning the overlapping songs so that the dominant beats of each song are exactly aligned. Thus, the beats of each song will occur at the same time and to the listener will appear as a single beat.
- The dominant beats in the overlap can only be aligned throughout the entire overlap portion if they are at the same repetition rate.
- The method may therefore include the step of time-stretch and/or time-compression processing one or both of the overlapping portions to match the rate of repetition. This may be achieved by speeding up one song in the overlap region, or slowing down one song, or a combination of both.
- Where the method comprises the step of speeding up/slowing down an overlapping portion of a song, the method may further comprise the step of gradually adjusting the speed of the remainder of the song over time to bring it back to its nominal (pre-adjusted) beat repetition rate. The human ear is not especially good at identifying slow changes in tempo over time, and so such adjustment can be made without the user noticing in the final compilation.
- After the songs have been aligned the subsequent alterations to the overlaps may require that the overall length of the resulting compilation exceeds the allocated duration, i.e. exceeds M seconds. In such a case, the method may further comprise the step of automatically truncating the compilation by fading out the final song early (before its end) or fading in the first song late ( after its start), or by a combination of both.
- The same step of early/late fade can also be applied if the duration of any one of the calculated overlaps exceeds a predetermined value L. This can arise if the overlap is aligned using a drop-out that occurs some way into a song. The value of L may be quantitative, i.e. 30 seconds, or qualitative, i.e. 30 percent of the duration of a song. Thus, if the overlap exceeds the predetermined threshold an appropriate fade-out, fade-in or both may be used to lower the overlap to within the threshold. If no maximum time limit is set, the overlap may be chosen at random or set to some predetermined initial value, say 20 seconds. This may also be used where the total length T of the songs for the compilation are less than the predetermined limit M. The value of the overlap may be selected as a function of the length of the songs being compiled, for example as a percentage of the length of an outgoing song, as a percentage of a length of an incoming song, or as a percentage of the average length of the songs being overlapped. This percentage could be fixed or generated at random from a probability distribution.
- The method may further include the step of fading out the amplitude (volume) of the overlapping end portion of a song. The method may also include the step of fading in the overlapping start portion of a song. A linear increase in amplitude may be provided or it may be faded in accordance with any other predetermined pattern.
- The method may further comprise the step of recording the songs in their selected order over time onto a data carrier, such as a compact disc.
- According to a second aspect, the invention also provides apparatus adapted to automatically produce a compilation mix of songs in accordance with the method of the first aspect.
- The apparatus may include a first area of memory in which the songs to be compiled are stored, a processing unit which processes the data in the memory, a computer program which, when run on the processor instructs the processor to perform the method of the first aspect of the invention.
- A second area of memory may be provided in which one or more user defined beat profiles are stored for access by the processor.
- The apparatus may further include a connector for connection to an output storage device adapted to record the resulting compilation on a storage data carrier. The storage device may comprise a compact disc writer.
- The apparatus may include a data input device whereby a user may select the temporal profile in response to a prompt by the apparatus. The apparatus may produce an audio or visual prompt.
- According to a third aspect, the invention provides a data carrier including a computer program which, when operating on a computer, produces an apparatus according to the second aspect of the invention.
- There will now be described, by way of example only, one embodiment of the present invention with reference to the accompanying drawings, of which:
- FIG. 1 is a schematic of an apparatus in accordance with the present invention for automatically producing a compilation of songs;
- FIG. 2 is a general flow diagram of the overall method carried out by the apparatus of FIG. 1;
- FIG. 3 is a flow diagram illustrating the steps involved in selecting the songs to be used in constructing the compilation mix;
- FIG. 4 is a flow diagram illustrating the steps of calculating the temporal profile of the compilation mix;
- FIG. 5 is a flow diagram illustrating the steps of calculating the song data for storage in data files;
- FIG. 6 is a flow diagram illustrating the steps of allocating the songs to temporal locations within the compilation;
- FIG. 7 is a flow diagram of the steps for providing seamless transitions between songs; and
- FIG. 8 is a diagram showing three typical beat profiles.
- FIG. 1 is a schematic diagram of an apparatus in accordance with one aspect of the present invention.
- The apparatus comprises three main functional elements. Firstly, a
memory storage area 1 is provided which stores, in either digital or analogue form, a number of songs from which a compilation mix is to be made. As shown, this comprises a compact disc player which reads compact discs on which songs are recorded in a digital format. The storage may alternatively or additionally comprise a solid state electronic memory, and/or magnetic storage such as a number of hard drives and/or magnetic tape drives. Any combination of these and other storage technologies are appropriate. - A
compilation generation apparatus 2 is provided, which includes acentral processing unit 3 and a first area of memory 4 which stores a program that provides operating instructions to thecentral processing unit 3. The apparatus further includes aninput port 5 which is connected to thememory storage area 1 whereby thecentral processing unit 3 can download the song data from the storage area. In the example, thecompilation apparatus 2 is located in the same room as thememory storage area 1, and is connected through aconductive cable 6, such as a copper cable. Optical connections could be used, and it is envisaged that the storage area may be located in a different geographical area for access over the telephone network using a modem. - The
compilation generation apparatus 2 further includes anintermediate memory buffer 7 into which the songs to be compiled are stored prior to processing, and into which song data produced by processing the songs can be stored. - An
output port 8 is also provided with enables the final compilation mix in theintermediate memory 7 to be passed along a cable 9 to a storage device such as aCD writer 10. - It will be appreciated that the compilation generation apparatus may be embodied as a personal computer in which case it will further include an input device (not shown) for receiving commands from a user and a display (not shown).
- The apparatus automatically produces a compilation by performing the steps illustrated in the flow charts of FIGS.2 to 7 of the accompanying drawings.
- FIG. 2 is a flowchart which gives an overview of the key steps performed in the compilation method performed by the apparatus of FIG. 1 of the accompanying drawings.
- In a
first step 21, the songs which will make up the finished compilation mix are selected from those stored in thefirst storage area 1. Subsequently, a temporal beat profile is selected 22 for the finished compilation. This profile provides a template for the finished compilation and determines the mood of the finished compilation. Several different profiles may be provided. - FIGS.8(a) to 8(c) illustrate three possible profiles. Each illustration plots a qualitative value of beats per minute (bpm) against time over the length of the compilation. In FIG. 8(a), the bpm starts at a lower value, designated “minimum”. This gradually increases over time to a “maximum” value, and then decreases towards the end. In FIG. 8(b) the bpm starts at a “maximum” value and gradually falls. In FIG. 8(c) it starts at a minimum and gradually increases over time. In each case, the profile covers a time span from “start” to “stop”. The profile can thus be scaled, for example to correspond to the maximum run-time of a standard format compact disc.
- Having selected the desired “ideal” profile, the selected songs are next classified23 to generate a set of song data. A data record is generated for each song in which is recorded the average bpm of the song, its run-time (length) and the location of any drop-outs in the song.
- In the next step, the songs are allocated24 to the profile to determine the order in which the songs will play in the finished compilation. This is performed by comparing the song data in the data records with the temporal profile that has been selected to provide a best fit. This ensures the bpm of the compilation follows, as closely as possible, the ideal bpm profile.
- After allocating a play order of the songs, the amount of overlap between songs is calculated and the overlapping portions are processed25 to provide a seamless transition between songs.
- Finally, the processed set of songs is
output 26 to the storage area for recording onto the compact disc by the compact disc writer. - Each of the steps is performed automatically and is illustrated in more detail in the flow charts of FIGS.3 to 7 and is described below.
- The step of selecting
songs 21 is represented schematically in FIG. 3 of the accompanying drawings. - Song selection can be made either manually by a user who selects from a list or automatically. Initially, the apparatus queries31 whether manual or automatic selection is required.
- If manual selection is required, the user is next prompted32 to select a song. A list of songs from which the selection is to be made may be displayed or a prompt may be made for the user to provide a song. For example, following the prompt, the user may load a compact disc containing the song into the CD player.
- Once a selection has been made33 (from a list or providing a disc), the selected song is stored 35 in intermediate memory. The song itself may be recorded in the memory, or a marker may be stored which points to the location of the selected song.
- The total run-time of the compilation mix is also requested34. This may be entered by a user from a prompt or may be a default value. For example, for a compact disc the default of 74 minutes may be used. For a compilation for play on a radio show, other lengths may be stored.
- The apparatus continues to request songs to be selected until the user indicates that the selection is complete or until the length of the songs when played end-to-end exceeds the requested run-time. The selection is then complete. After selection, a complete set of songs, or markers pointing to the location of songs, is stored in the intermediate memory. The temporal beat profile is next calculated or selected, as illustrated in the flow chart of FIG. 4 of the accompanying drawings. Again, a profile may be selected automatically or manually by a user. The apparatus therefore initially queries41 the selection status. A set of profiles is stored in memory and identified by an appropriate label, i.e. profile A, profile B, profile C, etc. j The apparatus prompts 42 the user to select an appropriate profile by entering the label relating to the profile from the input device.
- If automatic selection is requested, the profile is either chosen at random or a default profile is used. This profile is then stretched/compressed temporally to fit the length of the compilation required.
- Once a profile has been selected43 it is stored in memory or the label identifying the selected profile is recorded 44.
- In the next step, illustrated in more detail in the flow chart of FIG. 5, the selected songs are analysed to generate a set of data records. For each song51 a blank data file is first produced 52. The average rate of repetition of the dominant beat of the song is calculated 53 and stored 54 in the data file for the respective song. The temporal length of the song is calculated and stored 55, and the temporal location within the song of any drop-outs is calculated 56 and stored 57 in the data file. After analysing the first song, the process is repeated 58 until all songs are analysed and a full set of data files has been produced.
- The selected temporal profile and the data files are used to arrange the order of the songs in the completed compilation mix as best illustrated in the flow chart of FIG. 6 of the accompanying drawings.
- The profile is first analysed61 to calculate the ideal temporal position of the highest and lowest beats per minute. The profile is then divided 62 into N temporal slices, each slice being ranked in order by the beats per minute value in the profile.
- The slice having the highest beats per minute is then matched63 to the song whose data file indicates it has the highest beats per minute. Similarly, the slice with the lowest beats per minute is matched 64 to the song whose data file has the lowest bpm.
- The remaining songs are then allocated65 to the profile so as to provide a “best-fit” solution before the songs are aligned 66.
- If all the songs are the same length, the temporal profile is divided into equal width slices, the width of the slice being equal to the length of a song. In that case, songs are fitted directly to the slots with a small amount of overlap.
- Alternatively, the highest and lowest bpm songs may be “centred” around the maxima and minima in the profiles. The remaining songs are then fitted into the profile and the songs are re-aligned to provide the smallest overlap between songs. The overlaps may be made equal length. Having arranged the songs into an approximate run-time order, and roughly overlapped the songs, the overlapped portions are next processed to provide a seamless transition between songs. This is illustrated in the flow chart of FIG. 7 of the accompanying drawings.
- Initially, the first two songs allocated to the sequence are selected, and the areas of overlap are analysed. The apparatus checks71 to see if the bpm value data in the two data files is the same. If not, then the two overlapping portions are processed 72 by either time-compressing to speed up or time-stretching to slow down one or both of the songs until their tempos are matched.
- After ensuring the beats match, the phase of the two end portions are aligned73. This is achieved by moving the two songs temporally until the beats coincide.
- Having aligned the two songs, a fade is then applied to both songs. The end of the first song is faded out74, whilst the start of the next song is faded in 75. This can be achieved by allocating an amplitude profile to the songs which is stored in the data file for the song. The data record contains this information alongside the information indicating where the songs start and finish.
- If the speed of an overlap portion has been adjusted, the songs are next processed so that the speed returns to its normal value throughout the song.
- The process is again repeated76 if required until all the songs have been processed.
- Finally, the data held in the data files, which provides a full indication of the temporal alignment, phase adjustment, and amplitude data for fading is used to record the completed compilation mix onto the CD in the CD writer. In an alternative, the details may be written to the CD writer alongside a set of separate tracks. On play-back, the data file is used to control the order and timing of play-back so that a full compilation mix is heard.
Claims (21)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP00303960A EP1162621A1 (en) | 2000-05-11 | 2000-05-11 | Automatic compilation of songs |
EP00303960.9 | 2000-05-11 | ||
EP00303960 | 2000-05-11 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20010039872A1 true US20010039872A1 (en) | 2001-11-15 |
US6344607B2 US6344607B2 (en) | 2002-02-05 |
Family
ID=8172979
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/805,170 Expired - Fee Related US6344607B2 (en) | 2000-05-11 | 2001-03-14 | Automatic compilation of songs |
Country Status (3)
Country | Link |
---|---|
US (1) | US6344607B2 (en) |
EP (1) | EP1162621A1 (en) |
JP (1) | JP2002049370A (en) |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030212466A1 (en) * | 2002-05-09 | 2003-11-13 | Audeo, Inc. | Dynamically changing music |
US20040196988A1 (en) * | 2003-04-04 | 2004-10-07 | Christopher Moulios | Method and apparatus for time compression and expansion of audio data with dynamic tempo change during playback |
US20040196989A1 (en) * | 2003-04-04 | 2004-10-07 | Sol Friedman | Method and apparatus for expanding audio data |
WO2005071662A1 (en) * | 2004-01-21 | 2005-08-04 | Koninklijke Philips Electronics N.V. | Method and system for determining a measure of tempo ambiguity for a music input signal |
WO2007060605A2 (en) * | 2005-11-25 | 2007-05-31 | Koninklijke Philips Electronics N.V. | Device for and method of processing audio data items |
WO2007072350A3 (en) * | 2005-12-22 | 2007-10-18 | Koninkl Philips Electronics Nv | Electronic device and method for determining a mixing parameter |
US7319185B1 (en) | 2001-11-06 | 2008-01-15 | Wieder James W | Generating music and sound that varies from playback to playback |
US20080190267A1 (en) * | 2007-02-08 | 2008-08-14 | Paul Rechsteiner | Sound sequences with transitions and playlists |
US20080236369A1 (en) * | 2007-03-28 | 2008-10-02 | Yamaha Corporation | Performance apparatus and storage medium therefor |
US20080236370A1 (en) * | 2007-03-28 | 2008-10-02 | Yamaha Corporation | Performance apparatus and storage medium therefor |
US20080314232A1 (en) * | 2007-06-25 | 2008-12-25 | Sony Ericsson Mobile Communications Ab | System and method for automatically beat mixing a plurality of songs using an electronic equipment |
US20090049979A1 (en) * | 2007-08-21 | 2009-02-26 | Naik Devang K | Method for Creating a Beat-Synchronized Media Mix |
US20090259326A1 (en) * | 2008-04-15 | 2009-10-15 | Michael Joseph Pipitone | Server side audio file beat mixing |
US20100031805A1 (en) * | 2008-08-11 | 2010-02-11 | Agere Systems Inc. | Method and apparatus for adjusting the cadence of music on a personal audio device |
US7732697B1 (en) | 2001-11-06 | 2010-06-08 | Wieder James W | Creating music and sound that varies from playback to playback |
US20100215195A1 (en) * | 2007-05-22 | 2010-08-26 | Koninklijke Philips Electronics N.V. | Device for and a method of processing audio data |
US8258390B1 (en) | 2011-03-30 | 2012-09-04 | Google Inc. | System and method for dynamic, feature-based playlist generation |
US8487176B1 (en) * | 2001-11-06 | 2013-07-16 | James W. Wieder | Music and sound that varies from one playback to another playback |
US20140018947A1 (en) * | 2012-07-16 | 2014-01-16 | SongFlutter, Inc. | System and Method for Combining Two or More Songs in a Queue |
US8766078B2 (en) * | 2010-12-07 | 2014-07-01 | JVC Kenwood Corporation | Music piece order determination device, music piece order determination method, and music piece order determination program |
US9070352B1 (en) * | 2011-10-25 | 2015-06-30 | Mixwolf LLC | System and method for mixing song data using measure groupings |
US20150207481A1 (en) * | 2007-11-16 | 2015-07-23 | Adobe Systems Incorporated | Genre dependent audio level control |
US9459828B2 (en) | 2012-07-16 | 2016-10-04 | Brian K. ALES | Musically contextual audio advertisements |
US20170090855A1 (en) * | 2015-03-20 | 2017-03-30 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method and device for processing audio files |
US20190236209A1 (en) * | 2018-01-29 | 2019-08-01 | Gary Bencar | Artificial intelligence methodology to automatically generate interactive play along songs |
US10607500B1 (en) * | 2019-05-21 | 2020-03-31 | International Business Machines Corporation | Providing background music tempo to accompany procedural instructions |
US11188605B2 (en) | 2019-07-31 | 2021-11-30 | Rovi Guides, Inc. | Systems and methods for recommending collaborative content |
US12118984B2 (en) | 2020-11-11 | 2024-10-15 | Rovi Guides, Inc. | Systems and methods to resolve conflicts in conversations |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020002039A1 (en) | 1998-06-12 | 2002-01-03 | Safi Qureshey | Network-enabled audio device |
KR100867760B1 (en) * | 2000-05-15 | 2008-11-10 | 소니 가부시끼 가이샤 | Playback device, playback method and recording medium |
US7840691B1 (en) | 2000-09-07 | 2010-11-23 | Zamora Radio, Llc | Personal broadcast server system for providing a customized broadcast |
US6915261B2 (en) * | 2001-03-16 | 2005-07-05 | Intel Corporation | Matching a synthetic disc jockey's voice characteristics to the sound characteristics of audio programs |
US6933432B2 (en) * | 2002-03-28 | 2005-08-23 | Koninklijke Philips Electronics N.V. | Media player with “DJ” mode |
JP4134945B2 (en) * | 2003-08-08 | 2008-08-20 | ヤマハ株式会社 | Automatic performance device and program |
US8028323B2 (en) | 2004-05-05 | 2011-09-27 | Dryden Enterprises, Llc | Method and system for employing a first device to direct a networked audio device to obtain a media item |
US7081582B2 (en) * | 2004-06-30 | 2006-07-25 | Microsoft Corporation | System and method for aligning and mixing songs of arbitrary genres |
US20090223352A1 (en) * | 2005-07-01 | 2009-09-10 | Pioneer Corporation | Computer program, information reproducing device, and method |
WO2007036846A2 (en) * | 2005-09-30 | 2007-04-05 | Koninklijke Philips Electronics N.V. | Method and apparatus for automatic structure analysis of music |
US20070074617A1 (en) * | 2005-10-04 | 2007-04-05 | Linda Vergo | System and method for tailoring music to an activity |
US20070074618A1 (en) * | 2005-10-04 | 2007-04-05 | Linda Vergo | System and method for selecting music to guide a user through an activity |
US20070074619A1 (en) * | 2005-10-04 | 2007-04-05 | Linda Vergo | System and method for tailoring music to an activity based on an activity goal |
JP2007242215A (en) * | 2006-02-13 | 2007-09-20 | Sony Corp | Content reproduction list generation device, content reproduction list generation method, and program-recorded recording medium |
WO2007105180A2 (en) * | 2006-03-16 | 2007-09-20 | Pace Plc | Automatic play list generation |
US8195725B2 (en) | 2006-04-11 | 2012-06-05 | Nokia Corporation | Electronic device and method therefor |
US20080114665A1 (en) * | 2006-11-10 | 2008-05-15 | Teegarden Kamia J | Licensing system |
US20090044687A1 (en) * | 2007-08-13 | 2009-02-19 | Kevin Sorber | System for integrating music with an exercise regimen |
US8173883B2 (en) * | 2007-10-24 | 2012-05-08 | Funk Machine Inc. | Personalized music remixing |
US9015147B2 (en) * | 2007-12-20 | 2015-04-21 | Porto Technology, Llc | System and method for generating dynamically filtered content results, including for audio and/or video channels |
US8316015B2 (en) | 2007-12-21 | 2012-11-20 | Lemi Technology, Llc | Tunersphere |
US8117193B2 (en) | 2007-12-21 | 2012-02-14 | Lemi Technology, Llc | Tunersphere |
US20100017455A1 (en) * | 2008-07-17 | 2010-01-21 | Lemi Technology, Llc | Customized media broadcast for a broadcast group |
US8494899B2 (en) | 2008-12-02 | 2013-07-23 | Lemi Technology, Llc | Dynamic talk radio program scheduling |
US8806047B2 (en) | 2009-04-29 | 2014-08-12 | Lemi Technology, Llc | Skip feature for a broadcast or multicast media station |
US7657337B1 (en) | 2009-04-29 | 2010-02-02 | Lemi Technology, Llc | Skip feature for a broadcast or multicast media station |
US9111519B1 (en) * | 2011-10-26 | 2015-08-18 | Mixwolf LLC | System and method for generating cuepoints for mixing song data |
JP5962218B2 (en) * | 2012-05-30 | 2016-08-03 | 株式会社Jvcケンウッド | Song order determining apparatus, song order determining method, and song order determining program |
JP6011064B2 (en) * | 2012-06-26 | 2016-10-19 | ヤマハ株式会社 | Automatic performance device and program |
US9880805B1 (en) | 2016-12-22 | 2018-01-30 | Brian Howard Guralnick | Workout music playback machine |
CN108831425B (en) * | 2018-06-22 | 2022-01-04 | 广州酷狗计算机科技有限公司 | Sound mixing method, device and storage medium |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2927229B2 (en) * | 1996-01-23 | 1999-07-28 | ヤマハ株式会社 | Medley playing equipment |
US5824933A (en) * | 1996-01-26 | 1998-10-20 | Interactive Music Corp. | Method and apparatus for synchronizing and simultaneously playing predefined musical sequences using visual display and input device such as joystick or keyboard |
JP3799761B2 (en) * | 1997-08-11 | 2006-07-19 | ヤマハ株式会社 | Performance device, karaoke device and recording medium |
EP0932157A1 (en) * | 1998-01-26 | 1999-07-28 | Deutsche Thomson-Brandt GmbH | Automatically performed crossover between two consecutively played back sets of audio data |
US5969283A (en) * | 1998-06-17 | 1999-10-19 | Looney Productions, Llc | Music organizer and entertainment center |
-
2000
- 2000-05-11 EP EP00303960A patent/EP1162621A1/en not_active Withdrawn
-
2001
- 2001-03-14 US US09/805,170 patent/US6344607B2/en not_active Expired - Fee Related
- 2001-05-08 JP JP2001137488A patent/JP2002049370A/en not_active Withdrawn
Cited By (60)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7319185B1 (en) | 2001-11-06 | 2008-01-15 | Wieder James W | Generating music and sound that varies from playback to playback |
US8487176B1 (en) * | 2001-11-06 | 2013-07-16 | James W. Wieder | Music and sound that varies from one playback to another playback |
US9040803B2 (en) * | 2001-11-06 | 2015-05-26 | James W. Wieder | Music and sound that varies from one playback to another playback |
US7732697B1 (en) | 2001-11-06 | 2010-06-08 | Wieder James W | Creating music and sound that varies from playback to playback |
US20150243269A1 (en) * | 2001-11-06 | 2015-08-27 | James W. Wieder | Music and Sound that Varies from Playback to Playback |
US10224013B2 (en) * | 2001-11-06 | 2019-03-05 | James W. Wieder | Pseudo—live music and sound |
US11087730B1 (en) * | 2001-11-06 | 2021-08-10 | James W. Wieder | Pseudo—live sound and music |
US20030212466A1 (en) * | 2002-05-09 | 2003-11-13 | Audeo, Inc. | Dynamically changing music |
US7078607B2 (en) * | 2002-05-09 | 2006-07-18 | Anton Alferness | Dynamically changing music |
US7189913B2 (en) * | 2003-04-04 | 2007-03-13 | Apple Computer, Inc. | Method and apparatus for time compression and expansion of audio data with dynamic tempo change during playback |
US20040196989A1 (en) * | 2003-04-04 | 2004-10-07 | Sol Friedman | Method and apparatus for expanding audio data |
US20040196988A1 (en) * | 2003-04-04 | 2004-10-07 | Christopher Moulios | Method and apparatus for time compression and expansion of audio data with dynamic tempo change during playback |
US7233832B2 (en) * | 2003-04-04 | 2007-06-19 | Apple Inc. | Method and apparatus for expanding audio data |
US7425674B2 (en) | 2003-04-04 | 2008-09-16 | Apple, Inc. | Method and apparatus for time compression and expansion of audio data with dynamic tempo change during playback |
US20070137464A1 (en) * | 2003-04-04 | 2007-06-21 | Christopher Moulios | Method and apparatus for time compression and expansion of audio data with dynamic tempo change during playback |
WO2005071662A1 (en) * | 2004-01-21 | 2005-08-04 | Koninklijke Philips Electronics N.V. | Method and system for determining a measure of tempo ambiguity for a music input signal |
WO2007060605A2 (en) * | 2005-11-25 | 2007-05-31 | Koninklijke Philips Electronics N.V. | Device for and method of processing audio data items |
WO2007060605A3 (en) * | 2005-11-25 | 2007-08-16 | Koninkl Philips Electronics Nv | Device for and method of processing audio data items |
US20080319756A1 (en) * | 2005-12-22 | 2008-12-25 | Koninklijke Philips Electronics, N.V. | Electronic Device and Method for Determining a Mixing Parameter |
WO2007072350A3 (en) * | 2005-12-22 | 2007-10-18 | Koninkl Philips Electronics Nv | Electronic device and method for determining a mixing parameter |
US20080190267A1 (en) * | 2007-02-08 | 2008-08-14 | Paul Rechsteiner | Sound sequences with transitions and playlists |
US20110100197A1 (en) * | 2007-02-08 | 2011-05-05 | Kaleidescape, Inc. | Sound sequences with transitions and playlists |
US7888582B2 (en) * | 2007-02-08 | 2011-02-15 | Kaleidescape, Inc. | Sound sequences with transitions and playlists |
US7982120B2 (en) * | 2007-03-28 | 2011-07-19 | Yamaha Corporation | Performance apparatus and storage medium therefor |
US20100236386A1 (en) * | 2007-03-28 | 2010-09-23 | Yamaha Corporation | Performance apparatus and storage medium therefor |
US20080236369A1 (en) * | 2007-03-28 | 2008-10-02 | Yamaha Corporation | Performance apparatus and storage medium therefor |
US20080236370A1 (en) * | 2007-03-28 | 2008-10-02 | Yamaha Corporation | Performance apparatus and storage medium therefor |
US7956274B2 (en) | 2007-03-28 | 2011-06-07 | Yamaha Corporation | Performance apparatus and storage medium therefor |
US8153880B2 (en) | 2007-03-28 | 2012-04-10 | Yamaha Corporation | Performance apparatus and storage medium therefor |
US20100215195A1 (en) * | 2007-05-22 | 2010-08-26 | Koninklijke Philips Electronics N.V. | Device for and a method of processing audio data |
US20080314232A1 (en) * | 2007-06-25 | 2008-12-25 | Sony Ericsson Mobile Communications Ab | System and method for automatically beat mixing a plurality of songs using an electronic equipment |
US7525037B2 (en) | 2007-06-25 | 2009-04-28 | Sony Ericsson Mobile Communications Ab | System and method for automatically beat mixing a plurality of songs using an electronic equipment |
WO2009001164A1 (en) * | 2007-06-25 | 2008-12-31 | Sony Ericsson Mobile Communications Ab | System and method for automatically beat mixing a plurality of songs using an electronic equipment |
US8269093B2 (en) * | 2007-08-21 | 2012-09-18 | Apple Inc. | Method for creating a beat-synchronized media mix |
US20130008301A1 (en) * | 2007-08-21 | 2013-01-10 | Naik Devang K | Method for creating a beat-synchronized media mix |
US20090049979A1 (en) * | 2007-08-21 | 2009-02-26 | Naik Devang K | Method for Creating a Beat-Synchronized Media Mix |
US8704069B2 (en) * | 2007-08-21 | 2014-04-22 | Apple Inc. | Method for creating a beat-synchronized media mix |
US20150207481A1 (en) * | 2007-11-16 | 2015-07-23 | Adobe Systems Incorporated | Genre dependent audio level control |
US20090259326A1 (en) * | 2008-04-15 | 2009-10-15 | Michael Joseph Pipitone | Server side audio file beat mixing |
US9014831B2 (en) * | 2008-04-15 | 2015-04-21 | Cassanova Group, Llc | Server side audio file beat mixing |
US20100031805A1 (en) * | 2008-08-11 | 2010-02-11 | Agere Systems Inc. | Method and apparatus for adjusting the cadence of music on a personal audio device |
US7888581B2 (en) * | 2008-08-11 | 2011-02-15 | Agere Systems Inc. | Method and apparatus for adjusting the cadence of music on a personal audio device |
US8766078B2 (en) * | 2010-12-07 | 2014-07-01 | JVC Kenwood Corporation | Music piece order determination device, music piece order determination method, and music piece order determination program |
WO2012135335A1 (en) * | 2011-03-30 | 2012-10-04 | Google Inc. | A system and method for dynamic, feature-based playlist generation |
US8258390B1 (en) | 2011-03-30 | 2012-09-04 | Google Inc. | System and method for dynamic, feature-based playlist generation |
US8319087B2 (en) | 2011-03-30 | 2012-11-27 | Google Inc. | System and method for dynamic, feature-based playlist generation |
US9070352B1 (en) * | 2011-10-25 | 2015-06-30 | Mixwolf LLC | System and method for mixing song data using measure groupings |
US20140121797A1 (en) * | 2012-07-16 | 2014-05-01 | SongFlutter, Inc. | System and Method for Combining a Song and Non-Song Musical Content |
US9459828B2 (en) | 2012-07-16 | 2016-10-04 | Brian K. ALES | Musically contextual audio advertisements |
US9355627B2 (en) * | 2012-07-16 | 2016-05-31 | Brian K. ALES | System and method for combining a song and non-song musical content |
US20140018947A1 (en) * | 2012-07-16 | 2014-01-16 | SongFlutter, Inc. | System and Method for Combining Two or More Songs in a Queue |
US20170090855A1 (en) * | 2015-03-20 | 2017-03-30 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method and device for processing audio files |
US10031714B2 (en) * | 2015-03-20 | 2018-07-24 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method and device for processing audio files |
US20190236209A1 (en) * | 2018-01-29 | 2019-08-01 | Gary Bencar | Artificial intelligence methodology to automatically generate interactive play along songs |
US10534811B2 (en) * | 2018-01-29 | 2020-01-14 | Beamz Ip, Llc | Artificial intelligence methodology to automatically generate interactive play along songs |
US20200142926A1 (en) * | 2018-01-29 | 2020-05-07 | Beamz Ip, Llc | Artificial intelligence methodology to automatically generate interactive play along songs |
US10607500B1 (en) * | 2019-05-21 | 2020-03-31 | International Business Machines Corporation | Providing background music tempo to accompany procedural instructions |
US11188605B2 (en) | 2019-07-31 | 2021-11-30 | Rovi Guides, Inc. | Systems and methods for recommending collaborative content |
US11874888B2 (en) | 2019-07-31 | 2024-01-16 | Rovi Guides, Inc. | Systems and methods for recommending collaborative content |
US12118984B2 (en) | 2020-11-11 | 2024-10-15 | Rovi Guides, Inc. | Systems and methods to resolve conflicts in conversations |
Also Published As
Publication number | Publication date |
---|---|
EP1162621A1 (en) | 2001-12-12 |
US6344607B2 (en) | 2002-02-05 |
JP2002049370A (en) | 2002-02-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6344607B2 (en) | Automatic compilation of songs | |
US6888999B2 (en) | Method of remixing digital information | |
US8046688B2 (en) | System for and method of adjusting tempo to match audio events to video events or other audio events in a recorded signal | |
US7208672B2 (en) | System and method for structuring and mixing audio tracks | |
CA2477697C (en) | Methods and apparatus for use in sound replacement with automatic synchronization to images | |
US7041892B2 (en) | Automatic generation of musical scratching effects | |
US8115090B2 (en) | Mashup data file, mashup apparatus, and content creation method | |
US8445768B1 (en) | Method and apparatus for audio mixing | |
Cliff | Hang the DJ: Automatic sequencing and seamless mixing of dance-music tracks | |
JP2004527786A (en) | Automatic melody tempo and phase detection method and harmony method, and interactive music playback device using them | |
US20050235811A1 (en) | Systems for and methods of selection, characterization and automated sequencing of media content | |
US8969700B2 (en) | Systems and methods of selection, characterization and automated sequencing of media content | |
JP2006517679A (en) | Audio playback apparatus, method, and computer program | |
KR20080051054A (en) | Distribution method of mashup data, mashup method, server device and mashup device of mashup data | |
CN103514867B (en) | Use the automatic Playing technology of audio waveform data | |
CN1838229B (en) | Playback apparatus and playback method | |
JP3821103B2 (en) | INFORMATION DISPLAY METHOD, INFORMATION DISPLAY DEVICE, AND RECORDING MEDIUM CONTAINING INFORMATION DISPLAY PROGRAM | |
Cliff | Patent: US 6,344,607: Automatic Compilation of Songs | |
US7511214B2 (en) | Automatic performance apparatus for reproducing music piece | |
Cliff | hpDJ: An automated DJ with floorshow feedback | |
GB2365616A (en) | Computer aided music mixing system | |
US20230343314A1 (en) | System for selection and playback of song versions from vinyl type control interfaces | |
Petelin et al. | Cool Edit Pro2 in Use | |
JPH10503851A (en) | Rearrangement of works of art | |
JP3460562B2 (en) | Input / editing device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HEWLETT-PACKARD COMPANY, COLORADO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CLIFF, DAVID T.;REEL/FRAME:011666/0930 Effective date: 20001221 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FEPP | Fee payment procedure |
Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
AS | Assignment |
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD COMPANY;REEL/FRAME:026945/0699 Effective date: 20030131 |
|
REMI | Maintenance fee reminder mailed | ||
LAPS | Lapse for failure to pay maintenance fees | ||
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20140205 |