Identifying 'Cover Songs' with Chroma Features and Dynamic Programming Beat Tracking
Large music collections, ranging from thousands to millions of tracks, are unsuited to manual searching, motivating the development of automatic search methods. When different musicians perform the same underlying song or piece, these are known as 'cover' versions. We describe a system that attempts to identify such a relationship between music audio recordings. To overcome variability in tempo, we use beat tracking to describe each piece with one feature vector per beat. To deal with variation in instrumentation, we use 12-dimensional 'chroma' feature vectors that collect spectral energy supporting each semitone of the octave. To compare two recordings, we simply cross-correlate the entire beat-by-chroma representation for two tracks and look for sharp peaks indicating good local alignment between the pieces. Evaluation on several databases indicate good performance, including best performance on an independent international evaluation, where the system achieved a mean reciprocal ranking of 0.49 for true cover versions among top-10 returns.
- EllisP07-coversongs.pdf application/pdf 563 KB Download File
Also Published In
- 2007 IEEE International Conference on Acoustics, Speech, and Signal Processing: Proceedings: April 16-20, 2007, Hawaii Convention Center, Honolulu, Hawaii, U.S.A.
More About This Work
- Academic Units
- Electrical Engineering
- Published Here
- June 27, 2012