location: Current position: Home >> Scientific Research >> Paper Publications

Deep attention based music genre classification

Hits:

Indexed by:Journal Papers

Date of Publication:2020-01-08

Journal:NEUROCOMPUTING

Included Journals:EI、SCIE

Volume:372

Page Number:84-91

ISSN No.:0925-2312

Key Words:Music genre classification; Deep neural networks; Serial attention; Parallelized attention

Abstract:As an important component of music information retrieval, music genre classification attracts great attentions these years. Benefitting from the outstanding performance of deep neural networks in computer vision, some researchers apply CNN on music genre classification tasks with audio spectrograms as input instead, which has similarities with RGB images. These methods are based on a latent assumption that spectrums with different temporal steps have equal importance. However, it goes against the theory of processing bottleneck in psychology as well as our observation from audio spectrograms. By considering the differences of spectrums, we propose a new model incorporating with attention mechanism based on Bidirectional Recurrent Neural Network. Furthermore, two attention-based models (serial attention and parallelized attention) are implemented in this paper. Comparing with serial attention, parallelized attention is more flexible and gets better results in our experiments. Especially, the CNN-based parallelized attention models with taking STFT spectrograms as input outperform the previous work. (C) 2019 Elsevier B.V. All rights reserved.

Pre One:Self-adaption neighborhood density clustering method for mixed data stream with concept drift

Next One:Multi-view laplacian least squares for human emotion recognition