Robust feature representation for classification of bird song syllables

Maria Sandsten, Mareile Große Ruse, Martin Jönsson

Research output: Contribution to journalArticlepeer-review

4 Citations (SciVal)

Abstract

A novel feature set for low-dimensional signal representation, designed for classification or clustering of non-stationary signals with complex variation in time and frequency, is presented. The feature representation of a signal is given by the first left and right singular vectors of its ambiguity spectrum matrix. If the ambiguity matrix is of low rank, most signal information in time direction is captured by the first right singular vector while the signal’s key frequency information is encoded by the first left singular vector. The resemblance of two signals is investigated by means of a suitable similarity assessment of the signals’ respective singular vector pair. Application of multitapers for the calculation of the ambiguity spectrum gives an increased robustness to jitter and background noise and a consequent improvement in performance, as compared to estimation based on the ordinary single Hanning window spectrogram. The suggested feature-based signal compression is applied to a syllable-based analysis of a song from the bird species Great Reed Warbler and evaluated by comparison to manual auditive and/or visual signal classification. The results show that the proposed approach outperforms well-known approaches based on mel-frequency cepstral coefficients and spectrogram cross-correlation.

Original languageEnglish
Article number68
JournalEurasip Journal on Advances in Signal Processing
Volume2016
Issue number1
DOIs
Publication statusPublished - 2016 Dec 1

Subject classification (UKÄ)

  • Signal Processing

Keywords

  • Ambiguity spectrum
  • Bird song
  • Multitaper
  • Singular value decomposition
  • Time-frequency analysis

Fingerprint

Dive into the research topics of 'Robust feature representation for classification of bird song syllables'. Together they form a unique fingerprint.

Cite this