Ear in the sky: Ego-noise reduction for auditory micro aerial vehicles

  • Lin Wang and Andrea Cavallaro. Ear in the sky: Ego-noise reduction for auditory micro aerial vehicles. In Proceedings of 13th IEEE International Conference on Advanced Signal and Video based Surveillance (AVSS), Colorado Springs, USA, 2016. doi:10.1109/AVSS.2016.7738063
    [BibTeX] [Abstract]

    We investigate the spectral and spatial characteristics of the ego-noise of a multirotor micro aerial vehicle (MAV) using audio signals captured with multiple onboard microphones and derive a noise model that grounds the feasibility of microphone-array techniques for noise reduction. The spectral analysis suggests that the ego-noise consists of narrowband harmonic noise and broadband noise, whose spectra vary dynamically with the motor rotation speed. The spatial analysis suggests that the ego-noise of a P-rotor MAV can be modeled as P directional noises plus one diffuse noise. Moreover, because of the fixed positions of the microphones and motors, we can assume that the acoustic mixing network of the ego-noise is stationary. We validate the proposed noise model and the stationary mixing assumption by applying blind source separation to multi-channel recordings from both a static and a moving MAV and quantify the signal-to-noise ratio improvement. Moreover, we make all the audio recordings publicly available.

    @InProceedings{2016-08-WANG,
    title = {{Ear in the sky: Ego-noise reduction for auditory micro aerial vehicles}},
    author = {Lin Wang and Andrea Cavallaro},
    booktitle = {{Proceedings of 13th IEEE International Conference on Advanced Signal and Video based Surveillance (AVSS)}},
    address= {Colorado Springs, USA},
    date = {2016-08-23/2016-08-26},
    doi = {10.1109/AVSS.2016.7738063},
    year = {2016},
    abstract = {We investigate the spectral and spatial characteristics of the ego-noise of a multirotor micro aerial vehicle (MAV) using audio signals captured with multiple onboard microphones and derive a noise model that grounds the feasibility of microphone-array techniques for noise reduction. The spectral analysis suggests that the ego-noise consists of narrowband harmonic noise and broadband noise, whose spectra vary dynamically with the motor rotation speed. The spatial analysis suggests that the ego-noise of a P-rotor MAV can be modeled as P directional noises plus one diffuse noise. Moreover, because of the fixed positions of the microphones and motors, we can assume that the acoustic mixing network of the ego-noise is stationary. We validate the proposed noise model and the stationary mixing assumption by applying blind source separation to multi-channel recordings from both a static and a moving MAV and quantify the signal-to-noise ratio improvement. Moreover, we make all the audio recordings publicly available.}
    }

This entry was posted in Dissemination. Bookmark the permalink.

Comments are closed.