Frequency-dependent integration of auditory and vestibular cues for self-motion perception

Corey S. Shayman, Robert (Bob) Peterka, Frederick J. Gallun, Yonghee Oh, Nai Yuan N. Chang, Timothy Hullar

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Recent evidence has shown that auditory information may be used to improve postural stability, spatial orientation, navigation, and gait, suggesting an auditory component of self-motion perception. To determine how auditory and other sensory cues integrate for self-motion perception, we measured motion perception during yaw rotations of the body and the auditory environment. Psychophysical thresholds in humans were measured over a range of frequencies (0.1-1.0 Hz) during self-rotation without spatial auditory stimuli, rotation of a sound source around a stationary listener, and self-rotation in the presence of an earth-fixed sound source. Unisensory perceptual thresholds and the combined multisensory thresholds were found to be frequency dependent. Auditory thresholds were better at lower frequencies, and vestibular thresholds were better at higher frequencies. Expressed in terms of peak angular velocity, multisensory vestibular and auditory thresholds ranged from 0.39°/s at 0.1 Hz to 0.95°/s at 1.0 Hz and were significantly better over low frequencies than either the auditory-only (0.54°/s to 2.42°/s at 0.1 and 1.0 Hz, respectively) or vestibular-only (2.00°/s to 0.75°/s at 0.1 and 1.0 Hz, respectively) unisensory conditions. Monaurally presented auditory cues were less effective than binaural cues in lowering multisensory thresholds. Frequency-independent thresholds were derived, assuming that vestibular thresholds depended on a weighted combination of velocity and acceleration cues, whereas auditory thresholds depended on displacement and velocity cues. These results elucidate fundamental mechanisms for the contribution of audition to balance and help explain previous findings, indicating its significance in tasks requiring self-orientation.NEW & NOTEWORTHY Auditory information can be integrated with visual, proprioceptive, and vestibular signals to improve balance, orientation, and gait, but this process is poorly understood. Here, we show that auditory cues significantly improve sensitivity to self-motion perception below 0.5 Hz, whereas vestibular cues contribute more at higher frequencies. Motion thresholds are determined by a weighted combination of displacement, velocity, and acceleration information. These findings may help understand and treat imbalance, particularly in people with sensory deficits.

Original languageEnglish (US)
Pages (from-to)936-944
Number of pages9
JournalJournal of neurophysiology
Volume123
Issue number3
DOIs
StatePublished - Mar 1 2020

Keywords

  • auditory motion
  • motion perception
  • multisensory integration
  • perceptual threshold
  • vestibular

ASJC Scopus subject areas

  • Neuroscience(all)
  • Physiology

Fingerprint Dive into the research topics of 'Frequency-dependent integration of auditory and vestibular cues for self-motion perception'. Together they form a unique fingerprint.

Cite this