Rendering localized spatial audio in a virtual auditory space
Title | Rendering localized spatial audio in a virtual auditory space |
Publication Type | Journal Articles |
Year of Publication | 2004 |
Authors | Zotkin DN, Duraiswami R, Davis LS |
Journal | IEEE Transactions on Multimedia |
Volume | 6 |
Issue | 4 |
Pagination | 553 - 564 |
Date Published | 2004/08// |
ISBN Number | 1520-9210 |
Keywords | -D audio processing, 3-D audio processing, Audio databases, audio signal processing, audio user interfaces, augmented reality, data sonification, Digital signal processing, head related transfer functions, head-related transfer function, Interpolation, Layout, perceptual user interfaces, Real time systems, Rendering (computer graphics), Scattering, spatial audio, Transfer functions, User interfaces, virtual audio scene rendering, virtual auditory spaces, virtual environments, Virtual reality, virtual reality environments |
Abstract | High-quality virtual audio scene rendering is required for emerging virtual and augmented reality applications, perceptual user interfaces, and sonification of data. We describe algorithms for creation of virtual auditory spaces by rendering cues that arise from anatomical scattering, environmental scattering, and dynamical effects. We use a novel way of personalizing the head related transfer functions (HRTFs) from a database, based on anatomical measurements. Details of algorithms for HRTF interpolation, room impulse response creation, HRTF selection from a database, and audio scene presentation are presented. Our system runs in real time on an office PC without specialized DSP hardware. |
DOI | 10.1109/TMM.2004.827516 |