Over 30 papers were presented on topics ranging from sound zones through higher order ambisonics, mode matching and psychoacoustics, to sound field control theories, microphone arrays and array transducers. Delegates had ample opportunity to experience fascinating demonstrations of sound field control technology including personal sound zones, 3D audio capture and adaptive object-based stereo reproduction, among other fascinating listening opportunities.
Bruce Drinkwater’s invited talk on ultrasonic levitation proved a remarkable tour of a new technology that enables small objects to be lifted using ultrasonic “tractor beams,” something that a few years ago would have seemed like science fiction. He brought along a small demonstrator of the technology that could hold little balls in midair, which held conference delegates riveted during the lunch session on day two.
Keynote Philip Nelson is Professor of Acoustics in the Institute of Sound and Vibration Research at the University of Southampton, and a leader of research council activity in the UK. Introduced by Fazi as a “scientific superstar,” Nelson’s tour of the history of sound field control on the first day provided an excellent scene-setter for the event, illuminating the connections between the active control of sound and contemporary approaches to sound reproduction within the consistent framework provided by multichannel digital signal processing and the physical behavior of linearly superposed sound fields.
The day two keynote speaker was Gary Elko, president of mh acoustics. With Jens Meyer, Elko developed the Eigenmike, a spherical microphone array that decomposes the sound field into a compact set of orthogonal spherical harmonic signals. The Eigenmike is now gaining commercial interest in the field of immersive audio. In his keynote address, Elko offered delegates a comprehensive review of differential microphone array technology at the start of day two.
Steven van de Par reviewed the effects of reverberation on auditory perception, concluding with a description of a system for reproducing the most perceptually important parameters of direct and reflected sound in a room that already has its own reverberation.
Themes of the conference, arising out of workshop discussions and informal interactions, coalesced around the different forms and meanings of the broad term “sound field control,” and how it both includes, but is broader than, spatial audio. Delegates were particularly interested in how to design systems to engineer an intended user experience, one that possibly follows people around as they move, and adapts to their circumstances, as well as being useful for multiple listeners. There was talk of navigable sound fields that can only be experienced by exploring them, which connects strongly to the growing importance of virtual reality.
All AES members can download the Conference proceedings at www.aes.org/publications/conferences/?confNum=52.