Anthony Collamati and Timothy Richardson
New technologies like Dolby Atmos and DTS:X are transforming the composition and playback of soundtracks. Instead of confining objects to speaker channels, these surround sound platforms create “audio objects” and situate them freely in 3D space. Just as these playback technologies have re-engineered concepts of listening space, 360-degree cameras have redefined recording spaces, making mobile the formerly fixed viewing angle of an audio event. With these fuller, more dynamic reproductions of spatial audio, one might wonder if anything still eludes capture. In an act Mark Katz calls a “performative quotation,” digital technologies can re-create the “details of timbre and timing that evoke and identify a unique sound event” (Capturing Sound 149). That is, these technologies e/invoke what he calls a location’s “sonic aura.” In the primary sense, this aura is a “reverberation that imparts a sense of space.” However, it can also point to “the slight but constant ambient noise—that is a byproduct of imperfect recording fidelity.” In an effort to test this theory and interrogate fidelities in new VR technologies, we record three rooms. Each one is filmed with a 360-degree camera and arranged to leverage the unique features of spatial audio. The goal is to make legible both the primary sonic aura of a location (a room’s ambient tone) and its secondary one (the artifacts of its recording). To do so, ambient noises and sampling anomalies are remixed to respond to an interactive, 360-degree interface (phones and tablets work well). Headphones are essential. Please wear headphones.