----- Forwarded message from physnews@aip.org -----
From: physnews@aip.org Reply-to: physnews@aip.org Subject: Physics News Update 848 To: chrism@MCCORMICK.CX X-SpamProbe: GOOD 0.0000138 2576216b3950d67d229c66bea8dc39b7
RECREATING THE WORLD INSIDE YOUR HEAD. The first use of individualized virtual-reality sounds in a functional MRI (fMRI) environment to reproduce a naturalistic acoustic experience for studying brain function might provide a better explanation of the *cocktail party* effect-the process by which we try to make sense of a conversation at a crowded party even as several other potentially distracting conversations proceed at the same time. New brain scans using fMRI are helping researchers to understand how the brain segregates objects in space when a person hears, but not necessarily sees, multiple sources of sound. At Kourosh Saberi's (saberi@uci.edu) lab at the University of California, Irvine, human subjects are exposed to several sounds. Sometimes the sounds come from different locations near the subject, while sometimes several sounds come from a single location. When looking at fMRI scans showing areas of enhanced blood flow, which provides 2-mm-resolution maps of brain activity, the U.C. Irvine scientists report two main results. First, no specific brain region accounts exclusively for identifying auditory motion, in contrast to the visual cortex which does have specific motion-sensing regions. And second, spatial auditory information seems to be processed in a neural region, called the Planum Temporale, in a way that can facilitate the segregation of multiple sound sources. (ASA meeting talk 2aPP8, http://www.acoustics.org/press/)
----- End forwarded message ----- ------------------- http://mccormick.cx