2/01/2012

Scientists decode how the brain hears words


US scientists said Wednesday they have found a way to decode how the brain hears words, in what researchers described as a major step toward one day helping people communicate after paralysis or stroke. 

These images courtesy of the Center for Vital Longevity, The University of Texas at Dallas, show positron emission tomography scans of the brains of healthy adults showing low (L) and high (R) levels of beta-amyloid protein. US scientists said Wednesday they have found a way to decode how the brain hears words [Credit: AFP]
By placing electrodes on the brains of research subjects and then having them listen to conversations, scientists were able to analyze the sound frequencies registered and figure out which words they were hearing. 

"We were focused on how the brain processes the sounds of speech," researcher Brian Pasley of the Helen Wills Neuroscience Institute at the University of California Berkeley told AFP. 

"Most of the information in speech is between one to 8,000 hertz. Essentially the brain analyzes those different sound frequencies in somewhat separate locations." 

By tracking how and where the brain registered sounds in the temporal lobe -- the center of the auditory system -- scientists were able to map out the words and then recreate them as heard by the brain. 

"When a particular brain site is being activated, we know that roughly corresponds to some sound frequency that the patient is actually listening to," Pasley said. 

"So we could map that out to an extent that would allow us to use that brain activity to resynthesize the sound from the frequencies we were guessing." 

One word the researchers mapped was "structure." The high-frequency "s" sound showed up as a certain pattern in the brain, while the lower harmonics of the "u" sound appeared as a different pattern. 

"There is to some extent a correspondence between these features of sound and the brain activity that they cause," and putting together the physical registry in the brain helped rebuild the words, Pasley explained. 

The work builds on previous research in ferrets, in which scientists read to the animals and recorded their brain activity. 

They were able to decode which words the creatures heard even though the ferrets themselves didn't understand the words. 

The next step for researchers is to figure out just how similar the process of hearing sounds may be to the process of imagining words and sounds. 

That information could one day help scientists determine what people want to say when they cannot physically speak. 

Some previous research has suggested there may be similarities, but much more work needs to be done, Pasley said. 

"This is huge for patients who have damage to their speech mechanisms because of a stroke or Lou Gehrig's disease and can't speak," co-author Robert Knight, a UC Berkeley professor of psychology and neuroscience, said in a statement. 

"If you could eventually reconstruct imagined conversations from brain activity, thousands of people could benefit." 

Participating researchers came from the University of Maryland, UC Berkeley and Johns Hopkins University in Baltimore, Maryland. 

The study appears in the January 31 edition of the open access journal PLoS Biology. 

Author: Kerry Sheridan | Source: AFP [February 01, 2012]

0 comments:

Post a Comment

Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | Facebook Themes