Thursday, March 06, 2008

Would you trust your government with this technology?

Is mind reading next?

Researchers using brain tags scans try to predict a person's thoughts

By Jeremy Manier

A research team has managed to crack the mind's internal code and deduce what a person is looking at based solely on brain activity, a feat that could pave the way for what the scientists described as "a brain-reading device."

The ability to read minds reliably is still beyond the grasp of science, but the study published Wednesday by neuroscientists at the University of California at Berkeley builds on a growing body of work on how to hack into the brain's inner language.

The Berkeley team, which published its study online Wednesday in the journal Nature, used a brain scan to find patterns of activity when people looked at black-and-white images of items such as bales of hay, a starfish or a sports car. When the people then looked at different photos, a software program drew on activity in the brain's vision center to guess which images they saw with up to 92 percent accuracy.

Other researchers have stolen glances at people's secret intentions and memories, and the new findings suggest that brain scanners could even reveal the elusive content of dreams.

Such abilities could have positive uses, like aiding communication for people who are paralyzed or disabled, but some applications might be questionable, such as extracting information from unwilling subjects. Experts said the work's ethical implications should be examined now, while the field is still young.

The deepest problem facing scientists working to understand the brain is how its billions of neurons work together to make our inner life of sensations, ideas and recollections.

The Berkeley group, led by professor Jack Gallant and graduate student Kendrick Kay, did not solve that enduring puzzle. But by using brute computing force, they showed how the raw noise of neurons firing could be linked with specific visual images.

"The finding is very important," said John-Dylan Haynes, a professor at the Bernstein Center for Computational Neuroscience in Berlin. "This is a very sophisticated way of getting around a problem that seems almost impossible to solve."

The researchers fine-tuned their computer model by showing 1,750 images to each of the subjects, who were actually Kay and study co-author Thomas Naselaris.

Kay said the researchers used their own brain scans because they knew they would be patient subjects and said there was no way they could have manipulated the outcome.

When asked via e-mail how it felt to be in a brain-scan machine that might read his thoughts, Kay replied: "To me it's just data—I often forget that it is actually my brain activity!"

In the study's second stage, the subjects looked at 120 new photos while in the brain-scan machine. The computer program also analyzed the new photos and predicted how a human brain would respond. The program then used its predictions to match actual brain scans with the photos the people were looking at. That technique was 92 percent accurate for one subject and 72 percent correct for the other.

The scientists' model still cannot reconstruct from scratch what a person is seeing or imagining. For now, the computer can only work from a well-defined set of images to identify which one a person is looking at. It's an impressive advance, but experts said the technique's uses would be limited in the short-term.

In addition, the type of brain scan the group used is far too sluggish to capture a person's response to fast-moving images, Kay said. But in theory, advances in computer programs and brain scans could allow scientists to record the narrative of dreams or let people communicate through pure imagery.

"Perhaps it will be possible for you to visually imagine an object, and then use a brain decoder to translate your underlying brain activity into an actual image that other people can see," Kay wrote via e-mail.

One shortcoming of the Berkeley study is that the accuracy plummeted when the computer program tried to guess in real time which photo a subject was looking at, said Julius Dewald, a neurophysiology expert at Northwestern University.

The program made its best guesses when it could analyze everything after the fact, using an average of how the subject's brain looked during multiple views of the same photo.

The study's authors estimate it could be 30 to 50 years before techniques for decoding brain activity are advanced enough to raise urgent ethical issues.

But Haynes' group published a study last year in which brain activity revealed whether a person had chosen to add or subtract two numbers. That seems like a modest feat, but he said it suggests that future methods could delve further into a person's private plans or aspirations.

"People should know that there is an ethical dilemma here, and scientists are already thinking about it," Haynes said.

Original article posted here.

No comments: