01 February 2016

What do you think about looking at this brick?

Researchers have learned to read human perception


The results of their study (Miller et al., Spontaneous Decoding of the Timing and Content of Human Object Perception from Cortical Surface Recordings Reveals Complementary Information in the Event-Related Potential and Broadband Spectral Change) are published in the journal PLOS Computational Biology, and its summary is provided in the university's Scientists decode brain signals nearly at speed of perception. The study was conducted on seven volunteers suffering from epilepsy.

The study involved patients with electrodes already implanted in the temporal lobes of the brain, which were used by doctors to analyze factors that provoke epilepsy attacks. Volunteers were shown photographs of houses and faces ten centimeters wide at a distance of one meter. Each image was shown for 400 milliseconds with 400-millisecond pauses.

The scientists filled the pauses between the photo shows with a demonstration of a uniformly gray background. In total, the volunteers were shown 50 photos of houses and 50 photos of faces. To maintain concentration, sometimes photos were shown upside down, and volunteers were asked to briefly describe what they see. Each of the volunteers went through three sessions of viewing photos.

In total, the study was conducted for a week. By the end, a special algorithm that analyzes data from electrodes in patients has learned to determine with high accuracy what exactly a volunteer sees: a house, a face or a gray background. In slightly more than 96 percent of the cases, the algorithm correctly deciphered human perception.

The algorithm determined what a person sees on average 20 milliseconds after the start of the demonstration of the next photo. According to the researchers, the exact operation of the algorithm became possible due to the fact that they studied the components of the "broadband" brain signal well and learned how to determine changes in the frequency spectrum of brain potentials.


The brain activity of the volunteer and the images displayed.
Illustration: Kai Miller / University of Washington

The US Army Research Laboratory is engaged in similar research today. The military plans to use the received developments when creating a system that will help military analysts to view images and videos faster. Now the laboratory is developing an algorithm that will simultaneously take into account the direction of the military's gaze and analyze his neural reactions in response to a particular picture.

The system consists of a computer and an electroencephalograph. The algorithm already developed by scientists is able to identify images of interest to the analyst. During one of the experiments, the fighter was shown a set of pictures divided into five main categories: boats, pandas, strawberries, butterflies and chandeliers. The soldier was asked only to count images from the category that he was interested in. The pictures themselves changed every second. According to the results of the experiment, the algorithm determined that the fighter was interested in the category of boats.

Today, military analysts have to analyze huge amounts of data: from photographs to hours-long recordings from unmanned aerial vehicles. When detecting important objects, the fighters should mark them on the images, and then distribute the marked photos into categories. In the future, the new system will only allow analysts to carefully look at images, and their categorization, as well as marking important objects, will be handled by a computer based on brain waves.

Portal "Eternal youth" http://vechnayamolodost.ru
01.02.2015
Found a typo? Select it and press ctrl + enter Print version