Artificial Intelligence Generates Personally Attractive Images by Reading Brain Waves

Varying conceptions of what “beauty” is has consistently been a challenging topic, but one thing without a doubt, what we find attractive in other people’s faces is stored in our brains with social and mental factors playing an unconscious role in our own preferences.

Presently, scientists at the University of Helsinki and University of Copenhagen have used electroencephalography (EEG) measurements to make an Artificial intelligence understand our subjective notions of what makes faces attractive, according to findings published in IEEE Transactions on Affective Computing.

The experiment which was led with 30 volunteers “worked a bit like the dating app Tinder,” explained senior researcher Michiel Spapé from the Department of Psychology and Logopedics, University of Helsinki.

The scientists made a generative adversarial neural network (GAN) to make 100s of artificial portraits. Individually, the images were shown to the volunteers who focused on the faces they found attractive while wearing elastic caps fitted with electrodes to measure their brain activity.

At the point when the participant found a face attractive, they just had to look at it instead of swiping right.

At that point, the measured neural activity was assessed by the GAN to interpret the brain responses in terms of how attractive each face was considered by the viewer. “A brain-computer interface such as this is able to interpret users’ opinions on the attractiveness of a range of images,” said Academy Research Fellow and Associate Professor Tuukka Ruotsalo, who heads the project.

“By interpreting their views, the AI model interpreting brain responses and the generative neural network modeling the face images can together produce an entirely new face image by combining what a particular person finds attractive.”

The recently generated faces, unique for each participant, were tested in a double-blind procedure against matched controls. It was seen that the new images matched the preferences of the subjects with an accuracy of over 80%.

“The study demonstrates that we are capable of generating images that match personal preference by connecting an artificial neural network to brain responses. Succeeding in assessing attractiveness is especially significant, as this is such a poignant, psychological property of the stimuli. Computer vision has thus far been very successful at categorizing images based on objective patterns. By bringing in brain responses to the mix, we show it is possible to detect and generate images based on psychological properties, like personal taste,” Spapé further explained.

At last, the study may benefit society by advancing the capacity for computers to learn and progressively understand subjective preferences, through interaction between AI solutions and brain-computer interfaces.


Original post:

Leave a Reply

Your email address will not be published. Required fields are marked *