Computers can now predict our preferences directly from our brain

May 20, 2021

14 May 2021Computers can now predict our preferences directly from our brainhuman-computer interaction A research team from the University of Copenhagen and University of Helsinki demonstrates it is possible to predict individual preferences based on how a person’s brain responses match up to others. But what if the algorithms could use responses from our brain rather than just our behavior? “Through comparing the brain activity of others, we’ve now also found it possible to predict faces each participant would find appealing prior to seeing them. “Due to social norms or other factors, users may not reveal their actual preferences through their behaviour online. Thus, the prediction was based partly on individual participant’s own brain signals and partly on how other participants responded to the images.

14 May 2021

Computers can now predict our preferences directly from our brain

human-computer interaction

A research team from the University of Copenhagen and University of Helsinki demonstrates it is possible to predict individual preferences based on how a person’s brain responses match up to others. This could potentially be used to provide individually-tailored media content — and perhaps even to enlighten us about ourselves.

Photo: Getty Images

We have become accustomed to online algorithms trying to guess our preferences for everything from movies and music to news and shopping. This is based not only on what we have searched for, looked at, or listened to, but also on how these activities compare to others. Collaborative filtering, as the technique is called, uses hidden patterns in our behavior and the behavior of others to predict which things we may find interesting or appealing.

But what if the algorithms could use responses from our brain rather than just our behavior? It may sound a bit like science fiction, but a project combining computer science and cognitive neuroscience showed that brain-based collaborative filtering is indeed possible. By using an algorithm to match an individual’s pattern of brain responses with those of others, researchers from the University of Copenhagen and the University of Helsinki were able to predict a person’s attraction to a not-yet-seen face.

Previously the researchers had placed EEG electrodes onto the heads of study participants and showed them images of various faces, and thereby demonstrated that machine learning can use electrical activity from the brain to detect which faces the subjects found most attractive.

“Through comparing the brain activity of others, we’ve now also found it possible to predict faces each participant would find appealing prior to seeing them. In this way, we can make reliable recommendations for users - just as streaming services suggest new films or series based on the history of the users,” explains senior author Dr. Tuukka Ruotsalo of the University of Copenhagen’s Department of Computer Science.

In the experiment, participants

 were shown images of human faces while having 

EEG electrodes on the heads.

Towards mindful computing and greater self-awareness

Industries and service providers are more and more often giving personalized recommendations and we are now starting to expect individually tailored content from them. Consequently, researchers and industries are interested in developing more accurate techniques of satisfying this demand. However, the current collaborative filtering techniques which are based on explicit behaviour in terms of ratings, click behaviour, content sharing etc. are not always reliable methods of revealing our real and underlying preferences.

“Due to social norms or other factors, users may not reveal their actual preferences through their behaviour online. Therefore, explicit behaviour may be biased. The brain signals we investigated were picked up very early after viewing, so they are more related to immediate impressions than carefully considered behaviour,” explains co-author Dr. Michiel Spapé.

“The electrical activity in our brains is an alternative and rather untapped source of information. In the longer term, the method can probably be used to provide much more nuanced information about people’s preferences than is possible today. This could be to decode the underlying reasons for a person’s liking of certain songs – which could be related to the emotions that they evoke,” explains Tuukka Ruotsalo.

But researchers don’t just see the new method as a useful way for advertisers and streaming services to sell products or retain users. As lead author Keith Davis points out:

“I consider our study as a step towards an era that some refer to as “mindful computing”, in which, by using a combination of computers and neuroscience techniques, users will be able to access unique information about themselves. Indeed, Brain-Computer Interfacing as it is known, could become a tool for understanding oneself better.”

Nevertheless, there is still a way to go before the technique can be applied beyond the laboratory. The researchers point out that brain-computer interface devices must become cheaper and easier to use before they find themselves in the hands or strapped to the heads of casual users. Their best guess is that this will take at least 10 years.

The researchers also underscore that the technology comes with a significant challenge for protecting brain-based data from misuse and that it is important for the research community to carefully consider data privacy, ownership and the ethical use of raw data collected by EEG.

ABOUT THE EXPERIMENT

In the experiment, participants were shown a large number of images of human faces and asked to look for those that they found attractive. While doing so, their brain signals were recorded. This data was used to train a machine learning model to distinguish between the brain activity when the participant saw a face that they found attractive versus when they saw a face that they did not find attractive.

With a different machine learning model, the brain-based data from a larger number of participants was used to calculate which new facial images each participant would find attractive. Thus, the prediction was based partly on individual participant’s own brain signals and partly on how other participants responded to the images.
The source of this news is from Copenhagen University

Popular in Research

China is using the world's largest online disinformation operation to harass Americans

Nov 15, 2023

Child sexual abuse survivors lend their voice to support others

Nov 15, 2023

Print on demand business with Printseekers.com

Sep 6, 2022

Cost of living pressures sees social cohesion hit record low

Nov 15, 2023

Professor Emeritus Walter Hollister, an expert in flight instrumentation and guidance, dies at 92

Nov 15, 2023

Cool Course: City as Text

Nov 15, 2023