In a world flooded every day with new music (o “resurrected”), choosing the next song to listen to can become an epic undertaking. Today, streaming platforms like Spotify combine human intuition and artificial intelligence to select the most worthy tracks, with a success rate just above that of a coin flip. A new study, however, could radically change the rules of the game.
If Neuroscience meets Machine Learning
A group of American researchers have discovered that our brain's response to music can predict with surprising accuracy which song will become a hit. Using a combination of machine learning techniques and neurophysiological data, they were able to achieve an incredible feat 97% accuracy predicting musical successes.
Professor Paul J. Zak is senior author of a study published in Frontiers in Artificial Intelligence (I link it here). In the study, Zak explains how he and his team were able to identify what appears to be the brain's rating system for social and emotional experiences, called Immersion. The team then applied this system to music and rating new songs, using data gathered from non-invasive physiological sensors such as those found on smartwatches.
From your wrist to the playlist
In the study, scientists analyzed a series of songs from different musical genres, both successful and unsuccessful. The neurophysiological data collected from the study participants were then entered into Immersion, the platform created by the research group. Unlike more invasive methods such as direct brain imaging, Zak's approach relies on simpler and more accessible signals, such as heart rate, making the technology much more practical and affordable. And, apparently, extremely effective.
Neuroforecasting: predicting the future of music
The concept behind this method is called “neuroforecasting”, which is the use of the neural activity of a small group of individuals to predict the responses of a larger population. It has previously been used to predict stock market swings, viral videos and election results.
What impact will this scientific “tsunami” have on the music of the future? Although the sample of participants and songs analyzed in the study was relatively small, the incredible results suggest that this approach could have a significant impact on how music is selected and promoted in the future. With the use of wearable devices becoming more widespread, we may see a future where entertainment choices are tailored based on our neurophysiology.
In the future, also thanks to the AR, people will "immerse themselves" (it must be said) in games and interactive experiences with sounds and music created in real time. Sound itineraries perfectly adapted to their mood, or their sensations. The perfect and totally personalized soundtrack.
A winning solution for artists and listeners
Beyond the commercial implications, Zak's approach could also help young artists develop hit songs, solving a philosophical dilemma that all artists face: Will the audience like what I like? Using Immersion, everyone will be able to quickly understand what people will like, stimulating their creativity and making content consumers happier.
A true win-win scenario? Or the death of unpredictability? Let me know what you think.