In the past, scientists tried to better understand the impact of music on the body, brain and emotions by analysing MRI scans. With the help of AI, scientists can now look forward to discovering that in a much deeper manner.
The question of why certain music affects a listener’s emotions in unusual ways, has remained on our minds for quite some long. Now, a study by a team of researchers from the USC Viterbi School of Engineering, US, attempts to answer that with the help of artificial intelligence.
A neuroimaging test was conducted on a group of volunteers as they listened to three unfamiliar pieces of music. This ensured that there was no element of listener’s memory that was attached to any particular music. To measure physical reaction, 60 people listened to music on headphones, while their heart activity and skin conductance were measured. The same group also rated the intensity of emotion (happy or sad) from 1 to 10 while listening to the music. The researchers observed their heart rate, skin response (or sweat gland activity), brain activity and subjective feelings of happiness and sadness.
Then, the computer scientists processed the data using AI algorithms to determine which auditory features people responded to consistently. From the results, it was concluded that dynamics, register, rhythm and harmony of music were directly related to listeners’ response.
“Taking a holistic view of music perception, using all different kinds of musical predictors, gives us an unprecedented insight into how our bodies and brains respond to music,” said Tim Greer, a computer science PhD student and a member of the USC Signal Analysis and Interpretation Laboratory (SAIL).
It all lies in the rhythm
The researchers noted that music had a powerful influence on the auditory of the brain called the Heschls’ gyrus and the superior temporal gyrus. The brain had a lively response when listening to pop music with dancing beats. They also found that by changing the dynamics, rhythm and timbre, or by introducing new instruments (that had different rhythm and dynamics), the gyrus became activated and showed an increased response.
“If a song is loud throughout, there’s not a lot of dynamic variability, and the experience will not be as powerful as if the composer uses a change in loudness,” said Greer. “It’s the songwriter’s job to take you on a rollercoaster of emotions in under three minutes, and dynamic variability is one of the ways this is achieved.”
So, if you’re consistently listening to loud and heavy music like black metal, then you’re probably not going to see an active response in the brain. However, if that loud music constantly changes from a quiet verse to loud chorus and back again, then an active response can occur.
Increased skin and brain response
It was also discovered that the skin response corresponding to secretion of sweat, increased after the introduction of a new instrument or near the build-up of music.
“When each new instrument enters, you can see a spike in the collective response of the skin,” said Greer.
By using algorithms to analyze data gathered in the lab, the scientists were able to look at how people felt while listening to music over longer periods of time, not only from brain scans, but also combining data from other modes.
“Novel multimodal computing approaches help not just illuminate human affective experiences to music at the brain and body level, but in connecting them to how actually individuals feel and articulate their experiences,” said Professor Shrikanth (Shri) Narayanan, study co-author, Niki and C. L. Max Nikias Chair in Engineering and professor of electrical and computer engineering and computer science.
Music is known for providing calmness in stressful situations. The researchers hope that this study will provide a new insight into how different types of music can positively manipulate our emotional responses and whether the intent of the composer matches the listener’s perception of a piece of music.
The above study titled “A Multimodal View into Music’s Effect on Human Neural, Physiological, and Emotional Experience,” was presented at ACM Multimedia 2019.