According to University of California, those spontaneous nonverbal exclamations we make speak volumes, expressing more about what we’re feeling than previously understood.
Everything from elation (woohoo) to embarrassment (oops) says a lot more about what we’re feeling than previously understood. Published in American Psychologist, the results are demonstrated in vivid sound and colour on the first-ever interactive audio map of conveying emotion (nonverbal vocal communication). Proving that a sigh is not just a sigh, UC Berkeley, USA, scientists conducted a statistical examination of listener responses to more than 2,000 nonverbal exclamations known as ‘vocal bursts’ and found they convey at least 24 kinds of emotion. Previous studies of vocal bursts set the number of recognisable emotions closer to 13.
A rich demonstration of conveying emotion
Dacher Keltner, a psychology professor at UC Berkeley and faculty director of the Greater Good Science Center, explains: “This study is the most extensive demonstration of our rich emotional vocal repertoire, involving brief signals of upwards of two dozen emotions as intriguing as awe, adoration, interest, sympathy and embarrassment.”
Humans have used wordless vocalisations to communicate feelings that can be decoded in a matter of seconds, according to this study.
“Our findings show that the voice is a much more powerful tool for expressing emotion than previously assumed,” said study lead author Alan Cowen, a Ph.D. student in psychology at UC Berkeley.
New era of voice controlled digital assistants?
Among other applications, Cowen explains how the map can be used to help teach voice-controlled digital assistants and other robotic devices to better recognise human emotions based on the sounds we make.
As for clinical uses, the map could theoretically guide medical professionals and researchers working with people with dementia, autism and other emotional processing disorders to zero in on specific emotion-related deficits.
“It lays out the different vocal emotions that someone with a disorder might have difficulty understanding,” Cowen adds.
“For example, you might want to sample the sounds to see if the patient is recognising nuanced differences between, say, awe and confusion.”
Though limited to U.S. responses, the study suggests humans are so keenly attuned to nonverbal signals – such as the bonding ‘coos’ between parents and infants – that we can pick up on the subtle differences between surprise and alarm, or an amused laugh versus an embarrassed laugh.