Politicians use gestures to strengthen the delivery of speeches, while many people shrug or move their hands wildly to make a point. Now, scientists have shown that gesticulating while talking is part of language and affects how the meaning of words is interpreted. Researchers believe that gestures form part of a communication system deeply ingrained in humans, which allow us to be understood. This might explain why many of us gesture while talking on the phone, where our actions can’t be seen.
Dr Marina Nespor, a neuroscientist at the International School for Advanced Studies (SISSA) of Trieste, Italy, who carried out the research set out to explore why humans find it difficult to keep still while speaking. She thinks it’s because gestures and words ‘very probably’ form a single communication system, which serves to enhance expression and help us to make ourselves understood. ‘In human communication, voice is not sufficient: even the torso and in particular hand movements are involved, as are facial expressions,’ she said.
Together with research fellow Dr Alan Langus, she studied prosody, which is what makes a sentence understood in a specific way and usually relies on intonation and the rhythm of language. For example, without prosody, nothing would distinguish the spoken statement ‘this is an apple’ from the ‘this is an apple?’
If gestures are ingrained in humans, it might explain why many of us gesture while talking on the phone, (stock image) where our actions can’t be seen in their study, published in the journal Frontiers in Psychology, the experts say that prosody should include gestures.
Dr Langus explained: ‘Prosodic information, for the person receiving the message, is a combination of auditory and visual cues. ‘The “superior” aspects at the cognitive processing level of spoken language are mapped to the motor-programmes responsible for the production of both speech sounds and accompanying hand gestures’.
To come to their conclusion, the researchers asked 20 Italian speakers to listen to various sentences with ambiguous meanings and watch videos of people saying the sentences. In the videos, the sentences could be ‘matched’ – when gestures corresponded to the meaning of the spoken words – or ‘mismatched’ when the gestures matched an alternative meaning. ‘In the matched conditions there was no improvement ascribable to gestures,’ Dr Langus said. ‘The participants’ performance was very good both in the video and in the audio only sessions.’
He said that it is in the mismatched experiment that the effect of hand gestures became clear. ‘With these stimuli the subjects were much more likely to make the wrong choice – that is, they’d choose the meaning indicated in the gestures rather than in the speech – compared to matched or audio-only conditions. ‘This means that gestures affect how meaning is interpreted and we believe this points to the existence of a common cognitive system for gestures, intonation and rhythm of spoken language.’