An Inherent Bond: External Visual Aid Has a Minor Effect on the Rate of Co-Speech Gestures
DOI:
https://doi.org/10.29038/eejpl.2020.7.2.jarKeywords:
co-speech gestures, speech conceptualization, visual aid, Information Packaging HypothesisAbstract
Traditionally, the purpose of representational co-speech gestures is to repeat or represent the semantic content of accompanying speech and so to facilitate speech comprehension. To test this belief, each of 22 participants was asked to deliver an informative speech once with the support of visual aid in the form of data-show (DS) projector slides and then to deliver the same speech without using any visual aid (NDS) in a different session; the purpose was to see if using visual aid had any significant effect on gesture rate during speech production. The theoretical framework of the study is based on findings in the Information Packaging Hypothesis, the Gesture as Simulated Action framework and relevant findings in cognitive psychology and neuroscience. The results showed that all participants used gestures during both sessions; the average number of co-speech gestures was 7.2 during the NDS and 6 during the DS sessions. This shows that using visual aid that supports the semantic content of speech did not lead to a significant reduction in the number of co-speech gestures in the DS sessions; it also indicates that the role of co-speech gestures is not merely to repeat the semantic content of accompanying speech. These results confirm previous findings in cognitive psychology that speech and accompanying gesture are cognitively and instinctively connected as one unit and that co-speech gestures possibly have an essential role in facilitating speech conceptualization and production. Speech and co-speech gestures are neurologically interconnected and they are impulsively produced whenever a speaker intends to communicate a message. These findings also add further evidence to relevant research which emphasizes that co-speech gestures are not produced merely as visual aid that aims to supplement speech.
Downloads
References
Abner, N., Cooperrider, K., & Goldin‐Meadow, S. (2015). Gesture for linguists: A handy primer. Language & Linguistics Compass, 9, 437–49. https://doi.org/10.1111/lnc3.12168
Alibali, M.W., Kita, S., & Young, A.J. (2000). Gesture and the process of speech production: we think, therefore we gesture. Language and Cognitive Processes, 15(6), 593 - 613.
Bates, E., & Dick, F. (2002). Language, gesture, and the developing brain, Developmental Psychology, 40(3), 425-435. https://doi.org/10.1002/dev.10034
Bavelas, J. B., Gerwing, J., Sutton, C., & Prevost, D. (2008). Gesturing on the telephone: Independent effects of dialogue and visibility. Journal of Memory and Language, 58, 495-520. https://doi.org/10.1016/j.jml.2007.02.004
Bernal, B., & Altman, N. (2010). The connectivity of the superior longitudinal fasciculus: a tractography DTI study. Magnetic Resonance Imaging, 28(2), 217–225. https://doi.org/10.1016/j.mri.2009.07.008
Biau, E., Fernandez, L.M., Holle, H., Avila, C., & Soto-Faraco, S. (2016). Hand gestures as visual prosody: BOLD responses to audio-visual alignment are modulated by the communicative nature of the stimuli. NeuroImage, 132, 129-137. https://psycnet.apa.org/doi/10.1016/j.neuroimage.2016.02.018
Chu, M., & Kita, S. (2011). The nature of gestures’ beneficial role in spatial problem solving. Journal of Experimental Psychology, 140, 102–116. https://doi.org/10.1037/a0021790
Dick, A. S., Goldin-Meadow, S., Hasson, U., Skipper, J., Small, S.L. (2009). Co-Speech gestures influence neural activity in brain regions associated with processing semantic information. Human Brain Mapping, 30, 3509-3526. https://doi.org/10.1002/hbm.20774
Goldin-Meadow, S., Nusbaum, H., Kelly, S.D., Wagner, S. (2001). Explaining math: Gesturing lightens the load. Psychological Science,12, 516–522. https://doi.org/10.1111/1467-9280.00395
Goldin-Meadow, S. & Wagner, S.M. (2005). How our hands help us learn. Trends in Cognitive Science, 9(5), 230-241. https://doi.org/10.1016/j.tics.2005.03.006
Goldin-Meadow, S., & Alibali, M.W. (2013). Gesture's role in speaking, learning, and creating language. Annual Review of Psychology, 64(1), 257-283. http://dx.doi.org/10.1146/annurev-psych-113011-143802
Hostetter, A. B., Alibali, M. W., & Kita, S. (2007). I see it in my hands’ eye: Representational gestures reflect conceptual demands. Language & Cognitive Processes, 22(3), 313–336. https://doi.org/10.1080/01690960600632812
Hostetter, A.B., & Alibali, M.W. (2010). Language, gesture, action: A test of the Gesture as Simulated Action framework. Journal of Memory and Language, 63, 245–57. https://doi.org/10.1016/j.jml.2010.04.003
Iverson, J. & Goldin-Meadow, S. (1998). Why people gesture as they speak. Nature, 396(6708), 228.
Iverson, J. & Thelen, E. (1999). Hand, mouth and brain: The dynamic emergence of speech and gesture. Journal of Consciousness Studies, 6(11-12), 19-40.
Kang, S., & Tversky, B. (2016). From hands to minds: Gestures promote understanding. Cognitive Research: Principles and Implications, 1(4). https://doi.org/10.1186/s41235-016-0004-9
Kelly, S. D., Kravitz, C., & Hopkins, M. (2004). Neural correlates of bimodal speech and gesture comprehension. Brain and Language, 89, 253–260. https://doi.org/10.1016/S0093-934X(03)00335-3
Kelly, S. D., McDevitt, T., & Esch, M. (2009). Brief training with co-speech gesture lends a hand to word learning in a foreign language. Language and Cognitive Processes, 24(2), 313-334. http://dx.doi.org/10.1080/01690960802365567
Kita, S. (2000). How representational gestures help speaking. In D. McNeill (Ed.), Language and Gesture (pp. 162-185). Cambridge: Cambridge University Press.
McNeill, D. (1989). A straight path to where? Reply to Butterworth and Hadar. Psychological Review, 96, 175-179.
McNeill, D., (1992). Hand and Mind: What Gestures Reveal about Thought. University of Chicago Press, Chicago.
McNeill, D. (2014). Gesture-speech unity: Phylogenesis, ontogenesis, and microgenesis. Language, Interaction and Acquisition, 5(2), 137-184. https://doi.org/10.1075/lia.5.2.01mcn
Melinger, A., & Kita, S. (2007). Conceptualisation load triggers gesture production. Language and Cognitive Processes, 22(4), 473–500. https://doi.org/10.1080/01690960600696916
Morrel-Samuels, P., & Krauss, R. M. (1992). Word familiarity predicts temporal asynchrony of hand gestures and speech. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18, 615–623.
Ping, R, & Goldin-Meadow, S. (2008). Hands in the air: Using ungrounded iconic gestures to teach children conservation of quantity. Developmental Psychology, 44, 1277-1287. https://doi.org/10.1037/0012-1649.44.5.1277
Ping, R, Goldin-Meadow, S. (2010). Gesturing saves cognitive resources when talking about nonpresent objects. Cognitive Science, 34, 602–19. https://doi.org/10.1111/j.1551-6709.2010.01102.x
Rizzolatti, G. & Craighero, L. (2004). The Mirror Neuron System. Annual Review of Neuroscienc, 27, 169–192. https://doi.org/10.1146/annurev.neuro.27.070203.144230
Ruiter, De J.P., Bangerter, A., & Dings, P. (2012). Interplay between gesture and speech in the production of referring expressions: investigating the tradeoff hypothesis. Topics in Cognitive Science, 4(2), 232-248. https://doi.org/10.1111/j.1756-8765.2012.01183.x
Schippers, M. B., Roebroeck, A., Renken, R., Nanetti, L., & Keysers, C. (2010). Mapping the information flow from one brain to another during gestural communication. Proceedings of the National Academy of Sciences of the United States of America, 107(20), 9388–9393.
Vainiger, D., Labruna, L., Ivry, R.B., Lavidor, M. (2014). Beyond words: evidence for automatic language–gesture integration of symbolic gestures but not dynamic landscapes. Psychological Research, 78, 55–69. https://doi.org/10.1007/s00426-012-0475-3
Wagner, P. (2014). Gesture and speech in interaction: An overview. Speech Communication, 57, 209–232. https://doi.org/10.1016/j.specom.2013.09.008
Willems, R.M, Özyürek, A, Hagoort P. (2007). When language meets action: The neural integration of gesture and speech. Cereb Cortex, 17(10), 2322-2333. https://doi.org/10.1093/cercor/bhl141
Xu, J., Gannon, P., Emmorey, K., Smith, J.F., Braun, A.R. (2009). Symbolic gestures and spoken language are processed by a common neural system. Proc. Natl Acad. Sci.,106(49), 20664–20669. https://doi.org/10.1073/pnas.0909197106
Downloads
Published
Issue
Section
License
Copyright (c) 2020 East European Journal of Psycholinguistics
This work is licensed under a Creative Commons Attribution 4.0 International License.