Presenter
Christine Moller, Department of Psychology, University of Alberta
Abstract
When deciding whether to approach someone, we often try to figure out if they are happy or angry through their non-verbal communication such as their facial expressions and body language. The purpose of this study is to test whether learning American Sign Language (ASL) improves people’s sensitivity to non-verbal communication. ASL may improve sensitivity as it marks some features of grammar with facial and other bodily movements. In order to test this hypothesis, we assess people’s ability to recognize emotions in three modalities: the eyes, face, and the body. ASL learners and learners of a spoken language will participate. Participants’ ability to recognize emotions will be measured at the start of the term and again after one semester of learning the language. We predict that ASL learners will be better at emotion recognition than spoken-language learners after one semester. These results would be consistent with the argument that the grammar of ASL trains learners to notice small changes in facial and body movements. Moreover, this ability would generalize to a non-linguistic task (i.e., emotion recognition). These results would suggest that non-verbal emotion recognition abilities could be improved through practice and experience in other skills.
Poster
Authors & Affiliations
C. Moller & E. Nicoladis (Psychology Department, University of Alberta