Volume 14, Issue 2 (March & April 2023)                   BCN 2023, 14(2): 289-296 | Back to browse issues page


XML Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Lohrasbi S, Moradi A R, Sadeghi M. Exploring Emotion Recognition Patterns Among Iranian People Using CANTAB as an Approved Neuro-Psychological Assessment. BCN 2023; 14 (2) :289-296
URL: http://bcn.iums.ac.ir/article-1-2255-en.html
1- Institute for Cognitive Science, Shahid Beheshti University, Tehran, Iran.
2- Institute for Cognitive Sciences Studies, Kharazmi University, Tehran, Iran.
Full-Text [PDF 568 kb]       |   Abstract (HTML) 
Full-Text:  
1. Introduction
Production, perception, interpretation, and response to emotional signals are influential parts of humans and their interpersonal life. Therefore, emotions play a significant role in human communication. Recognizing emotion using facial expressions is a key element in human communication (Varghese et al., 2015). These actions affect distant goals, such as survival and reproduction in humans and other species (Preston & De Waal, 2002). As a result, there are considerable advantages for people to recognize their feelings and those of others accurately. These benefits could be considered as a potential cultural advantage in identifying emotions, even when individuals are members of the same group or community (Anderson & Keltner, 2002), and could suggest a possible cultural benefit of emotion recognition (Wickline et al., 2009).
Each emotion is identified by a set of patterns of evaluation factors. It is believed that cognition is the prerequisite for emotion, and emotion is known as the product of a cognitive evaluation process (Izard, 2009). Significantly, regulating the relationship between emotion and cognition can help increase healthy functioning, while an imbalance between these two components might lead to emotional or cognitive disorders and difficulties such as anxiety or depression (Ebner & Johnson, 2009).
Studies show that the amygdala can respond positively and negatively to valenced stimuli (Hamann et al., 2002), as long as they are high in arousal (Cunningham et al., 2004). Specifically, studies show that the human amygdala robustly activates fear faces (Chiao et al., 2008).
The ability to comprehend and recognize others’ emotional states is difficult to attain due to the complexity of the process. A person cannot be simply aware of what is going on in others’ minds. To understand others’ emotional states, the individual needs to pay attention to people’s facial expressions and gestures when emotions occur (Juckel et al., 2018). Physical signs and symptoms can include changes in facial expressions, facial discoloration, as well as alterations in the position of facial components relative to each other, such as raised eyebrows or changes in the position of the lips and the mouth that can be seen and understood (Petersen & Higham, 2020). Understanding facial emotion or cognition of emotion could play a significant role in social interactions (Elliott & Jacobs, 2013). However, the role of culture should not be ignored. Different studies have evaluated the cultural differences in recognizing others’ emotions (Jack et al., 2012; Biehl et al., 1997). For example, a study of Asians and Caucasians found that Caucasians were much more accurate at recognizing anger, sadness, and fear (Biehl et al., 1997). On the other hand, another study examined the credibility of a Japanese face database among Japan, the United States, Poland, Vietnam, Hungary, and Sumatra. It showed that recognizing a face’s emotion is a universal thing, and people in all of these countries can correctly identify the emotion in the provided images (Jack et al., 2012).
Given the above, emotion recognition could be an appropriate and international test for examining social cognition. Considering that the ability to recognize emotion according to the culture and nationality of individuals can be specific, the existence of a normal range of data on the ability of individuals in society to recognize emotion could be a great help in distinguishing between mental disorders or psychological characteristics (Lim, 2016). Moreover, the existence of specific cutting points for each community or ethnic group is necessary to predict the psychological-emotional characteristics of that community or ethnic group. On the other hand, the boundary of disorder and psychological factors of that nation could be determined. In addition, changing cutting points can predict emotional bias in a country, which is essential for mental health decisions. Therefore, the purpose of this study was to investigate emotion recognition patterns among Iranians using CANTAB as a valid computer-based neuropsychological test.

2. Materials and Methods
PParticipants were recruited randomly by posting flyers. All interested individuals were screened for eligibility via the Raven-2 progressive matrices to ensure that their overall intelligence was in the latter range of normal (110-90) before participation. Participants included 117 males and females (Mean±SD 32.1±6.4). The minimum educational degree of all participants was a high school diploma. Moreover, all participants were employed and worked full-time or part-time. All the participants in this study were assessed by the emotion recognition task. Exclusion criteria were intelligence quotient (IQ) below normal by RAVEN-2 test, and if in the demographic form, we find usage of psychiatric drugs, history of psychiatric disorders, and any type of movement disorder or disability that could affect the reaction time of examiner. The movement ability has been assessed by an occupational therapist.

Materials
The emotion recognition task (ERT): The emotion recognition task (ERT) as part of the Cambridge neuropsychological test automated battery (CANTAB; Robbins et al.,1994) is a computer-generated paradigm for the recognition of six basic facial emotional expressions including happiness, sadness, anger, disgust, fear, and surprise. The emotions (15 stimuli for each emotion with different levels of intensity) are mimicked by actors and presented randomly in two blocks (90 stimuli each). After each stimulus presentation (200 ms), the participant was asked to choose between the six emotional expressions displayed in labels on the screen. The task provided a percentage of correctness for each emotion and an overall mean of response latency. The non-verbal and culture-free nature of the CANTAB also reduces any potential variance introduced by language and cultural translations of stimuli in word-based tests (Barnett et al., 2015).
The Raven-2 visual intelligence test: This test is used to determine the normal data appropriate to individuals’ intelligence according to the validity and footprint and independent of culture and language. The test has been recognized as a suitable test for research and evaluating and screening people’s intelligence (Raven, 2000). The previous research shows that the Ravens progressive matrices test is a reliable and valid intelligence test for mental testing and general intelligence for Iranian undergraduate students. The Raven’s progressive matrices test has significant reliability and validity (P<0.01) for measuring the intelligence of undergraduates (Rahmani, 2008).

Procedure
Detailed and complete explanations about the study aim as well as the confidentiality of their identity were provided. A written consent form was taken from all the participants. The participants were seated in a quiet room in front of a touch-screen computer. The computer was about 30 cm away from the subjects and there was nothing between the subject and the screen. Participants were instructed to observe a person’s face in the middle of the screen, and after displaying the face, they had to choose one of the six emotional states (disgust, sadness, happiness, surprise, anger, and fear) that were displayed. The examiner was also present in the test room at an angle that was not in the subject’s view. The whole test was performed automatically for each person and the results were saved. In this study, an Asus Laptop computer-TP510U was used. The laptop has a 15-inch touch screen, and according to the test standards, the tab could be rotated 360 degrees so that the keyboard was not available to the subject, and a wireless keyboard was used next to it.

Statistical analysis
In this study, data analysis was conducted using IBM SPSS statistics software, version 26 (IBM Corp., NY, USA). A one-way analysis of variance (ANOVA) was used to determine whether there are any statistically significant differences between the means of correct responses and time to recognize emotional states. Eta squared effect sizes were calculated and reported for ANOVA analyses, with 0.01 considered small, 0.06 considered medium, and 0.14 considered large (Cohen, 1988). Post-hoc t-tests and Cohen d effect sizes were calculated, with 0.20 considered small, 0.50 considered moderate, and 0.80 considered large (Cohen, 1988).

3. Results
Table 1 presents the Mean±SD of the percentage of the correct responses and the time to recognize emotional states. 

Correct responses
The Mean±SD for correct responses are presented in Figure 1.

The one-way within-subjects ANOVA revealed a significant difference in correct responses related to various emotional states F(5, 580)=162.09, P=0.001, η2=0.57 (Table 2). 

Further post-hoc t-test analyses showed that only the correct responses for sadness and surprised emotions did not differ significantly, t(112)=-0.59, P=0.55, d=0.05. These results suggested that different emotions have various correct responses. However, sadness and surprise did not differ. 

Recognition time
The Mean±SD for recognition time are presented in Figure 2.

The one-way within-subjects ANOVA revealed a significant difference in recognition time related to various emotional states F(5, 580)=73.14, P=0.001, η2=0.38 (Table 3). 

Further post-hoc t-test analyses showed that only the recognition time related to happiness and anger emotional states did not differ significantly, t(112)=1.79, P=0.07, d=0.16. These results suggested that different emotional states have different recognition times.

4. Discussion
The purpose of this study was to investigate the patterns of emotion recognition among the Iranian population by using the CANTAB as a valid computer-based neuropsychological assessment (CBNA). Our findings are in two categories, the percentage of correct emotion that is recognized, and the reaction time in milliseconds unite for responding correctly to each six basic emotional states (disgust, sadness, happiness, surprise, anger, and fear). The results of the present study showed that participants could better identify the emotion of happiness with 75.83% correct response and in the following, sadness with 70% correct response, and surprise with 68.48% correct response, in comparison to the emotions of disgust with 47.84% correct response and anger with 42.54% and fear with 38.26% of the correct response. Happiness has the most percentage of correct recognition emotions, while fear was the least correctly recognized emotion. Regarding the response time for emotion recognition, disgust had the shortest recognition time with a mean of 322 ms. The sadness mean time was 1800 ms and was the longest time for emotion recognition. Other emotion recognition times were a surprise for 966 ms, anger for 1187 ms, happiness for 1264 ms, and fear for 1529 ms. Therefore, it can be concluded that fear has the lowest correct recognition with the highest recognition time and could be considered a complicated emotion for Iranians to recognize. Happiness was the most emotion recognized correctly with the short response time. Therefore, it could be considered an easy emotion for Iranian people to recognize. 
The findings related to emotion recognition were in line with Fritz et al. (2009). Elfenbein and Ambady (2002) in a meta-analysis study with 87 articles including 22148 participants, showed that emotion recognition is different in various societies. Moreover, Americans were better at judging happiness in comparison with disgust. The same study found that the Japanese were better at feeling the emotion of disgust in comparison with Americans (Matsumoto et al., 1989). The study of Bihl et al. (1997) showed Caucasians recognize the emotion of fear well. The present study’s findings demonstrated that the highest percentage of accuracy in recognizing emotion was related to the emotion of happiness, which is in line with the findings of Matsumoto et al. (2000).
To the best of our knowledge, this is the first study comparing emotion recognition among Iranians. Therefore, we cannot compare our results with previous Iranian emotion recognition findings. 

5. Conclusion
This study showed Iranians were better at judging happiness, and fear was the worse emotion for recognition. It could be for the reason of cultural and societal differences that we discussed previously. From a neuroscience perspective, as we told in the introduction, the amygdala activity is the most activity that we will see in facial emotion recognition, especially in recognition of fear; weakness of recognition of fear in Iranian people could be related to specificity in the amygdala activity. In sum, the mean correct emotion recognition was 57.15%, which means nearly 42% error in emotion recognition; it can be assumed that Iranians might be weak at judging emotions. In addition, the findings of this study can be used for emotion recognition tests in the case of people who might have difficulties in the field of emotion recognition, such as people who suffer from autism, schizophrenia, and other similar disorders. Finally, the findings might be beneficial to staff members of companies to determine others’ emotions. 
As for the limitations of this study, it could be pointed out that the sample size was rather small. Moreover, the participants were recruited from limited areas in Iran. Therefore, the participants could have been recruited from other parts of the country as well. Research with neuroimaging devices that could indicate amygdala activity could help study the idea of differential amygdala activity in Iranian people.

Ethical Considerations
Compliance with ethical guidelines

Ethics approval was obtained from the Kharazmi University Institutional Review Board (Code: IR.KHU.REC.1398.037). 

Funding
The paper was extracted from the PhD thesis of Soroush Lohrasbi at Department of Cognitive Science-Psycology, Shahid Beheshti University.

Authors' contributions 
All authors contributed equally to prepare the article.

Conflict of interest
The authors declared no conflict of interest.


References
Anderson, C., & Keltner, D. (2002). The role of empathy in the formation and maintenance of social bonds. Behavioral and Brain Sciences, 25(1), 21-22. [DOI:10.1017/S0140525X02230010]
Barnett, J. H., Blackwell, A. D., Sahakian, B. J., & Robbins, T. W. (2016). The paired associates learning (PAL) test: 30 years of CANTAB translational neuroscience from laboratory to bedside in dementia research. Current Topics in Behavioral Neurosciences, 28, 449–474. [DOI:10.1007/7854_2015_5001] [PMID]
Biehl, M., Matsumoto, D., Ekman, P., Hearn, V., Heider, K., & Kudoh, T., et al. (1997). Matsumoto and ekman’s Japanese and Caucasian facial expressions of emotion (JACFEE): Reliability data and cross-national differences. Journal of Nonverbal Behavior, 21(1), 3-21. [DOI:10.1023/A:1024902500935]
Chiao, J. Y., Iidaka, T., Gordon, H. L., Nogawa, J., Bar, M., & Aminoff, E., et al. (2008). Cultural specificity in amygdala response to fear faces. Journal of Cognitive Neuroscience, 20(12), 2167–2174. [DOI:10.1162/jocn.2008.20151] [PMID]
Cohen J. (1988). Statistical Power Analysis for the Behavioral Sciences. New York: Routledge Academic. [DOI: 10.4324/9780203771587]
Cunningham, W. A., Johnson, M. K., Raye, C. L., Chris Gatenby, J., Gore, J. C., & Banaji, M. R. (2004). Separable neural components in the processing of black and white faces. Psychological Science, 15(12), 806–813. [DOI:10.1111/j.0956-7976.2004.00760.x] [PMID]
Ebner, N. C., & Johnson, M. K. (2009). Young and older emotional faces: Are there age group differences in expression identification and memory? Emotion, 9(3), 329–339. [DOI:10.1037/a0015179] [PMID] [PMCID]
Elfenbein, H. A., & Ambady, N. (2002). On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychological Bulletin, 128(2), 203–235. [DOI:10.1037/0033-2909.128.2.203] [PMID]
Elliott, E. A., & Jacobs, A. M. (2013). Facial expressions, emotions, and sign languages. Frontiers in Psychology, 4, 115.[DOI:10.3389/fpsyg.2013.00115] [PMID] [PMCID]
Fritz, T., Jentschke, S., Gosselin, N., Sammler, D., Peretz, I., & Turner, R., et al. (2009). Universal recognition of three basic emotions in music. Current Biology, 19(7), 573–576. [DOI:10.1016/j.cub.2009.02.058] [PMID]
Hamann, S. B., Ely, T. D., Hoffman, J. M., & Kilts, C. D. (2002). Ecstasy and agony: Activation of the human amygdala in positive and negative emotion. Psychological Science, 13(2), 135–141. [DOI:10.1111/1467-9280.00425] [PMID]
Izard, C. E. (2009). Emotion theory and research: Highlights, unanswered questions, and emerging issues. Annual Review of Psychology, 60, 1–25. [DOI:10.1146/annurev.psych.60.110707.163539] [PMID] [PMCID]
Jack, R. E., Garrod, O. G., Yu, H., Caldara, R., & Schyns, P. G. (2012). Facial expressions of emotion are not culturally universal. Proceedings of the National Academy of Sciences of the United States of America, 109(19), 7241–7244. [DOI:10.1073/pnas.1200155109] [PMID] [PMCID]
Juckel, G., Heinisch, C., Welpinghus, A., & Brüne, M. (2018). Understanding another person’s emotions-an interdisciplinary research approach. Frontiers in Psychiatry, 9, 414. [DOI:10.3389/fpsyt.2018.00414] [PMID] [PMCID]
Lim, N. (2016). Cultural differences in emotion: differences in emotional arousal level between the East and the West. Integrative Medicine Research, 5(2), 105–109. [DOI:10.1016/j.imr.2016.03.004] [PMID] [PMCID]
Matsumoto, D. (1989). Cultural influences on the perception of emotion. Journal of Cross-Cultural Psychology, 20(1), 92-105. [DOI:10.1177/0022022189201006]
Matsumoto, D., LeRoux, J., Wilson-Cohn, C., Raroque, J., Kooken, K., & Ekman, P., et al. (2000). A new test to measure emotion recognition ability: Matsumoto and ekman’s Japanese and Caucasian brief affect recognition test (JACBART). Journal of Nonverbal Behavior, 24(3), 179-209. [DOI:10.1023/A:1006668120583]
Petersen, R. M., & Higham, J. P. (2020). The role of sexual selection in the evolution of facial displays in male non-human primates and men. Adaptive Human Behavior and Physiology, 6, 249-276. [DOI:10.1007/s40750-020-00139-z]
Preston, S. D., & de Waal, F. B. (2002). Empathy: Its ultimate and proximate bases. The Behavioral and Brain Sciences, 25(1), 1–71.[DOI:10.1017/S0140525X02000018] [PMID]
Rahmani, J. (2008). [The reliability and validity of raven’s progressive matrics test among the students of Azad Khorasgan University (Persian)]. Knowledge & Research in Applied Psychology, 9(34), 61-74. [Link]
Raven, J. (2000). The raven's progressive matrices: Change and stability over culture and time. Cognitive Psychology, 41(1), 1–48. [DOI:10.1006/cogp.1999.0735] [PMID]
Robbins, T. W., James, M., Owen, A. M., Sahakian, B. J., McInnes, L., & Rabbitt, P. (1994). Cambridge neuropsychological test automated battery (CANTAB): A factor analytic study of a large sample of normal elderly volunteers. Dementia, 5(5), 266–281. [DOI:10.1159/000106735] [PMID]
Varghese, A. A., Cherian, J. P., & Kizhakkethottam, J. J. (2015). Overview on emotion recognition system. Paper presented at: 2015 International Conference on Soft-Computing and Networks Security, Coimbatore, India, 27 February 2015. [DOI:10.1109/ICSNS.2015.7292443]
Wickline, V. B., Bailey, W., & Nowicki, S. (2009). Cultural in-group advantage: Emotion recognition in African American and European American faces and voices. The Journal of Genetic Psychology, 170(1), 5–29. [DOI:10.3200/GNTP.170.1.5-30] [PMID]
Type of Study: Original | Subject: Clinical Neuroscience
Received: 2021/08/15 | Accepted: 2021/11/7 | Published: 2023/03/1

Add your comments about this article : Your username or Email:
CAPTCHA

Send email to the article author


Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

© 2024 CC BY-NC 4.0 | Basic and Clinical Neuroscience

Designed & Developed by : Yektaweb