Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Emotion Recognition Ability Test Using JACFEE Photos: A Validity/Reliability Study of a War Veterans' Sample and Their Offspring

  • Ivone Castro-Vale ,

    ivonecastrovale@med.up.pt

    Affiliations Medical Psychology Unit, Department of Clinical Neurosciences and Mental Health, Faculty of Medicine, University of Porto, Porto, Portugal, Instituto de Investigação e Inovação em Saúde, Universidade do Porto, Porto, Portugal

  • Milton Severo,

    Affiliations Department of Clinical Epidemiology, Predictive Medicine and Public Health, Faculty of Medicine, University of Porto, Porto, Portugal, Department of Medical Education and Simulation, Faculty of Medicine, University of Porto, Porto, Portugal

  • Davide Carvalho,

    Affiliations Department of Endocrinology, Diabetes and Metabolism, Centro Hospitalar Sāo Joāo, Faculty of Medicine, University of Porto, Porto, Portugal, Instituto de Investigação e Inovação em Saúde, Universidade do Porto, Porto, Portugal

  • Rui Mota-Cardoso

    Affiliation Medical Psychology Unit, Department of Clinical Neurosciences and Mental Health, Faculty of Medicine, University of Porto, Porto, Portugal

Emotion Recognition Ability Test Using JACFEE Photos: A Validity/Reliability Study of a War Veterans' Sample and Their Offspring

  • Ivone Castro-Vale, 
  • Milton Severo, 
  • Davide Carvalho, 
  • Rui Mota-Cardoso
PLOS
x

Abstract

Emotion recognition is very important for social interaction. Several mental disorders influence facial emotion recognition. War veterans and their offspring are subject to an increased risk of developing psychopathology. Emotion recognition is an important aspect that needs to be addressed in this population. To our knowledge, no test exists that is validated for use with war veterans and their offspring. The current study aimed to validate the JACFEE photo set to study facial emotion recognition in war veterans and their offspring. The JACFEE photo set was presented to 135 participants, comprised of 62 male war veterans and 73 war veterans’ offspring. The participants identified the facial emotion presented from amongst the possible seven emotions that were tested for: anger, contempt, disgust, fear, happiness, sadness, and surprise. A loglinear model was used to evaluate whether the agreement between the intended and the chosen emotions was higher than the expected. Overall agreement between chosen and intended emotions was 76.3% (Cohen kappa = 0.72). The agreement ranged from 63% (sadness expressions) to 91% (happiness expressions). The reliability by emotion ranged from 0.617 to 0.843 and the overall JACFEE photo set Cronbach alpha was 0.911. The offspring showed higher agreement when compared with the veterans (RR: 41.52 vs 12.12, p < 0.001), which confirms the construct validity of the test. The JACFEE set of photos showed good validity and reliability indices, which makes it an adequate instrument for researching emotion recognition ability in the study sample of war veterans and their respective offspring.

Introduction

Emotions are inherent to human nature. They exist because we need to recognize and respond to significant events that are related to survival and/or the maintenance of well-being, and thus they serve several functions [1]. Emotions are common to other animals, particularly vertebrates [2]. They are brought about by stimulus presentation, being comprised of such processes as arousal, appraisal, production of an affective state, as well as behavior and respective regulation [3]. Basic emotions can be distinguished from each another and are the result of evolution developing the best way to deal with fundamental life tasks [4]. The theme of an emotion is influenced phylogenetically and variations to that theme are influenced in turn by social experience [4].

Basic emotions are considered to be universal, because they have been shown to be categorized by different cultures and ethnicities [5, 6]. Emotions have distinctive behavior patterns and some are known to involve specific neuronal structures, as evidenced by stimulation, lesion and functional neuroimaging studies. For example, fear is known to depend in great part on the amygdala [7, 8] and disgust involves the insula [9].

Basic emotions have been shown to have specific autonomic nervous system responses [10]. For example, in the case of fear, blood flow is directed to the legs to prepare for flying away from the dangerous stimulus. Interestingly, when a person is asked to create a facial expression of fear, the same autonomic response emerges, evidencing the strong emotional pattern that has been phylogenetically developed [11, 12].

Facial expressions are an important component of emotional behavior and are specific to basic emotions [13]. Facial emotion recognition is essential for social interaction, as well as for clinical communication. Facial expressions have been extensively used to study emotions’ neurobiology, such as the amygdala’s role in social judgments [14], and many other structures as well [15]. Neuroimaging studies also suggest that different neural circuits underlie each facial emotion perception [8].

Facial emotion recognition is known to diminish with age [1620], and women have been reported to have better accuracy in identifying facial expressions of emotion than men [19, 21, 22]. Literacy has also been found to positively influence facial emotion recognition [19].

Many clinical situations limit facial emotion recognition, such as psychiatric disorders and some neurologic disorders. Depression and bipolar disorder are associated with moderate and stable deficiency in facial emotion perception, moderated by self-reported depression, age at time of testing, and years of education [23]. Facial emotion recognition ability is impaired in patients with schizophrenia, [24, 25] and is probably a characteristic trait of this illness [26]. Anxiety disorders have also been associated with impairment in facial emotion recognition—patients with panic disorder showed an inability to recognize disgust and fear and register higher accuracies in recognizing surprise [27]. Patients with post-traumatic stress disorder (PTSD) showed reduced accuracy and sensitivity to recognize facial expressions of fear and sadness [28]. Personality disorders, such as schizotypal and psychopathic, have also been associated with impairments in recognizing facial expressions of emotion [29, 30]. Autism spectrum disorders have been extensively studied with regards to facial emotion recognition and global impairment in identifying emotions has been particularly found in the case of negative ones [31, 32]. Neurologic disorders, such as Huntington’s disease [33] and Parkinson’s disease [34], have shown specific emotion recognition deficiency with respect to disgust and sadness.

Several methodologies have been used to study facial emotion recognition. In the literature we can find many photo sets and videos for this purpose. Some sets include only one ethnicity, whilst others include hundreds of stimuli [35], and others even use computerized faces [36] or morphed faces [37, 38].

One of the most important and frequently used methodologies is the set of photos developed by Matsumoto and Ekman—the Japanese and Caucasian Facial Expressions of Emotion (JACFEE) [39]. It is comprised of 56 photos of different posers from two different ethnicities, with an equal number of Japanese and Caucasian represented, and an equal number of females and males, with eight photos for each of the seven basic emotions (anger, contempt, disgust, fear, happiness, sadness, and surprise). All of the facial emotions have been scored by the Facial Action Coding System [40], to ensure that the expressions were exactly the ones intended. JACFEE photos have been validated and have shown multi-culture reliability [5].

JACFEE photos have been used in several research fields, such as: Basic emotion research neurobiology in normal subjects [17, 41]; psychiatric disorders; schizophrenia [25, 42]; anxiety disorders [27] and personality disorders [29, 43].

War veterans are a unique population, as they have been exposed to traumatic events and are at risk of developing psychiatric disorders [44], such as PTSD and other anxiety disorders, mood disorders and substance dependent disorders, all of which are conditions that have been associated with deficits in emotion recognition ability. Their offspring are also at an increased risk of psychopathology [45, 46] and are also an important population for study purposes, as, for example, there are some reports of altered facial emotion recognition in normal offspring of depressed patients [4749].

Portugal was involved in colonial wars from 1961 to 1974, during which 1 million Portuguese solders were involved. This has resulted in psychopathology, both in veterans [50] and also in their offspring [51].

To our knowledge, there are no reliability or validity studies of veterans’ populations and their offspring for tests of facial emotion recognition. A valid and reliable test for veterans and their offspring is needed, which can be used in future studies that attempt to control for hereditary problems that influence facial emotional recognition, or studies that want to show that problems influencing facial emotional recognition are transferred across generations. To carry out this type of studies we need a test which is valid and reliable in both groups. Our aim was to study the JACFEE photo set reliability/validity for a sample of veterans from the Portuguese colonial wars and also that of their offspring. We show that the JACFEE set of photos has good validity and reliability indices in the study sample of war veterans and their respective offspring.

Material and Methods

Ethics Statement

This study was approved by the Ethics Committee of the author’s university (Comissão de Ética para a Saúde do Centro Hospitalar São João / Faculdade de Medicina da Universidade do Porto). All participants received a written and verbal description of the study and gave their written informed consent.

Participants

Data are presented as the mean (standard deviation). Sixty two white veterans [mean age = 65.31 (3.71) years] of the Portuguese colonial wars were enrolled for the study. The sampling procedure used two ways of selecting participants: 75.8% from an outpatient clinic of the Portuguese Disabled Veterans Association (ADFA), and 24.2% from three lists of veterans’ companies from war time. Half of the veterans (51.61%) were disabled, and all were outpatients. Women did not participate in the Portuguese colonial wars as combatants. Offspring (n = 73; 54.1%) from 46 veterans were also included [mean age = 35.66 (4.82) years; woman: 60.3%)]. The mean age of the total sample (n = 135) was 49.27 (15.41) years’ old, and the median education in years (inter-quartile range) was 9 (4–15). Veterans’ median education was 4 (4–9.5) years, and offspring’s median education was 12 (9–15) years. All participants were of European descent.

Task

In a university setting, participants identified the emotion expressed in 56 photos from the JACFEE set [39], balanced for emotion and gender. Photos were displayed on a computer screen (after being controlled for brightness) in random order, with no successive photos belonging to the same emotion and the order of presentation was kept constant between subjects. During this task, participants chose a single term from a list of seven emotion names that best described the emotion expressed by the poser in the photo. The time taken to complete the task was measured in minutes. Because of the age of the veterans, a choice was made not to restrict the time of each photo presentation, neither to measure the intensity of emotions. Socio-demographic data was then collected.

Statistical methods

We used relative and absolute frequency of chosen emotions per intended emotional expression. The Cronbach alpha was used to evaluate the internal consistency. See Table 1.

thumbnail
Table 1. Example of the data collapsed according to the frequencies of the chosen and intended expression categories for each individual.

https://doi.org/10.1371/journal.pone.0132293.t001

A generalized linear Poisson model with log link was used to evaluate whether the agreement between the intended and chosen emotions was higher than the expected, assuming independence between chosen and intended emotions [52]. Furthermore, these models were used to assess whether the agreement was higher in offspring, higher educated, younger and female gender, than in veterans, lower educated, older and male gender. These comparisons were done to access the construct validity of the test. The magnitude of the agreement was measured using relative risk and the respective 95% confidence interval.

Results

For each image, we calculated how many participants chose the target emotion (See Table 2). The overall agreement between chosen and intended emotions was 76.3%, and the Cohen kappa was 0.72. The agreement ranged from 63% (sadness expressions) to 91% (happiness expressions). When we stratify by offspring and veterans, the overall agreement was lower for veterans when compared to their offspring, for all the emotions studied, as shown in Table 3.

thumbnail
Table 2. Number and Percentage of Chosen Emotions per Intended Emotional Expression.

https://doi.org/10.1371/journal.pone.0132293.t002

thumbnail
Table 3. Number and Percentage of Chosen Emotions per Intended Emotional Expression, Stratified by Veterans and Offspring.

https://doi.org/10.1371/journal.pone.0132293.t003

A close look at Table 2 reveals that, for some expressions, responses were not equally distributed across the chosen expressions: Faces with intended fear were sometimes confused with surprise (20.3%), and vice versa (8.5%). Intended sadness was sometimes confused with either contempt (14.8%), or surprise (11.4%), and intended contempt was sometimes mistaken for either sadness (9.8%), or surprise (7.3%) (See Table 2). For the pooled sample, the reliability by emotion ranged from 0.617 to 0.843 (See Table 4), and the overall reliability for the JACFEE photo set between images was 0.911. Overall reliability stratified by veterans and offspring was 0.905 and 0.837, respectively.

thumbnail
Table 4. Reliability between Images of Each Intended Expression, Pooled and Stratified by Veterans and Offspring.

https://doi.org/10.1371/journal.pone.0132293.t004

The frequencies of the categories in cases where the chosen expression equals the intended expression (agreement categories) was 21.4 (95% CI: 20.2 to 22.7) times higher in comparison to the categories where intended expression was different from chosen expression (disagreement categories), assuming the independence between both factors (Table 5). Table 5 shows an interaction between offspring and veterans in the agreement categories (p < 0.001). The offspring showed 41.53 (95% CI: 37.71 to 45.74) times higher frequencies in the agreement categories when compared with the disagreement categories whilst the veterans had only 12.12 (95% CI: 11.26 to 13.06) times higher frequencies. After stratifying by veterans and offspring we found that higher education (p < 0.001) and younger age (p < 0.001) were associated with higher agreement in both groups, and in the offspring group, female gender was associated with higher agreement (p < 0.001) than male gender (Table 6).

thumbnail
Table 5. Loglinear Poisson Model for the Frequencies of Chosen and Intended Expression Categories for Each Individual, Adjusted for Expected Frequencies, Assuming Independence between Both Factors.

https://doi.org/10.1371/journal.pone.0132293.t005

thumbnail
Table 6. Loglinear Poisson Model for the Frequencies of Observed and Intended Expression Categories for Each Individual, Adjusted for Expected Frequencies, Assuming Independence between Both Factors. Stratified by Education, Age, and Gender.

https://doi.org/10.1371/journal.pone.0132293.t006

Discussion

In this study, the 56 photos from the JACFEE set [39] were validated for a sample of veterans of the Portuguese colonial wars and their offspring. The photos were evaluated for their emotional content. The overall agreement between chosen and intended emotions was 76.3%, which is comparable to the multicultural agreement percentages found with this set of photos, which vary between 76.1% and 86.6% [5]. The kappa value for the agreement rates can be considered to be substantial [53]. The overall agreement was superior to other sets of photos [35, 54]. Happiness had the highest agreement level, followed by surprise, which replicated the findings of Biehl et al [5], Dodich et al [19] and Langner et al [55]. Happiness is consistently found to be the emotion that has the highest agreement level across studies [5, 19, 35, 54, 55]. The lowest agreement level was found for sadness, followed by fear. This finding was also reported by Elfenbein et al [54]. Goeleven et al [35], when using the unbiased hit rate, found that sadness was also the lowest emotion correctly recognized, except for fear.

There were systematic patterns of disagreement for fear, surprise, content and sadness. These patterns are similar to those found by Goeleven et al [35] and Langner et al [55]. Lower agreement for facial expressions of content may be related to problems of the expression label [56]. As for fear, it has been argued that this has characteristics of surprise [5].

Since facial emotion recognition can be considered as a multiple effect indicator construct model, reliability analysis through Cronbach alpha is adequate [57]. For the overall expressions, the Cronbach alpha, according to the Nunnally criterion [58], is excellent, although for intended expressions of happiness and fear, it is questionable. For the other emotions it is acceptable (sadness, surprise and disgust), and good (anger and contempt). When we stratified by veterans and offspring, the latter showed lower alpha values for the intended expressions of fear, happiness and sadness, due to the low variability between the offspring, which is shown by the high agreement frequencies.

We found that offspring had significantly higher frequencies in agreement categories when compared to the veterans. To our knowledge, no studies exist which compare the performance of parents with their adult offspring in facial emotion recognition. However offspring are expected to have better performances. This is probably accounted for by the lower age, higher education, and the inclusion of women in the offspring group, which are factors that have all been reported to positively influence facial emotion recognition ability [1619, 21]. In fact, the literature shows that young adults, together with those with higher education have better performances than the elderly and those with lower education respectively [19, 20]. Moreover, our offspring sample included women, who have also been found to have better performances in emotion recognition [19, 21, 22]. Our results confirm these previous assumptions, which thus confirm the construct validity of the test.

Limitations: the participants were forced to choose between limited emotional term options, which might have biased the choice that they really would have made. However, this methodology is in accordance with the one used to validate this photo set and it relies on the rigorous construction of the set, and on the universality of these basic emotions [5]. Other factors which may influence emotion recognition, such as symmetry, have been studied previously using the JACFEE set of photos, and these concluded that they are mostly symmetrical [59].

Some researchers still use the earlier Ekman and Friesen’s Pictures of Facial Affect [19, 26], but this set has been reported to be superseded, which limits its ecological validity [35, 60]. Some picture databases are too long, with difficult tasks to perform. The JACFEE photo set is well balanced, in that it has a sufficient number of stimuli, and that its tasks are not too difficult to perform. This makes it a good instrument for the study of facial emotion recognition among elderly individuals.

In conclusion, for the first time, this research replicated in a sample of war veterans and their respective offspring the results of previous studies which aimed to validate instruments for evaluation of facial emotion recognition in non-clinical samples. The JACFEE set of photos showed good validity and reliability indices, and seems to be a valid instrument for investigating the recognition of facial emotions in populations of war veterans and their offspring. We think that now this test can be used in future studies that aim to control for hereditary problems that influence facial emotional recognition, or those that want to show that problems influencing facial emotional recognition are transferred across generations. We are now going to use this same test to characterize the influence of war-related PTSD on facial emotion recognition ability, and also the role of inheritance with regards to the interaction between these variables. Emerging questions of trauma inheritance and how it relates to psychopathology and social interaction need to be further investigated. The JACFEE photo set seems to be an adequate instrument for this purpose.

Acknowledgments

We thank the Portuguese Disabled Veterans Association: Associação dos Deficientes das Forças Armadas (ADFA) for its help in selecting the sample. We also thank Sara Rocha and Gisela Vasconcelos for their assistance.

Author Contributions

Conceived and designed the experiments: IC-V MS DC RM-C. Performed the experiments: IC-V. Analyzed the data: MS. Wrote the paper: IC-V MS. Provided critical revisions: DC RM-C.

References

  1. 1. Farb NA, Chapman HA, Anderson AK. Emotions: form follows function. Curr Opin Neurobiol. 2013;23(3):393–8. pmid:23375166.
  2. 2. LeDoux J. Rethinking the emotional brain. Neuron. 2012;73(4):653–76. pmid:22365542.
  3. 3. Phillips ML, Drevets WC, Rauch SL, Lane R. Neurobiology of emotion perception I: The neural basis of normal emotion perception. Biol Psychiatry. 2003;54(5):504–14. pmid:12946879.
  4. 4. Ekman P, Cordaro D. What is Meant by Calling Emotions Basic. Emot Rev. 2011;3(4):364–70.
  5. 5. Biehl M, Matsumoto D, Ekman P, Hearn V, Heider K, Kudoh T, et al. Matsumoto and Ekman's Japanese and Caucasian Facial Expressions of Emotion (JACFEE): Reliability Data and Cross-National Differences. J Nonverbal Behav. 1997;21(1):3–22.
  6. 6. Sauter DA, Eisner F, Ekman P, Scott SK. Cross-cultural recognition of basic emotions through nonverbal emotional vocalizations. Proc Natl Acad Sci U S A. 2010;107(6):2408–12. pmid:20133790.
  7. 7. Adolphs R, Tranel D, Damasio H, Damasio A. Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala. Nature. 1994;372(6507):669–72. pmid:7990957.
  8. 8. Phan KL, Wager T, Taylor SF, Liberzon I. Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI. Neuroimage. 2002;16(2):331–48. pmid:12030820.
  9. 9. Phillips ML, Young AW, Senior C, Brammer M, Andrew C, Calder AJ, et al. A specific neural substrate for perceiving facial expressions of disgust. Nature. 1997;389(6650):495–8. pmid:9333238.
  10. 10. Kreibig SD. Autonomic nervous system activity in emotion: a review. Biol Psychol. 2010;84(3):394–421. pmid:20371374.
  11. 11. Levenson RW, Ekman P, Friesen WV. Voluntary facial action generates emotion-specific autonomic nervous system activity. Psychophysiology. 1990;27(4):363–84. pmid:2236440.
  12. 12. Levenson RW, Ekman P, Heider K, Friesen WV. Emotion and autonomic nervous system activity in the Minangkabau of west Sumatra. J Pers Soc Psychol. 1992;62(6):972–88. pmid:1619551.
  13. 13. Ekman P. Facial expression and emotion. Am Psychol. 1993;48(4):384–92. pmid:8512154.
  14. 14. Adolphs R. Emotion. Curr Biol. 2010;20(13):R549–52. pmid:20619803.
  15. 15. Vandekerckhove M, Plessers M, Van Mieghem A, Beeckmans K, Van Acker F, Maex R, et al. Impaired facial emotion recognition in patients with ventromedial prefrontal hypoperfusion. Neuropsychology. 2014;28(4):605–12. pmid:24773416.
  16. 16. Demenescu LR, Mathiak KA, Mathiak K. Age- and gender-related variations of emotion recognition in pseudowords and faces. Exp Aging Res. 2014;40(2):187–207. pmid:24625046.
  17. 17. MacPherson SE, Phillips LH, Della Sala S. Age-related differences in the ability to perceive sad facial expressions. Aging Clin Exp Res. 2006;18(5):418–24. pmid:17167306.
  18. 18. Montagne B, Kessels RP, De Haan EH, Perrett DI. The Emotion Recognition Task: a paradigm to measure the perception of facial emotional expressions at different intensities. Percept Mot Skills. 2007;104(2):589–98. pmid:17566449.
  19. 19. Dodich A, Cerami C, Canessa N, Crespi C, Marcone A, Arpone M, et al. Emotion recognition from facial expressions: a normative study of the Ekman 60-Faces Test in the Italian population. Neurol Sci. 2014;35(7):1015–21. pmid:24442557.
  20. 20. Ruffman T, Henry JD, Livingstone V, Phillips LH. A meta-analytic review of emotion recognition and aging: implications for neuropsychological models of aging. Neurosci Biobehav Rev. 2008;32(4):863–81. pmid:18276008.
  21. 21. Montagne B, Kessels RP, Frigerio E, de Haan EH, Perrett DI. Sex differences in the perception of affective facial expressions: do men really lack emotional sensitivity? Cogn Process. 2005;6(2):136–41. pmid:18219511.
  22. 22. McClure EB. A meta-analytic review of sex differences in facial expression processing and their development in infants, children, and adolescents. Psychol Bull. 2000;126(3):424–53. pmid:10825784.
  23. 23. Kohler CG, Hoffman LJ, Eastman LB, Healey K, Moberg PJ. Facial emotion perception in depression and bipolar disorder: a quantitative review. Psychiatry Res. 2011;188(3):303–9. pmid:21601927.
  24. 24. Kohler CG, Walker JB, Martin EA, Healey KM, Moberg PJ. Facial emotion perception in schizophrenia: a meta-analytic review. Schizophr Bull. 2010;36(5):1009–19. pmid:19329561.
  25. 25. Yu SH, Zhu JP, Xu Y, Zheng LL, Chai H, He W, et al. Processing environmental stimuli in paranoid schizophrenia: recognizing facial emotions and performing executive functions. Biomed Environ Sci. 2012;25(6):697–705. pmid:23228840.
  26. 26. Demirbuga S, Sahin E, Ozver I, Aliustaoglu S, Kandemir E, Varkal MD, et al. Facial emotion recognition in patients with violent schizophrenia. Schizophr Res. 2013;144(1–3):142–5. pmid:23333505.
  27. 27. Cai L, Chen W, Shen Y, Wang X, Wei L, Zhang Y, et al. Recognition of facial expressions of emotion in panic disorder. Psychopathology. 2012;45(5):294–9. pmid:22797533.
  28. 28. Poljac E, Montagne B, de Haan EH. Reduced recognition of fear and sadness in post-traumatic stress disorder. Cortex. 2011;47(8):974–80. pmid:21075363.
  29. 29. Dickey CC, Panych LP, Voglmaier MM, Niznikiewicz MA, Terry DP, Murphy C, et al. Facial emotion recognition and facial affect display in schizotypal personality disorder. Schizophr Res. 2011;131(1–3):242–9. pmid:21640557.
  30. 30. Dawel A, O'Kearney R, McKone E, Palermo R. Not just fear and sadness: meta-analytic evidence of pervasive emotion recognition deficits for facial and vocal expressions in psychopathy. Neurosci Biobehav Rev. 2012;36(10):2288–304. pmid:22944264.
  31. 31. Enticott PG, Kennedy HA, Johnston PJ, Rinehart NJ, Tonge BJ, Taffe JR, et al. Emotion recognition of static and dynamic faces in autism spectrum disorder. Cogn Emot. 2014;28(6):1110–8. pmid:24341852.
  32. 32. Greimel E, Schulte-Ruther M, Kamp-Becker I, Remschmidt H, Herpertz-Dahlmann B, Konrad K. Impairment in face processing in autism spectrum disorder: a developmental perspective. J Neural Transm. 2014;121(9):1171–81. pmid:24737035.
  33. 33. Sprengelmeyer R, Young AW, Calder AJ, Karnat A, Lange H, Homberg V, et al. Loss of disgust. Perception of faces and emotions in Huntington's disease. Brain. 1996;119 (Pt 5):1647–65. pmid:8931587.
  34. 34. Hipp G, Diederich NJ, Pieria V, Vaillant M. Primary vision and facial emotion recognition in early Parkinson's disease. J Neurol Sci. 2014;338(1–2):178–82. pmid:24484973.
  35. 35. Goeleven E, De Raedt R, Leyman L, Verschuere B. The Karolinska Directed Emotional Faces: A validation study. Cogn Emot. 2008;22(6):1094–118.
  36. 36. Schultz RT, Gauthier I, Klin A, Fulbright RK, Anderson AW, Volkmar F, et al. Abnormal ventral temporal cortical activity during face discrimination among individuals with autism and Asperger syndrome. Arch Gen Psychiatry. 2000;57(4):331–40. pmid:10768694.
  37. 37. Surguladze SA, Marshall N, Schulze K, Hall MH, Walshe M, Bramon E, et al. Exaggerated neural response to emotional faces in patients with bipolar disorder and their first-degree relatives. Neuroimage. 2010;53(1):58–64. pmid:20595014.
  38. 38. Scrimin S, Moscardino U, Capello F, Altoe G, Axia G. Recognition of facial expressions of mixed emotions in school-age children exposed to terrorism. Dev Psychol. 2009;45(5):1341–52. pmid:19702396.
  39. 39. Matsumoto D, Ekman P. Japanese and Caucasian facial expressions of emotion (JACFEE) [Slides]. San Francisco, CA: Intercultural and Emotion Research Laboratory, Department of Psychology, San Francisco State University.1988.
  40. 40. Ekman P, Friesen W. Facial Action Coding System. Palo Alto: Consulting Psychologists Press; 1978.
  41. 41. Susskind JM, Littlewort G, Bartlett MS, Movellan J, Anderson AK. Human and computer recognition of facial expressions of emotion. Neuropsychologia. 2007;45(1):152–62. pmid:16765997.
  42. 42. Lin IM, Fan SY, Huang TL, Wu WT, Li SM. The Associations between Visual Attention and Facial Expression Identification in Patients with Schizophrenia. Psychiatry Investig. 2013;10(4):393–8. pmid:24474989.
  43. 43. Zheng L, Chai H, Chen W, Yu R, He W, Jiang Z, et al. Recognition of facial emotion and perceived parental bonding styles in healthy volunteers and personality disorder patients. Psychiatry Clin Neurosci. 2011;65(7):648–54. pmid:22176284.
  44. 44. Sareen J, Cox BJ, Afifi TO, Stein MB, Belik SL, Meadows G, et al. Combat and peacekeeping operations in relation to prevalence of mental disorders and perceived need for mental health care: findings from a large representative sample of military personnel. Arch Gen Psychiatry. 2007;64(7):843–52. pmid:17606818.
  45. 45. Davidson AC, Mellor DJ. The adjustment of children of Australian Vietnam veterans: is there evidence for the transgenerational transmission of the effects of war-related trauma? Aust N Z J Psychiatry. 2001;35(3):345–51. pmid:11437808.
  46. 46. Yehuda R, Bell A, Bierer LM, Schmeidler J. Maternal, not paternal, PTSD is related to increased risk for PTSD in offspring of Holocaust survivors. J Psychiatr Res. 2008;42(13):1104–11. pmid:18281061.
  47. 47. Le Masurier M, Cowen PJ, Harmer CJ. Emotional bias and waking salivary cortisol in relatives of patients with major depression. Psychol Med. 2007;37(3):403–10. pmid:17109777.
  48. 48. Lopez-Duran NL, Kuhlman KR, George C, Kovacs M. Facial emotion expression recognition by children at familial risk for depression: high-risk boys are oversensitive to sadness. J Child Psychol Psychiatry. 2013;54(5):565–74. pmid:23106941.
  49. 49. Burkhouse KL, Woody ML, Owens M, McGeary JE, Knopik VS, Gibb BE. Sensitivity in detecting facial displays of emotion: Impact of maternal depression and oxytocin receptor genotype. Cogn Emot. 2015:1–13. pmid:25622005.
  50. 50. De Albuquerque A, Soares C, De Jesus PM, Alves C. [Post-traumatic stress disorder (PTSD). Assessment of its rate of occurrence in the adult population of Portugal]. Acta Med Port. 2003;16(5):309–20. pmid:14750273.
  51. 51. Pedras S, Pereira MG. Secondary Traumatic Stress Disorder in War Veterans' Adult Offspring. Mil Behav Health. 2014;2(1):52–8.
  52. 52. Hornik K, Zeileis A, Meyer D. The strucplot framework: Visualizing multi-way contingency tables with vcd. J Stat Softw. 2006;17(3):1–48.
  53. 53. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33(1):159–74. pmid:843571.
  54. 54. Elfenbein HA, Mandal MK, Ambady N, Harizuka S, Kumar S. Hemifacial differences in the in-group advantage in emotion recognition. Cogn Emot. 2004;18(5):613–29. WOS:000223126400002.
  55. 55. Langner O, Dotsch R, Bijlstra G, Wigboldus DHJ, Hawk ST, van Knippenberg A. Presentation and validation of the radboud faces database. Cogn Emot. 2010;24(8):1377–88.
  56. 56. Matsumoto D, Ekman P. The relationship among expressions, labels, and descriptions of contempt. J Pers Soc Psychol. 2004;87(4):529–40. pmid:15491276.
  57. 57. Atkinson MJ, Lennox RD. Extending basic principles of measurement models to the design and validation of Patient Reported Outcomes. Health Qual Life Outcomes. 2006;4:65. pmid:16995937.
  58. 58. Nunnally JC, Bernstein IH. Psychometric Theory. New York: McGraw-Hill; 1994.
  59. 59. Kessler H, Bachmayr F, Walter S, Hoffmann H, Filipic S, Traue HC. Symmetries in a standardized set of emotional facial expressions (JACFEE). Psychosoc Med. 2008;5:Doc09. pmid:19742281.
  60. 60. Matsumoto D, Ekman P. Commentary on "A New Series of Slides Depicting Facial Expressions of Affect" by Mazurski and Bond. Aust J Psychol. 1994;46(1):58.