Previous studies have shown that females and males differ in the processing of emotional facial expressions including the recognition of emotion, and that emotional facial expressions are detected more rapidly than are neutral expressions. However, whether the sexes differ in the rapid detection of emotional facial expressions remains unclear.
We measured reaction times (RTs) during a visual search task in which 44 females and 46 males detected normal facial expressions of anger and happiness or their anti-expressions within crowds of neutral expressions. Anti-expressions expressed neutral emotions with visual changes quantitatively comparable to normal expressions. We also obtained subjective emotional ratings in response to the facial expression stimuli. RT results showed that both females and males detected normal expressions more rapidly than anti-expressions and normal-angry expressions more rapidly than normal-happy expressions. However, females and males showed different patterns in their subjective ratings in response to the facial expressions. Furthermore, sex differences were found in the relationships between subjective ratings and RTs. High arousal was more strongly associated with rapid detection of facial expressions in females, whereas negatively valenced feelings were more clearly associated with the rapid detection of facial expressions in males.
Citation: Sawada R, Sato W, Kochiyama T, Uono S, Kubota Y, Yoshimura S, et al. (2014) Sex Differences in the Rapid Detection of Emotional Facial Expressions. PLoS ONE 9(4): e94747. doi:10.1371/journal.pone.0094747
Editor: André Aleman, University of Groningen, Netherlands
Received: August 7, 2013; Accepted: March 20, 2014; Published: April 11, 2014
Copyright: © 2014 Sawada et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Funding: This study was supported by funds from the Japan Society for the Promotion of Science Funding Program for Next Generation World-Leading Researchers. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing interests: The authors have declared that no competing interests exist.
Rapid communication via facial expressions is fundamental to human social interaction. The ability to immediately detect emotional signals from facial expressions has been understood as an evolutionary mechanism that enables the receiver to interpret emotional states and to anticipate future actions . Consistent with this notion, several behavioral studies have demonstrated the rapid detection of the emotional facial expressions of others , .
Given the evidence for sex differences in other cognitive functions, such as verbal fluency and visual perception , it has been proposed that females and males differ in their processing of emotional facial expressions as a result of evolution , . Consistent with this idea, empirical studies have reported sex differences in the ability to recognize emotion based on facial expressions. For example, several studies have found that females more accurately or sensitively recognized emotional facial expressions than did males –. Meta-analytic studies have also supported a small but significant female advantage in the ability of adults  and children and adolescents  to recognize the emotions portrayed in facial expressions. However, some studies have reported that females were superior at decoding facial expressions only for some emotional categories –. Other studies have reported no sex differences in the recognition of emotional facial expressions , . Taken together, these data appear to suggest sex differences in the recognition of emotional facial expressions, although the patterns characterizing such differences remain inconclusive.
Sex differences in the subjective emotional reactions to emotional stimuli have also been demonstrated. Indeed, a previous study found sex differences in the subjective emotional ratings of emotional facial expressions . Such data are consistent with another line of evidence regarding sex differences in emotional experience using non-facial stimuli, including scenes  and autobiographical memories , .
In contrast to studies showing sex differences in the processing of emotional facial expressions, findings on the rapid processing of emotional facial expressions has been unclear with regard to sex differences. The rapid detection of emotional facial expressions is a critical component of the processing of facial expressions and allows immediate responses to others or the environment . Several previous experimental studies using the visual search paradigm to investigate this issue have demonstrated that the reaction time (RT) for detecting an emotional face (e.g., angry, happy) was shorter than was that for detecting a neutral face , , and such rapid detection was attributed to the emotional significance of the facial stimuli rather than to their visual features . However, these studies did not examine sex differences. Indeed, few studies have investigated sex differences in the detection of emotional facial expressions , . These studies examined the detection of emotional facial expressions among crowds of neutral facial expressions but have reported inconsistent findings. One study reported that males detected angry expressions more rapidly than did females , and the other study reported no sex differences in the detection of emotional facial expressions . Thus, it remains difficult to reach conclusions about sex differences in the detection of emotional facial expressions based on these findings. Additionally, these studies focused only on differences in the detection of emotional facial expressions (e.g., anger vs. fear or anger vs. happiness) and did not compare the detection of emotional versus emotionally neutral facial expressions or consider the effect of visual factors. The possibility that females and males differ in the efficient detection of emotional versus neutral facial expressions under conditions in which visual features are controlled remains unexamined.
Furthermore, no study has assessed sex differences in the relationship between subjective emotional experience and the detection of emotion in response to emotional facial expressions. Several neurocognitive models have proposed that the efficient detection of emotional compared with neutral facial expressions may be accomplished through the process involved in detection of the emotional significance of facial expressions, and that emotional processing then modulates the visual processing of the facial expressions –. Consistent with this notion, a previous study using the visual search paradigm showed that higher levels of arousal were related to faster detection of emotional facial expressions . These studies indicate that the efficient attentional capture by emotional, relative to neutral, faces may enhance subjective awareness. However, the possibility that females and males differ with regard to the relationship between subjective emotional ratings and the detection of emotion in response to emotional facial expressions remains untested.
In the present study, we investigated sex differences in the detection of emotional facial expressions using the visual search paradigm. We used facial expressions depicting anger and happiness as target stimuli presented within crowds of neutral expressions according to a previous study . We also presented their anti-expressions as targets following a previous study . Anti-expressions were created by using a morphing technique that produced changes that were equivalent to those produced in the normal emotional facial expressions compared with neutral expressions, but the anti-expressions were usually recognized as emotionally neutral . This method allowed us to determine whether the sex differences in detection performance were attributable to basic visual processing or to emotional significance. To investigate the emotional processes related to facial expression detection, we required participants to rate the subjectively experienced arousal and valence . We also tested stimulus familiarity and naturalness as possible confounding factors . Previous studies showing a female advantage in the recognition of emotional facial expressions led to the expectation of sex differences in the detection or subjective ratings of emotional facial expressions. However, as mentioned above, evidence regarding sex differences in the processing of emotional facial expressions is not consistent, and data directly relevant to the present study are scarce. Therefore, we conducted an exploratory investigation of whether females and males could differ in the RTs for detecting normal versus anti-expressions. Furthermore, we investigated sex differences in subjective emotional experiences of arousal and valence and in the relationship between emotional experiences and detection performance in response to facial expressions.
Materials and Methods
This study was approved by the local ethics committee of Primate Research Institute, Kyoto University. Written informed consent was obtained from all participants.
Ninety volunteers (44 females and 46 males) participated in this study. Females (M ± SD age, 23.3±5.3) and males (M ± SD age, 22.5±3.4) did not differ with regard to age, t(88) = .8, p = .4. All participants were right-handed as assessed by the Edinburgh Handedness Inventory  and had normal or corrected-to-normal visual acuity.
Normal and anti-expressions of angry and happy faces were used as target stimuli, and neutral expressions were used as distractor stimuli. The stimuli were identical to those used in a previous study . Each individual face subtended a visual angle of 1.8° horizontally ×2.5° vertically. The schematic illustrations of stimuli are shown in Figure 1A.
Actual stimuli were photographic faces (see Figure 1 in ).
Normal expressions were gray-scale photographs depicting angry, happy, and neutral expressions of a female (PF) and male (PE) model chosen from a facial expression database . Neither model was familiar to any of the participants. No expression showed bared teeth.
Anti-expressions were created from these photographs using computer-morphing software (FUTON System, ATR) on a Linux computer. First, the coordinates of 79 facial feature points were identified manually and realigned based on the coordinates of the bilateral irises. Next, the differences between the feature points of the emotional (angry and happy) and neutral facial expressions were calculated. Then, the positions of the feature points for the anti-expressions were determined by moving each point by the same distance in the direction opposite from that in the emotional faces. Minor color adjustments by a few pixels were performed using Photoshop 5.0 (Adobe).
Two types of adjustments were made to the stimuli using Photoshop 5.0. First, the photographs were cropped into a circle, slightly inside the frame of the face, to eliminate contours and hairstyles not relevant to the expression. Second, the photographs were prepared so that significant differences in contrast were eliminated, thereby removing possible identifying information.
We prepared eight positions, separated by 45 degrees and arranged in a circle (10.0°×10.0°), for the presentation of stimulus faces. Stimuli occupied four of the eight positions; half were presented to the left and half were presented to the right side. A schematic illustration of an example of a stimulus display is presented in Figure 1B. Each combination of the four positions was presented an equal number of times. In the target-present trials, the position of the target stimulus was randomly chosen; however, the target stimulus was presented to the left side in the half of the trials and to the right side in the other half of the trials. In the target-absent trials, all four faces were identical and depicted neutral expressions.
Stimulus presentation was controlled by Presentation 14.9 (Neurobehavioral Systems) and was implemented on a Windows computer (HP Z200 SFF, Hewlett-Packard Company). The stimuli were presented on a 19-inch CRT monitor (HM903D-A, Iiyama) with a refresh rate of 150 Hz and a resolution of 1024×768 pixels. The refresh rate was confirmed by using a high-speed camera (EXILIM FH100, Casio) with a temporal resolution of 1000 frames/s.
The experiment was conducted in an electrically shielded and soundproofed room (Science Cabin, Takahashi Kensetsu). Participants sat in chairs with their chins fixed into steady positions 80 cm from the monitor. They were asked to keep their gaze on the fixation cross (0.9°×0.9°) at the center of the display when the cross was presented. Before the experiment began, participants engaged in 20 practice trials to gain familiarity with the apparatus.
The experiment consisted of a total of 432 trials presented in six blocks of 72 trials, with an equal number of target-present and target-absent trials. The trial order was pseudo-randomized. In each trial, the fixation cross was presented for 500 ms and then the stimulus array consisting of four faces was presented until participants responded. Participants were asked to respond as quickly and accurately as possible by pushing the appropriate button on a response box (RB-530, Cedrus) using their left or right index finger to indicate whether all four faces were the same or one face was different. The position of the response buttons was counterbalanced across participants.
After the visual search task, participants engaged in the rating task for the target and distractor (neutral) facial stimuli. The stimuli were presented individually. They were asked to evaluate each stimulus in terms of emotional arousal and valence (i.e., the intensity and nature of the emotion, respectively, that participants felt when perceiving the stimulus expression)  using a nine-point scale ranging from 1 (low arousal and negative, respectively) to 9 (high arousal and positive, respectively). They were also asked to rate familiarity (i.e., the frequency with which they encountered facial expressions such as those depicted by the stimulus in daily life) and naturalness (i.e., the degree to which the expression depicted by the stimulus seemed natural) using a nine-point scale ranging from 1 (not at all) to 9 (very much) to test possible confounding factors . The order of facial stimuli and rating items during the rating task were randomized and balanced across participants.
All statistical tests for the behavioral data were performed using the SPSS 10.0J software (SPSS Japan), and statistical significance was set at p<.05.
RT. The mean RTs of correct responses in target trials were calculated for each condition, excluding measurements ±3 SD from the mean as artifacts. The RTs were then subjected to a three-way repeated-measure ANOVA with type (normal/anti-expression) and emotion (anger/happiness) as within-participant factors, and sex (female/male) as a between-participant factor. Follow-up analyses of significant interactions for the simple effect were conducted . When higher-order interactions were significant, the main effects or lower-order interactions were not subjected to interpretation because they would be qualified by higher-order interactions .
Preliminary analyses showed that accuracy was high under all conditions (M ± SE %; normal-anger: 91.8±1.4, 91.9±1.6; normal-happy: 92.2±1.2, 91.7±1.4; anti-anger: 80.8±2.2, 78.9±2.5; anti-happy: 81.8±2.3, 80.6±2.4 for females and males, respectively), and we found no evidence of a speed–accuracy tradeoff. Therefore, we report only the RT results.
Each rating of arousal, valence, familiarity, and naturalness was analyzed according to the protocol used for the RT analysis (i.e., ANOVA and follow-up analyses).
Relationship between ratings and RTs.
Multiple regression analyses were performed to examine the relationship between subjective ratings and RTs. First, separate analyses were conducted for females and males. In these analyses, the mean RT for each participant under each condition (normal-anger, normal-happiness, anti-anger, and anti-happiness) was used as the dependent variable to test the between-response variability (vs. the between-participant variability) . The independent variable was the rating for arousal, valence, familiarity, or naturalness (effect of interest), and dummy variables were used to represent participants (effects of no-interest).
To examine sex differences in the relationship between ratings and RTs, we then tested for differences in the slopes of the regression lines for females and males. The independent variables were the interaction between sex and rating (effect of interest) and the main effects of sex and participant (effects of no-interest). Adjusted RTs were calculated to plot the relationship between rating and RTs by partialling out the group mean and the effect of participant.
In terms of RTs (Figure 2), the three-way ANOVA with type, emotion, and sex as factors revealed significant main effects of type, F(1, 88) = 92.7, p<.001, and emotion, F(1, 88) = 6.8, p<.05, as well as a significant interaction between type and emotion, F(1, 88) = 21.8, p<.001. No other main effects or interactions were significant, F(1, 88)<.5, p>.1, indicating no significant sex differences in the RTs to facial targets.
AN = normal-anger; HA = normal-happiness; aAN = anti-anger; aHA = anti-happiness.
To assess the general patterns of RTs across females and males, follow-up analyses were conducted for the interaction of type × emotion. The results revealed that the simple effects of type were significant for both anger, F(1, 176) = 105.8, p<.001, and happiness, F(1, 176) = 16.5, p<.001, indicating shorter RTs for the normal expressions of both anger and happiness than for their anti-expressions. The simple effect of emotion was significant for normal expressions, F(1, 176) = 27.8, p<.001, indicating shorter RTs for normal-anger than for normal-happiness, but not for anti-expressions, F(1, 176) = 3.2, p = .08.
In terms of arousal ratings (Figure 3A), the three-way ANOVA with type, emotion, and sex as factors revealed a significant main effect of type, F(1, 88) = 218.1, p<.001, indicating higher arousal to normal expressions than for anti-expressions, and of emotion F(1, 88) = 10.1, p<.005, indicating higher arousal for angry than for happy expressions. No other main effects and interactions were significant, F(1, 88)<1.5, p>.1, indicating no significant sex differences in evaluations of emotional arousal.
AN = normal-anger; HA = normal-happiness; aAN = anti-anger; aHA = anti-happiness.
In terms of valence ratings (Figure 3B), the three-way ANOVA revealed significant main effects of type, F(1, 88) = 17.7, p<.001; emotion, F(1, 88) = 233.0, p<.001; and sex, F(1, 88) = 7.1, p<.01. We also observed a significant two-way interaction between type and emotion, F(1, 88) = 332.0, p<.001, and a significant three-way interaction among type, emotion, and sex, F(1, 88) = 4.9, p<.05. No other interactions were significant, F(1, 88)<06, p>.1. Follow-up analyses for the three-way interaction revealed a significant simple effect of sex for normal-happy, F(1, 352) = 7.6, p<.01, and for anti-anger, F(1, 352) = 7.2, p<.01, expressions, indicating that females experienced more pleasant emotions in response to normal-happy and anti-angry expressions than did males.
With regard to familiarity ratings (Figure S1A), the three-way ANOVA revealed significant main effects of type, F(1, 88) = 51.3, p<.001, emotion, F(1, 88) = 157.7, p<.001, and sex, F(1, 88) = 4.7, p<.05, and a significant interaction between type and emotion, F(1, 88) = 240.4, p<.001. We observed no other significant interactions, F(1, 88)<1.4, p>.1. The main effect of sex indicated that females experienced the target stimuli as more familiar facial expressions than did males.
For naturalness ratings (Figure S1B), the three-way ANOVA revealed a significant main effects of type, F(1, 88) = 19.3, p<.001; and emotion, F(1, 88) = 82.5, p<.001, and a significant two-way interaction between type and emotion, F(1, 88) = 137.8, p<.001. No other significant main effect and interactions, F(1, 88)<1.0, p>.1, indicating no sex differences in evaluations of naturalness.
Relationship between ratings and RTs
With respect to the relationship between arousal ratings and RTs (Figure 4A), we first conducted separate multiple regression analyses for females and males. The results showed that the negative relationship between arousal ratings and RTs was significant in both females, t(131) = −6.9, p<.001, and males, t(137) = −5.5, p<.001, indicating that higher arousal ratings were related to shorter RTs for detecting facial expressions in both sexes. We next tested for sex differences in the relationship between arousal ratings and RTs. The results revealed a significant sex difference, F(2, 268) = 38.5, p<.001, indicating that females showed a more robust negative arousal–RT relationship than did males.
The scatter plots and regression lines indicate relationships between ratings and adjusted RTs.
In terms of the relationship between valence and RTs (Figure 4B), the multiple regression analysis for each sex showed the significant positive valence–RT relationship in both females, t(131) = 2.2, p<.05, and males, t(137) = 2.7, p<.01, indicating that more negative feelings were related to shorter RTs for detecting target facial expressions in both females and males. The test for differences in the slopes revealed the significant sex difference in the valence–RT relationship, F(2, 268) = 6.0, p<.01, indicating a stronger positive valence–RT relationship in males than in females.
No significant relationship was found between familiarity and RT in females, t(131) = .4, p = .7, or males, t(137) = 1.5, p = .1, or between naturalness and RT in females, t(131) = 1.0, p = .3, or males, t(137) = .4, p = .7. The test for differences in the slopes also revealed no sex difference in the relationship between familiarity and RT, F(2, 268) = 1.2, p = .3, or between naturalness and RT, F(2, 268) = .6, p = .6.
The general patterns of RTs across females and males showed that normal expressions of both anger and happiness were detected more rapidly than were their anti-expressions and that normal-anger expressions were detected more rapidly than were normal-happy expressions. The subjective ratings revealed that normal expressions elicited more arousal than did anti-expressions, and that normal-angry expressions elicited more negatively valenced emotion than did normal-happy expressions. Moreover, regression analyses showed a significant negative relationship between arousal and RTs, indicating that higher levels of emotional arousal facilitated rapid detection of facial expressions. Collectively, these results are consistent with those of a previous study . However, the regression analyses in this study also revealed a positive relationship between valence and RTs, indicating that more unpleasant feelings were associated with more rapid detection of facial expressions. This discrepancy is attributable to the superior statistical power of the large sample used here compared with that in the previous study (n = 90 vs. 17). Taken together, RT and rating data indicate that humans, regardless of sex, detect emotional facial expressions more rapidly than anti-expressions, that they detect normal-angry expressions more rapidly than normal-happy expressions, and that such rapid detection of facial expressions is related to emotional elicitation.
More important, the current study investigated sex differences in the rapid detection of emotional facial expressions and the relationship between such differences and subjective emotion. The RTs showed that females and males performed equally in tasks involving the rapid detection of emotional versus neutral and angry versus happy facial expressions. This result is consistent with those of one previous visual search study  that found no sex difference in the detection of emotional facial expressions of anger and happiness, but it is not consistent with those of another previous study  that reported male superiority in detecting angry expressions. However, these previous studies did not compare the detection of emotional versus emotionally neutral facial targets. This is the first study to show the absence of sex differences in the rapid detection of emotional versus neutral facial expressions. Furthermore, because the anti-expressions contained changes in visual features comparable to those in emotional facial expressions , our results regarding emotional versus neutral expressions cannot be attributable to basic visual processes. Additionally, the difference in detection performance in response to angry versus happy facial expressions is difficult to explain in terms of visual factors because comparable detection results were obtained for their anti-expressions. In summary, our data suggest that females and males are equally efficient at detecting emotional versus neutral and angry versus happy expressions.
In contrast, we found sex differences in the subjective emotional ratings offered in response to the facial expressions. Specifically, females accorded higher valence ratings to normal-happy and anti-angry expressions than did males. Because valence reflects the quality of an emotional experience , these results suggest that females feel more qualitatively in response to others' emotional facial expressions than do males. These results are consistent with those of several previous studies showing that females were highly sensitive to their emotional experiences –. The results are also in line with a previous report that females, compared with males, recognize more extreme emotions in emotional facial expressions , . Our results suggest that females and males differ in their emotional reactions to facial expressions.
Furthermore, the results of regression analyses revealed sex differences in the relationship between subjective ratings and RTs. The relationship between arousal and RTs, in which more arousing expressions were more rapidly detected, was more evident among females than among males. In contrast, the relationship between valence and RTs, in which more negatively experienced expressions were more rapidly detected, was more evident among males than among females. Taken together, these results indicate that females and males differ in their emotional reactions to others' facial expressions and that these differing reactions modulate the detection of facial expressions in different ways. Several neurocognitive models have proposed that the efficient detection of emotional facial expressions may initially involve processing the emotional significance of facial expressions in subcortical regions such as the amygdala –. These models postulate that the result of such emotional processing then modulates the visual processing of the facial expressions that occurs in the cortical visual areas. Consistent with this assumption and our results, several neuroimaging studies have shown that females and males showed different patterns of amygdala activation in response to emotional facial expressions , . However, no study has shown the functional significance of such sex differences in amygdala activation with regard to emotional facial expressions. The present study is of great significance as we believe this is the first report of the effect of sex differences on the emotional modulation involved in the detection of emotional facial expressions. Our findings implicate sex differences in the neural mechanisms involved in the rapid detection of emotional facial expressions.
The sex difference in the relationship between subjective emotional ratings and the detection of emotional facial expressions appears to be consistent with evolutionary evidence regarding sex differences. Females have been more responsible for childrearing than have males, and it has been hypothesized that females show greater sensitivity to emotionally arousing facial expressions across-the-board, as a result of their evolutionary role as primary caretakers because mothers must rapidly respond to the emotional signals of their infants to increase the chances of the infants' survival . Our results showing sex differences in the psychological processes underlying the rapid detection of emotional facial expressions suggest that the enhanced emotional arousal demonstrated by females expedites their efficient detection of the important signals communicated by emotional facial expressions. In childrearing situations, these expressions are often related to the status of infants, and the ability of females to rapidly detect them helps to maintain infant health and produce prosocial outcomes. This quantitative modulation of rapid detection of emotional facial expressions by subjective emotional processing in females may be also consistent with the general female advantage in processing emotional facial expressions, which has been shown in previous empirical studies –, . In contrast, from an evolutionary perspective, males are more likely than females to have been subjected to aggressive behavior from other males in the context of mating or hunting and such situations can result in death or serious injury , . In this context, some researchers have shown that males generally express and endorse emotions through their actions including aggressive behaviors . Based on this literature, our results suggest that, among males, subjective negative feelings would accelerate the efficient detection of the aggressive signals communicated by others via emotional facial expressions. This qualitative modulation of the detection of emotional facial expressions by subjective emotional processing is consistent with a previous study showing a male advantage in the rapid detection of angry faces  and may account for discrepancies between our RT result and the female advantage in emotional processing noted by previous studies –. Taken together, our findings suggest that differences in the evolutionary roles or traits of females and males may have led to the development of sex differences in the psychological processes underpinning the rapid detection of emotional facial expressions.
Irrespective of these possible evolutionary interpretations, it must be noted that learned factors may also account for the results. Some researchers have reported that social factors may contribute to sex differences in the psychological processes underlying the rapid detection of emotional facial expressions. For example, it has been shown that the intensity of reported emotions is correlated with belief in the stereotypical social role of females and males . Specifically, females who believed more strongly in stereotypical role patterns reported more intense emotions in response to emotional scenes, and males who believed more strongly in stereotypical role patterns reported less intense emotions in response to such scenes. These data suggest that social factors, such as gender role stereotypes, modulate the relationship between subjective emotional feelings and the rapid detection of emotional facial expressions.
Our results showed that detection RT was not related to ratings of the familiarity or naturalness of facial targets, suggesting that these non-emotional processes cannot account for detection performance. However, irrespective of relationships involving facial expression detection, females reported more familiarity with the stimulus facial expressions than did males. This result suggests that females have a better memory for the various types of facial expressions that they observe in their daily lives, which is consistent with a previous study reporting that females retained a better memory for emotional events than did males . These data suggest that consideration of sex differences in memory for emotional facial expressions may be a promising topic for future investigation.
Several limitations of the present study should be noted. First, we used stimulus faces of only one female and one male model from standardized materials of facial expressions  because a computer-morphing technique by which the anti-expressions were created can be applied only to faces with closed mouths . This approach may have confounded the effect of sex with that of the identity of facial stimuli. Some previous studies have shown that the sex of the target stimuli affects the emotional processing of facial expressions, in that angry expressions depicted by male faces are recognized more accurately or more rapidly, whereas happy expressions depicted by female faces are recognized more accurately and more rapidly , , . Therefore, further investigations of the effect of the sex of target stimuli are warranted.
Second, we used only two emotional facial expressions, anger and happiness. Our primary purpose was to investigate sex differences in the detection of emotional compared with emotionally neutral facial expressions. For this purpose, target stimuli with both negative (anger) and positive (happy) affects might be effective. However, some researchers have found sex differences in the recognition of facial expressions depicting certain categories of emotion –. Therefore, further investigations using more categories of emotion are needed to investigate the effects of different emotional information on sex differences in detecting facial expressions.
In summary, our results showed no sex differences in the rapid detection of emotional compared with emotionally neutral expressions. However, we did observe sex differences in the subjective ratings of facial stimuli and the relationship between ratings and RTs. Females reported a stronger qualitative response to the emotional facial expressions of others than did males. Furthermore, emotional arousal enhanced the detection of facial expressions more strongly in females than in males, whereas negative feelings facilitated the detection of facial expressions more clearly in males than in females. These findings suggest females and males differ in their subjective emotional reactions to facial expressions and that this difference leads to subsequent differences in the ways in which emotion modulates the detection of emotional facial expressions.
Mean (with SE) ratings of familiarity (A) and naturalness (B) for each target facial expression. AN = normal-anger; HA = normal-happiness; aAN = anti-anger; aHA = anti-happiness.
The authors would like to thank Kazusa Minemoto, Emi Yokoyama, and Akemi Inoue for help with participant recruitment and performance of the experiment.
Conceived and designed the experiments: RS WS TK SU YK SY MT. Performed the experiments: RS WS SU. Analyzed the data: RS WS. Contributed reagents/materials/analysis tools: RS WS. Wrote the paper: RS WS TK SU YK SY MT.
- 1. Ekman P (1997) Should we call it expression or communication? Innovation (Abingdon) 10: 333–344.
- 2. Williams MA, Moss SA, Bradshaw JL, Mattingley JB (2005) Look at me, I'm smiling: Visual search for threatening and nonthreatening facial expressions. Vis cogn 12: 29–50.
- 3. Sato W, Yoshikawa S (2010) Detection of emotional facial expressions and anti-expressions. Vis cogn 18: 369–388.
- 4. Kimura D (1999) Sex and cognition. Cambridge, MA: The MIT Press.
- 5. Babchuk WA, Hames RB, Thompson RA (1985) Sex differences in the recognition of infant facial expressions of emotion: The primary caretaker hypothesis. Ethol Sociobiol 6: 89–101.
- 6. Kret ME, De Gelder B (2012) A review on sex differences in processing emotional signals. Neuropsychologia 50: 1211–1221.
- 7. Katsikltis M, Pilowsky I, Innes JM (1997) Encoding and decoding of facial expression. J Gen Psychol 124: 357–370.
- 8. Thayer JF, Johnsen BH (2000) Sex differences in judgement of facial affect: A multivariate analysis of recognition errors. Scand J Psychol 41: 243–246.
- 9. Hall JA, Matsumoto D (2004) Gender differences in judgments of multiple emotions from facial expressions. Emotion 4: 201–206.
- 10. Hampson E, van Anders SM, Mullin LI (2006) A female advantage in the recognition of emotional facial expressions: test of an evolutionary hypothesis. Evol Hum Behav 27: 401–416.
- 11. Hoffmann H, Kessler H, Eppel T, Rukavina S, Traue HC (2010) Expression intensity, gender and facial emotion recognition: Women recognize only subtle facial emotions better than men. Acta Psychol 135: 278–283.
- 12. Hall JA (1978) Gender effects in decoding nonverbal cues. Psychol Bull 85: 845–857.
- 13. McClure EB (2000) A meta-analytic review of sex differences in facial expression processing and their development in infants, children, and adolescents. Psychol Bull 126: 424–453.
- 14. Goos LM, Silverman I (2002) Sex related factors in the perception of threatening facial expressions. J Nonverbal Behav 26: 27–41.
- 15. Montagne B, Kessels RPC, Frigerio E, De Haan EHF, Perrett DI (2005) Sex differences in the perception of affective facial expressions: Do men really lack emotional sensitivity? Cogn Process 6: 136–141.
- 16. Campbell R, Elgar K, Kuntsi J, Akers R, Terstegge J, et al. (2002) The classification of ‘fear’ from faces is associated with face recognition skill in women. Neuropsychologia 40: 575–584.
- 17. Grimshaw GM, Bulman-Fleming MB, Ngo C (2004) A signal-detection analysis of sex differences in the perception of emotional faces. Brain Cogn 54: 248–250.
- 18. Derntl B, Finkelmeyer A, Eickhoff S, Kellermann T, Falkenberg DI, et al. (2010) Multidimensional assessment of empathic abilities: Neural correlates and gender differences. Psychoneuroendocrino 35: 67–82.
- 19. Wild B, Erb M, Bartels M (2001) Are emotions contagious? Evoked emotions while viewing emotionally expressive faces: Quality, quantity, time course and gender differences. Psychiatry Res 102: 109–124.
- 20. Barrett FL, Lane RD, Sechrest L, Schwartz GE (2000) Sex differences in emotional awareness. Pers Soc Psychol Bull 26: 1027–103.
- 21. Fujita F, Diener E, Sandvik E (1991) Gender differences in negative affect and well-being: The case for emotional intensity. J Pers Soc Psychol 61: 427–434.
- 22. Seidlitz L, Diener E (1998) Sex differences in the recall of affective experiences. J Pers Soc Psychol 74: 262–271.
- 23. Hansen CH, Hansen RD (1988) Finding the face in the crowd: An anger superiority effect. J Pers Soc Psychol 54: 917–924.
- 24. Williams MA, Mattingley JB (2006) Do angry men get noticed? Curr Biol 16: R402–404.
- 25. Öhman A, Juth P, Lundqvist D (2010) Finding the face in a crowd: Relationships between distractor redundancy, target emotion, and target gender. Cogn Emot 24: 1216–1228.
- 26. Vuilleumier P (2009) The role of the human amygdala in perception and attention. The Human Amygdala. New York, NY: Guilford Press.
- 27. Pourtois G, Schettino A, Vuilleumier P (2013) Brain mechanisms for emotional influences on perception and attention: What is magic and what is not. Biol Psychol 92: 492–512.
- 28. Sato W, Kochiyama T, Uono S, Matsuda K, Usui K et al. (2013) Rapid and multiple-stage activation of the human amygdala for processing facial signals. Commun Integr Biol 6: e24562 1–5.
- 29. Sato W, Yoshikawa S (2009) Anti-expressions: Artificial control stimuli for the visual properties of emotional facial expressions. Soc Behav Pers 37: 491–502.
- 30. Lang PJ, Bradley MM, Cuthbert BN (1998) Emotion and motivation: Measuring affective perception. J Clin Neurophysiol 15: 397–408.31.
- 31. Tong F, Nakayama K (1999) Robust representations for faces: Evidence from visual search. J Exp Psychol Hum Percept Perform 25: 1016–35.
- 32. Oldfield RC (1971) The assessment and analysis of handedness: The Edinburgh inventory. Neuropsychologia 9: 97–113.
- 33. Ekman P, Friesen WV (1976) Pictures of facial affect. Palo Alto, CA: Consulting Psychologists Press.
- 34. Kirk RE (1995) Experimental design: Procedures for the behavioral sciences (3rd). Pacific Grove, CA: Brooks/Cole Publishing Company.
- 35. Tabachnick BG, Fidell LS (2000) Computer-assisted research design and analysis. Boston, MA: Allyn and Bacon.
- 36. Canli T, Zhao Z, Brewer J, Gabrieli JDE, Cahill L (2000) Event-related activation in the human amygdala associates with later memory for individual emotional response. J Neurosci 19: : RC99 1–5.
- 37. Killgore WD, Yurgelun-Todd DA (2001) Sex differences in amygdala activation during the perception of facial affect. Neuroreport 12: 2543–2547.
- 38. Kempton MJ, Haldane M, Jogia J, Christodoulou T, Powell J, et al. (2009) The effects of gender and COMT Val158Met polymorphism on fearful facial affect recognition: A fMRI study. Int J Neuropsychopharmacol 12: 371–381.
- 39. Campbell A (1999) Staying alive: Evolution, culture, and women's intrasexual aggression. Behav Brain Sci 22: 203–14.
- 40. Joseph R (2000) The evolution of sex differences in language, sexuality, and visual–spatial skills. Arch Sex Behav 29: 35–66.
- 41. Buck R, Miller RE, Caul WF (1974) Sex, personality, and physiological variables in the communication of affect via facial expression. J Pers Soc Psychol 30: 587–596.
- 42. Grossman M, Wood W (1993) Sex differences in intensity of emotional experience: A social role interpretation. J Pers Soc Psychol 65: 1010–1022.
- 43. Becker DV, Kenrick DT, Neuberg SL, Blackwell KC, Smith DM (2007) The confounded nature of angry men and happy women. J Pers Soc Psychol 92: 179–190.
- 44. Brody LR, Hall JA (2008) Gender and emotion in context. Handbook of emotions: 395–408. New York, NY: Guilford Press.