Skip to main content
Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Individual differences in emoji comprehension: Gender, age, and culture

  • Yihua Chen ,

    Contributed equally to this work with: Yihua Chen, Xingchen Yang

    Roles Conceptualization, Data curation, Investigation, Writing – original draft

    Affiliation School of Psychology, University of Nottingham, University Park, Nottingham, United Kingdom

  • Xingchen Yang ,

    Contributed equally to this work with: Yihua Chen, Xingchen Yang

    Roles Conceptualization, Data curation, Investigation, Writing – original draft

    Affiliation School of Psychology, University of Nottingham, University Park, Nottingham, United Kingdom

  • Hannah Howman,

    Roles Conceptualization, Data curation, Formal analysis, Writing – review & editing

    Affiliation School of Psychology, University of Nottingham, University Park, Nottingham, United Kingdom

  • Ruth Filik

    Roles Conceptualization, Methodology, Project administration, Supervision, Writing – review & editing

    Affiliation School of Psychology, University of Nottingham, University Park, Nottingham, United Kingdom


Emoji are an important substitute for non-verbal cues (such as facial expressions) in online written communication. So far, however, little is known about individual differences regarding how they are perceived. In the current study, we examined the influence of gender, age, and culture on emoji comprehension. Specifically, a sample of 523 participants across the UK and China completed an emoji classification task. In this task, they were presented with a series of emoji, each representing one of six facial emotional expressions, across four commonly used platforms (Apple, Android, WeChat, and Windows). Their task was to choose from one of six labels (happy, sad, angry, surprised, fearful, disgusted) which emotion was represented by each emoji. Results showed that all factors (age, gender, and culture) had a significant impact on how emojis were classified by participants. This has important implications when considering emoji use, for example, conversation with partners from different cultures.


Communication plays an essential role in our daily lives, during which we convey information to others using both verbal (e.g., speech) and non-verbal (e.g., facial expression) behaviour [1] With the development of Information and Communication Technologies, computer-mediated communication (CMC) is widespread, and growing exponentially, especially text-based communication [2]. There are many advantages of CMC, including facilitating communication regardless of time and location, facilitating the archiving of information, and maintaining the continuity of relationships [3]. Although online communication is convenient and efficient, Kaye et al. [1] found that many people would choose to communicate face-to-face when they would like to convey emotions, since non-verbal information would be lost in the process of online communication and may lead to unnecessary misunderstanding. Emoticons and emoji were introduced as a compensatory mechanism [4]. Given the increasing importance of emoji in everyday communication, the present study aims to investigate whether there are individual differences in the emotional labels that people choose to assign to emoji representing facial emotional expressions (happy, sad, angry, fearful, surprise, disgusted). The specific individual differences examined in the current study are gender, age, and culture (UK and China).

Emoticons and emoji

Emoticons, such as:), are composed of keyboard characters, and were first designed by Scott Fahlman in 1982. Emoticons can be used to replace non-verbal communication, especially facial expressions, however, they may not be accurate enough in expressing emotional nuances due to the limitation of the number of keyboard symbols [5]. Thus, emoji are the new generation of emoticons, first developed by Japanese originator Shigetaka Kurita in the late 1990s. The main difference between emoticons and emoji is that emoji are colourful pictures (e.g., 😀 or 🐵) [6]. In those representing facial expressions, a circle is typically used as the boundary, and multiple facial expressions can be expressed within the circle. In other words, emoji are a highly simplified symbolic version of facial expressions, which, compared to emoticons, can express more subtle and complex emotions, and are richer in semantics [7]. By 2015, already 92% of people used emoji in communicating online [1], thus emoji are considered as “the world’s fastest-growing form of communication” [8, p.76]. In addition, they have arguably become a necessary component of online chat. For example, Oleszkiewicz et al. [9]. analysed data from more than 86,000 Facebook users and found that 89.9% of users used at least one emoji in their posts.

Kaye et al. [1] conducted a survey to investigate the reasons why people were inclined to use emoji in online chat. The results demonstrated that sending emoji could promote positive interpersonal relationships. Specifically, emoji could create a relaxed chat atmosphere, promote communication and interaction, and reduce ambiguity in discourse to ensure that the recipient could understand the intended emotional meaning of the message. In addition, Prada et al. [7] argued that the main function of emoji is to express emotion and humour. Furthermore, Thompson et al. [10] found that participants tended to smile more when reading messages that were accompanied by an emoticon compared to the same message without an emoticon.

As well as facilitating online communication, emoji have been widely used in other fields such as marketing—for promotion and to attract attention [11], and to describe consumer emotions [12]. Other domains of application include medicine and public health, where emoji are used to guide people’s behaviour and improve doctor-patient communication [13, 14]. In addition, engineers in computer science use emoji for sentiment analysis to improve unsupervised machine learning [15], and educators apply emoji to improve learning efficiency [16].

However, some studies have shown that people’s interpretation of emoji may still be ambiguous [17, 18], suggesting not only differences in the reasons they use emoji, but also that the comprehension of emoji/emoticons may vary across individuals [19, 20]. Since emoji are such an important part of online communication and are increasingly used in many other fields, it is important to understand the factors that influence their interpretation.

Facial emotion and emoji

Results from brain imaging studies have suggested that the brain areas activated by recognising real faces have been shown to be similar to those that are activated when viewing more abstract faces such as emoticons [2123]. Thus, the factors that lead to individual differences in recognition of real faces may also influence people’s ability in understanding emoji depicting facial expressions. Therefore, to determine which kinds of factors may cause individual differences in emoji identification, we first discuss the results from experiments examining real facial emotion recognition.


Firstly, emoji comprehension may be influenced by gender. Some previous studies suggest that on average, women demonstrate higher accuracy in emotion recognition than men [24, 25]. One potential explanation is the "primary caretaker hypothesis" [26]. Specifically, accurate and rapid identification of infant emotions, especially facial expressions, is a very important part of infant care, as infant mortality has generally been high throughout human evolution. Regarding specific emotions, some researchers have argued that women are more sensitive to negative facial emotions than men, suggesting a negativity bias [27, 28]. Jones et al. [29] propose that part of the reason is that women have a higher rate of depression from adolescence, and depressed individuals have a stronger negativity bias [30]. However, other researchers have suggested that there are no gender differences in emotion recognition or interpretation [3133]. Research specifically examining emoji has mainly focused on emoji use rather than interpretation. For example, Chen et al.’s [34] study of 134,419 internet users suggested that males and females differed significantly in emoji usage, and emoji use has been applied in machine learning to predict users’ gender [35]. In the current study, we examine gender differences in emoji interpretation, rather than use. Given that most previous results suggest an accuracy advantage for women in recognising real facial emotions, we expect to observe higher accuracy for women than men in recognising facial expressions depicted by emoji, perhaps more so for negative facial expressions.


Age may also influence emoji comprehension. A large body of previous research suggests that young adults show higher accuracy in understanding the meaning of emotions than older adults [36, 37]. However, some researchers suggest that older adults are better at recognising emotional facial expressions of disgust than young adults [38, 39]. Hayes et al. [40] suggested that older adults’ recognition of disgust was better than that of young adults due to the stimuli used. Hayes et al. [40] found that older adults were significantly better at recognising disgust from the Pictures of Facial Affect (POFA) [41] than young adults, but were worse than young adults on other image sets. The POFA image set is composed of black and white Ekman faces [41] which may improve the performance of older adults because they were more familiar with black and white images, but the phenomenon only appeared in recognising disgust. This may explain why Calder et al. [38] and Orgeta and Phillips [39]found a tendency for older adults to be better at recognising disgust, since Calder et al. [38] and Orgeta and Phillips [39] showed participants the static photographs from POFA in their experiments. Across studies, the dominant finding suggests that young adults are more accurate than older adults in recognising facial emotions. Therefore, in the present study we predict a negative association between age and accuracy in emoji recognition in general, with the question remaining regarding whether this is also the case for disgust.


In addition to gender and age, culture can play an important role in facial emotion recognition. Researchers initially believed that westernised participants had higher accuracy in emotion recognition than participants from other countries [42]. However, Markham and Wang [43] suggested that there might be an in-group advantage whereby participants identified faces from their own group better than faces from other groups. Furthermore, Elfenbein and Ambady [44] conducted a meta-analysis on these cross-cultural emotion recognition experiments. They verified that there was an in-group advantage in emotion recognition, and this advantage would become smaller when one group was exposed to another. Recently, Mandal and Awasthi [45] suggested that there may also be an out-group bias, which means individuals can use decoding rules for out-group facial expressions, and they may be less motivated to recognise the emotions of out-group members. Some underlying mechanisms that have been proposed include language [46], cognitive representations [47], and specific emotional and linguistic experiences in different cultures [48]. Elfenbein and Ambady’s [44] meta-analysis is consistent with this idea—they suggested that people growing up in or being exposed to one culture means that they learn culture-specific elements of emotional behaviour, and this results in an in-group advantage.

In relation to emoji, Kejriwal et al. [49] examined tens of millions of Twitter messages in 30 languages and countries and suggested that usage and interpretation of emoji varied across countries. Guntuku et al. [50] analysed the use of emoji in China and the UK in Twitter and Weibo, and found that usage in China was significantly lower, which indicated that Chinese people are less exposed to emoji. In the present study, we will examine two types of emoji that are frequently used on smartphones: Apple and Android, and one usually seen on computers: Windows. In addition, we will examine the emoji from WeChat, which is widely used in China but rarely used in the UK, to allow us to examine whether any cultural differences are mediated by familiarity and/or platform.

Aims and hypotheses

Although there is a wealth of literature on individual differences in real facial expression recognition, there are relatively few studies on individual differences in the comprehension of emotional facial expressions depicted by emoji. In the current study, in order to examine whether there are similarities in the classification of facial expressions depicted by emoji, and those of real faces, we investigated individual differences relating to gender, age, and culture in recognising emoji representing six emotions (happy, disgusted, fearful, sad, surprised, and angry, following Ekman & Friesen, [41]) on four different platforms (Apple, Android, Windows, and WeChat). Based on the literature reviewed above, if emoji recognition is similar to emotion recognition in real faces, our hypotheses are: 1) There will be gender differences in emoji classification accuracy, with an accuracy advantage predicted for women, potentially more so for negative emotions, 2) There will be age differences in emoji classification accuracy, with an accuracy advantage for younger participants (although this advantage may not be observed for disgust), 3) There will be cultural differences in emoji recognition accuracy, with an accuracy advantage expected for UK participants, however, this effect may be mediated by familiarity and/or platform.



A power analysis was conducted using the shiny package (version 1.7.1) [51]. As both familiarity and platform would be added into the model as mediators in the analysis of cultural differences, we calculated power using the ‘Two Parallel Mediators’ model, where the objective was set to ‘Set Power, Vary N’. A power level of 0.80, with a minimum sample size of 50, a maximum sample size of 530, and a step size of 10 was selected. In addition, the number of replications was set to 5000, with Monte Carlo Draws per Rep set to 20,000, with a 95% confidence interval, corresponding to an α of 0.05. The correlation coefficients were all set at 0.5, with standard deviations of 1.00. The power analysis demonstrated that a minimum of 130 participants would be required to ensure statistical power of at least 80% for detecting the hypothesised indirect effect of mediator 1, and a minimum of 130 for mediator 2. This adds up to a minimum requirement of 260 participants.

A total of 523 participants were recruited through Prolific, SurveyCircle, and Facebook survey exchange groups. Participants consisted of 253 Chinese and 270 English adults (268 women and 255 men). Ages ranged from 18 to 84 years old (M = 36.72, SD = 13.39). Participation was voluntary, and participants had the chance to obtain a £20 Amazon voucher via a prize draw. The study was performed in accordance with the ethical standards of the Declaration of Helsinki (University of Nottingham School of Psychology ethics committee reference code: 720). Data were collected between 25th July 2020 and 31st January 2021.

Materials and design

A total of 24 emoji were used in the classification task (see Fig 1). They represented six emotions (happy, disgusted, fearful, sad, surprised, and angry) across four different formats (Apple, Android, Windows, and WeChat). All emoji were 72x72 pixels in size. Android, Apple, and Windows emoji were taken from Unicode Consortium’s Common Locale Data Repository. WeChat emoji were taken from the WeChat official website, as this format was not provided in Unicode. The image size provided in the WeChat website was different from those provided in Unicode (48x48 pixels), so images were magnified to 72x72 pixels to keep the size consistent across platforms. Each emoji was presented to the participants once and in a random order.

Fig 1. Emoji stimuli for all six emotions across four platforms.


There were two versions of the survey: one in English and one in Chinese, and participants were free to choose which one they completed. After giving informed consent, participants were asked for demographic information, such as gender, age, and ethnicity. If participants chose “Chinese” or “English” in the ethnicity column, they could continue to the next step, and if they chose “Other”, they would jump to the end page directly.

Participants then completed the emoji classification task. The 24 emoji were presented one at a time in random order. The task was to indicate which emotion each emoji represented by choosing from six emotional labels. After selecting the emotion label, participants also needed to indicate their familiarity with the emoji. Familiarity was assessed on a five-point scale: "not at all familiar", "slightly familiar", "somewhat familiar", "moderately familiar", "extremely familiar". Participants did not receive feedback about their accuracy. After completing the emoji classification task, participants completed the HEXACO-60 personality scale (as part of a student project). The HEXACO data are not reported here (as they are not relevant to our research questions) and will not be published separately elsewhere. At the end of the experiment, participants were shown a debrief sheet, and given the chance to enter a prize draw to win a £20 Amazon voucher. The study lasted approximately 10–15 minutes, and participants were free to withdraw at any point.

Data analysis

Data analysis was conducted in R (version 4.0.1) [52], where we used causal mediation analysis (lavaan package version 0.6–9) [53] to investigate whether there was an effect of gender, age, or culture on emoji recognition, and whether these effects were mediated by familiarity. We separately examined the six types of emoji–happy, surprised, disgusted, fearful, sad, and angry. We created familiarity scores by averaging participants’ familiarity for each emoji type, with a higher score indicating a higher familiarity with that emoji (for reference we have provided the descriptive statistics for the pre-averaged familiarity scores: Not at all familiar = 1, Slightly familiar = 2, Somewhat familiar = 3, Moderately familiar = 4, Extremely familiar = 5).

The first step was to investigate whether there was a direct effect of gender, age, or culture on emoji recognition: for example, Accuracy ~ Gender. If there was no direct effect, analysis ended there. If there was a direct effect, the second step was to compare the direct effect and indirect effect (e.g., Accuracy ~ Gender*Familiarity) on emoji recognition, where the effects were calculated with 5,000 bootstrapped samples. For culture analyses, we also included platform as a second mediator, and post-hoc analysis for platform was conducted using the lsmeans package (version 2.30–0) [54] utilising a binomial generalised linear model with the significant threshold adjusted using the Bonferroni method. We report detailed results of the analysis, including the following model parameters: estimate (B), standard error (SE), standardised estimate (β), z-value, p-value (p), adjusted R-squared (R2), and the 95% confidence intervals. During the pre-processing of the data, missing values were removed, and this process accounted for < 0.001% of the data.


Overall emoji recognition accuracy

See Figs 2 and 3 for accuracy and familiarity descriptive statistics for all emoji. A one-way MANOVA was conducted to determine whether the type of emoji had an effect on accuracy and familiarity. Using Pillai’s trace, there was a significant difference in accuracy and familiarity based on the type of emoji, V = 0.89, F(10, 25074) = 307, p < .001, partial ƞ2 = .11. Separate univariate ANOVAs on the variables revealed significant effects of type of emoji on accuracy, F(5, 12537) = 527.79, p < .001, and familiarity F(5, 12537) = 207.96, p < .001. To directly address our predictions, we now present separate analyses focusing on each individual difference factor in turn (gender, age, and culture).

Fig 2. Mean accuracy scores for the six emoji (error bars represent SEM).

Fig 3. Mean familiarity scores for the six emoji (error bars represent SEM).

Gender effects

There were significant direct effects of gender for happy, fearful, sad, and angry emoji, showing higher classification accuracy for women than men. Mediation analyses showed that accuracy effects were not mediated by familiarity for happy, fearful, or sad emoji, but were partially (but not fully) mediated by familiarity for angry emoji. There were no direct effects of gender for surprised or disgusted emoji (see Figs 4 and 5 for descriptive statistics and Table 1 for mediation analysis).

Fig 4. Mean accuracy scores for the six emoji across genders (error bars represent SEM).

Fig 5. Mean familiarity scores for the six emoji across genders (error bars represent SEM).

Table 1. Gender: Models, parameters, and Confidence Intervals (CI).

Age effects

There were significant direct effects of age for surprised, fearful, sad, and angry emoji, such that the older the participant, the less accurate they were. Effects were not mediated by familiarity for surprised or fearful emoji, and were partially (but not fully) mediated by familiarity for sad and angry emoji. The direct effect of age for disgusted emoji was removed by mediation analysis, and there were no direct effects of age for happy emoji (see Table 2 for mediation analysis).

Table 2. Age: Models, parameters, and Confidence Intervals (CI).

Culture effects

There were significant direct effects of culture for happy, sad, surprised, fearful, and angry emoji, such that participants from the UK had higher accuracy scores than Chinese participants. These effects were not mediated by familiarity for surprise, fear, or angry, and were partially (but not fully) mediated by familiarity for happy and sad emoji. None of the direct effects were mediated by platform. There was no direct effect of culture for disgusted emoji (see Figs 6 and 7 and Table 3 for descriptive statistics, and see Table 4 for mediation analysis).

Fig 6. Mean accuracy scores for the six emoji across cultures (error bars represent SEM).

Fig 7. Mean familiarity scores for the six emoji across cultures (error bars represent SEM).

Table 3. Descriptive statistics: Culture and platform–accuracy.

Table 4. Culture: Models, parameters, and Confidence Intervals (CI).


The aim of the present study was to investigate individual differences relating to gender, age, and culture, in classifying emoji representing facial emotional expressions. Overall, the results suggested that all of the factors under investigation (gender, age, and culture) had a significant impact on how emoji were classified.

Gender effects

The findings relating to gender showed some classification accuracy advantages for women, which would support previous research on facial emotion recognition [24, 25], However, it does not necessarily support the suggestion that this is due to women displaying a negativity bias [27, 28], as women also showed higher accuracy for ‘happy’ emoji in the current study.

Notably, women were not more accurate in recognising all emoji types; there were no gender differences for surprised or disgusted emoji. The identification of surprise necessitates processing of both the upper (the eyes) and lower parts (nose and mouth) of the face [5557]. According to the findings of Abbruzzese et al. [58] and Sullivan et al. [57], females were more likely to look at the eyes and males looked more to the mouth, which may result in similarly high accuracy in identifying surprised expressions for both genders.

Interestingly, recognition of disgust was not influenced by gender, age, or culture. Previous facial expression research [59, 60] and emoji research [61] suggested that disgust and fear are least-well recognised; a finding mirrored in the present results. According to Lambrecht et al. [62], there was no difference between females and males in recognising disgust due to disgust rarely appearing in our daily life [63].

Age effects

The results showed an overall accuracy advantage for younger participants, apart from happy and disgusted emoji. This finding partly converges with previous findings that young adults were more accurate than older adults in recognising emotions [36, 37]. It also accords with previous findings suggesting that the recognition of disgust is preserved during aging [38, 39], and further suggests that this finding may not entirely be due to the black and white stimuli used in previous studies. It may be that compared with younger adults, older adults were more likely to focus on the bottom half of the face (nose and mouth), leading to better recognition of disgusted and happy emotions, which could increase the accuracy of older adults and decrease the difference between younger and older individuals [55, 57]. The lack of an age-related decline for recognising happy emoji could also be due to the purported ‘positivity bias’ [64] observed in older adults, facilitating recognition in this case.

Although the current findings could be interpreted as suggesting that older adults may benefit less from the paralinguistic cues provided by emoji than young adults (at least for some emotions), it is important to note that some recent research has suggested that older adults may benefit from the use of emoji to clarify the meaning of more complex and ambiguous messages, such as those intended to be interpreted sarcastically [65]. Thus, further research is needed to investigate age-related differences in the interpretation of emoji both in isolation and in conjunction with written messages.

Culture effects

Results showed a general accuracy advantage for UK compared to Chinese participants, in that UK participants were more accurate in identifying all emotions except disgust. These effects were partially mediated by familiarity for happy and sad emoji only, suggesting that the effect cannot be fully explained by UK participants being more familiar with the emoji that were used. The lack of mediation by platform would also support this suggestion. However, even though Chinese people use WeChat much more frequently than people in the UK, they may nevertheless not be regular users of WeChat emoji. A recent report [66] showed that the type of emoji they used were very limited.

Additionally, Chinese internet users may use emoji for different communication purposes—for example, it has been suggested that they seldom use the happy emoji to express happiness, instead, they use it for negative meanings such as sarcasm [67]. Similarly, Guntuku et al. [50] analysed the emoji posted on Twitter and Weibo in four countries and demonstrated that in some areas, Eastern people tended to refer to the same thing with entirely different emoji compared to Western people, and this is also supported by Kejriwal et al. [49]. Thus, Chinese participants may show lower ‘accuracy’ because they tend to use these emoji in a different way. Finally, the differences observed between UK and Chinese participants may also be due to distinctions between the six ‘universal’ emotions that were used in the current study (following Ekman & Friesen, 41) being less clear cut for Eastern than Western participants [68]. This suggests the need for future research focusing on a more nuanced examination of cultural differences in emoji identification and use across different contexts.

Limitations and future directions

There are some limitations to the current study which need to be considered. As discussed above, the current experiment examined the interpretation of emoji in isolation, to allow us to assess whether similar factors influence the recognition of facial expressions depicted by emoji as those that influence real facial expressions. Like real facial expressions, emoji are more frequently interpreted in context, which may lead to more nuanced effects. In addition, the current study focused on emoji representing the six ‘universal’ emotions, and interestingly, different effects were observed for different emotions, suggesting that any accuracy advantages observed may be related to specific emoji. Thus, future studies should examine individual differences in the interpretation of a wider selection of frequently used emoji both in and out of context. Interestingly, recent research investigating emotions that are evoked by more complex situations (using video-based stimuli) has developed cutting-edge techniques to examine individual differences in emotion profiles [69]. Such an approach could be adopted in future research on emoji, particularly when considering responses to emoji conveying more complex, context-dependent information than those investigated in the current study.

Finally, in our study, we did not ask participants for demographic information beyond gender, age, and culture. In future, it would be interesting to consider a more detailed profile of the participants, including further individual differences which might be relevant to facial emotion recognition ability. Indeed, there is emerging evidence that autistic traits, alexithymia, and attachment styles may interact to influence emoji classification accuracy [e.g., 70], but it is as yet unknown how these individual differences factors may interact with the factors investigated in the current study. In relation to this, a further interesting avenue for research would be to examine whether incorporating more visual cues can facilitate online communication in these diverse populations.


In conclusion, the results from the current study revealed a range of individual differences in how participants classified emoji representing facial emotional expressions. Although a number of findings were in line with those from previous studies on ‘real’ facial emotion recognition, suggesting that similar individual differences may come into play across the two domains, the whole story is likely to be more complex. For instance, our findings in relation to age and culture highlight the importance of context in emoji use, for example, the possibility that participants in China may commonly use the ‘smile’ emoji for different purposes than to signify happiness, which means some ‘universal’ facial emotions may not be ‘universal’ when they transfer to emoji. The current results have important implications when considering emoji use in online communication, for example, with conversation partners from different cultures or of different ages. Given the broad and expanding use of emoji in other domains, the findings of individual differences in their interpretation also has more wide-reaching implications—for instance, in improving classification accuracy in sentiment analysis [71, 72], and regarding digital advertising within marketing, multinational corporations may need to apply different emoji for marketing purposes in different nations.


  1. 1. Kaye LK, Malone SA, Wall HJ. Emojis: Insights, affordances, and possibilities for psychological science. Trends in Cogn Sci. 2017 Feb;21(2):66–8. pmid:28107838
  2. 2. Marengo D, Giannotta F, Settanni M. Assessing personality using emoji: An exploratory study. Pers Individ Dif. 2017;112:74–78.
  3. 3. Juhasz A, Bradford K. Mobile phone use in romantic relationships. Marriage Fam Rev. 2016 Nov 16;52(8):707–21.
  4. 4. Walther JB D’Addario KP. The impacts of emoticons on message interpretation in computer-mediated communication. Soc Sci Comput Rev. 2001 Aug;19(3):324–47.
  5. 5. Riordan MA, Kreuz RJ. Emotion encoding and interpretation in computer-mediated communication: Reasons for use. Comput Human Behav. 2010 Nov 1;26(6):1667–73.
  6. 6. Rodrigues D, Prada M, Gaspar R, Garrido M V., Lopes D. Lisbon Emoji and Emoticon Database (LEED): Norms for emoji and emoticons in seven evaluative dimensions. Behav Res Methods. 2018 Feb;50:392–405. pmid:28364283
  7. 7. Prada M, Rodrigues DL, Garrido M V., Lopes D, Cavalheiro B, Gaspar R. Motives, frequency and attitudes toward emoji and emoticon use. Telemat Inform. 2018 Oct 1;35(7):1925–34.
  8. 8. Kerslake L, Wegerif R. The Semiotics of Emoji: The rise of visual language in the age of the internet. Media Commun. 2017 Dec 21;5(4):75–8.
  9. 9. Oleszkiewicz A, Karwowski M, Pisanski K, Sorokowski P, Sobrado B, Sorokowska A. Who uses emoticons? Data from 86 702 Facebook users. Pers Individ Dif. 2017 Dec 1;119:289–95.
  10. 10. Thompson D, Mackenzie IG, Leuthold H, Filik R. Emotional responses to irony and emoticons in written language: Evidence from EDA and facial EMG. Psychophysiology. 2016 Jul 1;53(7):1054–62. pmid:26989844
  11. 11. Leung CH, Chan WTY. Using emoji effectively in marketing: An empirical study. Journal of Digital & Social Media Marketing. 2017;5(1):76–95.
  12. 12. Li YM, Lin L, Chiu SW. Enhancing targeted advertising with social context endorsement. International Journal of Electronic Commerce. 2014 Oct 5;19(1):99–128.
  13. 13. Gaube S, Tsivrikos D, Dollinger D, Lermer E. How a smiley protects health: A pilot intervention to improve hand hygiene in hospitals by activating injunctive norms through emoticons. PLOS ONE. 2018 May 21;13(5):e0197465. pmid:29782516
  14. 14. Troiano G, Nante N. Emoji: What does the scientific literature say about them?-A new way to communicate in the 21st century. J Hum Behav Soc Environ. 2018 May 19;28(4):528–33.
  15. 15. LeCompte T, Chen J. Sentiment Analysis of Tweets Including Emoji Data. In: 2017 International Conference on Computational Science and Computational Intelligence (CSCI). IEEE; 2017. p. 793–8.
  16. 16. Brody N, Caldwell L. Cues filtered in, cues filtered out, cues cute, and cues grotesque: Teaching mediated communication with emoji Pictionary. Commun Teach. 2019 Apr 3;33(2):127–31.
  17. 17. Miller HJ, Thebault-Spieker J, Chang S, Johnson I, Terveen L, Hecht B. “Blissfully happy” or “ready to fight”: Varying interpretations of emoji. Tenth International AAAI Conference on Web and Social Media. 2016 Mar 31;
  18. 18. Tigwell GW, Flatla DR. Oh that’s what you meant! In: Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct. New York, NY, USA: ACM; 2016. p. 859–66.
  19. 19. Howman HE, Filik R. The role of emoticons in sarcasm comprehension in younger and older adults: Evidence from an eye-tracking experiment. Q J Exp Psychol. 2020;73(11):1729–1744. pmid:32338575
  20. 20. Hand CJ, Kennedy A, Filik R, Pitchford M, Robus CM. Emoji identification and emoji effects on sentence emotionality in ASD-diagnosed adults and neurotypical controls. J Autism Dev Disord. 2023 Jun; 53(6): 2514–28 pmid:35415776
  21. 21. Embick D, Marantz A, Miyashita Y, O’Neil W, Sakai KL. A syntactic specialization for Broca’s area. Proc Natl Acad Sci U S A. 2000 May 23;97(11):6150–4. pmid:10811887
  22. 22. Yuasa M, Saito K, Mukawa N. Brain activity associated with graphic emoticons. The effect of abstract faces in communication over a computer network. Electrical Engineering in Japan. 2011 Nov 30;177(3):36–45.
  23. 23. Weiß M, Gutzeit J, Rodrigues J, Mussel P, Hewig J. Do emojis influence social interactions? Neural and behavioral responses to affective emojis in bargaining situations. Psychophysiology. 2019 Apr;56(4):e13321. pmid:30628097
  24. 24. Hampson E, Vananders S, Mullin L. A female advantage in the recognition of emotional facial expressions: test of an evolutionary hypothesis. Evol Hum Behav. 2006 Nov 1;27(6):401–16.
  25. 25. Thompson AE, Voyer D. Sex differences in the ability to recognise non-verbal displays of emotion: A meta-analysis. Cogn Emot. 2014 Oct 3;28(7):1164–95. pmid:24400860
  26. 26. Babchuk WA, Hames RB, Thompson RA. Sex differences in the recognition of infant facial expressions of emotion: The primary caretaker hypothesis. Ethol Sociobiol. 1985;6(2):89–101.
  27. 27. Connolly HL, Lefevre CE, Young AW, Lewis GJ. Sex differences in emotion recognition: Evidence for a small overall female superiority on facial disgust. Emotion. 2019 Apr 1;19(3):455–64. pmid:29781645
  28. 28. Hall JA, Matsumoto D. Gender differences in judgments of multiple emotions from facial expressions. Emotion. 2004 Jun;4(2):201–6. pmid:15222856
  29. 29. Jones LL, Wurm LH, Norville GA, Mullins KL. Sex differences in emoji use, familiarity, and valence. Comput Human Behav. 2020;108:106305.
  30. 30. Roiser JP, Elliott R, Sahakian BJ. Cognitive mechanisms of treatment in depression. Neuropsychopharmacology. 2012;37:117–36. pmid:21976044
  31. 31. Grimshaw GM, Bulman-Fleming MB, Ngo C. A signal-detection analysis of sex differences in the perception of emotional faces. Brain and Cognition. 2004 Apr 1;54(3):248–50. pmid:15050785
  32. 32. Rahman Q, Wilson GD, Abrahams S. Sex, sexual orientation, and identification of positive and negative facial affect. Brain and Cognition. 2004 Apr 1;54(3):179–85. pmid:15050772
  33. 33. Jaeger SR, Xia Y, Lee PY, Hunter DC, Beresford MK, Ares G. Emoji questionnaires can be used with a range of population segments: Findings relating to age, gender and frequency of emoji/emoticon use. Food Qual Prefer. 2018 Sep 1;68:397–410.
  34. 34. Chen Z, Lu X, Ai W, Li H, Mei Q, Liu X. Through a gender lens: Learning usage patterns of emojis from large-scale android users. In Proceedings of the 2018 world wide web conference 2018 Apr 23 (pp. 763–772).
  35. 35. Koch TK, Romero P, Stachl C. Age and gender in language, emoji, and emoticon usage in instant messages. Comput Human Behav. 2022 Jan 1;126:106990.
  36. 36. Ebner NC, Johnson MK. Young and older emotional faces: Are there age group differences in expression identification and memory? Emotion. 2009 Jun;9(3):329–39. pmid:19485610
  37. 37. Mill A, Allik J, Realo A, Valk R. Age-related differences in emotion recognition ability: A cross-sectional study. Emotion. 2009;9(5):619–30. pmid:19803584
  38. 38. Calder AJ, Keane J, Manly T, Sprengelmeyer R, Scott S, Nimmo-Smith I, et al. Facial expression recognition across the adult life span. Neuropsychologia. 2003 Jan 1;41(2):195–202. pmid:12459217
  39. 39. Orgeta V, Phillips LH. Effects of age and emotional intensity on the recognition of facial emotion. Exp Aging Res. 2007 Dec 26;34(1):63–79.
  40. 40. Hayes GS, McLennan SN, Henry JD, Phillips LH, Terrett G, Rendell PG, et al. Task characteristics influence facial emotion recognition age-effects: A meta-analytic review. Psychol Aging. 2020 Mar 1;35(2):295–315. pmid:31999152
  41. 41. Ekman P, Friesen W V. Measuring facial movement. Environmental Psychology and Nonverbal Behavior. 1976 Sep;1(1):56–75.
  42. 42. Boucher JD, Carlson GE. Recognition of facial expression in three cultures. J Cross Cult Psychol. 1980 Sep;11(3):263–80.
  43. 43. Markham R, Wang L. Recognition of emotion by Chinese and Australian children. J Cross Cult Psychol. 1996 Sep;27(5):616–43.
  44. 44. Elfenbein HA, Ambady N. On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychol Bull. 2002 Mar;128(2):203–35. pmid:11931516
  45. 45. Mandal MK, Awasthi A, editors. Understanding facial expressions in communication: Cross-cultural and multidisciplinary perspectives. Springer; 2014 Oct 10.
  46. 46. Mesquita B, Frijda NH. Cultural variations in emotions: A review. Psychol Bull. 1992;112(2):179–204. pmid:1454891
  47. 47. Anthony T, Copper C, Mullen B. Cross-racial facial identification: A social cognitive integration. Pers Soc Psychol Bull. 1992 Jun 2;18(3):296–301.
  48. 48. Scherer KR, Wallbott HG. Evidence for universality and cultural variation of differential emotion response patterning. J Pers Soc Psychol. 1994;66(2):310–28. pmid:8195988
  49. 49. Kejriwal M, Wang Q, Li H, Wang L. An empirical study of emoji usage on Twitter in linguistic and national contexts. Online Soc Netw Media. 2021 Jul 1;24:100149.
  50. 50. Guntuku SC, Li M, Tay L, Ungar LH. Studying Cultural Differences in Emoji Usage across the East and the West. Proceedings of the 13th International Conference on Web and Social Media, ICWSM 2019. 2019 Apr 4;13:226–35.
  51. 51. Chang W, Cheng J, Allaire J, Sievert C, Schloerke B, Xie Y, et al. Shiny: Web application framework for R. R package version 1.7. 2.9000, Vol. 23, Retrieved February. 2021.
  52. 52. R Core Team. A language and environment for statistical computing. R Foundation for Statistical Computing. 2020;2.
  53. 53. Rosseel Y. Lavaan: An R package for structural equation modeling. J Stat Softw. 2012 May 24;48(2):1–36.
  54. 54. Lenth R V. Least-Squares Means: The R Package lsmeans. J Stat Softw. 2016 Jan 29;69(1):1–33.
  55. 55. Calder AJ, Keane J, Young AW, Dean M. Configural information in facial expression perception. J Exp Psychol Hum Percept Perform. 2000 Apr;26(2):527–51. pmid:10811161
  56. 56. Eisenbarth H, Alpers GW. Happy mouth and sad eyes: Scanning emotional facial expressions. Emotion. 2011 Aug;11(4):860–5. pmid:21859204
  57. 57. Sullivan S, Campbell A, Hutton SB, Ruffman T. What’s good for the goose is not good for the gander: Age and gender differences in scanning emotion faces. J Gerontol B Psychol Sci Soc Sci. 2017 May 1;72(3):441–7. pmid:25969472
  58. 58. Abbruzzese L, Magnani N, Robertson IH, Mancuso M. Age and gender differences in emotion recognition. Front Psychol. 2019 Oct;10:2371. pmid:31708832
  59. 59. Székely E, Tiemeier H, Arends LR, Jaddoe VWV, Hofman A, Verhulst FC, et al. Recognition of facial expressions of emotions by 3-year-olds. Emotion. 2011 Apr;11(2):425–35. pmid:21500910
  60. 60. Vicari S, Reilly JS, Pasqualetti P, Vizzotto A, Caltagirone C. Recognition of facial expressions of emotions in school-age children: The intersection of perceptual and semantic categories. Acta Paediatrica, International Journal of Paediatrics. 2000 Jul 1;89(7):836–45. pmid:10943968
  61. 61. Brants W, Sharif B, Serebrenik A. Assessing the meaning of emojis for emotional awareness—A pilot study. In: Companion Proceedings of the 2019 World Wide Web Conference. New York, NY, USA: ACM; 2019. pp. 419–23.
  62. 62. Lambrecht L, Kreifelts B, Wildgruber D. Gender differences in emotion recognition: Impact of sensory modality and emotional category. Cogn Emot. 2014 Apr;28(3):452–69. pmid:24151963
  63. 63. Myrtek M, Aschenbrenner E, Brügner G. Emotions in everyday life: An ambulatory monitoring study with female students. Bioll Psychol. 2005 Mar 1;68(3):237–55. pmid:15620793
  64. 64. Reed AE, Chan L, Mikels JA. Meta-analysis of the age-related positivity effect: Age differences in preferences for positive over negative information. Psychol Aging. 2014 Mar;29(1):1–15. pmid:24660792
  65. 65. Garcia C, Țurcan A, Howman H, Filik R. Emoji as a tool to aid the comprehension of written sarcasm: Evidence from younger and older adults. Comput Human Behav. 2022 Jan 1;126:106971.
  66. 66. Shi H, Liu X, Li K, Xie J. Emoji usage and interpersonal relationship in computer-mediated communication. International Joint Conference on Information, Media and Engineering (IJCIME), IEEE; 2019:262–6.
  67. 67. Wang S. Sarcastic meaning of the slightly smiling face emoji from Chinese Twitter users: When a smiling face does not show friendliness. International Journal of Languages, Literature and Linguistics. 2022 Jun;8:65–73.
  68. 68. Jack RE, Garrod OGB, Yu H, Caldara R, Schyns PG. Facial expressions of emotion are not culturally universal. Proc Natl Acad Sci U S A. 2012 May;109(19):7241–4. pmid:22509011
  69. 69. Hu X, Wang F, Zhang D. Similar brains blend emotion in similar ways: Neural representations of individual difference in emotion profiles. Neuroimage. 2022 Feb;247:118819. pmid:34920085
  70. 70. Taylor H, Hand CJ, Howman H, Filik R. Autism, attachment, and alexithymia: Investigating emoji comprehension. Int J Hum Comput Interact. In press.
  71. 71. Kraji Novak P, Smailović J, SLuban B, Mozeticč I. Sentiment of emojis. PLOS ONE. 2015; 10(12):e0144296. pmid:26641093
  72. 72. Liu C, Fang F, Lin X, Cai T, Tan X, Liu J, et al. Improving sentiment analysis accuracy with emoji embedding. Journal of Safety Science and Resilience. 2021;2:246–252