Studies of social dysfunction in patients with autism spectrum disorder (ASD) have generally focused on the perception of emotional words and facial affect. Brain imaging studies have suggested that the fusiform gyrus is associated with both the comprehension of language and face recognition. We hypothesized that patients with ASD would have decreased ability to recognize affect via emotional words and facial emoticons, relative to healthy comparison subjects. In addition, we expected that this decreased ability would be associated with altered activity of the fusiform gyrus in patients with ASD. Ten male adolescents with ASDs and ten age and sex matched healthy comparison subjects were enrolled in this case-control study. The diagnosis of autism was further evaluated with the Autism Diagnostic Observation Schedule. Brain activity was assessed using functional magnetic resonance imaging (fMRI) in response to emotional words and facial emoticon presentation. Sixty emotional words (45 pleasant words +15 unpleasant words) were extracted from a report on Korean emotional terms and their underlying dimensions. Sixty emoticon faces (45 pleasant faces +15 unpleasant faces) were extracted and modified from on-line sites. Relative to healthy comparison subjects, patients with ASD have increased activation of fusiform gyrus in response to emotional aspects of words. In contrast, patients with ASD have decreased activation of fusiform gyrus in response to facial emoticons, relative to healthy comparison subjects. We suggest that patients with ASD are more familiar with word descriptions than facial expression as depictions of emotion.
Citation: Han DH, Yoo HJ, Kim BN, McMahon W, Renshaw PF (2014) Brain Activity of Adolescents with High Functioning Autism in Response to Emotional Words and Facial Emoticons. PLoS ONE 9(3): e91214. https://doi.org/10.1371/journal.pone.0091214
Editor: Michelle Hampson, Yale University, United States of America
Received: September 26, 2013; Accepted: February 8, 2014; Published: March 12, 2014
Copyright: © 2014 Han et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Funding: This work was supported by Korean Game Culture Foundation and a grant of the Korean Health Technology R&D Project, Ministry of Health & Welfare, Republic of Korea (A120013). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing interests: The authors have declared that no competing interests exist.
Language comprehension and facial affect recognition in patients with autism spectrum disorder
Autism is a developmental disorder characterized by deficits in three domains, including social reciprocity; early language and communication problems; and restrictive, repetitive and stereotyped behaviors . Among these deficits, particular interest in social dysfunction has been focused on the perception of emotional words and facial affect in patients with autism spectrum disorder (ASD) –. Social dysfunction has also been associated with impairment of social perception and cognition, as demonstrated by observation of emotions, actions, body movements, hand gestures, and facial expressions . In addition, social dysfunction has been related to early language and communication impairments in children with ASD , . In the domains of comprehension and use of pragmatic language, patients with ASD have been reported to demonstrate impairments in understanding non-literal meaning, such as emotion, humor, and metaphor , .
Patients with ASD are also known to have impairments in their ability to recognize unfamiliar faces, compared to healthy subjects . Moreover, defects in the perception of facial expression may be a core feature of social deficits in patients with ASD . Patients with autism were impaired in their ability to recognize appropriate facial expressions in videotaped gestures, vocalizations and contexts . Similarly, patients with Asperger's syndrome had difficulty in the comprehension and production of facial and language expressions of emotion .
Facial Emoticons: simplified facial emotion expression
The emoticon in computer-mediated communication may be used for the purpose of facilitating non-verbal emotional communication and for reinforcing the verbal parts of a message . Although facial emoticons differ from facial expression, simplified text or cartoon facial expressions (emoticons) are frequently used in the context of social communication , . Lo  reported that emoticons, as “quasi-nonverbal cues”, help most people perceive appropriate emotion, attitude, and intention during internet use. Based on these observations, there have also been several empirical results showing that the emoticon is associated with high levels of emotional abstraction  and non-verbal information recognition , . Interestingly, patients with ASD, relative to healthy comparison subjects, do not apply different strategies for perceiving cartoon faces, while they use different strategies during perception of real human faces , . While healthy comparison subjects used a configural strategy in assessing both real and cartoon faces, patients with ASD used a figural strategy in viewing cartoon faces and a local strategy in evaluating real faces . In an assessment of attentional bias using saccade-related, event-related potentials (ERPs), patients with ASD showed a smaller interval in attentional time between faces and objects, relative to healthy comparison subjects . Moreover, patients with ASD can fixate on cartoon characters longer than on real objects . In an fMRI study of emoticon perception, the right inferior frontal gyrus of healthy volunteers, which has been associated with non-verbal communication, was activated in response to viewing facial emoticons .
Fusiform gyrus, language comprehension, and face recognition
Over the last decade, several neuroimaging studies have reported activation in focal brain areas in response to semantic comprehension of language and human facial affect recognition in healthy subjects. Bookheimer  reported that inferior frontal and temporal cortex play a crucial role in the comprehension of language in healthy subjects. Ghosh et al.  noted that contextual integration in lexical processing was associated with activation of the left fusiform gyrus. In a review of studies of the role of the fusiform gyrus in face recognition, Kanwisher and Yovel  suggested that the fusiform gyrus is specific for face recognition in terms of detecting and extracting the necessary perceptual information to recognize the face. Similarly, the fusiform gyrus is well known to be dysfunctional in prosopagnosia patients . In a review of fMRI articles, Sabatinelli et al.  suggested that the early developmental failure of the ventral temporal area (amygdala-fusiform system) was associated with deficits in social perception and social cognition. However, in fMRI studies, there have been reports that the fusiform gyrus was not activated in response to viewing facial expressions of emotion in patients with ASD , . Also, while a healthy control group showed activation of right fusiform gyrus in response to a discrimination of facial expression task, patients with autism did not show activation .
We hypothesized that patients with ASD would have decreased ability to recognize affect via emotional words and facial emoticons, relative to healthy comparison subjects. In addition, we expected that this decreased ability would be represented by altered activity of the fusiform gyrus in patients with ASD.
In response advertisements posted in Chung Ang University Hospital and other local hospitals, fifteen adolescents with autism spectrum disorders were screened. Inclusion criteria included: adolescents between the ages of 13 years to 18 years; diagnosed as having autism spectrum disorder; IQ≥70, ADOS score = 4–7 . Exclusion criteria included: Comorbid Axis I Disorders, as determined by Korean Kiddie Schedule for Affective Disorders and Schizophrenia-Present and Life time version (K-SADS-PL) ; history of head trauma with loss of consciousness, seizure disorder, multiple sclerosis, brain tumor, claustrophobia, metal implantation or cerebrovascular accident; serious or chronic medical illness; IQ<70; and a history of substance abuse. Of these fifteen adolescents, three adolescents had IQ<70 and two adolescents were excluded due to co-morbid major depressive disorder and/or attention deficit hyperactivity disorder. Finally, 10 adolescents with ASD and 10 age and sex matched healthy comparison subjects have been recruited. The research protocol was approved by the Chung Ang University Hospital Institutional Review Board. Written informed assent was provided by adolescents and written informed consent was provided by parents.
The diagnosis of autism was further evaluated with the ADOS  by social worker B.G.Y. who has both clinician and research certificates for ADOS and Autism Diagnostic Interview (ADI) based on training by supervisor Catherine Lord. The clinical symptoms were assessed by the child's mother using the Childhood Autism Rating Scale (CARS) parent version. IQ scores were obtained for all participants using the Korean Wechsler Adult Intelligence Scale (K-WAIS) . Social communication was evaluated by the mother using the Social Communication Questionnaire (SCQ)-current form . The inter-item consistency of SCQ Korean version has been reported to range from 0.84 to 0.93 .
Brain activity in both ASD patients and healthy subjects was assessed, in two separate scan sequences, in response to (1) emotional words and (2) scenes depicting emoticons in the scanner. All MR imaging was performed using 3 T blood oxygen level dependent (BOLD) functional magnetic resonance imaging (fMRI, Achieva 3.0, Philips, Eindhoven, the Netherlands). The stimulation was presented through an IFIS-SA™ system (MRI Device Corporation, Waukesha, WI, USA) during a single fMRI scanning session. For the fMRI session, 180 echo planar images (EPI, 33 transverse slices, 4.0 mm thickness, voxel size of 1.8×1.8×4.0 mm, TE = 30 msec, TR = 3000 ms, Flip angle = 90°, in-plane resolution = 128×128 pixels, field of view (FOV) = 230×230 mm) were recorded at 3-second intervals. For anatomical imaging, 3D T1-weighted magnetization-prepared rapid gradient echo (MPRAGE) data were gathered with these parameters: TR = 2000 ms, TE = 4.00 ms, FOV = 256×256 mm, 340 slices, 0.9×0.9×1.0 mm voxel size, flip angle = 30°.
Stimulation 1: Emotional words.
Sixty emotional words (45 pleasant words +15 unpleasant words) were extracted from a report on Korean emotional terms and their underlying dimensions . In ratings by forty young college students with a seven point analogue scale, the prototypicality and familiarity of these sixty words were 4.01–5.77 and 4.16–5.82, respectively . Pleasant score of pleasant words was over 3.0 and the unpleasant score of unpleasant words was over 3.0. Both the pleasant score and unpleasant score in response to non-emotional words were less than 1.0 . While in the scanner, each subject in patient group and healthy control group was asked to identify via microphone the one unpleasant emotional word presented among three pleasant words. Test-retest reliability for identification of emotional words was 0.82. The percentage of correct answers for emotional words in seventy healthy subjects was 81.2±10.3% (minimum = 56.6%, maximum = 96.7%). A video was constructed that was 450-seconds long and consisted of five continuous 90-second segments. Each 90-second segment consisted of three 30-second sub-segments. A white cross on a black background (B), a single non-emotional word (Match, M), and an emotional word stimulation (Stimulation, S) were included in these 90-second segments. The five segments were ordered as follows: B-M-S, B-S-M, M-B-S, S-B-M, and M-S-B. During the M sub-segment, three non-emotional words were presented every ten seconds. During the S sub-segment, three sets of four mixed emotional words were presented every ten seconds (Figure 1).
Stimulation2: Emoticon faces.
Sixty emoticon faces (45 pleasant faces +15 unpleasant faces) were extracted and modified from on-line sites (http://image.search.naver.com/search.naver, http://forums.themavesite.com/index.php?topic=6103.0)(Figure S3 in File S1). While in the scanner, each subject in patient group and healthy control group was asked to identify via microphone the number of unpleasant face emoticon among three other pleasant faced emotions. Test-retest reliability for facial emoticon identification is 0.84. The percentage of correct answers for facial emoticon identification in seventy healthy subjects was 86.6±9.2% (minimum = 66.7%, maximum = 100%). The sixty emotional words consisted of forty-five pleasant emotional words and fifteen unpleasant emotional words. The video used was 450-seconds long and consisted of five continuous 90-second segments. Each 90-second segment consisted of three 30-second sub-segments. A white cross on a black background (B), a single neutral face (Match, M), and an emoticon stimulation (Stimulation, S) were included in these 90-second segments. The five segments were ordered as follows: B-M-S, B-S-M, M-B-S, S-B-M, and M-S-B. During the M sub-segment, three neutral faces were presented every ten seconds. During the S sub-segment, three sets of four mixed emoticon faces were presented every ten seconds (Figure 1).
fMRI data analysis
Acquired fMRI data was analyzed by the Brain Voyager software package (BVQX 1.9, Brain Innovation, Maastricht, The Netherlands). In a multi-scale algorithm, each fMRI time series was registered to the MPRAGE 3D data set. The anatomic images were spatially normalized to standard Talairach space . The time series data was also spatially normalized. Scan time correction and motion correction were applied during processing. Spatial smoothing using 6 mm FWHM Gaussian kernel and temporal smoothing using a 4 second Gaussian kernel were also applied. Head motion algorithms represent head movements by 6 parameters, three translation (displacement) parameters and three rotation parameters. Six parameters are estimated iteratively by analyzing how a source volume should be translated and rotated in order to better align with the reference volume. Head motion of less than 3.5 mm (translation) or 3.5° (rotation) relative to the target volume was considered to be acceptable. None of the ASD and healthy subjects was excluded due to excessive head movement in the present study. There was no difference in head motion between ASD patients and healthy comparison subjects (Table S1 and Figure S1 in File S1).
Differences in demographic data including age, IQ, CARS scores, SCQ scores, and percentage of correct answers in identifying emotional words/emoticons between ASD and healthy subjects were compared using the Mann-Whitney U test and ANCOVA test controlling for IQ. For the analysis of fMRI signal time-courses, the general linear model (GLM) and random effects analysis (RFX) were applied to construct individual and group statistical parametric maps of brain activation. Activations were computed by contrasting the S sub-segments (emotional words/emoticons) to their corresponding M sub-segments (neutral words/neutral emoticons). With that parametric maps, ANCOVA tests controlling for IQ were applied to identify clusters at a level of FDR<0.05, voxels >40. The Talairach code of clusters was identified by nearest coordinate in Talairach daemon .
As a second-level analysis in subjects with ASD, the correlations between mean β value in clusters and the mean SCQ scores were analyzed using partial correlations controlling for IQ, ADOS-communication score, and ADOS-social score. In order to account for multiple comparisons, we set significant p values = 0.0166 (0.05/3, CARS, SCQ, and correct response rates).
There were significant differences between ASD subjects and healthy comparison subjects in the IQ (U = 3.5, z = 3.5, p<0.01), CARS score (F = 15.5, df = 1, p<0.01), SCQ score (F = 53.4, df = 1, p<0.01), and the percentage of correct answers in identifying emotional words (F = 32.7, df = 1, p<0.01) and emoticon faces (F = 10.1, df = 1, p<0.01)(Table 1)(Figure S2 in File S1). The mean IQ values of the healthy subjects and the ASD subjects were 105.5±8.9 and 81.5±9.0, respectively. The correct response rates for emotional words in healthy subjects and ASD subjects were 86.4±8.4 and 40.3±11.6%, respectively. The correct response rates for facial emoticons in the healthy subjects and ASD subjects were 90.3±9.1 and 53.0±14.6%, respectively.
Correct responses to emoticon were positively correlated with the correct response to emotional words (r = 0.91, p<0.01). There were no significant correlations between IQ and correct response of emotional words or emoticon. Controlling for IQ, the mean SCQ score were negatively correlated with the correct response to emotional words (r = −0.77, p = 0.01) in ASD patients. Controlling for IQ, SCQ score was negatively correlated with the correct responses to emoticons at a trend level (r = −0.73, p = 0.02).
Clusters in the interaction between emotional words and subject factors at baseline
In an F test, interaction between group (ASD vs. Healthy subjects) and stimuli (Emotional words vs. Neutral words), two clusters of activity in patients with ASD were increased, compared to healthy subjects (FDR<0.05, p<0.00038); Cluster1 (CL1): Talairach x, y, z; 55,−39, −9, voxels = 242, right temporal lobe, fusiform gyrus, Brodmann area 20; Cluster2 (CL2): 32, −80, −17, voxels = 70, Right Occipital fusiform Gyrus, Brodmann area 19 (Figure 2)(Figure 3)(Table2). In an F test, interaction between group (ASD vs. Healthy subjects) and stimuli (Emotional words vs. Neutral words), there were no significant clusters of activity in patients with ASD that were decreased, compared to healthy subjects (FDR<0.05, p<0.00038).
Brain activity in response to emotional words (ASD > healthy subjects), Cluster1 (CL1): right temporal lobe, fusiform gyrus, Brodmann area 20; Cluster2 (CL2): Right Cerebrum,Occipital Lobe,Fusiform Gyrus, Right Cerebrum, Occipital Lobe, Inferior Occipital Gyrus, Right Cerebellum, Posterior Lobe. Brain activity in response to emoticons (ASD < healthy subjects), Cluster3 (CL3): Right Cerebrum,Temporal Lobe, Middle Temporal Gyrus, Cluster4 (CL4): Right Cerebellum, Posterior Lobe, Right Cerebrum,Occipital Lobe, Fusiform Gyrus.
Cluster1 (CL1, unfilled triangle): right temporal lobe, fusiform gyrus, Brodmann area 20; Cluster2 (CL2, unfilled circle): Right Cerebrum,Occipital Lobe,Fusiform Gyrus, Right Cerebrum, Occipital Lobe, Inferior Occipital Gyrus, Right Cerebellum, Posterior Lobe, Cluster3 (CL3, filled triangle): Right Cerebrum,Temporal Lobe, Middle Temporal Gyrus, Cluster4 (CL4, filled circle): Right Cerebellum, Posterior Lobe, Right Cerebrum,Occipital Lobe, Fusiform Gyrus.
Clusters in the interaction between facial emoticons and subject factors at baseline
In an F test, interaction between group (ASD vs. Healthy subjects) and stimulus (Emoticons vs. Neutral faces), two clusters of activity in patients with ASD were decreased (FDR<0.05, p<0.000025); Cluster3 (CL3): Talairach x, y, z; 61, −39, −4, Right Cerebrum, Temporal Lobe, Middle Temporal Gyrus, Brodmann area 21, Cluster4 (CL4): Talairach x, y, z; 31, −82, −20, Right Cerebellum, Posterior Lobe, Right Occipital fusiform Gyrus, Brodmann area 18 (Figure 2)(Figure 3)(Table2). In an F test, interaction between group (ASD vs. Healthy subjects) and stimulus (Emoticon vs Neutral faces), there were no significant clusters of activity in patients with ASD that were increased, compared to healthy subjects (FDR<0.05, p<0.000025).
Correlations between Social Communication Questionnaire (SCQ) and brain activity
The mean SCQ score in ASD patients was negatively correlated with the mean beta value of right temporal fusiform gyrus in response to emotional words (r = −0.82, p = 0.02)(Figure 4). There were no significant correlations between SCQ scores, CARS scores, correct responses on emotional words/emoticon tests and the beta value of other clusters in response to emotional words. Similarly there were no significant correlations between SCQ scores, CARS scores, correct responses on emotional words/emoticon tests, and the beta value of other clusters in response to facial emoticons. There was no significant correlation between CARS score and other clusters.
A: The correlation between mean SCQ score in ASD patients and the mean beta value within right temporal fusiform gyrus in response to emotional words, with a partial correlation controlling for IQ, r = −0.82, p = 0.015. B: The correlation between mean SCQ score in healthy control subjects and the mean beta value within right temporal fusiform gyrus in response to emotional words, with a partial correlation controlling for IQ, r = −0.16, p = 0.67.
As expected, patients with ASD showed lower correct response rates for recognizing affect via emotional words and facial emoticons. However, there were divergent patterns of brain activation for emotional words and facial emoticons. The fusiform gyrus in the patients with ASD activated to a greater extent during emotional word tasks, relative to the healthy comparison subjects. Conversely, the activity of fusiform gyrus of patients with ASD was lower in response to facial emoticons, compared to healthy subjects.
Difficulty in recognizing emotional and non-verbal meanings in words and facial affect in patients with ASD has been reported in several studies –. Moreover, the decreased recognition of emotion and non-verbal meanings has been associated with increased activity of fusiform gyrus in patients with ASD , , . In solving verbal and visual problems, the dissociation between verbal and visuo-spatial abilities in terms of reduced activation of fronto-temporal language areas and increased activation of occipito-parietal and ventral temporal circuits has been noted in patients with ASD , . In solving verbal problems, patients with ASD may use the right hemisphere as compensation for a dysfunctional left hemisphere –. In a study of speaker-incongruent sentences vs. speaker-congruent sentences, patients with ASD showed increased activation in right inferior frontal gyrus, compared to healthy controls . During pragmatic language comprehension, increased activation in the right inferior frontal gyrus in patients with ASD has also been observed , . Taken together, we hypothesize that the fusiform gyrus in the patients with ASD may require greater activation in order to interpret to the meaning of emotional words, compared to healthy subjects. The increased activation within fusiform gyrus in response to emotional words in patients with ASD may be associated with a compensatory mechanism to control social information processing. Dichter et al.  have reported that patients with ASD showed greater activation within the anterior cingulate in response to the stimuli of social target detection, compared to healthy subjects. In addition, mean SCQ scores (higher SCQ scores are associated with greater deficits in social communication) were negatively correlated with brain activity within the fusiform gyrus in response to emotional words. However, mean SCQ scores had no correlation with brain activity within any other cluster in response to facial emotion. The observations suggest that patients with ASD may be more familiar with verbal expression than facial expression during emotional communication. Taken together, we hypothesize that the fusiform gyrus in the patients with ASD may require greater activation in order to interpret to the meaning of emotional words, compared to healthy subjects. In other words, that pattern was though as cortical inefficiency during social information processing .
As an explanation for the social deficits in patients with ASD, the disability in facial emoticon recognition is thought to be associated with emotional salience ,  or lack of social insight . Simplified carton facial expressions (emoticons) are frequently used in the context of social communication, because emoticons can be used as a surrogate for nonverbal emotional expression , . In a study of the role of emoticons in computer-mediated communication, Derks et al.  suggested that emoticons are commonly used in a manner similar to facial behavior in face to face communication with respect to social context and interpersonal interaction. In the case of real facial experiments, Schultz et al.  reported that patients with ASD were hard put to differentiate even neutral face recognition. Among various social brain areas, the lateral fusiform gyrus or fusiform face area has been thought to be important for the rapid recognition of faces . In face discrimination tasks, patients with ASD showed less activation of the fusiform gyrus, compared to healthy subjects . Patients with ASD are thought to recognize faces in a different manner compared to healthy subjects. They focus more on feature-based than configural analyses  and the fusiform gyrus is known to be associated with configural processing . The decreased activation within fusiform gyrus in response to emoticons in patients with ASD may be due to the use of different cognitive strategies to solve problems, relative to healthy subjects. Those differences may be caused by differences in programmed cell death, lack of functional specialization or deviant myelination during synaptogenesis and brain development , . Patients with ASD have difficulty with face discrimination, due not to the emotional valence, but to different patterns of facial recognition, compared to healthy subjects. We expect that ASD patients would show a similar neural response following the presentation of one pleasant emoticon out of three unpleasant ones, compared to current paradigm (one unpleasant one out of three pleasant ones).
In response to the emoticon task, right middle temporal gyrus was less activated in ASD patients, compared to healthy subjects. This brain region has been implicated in the pathophysiology of ASD in several prior studies. Freitag et al.  have suggested that the interpretation of complex motion may be associated with gray matter volumes within the right medial temporal cortex. Alaerts et al.  reported that hypoactivity of posterior superior temporal sulcus might be linked with the social deficits characteristic of ASD patients. Herrington et al  reported that the detection of a point-light walking figure was associated with the activity within inferior frontal gyrus, fusiform gyrus, and amygdala. In an fMRI study with stimulation in response to a biological motion task, Koldewyn et al.  reported that ASD patients showed reduced posterior superior temporal sulcus, parietal, and frontal activity, relative to healthy comparison subjects. Taken together, these results may be associated with decreased bodily motion or gesture perception. ASD patients are thought to have decreased visual sensitivity to human movements and gesture comprehension .
There are several limitations in the current study. First, the number of participants was too small to allow generalization of the results. Second, spoken responses during scanning may have increased subject motion and altered apparent brain activity . To address this issue, we monitored and corrected for possible motion artifacts. In addition, patients with ASD and healthy control subjects were asked to respond equally in the task (emotional word/emoticon) stimulation. However, the two subject groups performed this emoticon labeling the task with different degrees of accuracy, which is a limitation in terms of comparing the extent of BOLD activation across groups. Finally, we did not measure response times in this experiment.
The brain response to emotional words in patients with ASD, although requiring a greater degree of fusiform gyrus activation relative to healthy comparisons subjects, is similar to healthy comparison in the manner of activation. In contrast, the fusiform gyrus response to emoticon faces is different from healthy comparison subjects. Patients with ASD have great activation of the fusiform gyrus in response to emotional aspects of words, when compared to healthy subjects. However, relative to healthy comparison subjects, patients with ASD have lower activation of fusiform gyrus in response to emotional meaning in facial emoticons.
This file contains Table S1 and Figure S1-Figure S3. Table S1,Translation and Rotation during scans. Figure S1, Head motion correction. Figure S2, Comparison of behavioral scores between ASD patients and healthy comparison subjects. Figure S3, Face emoticons.
Conceived and designed the experiments: DHH HJY. Performed the experiments: DHH BNK . Analyzed the data: WM PFR. Contributed reagents/materials/analysis tools: WM PFR. Wrote the paper: DHH PFR.
- 1. American Psychiatric Association (1994) Diagnostic and statistical manual of mental disorders, fourth ed. (DSM-IV).Washington, DC: American Psychiatric Association.
- 2. Allison T, Puce A, McCarthy G (2000) Social perception from visual cues: role of the STS region. Trends Cogn Sci 4: 267–278.
- 3. Charman T (2003) Why is joint attention a pivotal skill in autism? Philos Trans R Soc Lond B Biol Sci 358: 315–324.
- 4. Hauck M, Fein D, Maltby N, Waterhouse L, Feinstein C (1998) Memory for faces in children with autism. Child Neuropsychol 4: 187–198.
- 5. Mundy P, Sigman M, Kasari C (1990) A longitudinal study of joint attention and language development in autistic children. J Autism Dev Disord 20: 115–128.
- 6. Scott DW (1985) Asperger's syndrome and non-verbal communication: a pilot study. Psychol Med 15: 683–687.
- 7. Happe FG (1993) Communicative competence and theory of mind in autism: a test of relevance theory. Cognition 48: 101–119.
- 8. Tager-Flusberg H, Joseph RM (2003) Identifying neurocognitive phenotypes in autism. Philos Trans R Soc Lond B Biol Sci 358: 303–314.
- 9. Grelotti DJ, Gauthier I, Schultz RT (2002) Social interest and the development of cortical face specialization: what autism teaches us about face processing. Dev Psychobiol 40: 213–225.
- 10. Hobson RP (1986) The autistic child's appraisal of expressions of emotion. J Child Psychol Psychiatry 27: 321–342.
- 11. Derks D, Bos AE, von Grumbkow J (2008) Emoticons in computer-mediated communication: social motives and social context. Cyberpsychol Behav 11: 99–101.
- 12. Lo SK (2008) The nonverbal communication functions of emoticons in computer-mediated communication. Cyberpsychol Behav 11: 595–597.
- 13. Yuasa M, Saito K, Mukawa N (2006) Emoticons convey emotions without cognition of faces: an fMRI study; Montreal, Canada.
- 14. Yuasa M, Keiichi S, Naoki M (2007) Brain Activity Associated with Emoticons: An fMRI Study. IEEJ Transactions on Electronics, Information and Systems 127: 1865–1870.
- 15. Yuasa M, Keiichi S, Naoki M (2011) Brain Activity When Reading Sentences and Emoticons: An fMRI Study of Verbal and Nonverbal Communication. Electronics and Communications in Japan 94: 17–24.
- 16. Rosset DB, Rondan C, Da Fonseca D, Santos A, Assouline B, et al. (2008) Typical emotion processing for cartoon but not for real faces in children with autistic spectrum disorders. J Autism Dev Disord 38: 919–925.
- 17. van der Geest JN, Kemner C, Camfferman G, Verbaten MN, van Engeland H (2002) Looking at images with human figures: comparison between autistic and normal children. J Autism Dev Disord 32: 69–75.
- 18. Kikuchi Y, Senju A, Akechi H, Tojo Y, Osanai H, et al. (2011) Atypical disengagement from faces and its modulation by the control of eye fixation in children with autism spectrum disorder. J Autism Dev Disord 41: 629–645.
- 19. Bookheimer S (2002) Functional MRI of language: new approaches to understanding the cortical organization of semantic processing. Annu Rev Neurosci 25: 151–188.
- 20. Ghosh S, Basu A, Kumaran SS, Khushu S (2010) Functional mapping of language networks in the normal brain using a word-association task. Indian J Radiol Imaging 20: 182–187.
- 21. Kanwisher N, Yovel G (2006) The fusiform face area: a cortical region specialized for the perception of faces. Philos Trans R Soc Lond B Biol Sci 361: 2109–2128.
- 22. Barton JJ, Press DZ, Keenan JP, O'Connor M (2002) Lesions of the fusiform face area impair perception of facial configuration in prosopagnosia. Neurology 58: 71–78.
- 23. Sabatinelli D, Fortune EE, Li Q, Siddiqui A, Krafft C, et al. (2011) Emotional perception: meta-analyses of face and natural scene processing. Neuroimage 54: 2524–2533.
- 24. Critchley HD, Daly EM, Bullmore ET, Williams SC, Van Amelsvoort T, et al. (2000) The functional neuroanatomy of social behaviour: changes in cerebral blood flow when people with autistic disorder process facial expressions. Brain 123 (Pt 11): 2203–2212.
- 25. Pierce K, Muller RA, Ambrose J, Allen G, Courchesne E (2001) Face processing occurs outside the fusiform ‘face area’ in autism: evidence from functional MRI. Brain 124: 2059–2073.
- 26. Schultz RT, Gauthier I, Klin A, Fulbright RK, Anderson AW, et al. (2000) Abnormal ventral temporal cortical activity during face discrimination among individuals with autism and Asperger syndrome. Arch Gen Psychiatry 57: 331–340.
- 27. Suzuki K, Matsuzaki H, Iwata K, Kameno Y, Shimmura C, et al. (2011) Plasma cytokine profiles in subjects with high-functioning autism spectrum disorders. PLoS One 6: e20470.
- 28. Kim YS, Cheon KA, Kim BN, Chang SA, Yoo HJ, et al. (2004) The reliability and validity of Kiddie-Schedule for Affective Disorders and Schizophrenia-Present and Lifetime Version- Korean version (K-SADS-PL-K). Yonsei Med J 45: 81–89.
- 29. Lord C, Risi S, Lambrecht L, Cook EH Jr, Leventhal BL, et al. (2000) The autism diagnostic observation schedule-generic: a standard measure of social and communication deficits associated with the spectrum of autism. J Autism Dev Disord 30: 205–223.
- 30. Yeom TH, Park YS, Oh KJ, Lee YH (1992) Korean version Wechsler adult intelligence scale. Seoul: Korean Guidance.
- 31. Berument SK, Rutter M, Lord C, Pickles A, Bailey A (1999) Autism screening questionnaire: diagnostic validity. Br J Psychiatry 175: 444–451.
- 32. Yoo HJ (2008) The Social CommunicationQuestionnaire. Seoul: Hakjisa Publisher.
- 33. Park IJ, Min KH (2005) Making a list of Korean Emotion Terms and Exploring Dimensions Underlying Them. Korean Journal of Social and Personality Psychology 19: 109–129.
- 34. Talairach J, Tournoux P (1988) Co-Planar Stereotactic Atlas of the Human Brain. New York: Thieme Medical Publishers, Inc.
- 35. Minshew NJ, Keller TA (2010) The nature of brain dysfunction in autism: functional brain imaging studies. Curr Opin Neurol 23: 124–130.
- 36. Sahyoun CP, Belliveau JW, Soulieres I, Schwartz S, Mody M (2010) Neuroimaging of the functional and structural networks underlying visuospatial vs. linguistic reasoning in high-functioning autism. Neuropsychologia 48: 86–95.
- 37. Mason RA, Williams DL, Kana RK, Minshew N, Just MA (2008) Theory of Mind disruption and recruitment of the right hemisphere during narrative comprehension in autism. Neuropsychologia 46: 269–280.
- 38. Tesink CM, Buitelaar JK, Petersson KM, van der Gaag RJ, Kan CC, et al. (2009) Neural correlates of pragmatic language comprehension in autism spectrum disorders. Brain 132: 1941–1952.
- 39. Wang AT, Lee SS, Sigman M, Dapretto M (2006) Neural basis of irony comprehension in children with autism: the role of prosody and context. Brain 129: 932–943.
- 40. Dichter GS, Felder JN, Bodfish JW (2009) Autism is characterized by dorsal anterior cingulate hyperactivation during social target detection. Soc Cogn Affect Neurosci 4: 215–226.
- 41. Buchsbaum MS, Buchsbaum BR, Hazlett EA, Haznedar MM, Newmark R, et al. (2007) Relative glucose metabolic rate higher in white matter in patients with schizophrenia. Am J Psychiatry 164: 1072–1081.
- 42. Baron-Cohen S, Leslie AM, Frith U (1985) Does the autistic child have a “theory of mind”? Cognition 21: 37–46.
- 43. Kanwisher N, McDermott J, Chun MM (1997) The fusiform face area: a module in human extrastriate cortex specialized for face perception. J Neurosci 17: 4302–4311.
- 44. Hobson RP, Ouston J, Lee A (1988) What's in a face? The case of autism. Br J Psychol79 (Pt4): 441–453.
- 45. Carper RA, Courchesne E (2005) Localized enlargement of the frontal cortex in early autism. Biol Psychiatry 57: 126–133.
- 46. Chugani DC, Muzik O, Rothermel R, Behen M, Chakraborty P, et al. (1997) Altered serotonin synthesis in the dentatothalamocortical pathway in autistic boys. Ann Neurol 42: 666–669.
- 47. Freitag CM, Konrad C, Haberlen M, Kleser C, von Gontard A, et al. (2008) Perception of biological motion in autism spectrum disorders. Neuropsychologia 46: 1480–1494.
- 48. Alaerts K, Woolley DG, Steyaert J, Di Martino A, Swinnen SP, et al. (2013) Underconnectivity of the superior temporal sulcus predicts emotion recognition deficits in autism. Soc Cogn Affect Neurosci.
- 49. Herrington JD, Nymberg C, Schultz RT (2011) Biological motion task performance predicts superior temporal sulcus activity. Brain Cogn 77: 372–381.
- 50. Koldewyn K, Whitney D, Rivera SM (2011) Neural correlates of coherent and biological motion perception in autism. Dev Sci 14: 1075–1088.
- 51. Kaiser MD, Delmolino L, Tanaka JW, Shiffrar M (2010) Comparison of visual sensitivity to human and object motion in autism spectrum disorder. Autism Res 3: 191–195.
- 52. Huang J, Francis AP, Carr TH (2008) Studying overt word reading and speech production with event-related fMRI: a method for detecting, assessing, and correcting articulation-induced signal changes and for measuring onset time and duration of articulation. Brain Lang 104: 10–23.