Figures
Abstract
Background
Recognizing familiar faces and identifying emotions through facial expressions are essential for social functioning. This study aimed to examine whether people with relapsing-remitting multiple sclerosis (PwMS) differ from healthy control individuals (HC) in their performance on different tasks related to facial emotion processing.
Methods
In a cross-sectional controlled study, 30 PwMS and 35 HC completed a baseline neuropsychological evaluation and experimental tasks assessing visual exploration of facial stimuli through eye tracking, facial emotion recognition, and facial memory recognition. The facial stimuli displayed either a neutral expression or an emotion (happiness, fear, or disgust).
Results
PwMS and HC performed comparably in facial emotion recognition. In facial memory recognition, HC were significantly more accurate in recognizing previously seen fearful faces compared to neutral faces (Wilcoxon test, Z = -2.26, P = 0.024), demonstrating emotional enhancement of memory. In contrast, PwMS did not exhibit a memory advantage for fearful faces over neutral faces (P > 0.05). Groups also differed in the eye-tracking task. In all but one condition (disgust), PwMS showed a significantly greater tendency to explore the eye area rather than the mouth area compared to HC.
Conclusions
Changes in visual exploration and a lack of emotional enhancement of memory are observed in PwMS, who otherwise demonstrate intact facial emotion recognition. These results suggest altered emotion-cognition interactions in PwMS. Early detection of subtle changes and targeted interventions may help prevent future debilitating impairments in social functioning.
Citation: Goettfried E, Barket R, Hershman R, Delazer M, Auer M, Berek K, et al. (2025) Face exploration, emotion recognition, and emotional enhancement of memory in relapsing-remitting multiple sclerosis. PLoS ONE 20(4): e0319967. https://doi.org/10.1371/journal.pone.0319967
Editor: Mohammad Reza Fattahi, King's College Hospital NHS Trust: King's College Hospital NHS Foundation Trust, UNITED KINGDOM OF GREAT BRITAIN AND NORTHERN IRELAND
Received: November 27, 2024; Accepted: February 12, 2025; Published: April 7, 2025
Copyright: © 2025 Goettfried et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: The data underlying the results presented in the study are available from https://doi.org/10.5281/zenodo.14707515.
Funding: The author(s) received no specific funding for this work.
Competing interests: EG has no conflicts of interest related to this article. RB has participated in meetings sponsored by, or received travel grants from, Novartis, Janssen-Cilag, and Sanofi-Genzyme. He has also received honoraria from Janssen-Cilag and Biogen, outside of the present work. RH has no conflicts of interest related to this article. MD has no conflicts of interest related to this article. MA has received speaker honoraria or travel grants and participated in meetings sponsored by Biogen, Merck, Novartis, Sanofi and Horizon Therapeutics, outside of the present work. KB has participated in meetings sponsored by and received travel funding or speaker honoraria from Roche, Teva, Merck, Biogen, Sanofi and Novartis. He is associate editor of Frontiers in Immunology / Neurology, Section Multiple Sclerosis and Neuroimmunology, outside of the present work. PE has participated in meetings sponsored by or received travel funding from AbbVie, Biogen, Bristol Myers Squibb, Roche and Zambon, outside of the present work. BS has no conflicts of interest related to this article. HH has participated in meetings sponsored by, or received speaker honoraria or travel funding from Bayer, Biogen, Bristol Myers Squibb, Horizon, Janssen, Merck, Novartis, Sanofi-Genzyme, Siemens, Teva. He has also received honoraria for acting as consultant for Biogen, Bristol Myers Squibb, Novartis, Roche, Sanofi-Genzyme and Teva, outside of the present work. FDP has participated in meetings sponsored by, or received honoraria (lectures, advisory boards, consultations) or travel funding from Bayer, Biogen, Celgene, Horizon, Janssen-Cilag, Merck, Novartis, Sanofi-Genzyme, Roche and Teva, outside of the present work. FD has participated in meetings sponsored by or received honoraria for acting as an advisor/speaker for Alexion, Almirall, Biogen, BMS, Sanofi, Horizon, Janssen, Merck, Novartis Pharma, Roche, Sandoz, and Teva. His institution has received research grants from Biogen, Novartis Pharma, and Sanofi. He is section editor of the MSARD Journal (Multiple Sclerosis and Related Disorders) and review editor of Frontiers in Neurology, outside of the present work. LZ reports honoraria from Novartis and Biogen as well as a research grant from EVTZ/Austrian Science Fund (FWF): IPN 135-B, outside of the present work.
Introduction
Skills such as the ability to recognize familiar faces or identify emotions in others through facial expressions are crucial for social functioning [1]. Difficulties in perceiving and interpreting non-verbal affective cues such as facial expressions, prosody, or bodily gestures have been linked to lower social competence, reduced social communication skills, and a poorer quality of life, significantly impacting social interactions and everyday functioning [2–4].
Emotion processing can significantly influence cognitive performance [5,6]. For example, emotional information is typically remembered better and more vividly than non-emotional information [5]. Studies using implicit memory paradigms have shown that healthy people recall fearful facial expressions better than neutral expressions, even when not explicitly instructed to do so [7]. This enhancement effect of emotions on memory is often disrupted in patients with dysfunctions of the limbic system and related structures [7]. Emotional information may also modulate attention, resulting in differences in the processing of emotionally arousing versus emotionally non-arousing (neutral) visual stimuli [6].
Multiple sclerosis (MS) is an inflammatory, demyelinating disease of the central nervous system [8]. Cognitive deficits are most commonly observed in information processing speed, episodic memory, executive functions, and attention, and are present across all MS phenotypes [9–11]. Impairments in emotion processing have also been reported [12–15]. Compared to healthy individuals, those with MS exhibit greater difficulties in tasks assessing Theory of Mind or facial emotion recognition [12]. Additionally, differences are observed in measures of alexithymia, which refers to an individual’s ability to perceive and understand their own emotions, with people with MS often scoring higher on these measures than healthy individuals [13]. Impairments in facial emotion recognition affect all individual emotions to varying degrees and are observed across all MS phenotypes, even in the early stages of the disease, with more severe impairments noted in progressive forms [12–14]. While some studies suggest a correlation between impairments in facial emotion processing and factors such as cognitive deficits, disability level, disease duration, and fatigue [16], the evidence is not uniform, with some studies failing to find such associations [17,18].
Despite the known difficulties in emotion processing among individuals with MS, the interactions between emotion and cognition remain poorly understood. To investigate this further, our study aimed to explore these interactions by utilizing both explicit and implicit paradigms of facial emotion processing. We examined attention processing and visual exploration of facial stimuli through eye tracking, as well as facial emotion recognition and the memory advantage for fearful faces compared to neutral faces in people with relapsing-remitting MS (hereafter referred to as PwMS) and healthy control participants (hereafter referred to as HC). Since the eye-tracking task involved age estimation, for which the interpretation of facial expressions is not relevant for successful task performance, we hypothesised – consistent with the findings of Liao et al. [19] – that individuals would spend time examining the upper (eyes) and lower (mouth) parts of the face for signs of aging, regardless of the emotional valence of the facial stimuli. Additionally, we expected that the enhancement effect of emotions on memory would be reflected in the finding that fearful faces are better remembered than neutral faces [7]. Whether PwMS would differ from HC in any of the facial emotion processing tasks in this study remains an open question. There is significant variability in findings across different studies: some research has reported significant changes in emotion processing already in the early stages of the disease, while other studies have not [12–14].
Materials and methods
Study design, setting, and participants
This was a cross-sectional controlled study conducted between June 2020 and December 2023. A total of 30 people with a confirmed diagnosis of relapsing-remitting MS, according to the 2017 revised McDonald criteria [20], and 35 HC agreed to participate in this study. PwMS were prospectively recruited as outpatients from the Department of Neurology at the Medical University of Innsbruck. HC did not report any health-associated complaints and were recruited through acquaintances or word of mouth. The inclusion criteria for both groups required participants to be over 18 years of age and to have good (corrected) near visual acuity of 20/25 (0.8) or better, as assessed by a Snellen chart. For the patient group, additional inclusion criteria were a confirmed diagnosis of relapsing-remitting MS as per the 2017 revised McDonald criteria [20], an Expanded Disability Status Scale (EDSS) [21] score of 4 or lower, no recent clinical relapse, and no current optic neuritis that could hinder participation in the eye-tracking assessment.
The exclusion criteria were as follows: eye surgery; actual optic neuritis; substance abuse; or a history of a major medical, psychiatric, or neurological disorder other than MS. We also excluded people experiencing current major depressive episodes. This information was collected through informal interviews for HC and from medical records for PwMS. All participants had an estimated verbal intelligence quotient [22] of at least 85.
Standard protocol approvals and informed consent
The study was approved by the ethics committee of the Medical University of Innsbruck (ethical approval number: 1047/2020) and conforms to the World Medical Association Declaration of Helsinki regarding studies involving human subjects. Written informed consent was obtained from all participants prior to their involvement in the study.
Study procedure and data collection
Participants underwent a baseline neuropsychological evaluation and experimental tasks assessing visual exploration of facial stimuli through eye tracking, facial emotion recognition, and facial memory recognition.
Baseline neuropsychological assessment
Participants completed a baseline neuropsychological assessment that included standardized cognitive tests measuring episodic memory [23,24], verbal attention span [25], verbal working memory [25], figural fluency [26], psychomotor speed [27], cognitive flexibility [27], information processing speed [28], visuo-construction [24], and facial emotion recognition [29]. Additionally, they completed questionnaires assessing alexithymia [30] and mental health [31]. PwMS, but not HC, answered questionnaires regarding quality of life [32] and fatigue [33,34]. They also rated their current perceived fatigue severity using a visual analogue scale [35].
Experimental tasks
Age estimation task (with eye tracking).
The task included 25 trials consisting of a bi-syllabic pseudo-word (e.g., KAMA) followed by a facial stimulus, while eye movements were recorded. The pseudo-words were displayed as white characters at the center of the top portion of the computer screen against a black background for 2000 msec. Facial stimuli, selected from a well-established affective stimuli database (www.emotionlab.se/kdef/) [36], were images of faces in a frontal view displaying either a neutral expression (n = 10) or emotions such as fear, disgust, or happiness (n = 5 each), with a balanced sex distribution. The pictures were aligned in terms of size, brightness, and eye position and were presented in the center of the computer screen for 5000 msec. Each picture was displayed only once, and the trials were presented in a random order that was consistent for every participant. Participants were asked to read the pseudo-word aloud and then inform the experimenter whether the displayed person appeared younger or older than 30 years. The reading task was designed to maintain participants’ attention and encourage them to refocus their gaze on the baseline (top centered) position. Answers were recorded by the experimenter but were not used for analysis purposes. Participants were unaware of the study’s purpose and were not informed that this task was part of an implicit memory paradigm.
Facial memory recognition task (without eye tracking).
After a delay of ca. 20 min, during which randomly selected tests from the baseline neuropsychological assessment battery were administered, participants were presented with 50 pictures. This set included the 25 facial stimuli from the age estimation task and 25 new stimuli matched for facial expression. Participants were asked to decide whether a specific picture had been shown previously. No time limit was given. Answers were recorded by the experimenter. The accuracy rate for recognizing previously seen pictures was entered into the analysis. Consistent with the study by Okruszek et al. [7], we also computed the difference in accuracy rates between fearful and neutral stimuli. A higher accuracy in recognizing previously seen fearful stimuli compared to neutral stimuli indicates an emotional enhancement of memory.
Facial emotion recognition task (without eye tracking).
We presented the 25 facial stimuli from the age estimation task and asked participants to identify the facial expression from seven alternatives: disgust, fear, happiness, anger, surprise, sadness, and neutrality. No time limit was given. Answers were recorded by the experimenter and analyzed as accuracy rates.
Eye tracking
Eye movements were recorded via a TOBII Pro TX300 remote eye tracker with a sampling rate of 300 Hz. The screen resolution was 1920 × 1080 pixels. Participants were comfortably positioned approximately 64 cm from the computer screen, with their heads resting against the backrest. The TOBII Pro TX300 system is provided with head movement compensation algorithms that ensure high accuracy and precision in data collection, eliminating the need for participants to use a chinrest. We performed a 9-point calibration and visually checked its quality before starting the age estimation task. Calibration was repeated if necessary. In this study, the rate of successfully recorded eye-tracking data was at least 60%. It is important to note that the rates of successfully recorded eye-tracking data between 60% and 79% (the lowest rates in the sample) pertained to nine participants. These lower rates were associated with the use of reflective glasses, FFP2 masks worn too high on the nose, or drooping eyelids. For the analysis, the total viewing time was divided into five one-second intervals to investigate changes in visual scanning over time. The proportion of dwell time was measured for four areas: eyes, nose, mouth, and other. Consistent with the approach of Gomez-Ibanez et al. [37], our analysis focused on the difference in the proportion of dwell time between the eye and mouth areas. Positive difference scores indicate a greater predominance of visual scanning in the eye area compared to the mouth area.
Statistics
We used SPSS software version 29.0 (IBM Corp, Chicago, IL) for our analyses, and JASP software version 0.19.2 (JASP Team, 2024) and R version 4.4.2 (R Core Team, 2023) for creating the figures. Groups were compared using the Chi-squared test and the Mann-Whitney test, where appropriate. We also computed Cohen’s d effect-sizes for demographic and neuropsychological data. Effect sizes can be interpreted as small (<0.50), medium (0.50-0.79), or large (≥0.80), according to Cohen’s convention. A mixed analysis of variance (ANOVA) was performed on the difference scores in the proportion of dwell time between the eye and mouth areas, with time interval (sec 1, sec 2, sec 3, sec 4, sec 5) as a within-subjects factor and group (HC, PwMS) as a between-subjects factor. This analysis was performed separately for each facial expression condition. Post-hoc contrasts were performed with Bonferroni correction. Due to a technical problem, data from one healthy participant were not recorded properly and were therefore excluded from the analysis. A Spearman rank-order correlational analysis was conducted for PwMS, examining the relationships between visual scanning measures, performance in recognition tasks, information processing speed, clinical variables, and fatigue scores. In this analysis, we focused exclusively on the fear condition. To correct for multiple correlations, we used the false discovery rate (FDR) method. Significance was set at α=0.05 (two-tailed), with P-values less than or equal to 0.05 considered significant.
Results
Participants’ characteristics are given in Table 1. PwMS had a median EDDS score of 1.5 (min-max 0-3.5). The two groups had comparable median ages (HC: median 37.0 years, mix-max 18-64; PwMS: median 38.0 years, mix-max 21-57) and comparable median education levels (HC: median 13.0 years, mix-max 9-19; PwMS: median 12.0 years, mix-max 9-18). There were significantly more female participants in the PwMS group (n = 25; 83%) than in the HC group (n = 21; 60%). Typically, MS is more prevalent in females than in males, with a sex ratio of approximately 3:1[38], which aligns with our observations. It is important to note that, in this study, sex did not have a significant influence on performance.
Baseline neuropsychological assessment
Results are reported in Table 2. The median scores for both PwMS and HC fell within the average range of standardized norms across all tests. Significant group differences were observed in tests assessing verbal attention span (P = 0.014), verbal working memory (P = 0.026), and delayed recall of figural information (P = 0.047), with PwMS generally scoring lower than HC. Additionally, PwMS obtained significantly higher scores on the alexithymia questionnaire compared to HC (P = 0.017). Three PwMS (10%), but no HC, scored above the cut-off score of 60 on the alexithymia scale [30]. Cohen’s d indicated a medium effect size for the significant group differences. For other measures, group differences were not significant. It is important to note that the performance of PwMS on the measures where significant group differences were found did not correlate with disease severity (EDSS), disease duration, or fatigue scores (Spearman rank-order tests, all P > 0.05).
Experimental tasks
Facial memory recognition task (without eye tracking).
Results are shown in Fig 1a. HC were significantly more accurate in recognizing previously seen fearful stimuli compared to other stimuli (Wilcoxon-tests, fear vs. neutrality: Z = -2.26, P = 0.024, fear vs. happiness: Z = -2.03, P = 0.042, fear vs. disgust: Z = -2.35, P = 0.019). No other significant differences were observed (all P > 0.05). In contrast, PwMS did not show any significant differences between facial expression conditions (all P > 0.05). A direct comparison of HC and PwMS across the individual facial expression conditions did not yield significant results (all P > 0.05).
Abbreviations: HC = healthy control participants; MS = people with multiple sclerosis; NEU = neutrality; HAP = happiness; FEA = fear; DIS = disgust.
We also compared the groups in terms of the memory enhancement effect of emotions. We found that the median difference in accuracy rates between fearful and neutral conditions was significantly smaller for PwMS than for HC (Z = -2.06, P = 0.040, Cohen’s d = 0.56; Fig 2). Moreover, 18 HC demonstrated higher accuracy with fearful stimuli compared to neutral stimuli (51%), while only 10 PwMS showed similar results (33%; χ2 = 6.65, P = 0.010).
Legend. Emotional enhancement of memory is defined as the difference in accuracy rates between fearful and neutral conditions in the facial memory recognition task. Abbreviations: HC = healthy control participants; MS = people with multiple sclerosis.
Facial emotion recognition task (without eye tracking).
Results are shown in Fig 1b. HC were significantly less accurate in identifying fear compared to other emotions (Wilcoxon-tests, fear vs. neutrality: Z = -4.92, P < 0.001, fear vs. happiness: Z = -4.45, P < 0.001, fear vs. disgust: Z = -4.82, P < 0.001). Additionally, they were significantly less accurate in identifying neutrality and happiness compared to disgust (disgust vs. neutrality: Z = -3.03, P = 0.002, disgust vs. happiness: Z = -3.63, P < 0.001, happiness vs. neutrality: P = 0.695). PwMS also demonstrated lower accuracy in identifying fear compared to other emotions (fear vs. neutrality: Z = -4.61, P < 0.001, fear vs. happiness: Z = -4.57, P < 0.001, fear vs. disgust: Z = -4.58, P < 0.001). No significant differences emerged for the other comparisons (all P > 0.05). PwMS and HC performed comparably across each facial expression condition (all P > 0.05).
Visual exploration of facial stimuli during the age estimation task (with eye tracking).
Results of an ANOVA carried out on difference scores in the proportion of dwell time between the eye and mouth areas showed that, across all facial expression conditions, the main effects of group and time interval were not significant (all P > 0.05). The interaction between time interval and group was significant for neutrality (F4,248 = 4.05, P = 0.003, µ p2 = 0.06; Fig 3a), happiness (F4,248 = 2.44, P = 0.047, µ p2 = 0.04; Fig 3b), and fear (F4,248 = 4.53, P = 0.002, µ p2 = 0.07; Fig 3c), but not for disgust (P > 0.05; Fig 3d). Post-hoc contrasts indicated significant group differences in the first (neutrality: P = 0.022; happiness: P = 0.05; fear: P = 0.031) and second time intervals (neutrality: P = 0.042; happiness: P = 0.004; fear: P = 0.016), with PwMS generally showing larger differences in the proportion of dwell time between the eye and mouth areas compared to HC. Other contrasts were not significant (all P > 0.05).
Legend. Positive difference scores indicate a predominance of visual scanning in the eye area compared to the mouth area. Abbreviations: HC = healthy control participants; MS = people with multiple sclerosis; Sec = second.
In summary, significant differences were observed between PwMS and HC in their visual scanning of facial stimuli. In all but one condition (disgust), PwMS exhibited marked visual exploration of the eye area compared to the mouth area throughout the entire stimulus presentation. Differently, HC explored the eye and mouth areas for nearly the same duration at the beginning of the stimulus presentation. However, from the third time interval onward, they also showed a predominance of visual scanning in the eye area, similar to what observed in PwMS.
Correlation analysis
Fig 4 reports rho-values for PwMS. Higher accuracy in memory and emotion recognition tasks involving fearful stimuli significantly correlated with higher information processing speed and lower EDSS scores. Longer exploration of the eye and mouth areas of fearful stimuli significantly correlated with lower EDSS scores, shorter disease duration, and lower fatigue scores. These correlations remained significant after applying FDR correction. No significant correlations were found between visual scanning measures and performance in recognition tasks (data not shown).
(*) indicates statistical significance (two-sided, P ≤ 0.05, uncorrected). Abbreviations: PwMS = people with multiple sclerosis; SDMT = Symbol Digit Modality Test; EDSS = Expanded Disability Status Scale; NFI-MS = Neurological Fatigue Index – Multiple Sclerosis; MEM REC = Accuracy in memory recognition of fearful stimuli; EMOT REC = Accuracy in emotion recognition of fearful stimuli; MEM Diff = Difference in accuracy rates between fearful and neutral conditions in the memory recognition task (i.e., emotional enhancement of memory); DWT Eyes = Proportion of dwell time in the eye area (fear condition, first time interval); DWT Mouth = Proportion of dwell time in the mouth area (fear condition, first time interval); DWT Diff = Difference in the proportion of dwell time between the eye and mouth areas (fear condition, first time interval).
Discussion
In this study, we included PwMS who have minimal-to-moderate disability and found that HC, but not PwMS, exhibit a memory advantage in recognizing previously seen fearful faces compared to neutral faces. Moreover, we demonstrated that the visual scanning patterns of PwMS during an age estimation task, measured through eye tracking, differ from those of HC. In this task, participants were presented with facial stimuli depicting various emotions or neutrality, and were required to determine whether the displayed person appeared younger or older than a certain age. The emotional valence of the facial stimuli was not relevant to successful task performance. Significant differences between PwMS and HC were evident in all but one condition (disgust) during a very early phase of visual exploration. During this initial phase (the first two seconds of a five-second exploration window), HC spent almost comparable amounts of time exploring the eye and mouth areas, whereas PwMS spent less time on the lower area in favor of the upper area. In a later phase (from the third second onward), both HC and PwMS spent more time exploring the eye area than the mouth area. Since eye tracking is often used to assess visual attention, these results suggest differences in the distribution of attention and processing resources between PwMS and HC. Importantly, there was no significant difference between the groups in the time spent on the whole face (these results are not reported here), suggesting that peripheral elements such as hair or background did not attract more attention from PwMS compared to HC. Studies investigating visual scanning in PwMS during facial emotion recognition have reported heterogeneous findings [39,40]. In general, our findings, along with previously reported results [39], indicate that changes in visual scanning of facial stimuli – regardless of whether facial expressions need to be interpreted – may arise at very early stages of the disease and are not always associated with impairments in task performance. Indeed, in this study, both PwMS and HC performed comparably in the age estimation task (both groups provided similar estimates) and in the explicit facial emotion recognition task, despite their differing visual scanning patterns.
We used an age estimation task to assess the memory enhancement effect of emotions through an implicit memory paradigm. The perception and interpretation of facial characteristics, such as age or emotional expressions, are essential for social interaction [1,19]. In everyday situations, people often need to quickly gather information about others’ faces and form first impressions [19]. They may later need to remember whether they have seen a person before [1]. Previous studies have demonstrated a memory enhancement effect of emotions [5], with angry and fearful facial expressions being better remembered than neutral expressions [7]. In this study, we found that HC but not PwMS exhibit a memory advantage in recognizing previously seen fearful faces compared to neutral faces. Previous studies using verbal declarative memory tasks have shown that the performance of healthy individuals, unlike that of PwMS, is influenced by emotionally salient information [41]. This indicates that differences in the memory enhancement effect of emotions between HC and PwMS are evident for both verbal and non-verbal (facial) material. Emotional enhancement of memory has been associated with the functional integrity of a brain network that includes the amygdala, memory-related medial temporal lobe (MTL) regions such as the hippocampus, and the prefrontal cortex (PFC), particularly the ventrolateral PFC and the dorsolateral PFC [5]. Findings from different studies point to functional and structural abnormalities in MS within a cortico-subcortical network dedicated to emotion processing and, more generally, social cognition [42]. For example, in a study by Passamonti et al. [43], PwMS, compared to healthy individuals, showed enhanced activation in the ventrolateral PFC and a lack of functional connectivity between prefrontal regions (specifically, ventrolateral and medial PFC) and the amygdala when processing emotional versus neutral stimuli. Results of our study contribute to the understanding of how MS affects cognition and emotion processing by showing that the mechanisms underlying emotion-cognition interactions may be disrupted in MS, even in people with preserved facial emotion recognition, minimal-to-moderate disability, and only minor cognitive alterations. The fact that the groups did not differ in emotion recognition but showed differences in the memory enhancement effect of emotions also underscores the importance of investigating both explicit and implicit emotion processing.
In this study, PwMS showed only minor alterations in some measures of attention, memory, and alexithymia, while performing comparably to HC on the majority of neuropsychological tests, including those assessing facial emotion recognition. Findings regarding facial emotion recognition or alexithymia in MS are heterogeneous, with some studies reporting differences between PwMS and HC, while others do not [12–14]. This variability in performance, particularly in the early stages of the disease, has been discussed with reference to potential compensatory mechanisms underlying cognitive reserve and brain reserve [42].
We found that performance of PwMS in recognition tasks and their visual scanning patterns correlated with information processing speed, disease duration, physical disability, and fatigue. Previous studies have yielded mixed findings, with some indicating significant associations with cognitive and clinical variables [16,40], while others have not [17,18]. Our results provide valuable insights, suggesting that disease-related changes in PwMS may impact visual scanning, as well as emotion and memory recognition processes that are critical for social interactions and everyday functioning. Notably, we did not find a correlation between visual scanning measures and performance in recognition tasks, at least for fearful stimuli. It is possible that the observed changes in visual exploration and the lack of emotional enhancement of memory in PwMS are related to alterations in cognitive mechanisms that are at least partially independent. Future studies are needed to verify this speculation.
Different explanations can be proposed to account for the observed differences in visual scanning and the lack of emotional enhancement of memory in PwMS. MS is a neuroinflammatory, demyelinating disorder characterized by both functional and structural changes in the brain, which disrupt neural connectivity on a large scale [44]. This disruption may affect pathways responsible for visual scanning and emotion processing, potentially leading to altered visual scanning patterns and impacting how emotions influence memory. Fatigue is a common symptom of MS that can significantly impair an individual’s physical and cognitive functioning [45]. When individuals experience fatigue, their ability to effectively scan their environment or engage with complex stimuli, such as faces, may be diminished, resulting in less efficient visual and memory processing. Additionally, MS is associated with cognitive changes [9–11], which may further influence how effectively PwMS can visually scan and recall previously encountered stimuli. The significant correlations found in this study between information processing speed, disease-related variables including fatigue, and performance of PwMS on recognition tasks, as well as their visual scanning patterns, support these hypotheses. Finally, some PwMS may develop compensatory strategies to cope with their symptoms [46], which could lead to variations in visual scanning and memory performance. For example, they may rely more heavily on specific visual cues or focus on particular areas of interest, which can alter their overall scanning patterns and the way emotions influence memory. These explanations are not mutually exclusive; however, they remain speculative and require further verification.
Limitations
One limitation of our study is that our sample consists of PwMS who have a minimal-to-moderate disability. As a result, we may have missed peculiarities related to disease severity or progression. Moreover, we did not collect response times in the behavioral tasks. Therefore, we cannot account for differences in processing speed during the execution of memory and emotion recognition tasks. Future studies could explore the differences between memory and emotion recognition tasks by employing both implicit and explicit paradigms.
Conclusion
The results of this study make a significant contribution to previous findings [12–15] by demonstrating, for the first time, that PwMS who have otherwise intact facial emotion recognition may still exhibit alterations in visual scanning and memory processing of socially relevant non-verbal information, such as faces. In daily life, these impairments can lead to inadequate processing of visual cues and difficulties in recognizing familiar faces, which may result in misunderstandings and hinder social interactions. Our findings strongly advocate for healthcare professionals to be aware that PwMS in the early stages of the disease may experience subtle changes in some aspects of social functioning. This underscores the need to include specific neuropsychological tests that focus on visual scanning and memory processing of facial stimuli. Such assessments could facilitate the early identification of changes in social functioning among PwMS. Early detection may, in turn, provide opportunities for targeted cognitive interventions that emphasize visual exploration of facial stimuli and memory strategies, potentially preventing or mitigating future impairments in psychosocial functioning and enhancing overall quality of life.
Supporting informations
S1 Fig. Example of visual scanning patterns with a facial stimulus (fear condition) at different time intervals.
https://doi.org/10.1371/journal.pone.0319967.s001
(DOCX)
Acknowledgments
We thank M. Herzog and V. Mayr for their help in the data collection, and K. Thaler for her help in the preparation of the figures. We also thank all patients and controls for their participation in the study.
References
- 1. Connolly HL, Lefevre CE, Young AW, Lewis GJ. Emotion recognition ability: Evidence for a supramodal factor and its links to social cognition. Cognition. 2020;197:104166. pmid:31951857
- 2. Bornhofen C, McDonald S. Emotion perception deficits following traumatic brain injury: a review of the evidence and rationale for intervention. J Int Neuropsychol Soc. 2008;14(4):511–25. pmid:18577280
- 3. Sousa D, Ferreira A, Rodrigues D, Pereira H, Amaral J, Crisostomo J, et al. A neurophysiological signature of dynamic emotion recognition associated with social communication skills and cortical gamma-aminobutyric acid levels in children. Frontiers in Neuroscience. 2023;17:1295608.
- 4. Phillips LH, Henry JD, Scott C, Summers F, Whyte M, Cook M. Specific impairments of emotion perception in multiple sclerosis. Neuropsychology. 2011;25(1):131–6. pmid:21090898
- 5. Dolcos F, Denkova E. Current emotion research in cognitive neuroscience: Linking enhancing and impairing effects of emotion on cognition. Emotion Review. 2014;6(4):362–75.
- 6. Schindler S, Bublatzky F. Attention and emotion: An integrative review of emotional face processing as a function of attention. Cortex. 2020;130:362–86. pmid:32745728
- 7. Okruszek Ł, Bala A, Dziekan M, Szantroch M, Rysz A, Marchel A, et al. Gaze matters! The effect of gaze direction on emotional enhancement of memory for faces in patients with mesial temporal lobe epilepsy. Epilepsy Behav. 2017;72:35–8. pmid:28575764
- 8. Dendrou CA, Fugger L, Friese MA. Immunopathology of multiple sclerosis. Nat Rev Immunol. 2015;15(9):545–58. pmid:26250739
- 9. Kalb R, Beier M, Benedict R, Charvet L, Costello K, Feinstein A, et al. Recommendations for cognitive screening and management in multiple sclerosis care. Multiple Sclerosis. 2018;24(13):1665–80.
- 10. Genova HM, Sumowski JF, Chiaravalloti N, Voelbel GT, Deluca J. Cognition in multiple sclerosis: a review of neuropsychological and fMRI research. Front Biosci (Landmark Ed). 2009;14(5):1730–44. pmid:19273158
- 11. Brochet B, Ruet A. Cognitive Impairment in Multiple Sclerosis With Regards to Disease Duration and Clinical Phenotypes. Front Neurol. 2019;10:261. pmid:30949122
- 12. Lin X, Zhang X, Liu Q, Zhao P, Zhong J, Pan P, et al. Social cognition in multiple sclerosis and its subtypes: A meta-analysis. Mult Scler Relat Disord. 2021;52:102973. pmid:33962135
- 13. Chalah MA, Ayache SS. Deficits in Social Cognition: An Unveiled Signature of Multiple Sclerosis. J Int Neuropsychol Soc. 2017;23(3):266–86. pmid:28069095
- 14. Doskas T, Vavougios GD, Karampetsou P, Kormas C, Synadinakis E, Stavrogianni K, et al. Neurocognitive impairment and social cognition in multiple sclerosis. Int J Neurosci. 2022;132(12):1229–44. pmid:33527857
- 15. Gury P, Moulin M, Laroye R, Montazel M, Trachino M, Narme P, et al. Explicit and implicit abilities in humor processing in patients with relapsing-remitting multiple sclerosis. Soc Neurosci. 2024;19(1):1–13. pmid:38424715
- 16. Berneiser J, Wendt J, Grothe M, Kessler C, Hamm AO, Dressel A. Impaired recognition of emotional facial expressions in patients with multiple sclerosis. Mult Scler Relat Disord. 2014;3(4):482–8. pmid:25877060
- 17. Pitteri M, Genova H, Lengenfelder J, DeLuca J, Ziccardi S, Rossi V, et al. Social cognition deficits and the role of amygdala in relapsing remitting multiple sclerosis patients without cognitive impairment. Mult Scler Relat Disord. 2019;29:118–23. pmid:30710839
- 18. Henry A, Tourbah A, Chaunu M-P, Rumbach L, Montreuil M, Bakchine S. Social cognition impairments in relapsing-remitting multiple sclerosis. J Int Neuropsychol Soc. 2011;17(6):1122–31. pmid:22014035
- 19. Liao D, Ishii LE, Chen J, Chen LW, Kumar A, Papel ID, et al. How Old Do I Look? Exploring the Facial Cues of Age in a Tasked Eye-Tracking Study. Facial Plast Surg Aesthet Med. 2020;22(1):36–41. pmid:32053421
- 20. Thompson AJ, Banwell BL, Barkhof F, Carroll WM, Coetzee T, Comi G, et al. Diagnosis of multiple sclerosis: 2017 revisions of the McDonald criteria. Lancet Neurol. 2018;17(2):162–73. pmid:29275977
- 21. Kurtzke JF. Rating neurologic impairment in multiple sclerosis: an expanded disability status scale (EDSS). Neurology. 1983;33(11):1444–52. pmid:6685237
- 22.
Lehrl S. Mehrfach-Wortschatz-Intelligenztest. [Multiple choice vocabulary intelligence test.]. Balingen: Spitta Verlag; 1999.
- 23.
Helmstaedter C, Lendt M, Lux S. Verbaler Lern- und Merkfähigkeitstest. Göttingen: Belz Test GmbH; 2001.
- 24.
Meyers JE, Meyers KR. Rey Complex Figure Test and recognition trial professional manual. Lutz, FL: Psychological Assessment Resources.
- 25.
Härting C, Markowitsch H, Neufeld H, Calabrese P, Deisinger K. WMS-R Wechsler Gedächtnistest—Revidierte Fassung. Bern: Hans Huber; 2000.
- 26.
Haid T, Martl C, Schubert F, Wenzl M, Kofler M, Saltuari L. Hamasch 5 Point Test-Revised (H5PT-R)—Additional normative data. 2004.
- 27. Rodewald K, Bartolovic M, Debelak R, Aschenbrenner S, Weisbrod M, Roesch-Ely D. Eine Normierungsstudie eines modifizierten Trail Making Tests im deutschsprachigen Raum. Zeitschrift für Neuropsychologie. 2012;23(1):37–48.
- 28.
Smith A. Symbol Digit Modalities Test. Hogrefe; 2013.
- 29.
Ekman P, Friesen W. Pictures of facial affect. Palo Alto, CA: Consulting Psychologists Press; 1976.
- 30. Bagby RM, Taylor GJ, Parker JD. The Twenty-item Toronto Alexithymia Scale--II. Convergent, discriminant, and concurrent validity. J Psychosom Res. 1994;38(1):33–40. pmid:8126688
- 31.
Hermann C, Buss U, Snaith R. Hospital Anxiety and Depression Scale—German (HADS-D). Bern: Hans Huber; 1995.
- 32. Flachenecker P, Vogel U, Simeoni MC, Auquier P, Rieckmann P. MusiQol: international questionnaire investigating quality of life in multiple sclerosis: validation results for the German subpopulation in an international comparison. Nervenarzt. 2011;82(10):1281–9. pmid:21472450
- 33. Mills RJ, Young CA, Pallant JF, Tennant A. Development of a patient reported outcome scale for fatigue in multiple sclerosis: The Neurological Fatigue Index (NFI-MS). Health Qual Life Outcomes. 2010;8:22. pmid:20152031
- 34. Seebacher B, Horton MC, Reindl M, Brenneis C, Ehling R, Deisenhammer F, et al. Psychometric Evaluation of the “German Neurological Fatigue Index for Multiple Sclerosis (NFI-MS-G)” in a Sample of Rehabilitation Patients with Multiple Sclerosis. Rehabilitation (Stuttg). 2023;62(1):31–9. pmid:36516968
- 35. Zamarian L, Delazer M, Ehling R, Pertl M-T, Bsteh G, Wenter J, et al. Improvement of medical judgments by numerical training in patients with multiple sclerosis. Eur J Neurol. 2019;26(1):106–12. pmid:30117230
- 36.
Lundqvist D, Flykt A, Öhman A. The Karolinska Directed Emotional Faces – KDEF. CD ROM from Department of Clinical Neuroscience, Psychology section, Karolinska Institutet, ISBN 91-630-7164-9:1998.
- 37. Gomez-Ibañez A, Urrestarazu E, Viteri C. Recognition of facial emotions and identity in patients with mesial temporal lobe and idiopathic generalized epilepsy: an eye-tracking study. Seizure 2014;23(10):892–8.
- 38. Coyle PK. What Can We Learn from Sex Differences in MS?. J Pers Med. 2021;11(10):1006. pmid:34683148
- 39. Polet K, Hesse S, Joly H, Cohen M, Morisot A, Kullmann B, et al. Facial emotion impairment in multiple sclerosis is linked to modifying observation strategies of emotional faces. Mult Scler Relat Disord. 2023;69:104439. pmid:36525898
- 40. Pinto C, Gomes F, Moreira I, Rosa B, Santos E, Silva A, et al. Emotion recognition in multiple sclerosis. Journal of Eye Tracking, Visual Cognition and Emotion. 2012;2:1647–77.
- 41. Iaffaldano P, Viterbo RG, Goretti B, Portaccio E, Amato MP, Trojano M. Emotional and neutral verbal memory impairment in Multiple Sclerosis. J Neurol Sci. 2014;341(1–2):28–31. pmid:24713509
- 42. Chalah MA, Ayache SS. A Scope of the Social Brain in Multiple Sclerosis: Insights From Neuroimaging Studies. Cogn Behav Neurol. 2020;33(2):90–102. pmid:32496294
- 43. Passamonti L, Cerasa A, Liguori M, Gioia MC, Valentino P, Nisticò R, et al. Neurobiological mechanisms underlying emotional processing in relapsing-remitting multiple sclerosis. Brain. 2009;132(Pt 12):3380–91. pmid:19420090
- 44. Schoonheim MM, Broeders TAA, Geurts JJG. The network collapse in multiple sclerosis: An overview of novel concepts to address disease dynamics. Neuroimage Clin. 2022;35:103108. pmid:35917719
- 45. Wendebourg MJ, Poettgen J, Finlayson M, Gonzalez-Lorenzo M, Heesen C, Köpke S, et al. Education for fatigue management in people with multiple sclerosis: Systematic review and meta-analysis. Eur J Neurol. 2024;31(12):e16452. pmid:39225447
- 46. Randolph JJ, Randolph JS, Wishart HA. Subgroup Analysis of Individuals with Multiple Sclerosis Showing Cognitive Resilience. Arch Clin Neuropsychol. 2022;37(2):302–8. pmid:34386812