Figures
Abstract
In this study we evaluate the convergent validity of a new graphical self-report tool (the EmojiGrid) for the affective appraisal of perceived touch events. The EmojiGrid is a square grid labeled with facial icons (emoji) showing different levels of valence and arousal. The EmojiGrid is language independent and efficient (a single click suffices to report both valence and arousal), making it a practical instrument for studies on affective appraisal. We previously showed that participants can intuitively and reliably report their affective appraisal (valence and arousal) of visual, auditory and olfactory stimuli using the EmojiGrid, even without additional (verbal) instructions. However, because touch events can be bidirectional and dynamic, these previous results cannot be generalized to the touch domain. In this study, participants reported their affective appraisal of video clips showing different interpersonal (social) and object-based touch events, using either the validated 9-point SAM (Self-Assessment Mannikin) scale or the EmojiGrid. The valence ratings obtained with the EmojiGrid and the SAM are in excellent agreement. The arousal ratings show good agreement for object-based touch and moderate agreement for social touch. For social touch and at more extreme levels of valence, the EmojiGrid appears more sensitive to arousal than the SAM. We conclude that the EmojiGrid can also serve as a valid and efficient graphical self-report instrument to measure human affective response to a wide range of tactile signals.
Citation: Toet A, van Erp JBF (2020) The EmojiGrid as a rating tool for the affective appraisal of touch. PLoS ONE 15(9): e0237873. https://doi.org/10.1371/journal.pone.0237873
Editor: Enzo Pasquale Scilingo, Universita degli Studi di Pisa, ITALY
Received: March 30, 2020; Accepted: August 4, 2020; Published: September 2, 2020
Copyright: © 2020 Toet, van Erp. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: All relevant data are within the paper and its Supporting Information files.
Funding: The authors received no specific funding for this work.
Competing interests: The authors have declared that no competing interests exist.
1 Introduction
Next to serving us to discriminate material and object properties, our sense of touch also has hedonic and arousing qualities [1–4]. For instance, light, soft stroking and soft and smooth materials (e.g., fabrics) are typically perceived as pleasant and soothing, while heavy, hard stroking and stiff, rough, or coarse materials are experienced as unpleasant and arousing [2, 5]. Affective touch has been defined as tactile processing with a hedonic or emotional component [6]. Affective touch plays a significant role in social communication [7]. Interpersonal or social touch [8] has a strong emotional valence that can either be positive (when expressing support, reassurance, affection or attraction [9]) or negative (conveying anger, frustration, disappointment [10]). Affective touch can significantly affect social interactions [11]. For example, a soft touch on the forearm can lead to more favorable evaluations of the toucher [12], a light touch on the back can help to persuade people [13], and caressing touch (e.g., holding hands, hugging, kissing, cuddling, and massaging) can influence our physical and emotional well-being [14].
Recent studies have shown that discriminative and affective touch are processed by orthogonal somatosensory subsystems [15, 16]) that are already established early in life [17, 18]. The importance of touch in social communication is further highlighted by the fact that the human skin has receptors that appear specifically tuned to process some varieties of (caressing) affective social touch (“the skin as a social organ” [7]) in addition to those for discriminative touch [15, 16, 19–21], presumably like all mammals [22]. These receptors are in fact so ideally suited to convey socially relevant touch that they have become known as a “social touch system” [16, 23]. Since it is always reciprocal, social touch not only emotionally affects the receiver [14] but also the touch giver [24]. Touch is the primary modality for conveying intimate emotions [7, 14, 25]. To study the emotional impact of touch, validated and efficient affective self-report tools are needed.
In our digital age, human social interaction–including haptic interaction—becomes increasingly mediated. Most research on mediated haptic interaction has addressed the affective qualities of vibrotactile stimulation patterns [26–28]. However, recent technological developments like the embodiment of artificial entities [29], advanced haptic and tactile display technologies and standards ([30], including initial guidelines for mediated social touch [31]) also enable mediated social touch [32]. Mediated social touch may serve to enhance the affective quality of communication between geographically separated partners [33] and foster trust in and compliance with artificial entities [34]. Social touch between humans and touch-enabled agents and robots will become more significant with their increasing deployment in healthcare, teaching and telepresence applications [31, 35]. For all these applications, a complete understanding of the relation between the affective response space and the tactile stimuli is crucial for designing effective stimuli. Again, validated and efficient tools are needed to measure human affective response to tactile signals.
In accordance with the circumplex model of affect [36], the affective responses elicited by tactile stimuli vary mainly over the two principal affective dimensions of valence and arousal [28]. Most studies on the emotional response to touch apply two individual one-dimensional Likert scales [37, 38] or SAM (Self-assessment Mannikin [39]) scales [33, 34, 40] to measure both affective dimensions. Although the SAM is a validated and generally accepted tool, it has some practical drawbacks. The emotions it depicts are often misunderstood [41, 42]. Although the valence scale of the SAM is quite intuitive (the mannikin’s facial expression varies from a frown to a smile), its arousal dimension (depicted as an ‘explosion’ in the mannikin’s stomach) is frequently misinterpreted [43–45]. When used to measure the affective appraisal of touch, it appeared that the SAM’s arousal scale is not evident for participants [40]. When participants do not fully understand the meaning of “arousal”, they tend to copy their valence response to the arousal scale (an anchoring effect) [46]. To remedy this effect, participants are sometimes provided with additional explanations about this scale, emphasizing that arousal refers to emotional arousal [40]. The use of Likert scales involves cognitive effort, since they require the user to relate experienced emotions to numbers or labels on a scale. As a result, people with stroke related impairments [47] and children [48] are unable to correctly complete VAS scales whereas they can successfully use scales based on iconic facial expressions [49, 50]. The cognitive effort required for the use of VAS scales can also result in an emotion-cognition interaction, such that participants react faster and more consistently to stimuli that are rated as more unpleasant or more arousing than to stimuli that are rated as more pleasant and less arousing [37]. Also, SAM and Likert scales both require two successive assessments of valence and arousal (along two individual scales), making their use more elaborate, error prone and time consuming. Previous studies therefore recognized the need to develop new rating scales for affective touch [51].
We recently presented a new intuitive and language-independent self-report instrument called the EmojiGrid (Fig 1): a square grid labeled with facial icons (emoji) expressing different levels of valence (e.g., angry face vs. smiling face) and arousal (e.g., sleepy face vs. excited face) [46]. In previous studies we found that participants can intuitively and reliably report their emotional response with a single click on the EmojiGrid, even without any further instructions [46, 52–54]. This suggested that the EmojiGrid might also have more general validity as a self-report instrument to assess human affective responses. Given the intimate relationship between affective touch and affective facial expressions [55] we believe that the EmojiGrid may also be a useful tool to rate the affective appraisal of touch events.
The facial expressions of the emoji along the horizontal (valence) axis gradually change from unpleasant via neutral to pleasant, while the intensity of the facial expressions gradually increases in the vertical (arousal) direction.
In this study, we evaluated the convergent validity of the EmojiGrid as a self-report tool for the affective assessment of perceived touch events. We thereto measured perceived valence and arousal for various touch events in video clips from the validated Socio-Affective Touch Expression Database (SATED: [40]) using both the EmojiGrid and the validated SAM affective rating tool, and we compared the results. While social and non-social touch events elicit different neural activation patterns [56], it appears that the brain activity patterns elicited by imagined, perceived and experienced (affective) touch are highly similar [7, 19, 57–59]. The mere observation of social touch (relative to observation of object touch) activates a number of brain regions (including primary and secondary somatosensory cortices) in a somatotopical way [59], resulting in similar pleasantness ratings [19]. Mental tactile imagery recruits partially overlapping neural substrates in both the primary and secondary somatosensory areas [57]. The anterior insula, which has been implicated in anticipating touch and coding its affective quality [60], responds both to experienced and imagined touch [58]. To some extent, people experience the same touches as the ones they see (they have the ability to imagine how an observed touch would feel [61]): seeing other people’s hands [62], legs [63], neck, or face [59] being touched activated brain regions that also respond when participants are touched on the same body part. Overall, these findings suggest that video clips showing touch actions are a valid means to study affective touch perception [64].
2 Methods and procedure
2.1 Stimuli
The stimuli used in this study are all 75 video clips from the validated Socio-Affective Touch Expression Database (SATED [40]). These clips represent 25 different dynamic touch events varying widely in valence and arousal. The interpersonal socio-affective touch events (N = 13) show people hugging, patting, punching, pushing, shaking hands or nudging each other’s arm (Fig 2A). The object-based (non-social) touch events (N = 12) represent human-object interactions with motions that match those involved in the corresponding social touch events, and show people touching, grabbing, carrying, shaking or pushing objects like bottles, boxes, baskets, or doors (Fig 2B). Each touch movement is performed three times (i.e., by three different actors or actor pairs) and for about three seconds, resulting in a total of 75 video clips. All video clips had a resolution of 640×360 pixels.
An interpersonal (socio-affective) touch event (left) and a corresponding object-based touch event (right).
2.2 Measures
2.2.2 Valence and arousal.
In two independent online experiments, valence and arousal ratings were obtained for each of the 75 SATED video clips using either a 9-point SAM scale (Experiment 1) or the EmojiGrid (Experiment 2). The SAM is a pictorial self-report tool that enables users to rate the valence and arousal dimensions of their momentary feelings by selecting those humanoid figures that best reflect their own feelings. The EmojiGrid (Fig 1) is a square grid that is labeled with emoji showing different facial expressions. Each outer edge of the grid is labeled with five emoji, and there is one (neutral) emoji located in its center. The facial expressions of the emoji along a horizontal (valence) edge vary from disliking (unpleasant) via neutral to liking (pleasant), and their expression gradually increases in intensity along a vertical (arousal) edge. Users can report their affective state by placing a checkmark at the appropriate location on the grid. The EmojiGrid was first introduced in a study on food evoked emotions [46].
2.3 Participants
English speaking participants were recruited via the Prolific database (https://prolific.ac). A total of 65 participants (40 females, 25 males) aged between 18 and 35 (M = 27.5; SD = 5.1) participated in Experiment 1 (SAM rating). A total of 65 participants (43 females, 22 males) aged between 18 and 35 (M = 29.2; SD = 5.2) participated in Experiment 2 (EmojiGrid rating). The experimental protocol was reviewed and approved by the TNO Ethics Committee (Ethical Approval Ref: 2017–011) and was in accordance with the Helsinki Declaration of 1975, as revised in 2013 [65]. Participation was voluntary. After completing the study, all participants received a small financial compensation for their participation.
2.4 Procedure
The experiments were performed as (anonymous) online surveys created with the Gorilla experiment builder [66]. In each experiment, the participants viewed 75 brief video clips showing a different touch event and rated (using the SAM in Experiment 1 and the EmojiGrid in Experiment 2) for each video how the touch would feel. First, the participants signed an informed consent and reported their demographic variables. Next, they were introduced to either the SAM (Experiment 1) or the EmojiGrid (Experiment 2) response tool and were instructed how they could use this rating tool to rate their affective appraisal of each perceived touch event. For the SAM, we explained that the feelings expressed by the figures on the valence scale ranged from very unpleasant via neutral to very pleasant, while the figures on the arousal scale expressed feelings ranging from very relaxed/calm via neutral to very excited/stimulated. The instructions for the use of the SAM scale stated: “Click on the response scales to indicate how pleasant (upper scale) and arousing (lower scale) the touch event feels”. No explanation was offered for the meaning of the facial icons along the EmojiGrid. The instructions for the use of the EmojiGrid stated: “Click on a point inside the grid that best matches how you think the touch event feels”. No further explanation was given. In each experiment, the participants performed two practice trials to get familiar with the SAM or EmojiGrid and its use. Immediately after these practice trials the actual experiment started. The video clips were presented in random order. The rating task was self-paced without imposing a time-limit. After seeing each video clip, the participants responded by successively clicking on the valence and arousal scales of the SAM tool (Experiment 1; see Fig 3) or with a single click on the EmojiGrid (Experiment 2; see Fig 4). Immediately after responding the next video clip was presented. On average the experiment lasted about 10 minutes.
Left: a screenshot of a movie clip showing an interpersonal (social) touch action. Right: The SAM valence (top) and arousal (bottom) rating scales.
Left: a screenshot of a movie clip showing an interpersonal (social) touch action. Right: The EmojiGrid rating tool.
2.4.1 Data analysis.
IBM SPSS Statistics 26 (www.ibm.com) for Windows was used to perform all statistical analyses. Intraclass correlation coefficient (ICC) estimates and their 95% confident intervals were based on a mean-rating (k = 3), absolute agreement, 2-way mixed-effects model [67, 68]. ICC values less than .5 are indicative of poor reliability, values between .5 and .75 indicate moderate reliability, values between .75 and .9 indicate good reliability, while values greater than .9 indicate excellent reliability [67]. For all other analyses, a probability level of p < .05 was considered to be statistically significant.
For each of the 25 different touch scenarios, we computed the mean valence and arousal responses over all three of its representations (three actor pairs) and over all participants. We used Matlab 2019b (www.mathworks.com) to investigate the relation between the (mean) valence and arousal ratings and to plot the data. The Curve Fitting Toolbox (version 3.5.7) in Matlab was used to compute a least-squares fit of a quadratic function to the data points.
3 Results
Table 1 lists the median valence and arousal ratings obtained with the SAM and the EmojiGrid, for all touch events and for social and non-social events separately. First, we verified the results reported by Masson & Op de Beeck [40] for their SATED database.
A Mann-Whitney U test revealed that mean SAM valence ratings were indeed significantly higher for social touch scenarios classified in SATED as positive or pleasant (Mdn = 6.96, MAD = .33., n = 6) than for those that were classified as negative or unpleasant (Mdn = 3.11, MAD = .26, n = 6), U = 0, z = -2.89, p = .004. Also, mean SAM valence ratings for the positive social touch scenarios were indeed significantly higher than the corresponding ratings for object-based touch scenarios (Mdn = 5.12, MAD = .20, n = 6, U = 0, z = -2.89, p = .004), while mean SAM valence ratings for the negative social touch scenarios were significantly lower than the corresponding ratings for the object-based touch scenarios (Mdn = 4.04, MAD = .40, n = 6, U = 0, z = -2.88, p = .004).
Then, we compared the mean SAM arousal ratings between social touch and object touch. The results revealed that social touch was indeed overall rated as more arousing (Mdn = 4.80, MAD = .35) than object-based (non-social) touch (Mdn = 3.94, MAD = .18, U = 7.0, z = -3.75, p < = 0.000), as reported by Masson & Op de Beeck [40]. Also, mean SAM arousal ratings for the positive social touch scenarios (Mdn = 4.55, MAD = .25) were significantly higher than the corresponding ratings for the object-based touch scenarios (Mdn = 3.76, MAD = .13, n = 6, U = 0, z = -2.88, p = .004), while mean SAM arousal ratings for the negative social touch scenarios (Mdn = 4.94, MAD = .21) were significantly higher than the corresponding ratings for the object-based touch scenarios (Mdn = 4.26, MAD = .18, n = 6, U = 1, z = -2.72, p = .006). Thus, it appears that social touch is more arousing than object-based (non-social) touch, suggesting that interpersonal touch is experienced as being more intense. Summarizing, our current findings fully agree with and confirm the corresponding results reported previously by Masson & Op de Beeck [40].
To quantify the agreement between the SAM ratings obtained in this study and those reported by Masson & Op de Beeck [40], we computed Intraclass Correlation Coefficient (ICC) estimates with their 95% confidence intervals for the mean valence and arousal ratings between both studies. For valence we find an ICC value of .98 [.96 -.99], indicating excellent reliability. For arousal, the ICC value is .59 [.07 - .82], indicating moderate reliability.
Next, we continued to investigate the agreement between the ratings obtained in this study with the SAM and with the EmojiGrid self-report tools.
Fig 5 shows the correlation plots between the mean subjective valence and arousal ratings obtained with the EmojiGrid and with the SAM. This figure shows that the mean valence ratings for all touch events closely agree between both studies, while the original classification of the social touch scenarios by Masson & Op de Beeck [40] into positive (scenarios 1–6), negative (scenarios 8–13) and neutral (scenario 7) scenarios also holds in this result. For social touch, the EmojiGrid appears more sensitive to arousal than the SAM: over the same valence range, the variation in mean arousal ratings is larger for the EmojiGrid than for the SAM.
The relationship between the valence (left) and arousal (right) ratings provided with a 9-point SAM scale and with the EmojiGrid. The numbers correspond to the original scenario identifiers in the SATED database [40]. The social touch scenarios (red labels) 1–6, 7 and 8–13 were originally classified as positive, neutral and negative, and the object touch scenarios (blue labels) 14–25 as neutral.
Fig 6 shows the relation between the mean valence and arousal ratings for all 25 different SATED scenarios, as measured with the 9-point SAM scale and with the EmojiGrid. The curves represent least-squares quadratic fits to the data points. The adjusted R-squared values are respectively .74 and .88, indicating good fits. This figure shows that the relation between the mean valence and arousal ratings obtained with both self-assessment methods (SAM and EmojiGrid) is closely described by a quadratic (U-shaped) relation at the nomothetic (group) level: touch events scoring near neutral on mean valence have the lowest mean arousal ratings, while touch events scoring either high (pleasant) or low (unpleasant) on mean valence show higher mean arousal ratings. For increasing absolute valence (absolute difference from neutral valence) the EmojiGrid appears increasingly more sensitive to variations in arousal (the length of the line segments in Fig 6 systematically increases for increasing absolute valence).
Mean valence and arousal ratings for affective touch video clips from the SATED database, obtained with the a 9-point SAM rating scale (blue dots) and with the EmojiGrid (red dots). The curves represent fitted polynomial curves of degree 2 using a least-squares regression between valence and arousal. The line segments connect ratings for the same video clips.
To quantify the agreement between the ratings obtained with the 9-point SAM scales and with the EmojiGrid we computed Intraclass Correlation Coefficient (ICC) estimates with their 95% confidence intervals for the mean valence and arousal ratings obtained with both tools, for all touch events and for social and non-social events separately (see Table 2). The valence ratings show excellent reliability in all cases. The arousal ratings show good reliability for all touch events and for object-based touch events, and moderate reliability for social touch events.
4 Conclusion and discussion
The valence ratings obtained with the EmojiGrid show excellent agreement with valance ratings obtained with a 9-point SAM scale, both for inter-human (social) and object-based touch events. The arousal ratings obtained with the EmojiGrid show good agreement with arousal ratings obtained with a 9-point SAM scale, both overall and for object-based touch events. However, for social touch events both arousal ratings only show moderate agreement. This results from the fact that the sensitivity of the EmojiGrid appears to increase with increasing absolute valence (deviation from neutral). It appears that the ratings obtained with the SAM tool show a ceiling effect, possibly because participants hesitate to use the extreme ‘explosion’ icons near the higher end of the SAM arousal scale while they are still confident to use the more extreme facial expressions of the emoji on the arousal axis of the EmojiGrid.
Our results also replicate the U-shaped (quadratic) relation between the mean valence and arousal ratings, as reported in the literature [40].
Summarizing, we conclude that the EmojiGrid appears to be a valid graphical self-report instrument for the affective appraisal of perceived social touch events, which appears to be more sensitive to variations in arousal than the SAM at more extreme levels of valence. A previous study found that slow brush stroking on the forearm and the palm elicited significantly different brain activations, while individual VAS pleasantness and intensity scales were not sufficiently sensitive to distinguish between these two types of touch [69]. Since the EmojiGrid combines both valence and arousal, it would be interesting to investigate whether this instrument is sensitive enough to differentiate between these kinds of touch. However, more studies using different emotion elicitation protocols are required to fully assess the validity of this new tool.
A limitation of this study is that we did not investigate the affective appraisal of real touch events. However, given the ability of people to imagine how an observed touch would feel [61], we expect that the EmojiGrid will also be useful as a self-report tool for the affective appraisal of experienced touch events. Further research is needed to test this hypothesis.
Supporting information
S1 File. Mean and SD values measured with SAM and EmojiGrid for each of the 25 different dynamic touch events.
https://doi.org/10.1371/journal.pone.0237873.s001
(XLSX)
References
- 1. Drewing K, Weyel C, Celebi H, Kaya D. Systematic relations between affective and sensory material dimensions in touch. IEEE Transactions on Haptics. 2018;11(4):611–22. pmid:29994318
- 2. Essick GK, McGlone F, Dancer C, Fabricant D, Ragin Y, Phillips N, et al. Quantitative assessment of pleasant touch. Neuroscience & Biobehavioral Reviews. 2010;34(2):192–203.
- 3. Kirsch LP, Krahé C, Blom N, Crucianelli L, Moro V, Jenkinson PM, et al. Reading the mind in the touch: Neurophysiological specificity in the communication of emotions by touch. Neuropsychologia. 2018;116:136–49. pmid:28572007
- 4. McGlone F, Vallbo AB, Olausson H, Loken L, Wessberg J. Discriminative touch and emotional touch. Canadian Journal of Experimental Psychology. 2007;61(3):173–83. pmid:17974312
- 5. Yu J, Yang J, Yu Y, Wu Q, Takahashi S, Ejima Y, et al. Stroking hardness changes the perception of affective touch pleasantness across different skin sites. Heliyon. 2019;5(8):e02141. pmid:31453390
- 6. Morrison I. ALE meta-analysis reveals dissociable networks for affective and discriminative aspects of touch. Human Brain Mapping. 2016;37(4):1308–20. pmid:26873519
- 7. Morrison I, Löken L, Olausson H. The skin as a social organ. Experimental Brain Research. 2010;204(3):305–14. pmid:19771420
- 8. Gallace A, Spence C. The science of interpersonal touch: An overview. Neuroscience & Biobehavioral Reviews. 2010;34(2):246–59. pmid:18992276
- 9. Jones SE, Yarbrough AE. A naturalistic study of the meanings of touch. Communication Monographs. 1985;52(1):19–56.
- 10.
Knapp ML, Hall JA. Nonverbal communication in human interaction (7th ed.). Boston, MA, USA: Wadsworth, CENGAGE Learning; 2010.
- 11. Hertenstein MJ, Verkamp JM, Kerestes AM, Holmes RM. The communicative functions of touch in humans, nonhuman primates, and rats: A review and synthesis of the empirical research. Genetic, Social, and General Psychology Monographs. 2006;132(1):5–94. pmid:17345871
- 12. Erceau D, Guéguen N. Tactile contact and evaluation of the toucher. The Journal of Social Psychology. 2007;147(4):441–4. pmid:17955753
- 13. Crusco AH, Wetzel CG. The Midas Touch: The effects of interpersonal touch on restaurant tipping. Personality and Social Psychology Bulletin. 1984;10(4):512–7.
- 14. Field T. Touch for socioemotional and physical well-being: A review. Developmental Review. 2010;30(4):367–83.
- 15. McGlone F, Wessberg J, Olausson H. Discriminative and affective touch: Sensing and feeling. Neuron. 2014;82(4):737–55. pmid:24853935
- 16. Gordon I, Voos AC, Bennett RH, Bolling DZ, Pelphrey KA, Kaiser MD. Brain mechanisms for processing affective touch. Human Brain Mapping. 2013;34(4):914–22. pmid:22125232
- 17. Bjornsdotter M, Gordon I, Pelphrey K, Olausson H, Kaiser M. Development of brain mechanisms for processing affective touch. Frontiers in Behavioral Neuroscience. 2014;8(24). pmid:24478655
- 18. Jönsson EH, Kotilahti K, Heiskala J, Wasling HB, Olausson H, Croy I, et al. Affective and non-affective touch evoke differential brain responses in 2-month-old infants. NeuroImage. 2018;169:162–71. pmid:29242105
- 19. Morrison I, Björnsdotter M, Olausson H. Vicarious responses to social touch in posterior insular cortex are tuned to pleasant caressing speeds. The Journal of Neuroscience. 2011;31(26):9554–62. pmid:21715620
- 20. Löken LS, Wessberg J, Morrison I, McGlone F, Olausson H. Coding of pleasant touch by unmyelinated afferents in humans. Nat Neurosci. 2009;12(5):547–8. pmid:19363489
- 21. Ackerley R, Wasling HB, Liljencrantz J, Olausson H, Johnson RD, Wessberg J. Human C-tactile afferents are tuned to the temperature of a skin-stroking caress. The Journal of Neuroscience. 2014;34(8):2879–83. pmid:24553929
- 22. Vrontou S, Wong AM, Rau KK, Koerber HR, Anderson DJ. Genetic identification of C fibres that detect massage-like stroking of hairy skin in vivo. Nature. 2013;493(7434):669–73. pmid:23364746
- 23. Olausson H, Wessberg J, Morrison I, McGlone F, Vallbo Å. The neurophysiology of unmyelinated tactile afferents. Neuroscience & Biobehavioral Reviews. 2010;34(2):185–91. pmid:18952123
- 24. Gentsch A, Panagiotopoulou E, Fotopoulou A. Active interpersonal touch gives rise to the social softness illusion. Current Biology. 2015;25(18):2392–7. pmid:26365257
- 25. App B, McIntosh DN, Reed CL, Hertenstein MJ. Nonverbal channel use in communication of emotion: How may depend on why. Emotion. 2011;11(3):603–17. pmid:21668111
- 26. van Erp JBF, Toet A, Janssen JB. Uni-, bi- and tri-modal warning signals: Effects of temporal parameters and sensory modality on perceived urgency. Safety Science. 2015;72(0):1–8.
- 27.
Yoo Y, Yoo T, Kong J, Choi S. Emotional responses of tactile icons: Effects of amplitude, frequency, duration, and envelope. 2015 IEEE World Haptics Conference (WHC); 22–26 June 2015; Evanston, IL, USA2015. p. 235–40. https://doi.org/10.1109/WHC.2015.7177719
- 28.
Hasegawa H, Okamoto S, Ito K, Yamada Y. Affective vibrotactile stimuli: relation between vibrotactile parameters and affective responses. International Journal of Affective Engineering. 2019;Advance online publication. https://doi.org/10.5057/ijae.IJAE-D-18-00008
- 29.
van Erp JBF, Krom BN, Toet A. Enhanced teleoperation through body ownership. Frontiers in Robotics and AI. 2020;In press. https://doi.org/10.3389/frobt.2020.00014
- 30.
van Erp JBF, Kyung K-U, Kassner S, Carter J, Brewster S, Weber G, et al., editors. Setting the standards for haptic and tactile interactions: ISO's work2010;Vol LNCS 6192—Vol. II; Heidelberg, Germany: Springer.
- 31.
van Erp JBF, Toet A. How to touch humans. Guidelines for social agents and robots that can touch. The 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction; 2–5 Sept. 2013; Geneva, Switzerland: IEEE; 2013. p. 780–5. https://doi.org/10.1109/ACII.2013.77145
- 32. Huisman G. Social touch technology: A survey of haptic technology for social touch. IEEE Transactions on Haptics. 2017;10(3):391–408. pmid:28092577
- 33. Erk SM, Toet A, van Erp JBF. Effects of mediated social touch on affective experiences and trust. PeerJ. 2015;3(e1297):1–35. pmid:26557429
- 34. Willemse CJAM, Toet A, van Erp JBF Affective and behavioral responses to robot-initiated social touch: Toward understanding the opportunities and limitations of physical contact in human–robot interaction. Frontiers in ICT. 2017;4(12).
- 35. van Erp JBF, Toet A. Social touch in human computer interaction. Frontiers in Digital Humanities. 2015;2(Article 2):1–13.
- 36. Russell JA. A circumplex model of affect. Journal of Personality and Social Psychology. 1980;39(6):1161–78.
- 37.
Salminen K, Surakka V, Lylykangas J, Raisamo R, Saarinen R, Raisamo R, et al. Emotional and behavioral responses to haptic stimulation. SIGCHI Conference on Human Factors in Computing Systems (CHI'08); April 05–10, 2008 Florence, Italy. New York, NY, USA: ACM; 2008. p. 1555–62. https://doi.org/10.1145/1357054.1357298
- 38.
Huisman G, Frederiks AD, van Erp JBF, Heylen DKJ. Simulating affective touch: Using a vibrotactile array to generate pleasant stroking sensations. In: Bello F, Kajimoto H, Visell Y, editors. Haptics: Perception, Devices, Control, and Applications: 10th International Conference, EuroHaptics 2016;Vol Part II; July 4–7, 2016; London, UK. Cham: Springer International Publishing; 2016. p. 240–50. https://doi.org/10.1007/978-3-319-42324-1_24.
- 39. Bradley MM, Lang PJ. Measuring emotion: the Self-Assessment Manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry. 1994;25(1):49–59. pmid:7962581
- 40. Lee Masson H, Op de Beeck H. Socio-affective touch expression database. PLOS ONE. 2018;13(1):e0190921. pmid:29364988
- 41.
Hayashi ECS, Gutiérrez Posada JE, Maike VRML, Baranauskas MCC. Exploring new formats of the Self-Assessment Manikin in the design with children. 15th Brazilian Symposium on Human Factors in Computer Systems; October 04–07, 2016; São Paulo, Brazil. New York, NY, USA: ACM; 2016. p. 1–10. https://doi.org/10.1145/3033701.3033728
- 42.
Yusoff YM, Ruthven I, Landoni M. Measuring emotion: A new evaluation tool for very young children. 4th Int Conf on Computing and Informatics (ICOCI 2013); Aug 28–30, 2013; Kuching, Sarawak, Malaysia. Sarawak, Malaysia: Universiti Utara Malaysia; 2013. p. 358–63. http://repo.uum.edu.my/id/eprint/12040.
- 43. Broekens J, Brinkman WP. AffectButton: A method for reliable and valid affective self-report. Int J Hum Comput Stud. 2013;71(6):641–67.
- 44. Betella A, Verschure PFMJ The Affective Slider: A digital self-assessment scale for the measurement of human emotions. PLOS ONE. 2016;11(2):e0148037. pmid:26849361
- 45. Chen Y, Gao Q, Lv Q, Qian N, Ma L. Comparing measurements for emotion evoked by oral care products. Int J Ind Ergonom. 2018;66:119–29.
- 46. Toet A, Kaneko D, Ushiama S, Hoving S, de Kruijf I, Brouwer A-M, et al. EmojiGrid: A 2D pictorial scale for the assessment of food elicited emotions. Front Psychol. 2018;9:2396. pmid:30546339
- 47. Price CIM, Curless RH, Rodgers H. Can stroke patients use visual analogue scales? Stroke. 1999;30(7):1357–61. pmid:10390307
- 48. Croy I, Sehlstedt I, Wasling HB, Ackerley R, Olausson H. Gentle touch perception: From early childhood to adolescence. Developmental Cognitive Neuroscience. 2019;35:81–6. pmid:28927641
- 49. Benaim C, Froger J, Cazottes C, Gueben D, Porte M, Desnuelle C, et al. Use of the Faces Pain Scale by left and right hemispheric stroke patients. Pain. 2007;128(1):52–8. https://doi.org/10.1016/j.pain.2006.08.029.
- 50. Jäger R. Construction of a rating scale with smilies as symbolic labels [Konstruktion einer Ratingskala mit Smilies als symbolische Marken]. Diagnostica. 2004;50(1):31–8.
- 51.
Schneider OS, Seifi H, Kashani S, Chun M, MacLean KE. HapTurk: Crowdsourcing affective ratings of vibrotactile icons. 2016 CHI Conference on Human Factors in Computing Systems; San Jose, California, USA. New York, NY, USA: ACM; 2016. p. 3248–60. https://doi.org/10.1145/2858036.2858279
- 52. Toet A, van Erp JBF. The EmojiGrid as a tool to assess experienced and perceived emotions. Psych. 2019;1(1):469–81.
- 53.
Toet A, Heijn F, Brouwer A-M, Mioch T, van Erp JBF. The EmojiGrid as an immersive self-report tool for the affective assessment of 360 VR videos. In: Bourdot P, Interrante V, Nedel L, Magnenat-Thalmann N, Zachmann G, editors. EuroVR 2019: Virtual Reality and Augmented Reality;Vol LNCS 11883; 23–25 October, 2019; Tallinn, Estland: Springer International Publishing; 2019. p. 330–5. https://doi.org/10.1007/978-3-030-31908-3_24.
- 54. Kaneko D, Toet A, Ushiama S, Brouwer AM, Kallen V, van Erp JBF. EmojiGrid: a 2D pictorial scale for cross-cultural emotion assessment of negatively and positively valenced food. Food Res Int. 2018;115:541–51. pmid:30599977
- 55. Mayo LM, Lindé J, Olausson H, Heilig M, Morrison I. Putting a good face on touch: Facial expression reflects the affective valence of caress-like touch across modalities. Biological Psychology. 2018;137:83–90. pmid:30003943
- 56. Lee Masson H, Van De Plas S, Daniels N, Op de Beeck H. The multidimensional representational space of observed socio-affective touch experiences. NeuroImage. 2018;175:297–314. pmid:29627588
- 57. Yoo S-S, Freeman DK, McCarthy JJ, Jolesz FA. Neural substrates of tactile imagery: a functional MRI study. NeuroReport. 2003;14(4):581–5. 00001756-200303240-00011. pmid:12657890
- 58. Lucas MV, Anderson LC, Bolling DZ, Pelphrey KA, Kaiser MD. Dissociating the neural correlates of experiencing and imagining affective touch. Cerebral Cortex. 2014;25(9):2623–30. pmid:24700583
- 59. Blakemore S-J, Bristow D, Bird G, Frith C, Ward J. Somatosensory activations during the observation of touch and a case of vision–touch synaesthesia. Brain. 2005;128(7):1571–83. pmid:15817510
- 60. Lovero KL, Simmons AN, Aron JL, Paulus MP. Anterior insular cortex anticipates impending stimulus significance. NeuroImage. 2009;45(3):976–83. pmid:19280711
- 61. Keysers C, Gazzola V. Expanding the mirror: vicarious activity for actions, emotions, and sensations. Current Opinion in Neurobiology. 2009;19(6):666–71. pmid:19880311
- 62. Ebisch SJH, Perrucci MG, Ferretti A, Gratta CD, Romani GL, Gallese V. The sense of touch: Embodied simulation in a visuotactile mirroring mechanism for observed animate or inanimate touch. Journal of Cognitive Neuroscience. 2008;20(9):1611–23. pmid:18345991
- 63. Keysers C, Wicker B, Gazzola V, Anton J-L, Fogassi L, Gallese V. A touching sight: SII/PV Activation during the observation and experience of touch. Neuron. 2004;42(2):335–46. pmid:15091347
- 64.
Willemse CJAM, Huisman G, Jung MM, van Erp JBF, Heylen DKJ. Observing touch from video: The influence of social cues on pleasantness perceptions. In: Bello F, Kajimoto H, Visell Y, editors. Haptics: Perception, Devices, Control, and Applications: 10th International Conference (EuroHaptics 2016);Vol Part II; July 4–7, 2016,; London, UK: Springer International Publishing; 2016. p. 196–205. https://doi.org/10.1007/978-3-319-42324-1_20
- 65. World Medical Association. World Medical Association declaration of Helsinki: Ethical principles for medical research involving human subjects. Journal of the American Medical Association. 2013;310(20):2191–4. pmid:24141714
- 66. Anwyl-Irvine A, Massonnié J, Flitton A, Kirkham N, Evershed J. Gorilla in our Midst: An online behavioral experiment builder. bioRxiv. 2019;438242.
- 67. Koo TK, Li MY. A guideline of selecting and reporting intraclass correlation coefficients for reliability research. Journal of Chiropractic Medicine. 2016;15(2):155–63. pmid:27330520
- 68. Shrout PE, Fleiss JL. Intraclass correlations: Uses in assessing rater reliability. Psychol Bull. 1979;86(2):420–8. pmid:18839484
- 69. McGlone F, Olausson H, Boyle JA, Jones-Gotman M, Dancer C, Guest S, et al. Touching and feeling: differences in pleasant touch processing between glabrous and hairy skin in humans. European Journal of Neuroscience. 2012;35(11):1782–8. pmid:22594914