The distance individuals maintain between themselves and others can be defined as ‘interpersonal space’. This distance can be modulated both by situational factors and individual characteristics. Here we investigated the influence that the interpretation of other people interaction, in which one is not directly involved, may have on a person’s interpersonal space. In the current study we measured, for the first time, whether the size of interpersonal space changes after listening to other people conversations with neutral or aggressive content. The results showed that the interpersonal space expands after listening to a conversation with aggressive content relative to a conversation with a neutral content. This finding suggests that participants tend to distance themselves from an aggressive confrontation even if they are not involved in it. These results are in line with the view of the interpersonal space as a safety zone surrounding one’s body.
Citation: Vagnoni E, Lewis J, Tajadura-Jiménez A, Cardini F (2018) Listening to a conversation with aggressive content expands the interpersonal space. PLoS ONE 13(3): e0192753. https://doi.org/10.1371/journal.pone.0192753
Editor: Suliann Ben Hamed, Centre de neuroscience cognitive, FRANCE
Received: September 28, 2017; Accepted: January 30, 2018; Published: March 28, 2018
Copyright: © 2018 Vagnoni et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: All relevant data are within the paper and its Supporting Information files.
Funding: ATJ was supported by RYC-2014-15421 and PSI2016-79004-R (“MAGIC SHOES: Changing sedentary lifestyles by altering mental body-representation using sensory feedback”; AEI/FEDER, UE), Ministerio de Economía, Industria y Competitividad of Spain.
Competing interests: The authors have declared that no competing interests exist.
The space close to our body is particularly important given that it is where we physically interact with stimuli in the external world. Several disciplines have investigated this space using different paradigms and terminology. In social psychology, ‘personal space’ is often used to define the emotionally-tinged zone around the human body that people experience as ‘their space’  and which others cannot intrude without causing discomfort . This definition suggests that personal space only exists during interaction with other people . Hence, the terms ‘interpersonal space’ (IPS) and ‘interpersonal distance’ (IPD), which define the space/distance individuals maintain between themselves and others, are often used as synonyms of ‘personal space’ [4, 5]. In the cognitive neuroscience tradition, instead, this space has been referred to as ‘peripersonal space’ (PPS)  and has been defined as the space immediately surrounding our body. The discrete coding of peripersonal space in the brain was first revealed by single-cell recordings in monkeys, within a network of interconnected sensori-motor areas [7, 8, 9, 10, 11]. Later, neuroimaging and neurophysiological studies showed a similar fronto-parietal network in humans [12, 13, 14, 15, 16, 17]. Cognitive neuroscience studies have focused mainly on two interpretations of PPS, i.e. a space for action and a defensive space. According to the first interpretation, the PPS is where goal-directed actions occur, and where objects can be grasped and manipulated, while objects located beyond this space cannot [18, 19]. According to the second interpretation, the PPS is the space that we maintain between our body and dangerous objects, like a protective bubble that keeps a margin of safety around the body surface and coordinates defensive behaviours against potentially dangerous stimuli [20, 21, 22]. Recently, a dual model of PPS has been proposed, which is based on a functional distinction between defensive and goal-directed action space . According to this model the two functions of PPS require distinct sensory and motor processes. For example, the goal-directed PPS, generally, requires finer and more controlled motor actions than the defensive PPS, while defensive actions are frequently, but not always, automatic.
Moreover, different factors seem to influence the two types of PPS and sometimes the same factor modulates them in opposite ways. For example, whereas anxiety expands the defensive PPS [24, 25] it shrinks the goal directed PPS . Indeed, in the first case the safety bubble around us becomes bigger and objects that are normally considered innocuous, as far from our body, are instead treated as within our PPS boundaries. In the mentioned studies the ‘size’ of the PPS was measured with the bisection line paradigm  and the hand-blink reflex (HBR) . The line bisection paradigm measures the bias in a visual bisection task. When bisecting horizontal lines close to the body observers show a slight leftward bias that, however, shifts rightward when the line is presented in far space  (see  for review). Lourenco and colleagues  found that subjects with high level of claustrophobia showed a more gradual rightward shift over distance. The authors interpreted the results as evidence of a larger representation of their PPS space due to their claustrophobia-related anxiety . In the hand-blink reflex (HBR), instead, participants receive a stimulation on the median nerve that produce an eye blink reflex when the hand is located close to the face [29, 30]. With this paradigm the authors showed that in more anxious individuals, the “safety margin” is located further away from the body than in less anxious individuals. In this case the boundaries of PPS are measured with the strength of the HBR in relation to the position of the stimulated hand from the face.
Conversely, anxiety seems to reduce the size of our goal directed PPS, as it has been shown that anxious people perceive themselves as less able to perform a movement . In this case participants were asked to judge if they were able to reach an object on a table. When anxiety was experimentally induced participants underestimated their ability to reach for the (inoffensive) objects on the table.
Both the social psychology and cognitive neuroscience literature have shown that people’s mental representation of the space around their body is not fixed. Experimental evidence from cognitive neuroscience studies showed an expansion of the PPS representation after tool use: when, through a tool, people act upon far space their representation of near and far space changes, with the far space being remapped as near space [31, 32, 33]. This effect was first described in monkeys, indeed, Iriki and colleagues  analyzed the responses of neurons in the post-central gyrus of the monkey after training the monkey to reach food with a rake. Interestingly, after the training neurons in the post-central gyrus responded to visual stimulation in the monkey’s extrapersonal space. According to the authors, the visual receptive fields of neurons representing the PPS expanded following tool use . Similarly, in humans, Canzoneri and colleagues  showed that a brief training with a tool induces plastic changes both to the representation of the body part using the tool and to the PPS. Interestingly, not only the hand-centered PPS expands after training with tools. Indeed, Galli and colleagues  have shown the effects of a special tool, the wheelchair, in extending the action possibilities of the whole body. Even if the hand-centered PPS is the most investigated, there is evidence of at least three body-part specific PPS representations, face, hand and trunk-centered, that differ in extension and directional tuning . These experiments used several versions of a well-validated bimodal paradigm [35, 36, 37, 38, 39]. In this paradigm sounds approaching the participant’s body are presented while tactile stimuli are delivered to the participant’s hand at several time delays. Specifically, the tactile stimulus is delivered when the sound is perceived at several distances from the body. The participants are asked to respond as quickly as possible to the tactile stimulus ignoring the approaching sound. It has been shown that the reaction times to the tactile stimuli are modulated by the simultaneous presentation of the to-be-ignored sound. Indeed, the reaction times become progressively faster as the sound is perceived closer to the body. This paradigm allows identifying the PPS boundaries, and quantifying their variation due to various factors (e.g., after tool use, as in ). Moreover, the same paradigm has been used to show how multisensory inputs, even outside of awareness, are integrated within the PPS .
The expansion of PPS after tool use relates to the interpretation of PPS as the space where goal-directed actions occur. The PPS representation seems to expand also in the presence of unpleasant or threatening stimuli, which relates to the interpretation of PPS as a protective space: when a threatening stimulus is approaching our body, we expand our safety zone [41, 42, 43, 44]. It has been suggested that coding a dangerous stimulus as inside our safety zone earlier than a non-dangerous stimulus has an adaptive advantage given that it allows having more time to engage in a defensive response .
Recently the influence of social interaction on PPS has been investigated also in cognitive neuroscience studies. Indeed, it has been shown that PPS boundaries shrink when subjects sit in front of another person, as compared to a mannequin, placed in far space . Interestingly, after playing an economic game with another person, PPS boundaries between self and other merge when performing a task with another person, but only if the other person behaves cooperatively . Moreover, a recent study demonstrated that shared sensory experiences between two people induced by interpersonal multisensory stimulation, do not only increase the remapping of the other’s sensory experiences onto the participant’s own body  but also alter the way in which PPS is represented . In particular, by using the bimodal paradigm previously described [35, 36, 37], Maister and colleagues showed a significant increase in audio-tactile integration in the space close to the confederate’s body after the shared experience . These results suggest that sharing multisensory experiences can induce a remapping of the other’s PPS onto our own PPS . In relation to shared PPS, Brozzoli and colleagues have identified neuronal populations in the human ventral premotor cortex that encode the space near both one’s own hand and another person’s hand. This suggests that we use a common spatial reference frame to code sensory events, actions and cognitive processing happening within the shared PPS .
Social psychology studies showed how people tend to react to spatial violations by increasing distance from intruders when feeling in hostile and uncomfortable situations and, vice-versa, by reducing distance when feeling in friendly and comfortable situations [49, 21]. A typical task to assess the size of IPS is based on comfort-distance judgments provided through the ‘stop-distance’ paradigm: in this task participants are required to stop the person walking towards them when they start to feel uncomfortable with the other’s proximity (passive ‘stop-distance’ task; [2, 50, 51, 52, 53, 54]) or have to walk towards a person and stop themselves when they start to feel uncomfortable with the other’s proximity (active ‘stop-distance’ task; [33, 54, 55, 56]). With this paradigm, it has been shown that the size of IPS is modulated by the situational, emotional and individual characteristics [2, 51, 57]. For example, listening to positive versus negative emotion-inducing music reduces the representation of IPS, allowing others to come closer to us . Moral information about the confederates modulates the IPS as well, so that participants choose to increase the distance between themselves and an immorally described confederate while they reduce the distance with a morally described confederate .
From an ‘action-centered’ perspective, IPS can be seen as the physical space where social interactions occur . Lloyd and Morrison  suggested that the nature of the social interactions, and the characteristics of the person we are interacting with, may affect IPS. In their study, using fMRI they measured brain activity while participants viewed photographs where one person either posed a potential threat to another (threat condition) or it did not pose a threat (non-threat condition). Crucially, the two people were depicted close or far from each other. The temporal–occipital junction, extrastriate, and fusiform cortices and right superior parietal lobe (BA7)—which are visuospatial areas—responded when the threatening person was close to the other person (in this case the authors used the term ‘personal space’), but not when the two people were distant from each other. From these results the authors concluded that higher-level visual cortices seemed to play a role in distinguishing social categories based on a person’s features, e.g. how dangerous a person looks, and that it is not only the presence of a personified threat, but the spatial distance between the people interacting, that together influences an observer’s interpretation of the interaction. Moreover, results showed that posterior parietal areas—which code the space surrounding one’s own body (see )—responded when the individuals were closer, regardless of whether the person was depicted as threatening or not. Observing interactions in which one is not directly involved seemed to influence one’s own IPS. The authors referred here to the term ‘eavesdropping’ that ethologists use to describe the process of gaining relevant information on an individual—such as status, aggression potential or sexual desirability- by observing him/her interacting with others . This information allows preparing for action before a direct interaction takes place, which may be especially important in situations which pose a potential threat to one’s body.
The concept of ‘eavesdropping’ is particularly relevant in the present study. Indeed, we aimed to directly investigate the influence that the interpretation of other people interaction, in which one is not directly involved, may have on a person’s IPS. In particular, we hypothesized that the emotional content of a heard conversation will modulate the participant’s IPS representation, even if the participant is not involved in that conversation. In this experiment, participants listened to two different conversations between two persons, one conversation had an aggressive content and the other had a neutral content. After listening to each conversation, the comfort IPS of participants was measured by using the ‘stop-distance’ paradigm, which was described above and that it has been widely used to investigate the IPS representation [2, 50, 51, 52, 53, 54]. Usually in this paradigm the participants actively approach, or are approached by, another person that they are looking at. Here we decided to measure the IPS when no visual information about the other person was available to the participants. Indeed, no one was in front of the participant but participants listened to the recording of the footstep sounds of a person walking towards them. They were asked to stop the recording as soon as they started feeling uncomfortable, or as the footsteps were perceived too close to them. Interestingly, this modified version of the more classic paradigm resulted to be effective in measuring the IPS. Using the sound of approaching footsteps has the advantage of eliminating any interactions and confounds with the idiosyncrasy of the person approaching (actively or passively) the participants. Moreover, the content of the conversation modulated the IPS with participants stopping earlier the footsteps recording after listening to an aggressive conversation relative to a neutral one. Therefore, listeners seem to distance themselves more from someone approaching after a conflictual discussion.
The present research involved human participants and has been approved by the local ethical committee–i.e. the Faculty Research Ethics Panel, at Anglia Ruskin University—and has been conducted according to the principles expressed in the Declaration of Helsinki. Written informed consent was obtained from the participants.
Thirty-three participants (21 female) between 19 and 30 years of age, mean age 21.7 took part in the experiment. They were members of the Anglia Ruskin community and participated in the experiment in exchange for course credit. Participants reported normal or corrected-to-normal vision.
Stimuli, design, and procedure
A pair of binaural microphones (Core Sound, frequency response 20 Hz-20 kHz) and an audio recorder (ZOOM ZH4N Handy Portable Digital Recorder) were used to record the sound stimuli used in the experiment. Specifically, the sound of “approaching footsteps” (i.e., the sounds produced by a person walking towards the listener) were recorded in an empty and quiet large room with a length of 17.4 m and wooden floor. The recorder was placed at one end of the room and the “walker”, a female wearing hard-sole shoes, positioned herself at the opposite end of the room. She was instructed to walk towards the recorder, at a natural gait speed and keeping a constant pace. This produced an “approaching footsteps” recording that lasted 42.59 seconds. In order to ascertain that the footsteps sound was perceived as approaching, we asked 20 participants–who did not take part in the main experiment–to listen at the recording and rate the direction of the sound on a Likert scale from -5 (“receding”), to +5 (“approaching”) with 0 as “walking in place”. Participants’ ratings were all 4 or above (median = 5, range = 4–5) and every participants reported that the sound was clearly perceived as approaching their body.
The two conversations were performed by two actors, one male and one female, both drama students at Anglia Ruskin University (both actors gave informed consent before being recorded). The conversations were not scripted, the actors improvised from selected topics: first date, catch up between old friends, infidelity within a relationship and drunk fight. Two researchers independently chose the best recording for each condition, and they coincide on the ‘first date’ for the neutral and ‘drunk fight’ for the aggressive condition. The conversations were recorded in an empty and quiet corridor of the same length as the room where the footsteps were recorded, the recorder was placed at one end of the corridor and the actors positioned themselves at the opposite end. The conversations’ audio clips lasted 165 seconds.
We merged the audio clips of the conversations with the “approaching footsteps” clip with Audacity 2.1.2 Software leaving one second gap between the end of the conversation and the beginning of the footsteps. These produced two experimental stimuli, one used for the neutral condition and one used for the aggressive condition. For each condition, the resulting audio clip lasted a total of 208.59 seconds. Therefore, there were 165.00 seconds of conversation, 1 second of pause and 42.59 seconds of approaching footsteps (Fig 1). During the last 42.59 seconds, participants could either produce a response by stopping the recording, if they felt uncomfortable, or let the recording play until its end, if they did not feel uncomfortable.
The recording was created concatenating two recordings (conversation and footsteps), leaving 1 second silence gap in between. The recording of the conversation lasted 165.00 seconds (for both aggressive and control condition). This was followed by a silence pause lasting 1 second, after which the recording of the footsteps started. The footsteps recording lasted 42.59 seconds—this was the time window that the participants had to respond.
The stimuli were presented through a Tablet (Acer Aspire Switch 10) using Audacity 2.1.2 Software. Participants were asked to wear a blindfold and noise cancelling headphones. All participants listened to both the aggressive and neutral conversations, in a counterbalanced order. The sound level was approximately 65 dBA. After each conversation ended, the participants heard footsteps approaching them. They were asked to press the keyboard key “P” when they felt like the footsteps were too close and started to make them feel uncomfortable. In the case the footsteps did not make participants feel uncomfortable they could just let the recording of the footsteps play until its end.
Results and discussion
We subtracted the response time of the participants (when they pressed the keyboard key) from the total duration of the recording (208.59 s), with higher values indicating that the participants stopped the recording sooner (S1 File). In the aggressive condition (M = 6.96 s, SE = 1.13 s) participants stopped the recording after 201.63 seconds while in the neutral condition (M = 4.55 s, SE = 1.06 s) after 204.04 seconds (Fig 2). To put it in simpler words, since the time window available for the participants to give their response was the 42.59 seconds of approaching footsteps, this means that in the aggressive condition the footsteps were stopped after 35.04 seconds and in the neutral condition after 37.45 seconds, demonstrating that participants started feeling uncomfortable sooner in the first condition–i.e. after listening to an aggressive conversation—than in the second–i.e. after the neutral conversation.
The right panel shows how the participants’ responses would be translated in the space domain.
A Kolmogorov-Smirnov test was used to test the data for normality. RTs for both conditions were not normally distributed (p < .05). Therefore, we used non-parametrical statistical tests to analyze the data.
As all participants listened to both the aggressive and neutral conversations, in a counterbalanced order, we first ran a Kruskal-Wallis test to compare RTs in each condition (neutral and aggressive conversation) across the two groups of participants (i.e. those who listened to the neutral conversation first vs those who listened to the aggressive conversation first), in order to test whether the order of the conversation presentation had any impact on the RTs. Results showed that RTs in the neutral conversation did not significantly differ between the two groups: X2(1, N = 33) = 1.13, p = .28. Similarly, RTs in the aggressive conversation condition did not significantly differ between the two groups: X2(1, N = 33) = .29, p = .58. Therefore, the type of conversation that participants listened first did not affect the time at which participants interrupted the “approaching footsteps’” clip.
Second, we ran a Friedman test to compare the RTs after the two conversations (neutral vs aggressive). Results showed that participants stopped the “approaching footsteps” clip earlier after listening to an aggressive conversation (M = 6.96 s, SE = 1.13 s) compared to after listening to a neutral conversation (M = 4.55 s, SE = 1.06 s): X2(1, N = 33) = 8.53, p = .003. The present results showed that the mere sound of approaching footsteps, instead of visual information of another person, can be used to measure the IPS using the ‘stop-distance’ paradigm. Indeed, with this study we have shown for the first time that it is possible to measure the IPS representation even without an actual person standing in front of the participants. We presented a recording of approaching footsteps and asked the participants to stop the recording when the distance from the walker was making them feeling uncomfortable. This modified version of the more classic ‘stop-distance’ task is not only effective but it also eliminates any possible influence of the idiosyncrasy of the person standing in front of the participants.
Interestingly, after listening to a conversation with an aggressive content the participants stop the sound of approaching footsteps further away from their body relative to after listening to a conversation with neutral content. Therefore, after listening to an aggressive conversation the IPS representation increases with participants setting a wider distance between them and the approaching footsteps. This increased distance could be interpreted as an attempt to avoid being involved in an aggressive confrontation or to avoid an interaction with a person soon after a threatening social interaction. The results showed that 11 participants in the control condition and 6 participants in the experimental condition did not stop the recording. On one hand this result points towards the fact that participants who stopped the recording did so because they felt uncomfortable and, on the other, that more people in the experimental condition did feel uncomfortable (more people stopped the recording) therefore showing the effectiveness of our manipulation.
Our result is consistent with previous findings from the social psychology literature [61, 62]. For example, Lieberz and colleagues  showed that women detect implicit cues of aggressiveness in male faces and adjust their interpersonal distance behaviour accordingly. Moreover, a recent study found an increased interpersonal space when an angry confederate (virtual character) approached participants . Participants tend to maintain a certain distance from threatening stimuli approaching the body as self-protection response [49, 55, 63, 64]. Interestingly, we showed for the first time that the quality of social interactions influences our comfort boundaries even when we are not directly involved in the aggressive confrontation.
In this study we have shown how being the bystander of an aggressive confrontation influences the distance that we take from other people. Perceived threat from others represents a crucial factor in mediating the equilibrium between interpersonal space and social interaction [50, 56, 65, 66]. Moreover, information gained from observing, or in this case listening to, human interactions may contribute to an evaluation of both the situation and the person involved and thereby facilitates social learning particularly in fearful or threatening situations [56, 67].
In the cognitive neuroscience literature, many studies focused on the defensive aspect of PPS [20, 42, 43, 44]. However, rarely the threatening objects used in these studies were social. On the other hand, the social psychology literature focused on the distance between two people but rarely the interacting person was represented as threatening. The study of Lloyd and Morrison  is relevant to the present work given that showed a network of areas involved in interpersonal spatial behaviour that is modulated not only by the distance between the interactants but also by the nature of the interaction. Moreover, as in our case, the participants were not directly involved in the social interaction but they were observing the interaction between two people. We believe that the IPS can be interpreted, as it has been done for the PPS, as a margin of safety around the body. However, it is difficult to exactly say if IPS and PPS are just two terms used to indicate the same portion of space [55, 56] or if they are two functionally different and independent spatial representations [33, 68]. Patané and colleagues  compared the effect of tool use on reaching distance and comfort distance and found an effect of tool use on the reaching distance but not comfort distance. Moreover, in a following study  the authors showed dissociation between PPS, operationalized as reachable space, and IPS, operationalized as comfort space. Specifically, the authors used a ‘social’ tool-use setting in which tools were not only bodily extensions, but instruments for social cooperation. Indeed, the participants had to cooperate, using the tools, to complete the task. The results showed an effect of cooperative tool use on PPS but not on IPS. It would be interesting to investigate if the defensive PPS and IPS have the same constrains. Although one might argue that the notion of IPS as ‘comfort zone’ is closer to the ‘margin of safety’ interpretation of PPS. It is indeed impossible to feel comfortable when not safe.
Both PPS and IPS have been investigated in relation to anxiety and social disorders. As already mentioned, the size of PPS is modulated by the anxiety level of the individuals, with more anxious individuals showing a larger PPS [29, 30]. Moreover, it has been shown that individuals with higher level of claustrophobia show a larger and less flexible PPS [24, 69]. Regarding socio-communicative disorders several works have focused instead on the IPS. Specifically, the IPS of individuals with persistent difficulties has been investigated in the domain of social behavior, such as children with autism spectrum disorders (ASD). Using the stop-distance paradigm it has been shown that ASD children feel comfortable at a greater distance relative to children with typical development  (but see also  and  for different results on the size of PPS). Moreover, ASD children seems to have a less flexible IPS , in accordance with  and . This last finding is in agreement with the hypothesis that ASD children show a steeper and less flexible gradient between self and other . Indeed, it has been proposed that schizophrenia and ASD can be considered two extremes of an element of self-representation, self-location , and, specifically, the distance between self and other.
Our modified version of the stop-distance task and the manipulation used in this study could be helpful to investigate the IPS in individuals with socio-communicative disorders or high level of social anxiety. It would be interesting, for example, to investigate if individuals with high social anxiety show flexibility of their IPS or if the mere presentation of a conversation, even with a neutral content, is enough to make them feel uncomfortable and avoid the cue of an approaching person (i.e. approaching footsteps).
Future work may also compare the different effects on IPS of using visual and auditory stimuli in ‘stop-distance’ paradigms. Visual and auditory systems differ in their processing of environmental signals. When considering the IPS as a ‘comfort zone’ or a ‘margin of safety’, one could hypothesize that the auditory stimuli may have a greater influence on IPS, as it is indeed being categorized as a ‘warning system’ (e.g., ). This categorization derives from a number of advantages that the auditory system displays, as compared to the visual and other sensory systems. These include being characterized as a ‘change detector’, with high temporal resolution and high sensitivity for structured motion that allow to quickly extract cues indicating a rapid change and quickly orient behaviour towards it, in a faster way than the visual system [75, 76]. A second advantage is that audition provides a continuous stream of information on distant and close stimuli–while we regularly block vision by closing our eyes, our ears cannot be ‘turned off’ in the same way . Finally, audition informs about events taking place all around us, even those events outside the visual field, and process several streams of information in parallel, which provides and overall impression of the events around us, as well as an impression of the geometry and size of the space we are in, through the acoustic reflections with the surrounding objects and walls [78, 79]. Given these differences in processing of information, it may be hypothesized than the size of the IPS, as measured by ‘stop-distance’ and other paradigms, may be modulated differently by visual and auditory stimuli. Future work could test this hypothesis, while also considering that in complex real-life contexts we often encounter a combination of information from different sensory modalities, which relate to different events. Our brain needs to monitor, integrate and respond to all these different cues in an optimum way that allows to keep us safe, and if possible, comfortable.
- 1. Sommer R. (1959). Studies in personal space. Sociometry 22: 247–260.
- 2. Hayduk L. A. (1983). Personal space: Where we now stand. Psychol bull 94: 293.
- 3. Deus V., Jokic-Begic N. (2006) Personal space in schizophrenic patients. Psychiatr Danub 18: 150–158. pmid:17099605
- 4. Gifford R., & Sacilotto P. A. (1993). Social isolation and personal space: A field study. Can J Behav Sci 25: 165.
- 5. Evans G. W., & Howard R. B. (1973). Personal space. Psychol bull 80: 334. pmid:4590526
- 6. Rizzolatti G., Fadiga L., Fogassi L., & Gallese V. (1997). The space around us. Science 277: 190. pmid:9235632
- 7. Graziano M., Yap G., & Gross C. (1994). Coding of visual space by premotor neurons. Science 266: 1054–1057. pmid:7973661
- 8. Hyvärinen J., & Poranen A. (1974). Function of the parietal associative area 7 as revealed from cellular discharges in alert monkeys. Brain 97: 673–692. pmid:4434188
- 9. Rizzolatti G., Scandolara C., Matelli M., & Gentilucci M. (1981a). Afferent properties of periarcuate neurons in macaque monkeys. II. Visual responses. Behav Brain Res 2: 147–163.
- 10. Rizzolatti G., Scandolara C., Matelli M., & Gentilucci M. (1981b). Afferent properties of periarcuate neurons in macaque monkeys. I. Somatosensory responses. Behav Brain Res 2: 125–146.
- 11. Gentilucci M., Scandolara C., Pigarev I. N., & Rizzolatti G. (1983). Visual responses in the postarcuate cortex (area 6) of the monkey that are independent of eye position. Exp. Brain Res 50: 464–468. pmid:6641880
- 12. Bremmer F., Schlack A., Shah N. J., Zafiris O., Kubischik M., Hoffmann K. P., Ziles K., & Fink G. R. (2001). Polymodal motion processing in posterior parietal and premotor cortex: a human fMRI study strongly implies equivalencies between humans and monkeys. Neuron 29: 287–296. pmid:11182099
- 13. Brozzoli C., Gentile G., Petkova V. I., & Ehrsson H. H. (2011). FMRI adaptation reveals a cortical mechanism for the coding of space near the hand. J. Neurosci 31: 9023–9031. pmid:21677185
- 14. Gentile G., Petkova V. I., & Ehrsson H. H. (2011). Integration of visual and tactile signals from the hand in the human brain: an FMRI study. J. Neurophysiol 105: 910–922. pmid:21148091
- 15. Cardini F., Costantini M., Galati G., Romani G. L., Làdavas E., & Serino A. (2011). Viewing one’s own face being touched modulates tactile perception: an fMRI study. J Cogn Neurosci 23: 503–513. pmid:20350177
- 16. Makin T. R., Holmes N. P., & Zohary E. (2007). Is that near my hand? Multisensory representation of peripersonal space in human intraparietal sulcus. J. Neurosci 27: 731–740. pmid:17251412
- 17. Serino A., Canzoneri E., & Avenanti A. (2011). Fronto-parietal areas necessary for a multisensory representation of peripersonal space in humans: an rTMS study. J Cogn Neurosci 23: 2956–2967. pmid:21391768
- 18. Previc F. H. (1998). The neuropsychology of 3-D space. Psychol bull 124: 123–64. pmid:9747184
- 19. Holmes N. P., & Spence C. (2004). The body schema and the multisensory representation (s) of peripersonal space. Cogn Process 5: 94–105. pmid:16467906
- 20. Graziano M. S., & Cooke D. F. (2006). Parieto-frontal interactions, personal space, and defensive behavior. Neuropsychologia 44: 845–859. pmid:16277998
- 21. Hall E. T. (1966). The hidden dimension. New York, NY, US: Doubleday & Co.
- 22. Sommer R. (2002). From personal space to cyberspace. Handbook of environmental psychology 2: 1–10.
- 23. de Vignemont F., & Iannetti G. D. (2015) How many peripersonal spaces? Neuropsychologia 70: 327–334. pmid:25448854
- 24. Lourenco S. F., Longo M. R., & Pathman T. (2011). Near space and its relation to claustrophobic fear. Cognition 119: 448–53. pmid:21396630
- 25. Sambo C. F., & Iannetti G. D. (2013). Better safe than sorry? The safety margin surrounding the body is increased by anxiety. J. Neurosci 33: 14225–14230. pmid:23986256
- 26. Graydon MM, Linkenauger SA, Teachman BA, Proffitt DR (2012), Scared stiff: The influence of anxiety on the perception of action capabilities. Cogn Emot. 26: 1301–1315. pmid:22650350
- 27. Bowers D., & Heilman K. M. (1980). Pseudoneglect: effects of hemispace on a tactile line bisection task. Neuropsychologia 18: 491–498. pmid:6777712
- 28. Jewell G., & McCourt M. E. (2000). Pseudoneglect: a review and meta-analysis of performance factors in line bisection tasks. Neuropsychologia 38: 93–110. pmid:10617294
- 29. Sambo C. F., Forster B., Williams S. C., & Iannetti G. D. (2012a). To blink or not to blink: fine cognitive tuning of the defensive peripersonal space. J. Neurosci 32: 12921–12927.
- 30. Sambo C. F., Liang M., Cruccu G., and Iannetti G. D. (2012b). Defensive peripersonal space: the blink reflex evoked by hand stimulation is increased when the hand is near the face. J. Neurophysiol 107: 880–889.
- 31. Berti A., & Frassinetti F. (2000). When far becomes near: remapping of space by tool use. J Cogn Neurosci 12, 415–420. pmid:10931768
- 32. Longo M. R., & Lourenco S. F. (2006). On the nature of near space: Effects of tool use and the transition to far space. Neuropsychologia 44: 977–981. pmid:16243365
- 33. Patané I., Iachini T., Farnè A., & Frassinetti F. (2016). Disentangling action from social space: tool-use differently shapes the space around us. PloS one 11: e0154247. pmid:27144720
- 34. Iriki A., Tanaka M., & Iwamura Y. (1996). Coding of modified body schema during tool use by macaque postcentral neurones. Neuroreport, 7: 2325–2330. pmid:8951846
- 35. Canzoneri E., Ubaldi S., Rastelli V., Finisguerra A., Bassolino M., & Serino A. (2013b). Tool-use reshapes the boundaries of body and peripersonal space representations. Exp. Brain Res 228: 25–42.
- 36. Galli G., Noel J. P., Canzoneri E., Blanke O., & Serino A. (2015). The wheelchair as a full-body tool extending the peripersonal space. Front in psychol, 6: 639.
- 37. Serino A., Noel J. P., Galli G., Canzoneri E., Marmaroli P., Lissek H., & Blanke O. (2015). Body part-centered and full body-centered peripersonal space representations. Scientific reports, 5: 18603. pmid:26690698
- 38. Canzoneri E., Magosso E., & Serino A. (2012). Dynamic sounds capture the boundaries of peripersonal space representation in humans. PloS one 7: e44306. pmid:23028516
- 39. Canzoneri E., Marzolla M., Amoresano A., Verni G., & Serino A. (2013a). Amputation and prosthesis implantation shape body and peripersonal space representations. Sci rep 3.
- 40. Salomon R., Noel J. P., Łukowska M., Faivre N., Metzinger T., Serino A., & Blanke O. (2017). Unconscious integration of multisensory bodily inputs in the peripersonal space shapes bodily self-consciousness. Cognition, 166: 174–183. pmid:28577447
- 41. Vagnoni E., Lourenco S. F., & Longo M. R. (2012) Threat modulates perception of looming visual stimuli. Curr Bio 22: R826–R827.
- 42. Taffou M., & Viaud-Delmon I. (2014). Cynophobic fear adaptively extends peri-personal space. Front in psychiatry 5.
- 43. Ferri F., Tajadura-Jiménez A., Väljamäe A., Vastano R., & Costantini M. (2015). Emotion-inducing approaching sounds shape the boundaries of multisensory peripersonal space. Neuropsychologia 70: 468–475. pmid:25744869
- 44. de Haan A. M., Smit M., Stigchel S., & Dijkerman H. C. (2016). Approaching threat modulates visuotactile interactions in peripersonal space. Exp. Brain Res 234: 1875–1884. pmid:26894891
- 45. Teneggi C., Canzoneri E., Di Pellegrino G., & Serino A. (2013). Social modulation of peripersonal space boundaries. Curr Bio 23: 406–411.
- 46. Cardini F., Tajadura-Jiménez A., Serino A., & Tsakiris M. (2013). It feels like it’s me: interpersonal multisensory stimulation enhances visual remapping of touch from other to self. J Exp Psychol Hum Percept Perform 39: 630. pmid:23276110
- 47. Maister L., Cardini F., Zamariola G., Serino A., & Tsakiris M. (2015). Your place or mine: Shared sensory experiences elicit a remapping of peripersonal space. Neuropsychologia 70: 455–461. pmid:25447370
- 48. Brozzoli C., Gentile G., Bergouignan L., & Ehrsson H. H. (2013). A shared representation of the space near oneself and others in the human premotor cortex. Curr Bio 23: 1764–1768.
- 49. Kennedy D. P., Gläscher J., Tyszka J. M., & Adolphs R. (2009). Personal space regulation by the human amygdala. Nat. Neurosci. 12: 1226–1227. pmid:19718035
- 50. Dosey M. A., & Meisels M. (1969). Personal space and self-protection. J Pers Soc Psychol 11: 93. pmid:5778351
- 51. Aiello J. R. (1987). Human spatial behaviour. In Stokols D. & Altman I. (Eds.), Handbook of environmental psychology (Vol. 1, pp. 505–531). New York: Wiley Interscience.
- 52. Bailenson J. N., Blascovich J., Beall A. C., & Loomis J. M. (2003). Interpersonal distance in immersive virtual environments. Pers Soc Psychol Bull 29: 819–833. pmid:15018671
- 53. Gessaroli E., Santelli E., di Pellegrino G., & Frassinetti F. (2013). Personal space regulation in childhood autism spectrum disorders. PLoS One 8: e74959. pmid:24086410
- 54. Tajadura-Jiménez A., Pantelidou G., Rebacz P., Västfjäll D., & Tsakiris M. (2011). I-space: the effects of emotional valence and source of music on interpersonal distance. PloS one 6: e26083. pmid:22022516
- 55. Iachini T., Pagliaro S., & Ruggiero G. (2015). Near or far? It depends on my impression: Moral information and spatial behavior in virtual interactions. Acta psychol 161, 131–136.
- 56. Lloyd D. M. (2009). The space between us: A neurophilosophical framework for the investigation of human interpersonal space. Neurosci. Biobehav. Rev. 33: 297–304. pmid:18926850
- 57. Uzzell D., & Horne N. (2006). The influence of biological sex, sexuality and gender role on interpersonal distance. Br J Soc Psychol 45: 579–597. pmid:16984722
- 58. Lloyd D. M., & Morrison C. I. (2008). ‘Eavesdropping’on social interactions biases threat perception in visuospatial pathways. Neuropsychologia 46: 95–101. pmid:17897686
- 59. Icoboni M. (2006). Visuo-motor integration and control in the human posterior parietal cortex: Evidence from TMS and fMRI. Neuropsychologia 44: 2691–2699. pmid:16759673
- 60. McGregor P. K. (1993). Signalling in territorial systems: a context for individual identification, ranging and eavesdropping. Philos Trans: Biol Sci 237–244.
- 61. Lieberz K. A., Windmann S., Geniole S. N., McCormick C. M., Mueller-Engelmann M., Gruener F., Pia Bornefeld-Ettmann P., & Steil R. (2017). The facial width-to-height ratio determines interpersonal distance preferences in the observer. Aggress Behav 43: 460–470. pmid:28261811
- 62. Ruggiero G., Frassinetti F., Coello Y., Rapuano M., di Cola A. S., & Iachini T. (2016). The effect of facial expressions on peripersonal and interpersonal spaces. Psychological research 1–9.
- 63. Horstmann (2003). What do facial expressions convey: feeling states, behavioral intentions, or action requests? Emotion 3: 150–66 pmid:12899416
- 64. Seidel E. M., Habel U., Kirschner M., Gur R. C., & Derntl B. (2010). The impact of facial emotional expressions on behavioral tendencies in women and men. J Exp Psychol Hum Percept Perform 36: 500. pmid:20364933
- 65. Argyle M., & Dean J. (1965). Eye-contact, distance and affiliation. Sociometry 289–304. pmid:14341239
- 66. Horowitz M. J., Duff D. F., & Stratton L. O. (1964). Body-buffer zone: exploration of personal space. Arch gen psych 11: 651–656.
- 67. Olsson A., & Phelps E. A. (2007). Social learning of fear. Nat neurosci 10: 1095. pmid:17726475
- 68. Patané I., Farnè A., & Frassinetti F. (2017). Cooperative tool-use reveals peripersonal and interpersonal spaces are dissociable. Cognition 166: 13–22. pmid:28554081
- 69. Hunley S. B., Marker A. M., & Lourenco S. F. (2017). Individual differences in the flexibility of peripersonal space. Exp psychol. 64: 49–55. pmid:28219262
- 70. Gessaroli E., Santelli E., di Pellegrino G., & Frassinetti F. (2013). Personal space regulation in childhood autism spectrum disorders. PLoS One, 8: e74959. pmid:24086410
- 71. Kennedy D. P., & Adolphs R. (2014). Violations of personal space by individuals with autism spectrum disorder. PloS one, 9: e103369. pmid:25100326
- 72. Parsons S., Mitchell P., & Leonard A. (2004). The use and understanding of virtual environments by adolescents with autistic spectrum disorders. Journal of Autism and Developmental disorders, 34: 449–466. pmid:15449520
- 73. Candini M., Giuberti V., Manattini A., Grittani S., di Pellegrino G., & Frassinetti F. (2017). Personal space regulation in childhood autism: Effects of social interaction and person’s perspective. Autism Research, 10: 144–154. pmid:27157094
- 74. Noel J. P., Cascio C. J., Wallace M. T., & Park S. (2017). The spatial self in schizophrenia and autism spectrum disorder. Schizophrenia research, 179: 8–12. pmid:27650196
- 75. Juslin P. N., & Vastfjall D. (2008). Emotional responses to music : The need to consider underlying mechanisms, Behav Brain Sci 31:559–75. pmid:18826699
- 76. McDonald J.J., Teder-Salejarvi W.A.,Hillyard S.A.,2000. Involuntary orienting to sound improves visual perception. Nature 407: 906–908. pmid:11057669
- 77. Larsson, P., 2005. Virtually hearing, seeing, and being: Room acoustics, presence, and audiovisual environments. PhD thesis, Chalmers University of Technology, Gothenburg, Sweden.
- 78. Larsson P.,Väljamäe A.,Västfjäll D.,Tajadura-Jiménez A.,Kleiner M., (2010). Auditory-Induced Presence in Mixed Reality Environments and Related Technology In: Dubois E.,Gray P.,Nigay L.(Eds.), The Engineering of Mixed Reality Systems. Springer, London, pp.143–163.
- 79. Tajadura-Jiménez A., Valjamae A., Asutay E., Västjfäll D., (2010).Embodied auditory perception: the emotional impact of approaching and receding sound sources. Emotion 10: 216–229. pmid:20364898