Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Intercepting a sound without vision

Intercepting a sound without vision

  • Tiziana Vercillo, 
  • Alessia Tonelli, 
  • Monica Gori
PLOS
x

Abstract

Visual information is extremely important to generate internal spatial representations. In the auditory modality, the absence of visual cues during early infancy does not preclude the development of some spatial strategies. However, specific spatial abilities might result impaired. In the current study, we investigated the effect of early visual deprivation on the ability to localize static and moving auditory stimuli by comparing sighted and early blind individuals’ performance in different spatial tasks. We also examined perceptual stability in the two groups of participants by matching localization accuracy in a static and a dynamic head condition that involved rotational head movements. Sighted participants accurately localized static and moving sounds. Their localization ability remained unchanged after rotational movements of the head. Conversely, blind participants showed a leftward bias during the localization of static sounds and a little bias for moving sounds. Moreover, head movements induced a significant bias in the direction of head motion during the localization of moving sounds. These results suggest that internal spatial representations might be body-centered in blind individuals and that in sighted people the availability of visual cues during early infancy may affect sensory-motor interactions.

1. Introduction

Sound localization is an important aspect of our hearing as it facilitates interactions with environmental stimuli. Within real environments, sound sources are not always static but can be dynamic, and accounting for these spatial changes over time becomes crucial for an accurate localization. Furthermore, humans are meant to move, so the big challenge for the brain is to extrapolate spatial positions of moving objects and preserve these representations despite self-movements.

Identifying sound source locations becomes particularly relevant when external stimuli are not accessible to the visual system; for example, when stimuli are located out of the visual field (as in the case of a car approaching from behind) or, more importantly, in the case of visual impairment. A number of studies reported enhanced hearing abilities in blind individuals after early visual deprivation, suggesting that the brain might take advantage of the auditory modality to compensate for the lack of vision [14]. Some forms of sensory compensation seem to be driven by a cortical reorganization of the visual cortex, which preserves its computational processing but develops sensitivity to non-visual stimuli [511]. Other recent studies reported task-specific spatial auditory impairment in blind individuals [1215], supporting the idea that the visual system might be important for the spatial calibration of the auditory space [16], [17]. Most of these studies used static auditory stimuli paired with a static body position of the participant, neglecting more natural conditions that consider body movements and dynamic auditory stimuli.

The localization of moving auditory stimuli may be poorly accurate, as object in motion can be displaced toward the direction of motion [1820]. This spatial displacement depends on a number of variables [19] such as target velocity [20], predictability and position of the target [21], type of visual motion and response mode [22]. Although the idea of a perceptual spatial distortion for moving objects might seems detrimental, such displacement sometimes plays an important role for the organization of goal-directed movement. For instance, to catch a target in motion the perceptual system must compensate for the target’s movement occurring during visual processing and during the phase of motor preparation. Under this perspective, a spatial displacement may reflect the ability of the brain to predict the future position of the target and facilitate rapid and accurate motor responses, binding the gap between perception and action [19], [23]. This hypothesis is consistent with “forward models” suggesting an internal mechanism triggered by the efferent copy of the motor command that predicts sensory consequences to improve sensory-motor interaction [2426].

The localization of visual stimuli or sound sources apparently doesn’t change under dynamic conditions [27], [28]. In fact, people make no directional errors when have to report the final location of a moving auditory target after rotational movement of the head. Perceptual stability occurs through a spatial remapping of the stimulus from an egocentric (eye, head and body centered coordinates) to an allocentric, i.e. external, frame of reference that guarantees a stable representation of objects in world coordinates. In the acoustic domain, external stimuli are represented in both egocentric and allocentric frames of reference [29]. Indeed, the egocentric spatial representation of sound sources that is originally deduced from the processing of binaural cues such as interaural time difference (ITD) and interaural level difference (ILD) afterward is remapped in an allocentric frame of references, to ensure an accurate multisensory perception and sensory-motor interaction.

To overcome the limits of using static auditory stimuli and static body positions of participants, we tested a group of sighted and early blind individuals in different auditory tasks to investigate auditory localization of static and moving sound sources with and without head movements. This study explores: i) audio processing in dynamic contexts; ii) the role of body movement on sound perception and iii) the association between auditory processing and body movement (in terms of reference systems and interaction abilities). As mentioned above, moving stimuli are usually displaced toward the direction of motion and this perceptual illusion might be important for sensory-motor interaction. Head movements affect binaural cues adopted by the auditory system for localization and these changes may result in a spatial displacement. Nevertheless, if self-movements trigger the spatial remapping of the sound in allocentric frames of reference, the perceived location of the sound should not change. Our hypothesis is that visual deprivation might affect the localization of static and dynamic auditory stimuli and the remapping into allocentric coordinates that usually occurs during head movements.

2. Methods

Eight healthy volunteers (4 males, 4 females, mean age: 36 ± 6 years of age) and eight early blind individuals (3 males, 5 females, mean age: 40.12 ± 6 years of age) participated in the experiment. S1 Table (Supplemental material) shows additional details about the age, pathology and residual vision of the blind participants. All participants were right handed and had normal hearing. Sighted participants were blindfolded during the experiment. The study was conducted according to the principles defined in the declaration of Helsinki. All testing procedures were approved by the ASL3 of Genoa (Italy). Participants provided signed informed consent after the experimental procedures were explained.

The experimental setup was composed by 18 speakers placed at 5 cm distance one from another and arranged in an arc with 57 cm radius (see Fig 1). Each speaker was covered with 4X4 array of tactile sensors, used to record participants’ responses. Participants sat in the middle of the array. Auditory stimuli were static or moving sounds (white noise burst), presented at 70 dB of sound pressure level. We recorded motor responses by using tactile sensors directly attached to the speaker surface and head movements by using the Vicon motion tracking system (Vicon Motion Systems, Ltd., UK). This is an infrared marker-tracking system that acquires live movements in 3D space with high temporal and spatial precision.

thumbnail
Fig 1. Upper panels shows the procedures for the three different localization tasks.

The lower panel shows a picture of the experimental setup.

https://doi.org/10.1371/journal.pone.0177407.g001

Participants performed three different auditory tasks in a random order: 1) a simple pointing task with static sound sources to control for any bias in sound localization, 2) a localization task with moving sounds, 3) a localization task with moving sounds while participants performed a head movement (see Fig 1 for methods and procedures). During the localization task with static auditory stimuli (task 1), a 300 ms sound was delivered by one of the 18 speakers from the setup. Participants had to identify and touch the speaker that produced the sound. We run a single block where every sound location was repeated 10 times, in a random order, for a total of 180 trials.

In the localization task with moving sounds sources (task 2), participants kept their head straight while listening to a sound moving from left to right or from right to left. While the space displacement of the moving sound was fixed at 15 cm across the experiment, the duration of the sound, and consequently its velocity, was manipulated across three values: 200, 300 and 500 ms where the duration is inversely proportional to the velocity. After the presentation of the moving sound, participants had to locate its endpoint by touching the last speaker that played the sound. We manipulated the direction of sound motion generating a “rightward” and a “leftward” condition. For each direction of the moving sound, participants performed three experimental blocks: one for each stimulus duration. The endpoint of the moving sound ranged between -7.5, 2.5, 12.5, 22.5 and 32.5 cm (given the fixed spatial displacement of the moving sound its start-point ranged between -27.5, -17.5, -7.5, 5 and 12.5 cm) in the rightward condition, where negative values represent the left side and positive values the right side of the array, and -32.5, -22.5, -12.5, -2.5, 7.5 for the leftward condition. Each endpoint location was repeated 10 times in a constant stimuli algorithm, in a random order, for a total of 100 trials.

In the localization task with moving sounds and head movements (task 3), we used the same auditory stimuli of task 2, but this time participants had to perform a head rotation during sound presentation. Specifically, we asked participants to rotate their head in the opposite direction of sound motion (for example in the rightward condition, while the sound was moving to the right the head was rotating to the left side) and to start the movement after a go signal simultaneous to the start of the sound. At the end of the auditory stimulation, participants had to maintain the head rotated and localize the endpoint of the moving sound by touching the speaker with their right hand. Also in this task, participants performed one experimental blocks for each stimulus duration (three blocks).

Before the experiment, we trained participants to perform precise head movements. We analyzed head movements with the Vicon motion capture system. For an accurate analysis, we used seven markers: three of them were placed on participants’ shoulders to form a horizontal line, two markers were placed above the ears, one on the forehead and one above the inion. These last two markers generated a vertical line on the antero-posterior axes of the brain. We measured the intersection between the horizontal and the vertical line and calculated the amplitude of the angle produced by the rotation of the head and the speed of the head movement.

3. Results

Results from the pointing task with static sound sources are showed in Fig 2. Average errors, calculated for each subject as the difference between the reproduced and the real location of the sound and then averaged across participants for each group, are plotted for each speaker location. Positive values represent a mislocalization to the right, while negative value a displacement to the left. The dashed line represents the central position of the speaker array, so that bars on the right side of the line denote speakers on the right side of the array and bars on the left side of the line denote speakers on the left side of the array. Sighted participants had a small tendency to expand the auditory space, displacing sounds location on the right side of the array to the right (positive errors) and location on the left side of the array to the left (negative errors). However, errors were not statistically different from zero. On the other side, the group of early blind participants showed a significant mislocalization to the left (one sample, two-tailed t-test: t(17) = -4.53, p<0.001). A repeated measure ANOVA revealed a significant effect of the speaker location (F(17) = 4.43, p<0.0001, η2 = 0.38). Specifically the error in localizing the last speaker on the right was significantly higher than all the others (all p<0.01).

thumbnail
Fig 2. Average errors for each speaker location (locations are represented up in the figure) for the sighted (black bars, left panel) and blind group (green bars, right panel) measured in the localization task with static sounds.

https://doi.org/10.1371/journal.pone.0177407.g002

Localization errors in task 2 were analysed with a repeated measure ANOVA (within factors: motion direction, speed and endpoint location; between factor: group). For this task, errors were calculated as the difference between reproduced endpoint locations measured in task 2 and reproduced locations measured in task 1. Moreover, we normalized errors to correct for the direction of motion so that positive values of the error represent a bias in the direction of sound motion. Since endpoint locations were specular for the two motion direction conditions, we considered 5 endpoint locations from 1 (most central) to 5 (most peripheral). We found a significant interaction between endpoint location and group (F(1,2,4,1) = 4.11, p = 0.006, η2 = 0.25), but no effect of speed and motion direction. Fig 3 shows the results for task 2 for sighted (black symbols and line) and blind (green symbols and line) participants. As we did not find any significant effect of speed, we averaged individual data across the three speed conditions. Group mean errors are plotted as a function of all the endpoint locations of the sound. Positive value of the error represent a displacement toward the direction of sound motion. On average error were significantly different from zero (showing a bias in the direction of sound motion) only for blind participants when the sound ended 5 cm to the right (one sample, 2-tailed t-tests with Bonferroni correction for multiple comparison, t(7) = 4.73, p = 0.002). Sighted controls’ error when the sound ended 25 cm to the right was only marginally significant (one sample, 2-tailed t-tests with Bonferroni correction for multiple comparison, t(7) = 2.44, p = 0.04).

thumbnail
Fig 3. Average errors (from the bias observed in task 1) measured in the localization task with moving sounds and static head for the sighted (black lines and symbols) and blind (green lines and symbols) groups of participants.

Data are averaged across the three speed conditions (200, 300 and 500 ms).

https://doi.org/10.1371/journal.pone.0177407.g003

The head rotation, performed in task 3, differently affected the performance of the two group of participants, with blind individuals showing a bias in the direction of head motion. Results for the localization tasks with moving sounds and moving head are reported in Fig 4. Group mean errors are calculated as the difference between reproduced endpoint locations measured in task 3 and errors measured in task 1. Errors were normalized to correct for the direction of motion so that positive values of the error represent a bias in the direction of sound motion and negative values of the error show a bias in the direction of head motion. A repeated measure ANOVA (within factors: motion direction, speed and endpoint location; between factor: group) showed a significant interaction between motion direction, endpoint location and group (F(1,2,4,1) = 4.12, p = 0.008, η2 = 0.31); motion direction and endpoint location (F(1,2,4,1) = 8.9, p<0.001, η2 = 0.49); speed and endpoint location (F(1,2,4,1) = 3.28, p = 0.003, η2 = 0.26); endpoint and group (F(1,2,4,1) = 3.42, p = 0.01, η2 = 0.27); and a significant effect of endpoint location (F(1,2,4,1) = 13.72, p<0.001, η2 = 0.6). Pairwaise comparison (corrected for multiple comparison) revealed that the mislocalization of peripheral speakers was larger than the mislocalization of the most central speakers (p<0.002).

thumbnail
Fig 4. Average errors (from the bias observed in task 1) measured in the localization task with moving sounds and moving head for the sighted (black lines and symbols) and blind (green lines and symbols) groups of participants, for the three speed conditions (200, 300 and 500 ms).

https://doi.org/10.1371/journal.pone.0177407.g004

For the rightward condition, differences in localization’s error between the two groups of participants were significant only for the 500 ms condition (repeated measure ANOVA, group for the 500 ms condition; within factors: endpoint location; between factor: group, F(4,1) = 9.28, p = 0.009, η2 = 0.39). We also compared localization error measured in this task with localization errors observed in the task with moving sound and static head. A repeated measure ANOVA (within factor: head, speed and endpoint; between factor: group) showed a significant effect of head (F(1,3,4,1) = 8.44, p = 0.01, η2 = 0.37) and endpoint (F(1,3,4,1) = 6.57, p<0.001, η2 = 0.44), a significant interaction between head and group (F(1,3,4,1) = 4.66, p = 0.04, η2 = 0.25) and head and endpoint (F(1,3,4,1) = 15.09, p<0.001, η2 = 0.40). Similarly to the rightward condition, for the leftward condition, a repeated measure ANOVA (within factor: head, speed and endpoint; between factor: group) showed a significant interaction between endpoint and group (F(1,3,4,1) = 3.88, p = 0.01, η2 = 0.30) and head, endpoint and group (F(1,3,4,1) = 3.49, p = 0.01, η2 = 0.28).

4. Discussion

Results from this study showed that early blind individuals are inclined to displace static sounds toward the left side of space. Results also showed that rotational head movements affect auditory localization in early blind individuals but not in sighted people. Overall, these findings suggest that early visual deprivation has an effect on auditory localization and that auditory spatial representations in blind individuals may be body-centered.

In the localization task with static sound, blind participants showed a spatial bias to the left. This result might appear in conflict with previous study reporting similar or even higher auditory spatial accuracy in blind as compared to sighted individuals (Lessard et al., 1998; Röder et al., 1999). We believe that the high spatial resolution provided by our setup (the distance between the center of two adjacent speakers was 5 cm) enabled us to highlight even small spatial errors that otherwise might not be detectable. Indeed, in a previous study (Lessard et al., 1998) the distance between speakers was 10 cm, which is the largest error that we reported in this task.

By positioning speakers on a table (not at the ear level) we might have added reverberation and affected sound perception. On the other hand, reverberation should have equally impaired the performance of both group of participants, and not only that one of the blind group. Other studies support our result, showing a leftward bias in blind individuals for a tactile bisection task [30] that might depend on differences in the role of the right and left hemisphere in the control of spatial attention [31].

Stimulus motion induced a small spatial bias in both groups, as previously reported in the visual modality [18], [19] and in the auditory modality [20]. Interestingly, blind and sighted participants showed opposite trend with blind individuals displacing central stimuli and sighted controls only peripheral sounds. This might be explained by an enhanced auditory localization for peripheral stimuli in blind individuals [3].

The more interesting result of the current study is that rotational head movements impaired sound localization in blind individuals, inducing a bias in the direction of head motion, but did not affect the performance of sighted participants as previously reported [28]. This result suggests that in early blind, the spatial remapping of the auditory stimulus in an allocentric frame of reference does not occur and that blind participants might be more susceptible to a motor bias. An egocentric representation of the surrounding space may be extremely functional for blind individuals during navigation as it provides an additional “margin of safety” and can help to avoid obstacles [32]. Previous studies already supported an egocentric spatial representation in blind individuals [33], [34]. This result is also consistent with a recent study from Finocchietti et al. [12] reporting a deficit in blind individuals in encoding sound motion in the lower side of the sagittal plane with a spatial bias toward the head.

We propose that vision plays a crucial role in organizing auditory spatial processing. Developmental studies already showed that before 10–14 years of age, audio-visual and visual-haptic stimuli are not integrated optimally as in adults, but vision dominates over audition and touch for spatial judgments [16], [35], supporting the idea that vision is important for the spatial calibration of the other sensory modalities [17]. In agreement with recent findings [1214], [3638], here we showed that the presence of visual cues during the early infancy affects auditory spatial representations.

Vision, over all sensory modalities, is usually defined as a spatial sense [39] because it represents proximal and distal items simultaneously, extracts spatial invariants despite self-motion and provides the most precise information about consequences of self-displacement. The severe reduction of distal information and the poor representation of external landmarks induced by the loss of vision may prompt the use of egocentric frames of reference because allocentric representations become more difficult to process. Indeed, it has been argued that early blind spatial knowledge rely more on body-centered proprioceptive and kinesthetic information (Millar, 1994). In agreement with this hypothesis, we suggest that early visual deprivation shapes internal spatial representations, enhancing the salience of the body as a reference. Our findings suggest that visual information is important not only for a spatial calibration of the other sensory modalities, but also for the correct development of sensory-motor interactions.

Supporting information

S1 Table. Individual age, diagnosis and residual vision for each of the blind participants we have tested.

https://doi.org/10.1371/journal.pone.0177407.s001

(DOCX)

Author Contributions

  1. Conceptualization: TV MG.
  2. Data curation: TV AT.
  3. Formal analysis: TV.
  4. Investigation: TV AT.
  5. Methodology: TV.
  6. Project administration: MG.
  7. Resources: MG.
  8. Supervision: MG.
  9. Writing – original draft: TV.
  10. Writing – review & editing: TV MG AT.

References

  1. 1. Doucet M, Guillemot J, Lassonde M, Gagné J, Leclerc C, and Lepore F, “Blind subjects process auditory spectral cues more efficiently than sighted individuals,” Exp. Brain Res., vol. 160, no. 2, pp. 194–202, 2005. pmid:15309355
  2. 2. Lessard N, Paré M, Lepore F, and Lassonde M, “Early-blind human subjects localize sound sources better than sighted subjects.,” Nature, vol. 395, no. 6699, pp. 278–280, 1998. pmid:9751055
  3. 3. Röder B, Teder-Sälejärvi W, Sterr A, Rösler F, Hillyard SA, and Neville HJ, “Improved auditory spatial tuning in blind humans.,” Nature, vol. 400, no. 6740, pp. 162–166, 1999. pmid:10408442
  4. 4. Voss P, Lassonde M, Gougoux F, Fortin M, Guillemot JP, and Lepore F, “Early- and late-onset blind individuals show supra-normal auditory abilities in far-space,” Curr. Biol., vol. 14, no. 19, pp. 1734–1738, 2004. pmid:15458644
  5. 5. Amedi A, Merabet LB, Bermpohl F, and Pascual-Leone A, “The occipital cortex in the blind lessons about plasticity and vision,” Curr. Dir. Psychol. Sci., vol. 14, no. 6, pp. 306–311, 2005.
  6. 6. Collignon O, Dormal G, Albouy G, Vandewalle G, Voss P, Phillips C, at al., “Impact of blindness onset on the functional organization and the connectivity of the occipital cortex,” Brain, vol. 136, no. 9, pp. 2769–2783, 2013.
  7. 7. Collignon O, Vandewalle G, Voss P, Albouy G, Charbonneau G, Lassonde M, et al., “Functional specialization for auditory-spatial processing in the occipital cortex of congenitally blind humans.,” Proc. Natl. Acad. Sci. U. S. A., vol. 108, no. 11, pp. 4435–4440, 2011. pmid:21368198
  8. 8. Collignon O, Voss P, Lassonde M, and Lepore F, “Cross-modal plasticity for the spatial processing of sounds in visually deprived subjects,” Exp. Brain Res., vol. 192, no. 3, pp. 343–358, 2009. pmid:18762928
  9. 9. Heimler B, Striem-Amit E, and Amedi A, “Origins of task-specific sensory-independent organization in the visual and auditory brain: Neuroscience evidence, open questions and clinical implications,” Current Opinion in Neurobiology, vol. 35. pp. 169–177, 2015. pmid:26469211
  10. 10. Rauschecker JP, “Compensatory plasticity and sensory substitution in the cerebral cortex,” Trends in Neurosciences, vol. 18, no. 1. pp. 36–43, 1995. pmid:7535489
  11. 11. Voss P and Zatorre RJ, “Organization and reorganization of sensory-deprived cortex,” Current Biology, vol. 22, no. 5. 2012.
  12. 12. Finocchietti S, Cappagli G, and Gori M, “Encoding audio motion: spatial impairment in early blind individuals,” Front. Psychol., vol. 6, p. 1357, 2015. pmid:26441733
  13. 13. Gori M, Sandini G, Martinoli C, and Burr DC, “Impairment of auditory spatial localization in congenitally blind human subjects.,” Brain, vol. 137, no. Pt 1, pp. 288–93, 2014. pmid:24271326
  14. 14. Vercillo T, Burr D, andGori M, “Early visual deprivation severely compromises the auditory sense of space in congenitally blind children,” Dev. Psychol., vol. 52, no. 6, p. 847, 2016. pmid:27228448
  15. 15. Vercillo T, Milne JL, Gori M., and Goodale MA., “Enhanced auditory spatial localization in blind echolocators,” Neuropsychologia, vol. 67, pp. 35–40, 2015. pmid:25484307
  16. 16. Gori M, Sandini G, and Burr D, “Development of Visuo-Auditory Integration in Space and Time,” Frontiers in Integrative Neuroscience, vol. 6. 2012.
  17. 17. Gori M, “Multisensory Integration and Calibration in Children and Adults with and without Sensory and Motor Disabilities,” Multisens. Res., vol. 28, pp. 71–99, 2015. pmid:26152053
  18. 18. Freyd JJ and Finke RA, “Representational momentum.,” J. Exp. Psychol. Learn. Mem. Cogn., vol. 10, no. 1, pp. 126–132, 1984.
  19. 19. Hubbard TL, “Representational momentum and related displacements in spatial memory: A review of the findings.,” Psychon. Bull. Rev., vol. 12, no. 5, pp. 822–851, 2005. pmid:16524000
  20. 20. Perrott DR and a Musicant D, “Minimum auditory movement angle: binaural localization of moving sound sources.,” J. Acoust. Soc. Am., vol. 62, pp. 1463–1466, 1977. pmid:591679
  21. 21. Getzmann S, “Shifting the onset of a moving sound source: A Fr??hlich effect in spatial hearing,” Hear. Res., vol. 210, pp. 104–111, 2005. pmid:16213116
  22. 22. Kerzel D, “Mental extrapolation of target position is strongest with weak motion signals and motor responses,” Vision Res., vol. 43, no. 25, pp. 2623–2635, 2003. pmid:14552804
  23. 23. Nijhawan R, “Motion extrapolation in catching,” Nature, vol. 370, no. 6487, pp. 256–257, 1994.
  24. 24. Desmurget M, and Grafton S, “Feedback or feedforward control: End of a dichotomy,” Tak. action Cogn. Neurosci. Perspect. intentional acts, pp. 289–338, 2003.
  25. 25. Sperry RW, “Neural basis of the spontaneous optokinetic response produced by visual inversion.,” J. Comp. Physiol. Psychol., vol. 43, pp. 482–489, 1950. pmid:14794830
  26. 26. Wolpert DM, Ghahramani Z., and Jordan M. I., “An internal model for sensorimotor integration,” Science (80-.)., vol. 269, no. 5232, pp. 1880–1882, 1995. pmid:7569931
  27. 27. Medendorp WP, Smith MA, Tweed DB, and Crawford JD, “Rotational remapping in human spatial memory during eye and head motion.,” J. Neurosci., vol. 22, no. 1, p. RC196, 2002. pmid:11756525
  28. 28. Vliegen J, Van Grootel TJ, and a Van Opstal J, “Dynamic sound localization during rapid eye-head gaze shifts.,” J. Neurosci., vol. 24, no. 42, pp. 9291–9302, 2004. pmid:15496665
  29. 29. Schechtman E, Shrem T, and Deouell LY, “Spatial Localization of Auditory Stimuli in Human Auditory Cortex is Based on Both Head-Independent and Head-Centered Coordinate Systems,” J. Neurosci., vol. 32, no. 39, pp. 13501–13509, 2012. pmid:23015439
  30. 30. Cattaneo Z, Fantino M, Tinti C, Pascual-Leone A, Silvanto J, and Vecchi T, “Spatial biases in peripersonal space in sighted and blind individuals revealed by a haptic line bisection paradigm.,” J. Exp. Psychol. Hum. Percept. Perform., vol. 37, no. 4, pp. 1110–1121, 2011. pmid:21517214
  31. 31. Jewell G and McCourt ME, “Pseudoneglect: A review and meta-analysis of performance factors in line bisection tasks,” Neuropsychologia, vol. 38, no. 1, pp. 93–110, 2000. pmid:10617294
  32. 32. Kolarik AJ, Pardhan S, Cirstea S, and Moore BCJ, “Auditory spatial representations of the world are compressed in blind humans,” Exp. Brain Res., pp. 1–10, 2016.
  33. 33. Corazzini LL, Tinti C, Schmidt S, Mirandola C, and Cornoldi C, “Developing Spatial Knowledge in the Absence of Vision: Allocentric and Egocentric Representations Generated by Blind People When Supported by Auditory Cues,” Psychol. Belg., vol. 50, no. 3–4, p. 327, 2010.
  34. 34. Iachini T, Ruggiero G, and Ruotolo F, “Does blindness affect egocentric and allocentric frames of reference in small and large scale spaces?,” Behav. Brain Res., vol. 273, pp. 73–81, 2014. pmid:25078290
  35. 35. Gori M, Del Viva M, Sandini G, and Burr DC, “Young Children Do Not Integrate Visual and Haptic Form Information,” Curr. Biol., vol. 18, no. 9, pp. 694–698, 2008. pmid:18450446
  36. 36. Cappagli G, Cocchi E, and Gori M, “Auditory and proprioceptive spatial impairments in blind children and adults,” Developmental Science, 2015.
  37. 37. Lewald J, “Vertical sound localization in blind humans,” Neuropsychologia, vol. 40, no. 12, pp. 1868–1872, 2002. pmid:12207985
  38. 38. Zwiers MP, Van Opstal AJ, Cruysberg JRM, Van Opstal AJ, and Cruysberg JRM., “A spatial hearing deficit in early-blind humans,” J.Neurosci., vol. 21, no. 1529–2401, p. RC142–RC145, 2001.
  39. 39. Thinus-Blanc C and Gaunet F, “Representation of space in blind persons: vision as a spatial sense?,” Psychol. Bull., vol. 121, no. 1, pp. 20–42, 1997. pmid:9064698