Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Seeing Circles and Drawing Ellipses: When Sound Biases Reproduction of Visual Motion

  • Etienne Thoret ,

    etienne.thoret@mcgill.ca

    Affiliation Laboratoire de Mécanique et d’Acoustique, CNRS, UPR 7051, Aix Marseille Université, Centrale Marseille, Marseille, France

  • Mitsuko Aramaki,

    Affiliation Laboratoire de Mécanique et d’Acoustique, CNRS, UPR 7051, Aix Marseille Université, Centrale Marseille, Marseille, France

  • Lionel Bringoux,

    Affiliation Aix-Marseille Université, CNRS, ISM, UMR 7287, Marseille, France

  • Sølvi Ystad,

    Affiliation Laboratoire de Mécanique et d’Acoustique, CNRS, UPR 7051, Aix Marseille Université, Centrale Marseille, Marseille, France

  • Richard Kronland-Martinet

    Affiliation Laboratoire de Mécanique et d’Acoustique, CNRS, UPR 7051, Aix Marseille Université, Centrale Marseille, Marseille, France

Seeing Circles and Drawing Ellipses: When Sound Biases Reproduction of Visual Motion

  • Etienne Thoret, 
  • Mitsuko Aramaki, 
  • Lionel Bringoux, 
  • Sølvi Ystad, 
  • Richard Kronland-Martinet
PLOS
x

Abstract

The perception and production of biological movements is characterized by the 1/3 power law, a relation linking the curvature and the velocity of an intended action. In particular, motions are perceived and reproduced distorted when their kinematics deviate from this biological law. Whereas most studies dealing with this perceptual-motor relation focused on visual or kinaesthetic modalities in a unimodal context, in this paper we show that auditory dynamics strikingly biases visuomotor processes. Biologically consistent or inconsistent circular visual motions were used in combination with circular or elliptical auditory motions. Auditory motions were synthesized friction sounds mimicking those produced by the friction of the pen on a paper when someone is drawing. Sounds were presented diotically and the auditory motion velocity was evoked through the friction sound timbre variations without any spatial cues. Remarkably, when subjects were asked to reproduce circular visual motion while listening to sounds that evoked elliptical kinematics without seeing their hand, they drew elliptical shapes. Moreover, distortion induced by inconsistent elliptical kinematics in both visual and auditory modalities added up linearly. These results bring to light the substantial role of auditory dynamics in the visuo-motor coupling in a multisensory context.

Introduction

It is now well established that biological motion is characterized by specific kinematic properties, for instance, by the 1/3 power law which postulates that the tangential velocity of the motion vt is constrained by the local curvature C of its geometrical trajectory: vt = KC-1/3 with K a constant [1]. Regarding the visual modality, it has been shown that the human ability to track a visual motion with the non-hidden hand is facilitated when the motion complies with the 1/3 power law [2]. By contrast, the perceived geometry of a circular visual motion can be distorted if the motion does not comply with these biological rules [3]. More recently, we showed that the visuo-motor coupling of circular motions displayed with incongruent elliptical kinematics were distorted by subjects who do not see their hand [4]. Regarding the kinaesthetic modality, Viviani, Baud-Bovy, and Redolfi [5] have also shown that the perception of the movement geometry is constrained by the covariations between the movement kinematics and its curvature. Indeed, a circular hand movement trailed by a mechanical arm that does not comply with the 1/3-power law distorts its perceived geometry into an elliptical one. Taken together, these results confirmed the existing relation between curvature and dynamics in the emergence of the perceived and/or reproduced motion geometry.

In the field of multisensory research, many studies have demonstrated the ability of the auditory modality to modify the visual perception of motion [6,7] (see [8] for a review) and even drive our motor behavior [9] (see [10] for a review). For instance, Brooks et al. [11] showed that the visual perception of a specific kind of biological movement, represented by a point-like walker, is affected by spatial auditory motion. In line with this observation, Arrighi et al. [12] demonstrated that sounds can even enhance the visual perception of point-like walkers when displayed with synchronous auditory motions. These studies give credit to a multisensory processing framework of biological motions.

The present study investigated the influence of movement dynamics on the reproduced geometry of biological motion by combining auditory and visual stimuli evoking circular or elliptical motions in consistent or inconsistent audiovisual situations. Whereas spatial (i.e. geometrics-related) cues are intrinsically conveyed by visual (and kinaesthetic) sensory inputs, we hereby investigated purely dynamic (i.e., kinematics-related) cues without spatial information by using monophonic sounds [13]. Indeed, recent experiments revealed that time-varying, monophonic friction sounds produced by someone drawing evoke the movement kinematics of the drawer’s pencil and enable, to a certain extent, the recognition of the drawn shape [14]. In our experiment, subjects were asked to synchronize drawing movements on a graphic tablet with visual motions displayed on a screen without seeing their hand. It was hypothesized that sounds evoking incongruent kinematics with respect to the visual motion affect the geometry of the reproduced shape resulting from the coupling between a visual biological motion and a drawing movement. This would highlight the role of movement dynamics in the emergence of the geometry independently from spatial cues.

By investigating the effect of congruent and incongruent combinations of auditory and visual kinematics on visuo-motor coupling of biological motion, we aimed at considering crossmodal influences of these two modalities. As previously assumed, if movement dynamics alone influence the geometry of the motion induced by such a visuo-motor coupling, it would allow for the investigation of the combined role of the auditory and visual channels in a multisensory context.

Materials & Methods

Participants

Seventeen right-handed subjects (2 women; M = 28.5 years, SD = 8 years) participated in the experiment. All subjects provided their consent prior to the study and were naive as to the specific purpose of the experiment. At the time the experiment was designed and conducted (winter 2012–2013), no ethics approval was required from the Aix-Marseille University for behavioral studies such as those reported in this manuscript. The local Ethics board at the Aix-Marseille University subsequently approved highly similar experiments conducted in the same Institute two years later the experiment reported in this manuscript was conducted. Neither of the experiments involved deception or stressful procedures. Participants were informed that they were free to leave the experiment at any time, and that their data would have been treated anonymously. The research reported in this manuscript was carried out according to the principles expressed in the 1964 Declaration of Helsinki. Participants in the experiment were recruited on a voluntary basis from the students and staff of the Groupement des Laboratoires de Marseille. All the data were anonymized before analysis.

Stimuli

Audiovisual stimuli evoking motions were generated from specific kinematic rules applied to both visual and auditory stimuli leading to consistent or inconsistent situations. According to the kinematic behavior of planar motion, the dynamic behavior was modeled by a system of two harmonic oscillators of period T that differ by their relative phase, noted Φ, allowing for the simulation of specific kinematic properties ranging from circular to elliptical motion [13]: where Amp is the amplitude of the motion, x and y the coordinates of the motion in the (x(t), y(t)) plane, s(t) the curvilinear abscissa along the trajectory, and t the time. Interestingly, this model complies with biological rules, in particular the 1/3 power law. Two configurations were used here: 1) circular kinematics, with relative phase Φ = 90°, leading to a constant velocity profile matching the kinematic behavior associated with the displacement along a circle and 2) elliptical kinematics, with Φ = 45°, leading to a time varying velocity profile matching the kinematics associated with the displacement along an ellipse of eccentricity 0.9, which corresponds to the ellipse drawn in the most natural way [15].

Visual motions consisted of a white moving dot, with a diameter of 6mm on a black background. The motion was generated according to a method proposed by Viviani, Baud-Bovy, and Redolfi [5]. The dot always followed the same trajectory, a geometric circle with a radius (R) of 6.36 cm corresponding to a circumference of 40 cm, while its velocity varied according to either the circular or the elliptic kinematics as described above (Fig 1, Left). They defined what is referred to as the Visual Kinematics (VK) and were characterized by the relative phase, noted ΦV (indexed V for visual modality). For circular kinematics obtained with ΦV = 90° and A = R corresponding to constant velocity, the kinematics were consistent with biological motion on the circular geometric trajectory traveled by the dot (Fig 1, Left, Panel A). For elliptical kinematics, with ΦV = 45° and A = 6.92 cm corresponding to a time varying velocity, the dot moved faster in the vertical parts of the circle and the motion was therefore biologically inconsistent (Fig 1 Left, Panel B), with the circular trajectory traveled by the dot. Each type of visual motion contained 19 complete cycles of period T = 1.8s and lasted for 34.2s. The four audiovisual stimuli are available in S1S4 Videos.

thumbnail
Fig 1. Visual Stimuli.

Left: Kinematics of visual stimuli (dotted line) that either comply (Panel A) or not (Panel B) with biological rules for a geometric circle (solid line). The velocity of the dot in the biologically consistent motion condition is constant and equals 22.24 cm.s-1 and varies between 13.81 cm.s-1 and 31.62 cm.s-1 for the biologically inconsistent condition. Right: Audiovisual stimuli consisting of combinations of visual and auditory motion characterized by their relative phases Φ (indexed V and A for vision and audition respectively).

https://doi.org/10.1371/journal.pone.0154475.g001

Auditory motion was evoked by synthesized friction sounds generated according to a physically-informed model [16,17], which considers that a friction sound is the result of successive micro-impacts produced when a sharp object (exciter) interacts with the asperities of a rough surface (resonator). From a signal point of view, such friction sounds can be simulated by bandpass filtered white noise, in which the central filter frequency is controlled by the velocity profile. A biquad bandpass filter with a constant quality factor equal to 3 was used for this purpose. The relationship between the tangential velocity vt and the central frequency fc of the bandpass filter was defined by: fc(t) = α vt(t) where α is a constant proportionality coefficient. From a perceptual point of view, the mapping mainly influences the timbre, in particular the mean spectral centroid of the resulting friction sound: the higher the value of α, the higher the mean spectral centroid (perceived brightness). The most appropriate mapping (chosen as α = 20) was based on results from a preliminary calibration experiment. For the sake of realism of the synthetic-sound, the contribution of an interacting resonant object was taken into account, and the sounds were synthesised to evoke rubbing on a wooden plate [18,19]. Based on this model, synthesized friction sounds modulated by the velocity profile corresponding either to circular or elliptical kinematics were generated. They defined what is referred to as the Auditory Kinematics (AK) and were characterized by the relative phase, noted ΦA (indexed A for auditory modality). For circular kinematics (ΦA = 90°), the generated sound contained no modulation and evoked a uniform motion, whereas for elliptical kinematics (ΦA = 45°), the generated sound contained acoustic modulations induced by the time-varying velocity profile. These modulations reflect variations of specific acoustical features, such as the spectral centroid that has been shown to adequately evoke perceived motion [2022].

Hence, 12 different types of audiovisual motion (AVM) were obtained by generating four combinations of constant and time-varying, visual and auditory motions (4 conditions x 3 repetitions, see Fig 1, Right). The AVM had consistent kinematics when both visual and auditory kinematics matched uniform circular kinematics (ΦV = ΦA = 90°). In this case, the AVM corresponded to biological motion. In the three other cases, the AVM had inconsistent kinematics with respect to the geometry as elliptical kinematics were conveyed either in the auditory modality (ΦV = 90°; ΦA = 45°), in the visual modality (ΦV = 45°; ΦA = 90°) or in both modalities (ΦV = ΦA = 45°). These three types of AVM were therefore non biological.

Task

Subjects were placed in front of a DELL 1907FP screen (1280 x 1024 pixels; 60 Hz refresh rate) that displayed the moving dot while they listened to the auditory stimuli, presented diotically without any spatial cues through a set of headphones (Sennheiser HD650 headphones, the sample rate of the soundcard was 44100 Hz with 16-bit resolution), and were asked to synchronize their hand movement with the visual moving dot on a graphic tablet (Wacom Intuos 5 graphic tablet at a sample rate of 129 Hz and with a spatial precision of 5.10−3 mm; see Fig 2). No specific information was given concerning the auditory kinematics or the geometry of the trajectory to be reproduced. In particular, the subjects did not know that the visual trajectory of the moving dot was always circular and that only the kinematics of the moving dot differed. The stimuli were presented according to two different pseudo random series that were balanced across subjects: CC CE EE EE CE EC CC CE EE CC EC CE and EC CC CE EC EC CC EE CE CE EE EE CC, the audiovisual conditions are denoted as follow CC (ΦV = 90°; ΦA = 90°), CE (ΦV = 90°; ΦA = 45°), EC (ΦV = 45°; ΦA = 90°), and EE (ΦV = 45°; ΦA = 45°). A training session was conducted before the experiment to familiarise subjects with the task and with the use of the graphic tablet. During the training session, which contained up to 3 trials of the actual test, the same instructions as in the real test were given. In order to evaluate the visuo-motor coupling without visual feedback on the performed movement, the subjects could not see their hand during the experiment. In order to analyze the influence of the different types of AVM on motor performance, the drawn shapes were fitted to compute their relative phases Φdrawn.

thumbnail
Fig 2. Experimental set-up.

The subjects had to synchronize their hand movement with the moving dot on a graphic tablet without seeing their drawing hand.

https://doi.org/10.1371/journal.pone.0154475.g002

Data Analyses

Data collected on the graphic tablet were filtered using a Savitzky-Golay filter [23] with a 43-point temporal window and third-order interpolation to remove digital noise due to the high sampling rate. Then high-pass Butterworth filtering (0.2 Hz cut-off frequency) was applied to remove the spatial drift observed on hand movements (due to the fact that subjects could not see their hand while drawing). Finally, data were analysed with respect to the relative phase Φdrawn characterising the geometry of the reproduced shape. First, the eccentricity edrawn (i.e. a variable characterising the flatness of the drawn shape) was estimated for the last 10 out of 19 drawn shapes [4,5] and then the relative phase Φdrawn was calculated using the following formula: .

The statistical design contained 2 Visual Kinematics (VK) x 2 Auditory Kinematics (AK). Repeated measures ANOVA was performed with Statistica software to evaluate the effects of each experimental factor on the motor performance described by the relative phase Φdrawn. The normality and the homogeneity of the distributions were assessed with a Lilliefors test and a O’Brien test respectively. The distortion between the reproduced shapes and the circular geometry of the visual motion in terms of flatness was performed by means of a one-sample two-tailed t-test between the relative phase Φdrawn and the mean ΦV = 90° in the four audiovisual conditions. The significance level of the p-value was set to 0.05 for all analyses.

Results

The averaged relative phases by subjects are within S1 Dataset.

The results showed that the geometry of the drawn shapes was noticeably distorted by elliptical kinematics conveyed by both visual and auditory modalities (see Table 1). More precisely a main effect of the visual kinematics (F(1,16) = 147.22, p < .001) as well as a main effect of the auditory kinematics (F(1,16) = 12.83, p < .01) on the flatness, characterized by the relative phase of the reproduced shape was revealed, meaning that circles are reproduced flattened (i.e. elliptical) when the visual kinematics (ΦV = 45°) as well as auditory kinematics (ΦA = 45°) are elliptical. However, the interaction between visual and auditory kinematics was not significant (F(1,16) = 0.01, p = 0.96; Fig 3 and Fig 4). The S1S4 Videos display the 4 averaged reproduced ellipses along with the corresponding audiovisual stimuli.

thumbnail
Fig 3. Motor performances.

Mean of the motor reproduction in the four audiovisual conditions.

https://doi.org/10.1371/journal.pone.0154475.g003

thumbnail
Fig 4. Results.

Mean relative phase and 95% confidence intervals of the reproduced motion as a function of visual and auditory kinematics, illustrating the distinct influences of both factors (i.e., no interaction).

https://doi.org/10.1371/journal.pone.0154475.g004

Figs 3 and 4 show that even in the consistent situation when both auditory and visual kinematics are circular, the reproduced circles were distorted into a more elliptical shape (Φdrawn = 77°). Globally, elliptical visual kinematics (ΦV = 45°) induced stronger distortions than elliptical auditory kinematics (ΦA = 45°). A distortion of 11.7° with respect to the congruent situation was observed when elliptical visual kinematics is combined with circular auditory kinematics (ΦV = 45°, ΦA = 90°), while a distortion of 5.5° is observed when elliptical auditory kinematics is combined with circular visual kinematics (ΦV = 90°, ΦA = 45°). When both visual and auditory kinematics were elliptical (ΦV = 45°, ΦA = 45°), a distortion of 17° was obtained, which corresponds to the sum of the previously observed distortions obtained when either the visual or the auditory kinematics were elliptical. Hence, the distortion induced by elliptical auditory kinematics has not been influenced by visual kinematics (5.3° for ΦV = 45° and 5.5° for ΦV = 90°) and vice versa (11.5° for ΦA = 45° and 11.7° for ΦA = 90°). Finally, the reproduced motion was distorted in all situations (Two-tailed t-tests between Φdrawn and 90°: ΦV = 90°, ΦA = 90°: t(16) = -18.09, p < .001; ΦV = 90°, ΦA = 45°: t(16) = -9.89, p < .001; ΦV = 45°, ΦA = 45°: t(16) = -21.37, p < .001; ΦV = 45°, ΦA = 90°: t(16) = -17.06, p < .001).

Discussion

In this study we investigated the influence of both visual and auditory dynamics on the reproduced geometry of visual motion by asking subjects to synchronize drawing movements on a graphic tablet. The results were analyzed in terms of relative phase distortion of the reproduced shape with respect to the visual (circular) trajectory displayed on a screen in front of the subjects.

First, it was found that for all of the audiovisual conditions, the motor reproductions were significantly flatter than the actual circular motions displayed on the screen. This observation is in line with studies on movement coordination showing that when someone draws a repetitive enclosed shape like a circle, he/she naturally tends to draw an elliptical shape which may be considered as an attractor of such a dynamic system [15,24,25].

The results showed that elliptical visual kinematics flattened the reproduced circles regardless of the auditory kinematics. This effect of biologically inconsistent visual kinematics on the reproduced circle underlined the links between motor output and visual motion perception, confirming the results and expectations of Viviani and colleagues [2,3,5]. Our results, issued from a situation without visual feedback from the drawing movement, thereby showed that the motor reproduction of visual biological motion is constrained by co-variations between curvature and kinematic properties in a feedforward manner, which could not have been fully demonstrated in the visual feedback condition investigated by Viviani et al. [2].

However, it must be noted that we cannot conclude whether both visual perception and motor processes are affected by incongruent auditory motion as our experiments were performed in a multisensory and motor context. Nevertheless, we recently showed [5] that distortions induced by elliptical visual kinematics (ΦV = 45°) are larger for the visuo-motor coupling (10.17%) than the perceptual distortions observed by Viviani and colleagues in purely visual (1.01% in [3]) or kinaesthetic presentations (<1% in [5]). As we here observed that the motor reproduction was less distorted by auditory elliptical kinematics (ΦA = 45°) than elliptical visual kinematics (ΦV = 45°), we may expect that if distorted, the perception of a visual circular motion (ΦV = 90°) would be less affected by elliptical auditory kinematics (ΦA = 45°) than by elliptical visual kinematics (ΦV = 45°).

The key point of this study is that continuous friction sounds also substantially interfered with visual motion cues and clearly modified the geometric properties of the motor reproduction by flattening the reproduced circles regardless of the visual kinematics in the biologically inconsistent condition. In line with our hypothesis, this effect of sound on the reproduced geometry strongly attests to the central role of dynamics per se, on the visuo-motor coupling. While the influence of visual kinematics involves both geometric and kinematic cues, the influence of auditory dynamics evoked by timbre variations of friction sounds allows for the investigation of the role of kinematic cues alone and reveal that the emergence of the geometry and the associated motor output is clearly driven by motion dynamics. Note that this would not have been possible to investigate the role of auditory dynamics with rhythmic discrete sounds, e.g. corresponding to velocity minima or maxima, as such stimuli convey only the timing of the movement and not its continuous velocity variations. Hence, while the studies made by Viviani et al. [2,3,5] supporting perceptual-motor interactions were performed in a purely unimodal context, in the present study we showed that such interactions also can occur in a multisensory context. In line with this, Varlet et al. [26] investigated sensorimotor coordination between audio-visual motion and a motor synchronization. They reported that continuous sounds affect motor synchronization. Although their study did not focus on the geometry but on temporal aspects of motor synchronization with audiovisual stimuli on a horizontal axis, their results highlighted the influence of motor attractors and biomechanical constraints in the sensori-motor coupling with audiovisual motions.

The absence of interaction between visual and auditory kinematics in our study suggests that the visual and auditory influences were combined in a perfectly linear way. While both the visual and auditory motions were inconsistent, i.e. elliptical (ΦV = 45°, ΦA = 45°), the distortion of the reproduced shape equaled the sum of the individual distortions obtained when either the visual motion was biologically inconsistent (ΦV = 45°, ΦA = 90°) or the auditory motion was inconsistent (ΦV = 90°, ΦA = 45°). Although the literature on multisensory integration most often claims that the effects of several sensory channels add up non-linearly and that synchronous sensory inputs amplify or inhibit perceptual and behavioral effects (e.g., [27]), some behavioral studies did not support this view and revealed additive effects [2831]. Here, we provide a clear example of linear integration between visual and auditory motion-related cues. More than the additivity of the effects observed here, it is noticeable that the distortion induced by biologically inconsistent kinematics in the visual domain is stronger (about 12°) than the distortion induced by inconsistent auditory kinematics (about 5°). This suggests that the effect of dynamics per se is about half the combined effect of spatial and dynamic cues.

Taken together, these results could be interpreted in line with the Theory of Event Coding [32] stressing the existence of unified percepts that combine both sounds produced by objects and modulated by actions with their perceptual properties. Thoret et al. [13] suggested the existence of such a unified percept of biological movements from a purely auditory point of view, i.e. linking the properties of drawing movements to their evocation through timbre variations of friction sounds. Studies by Danna et al. [33] confirmed such binding between friction sounds and drawing movements by showing that the quality of handwriting can be accurately judged through friction sounds evoking the movement velocity. Such unified percepts that link sounds and actions have also been shown for footstep sounds [34]. Our results are clearly in line with the existence of this kind of unified percept in a multisensory context involving audio-visuo-motor coupling. In particular, the motor production appears to be driven by the linear integration of both geometric and dynamic cues in the visual modality and solely by dynamic cues evoked through timbre variations in the auditory modality.

Finally, these findings may help in designing new devices using sounds to enhance or substitute visual and, more generally, sensory information. In particular, training activities involving continuous auditory stimuli when treating diseases that affect motor function, such as dysgraphia [35] and Parkinson’s disease [36,37], or when guiding movements in sport [38], are believed to constitute powerful alternatives to existing methods. In this context, we suggest that conveying dynamic information through specific sound properties can substantially modify motor performance.

Conclusions

The experiment presented in this study provides an evidence of the central role of dynamics in the reproduction of the geometry of a visual motion in an audiovisual context. We showed that both biologically inconsistent visual motion and inconsistent auditory motion flatten the reproduced geometry of circular visual motions. Interestingly, the combined effects of visual and auditory inputs observed here were added up linearly. Nevertheless, in order to better understand these effects and how the two modalities are combined to plan motor actions, it might be interesting to manipulate the instructions given to the subjects. This could be done by asking subjects either to focus on the sound rather than the visual motion, or to focus with the same attention on the visual and auditory information. The experimental data could be analyzed in terms of sensorimotor coordination such as in the study of Varlet et al. [26]. The relative phase considered in our study as a geometrical descriptor characterizing the eccentricity of the ellipse could then be used to reveal the temporal coordination of the movements in terms of negative versus non-negative lag and stability, which might also be affected by incoherent audiovisual motions. Moreover, purely perceptual experiments investigating whether incongruent audiovisual motions affect the visual perception of the geometry would enable to reveal information on the separate role of perceptual and motor processes in such a task.

In a more general perspective, it might be of interest to evaluate whether the effects observed here are specific to biological motions or whether the motor reproduction of any kind of audiovisual motion might be affected by incongruent visual and auditory kinematics, for instance by considering physical motions such as those constrained by Newton’s laws [39].

Supporting Information

S1 Video. Audiovisual stimulus (ΦV = 90°; ΦA = 90°) and the corresponding averaged reproduced movement.

The visual sample rate of the original visual stimulus has been downsampled to 30 Hz and the audio encoded in the AAC format due to technical capabilities.

https://doi.org/10.1371/journal.pone.0154475.s001

(MP4)

S2 Video. Audiovisual stimulus (ΦV = 90°; ΦA = 45°) and the corresponding averaged reproduced movement.

The visual sample rate of the original visual stimulus has been downsampled to 30 Hz and the audio encoded in the AAC format due to technical capabilities.

https://doi.org/10.1371/journal.pone.0154475.s002

(MP4)

S3 Video. Audiovisual stimulus (ΦV = 45°; ΦA = 90°) and the corresponding averaged reproduced movement.

The visual sample rate of the original visual stimulus has been downsampled to 30 Hz and the audio encoded in the AAC format due to technical capabilities.

https://doi.org/10.1371/journal.pone.0154475.s003

(MP4)

S4 Video. Audiovisual stimulus (ΦV = 45°; ΦA = 45°) and the corresponding averaged reproduced movement.

The visual sample rate of the original visual stimulus has been downsampled to 30 Hz and the audio encoded in the AAC format due to technical capabilities.

https://doi.org/10.1371/journal.pone.0154475.s004

(MP4)

S1 Dataset. Averaged relative phases by subjects.

https://doi.org/10.1371/journal.pone.0154475.s005

(TXT)

Acknowledgments

The authors are thankful to Meghan Goodchild for revising the English of the manuscript.

Author Contributions

Conceived and designed the experiments: ET MA LB SY RKM. Performed the experiments: ET. Analyzed the data: ET MA LB SY RKM. Contributed reagents/materials/analysis tools: ET MA LB SY RKM. Wrote the paper: ET MA LB SY RKM.

References

  1. 1. Lacquaniti F, Terzuolo C, Viviani P (1983) The law relating the kinematic and figural aspects of drawing movements. Acta Psychologica 54: 115–130. pmid:6666647
  2. 2. Viviani P, Campadelli P, Mounoud P (1987) Visuo-manual pursuit tracking of human two-dimensional movements. Journal of Experimental Psychology: Human Perception and Performance 13: 62–78.
  3. 3. Viviani P, Stucchi N (1989) The effect of movement velocity on form perception: Geometric illusions in dynamic displays. Perception & Psychophysics 46: 266–274.
  4. 4. Thoret E, Aramaki M, Bringoux L, Ystad S, Kronland-Martinet R (2016) When eyes drive hand: Influence of non-biological motion on visuo-motor coupling. Neuroscience Letters 612: 225–230. pmid:26708633
  5. 5. Viviani P, Baud-Bovy G, Redolfi M (1997) Perceiving and tracking kinesthetic stimuli: Further evidence of motor-perceptual interactions. Journal of Experimental Psychology: Human Perception and Performance 23: 1232–1252. pmid:9269735
  6. 6. Kitagawa N, Ichihara S (2002) Hearing visual motion in depth. Nature 416: 172–174. pmid:11894093
  7. 7. Sekuler R, Sekuler A, Lau R (1997) Sound alters visual motion perception. Nature 385: 308–308. pmid:9002513
  8. 8. Soto-Faraco S, Kingstone A, Spence C (2003) Multisensory contributions to the perception of motion. Neuropsychologia 41: 1847–1862. pmid:14527547
  9. 9. Repp B, Penel A (2003) Rhythmic movement is attracted more strongly to auditory than to visual rhythms. Psychological Research 68.
  10. 10. Repp B, Su Y (2013) Sensorimotor synchronization: A review of recent research (2006–2012). Psychonomic Bulletin & Review 20: 403–452.
  11. 11. Brooks A, van der Zwan R, Billard A, Petreska B, Clarke S, Blanke O (2007) Auditory motion affects visual biological motion processing. Neuropsychologia 45: 523–530. pmid:16504220
  12. 12. Arrighi R, Marini F, Burr D (2009) Meaningful auditory information enhances perception of visual biological motion. Journal of Vision 9: 25–25.
  13. 13. Thoret E, Aramaki M, Kronland-Martinet R, Velay JL, Ystad S (2014) From sound to shape: Auditory perception of drawing movements. Journal of Experimental Psychology: Human Perception and Performance 40: 983–994. pmid:24446717
  14. 14. Hollerbach J (1981) An oscillation theory of handwriting. Biol Cybern 39: 139–156.
  15. 15. Dounskaia N, Van Gemmert A, Stelmach G (2000) Interjoint coordination during handwriting-like movements. Experimental Brain Research 135: 127–140. pmid:11104134
  16. 16. Gaver WW (1993) How Do We Hear in the World? Explorations in Ecological Acoustics. Ecological Psychology 5: 285–313.
  17. 17. van den Doel K, Kry P, Pai D (2001) FoleyAutomatic: Physically-Based Sound Effects for Interactive Simulation and Animation. Proceedings of the 28th annual conference on Computer graphics and interactive techniques—SIGGRAPH '01. https://doi.org/10.1145/383259.383322
  18. 18. Conan S, Thoret E, Aramaki M, Derrien O, Gondre C, Ystad S, et al. (2014) An Intuitive Synthesizer of Continuous-Interaction Sounds: Rubbing, Scratching, and Rolling. Computer Music Journal 38: 24–37.
  19. 19. Aramaki M, Besson M, Kronland-Martinet R, Ystad S (2011) Controlling the Perceived Material in an Impact Sound Synthesizer. IEEE Transactions on Audio, Speech, and Language Processing 19: 301–314.
  20. 20. Chowning J (1977) The Simulation of Moving Sound Sources. Computer Music Journal 1: 48–52.
  21. 21. Kronland-Martinet R, Voinier T (2008) Real-Time Perceptual Simulation of Moving Sources: Application to the Leslie Cabinet and 3D Sound Immersion. EURASIP Journal on Audio, Speech, and Music Processing 2008: 849696.
  22. 22. Merer A, Aramaki M, Ystad S, Kronland-Martinet R (2013) Perceptual characterization of motion evoked by sounds for synthesis control purposes. ACM Transactions on Applied Perception 10: 1–24.
  23. 23. Savitzky A, Golay M (1964) Smoothing and Differentiation of Data by Simplified Least Squares Procedures. Analytical Chemistry 36: 1627–1639.
  24. 24. Athènes S, Sallagoïty I, Zanone P, Albaret J (2004) Evaluating the coordination dynamics of handwriting. Human Movement Science 23: 621–641. pmid:15589625
  25. 25. Danna J, Athènes S, Zanone P (2011) Coordination dynamics of elliptic shape drawing: Effects of orientation and eccentricity. Human Movement Science 30: 698–710. pmid:21524807
  26. 26. Varlet M, Marin L, Issartel J, Schmidt R, Bardy B (2012) Continuity of Visual and Auditory Rhythms Influences Sensorimotor Coordination. PLoS One 7: e44082. pmid:23028488
  27. 27. McGarry L, Russo F, Schalles M, Pineda J (2012) Audio-visual facilitation of the mu rhythm. Exp Brain Res 218: 527–538. pmid:22427133
  28. 28. Angelaki D, Gu Y, DeAngelis G (2009) Multisensory integration: psychophysics, neurophysiology, and computation. Current Opinion in Neurobiology 19: 452–458. pmid:19616425
  29. 29. Campos J, Byrne P, Sun H (2010) The brain weights body-based cues higher than vision when estimating walked distances. European Journal of Neuroscience 31: 1889–1898. pmid:20584194
  30. 30. Campos J, Butler J, Bülthoff H (2012) Multisensory integration in the estimation of walked distances. Exp Brain Res 218: 551–565. pmid:22411581
  31. 31. Campos J, Butler J, Bülthoff H (2014) Contributions of visual and proprioceptive information to travelled distance estimation during changing sensory congruencies. Exp Brain Res 232: 3277–3289. pmid:24961739
  32. 32. Hommel B, Müsseler J, Aschersleben G, Prinz W (2001) The Theory of Event Coding (TEC): A framework for perception and action planning. Behavioral and Brain Sciences 24: 849–878. pmid:12239891
  33. 33. Danna J, Paz-Villagrán V, Gondre C, Aramaki M, Kronland-Martinet R, Ystad S, et al. (2015) “Let Me Hear Your Handwriting!‚” Evaluating the Movement Fluency from Its Sonification. PLoS One 10: e0128388. pmid:26083384
  34. 34. Young W, Rodger M, Craig C (2013) Perceiving and reenacting spatiotemporal characteristics of walking sounds. Journal of Experimental Psychology: Human Perception and Performance 39: 464–476. pmid:22866760
  35. 35. Danna J, Fontaine M, Paz-Villagrán V, Gondre C, Thoret E, Aramaki M, et al. (2015) The effect of real-time auditory feedback on learning new characters. Human Movement Science 43: 216–228. pmid:25533208
  36. 36. Rodger M, Young W, Craig C (2014) Synthesis of Walking Sounds for Alleviating Gait Disturbances in Parkinson's Disease. IEEE Trans Neural Syst Rehabil Eng 22: 543–548. pmid:24235275
  37. 37. Young W, Rodger M, Craig C (2014) Auditory observation of stepping actions can cue both spatial and temporal components of gait in Parkinson’s disease patients. Neuropsychologia 57: 140–153. pmid:24680722
  38. 38. Craig C, Delay D, Grealy M, Lee D (2000). Guiding the swing in golf putting. Nature 405: 295–296. pmid:10830947
  39. 39. La Scaleia B, Zago M, Moscatelli A, Lacquaniti F, Viviani P (2014). Implied dynamics biases the visual perception of velocity. PloS One, 9(3), e93020. pmid:24667578