Skip to main content
Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Auditory rhythm complexity affects cardiac dynamics in perception and synchronization


Accurate perception and production of auditory rhythms are key for human behaviors such as speech and music. Auditory rhythms in music range in their complexity: complex rhythms (based on non-integer ratios between successive tone durations) are more difficult to perceive and produce than simple rhythms (based on integer ratios). The physiological activity supporting this behavioral difference is not well understood. In a within-subjects design, we addressed how rhythm complexity affects cardiac dynamics during auditory perception and production. Musically trained adults listened to and synchronized with simple and complex auditory rhythms while their cardiac activity was recorded. Participants identified missing tones in the rhythms during the Perception condition and tapped on a keyboard to synchronize with the rhythms in the Synchronization condition. Participants were equally accurate at identifying missing tones in simple and complex rhythms during the Perception condition. Tapping synchronization was less accurate and less precise with complex rhythms than with simple rhythms. Linear cardiac analyses showed a slower mean heart rate and greater heart rate variability during perception than synchronization for both simple and complex rhythms; only nonlinear recurrence quantification analyses reflected cardiac differences between simple and complex auditory rhythms. Nonlinear cardiac dynamics were also more deterministic (predictable) during rhythm perception than synchronization. Individual differences during tapping showed that greater heart rate variability was correlated with poorer synchronization. Overall, these findings suggest that linear measures of musicians’ cardiac activity reflect global task variability while nonlinear measures additionally reflect stimulus rhythm complexity.


Listeners are adept at hearing and reproducing a wide range of auditory rhythms, such as those contained in dance steps or in musical melodies. Auditory rhythms are composed of sequences of regularly spaced events over time [1]. Some rhythms–those defined by complex ratios between the sequential tone durations—are less common and more difficult to perceive and produce [25]. Behavioral differences between simple and complex rhythms and the physiological activities that support them are not well-understood. Auditory rhythms such as musical rhythms are known to influence cardiac activity; previous research has largely focused on how acoustic features of musical rhythms, such as tempo or pitch, affect cardiac activity during perception [6]. A few studies have shown changes in the periodicities present in cardiac rhythms during perception [7] and production [8] of different auditory rhythms. Yet little research has addressed the specific relationship between auditory rhythm complexity and physiological activity. The current study investigates how auditory rhythm complexity affects behavior and cardiac dynamics during perception and production, using a synchronization paradigm.

Changes in heart rate mean and variability during cognitive-motor tasks are often attributed to changes in physiological arousal and task difficulty [911]. Slower heart rate and greater heart rate variability tend to correspond to lower arousal states [12]. Both musical acoustic features and musical task difficulty can influence cardiac rhythms. In perception-only tasks, some studies show that fast-tempo musical rhythms tend to increase listeners’ mean heart rate and decrease their heart rate variability, indicative of an arousal effect [6, 1316, cf. 17]. Other studies show that faster-tempo auditory rhythms tend to elicit faster heart rates in listeners without changes in heart rate variability [16, 1820]. Slow-tempo music has been shown to slow down listeners’ heart rates and increase heart rate variability [21, 22]. Task factors in music performance also influence physiological arousal: Performing music in front of an audience compared to performing alone increased a professional musician’s heart rate and decreased heart rate variability [23]. Musical task familiarity can influence cardiac rhythms: Pianists’ (≥ 6 years private piano instruction) heart rates increased during production of unfamiliar (more challenging) musical melodies compared with familiar melodies and a silent baseline condition [8]. Together, these studies suggest that auditory rhythms such as those found in music modulate listeners’ arousal levels, with task difficulty potentially contributing to this effect during rhythm production. Thus, we expected that changes in task difficulty based on auditory rhythm complexity might impact heart rate and heart rate variability during perception and synchronization, such that simple rhythms result in slower heart rate and greater heart rate variability than complex rhythms.

Nonlinear dynamical systems theory provides an explanation for the effects of rhythm complexity in music perception and production. Oscillators (signals with recurring cyclic patterns with a natural or default frequency) are central in this theory: Rhythm complexity depends on the amount of coupling or interaction between two oscillators which is determined in part by the ratio of one oscillator frequency to the other oscillator frequency. Oscillators whose natural (default) frequencies form small integer ratio relationships, such as 1:2, show stronger coupling, compared to oscillators whose natural frequencies form non-integer ratios, such as 3:2 [2426]. In this study, we compare Simple rhythms whose tone durations form small integer ratios (1:2) with Complex rhythms whose tone durations form non-integer ratios (3:2). Neural resonance theory extends this nonlinear dynamical systems perspective specifically to auditory rhythm perception, holding that rhythm perception is underpinned by internal neural oscillations that couple with external auditory rhythms; when neural oscillations and auditory rhythms form simple integer ratios, coupling is most likely to occur [24, 25]. The duration ratio distinction of simple and complex rhythms converges with auditory rhythm structures that are most frequently observed across musical cultures [27], and recent evidence suggests that small integer ratios primarily characterize the structure of certain non-human animal vocalizations [28, 29].

Production of auditory rhythms by humans supports the dynamical systems distinction between Simple and Complex rhythms. Individuals reproduced auditory rhythm sequences better when the successive tone intervals formed integer ratios such as 2:4 compared with 2:3 [2]. When asked to tap one hand in synchrony with non-integer ratio auditory rhythms, individuals with a range of musical experience tended to distort their taps toward simple integer ratios [4, 5, 30]. A similar effect has been found when individuals tap both hands with Simple and Complex polyrhythms (rhythms that are composed of more than one tone sequence) [31]. In an iterated learning paradigm in which individuals initially heard and reproduced random sequences of tones, individuals produced sequences that converged toward small integer ratio relationships between tones [32]. There is also some cross-cultural evidence for a bias toward producing simple auditory rhythms [33], although cultural familiarity with different musical rhythms may modulate these biases in both rhythm production [34] and rhythm perception [35].

A few studies directly assess Simple and Complex auditory rhythms in perception and production tasks. Repp et al. [36] measured musicians (graduate students in music performance) as they listened to and tapped in synchrony with simple and complex rhythms that contained temporal perturbations. They found evidence of better perception and production for small-integer ratio rhythms (such as 1:1, 1:2 ratios) than for complex rhythms (such as 4:5, 7:11, 5:13), suggesting similar constraints on rhythm perception and production. However, there was no clear advantage across perception and production for all of the small integer ratios used in the experiment [36]. As most studies address perception or production tasks but not both, it is unknown whether the tendency toward Simple over Complex rhythms is stronger in perception or production.

Neurophysiological studies have investigated how rhythm complexity affects oscillatory neural activity. Mathias et al. [37] directly compared neural oscillations with EEG while musicians ((≥ 6 years private musical instruction) perceived and produced auditory stimuli containing simple (1:1), moderate (1:2), and complex (3:2) rhythms. During perceptual trials, participants detected omitted tones in the auditory rhythms; during production trials, participants tapped the second part of the ratio of each auditory rhythm to produce the full rhythm. Detection of missing tones was most accurate for the 1:1 rhythm with no difference observed between the 1:2 and 3:2 rhythms. Tapping was more accurate and precise for the 1:1 and 1:2 rhythms compared to the 3:2 rhythms. Power spectral density measures (EEG) were greatest at frequencies that corresponded to the tapping frequency in the simple (1:1) rhythm compared to the moderate (1:2) and complex (3:2) rhythms [37]. These findings suggest that neurophysiological entrainment is stronger for simple rhythms than complex rhythms, particularly during production (synchronization) tasks compared to perceptual tasks.

One of the goals of the current study is to compare cardiac rhythms during auditory rhythm perception and production tasks in a synchronization paradigm. Previous studies of cardiac activity during rhythm perception and production tend to employ linear measures of cardiac activity, such as mean heart rate or R-R interval and heart rate variability. For example, one study reported no difference in linear measures of listeners’ heart rate variability during music perception of duple metre rhythms (a march) compared to triple metre rhythms (a waltz) [7]. Linear measures such as heart rate variability assume that the cardiac signal is stationary, meaning the mean and variance of the time series are relatively stable over time [38]. In reality, stationarity in cardiac activity is often not the case [39]; behavioral tasks are likely to have different effects on cardiac activity over time. Thus, it is possible that nonlinear cardiac measures, which address relationships within a single time series, may be more sensitive to rhythmic changes that take place within musical stimuli. Indeed, previous research has successfully applied the nonlinear analysis technique of multiscale sample entropy, a measure of the predictability of a signal in time, to performing musicians’ cardiac dynamics. Professional musicians showed more predictable, less complex cardiac dynamics during higher-stress music performances [23, 40] and during jazz music performance compared to classical music performance [41].

Recurrence quantification analysis (RQA) is a nonlinear analysis alternative that does not make assumptions about stationarity of the signal [42]. In RQA, a time-delayed copy of the cardiac time series is produced to reconstruct the signal in a multi-dimensional phase space [43, 44]. Points in the phase space are then assessed for their closeness; points that are sufficiently close to one another in the phase space are deemed to be recurrent [45]. These recurrent points are plotted in a 2-dimensional recurrence plot and quantified according to different metrics to describe the behavior of the system. RQA has been applied to identify disrupted cardiac dynamics in clinical populations such as syncope (fainting) [46, 47], sleep apnea [48], and ventricular tachyarrhythmia [49] as well as changes in cardiac rhythms that occur during behavioral tasks. Konvalinka et al. [50] reported increased predictability and stability in cardiac dynamics of firewalkers during a ritual fire walk compared to spectators’ cardiac dynamics. Wright and Palmer [8] reported increased predictability in pianists’ (≥ 6 years private piano instruction) cardiac dynamics when they performed simple melodies compared to a silent baseline; furthermore, pianists’ cardiac dynamics were most predictable when they performed novel (unfamiliar) melodies. It is unknown, however, how much cardiac rhythms change during perception or production of auditory rhythms and whether auditory rhythm complexity affects cardiac dynamics.

The current study investigated how auditory rhythm complexity affects cardiac dynamics during auditory perception and synchronization tasks. The first aim was to determine whether rhythmic complexity, measured by the ratio of tone durations in two simultaneously presented rhythms, influences behavior (perception and synchronization) and of cardiac dynamics. The second aim was to compare how listening to auditory rhythms versus actively synchronizing a voice or part with those rhythms affects behavior and cardiac dynamics. Trained musicians (adults with ≥ 6 years private instruction on a musical instrument), who have had experience synchronizing their movements to rhythmic sound, listened to and tapped in synchrony with simple and complex auditory rhythms. Based on previous studies, our initial behavioral hypothesis was that tapping performance should be worse for the Complex rhythm (3:2 duration ratios) than for the Simple rhythm (1:2 duration ratios) [4, 5, 30, 37]. Also based on previous findings [8], our initial cardiac hypotheses were that greater recurrence and predictability of cardiac dynamics should be obtained during the synchronization task than during the perception task, and greater predictability should be obtained during the Complex rhythm compared to the Simple rhythm (across perception and synchronization tasks). A final aim was to test the hypothesis that individuals’ behavioral measures of auditory rhythm complexity would correspond to their cardiac measures, which we tested with simple correlations.



Twenty-five musically trained adults participated in the study (mean age = 22.24 years, sd = 3.78, Nfemale = 20). Participants had an average of 13.5 years of private musical instruction (range = 6–18 years; sd = 3.88 years). Participants were screened by email for eligibility; eligible participants had to be between 18 and 35 years of age and have at least 6 years of private musical instruction. Exclusion criteria included a history of cardiovascular, respiratory, neurological, or psychiatric disorders, a diagnosed hearing impairment, or having taken a transcontinental flight within 3 weeks of participating in the study. Participants were screened for normal hearing for the range of tone frequencies used in the auditory stimuli (< 30 dB HL threshold for 125–750 Hz frequencies), as determined by an audiometry screening at the beginning of the experiment. Participants were recruited via social media and flyer postings between May and October 2022; 10 participants were recruited for course credit in the Psychology Department at McGill University and 15 participants were recruited from the general adult population in Montreal. Individual participant identifiers available at the time of testing were removed following their participation. The study took place at McGill University and the protocol was reviewed by the Research Ethics Board (Ethics protocol #197–1018).

Stimulus materials and equipment

The auditory rhythms in all conditions were composed of two isochronous sequences of tones that differed in pitch and duration and formed specific temporal ratios with one another (Fig 1), similar to Mathias et al. [37]. The high-pitched tones were presented as 660 Hz sine tones and the low-pitched tones were presented as 392 Hz woodblock tones produced by a Roland Sound Canvas (SC-55) tone generator (timbre = 116). The high-pitched and low-pitched tones in the Simple rhythm formed a 1:2 ratio such that the high-pitched tone (intertone interval = 1090 ms) occurred once for every two low-pitched tones (intertone interval = 545 ms). The high-pitched and low-pitched tones in the Complex rhythm formed a 3:2 ratio such that the high-pitched tone (intertone interval = 363.33 ms) occurred three times for every two low-pitched tones (intertone interval = 545 ms). Thus, the low-pitched tone interonset interval (545 ms) was constant across both the Simple and the Complex rhythms; only the high-pitched tone interval changed across the rhythms. Each auditory rhythm was sounded for 1 minute and began with four woodblock tones (intertone interval = 545 ms). Each 1-minute auditory rhythm was repeated five times with a silent pause of 10 seconds between each repetition. Pilot testing with longer trials (five minute duration) generated a sufficient number of tapping errors as individuals stopped and restarted, which had the outcome of creating missing data in a time series analysis. Five one-minute trials with short (10s) gaps inbetween were thus chosen to balance the length of the physiological time series with the accuracy of behavioural performance.

Fig 1. Simple and complex auditory rhythms used in the perception and synchronization tasks.

Black and grey tones were sounded in the Perception condition. Black indicates sounded tones and grey indicates participants’ instructed taps that produced tones in the Synchronization task. Circles indicate stimulus beats on which asynchronies between sounded tones and produced taps were computed.

Auditory stimuli in the Perception condition contained one omitted tone in either the high-pitched or low-pitched tone stream on two of the five trials for each rhythm (Simple and Complex) that participants were asked to detect. Tone omissions were placed at either the middle or the end of the rhythm sequence; trials with tone omissions were randomized across rhythm conditions and participants. Auditory stimuli in the Synchronization condition contained only the high-pitched tone; each auditory rhythm began with four woodblock tones (presented with the same timbre as in the Perception condition) that served as the participants’ tempo cue for tapping the low-pitched tone part (set to 545 ms). Thus, the participants’ tapping rate in the Synchronization task was constant across all trials of the Simple and Complex rhythms.

Sound was delivered through AKG K271 Studio headphones and participants tapped their finger on a Roland RD-700 electronic piano keyboard. Auditory stimuli and participants’ auditory feedback were generated on the Roland Sound Canvas (SC-55) tone generator using MIDI (musical instrument digital interface) with 1-ms temporal resolution. Participant key taps were recorded in FTAP v.2.1.07b [51] on a Dell T3600 PC running Linux (Fedora 16). Cardiac activity was recorded with a Polar H10 chest strap heart rate monitor with 1-ms temporal resolution connected via Bluetooth to the application EliteHRV running on an iPad Mini. Questionnaires included a musical background questionnaire and a short questionnaire about participants’ physical activity in the hour prior to the experimental session.


Behavioral performance and cardiac activity were measured for all participants in all auditory rhythm conditions, making this a within-subjects Task (Perception and Synchronization) by Rhythm (Simple vs Complex) design. In addition, each participant had a 5-minute silent baseline measure of cardiac activity collected. The order of conditions in the experiment was kept constant across participants, such that all participants first completed the physiological baseline measurement followed by auditory rhythm perception, then synchronization. The Synchronization condition followed the Perception condition to avoid unintentional motor imagery or motor planning during the perceptual task. The Simple rhythm condition always preceded the Complex rhythm condition so that participants began with the easier task.

Dependent variables for the Perception task were hit rate (% correct detection) and false alarm rate (% incorrect detection) for the omitted tones. The dependent variables for the Synchronization task were the mean intertap interval (ITI) as well as the tapping accuracy (participant tap onset time–stimulus tone onset time) for participant taps that coincided with stimulus tones, indicated with circles in Fig 1. Tapping precision was measured by the standard deviation of the mean signed asynchronies. Dependent variables for the cardiac signals included mean R-R intervals (normal heart beat-to-heart beat intervals, ms) and the root mean square of successive R-R interval differences (RMSSD, ms), a measure of heart rate variability [52]. Nonlinear cardiac measures included Recurrence Rate (% Rec) and Determinism (% Det), described below (Data Analysis).


Participants were invited to the lab to complete the 1-hour in-person testing session between 09h and 17h. Upon arrival at the lab, participants provided written informed consent by reading and signing a consent form and then completed an audiometric screening in which pure tones were presented over sound-attenuating headphones (Maico MA40). Only participants who reported hearing the range of tones used in the experiment (125 Hz– 750 Hz) at an average threshold of < 30 dB continued in the experiment.

Next, a 5-minute baseline cardiac recording was conducted. Participants attached the heart rate monitor around their chest and sat in a comfortable chair with their legs uncrossed. During the baseline recording, participants completed written questionnaires with minimal body movement. Participants then completed the Perception task. They were told that each rhythm they heard would have a high-pitched part and a low-pitched part and were asked to identify whether a trial had a missing tone by circling “yes” or “no” on a sheet of paper at the end of each trial. Participants listened to an example of a rhythm with and without a missing tone in initial practice trials for each rhythm. Participants completed five 1-minute Perception trials of the Simple (1:2) rhythm followed by five 1-minute perception trials of the Complex (3:2) rhythm.

Participants then completed the Synchronization task. They were asked to tap the low-pitched part of the rhythm on a single key of the piano keyboard using their dominant hand, so that their taps synchronized with the high-pitched part of the rhythm to form the intended rhythm ratio in each condition. Participants heard the high-pitched part of the rhythm as well as their own taps (low-pitch part of the rhythm) during the Synchronization condition. Participants received up to three practice trials tapping the rhythm before completing five 1-minute Synchronization trials for each rhythm condition. At the end of the Synchronization task, participants removed the heart rate monitor and were debriefed. The experimental session took approximately 1 hour to complete.

Data analysis

Behavioral data.

Each participant’s tapping data formed a time series of intertap intervals. Tap ITIs were examined for double taps (< 75 ms between two successive taps) and, when present (approximately 1.4% of all taps), the second tap was removed. The first 4 taps (indicating initial synchronization) and the last 4 taps were then removed from each trial, similar to Mathias et al. [37]. Intertap intervals greater than 3 standard deviations from the mean ITI were removed (Simple rhythm = 0.67% of all intervals; Complex rhythm = 0.65% of all intervals) and the mean ITI and CV were then calculated for each trial using Matlab (version 9.8.0, 2020). The mean absolute asynchrony, mean signed asynchrony, and standard deviation of mean signed asynchrony were calculated in each trial for each tap that aligned with a stimulus event, as shown by the circles in Fig 1. Asynchrony values were generated by subtracting the stimulus tone onset time from the participant tap onset time [53]. Smaller absolute asynchrony values indicate more accurate tapping; negative mean signed asynchrony values indicate participants’ taps anticipated the stimulus tone. A negative mean asynchrony between stimulus and response is commonly observed in auditory-motor synchronization tasks [54] such as the one used in the current study, and the measure captures phase synchronization between a tap and an auditory stimulus. A smaller standard deviation of the mean signed asynchronies indicates more precise tapping.

Cardiac data.

Cardiac data were processed in Kubios HRV Premium (version 3.5.0). R-R interval series were generated for each trial, and mean R-R interval and RMSSD were calculated in Kubios. The RMSSD is primarily a measure of vagally-mediated beat-to-beat heart rate variability [55]. It was deemed an appropriate measure of heart rate variability as there is some evidence that RMSSD measures at rest are robust to differences in respiration rate [56, 57]. RMSSD has also been validated on 60-second durations taken from and compared with five-minute cardiac recordings [58], the same trial durations used in the current study.

The R-R values were then converted into beats per minute (BPM) that resulted in values every 300 ms (ie. measures faster than the maximal heart rate of all participants), following Wallot et al. [59]. Each R-R value in the time series (indicating onset times) was replaced with a string of BPM values. The BPM values were then averaged using a non-overlapping moving window of 300 ms, and the resultant time series served as input to the recurrence quantification analysis (RQA). The result of this transformation is an upsampling of the time series that creates a time series of 180 samples per trial, consistent across trials and participants.

Auto recurrence quantification analysis (RQA) was applied to the cardiac time series and was performed in Matlab with the CRP Toolbox 5.22 [60]. In contrast to the linear cardiac measures, RQA captures recurring patterns in cardiac activity over time. RQA metrics have been shown to yield robust statistics for physiological time series as short as 100 data points [61], which are shorter than those used in the current study (180 samples). Symmetrical, binary (recurrent, non-recurrent) recurrence plots were generated, which capture the recurring patterns in the cardiac signal (see Fig S1 in S1 File). Recurrence plots were generated on normalized values of the cardiac timeseries. Two metrics were used to quantify the recurrence patterns observed in the recurrence plots. The first was Recurrence Rate, a measure of the total proportion of recurrent to non-recurrent points. The second was Determinism: The proportion of sequential recurrent points, corresponding to predictability of the cardiac signal. For a detailed description of RQA methodology and chosen parameters, see S1 Appendix in S1 File.

Statistical tests for the behavioral and physiological measures were conducted in R Studio (version 4.2.0). Repeated-measures analyses of variance (using the function ezANOVA from the ez package in R) were used to investigate differences between more than two conditions, with a p-value below .05 determining statistical significance. Planned contrasts with the Holm-Bonferroni correction were used to follow-up significant analyses of variance. Trial-level participant data was the input for all statistical tests. The number of participants was chosen based on previous findings and expected medium to large effect sizes for the within-subjects design [8, 37].


Perception of missing beats

The effect of Rhythm (Simple, Complex) on missing tone identification in the Perception condition was tested with paired t-tests. There were no significant differences between Rhythm conditions for the hit rate (t(24) = .681, p = .251, Cohen’s d = .138). Participants were equally good at detecting missing tones in the Simple rhythm (90.4%) and the Complex rhythm (88.0%). Similarly, there were no significant differences between rhythm conditions for the false alarm rate (t(24) = 0.0, p = .50, Cohen’s d = 0.0). Participants did not make more false detections in the Complex rhythm (7.2%) than the Simple rhythm (7.2%). Thus, the analyses supported a null hypothesis of no perceptual differences between the Simple and Complex rhythms.

Synchronization of taps

To confirm that participants maintained the same overall tapping rate across conditions, we first examined the mean intertap intervals (ITI) across Rhythms (Simple, Complex) in the Synchronization condition. As expected for the fixed tapping rates across stimuli in the Rhythm conditions, participants’ mean ITIs did not differ significantly between the Simple and Complex rhythms (F(1,24) = .522, p = .477, generalized η2 = .009). Thus, the results were consistent with the null hypothesis that there were no differences in tapping rate across Simple and Complex rhythms (and therefore, any asynchrony differences by Rhythm condition are not a function of differences in tapping rates).

Next, we assessed participants’ tapping synchronization accuracy in the Synchronization conditions. As shown in Fig 2A, mean absolute asynchrony values were significantly larger for the Complex rhythm than the Simple rhythm (F (1,24) = 21.1, p < .001, generalized η2 = .278. As shown in Fig 2B, the Complex rhythm yielded significantly larger positive signed asynchrony values whereas the Simple rhythm yielded smaller negative values (F (1,24) = 25.46, p < .001, generalized η2 = .330). These results constitute support for the alternative hypothesis that synchronization accuracy is worse for Complex rhythms than for Simple Rhythms.

Fig 2. Behavioral synchrony measures by rhythm condition in the synchronization task.

A): Mean absolute asynchrony by Rhythm condition (in ms). B): Mean signed asynchrony by Rhythm condition (in ms). C): Mean standard deviation of signed asynchrony by Rhythm condition (in ms). Circles = individual participants’ values.

Finally, analyses of the synchronization tapping variability were conducted in the Synchronization condition. The standard deviation of the signed asynchronies was significantly larger for the Complex rhythm than the Simple rhythm (F (1,24) = 18.5, p < .001, generalized η2 = .255), as shown in Fig 2C, supporting the alternative hypothesis that synchronization precision was worse for the Complex rhythm than for the Simple rhythm. Overall, these results indicate that participants synchronized their taps less accurately and less precisely with the Complex rhythm, and they showed greater anticipation (ie, their taps preceded the stimulus more) with the Simple rhythm.

Linear measures of cardiac rhythms

We first compared the linear cardiac activity measures between Baseline (at rest) with the cardiac activity during the auditory rhythm tasks, to confirm that the presence of auditory rhythms was responsible for cardiac changes. A one-way ANOVA on mean R-R intervals by Task (Baseline, Perception, Synchronization) revealed a significant effect of Task (F(2,48) = 5.280, p = .008, generalized η2 = .012). The R-R intervals were larger (heart rate was slower) during the Perception condition (mean = 768.335 ms) compared to Baseline (mean = 744.161 ms) (Holm-Bonferroni corrected t(24) = 3.389, p = .002, Cohen’s d = .26) and during Perception compared to Synchronization (mean = 745.052 ms) (t(24) = 3.332, p = .003, Cohen’s d = .24). There was no significant difference between the Baseline and Synchronization conditions (t(1,24) = -.084, p = .934, Cohen’s d = .01). A one-way ANOVA on RMSSD by Task (Baseline, Perception, Synchronization) yielded no significant differences (F(2,48) = 1.360, p = .267, generalized η2 = .005), supporting the null hypothesis of no task effect on heart rate variability. Thus, the R-R intervals but not the HRV differed between Baseline (no auditory rhythms) and other tasks (auditory rhythms present).

To test for task and rhythm complexity effects, we conducted two-way ANOVAs on the linear cardiac measures (R-R inteval and RMSSD) by Task (Perception, Synchronization) and Rhythm (Simple, Complex). The ANOVA on mean R-R intervals showed a significant main effect of Task (F (1,24) = 11.103, p = .003, generalized η2 = .014), confirming that heart rates were slower during Perception than during Synchronization for both Simple and Complex rhythms. There were no significant effects of Rhythm or Task x Rhythm interaction. The two-way ANOVA on RMSSD indicated a significant main effect of Task (F(1,24) = 5.609, p = .026, generalized η2 = .075). There was greater heart rate variability during Perception than during Synchronization for both Simple (Perception = 34.78 ms, Synchronization = 32.17 ms) and Complex rhythms (Perception = 34.49, Synchronization = 31.37). There were no other significant main effects or interactions. Overall, the linear cardiac findings indicate that heart rate was faster and less variable during the Synchronization condition compared to the Perception condition, and no significant effects of rhythm complexity.

Nonlinear measures of cardiac rhythms

We first compared the nonlinear cardiac metrics for the Baseline condition (at rest) with the cardiac activity during the auditory rhythm tasks, to confirm that the cardiac measures differed in the presence of auditory rhythms (tasks) from the absence of auditory rhythms (Baseline). For Recurrence Rate, there was a significant effect of Task (F(1,24) = 10.40, p < .001, generalized η2 = .119), with greater cardiac recurrence during Perception (mean = 3.062%) than Baseline (mean = 1.803%) (pairwise comparisons, Holm-Bonferroni corrected t(24) = 4.156, p < .001, Cohen’s d = .96) and during Synchronization (mean = 3.283%) than Baseline (t(24) = 4.286, p < .001, Cohen’s d = .93). Perception and Synchronization conditions did not differ significantly (t(24) = -.546, p = .590, Cohen’s d = .08). These analyses demonstrate support for the alternative hypothesis that Task affects Recurrence Rate. Determinism values, based on fixed recurrence rates (5%), showed similar main effects of Task (F(1,24) = 4.395, p = .018, generalized η2 = .027), indicating support for the alternative hypothesis that Task affects cardiac Determinism. Follow-up pairwise comparisons (Holm-Bonferroni corrected) indicated greater Determinism during Perception (mean = 65.71%) than Baseline (mean = 62.51%) (t(24) = 2.862, p = .009, Cohen’s d = .37) and during Perception than Synchronization (mean = 62.85%) (t(24) = 2.486, p = .02, Cohen’s d = .36). There was no significant difference between the baseline and Synchronization conditions (t(24) = -.267, p = .792, Cohen’s d = .04). Thus, the nonlinear cardiac measures differed between the Baseline condition (no auditory rhythms) and the other tasks (auditory rhythms present).

To test for auditory rhythm effects on the nonlinear cardiac measures, a 2-way ANOVA on Recurrence Rate was conducted by Task (Perception, Synchronization) and Rhythm (Simple, Complex). This yielded a significant Task x Rhythm interaction (F(1,24) = 11.542, p = .002, generalized η2 = .022), and no main effects. There was significantly more recurrence during perception of the Complex rhythm than during perception of the Simple rhythm, and significantly more recurrence during synchronization with the Simple rhythm compared to synchronization with the Complex rhythm (HSD = .039, p < .05; Fig 3). Fig 4 shows example recurrence plots for the same individual’s trials of Simple and Complex rhythms from the Perception and Synchronization tasks. The recurrence plot for Complex Rhythm Perception contains more recurrent points (black dots, greater Recurrence Rate) than does the plot for Simple Rhythm Perception by the same individual. The Synchronization condition plots show the opposite pattern: Synchronization with the Simple rhythm leads to more recurrent points than does Synchronization with the Complex rhythm.

Fig 3. Mean recurrence rate (%) in the cardiac time series (beats per minute) by task (perception, synchronization) and rhythm (simple, complex).

Circles = individual data points.

Fig 4. Sample cardiac time series (in beats per minute) and recurrence plots (in number of samples) from one participant for each auditory condition and rhythm condition.

A) Perception task: Simple rhythm (left) and Complex rhythm (right). B) Synchronization task: Simple rhythm (left) and Complex rhythm (right). Black dots in recurrence plots indicate points of recurring heartbeat intervals.

The same two-way ANOVA on Determinism showed a significant main effect of Task (F(1,24) = 6.107, p = .021, generalized η2 = .029) with greater cardiac determinism (more predictability) during rhythm Perception than during Synchronization for both Simple (Perception = 65.4%, Synchronization = 63.3%) and Complex (Perception = 66.0%, Synchronization = 62.4%) auditory rhythms. This finding supports the alternative hypothesis that Task affects cardiac Determinism. There was no significant effect of Rhythm or Task x Rhythm interaction. Fig 5 shows example recurrence plots for a single participant during individual trials of the Simple rhythm in Perception and Synchronization conditions. There is greater determinism, or proportion of black dots forming diagonal lines (indicating sequential runs of recurrence in the time series), in the cardiac patterns during Perception compared to Synchronization.

Fig 5.

A) Mean % Determinism in the cardiac time series (beats per minute) by Task. Circles = individual data points. B) Sample cardiac time series (in beats per minute) and recurrence plots (in number of samples) from one participant for each Auditory Condition with the Simple rhythm. Left) Perception task. Right) Synchronization task. Diagonal lines of black dots indicate determinism (predictability), sequential recurring heartbeat intervals in the cardiac time series.

Behavior-cardiac correlations

Simple correlations were used to test for individual differences in behavioral synchrony-cardiac relationships in the Synchronization condition at the participant level. We compared each participant’s mean heart rate variability (RMSSD) with mean absolute asynchrony and the standard deviations of signed asynchronies within each rhythm condition. Fig 6 shows the significant correlations observed for Complex rhythms. Participants with larger mean absolute asynchrony values (less accurate synchronization) exhibited greater RMSSD values (r = .41, p = .042). Also in the Complex rhythm condition, participants with larger standard deviations of signed asynchronies (more variable synchronization) showed larger RMSSD values (r = .40, p = .0499). The same correlations for the Simple rhythm were small and did not reach significance (mean absolute asynchrony: r = .05, p = .821; standard deviation of signed asynchronies: r = .001, p = .998). In sum, these synchronization-cardiac correspondences indicate that worsened synchronization accuracy and variability were correlated with greater heart rate variability during the Complex rhythms.

Fig 6. Simple correlations.

A) Mean Absolute Asynchrony and RMSSD, in ms, and B) Mean Standard Deviation of Signed Asynchrony and RMSSD, in ms.


This study demonstrated effects of rhythm complexity on cardiac dynamics during perception and production (synchronization) of auditory rhythms. Musically trained adults (≥ 6 years private musical instruction) listened to and tapped with a Simple rhythm (that formed a 1:2 duration ratio between the two rhythms) and with a Complex rhythm (that formed a 3:2 duration ratio between the two rhythms) while their cardiac activity was measured. Linear and nonlinear analysis methods were applied to the participants’ cardiac dynamics measured during rhythm Perception and Synchronization. Listeners identified omitted tones accurately in both Simple and Complex rhythms, indicating successful perception. Complex auditory rhythms were more difficult to synchronize with than were Simple rhythms. Individual differences analyses indicated that more inaccurate and more variable synchronization corresponded to greater heart rate variability. Finally, nonlinear analyses of recurring patterns in cardiac dynamics were differentially modulated by both the task (Perception or Synchronization) as well as by stimulus rhythm complexity. Thus, nonlinear cardiac measures of recurrence rate and determinism were more sensitive to rhythm complexity than were linear measures.

Participants’ behavioral difficulties in synchronizing with Complex rhythms match predictions from dynamical systems theory [2426], and other behavioral findings [2, 4, 5, 30, 37] indicate that synchronizing with Complex rhythm ratios (such as the 3:2 rhythms used here) is more difficult than with Simple ratios (such as the 1:2 rhythms used here). Tappers in the current study showed less accurate and less precise synchronization of taps with the Complex rhythm than the Simple rhythm. Importantly, tapping rates were held constant between the Simple and Complex rhythms, and thus motor system demands were equated and cannot account alone for the poorer performance on the Complex rhythm. There was also less anticipatory synchrony for the Complex rhythm than the Simple rhythm, reflected in taps that lagged the stimulus tones for the Complex rhythm. Notably, there were no rhythm complexity effects for the missing tone detection task in the Perception condition, suggesting that the observed synchronization differences were not simply due to differences in perceptual processing of the Simple and Complex rhythms. It may be that rhythm complexity affects production more than perception because auditory-motor coupling is required, yielding greater task difficulty.

Perception/production differences in cardiac rhythms indicated that cardiac activity was modulated by auditory perception and synchronization tasks, above and beyond the complexity of the rhythm. Rhythmic synchronization resulted in faster heart rates (smaller R-R intervals) and decreased heart rate variability compared to rhythm perception. These findings are consistent with previous findings of faster heart rates and decreased heart rate variability when pianists performed a complex musical piece compared to when they listened to the same piece [62]. The Synchronization task may have been more difficult and resulted in greater physiological arousal, as it required individuals to simultaneously perceive one rhythm while producing another auditory rhythm. More difficult cognitive and motor tasks have been reported to result in increased autonomic arousal [9], including in the auditory domain [11]. Similar physiological patterns have also been observed with increased physical effort during exercise [63], leading to reductions in heart rate variability and complexity in the cardiac signal [64]. A related interpretation is that greater physiological arousal in the Synchronization condition reflected performance anxiety, leading to the observed changes in heart rate and heart rate variability [23, 41]. However, the linear cardiac metrics observed here were similar between the Baseline (no auditory stimulus) and Synchronization conditions suggesting that performance anxiety cannot account for the observed differences.

Nonlinear measures of cardiac rhythms, including recurrence rates and determinism (predictability), showed that cardiac dynamics were more predictable during rhythm perception than during synchronization. These findings contrast with previous reports of increased cardiac predictability accompanying increased task difficulty [46, 50, 65]. Notably, Wright and Palmer [8] reported increased cardiac predictability when musicians performed unfamiliar (novel) musical rhythms compared to familiar musical rhythms; that study did not compare perception and performance conditions. The design of the current study allowed us to dissociate cardiac effects of simply perceiving auditory rhythms from perceiving while simultaneously producing the same rhythms. In the Perception task, the timing of the stimulus rhythms was computer-generated with no variability; in the Synchronization task, the timing of the auditory rhythms contained variability as participants produced one part of the rhythm. These findings thus suggest that nonlinear properties of cardiac dynamics may be modulated by stimulus timing and rhythmic motor timing.

Effects of rhythmic timing on cardiac dynamics were also evidenced in the striking Task (perception/synchronization) by Rhythm (simple/complex) interaction observed for cardiac recurrence rate (indicating repeating patterns). There was greater cardiac recurrence during Perception of the Complex rhythm (which presented different temporal intervals between the two parts), and greater cardiac recurrence during Synchronization for the Simple rhythm. The order of the conditions (Perception, then Synchronization) and rhythms (Simple, then Complex) was constant across participants, suggesting that the Task by Rhythm interaction cannot be explained simply by order effects. It also cannot be explained by participants’ tapping rates between conditions, as no tapping rate differences were observed. The RQA findings also do not fit with a straightforward interpretation of increased physical stress, which suggests decreases in physiological complexity as observed by lower heart rate variability and sample entropy [66] and greater cardiac recurrence and determinism [67]. It is possible that cardiac activity was less recurrent during Synchronization with the Complex rhythm due to increased motor noise associated with participants’ more variable asynchronies in this condition. Finally, cardiac recurrence may not reflect the same underlying causes in perception and synchronization/production of auditory rhythms, as motor timing and stimulus timing do not individually explain the observed interaction. Future research may disentangle this explanation by investigating how tapping rhythms in the absence of auditory feedback affects cardiac dynamics.

Correlations between individuals’ synchronization performance and linear cardiac measures support the interpretation that cardiac dynamics are sensitive to auditory and motor timing. Participants’ lower accuracy and more variable synchronization was correlated with increased heart rate variability in the Complex rhythm condition only. This is interesting because poorer performance indicates that synchronizing with the Complex rhythm was more difficult, yet heart rate variability increased, the opposite of task difficulty effects [68, 69]. Notably, those tasks did not manipulate the sequential stimulus timing, a factor unique to the current design. Thus, individual differences in synchronization and heart rate variability are consistent with the interpretation that cardiac activity may reflect increased noise in auditory and motor timing.

Only nonlinear cardiac analyses were sensitive to rhythm complexity, as expected for time series analyses that derive recurrences from successive items in a time series (such as the sequential ratios between successive intertap intervals that defined the Simple and Complex rhythms). The recurrence measures were able to capture the cardiac differences between the Simple and Complex auditory rhythms. Similar findings in ensemble musicians’ joint cardiac dynamics showed increased cardiac recurrence and predictability during quartet music performance [70]. The current study suggests that nonlinear analysis methods applied to R-R cardiac intervals may be well-suited for detecting changes in cardiac rhythms during temporally patterned tasks such as auditory rhythm production.

There are a few important limitations in the findings to consider. First, musical training may influence physiological measures such as cardiac dynamics in response to auditory rhythms. Nonmusicians tend to exhibit greater variability in rhythmic tapping performance than musicians [71, 72]; comparison of musically trained with untrained individuals may reveal different relationships. Second, natural music produced in richer contexts introduces many more acoustic features such as pitch, timbre and tempo; cardiac dynamics are likely to be affected by additional features of complex musical pieces. Future research may compare Simple and Complex musical rhythms in natural musical compositions to complement the constrained basic rhythm complexity findings reported here. Natural musical settings such as concerts would allow for investigation of affective influences (valence, arousal) of rhythm complexity on listeners and performers. Finally, the relationship of the R-R cardiac signals to other cardiac metrics (such as J-T intervals; [64]) may extend the findings reported here.

Overall, this study demonstrated that auditory rhythm perception and production influence cardiac activity in musically trained individuals. Complex auditory rhythms (those that formed non-integer duration ratios) were more difficult to synchronize with than Simple auditory rhythms (those that formed integer duration ratios). Synchronization with auditory rhythms resulted in faster heart rate, decreased heart rate variability, and more predictable cardiac dynamics than perception of the same rhythms. Nonlinear cardiac dynamics showed more recurrence when participants synchronized with Simple compared to Complex auditory rhythms; participants who were worse at producing Complex rhythms showed decreased heart rate variability. To our knowledge, this is the first study to demonstrate altered cardiac dynamics while individuals perceived and synchronized with auditory rhythms varying in complexity. These findings suggest that auditory-motor coupling during rhythm production may be an important modulator of cardiac dynamics that extends beyond perceptual influences.


We gratefully acknowledge the technical assistance of Selva Asgharizadeh, Louis Deng, and Sandra Kuliver. Correspondence should be addressed to Shannon Wright ( or Caroline Palmer (


  1. 1. Fraisse P. Rhythm and tempo. In: Deutsch D. The psychology of music. New York (USA): Academic Press Inc. 1982. 149–180.
  2. 2. Essens PJ. Hierarchical organization of temporal patterns. Percept Psychophys. 1986; 40(2): 69–73. pmid:3763362
  3. 3. Hannon EE, Soley G, Levine RS. Constraints on infants’ musical rhythm perception: effects of interval ratio complexity and enculturation. Dev Sci. 2011; 14(4): 865–872. pmid:21676105
  4. 4. Repp BH, London J, Keller PE. Production and synchronization of uneven rhythms at fast tempi. Music Percept. 2005 Sep; 23(1): 61–78.
  5. 5. Snyder JS, Hannon EE, Large EW, Christiansen MH. Synchronization and continuation tapping to complex meters. Music Percept. 2006 Dec; 24(2): 135–146.
  6. 6. Koelsch S, Jancke L. Music and the heart. Eur Heart J. 2015; 36(44): 3043–3049.
  7. 7. Zhao TC, Kuhl PK. Neural and physiological relations observed in musical beat and meter processing. Brain Behav. 2020. pmid:32920995
  8. 8. Wright SE, Palmer C. Physiological and behavioral factors in musicians’ performance tempo. Front Hum Neurosci. 2020. pmid:33192375
  9. 9. Fairclough SH, Venables L, Tattersall A. The influence of task demand and learning on the psychophysiological response. Int J Psychophysiol. 2005; 56(2): 171–84. pmid:15804451
  10. 10. Hidalgo-Muñoz AR, Mouratille D, Matton N, Causse M, Rouillard Y, El-Yagoubi R. Cardiovascular correlates of emotional state, cognitive workload and time-on-task effect during a realistic flight simulation. Int J of Psychophysiol. 2018; 128: 62–69. pmid:29627585
  11. 11. Slade K, Kramer SE, Fairclough S, Richter M. Effortful listening: Sympathetic activity varies as a function of listening demand but parasympathetic activity does not. Hear Res. 2021. pmid:34543837
  12. 12. Murray NP, Raedeke TD. Heart rate variability as an indicator of pre-competitive arousal. Int J Sport Psychol. 2008; 39(4): 346–355.
  13. 13. Gomez P, Danuser B. Relationships between musical structure and psychophysiological measures of emotion. Emotion. 2007; 7(2): 377–387. pmid:17516815
  14. 14. Iwanaga M, Ikeda M, Iwaki T. The effects of repetitive exposure to music on subjective and physiological responses. J Music Ther. 1996; 33(3): 219–230.
  15. 15. Nomura S, Yoshimura K, Kurosawa Y. A pilot study on the effect of music-heart beat feedback system on human heart activity. J Med Inform Tech. 2013; 22: 251–256.
  16. 16. Watanabe K, Ooishi Y, Kashino M. Sympathetic tone induced by high acoustic tempo requires fast respiration. PloS one. 2015 Aug 18. pmid:26284521
  17. 17. Mütze H, Kopiez R, Wolf A. The effect of a rhythmic pulse on the heart rate: Little evidence for rhythmical ‘entrainment’and ‘synchronization’. Music Sci. 2020 Sep; 24(3): 377–400.
  18. 18. Bernardi L, Porta C, Sleight P. Cardiovascular, cerebrovascular, and respiratory changes induced by different types of music in musicians and non-musicians: the importance of silence. Heart. 2006; 92(4): 445–452. pmid:16199412
  19. 19. Chuen L, Sears D, McAdams S. Psychophysiological responses to auditory change. Psychophysiology. 2016; 53(6): 891–904.
  20. 20. Egermann H, Fernando N, Chuen L, McAdams S. Music induces universal emotion-related psychophysiological responses: comparing Canadian listeners to Congolese Pygmies. Front Psychol. 2015. pmid:25620935
  21. 21. Ooishi Y, Mukai H, Watanabe K, Kawato S, Kashino M. Increase in salivary oxytocin and decrease in salivary cortisol after listening to relaxing slow-tempo and exciting fast-tempo music. PLoS One. 2017. pmid:29211795
  22. 22. Van Dyck E, Six J, Soyer E, Denys M, Bardijn I, Leman M. Adopting a music-to-heart rate alignment strategy to measure the impact of music and its tempo on human heart rate. Music Sci. 2017 Dec; 21(4): 390–404.
  23. 23. Williamon A, Aufegger L, Wasley D, Looney D, Mandic DP. Complexity of physiological responses decreases in high-stress musical performance. J R Soc Interface. 2013. pmid:24068177
  24. 24. Large EW, Herrera JA, Velasco MJ. Neural networks for beat perception in musical rhythm. Front Syst Neurosci. 2015. pmid:26635549
  25. 25. Large EW, Snyder JS. Pulse and meter as neural resonance. Ann N.Y. Acad Sci. 2009 Jul; 1169(1): 46–57. pmid:19673754
  26. 26. Treffner PJ, Turvey MT. Resonance constraints on rhythmic movement. J Exp Psychol Hum Percept Perf. 1993 Dec; 19(6): 1221.
  27. 27. Savage PE, Brown S, Sakai E, Currie TE. Statistical universals reveal the structures and functions of human music. Proc Natl Acad Sci USA. 2015 Jul 21; 112(29): 8987–8992.
  28. 28. De Gregorio C, Valente D, Raimondi T, Torti V, Miaretsoa L, Friard O, et al. Categorical rhythms in a singing primate. Curr Biol. 2021; 31(20): R1379–R1380. pmid:34699799
  29. 29. Roeske T, Larrouy-Maestri P, Sakamoto Y, Poeppel D. Listening to birdsong reveals basic features of rate perception and aesthetic judgements. Proc. Royal Soc. B. 2020 Mar 25. pmid:32208834
  30. 30. Sakai K, Hikosaka O, Miyauchi S, Takino R, Tamada T, Iwata NK, et al. Neural representation of a rhythm depends on its interval ratio. J Neurosci. 1999 Nov 15; 19(22): 10074–10081. pmid:10559415
  31. 31. Dotov D, Trainor LJ. Cross-frequency coupling explains the preference for simple ratios in rhythmic behaviour and the relative stability across non-synchronous patterns. Philos Trans R Soc B. 2021. pmid:34420377
  32. 32. Ravignani A, Delgado T, Kirby S. Musical evolution in the lab exhibits rhythmic universals. Nat Hum Behav. 2016 Dec 19.
  33. 33. Polak R, Jacoby N, Fischinger T, Goldberg D, Holzapfel A, London J. Rhythmic prototypes across cultures: A comparative study of tapping synchronization. Music Percept. 2018 Sep.
  34. 34. Jacoby N, McDermott JH. Integer ratio priors on musical rhythm revealed cross-culturally by iterated reproduction. Current Biology. 2017; 27(3): 359–370. pmid:28065607
  35. 35. Hannon EE, Soley G, Ullal S. Familiarity overrides complexity in rhythm perception: a cross-cultural comparison of American and Turkish listeners. J Exp Psychol Hum Percept Perform. 2012; 38(3): 543–548. pmid:22352419
  36. 36. Repp BH, London J, Keller PE. Perception–production relationships and phase correction in synchronization with two-interval rhythms. Psychol Res. 2011 May; 75: 227–242. pmid:20644955
  37. 37. Mathias B, Zamm A, Gianferrara PG, Ross B, Palmer C. Rhythm complexity modulates behavioral and neural dynamics during auditory-motor synchronization. J Cogn Neurosci. 2020; 32(10): 1864–1880. pmid:32644881
  38. 38. Manuca R, Savit R. Stationarity and nonstationarity in time series analysis. Physica D. 1996 Dec 15; 99(2–3): 134–161.
  39. 39. Berntson GG, Thomas Bigger J Jr, Eckberg DL, Grossman P, Kaufmann PG, Malik M, et al. Heart rate variability: Origins, methods, and interpretive caveats. Psychophysiology. 1997; 34(6): 623–648. pmid:9401419
  40. 40. Chanwimalueang T, Aufegger L, Adjei T, Wasley D, Cruder C, Mandic DP, et al. Stage call: Cardiovascular reactivity to audition stress in musicians. PloS One. 2017.
  41. 41. Sebastiani L, Mastorci F, Magrini M, Paradisi P, Pingitore A. Synchronization between music dynamics and heart rhythm is modulated by the musician’s emotional involvement: A single case study. Front in Psychol. 2022. pmid:36160502
  42. 42. Zbilut JP, Koebbe M, Loeb H, Mayer-Kress G. Use of recurrence plots in the analysis of heart beat intervals. Proc Comput Cardiol. 1991; Sep 23.
  43. 43. Takens F. Detecting strange attractors in turbulence. In Rand D, Young L. (Eds.), Dynamical Systems and Turbulence. Berlin: Springer; 1981. pp. 366–381.
  44. 44. Webber CL Jr, Zbilut JP. Recurrence quantification analysis of nonlinear dynamical systems. in: Riley MA, Van orden GC. Tutorials in contemporary nonlinear methods for the behavioral sciences. National Science Foundation; 2005. pp. 26–94.
  45. 45. Webber CL, Marwan N. Recurrence quantification analysis. Theory and Best Practices. Spring International Publishing: Switzerland. 2015.
  46. 46. Javorka M, Turianikova Z, Tonhajzerova I, Javorka K, Baumert M. The effect of orthostasis on recurrence quantification analysis of heart rate and blood pressure dynamics. Physiol Meas. 2009; 30(1): 29–41. pmid:19039163
  47. 47. Javorka M, Trunkvalterova Z, Tonhajzerova I, Lazarova Z, Javorkova J, Javorka K. Recurrences in heart rate dynamics are changed in patients with diabetes mellitus. Clin Physiol Funct Imaging. 2008; 28(5): 326–31. pmid:18507669
  48. 48. Nguyen HD, Wilkins BA, Cheng Q, Benjamin BA. An online sleep apnea detection method based on recurrence quantification analysis. J Biomed Inform. 2014 Nov 27; 18(4): 1285–1293.
  49. 49. Marwan N, Wessel N, Meyerfeldt U, Schirdewan A, Kurths J. Recurrence-plot-based measures of complexity and their application to heart-rate-variability data. Phys Rev E. 2002. pmid:12241313
  50. 50. Konvalinka I, Xygalatas D, Bulbulia J, Schjodt U, Jegindo EM, Wallot S, et al. Synchronized arousal between performers and related spectators in a fire-walking ritual. Proc Natl Acad Sci USA. 2011; 108(20): 8514–8519.
  51. 51. Finney S. A. FTAP: A Linux-based program for tapping and music experiments. Behav Res Methods Instrum Comput. 2001; 33(1): 65–72. pmid:11296721
  52. 52. Shaffer F, McCraty R, Zerr CL. A healthy heart is not a metronome: An integrative review of the heart’s anatomy and heart rate variability. Front Psychol. 2014. pmid:25324790
  53. 53. Dalla Bella S, Białuńska A, Sowiński J. Why movement is captured by music, but less by speech: Role of temporal regularity. PloS One. 2013. pmid:23936534
  54. 54. Repp BH, Su YH. Sensorimotor synchronization: A review of recent research (2006–2012). Psychon Bull Rev. 2013. pmid:23397235
  55. 55. Shaffer F, Ginsberg JP. An overview of heart rate variability metrics and norms. Front Public Health. 2017. pmid:29034226
  56. 56. Penttilä J, Helminen A, Jartti T, Kuusela T, Huikuri HV, Tulppo MP, et al. Time domain, geometrical and frequency domain analysis of cardiac vagal outflow: Effects of various respiratory patterns. Clin Physiol. 2001. pmid:11380537
  57. 57. Saboul D, Pialoux V, Hautier C. The impact of breathing on HRV measurements: Implications for the longitudinal follow-up of athletes. Eur J Sport Sci. 2013. pmid:24050471
  58. 58. Esco MR, Flatt AA. Ultra-short-term heart rate variability indexes at rest and post-exercise in athletes: Evaluating the agreement with accepted recommendations. J Sports Sci Med. 2014. pmid:25177179
  59. 59. Wallot S, Fusaroli R, Tylen K, Jegindo EM. Using complexity metrics with R-R intervals and BPM heart rate measures. Front Physiol. 2013. pmid:23964244
  60. 60. Marwan N. Cross Recurrence Plot Toolbox for MATLAB®(Version 5.22 (R32. 2)). Computer software. http://tocsy. pik-potsdam. de/CRPtoolbox. 2020.
  61. 61. Yargholi E, Hossein-Zadeh GA. Cross recurrence quantifiers as new connectivity measures for structure learning of Bayesian networks in brain decoding. Chaos, Solitons & Fractals. 2019.
  62. 62. Nakahara H, Furuya S, Masuko T, Francis PR, Kinoshita H. Performing music can induce greater modulation of emotion-related psychophysiological responses than listening to music. Int J Psychophysiol. 2011; 81(3): 152–158. pmid:21704661
  63. 63. Mongin D, Chabert C, Extremera MG, Hue O, Courvoisier DS, Carpena P, et al. Decrease of heart rate variability during exercise: An index of cardiorespiratory fitness. PLoS One. 2022. pmid:36054204
  64. 64. Ziaukas P, Alabdulgader A, Vainoras A, Navickas Z, Ragulskis M. New approach for visualization of relationships between RR and JT intervals. PLoS One. 2017. pmid:28379976
  65. 65. Schlenker J, Socha V, Riedlbauchová L, Nedělka T, Schlenker A, Potočková V, et al. Recurrence plot of heart rate variability signal in patients with vasovagal syncopes. Biomed Signal Process Control. 2016.
  66. 66. Javorka M, Zila I, Balharek T, Javorka K. Heart rate recovery after exercise: relations to heart rate variability and complexity. Braz J Med Bio Res. 2002. pmid:12185393
  67. 67. Zimatore G, Gallotta MC, Innocenti L, Bonavolontà V, Ciasca G, De Spirito M, et al. Recurrence quantification analysis of heart rate variability during continuous incremental exercise test in obese subjects. Chaos. 2020. pmid:32237785
  68. 68. Delliaux S, Delaforge A, Deharo JC, Chaumet G. Mental workload alters heart rate variability, lowering non-linear dynamics. Front Physiol. 2019 May 14. pmid:31156454
  69. 69. Luque-Casado A, Perales JC, Cardenas D, Sanabria D. Heart rate variability and cognitive processing: The autonomic response to task demands. Biol Psychol. 2016; 113: 83–90.
  70. 70. Høffding S, Yi W, Lippert E, Sanchez VG, Bishop L, Laeng B, et al. Into the hive-mind: Shared absorption and cardiac interrelations in expert and student string quartets. Music Sci. 2023.
  71. 71. Grahn JA, Schuit D. Individual differences in rhythmic ability: Behavioral and neuroimaging investigations. Psychomusicology. 2012; 22(2):105–121.
  72. 72. Matthews TE, Witek MAG, Heggli OA, Penhune VB, Vuust P. The sensation of groove is affected by the interaction of rhythmic and harmonic complexity. PLoS One. 2019. pmid:30629596