Figures
Abstract
Sport psychology as an academic pursuit is nearly two centuries old. An enduring goal since inception has been to understand how psychological techniques can improve athletic performance. Although much evidence exists in the form of meta-analytic reviews related to sport psychology and performance, a systematic review of these meta-analyses is absent from the literature. We aimed to synthesize the extant literature to gain insights into the overall impact of sport psychology on athletic performance. Guided by the PRISMA statement for systematic reviews, we reviewed relevant articles identified via the EBSCOhost interface. Thirty meta-analyses published between 1983 and 2021 met the inclusion criteria, covering 16 distinct sport psychology constructs. Overall, sport psychology interventions/variables hypothesized to enhance performance (e.g., cohesion, confidence, mindfulness) were shown to have a moderate beneficial effect (d = 0.51), whereas variables hypothesized to be detrimental to performance (e.g., cognitive anxiety, depression, ego climate) had a small negative effect (d = -0.21). The quality rating of meta-analyses did not significantly moderate the magnitude of observed effects, nor did the research design (i.e., intervention vs. correlation) of the primary studies included in the meta-analyses. Our review strengthens the evidence base for sport psychology techniques and may be of great practical value to practitioners. We provide recommendations for future research in the area.
Citation: Lochbaum M, Stoner E, Hefner T, Cooper S, Lane AM, Terry PC (2022) Sport psychology and performance meta-analyses: A systematic review of the literature. PLoS ONE 17(2): e0263408. https://doi.org/10.1371/journal.pone.0263408
Editor: Claudio Imperatori, European University of Rome, ITALY
Received: September 28, 2021; Accepted: January 18, 2022; Published: February 16, 2022
Copyright: © 2022 Lochbaum et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: All relevant data are within the paper.
Funding: The author(s) received no specific funding for this work.
Competing interests: The authors have declared that no competing interests exist.
Introduction
Sport performance matters. Verifying its global importance requires no more than opening a newspaper to the sports section, browsing the internet, looking at social media outlets, or scanning abundant sources of sport information. Sport psychology is an important avenue through which to better understand and improve sport performance. To date, a systematic review of published sport psychology and performance meta-analyses is absent from the literature. Given the undeniable importance of sport, the history of sport psychology in academics since 1830, and the global rise of sport psychology journals and organizations, a comprehensive systematic review of the meta-analytic literature seems overdue. Thus, we aimed to consolidate the existing literature and provide recommendations for future research.
The development of sport psychology
The history of sport psychology dates back nearly 200 years. Terry [1] cites Carl Friedrich Koch’s (1830) publication titled [in translation] Calisthenics from the Viewpoint of Dietetics and Psychology [2] as perhaps the earliest publication in the field, and multiple commentators have noted that sport psychology experiments occurred in the world’s first psychology laboratory, established by Wilhelm Wundt at the University of Leipzig in 1879 [1, 3]. Konrad Rieger’s research on hypnosis and muscular endurance, published in 1884 [4] and Angelo Mosso’s investigations of the effects of mental fatigue on physical performance, published in 1891 [5] were other early landmarks in the development of applied sport psychology research. Following the efforts of Koch, Wundt, Rieger, and Mosso, sport psychology works appeared with increasing regularity, including Philippe Tissié’s publications in 1894 [6, 7] on psychology and physical training, and Pierre de Coubertin’s first use of the term sport psychology in his La Psychologie du Sport paper in 1900 [8]. In short, the history of sport psychology and performance research began as early as 1830 and picked up pace in the latter part of the 19th century. Early pioneers, who helped shape sport psychology include Wundt, recognized as the “father of experimental psychology”, Tissié, the founder of French physical education and Legion of Honor awardee in 1932, and de Coubertin who became the father of the modern Olympic movement and founder of the International Olympic Committee.
Sport psychology flourished in the early 20th century [see 1, 3 for extensive historic details]. For instance, independent laboratories emerged in Berlin, Germany, established by Carl Diem in 1920; in St. Petersburg and Moscow, Russia, established respectively by Avksenty Puni and Piotr Roudik in 1925; and in Champaign, Illinois USA, established by Coleman Griffith, also in 1925. The period from 1950–1980 saw rapid strides in sport psychology, with Franklin Henry establishing this field of study as independent of physical education in the landscape of American and eventually global sport science and kinesiology graduate programs [1]. In addition, of great importance in the 1960s, three international sport psychology organizations were established: namely, the International Society for Sport Psychology (1965), the North American Society for the Psychology of Sport and Physical Activity (1966), and the European Federation of Sport Psychology (1969). Since that time, the Association of Applied Sport Psychology (1986), the South American Society for Sport Psychology (1986), and the Asian-South Pacific Association of Sport Psychology (1989) have also been established.
The global growth in academic sport psychology has seen a large number of specialist publications launched, including the following journals: International Journal of Sport Psychology (1970), Journal of Sport & Exercise Psychology (1979), The Sport Psychologist (1987), Journal of Applied Sport Psychology (1989), Psychology of Sport and Exercise (2000), International Journal of Sport and Exercise Psychology (2003), Journal of Clinical Sport Psychology (2007), International Review of Sport and Exercise Psychology (2008), Journal of Sport Psychology in Action (2010), Sport, Exercise, and Performance Psychology (2014), and the Asian Journal of Sport & Exercise Psychology (2021).
In turn, the growth in journal outlets has seen sport psychology publications burgeon. Indicative of the scale of the contemporary literature on sport psychology, searches completed in May 2021 within the Web of Science Core Collection, identified 1,415 publications on goal setting and sport since 1985; 5,303 publications on confidence and sport since 1961; and 3,421 publications on anxiety and sport since 1980. In addition to academic journals, several comprehensive edited textbooks have been produced detailing sport psychology developments across the world, such as Hanrahan and Andersen’s (2010) Handbook of Applied Sport Psychology [9], Schinke, McGannon, and Smith’s (2016) International Handbook of Sport Psychology [10], and Bertollo, Filho, and Terry’s (2021) Advancements in Mental Skills Training [11] to name just a few. In short, sport psychology is global in both academic study and professional practice.
Meta-analysis in sport psychology
Several meta-analysis guides, computer programs, and sport psychology domain-specific primers have been popularized in the social sciences [12, 13]. Sport psychology academics have conducted quantitative reviews on much studied constructs since the 1980s, with the first two appearing in 1983 in the form of Feltz and Landers’ meta-analysis on mental practice [14], which included 98 articles dating from 1934, and Bond and Titus’ cross-disciplinary meta-analysis on social facilitation [15], which summarized 241 studies including Triplett’s (1898) often-cited study of social facilitation in cycling [16]. Although much meta-analytic evidence exists for various constructs in sport and exercise psychology [12] including several related to performance [17], the evidence is inconsistent. For example, two meta-analyses, both ostensibly summarizing evidence of the benefits to performance of task cohesion [18, 19], produced very different mean effects (d = .24 vs d = 1.00) indicating that the true benefit lies somewhere in a wide range from small to large. Thus, the lack of a reliable evidence base for the use of sport psychology techniques represents a significant gap in the knowledge base for practitioners and researchers alike. A comprehensive systematic review of all published meta-analyses in the field of sport psychology has yet to be published.
Purpose and aim
We consider this review to be both necessary and long overdue for the following reasons: (a) the extensive history of sport psychology and performance research; (b) the prior publication of many meta-analyses summarizing various aspects of sport psychology research in a piecemeal fashion [12, 17] but not its totality; and (c) the importance of better understanding and hopefully improving sport performance via the use of interventions based on solid evidence of their efficacy. Hence, we aimed to collate and evaluate this literature in a systematic way to gain improved understanding of the impact of sport psychology variables on sport performance by construct, research design, and meta-analysis quality, to enhance practical knowledge of sport psychology techniques and identify future lines of research inquiry. By systematically reviewing all identifiable meta-analytic reviews linking sport psychology techniques with sport performance, we aimed to evaluate the strength of the evidence base underpinning sport psychology interventions.
Materials and methods
This systematic review of meta-analyses followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines [20]. We did not register our systematic review protocol in a database. However, we specified our search strategy, inclusion criteria, data extraction, and data analyses in advance of writing our manuscript. All details of our work are available from the lead author. Concerning ethics, this systematic review received a waiver from Texas Tech University Human Subject Review Board as it concerned archival data (i.e., published meta-analyses).
Eligibility criteria
Published meta-analyses were retained for extensive examination if they met the following inclusion criteria: (a) included meta-analytic data such as mean group, between or within-group differences or correlates; (b) published prior to January 31, 2021; (c) published in a peer-reviewed journal; (d) investigated a recognized sport psychology construct; and (e) meta-analyzed data concerned with sport performance. There was no language of publication restriction. To align with our systematic review objectives, we gave much consideration to study participants and performance outcomes. Across multiple checks, all authors confirmed study eligibility. Three authors (ML, AL, and PT) completed the final inclusion assessments.
Information sources
Authors searched electronic databases, personal meta-analysis history, and checked with personal research contacts. Electronic database searches occurred in EBSCOhost with the following individual databases selected: APA PsycINFO, ERIC, Psychology and Behavioral Sciences Collection, and SPORTDiscus. An initial search concluded October 1, 2020. ML, AL, and PT rechecked the identified studies during the February–March, 2021 period, which resulted in the identification of two additional meta-analyses [21, 22].
Search protocol
ML and ES initially conducted independent database searches. For the first search, ML used the following search terms: sport psychology with meta-analysis or quantitative review and sport and performance or sport* performance. For the second search, ES utilized a sport psychology textbook and used the chapter title terms (e.g., goal setting). In EBSCOhost, both searches used the advanced search option that provided three separate boxes for search terms such as box 1 (sport psychology), box 2 (meta-analysis), and box 3 (performance). Specific details of our search strategy were:
Search by ML:
- sport psychology, meta-analysis, sport and performance
- sport psychology, meta-analysis or quantitative review, sport* performance
- sport psychology, quantitative review, sport and performance
- sport psychology, quantitative review, sport* performance
Search by ES:
- mental practice or mental imagery or mental rehearsal and sports performance and meta-analysis
- goal setting and sports performance and meta-analysis
- anxiety and stress and sports performance and meta-analysis
- competition and sports performance and meta-analysis
- diversity and sports performance and meta-analysis
- cohesion and sports performance and meta-analysis
- imagery and sports performance and meta-analysis
- self-confidence and sports performance and meta-analysis
- concentration and sports performance and meta-analysis
- athletic injuries and sports performance and meta-analysis
- overtraining and sports performance and meta-analysis
- children and sports performance and meta-analysis
The following specific search of the EBSCOhost with SPORTDiscus, APA PsycINFO, Psychology and Behavioral Sciences Collection, and ERIC databases, returned six results from 2002–2020, of which three were included [18, 19, 23] and three were excluded because they were not meta-analyses.
- Box 1 cohesion
- Box 2 sports performance
- Box 3 meta-analysis
Study selection
As detailed in the PRISMA flow chart (Fig 1) and the specified inclusion criteria, a thorough study selection process was used. As mentioned in the search protocol, two authors (ML and ES) engaged independently with two separate searches and then worked together to verify the selected studies. Next, AL and PT examined the selected study list for accuracy. ML, AL, and PT, whilst rating the quality of included meta-analyses, also re-examined all selected studies to verify that each met the predetermined study inclusion criteria. Throughout the study selection process, disagreements were resolved through discussion until consensus was reached.
Data extraction process
Initially, ML, TH, and ES extracted data items 1, 2, 3 and 8 (see Data items). Subsequently, ML, AL, and PT extracted the remaining data (items 4–7, 9, 10). Checks occurred during the extraction process for potential discrepancies (e.g., checking the number of primary studies in a meta-analysis). It was unnecessary to contact any meta-analysis authors for missing information or clarification during the data extraction process because all studies reported the required information. Across the search for meta-analyses, all identified studies were reported in English. Thus, no translation software or searching out a native speaker occurred. All data extraction forms (e.g., data items and individual meta-analysis quality) are available from the first author.
Data items
To help address our main aim, we extracted the following information from each meta-analysis: (1) author(s); (2) publication year; (3) construct(s); (4) intervention based meta-analysis (yes, no, mix); (5) performance outcome(s) description; (6) number of studies for the performance outcomes; (7) participant description; (8) main findings; (9) bias correction method/results; and (10) author(s) stated conclusions. For all information sought, we coded missing information as not reported.
Individual meta-analysis quality
ML, AL, and PT independently rated the quality of individual meta-analysis on the following 25 points found in the PRISMA checklist [20]: title; abstract structured summary; introduction rationale, objectives, and protocol and registration; methods eligibility criteria, information sources, search, study selection, data collection process, data items, risk of bias of individual studies, summary measures, synthesis of results, and risk of bias across studies; results study selection, study characteristics, risk of bias within studies, results of individual studies, synthesis of results, and risk of bias across studies; discussion summary of evidence, limitations, and conclusions; and funding. All meta-analyses were rated for quality by two coders to facilitate inter-coder reliability checks, and the mean quality ratings were used in subsequent analyses. One author (PT), having completed his own ratings, received the incoming ratings from ML and AL and ran the inter-coder analysis. Two rounds of ratings occurred due to discrepancies for seven meta-analyses, mainly between ML and AL. As no objective quality categorizations (i.e., a point system for grouping meta-analyses as poor, medium, good) currently exist, each meta-analysis was allocated a quality score of up to a maximum of 25 points. All coding records are available upon request.
Planned methods of analysis
Several preplanned methods of analysis occurred. We first assessed the mean quality rating of each meta-analysis based on our 25-point PRISMA-based rating system. Next, we used a median split of quality ratings to determine whether standardized mean effects (SMDs) differed by the two formed categories, higher and lower quality meta-analyses. Meta-analysis authors reported either of two different effect size metrics (i.e., r and SMD); hence we converted all correlational effects to SMD (i.e., Cohen’s d) values using an online effect size calculator (www.polyu.edu.hk/mm/effectsizefaqs/calculator/calculator.html). We interpreted the meaningfulness of effects based on Cohen’s interpretation [24] with 0.20 as small, 0.50 as medium, 0.80 as large, and 1.30 as very large. As some psychological variables associate negatively with performance (e.g., confusion [25], cognitive anxiety [26]) whereas others associate positively (e.g., cohesion [23], mental practice [14]), we grouped meta-analyses according to whether the hypothesized effect with performance was positive or negative, and summarized the overall effects separately. By doing so, we avoided a scenario whereby the demonstrated positive and negative effects canceled one another out when combined. The effect of somatic anxiety on performance, which is hypothesized to follow an inverted-U relationship, was categorized as neutral [35]. Last, we grouped the included meta-analyses according to whether the primary studies were correlational in nature or involved an intervention and summarized these two groups of meta-analyses separately.
Results
Study characteristics
Table 1 contains extracted data from 30 meta-analyses meeting the inclusion criteria, dating from 1983 [14] to 2021 [21]. The number of primary studies within the meta-analyses ranged from three [27] to 109 [28]. In terms of the description of participants included in the meta-analyses, 13 included participants described simply as athletes, whereas other meta-analyses identified a mix of elite athletes (e.g., professional, Olympic), recreational athletes, college-aged volunteers (many from sport science departments), younger children to adolescents, and adult exercisers. Of the 30 included meta-analyses, the majority (n = 18) were published since 2010. The decadal breakdown of meta-analyses was 1980–1989 (n = 1 [14]), 1990–1999 (n = 6 [29–34]), 2000–2009 (n = 5 [23, 25, 26, 35, 36]), 2010–2019 (n = 12 [18, 19, 22, 27, 37–43, 48]), and 2020–2021 (n = 6 [21, 28, 44–47]).
As for the constructs covered, we categorized the 30 meta-analyses into the following areas: mental practice/imagery [14, 29, 30, 42, 46, 47], anxiety [26, 31, 32, 35], confidence [26, 35, 36], cohesion [18, 19, 23], goal orientation [22, 44, 48], mood [21, 25, 34], emotional intelligence [40], goal setting [33], interventions [37], mindfulness [27], music [28], neurofeedback training [43], perfectionism [39], pressure training [45], quiet eye training [41], and self-talk [38]. Multiple effects were generated from meta-analyses that included more than one construct (e.g., tension, depression, etc. [21]; anxiety and confidence [26]). In relation to whether the meta-analyses included in our review assessed the effects of a sport psychology intervention on performance or relationships between psychological constructs and performance, 13 were intervention-based, 14 were correlational, two included a mix of study types, and one included a large majority of cross-sectional studies (Table 1).
A wide variety of performance outcomes across many sports was evident, such as golf putting, dart throwing, maximal strength, and juggling; or categorical outcomes such as win/loss and Olympic team selection. Given the extensive list of performance outcomes and the incomplete descriptions provided in some meta-analyses, a clear categorization or count of performance types was not possible. Sufficient to conclude, researchers utilized many performance outcomes across a wide range of team and individual sports, motor skills, and strength and aerobic tasks.
Effect size data and bias correction
To best summarize the effects, we transformed all correlations to SMD values (i.e., Cohen’s d). Across all included meta-analyses shown in Table 2 and depicted in Fig 2, we identified 61 effects. Having corrected for bias, effect size values were assessed for meaningfulness [24], which resulted in 15 categorized as negligible (< ±0.20), 29 as small (±0.20 to < 0.50), 13 as moderate (±0.50 to < 0.80), 2 as large (±0.80 to < 1.30), and 1 as very large (≥ 1.30).
Study quality rating results and summary analyses
Following our PRISMA quality ratings, intercoder reliability coefficients were initially .83 (ML, AL), .95 (ML, PT), and .90 (AL, PT), with a mean intercoder reliability coefficient of .89. To achieve improved reliability (i.e., rmean > .90), ML and AL re-examined their ratings. As a result, intercoder reliability increased to .98 (ML, AL), .96 (ML, PT), and .92 (AL, PT); a mean intercoder reliability coefficient of .95. Final quality ratings (i.e., the mean of two coders) ranged from 13 to 25 (M = 19.03 ± 4.15). Our median split into higher (M = 22.83 ± 1.08, range 21.5–25, n = 15) and lower (M = 15.47 ± 2.42, range 13–20.5, n = 15) quality groups produced significant between-group differences in quality (F1,28 = 115.62, p < .001); hence, the median split met our intended purpose. The higher quality group of meta-analyses were published from 2015–2021 (median 2018) and the lower quality group from 1983–2014 (median 2000). It appears that meta-analysis standards have risen over the years since the PRISMA criteria were first introduced in 2009. All data for our analyses are shown in Table 2.
Table 3 contains summary statistics with bias-corrected values used in the analyses. The overall mean effect for sport psychology constructs hypothesized to have a positive impact on performance was of moderate magnitude (d = 0.51, 95% CI = 0.42, 0.58, n = 36). The overall mean effect for sport psychology constructs hypothesized to have a negative impact on performance was small in magnitude (d = -0.21, 95% CI -0.31, -0.11, n = 24). In both instances, effects were larger, although not significantly so, among meta-analyses of higher quality compared to those of lower quality. Similarly, mean effects were larger but not significantly so, where reported effects in the original studies were based on interventional rather than correlational designs. This trend only applied to hypothesized positive effects because none of the original studies in the meta-analyses related to hypothesized negative effects used interventional designs.
Discussion
In this systematic review of meta-analyses, we synthesized the available evidence regarding effects of sport psychology interventions/constructs on sport performance. We aimed to consolidate the literature, evaluate the potential for meta-analysis quality to influence the results, and suggest recommendations for future research at both the single study and quantitative review stages. During the systematic review process, several meta-analysis characteristics came to light, such as the number of meta-analyses of sport psychology interventions (experimental designs) compared to those summarizing the effects of psychological constructs (correlation designs) on performance, the number of meta-analyses with exclusively athletes as participants, and constructs featuring in multiple meta-analyses, some of which (e.g., cohesion) produced very different effect size values. Thus, although our overall aim was to evaluate the strength of the evidence base for use of psychological interventions in sport, we also discuss the impact of these meta-analysis characteristics on the reliability of the evidence.
When seen collectively, results of our review are supportive of using sport psychology techniques to help improve performance and confirm that variations in psychological constructs relate to variations in performance. For constructs hypothesized to have a positive effect on performance, the mean effect strength was moderate (d = 0.51) although there was substantial variation between constructs. For example, the beneficial effects on performance of task cohesion (d = 1.00) and self-efficacy (d = 0.82) are large, and the available evidence base for use of mindfulness interventions suggests a very large beneficial effect on performance (d = 1.35). Conversely, some hypothetically beneficial effects (2 of 36; 5.6%) were in the negligible-to-small range (0.15–0.20) and most beneficial effects (19 of 36; 52.8%) were in the small-to-moderate range (0.22–0.49). It should be noted that in the world of sport, especially at the elite level, even a small beneficial effect on performance derived from a psychological intervention may prove the difference between success and failure and hence small effects may be of great practical value. To put the scale of the benefits into perspective, an authoritative and extensively cited review of healthy eating and physical activity interventions [49] produced an overall pooled effect size of 0.31 (compared to 0.51 for our study), suggesting sport psychology interventions designed to improve performance are generally more effective than interventions designed to promote healthy living.
Among hypothetically negative effects (e.g., ego climate, cognitive anxiety, depression), the mean detrimental effect was small (d = -0.21) although again substantial variation among constructs was evident. Some hypothetically negative constructs (5 of 24; 20.8%) were found to actually provide benefits to performance, albeit in the negligible range (0.02–0.12) and only two constructs (8.3%), both from Lochbaum and colleagues’ POMS meta-analysis [21], were shown to negatively affect performance above a moderate level (depression: d = -0.64; total mood disturbance, which incorporates the depression subscale: d = -0.84). Readers should note that the POMS and its derivatives assess six specific mood dimensions rather than the mood construct more broadly, and therefore results should not be extrapolated to other dimensions of mood [50].
Mean effects were larger among higher quality than lower quality meta-analyses for both hypothetically positive (d = 0.54 vs d = 0.45) and negative effects (d = -0.25 vs d = 0.17), but in neither case were the differences significant. It is reasonable to assume that the true effects were derived from the higher quality meta-analyses, although our conclusions remain the same regardless of study quality. Overall, our findings provide a more rigorous evidence base for the use of sport psychology techniques by practitioners than was previously available, representing a significant contribution to knowledge. Moreover, our systematic scrutiny of 30 meta-analyses published between 1983 and 2021 has facilitated a series of recommendations to improve the quality of future investigations in the sport psychology area.
Recommendations
The development of sport psychology as an academic discipline and area of professional practice relies on using evidence and theory to guide practice. Hence, a strong evidence base for the applied work of sport psychologists is of paramount importance. Although the beneficial effects of some sport psychology techniques are small, it is important to note the larger performance benefits for other techniques, which may be extremely meaningful for applied practice. Overall, however, especially given the heterogeneity of the observed effects, it would be wise for applied practitioners to avoid overpromising the benefits of sport psychology services to clients and perhaps underdelivering as a result [1].
The results of our systematic review can be used to generate recommendations for how the profession might conduct improved research to better inform applied practice. Much of the early research in sport psychology was exploratory and potential moderating variables were not always sufficiently controlled. Terry [51] outlined this in relation to the study of mood-performance relationships, identifying that physical and skills factors will very likely exert a greater influence on performance than psychological factors. Further, type of sport (e.g., individual vs. team), duration of activity (e.g., short vs. long duration), level of competition (e.g., elite vs. recreational), and performance measure (e.g., norm-referenced vs. self-referenced) have all been implicated as potential moderators of the relationship between psychological variables and sport performance [51]. To detect the relatively subtle effects of psychological effects on performance, research designs need to be sufficiently sensitive to such potential confounds. Several specific methodological issues are worth discussing.
The first issue relates to measurement. Investigating the strength of a relationship requires the measured variables to be valid, accurate and reliable. Psychological variables in the meta-analyses we reviewed relied primarily on self-report outcome measures. The accuracy of self-report data requires detailed inner knowledge of thoughts, emotions, and behavior. Research shows that the accuracy of self-report information is subject to substantial individual differences [52, 53]. Therefore, self-report data, at best, are an estimate of the measure. Measurement issues are especially relevant to the assessment of performance, and considerable measurement variation was evident between meta-analyses. Some performance measures were more sensitive, especially those assessing physical performance relative to what is normal for the individual performer (i.e., self-referenced performance). Hence, having multiple baseline indicators of performance increases the probability of identifying genuine performance enhancement derived from a psychological intervention [54].
A second issue relates to clarifying the rationale for how and why specific psychological variables might influence performance. A comprehensive review of prerequisites and precursors of athletic talent [55] concluded that the superiority of Olympic champions over other elite athletes is determined in part by a range of psychological variables, including high intrinsic motivation, determination, dedication, persistence, and creativity, thereby identifying performance-related variables that might benefit from a psychological intervention. Identifying variables that influence the effectiveness of interventions is a challenging but essential issue for researchers seeking to control and assess factors that might influence results [49]. A key part of this process is to use theory to propose the mechanism(s) by which an intervention might affect performance and to hypothesize how large the effect might be.
A third issue relates to the characteristics of the research participants involved. Out of convenience, it is not uncommon for researchers to use undergraduate student participants for research projects, which may bias results and restrict the generalization of findings to the population of primary interest, often elite athletes. The level of training and physical conditioning of participants will clearly influence their performance. Highly trained athletes will typically make smaller gains in performance over time than novice athletes, due to a ceiling effect (i.e., they have less room for improvement). For example, consider runner A, who takes 20 minutes to run 5km one week but 19 minutes the next week, and Runner B who takes 30 minutes one week and 25 minutes the next. If we compare the two, Runner A runs faster than Runner B on both occasions, but Runner B improved more, so whose performance was better? If we also consider Runner C, a highly trained athlete with a personal best of 14 minutes, to run 1 minute quicker the following week would almost require a world record time, which is clearly unlikely. For this runner, an improvement of a few seconds would represent an excellent performance. Evidence shows that trained, highly motivated athletes may reach performance plateaus and as such are good candidates for psychological skills training. They are less likely to make performance gains due to increased training volume and therefore the impact of psychological skills interventions may emerge more clearly. Therefore, both test-retest and cross-sectional research designs should account for individual difference variables. Further, the range of individual difference factors will be context specific; for example, individual differences in strength will be more important in a study that uses weightlifting as the performance measure than one that uses darts as the performance measure, where individual differences in skill would be more important.
A fourth factor that has not been investigated extensively relates to the variables involved in learning sport psychology techniques. Techniques such as imagery, self-talk and goal setting all require cognitive processing and as such some people will learn them faster than others [56]. Further, some people are intuitive self-taught users of, for example, mood regulation strategies such as abdominal breathing or listening to music who, if recruited to participate in a study investigating the effects of learning such techniques on performance, would respond differently to novice users. Hence, a major challenge when testing the effects of a psychological intervention is to establish suitable controls. A traditional non-treatment group offers one option, but such an approach does not consider the influence of belief effects (i.e., placebo/nocebo), which can either add or detract from the effectiveness of performance interventions [57]. If an individual believes that, an intervention will be effective, this provides a motivating effect for engagement and so performance may improve via increased effort rather than the effect of the intervention per se.
When there are positive beliefs that an intervention will work, it becomes important to distinguish belief effects from the proposed mechanism through which the intervention should be successful. Research has shown that field studies often report larger effects than laboratory studies, a finding attributed to higher motivation among participants in field studies [58]. If participants are motivated to improve, being part of an active training condition should be associated with improved performance regardless of any intervention. In a large online study of over 44,000 participants, active training in sport psychology interventions was associated with improved performance, but only marginally more than for an active control condition [59]. The study involved 4-time Olympic champion Michael Johnson narrating both the intervention and active control using motivational encouragement in both conditions. Researchers should establish not only the expected size of an effect but also to specify and assess why the intervention worked. Where researchers report performance improvement, it is fundamental to explain the proposed mechanism by which performance was enhanced and to test the extent to which the improvement can be explained by the proposed mechanism(s).
Limitations
Systematic reviews are inherently limited by the quality of the primary studies included. Our review was also limited by the quality of the meta-analyses that had summarized the primary studies. We identified the following specific limitations; (1) only 12 meta-analyses summarized primary studies that were exclusively intervention-based, (2) the lack of detail regarding control groups in the intervention meta-analyses, (3) cross-sectional and correlation-based meta-analyses by definition do not test causation, and therefore provide limited direct evidence of the efficacy of interventions, (4) the extensive array of performance measures even within a single meta-analysis, (5) the absence of mechanistic explanations for the observed effects, and (6) an absence of detail across intervention-based meta-analyses regarding number of sessions, participants’ motivation to participate, level of expertise, and how the intervention was delivered. To ameliorate these concerns, we included a quality rating for all included meta-analyses. Having created higher and lower quality groups using a median split of quality ratings, we showed that effects were larger, although not significantly so, in the higher quality group of meta-analyses, all of which were published since 2015.
Conclusions
Journals are full of studies that investigate relationships between psychological variables and sport performance. Since 1983, researchers have utilized meta-analytic methods to summarize these single studies, and the pace is accelerating, with six relevant meta-analyses published since 2020. Unquestionably, sport psychology and performance research is fraught with limitations related to unsophisticated experimental designs. In our aggregation of the effect size values, most were small-to-moderate in meaningfulness with a handful of large values. Whether these moderate and large values could be replicated using more sophisticated research designs is unknown. We encourage use of improved research designs, at the minimum the use of control conditions. Likewise, we encourage researchers to adhere to meta-analytic guidelines such as PRISMA and for journals to insist on such adherence as a prerequisite for the acceptance of reviews. Although such guidelines can appear as a ‘painting by numbers’ approach, while reviewing the meta-analyses, we encountered difficulty in assessing and finding pertinent information for our study characteristics and quality ratings. In conclusion, much research exists in the form of quantitative reviews of studies published since 1934, almost 100 years after the very first publication about sport psychology and performance [2]. Sport psychology is now truly global in terms of academic pursuits and professional practice and the need for best practice information plus a strong evidence base for the efficacy of interventions is paramount. We should strive as a profession to research and provide best practices to athletes and the general community of those seeking performance improvements.
Acknowledgments
We acknowledge the work of all academics since Koch in 1830 [2] for their efforts to research and promote the practice of applied sport psychology.
References
- 1.
Terry PC. Applied Sport Psychology. IAAP Handbook of Applied Psychol. Wiley-Blackwell; 2011 Apr 20;386–410.
- 2.
Koch CF. Die Gymnastik aus dem Gesichtspunkte der Diätetik und Psychologie [Callisthenics from the Viewpoint of Dietetics and Psychology]. Magdeburg, Germany: Creutz; 1830.
- 3.
Chroni S, Abrahamsen F. History of Sport, Exercise, and Performance Psychology in Europe. Oxford Research Encyclopedia of Psychology. Oxford: Oxford University Press; 2017 Dec 19. https://doi.org/10.1093/acrefore/9780190236557.013.135
- 4.
Rieger K. Der Hypnotismus: Psychiatrische Beiträge zur Kenntniss der Sogenannten Hypnotischen Zustände [Hypnotism: Psychiatric Contributions to the Knowledge of the So-called Hypnotic States]. Würzburg, Germany: University of Würzburg; 1884.
- 5.
Mosso, A. La fatica [Fatigue]. Milan, Italy: Treves; 1891 [trans. 1904].
- 6. Tissié P. Concernant un record velocipédique. Archives de Physiologie Normale et Pathologique. 1894a; 837.
- 7. Tissié P. L’entraînement physique. La Revue Scientifique. 1894b; 518.
- 8. de Coubertin P. La psychologie du sport. La Revues Des Deux Mondes. 1900;70:161–179.
- 9.
Hanrahan SJ, Andersen MB, editors. Routledge Handbook of Applied Sport Psychology. London: Routledge; 2010.
- 10.
Schinke RJ, McGannon KR, Smith B, editors. Routledge International Handbook of Sport Psychology. London: Routledge; 2016.
- 11.
Bertollo M, Filho E, Terry PC. Advancements in Mental Skills Training: International Perspectives on Key Issues in Sport and Exercise Psychology. London: Routledge; 2021.
- 12. Hagger MS. Meta-analysis in sport and exercise research: Review, recent developments, and recommendations. Euro J Sport Sci. 2006 Jun;6(2):103–15.
- 13. Chatzisarantis NLD, Stoica A. A primer on the understanding of meta-analysis. Psychol Sport Exer. 2009 Sep;10(5):498–501.
- 14. Feltz DL, Landers DM. The effects of mental practice on motor skill learning and performance: A meta-analysis. J Sport Psychol. 1983 Mar;5(1):25–57.
- 15. Bond CF, Titus LJ. Social facilitation: A meta-analysis of 241 studies. Psychol Bull. 1983;94(2):265–92. pmid:6356198
- 16. Triplett N. The dynamogenic factors in pace-making and competition. Amer J Psychol. 1898 Jul;9(4),507–533.
- 17.
Lochbaum M. Understanding the meaningfulness and potential impact of sports psychology on performance. In: Milanović D, Sporiš G, Šalaj S, Škegro D, editors, Proceedings book of 8th International Scientific Conference on Kinesiology, Opatija. Zagreb, Croatia: University of Zagreb, Faculty of Kinesiology; 2017. pp. 486–489.
- 18. Castaño N, Watts T, Tekleab AG. A reexamination of the cohesion–performance relationship meta-analyses: A comprehensive approach. Group Dyn Theory Res Pract. 2013 Dec;17(4):207–31.
- 19. Filho E, Dobersek U, Gershgoren L, Becker B, Tenenbaum G. The cohesion–performance relationship in sport: A 10-year retrospective meta-analysis. Sport Sci Health. 2014 Jun 20;10(3):165–77.
- 20. Page M J, McKenzie J E, Bossuyt P M, Boutron I, Hoffmann T C, Mulrow C D et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021; 372:n71. pmid:33782057
- 21. Lochbaum M, Zanatta T, Kirschling D, May E. The Profile of Moods States and athletic performance: A meta-analysis of published studies. Euro J Invest Health Psychol Edu. 2021 Jan 13;11(1):50–70. pmid:34542449
- 22. Harwood CG, Keegan RJ, Smith JMJ, Raine AS. A systematic review of the intrapersonal correlates of motivational climate perceptions in sport and physical activity. Psychol Sport Exer. 2015 May;18:9–25.
- 23. Carron AV, Colman MM, Wheeler J, Stevens D. Cohesion and performance in sport: A meta-analysis. J Sport Exer Psychol. 2002 Jun;24(2):168–88.
- 24.
Cohen J. Statistical Power Analysis for the Behavioral Sciences. New York: Routledge Academic; 1988.
- 25. Beedie CJ, Terry PC, Lane AM. The profile of mood states and athletic performance: Two meta-analyses. J Appl Sport Psychol. 2000 Mar;12(1):49–68.
- 26. Woodman T, Hardy L. The relative impact of cognitive anxiety and self-confidence upon sport performance: A meta-analysis. J Sports Sci. 2003 Jan;21(6):443–57. pmid:12846532
- 27. Bühlmayer L, Birrer D, Röthlin P, Faude O, Donath L. Effects of mindfulness practice on performance-relevant parameters and performance outcomes in sports: A meta-analytical review. Sport Med. 2017 Jun 29;47(11):2309–21. pmid:28664327
- 28. Terry PC, Karageorghis CI, Curran ML, Martin OV, Parsons-Smith RL. Effects of music in exercise and sport: A meta-analytic review. Psychol Bull. 2020 Feb;146(2):91–117. pmid:31804098
- 29. Driskell JE, Copper C, Moran A. Does mental practice enhance performance? J Appl Psychol. 1994;79(4):481–92.
- 30. Hinshaw KE. The effects of mental practice on motor skill performance: Critical evaluation and meta-analysis. Imagin Cogn Pers. 1991 Sep;11(1):3–35.
- 31. Jokela M, Hanin YL. Does the individual zones of optimal functioning model discriminate between successful and less successful athletes? A meta-analysis. J Sport Sci. 1999 Jan;17(11):873–87. pmid:10585167
- 32. Kleine D. Anxiety and sport performance: A meta-analysis. Anxiety Res. 1990 Jan;2(2):113–31.
- 33. Kyllo LB, Landers DM. Goal setting in sport and exercise: A research synthesis to resolve the controversy. J Sport Exer Psychol. 1995 Jun;17(2):117–37.
- 34. Rowley AJ, Landers DM, Kyllo LB, Etnier JL. Does the iceberg profile discriminate between successful and less successful athletes? A meta-analysis. J Sport Exer Psychol. 1995 Jun;17(2):185–99.
- 35. Craft LL, Magyar TM, Becker BJ, Feltz DL. The relationship between the competitive state anxiety inventory-2 and sport performance: A meta-analysis. J Sport Exer Psychol. 2003 Mar;25(1):44–65.
- 36. Moritz SE, Feltz DL, Fahrbach KR, Mack DE. The relation of self-efficacy measures to sport performance: A meta-analytic review. Res Q Exer Sport. 2000 Sep;71(3):280–94. pmid:10999265
- 37. Brown DJ, Fletcher D. Effects of psychological and psychosocial interventions on sport performance: A meta-analysis. Sport Med. 2016 May 30;47(1):77–99. pmid:27241124
- 38. Hatzigeorgiadis A, Zourbanos N, Galanis E, Theodorakis Y. Self-talk and sports performance. Perspect Psychol Sci. 2011 Jul;6(4):348–56. pmid:26167788
- 39. Hill AP, Mallinson-Howard SH, Jowett GE. Multidimensional perfectionism in sport: A meta-analytical review. Sport Exer Perform Psychol. 2018 Aug;7(3):235–70.
- 40. Kopp A, Jekauc D. The influence of emotional intelligence on performance in competitive sports: A meta-analytical investigation. Sport. 2018 Dec 13;6(4):175. pmid:30551649
- 41. Lebeau J-C, Liu S, Sáenz-Moncaleano C, Sanduvete-Chaves S, Chacón-Moscoso S, Becker BJ, et al. Quiet eye and performance in sport: A meta-analysis. J Sport Exer Psychol. 2016 Oct;38(5):441–57. pmid:27633956
- 42. Paravlic AH, Slimani M, Tod D, Marusic U, Milanovic Z, Pisot R. Effects and dose–response relationships of motor imagery practice on strength development in healthy adult populations: A systematic review and meta-analysis. Sport Med. 2018 Mar 14;48(5):1165–87. pmid:29541965
- 43. Xiang M-Q, Hou X-H, Liao B-G, Liao J-W, Hu M. The effect of neurofeedback training for sport performance in athletes: A meta-analysis. Psychol Sport Exer. 2018 May;36:114–22.
- 44. Ivarsson A, Kilhage-Persson A, Martindale R, Priestley D, Huijgen B, Ardern C, et al. Psychological factors and future performance of football players: A systematic review with meta-analysis. J Sci Med Sport. 2020 Apr;23(4):415–20. pmid:31753742
- 45. Low WR, Sandercock GRH, Freeman P, Winter ME, Butt J, Maynard I. Pressure training for performance domains: A meta-analysis. Sport Exer, Perform Psychol. 2021 Feb;10(1):149–63.
- 46. Simonsmeier BA, Androniea M, Buecker S, Frank C. The effects of imagery interventions in sports: A meta-analysis. Int Rev Sport Exer Psychol. 2020 Jun 23;1–22.
- 47. Toth AJ, McNeill E, Hayes K, Moran AP, Campbell M. Does mental practice still enhance performance? A 24 Year follow-up and meta-analytic replication and extension. Psychol Sport Exer. 2020 May;48:101672.
- 48. Lochbaum M, Gottardy J. A meta-analytic review of the approach-avoidance achievement goals and performance relationships in the sport psychology literature. J Sport Health Sci. 2015 Jun;4(2):164–73.
- 49. Michie S, Abraham C, Whittington C, McAteer J, Gupta S. Effective techniques in healthy eating and physical activity interventions: A meta-regression. Health Psychol. 2009 Nov;28(6):690–701. pmid:19916637
- 50.
Ekkekakis P. The Measurement of Affect, Mood, and Emotion. Cambridge: Cambridge University Press; 2013.
- 51. Terry PC. The efficacy of mood state profiling among elite performers: A review and synthesis. Sport Psychol. 1995 Jul 9;309–324.
- 52. Kotsou I, Mikolajczak M, Heeren A, Grégoire J, Leys C. Improving emotional intelligence: A systematic review of existing work and future challenges. Emot Rev. 2019 Apr 11;151–165.
- 53. Thornton MA, Weaverdyck ME, Mildner JN et al. People represent their own mental states more distinctly than those of others. Nat Commun. 2019 May 9;2117. pmid:31073156
- 54. Hopkins WG, Hawley JA, Burke LM. Design and analysis of research on sport performance enhancement. Med Sci Sports Exer. 1999 Mar;31(3):472–485. pmid:10188754
- 55. Issurin VB. Evidence-based prerequisites and precursors of athletic talent: A review. Sports Med. 2017 Oct;47(10):1993–2010. pmid:28493064
- 56. Cumming J, Eaves DL. 2018. The nature, measurement, and development of imagery ability. Imag Cog Pers. 37(4):375–393.
- 57. Beedie CJ, Foad AJ. The placebo effect in sports performance: A brief review. Sports Med. 2009;39(4):313–29. pmid:19317519
- 58. Sheeran P, Webb TL. The intention-behavior gap. Soc Pers Psychol Comp. 2016 Sep;10(9):503–18.
- 59. Lane AM, Totterdell P, MacDonald I. et al. (2016) Brief online training enhances competitive performance: Findings of the BBC Lab UK Psychological Skills Intervention Study. Front Psychol. 2016;7:413. pmid:27065904