Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Impact of a Multifaceted and Clinically Integrated Training Program in Evidence-Based Practice on Knowledge, Skills, Beliefs and Behaviour among Clinical Instructors in Physiotherapy: A Non-Randomized Controlled Study

  • Nina Rydland Olsen ,

    Affiliations Centre for Evidence Based Practice, Faculty of Health and Social Sciences, Bergen University College, Bergen, Norway, Physiotherapy Research Group, Department of Global Public Health and Primary Care, University of Bergen, Bergen, Norway

  • Peter Bradley,

    Affiliation Public Health Wales, Cardiff, United Kingdom

  • Birgitte Espehaug,

    Affiliation Centre for Evidence Based Practice, Faculty of Health and Social Sciences, Bergen University College, Bergen, Norway

  • Monica Wammen Nortvedt,

    Affiliation Centre for Evidence Based Practice, Faculty of Health and Social Sciences, Bergen University College, Bergen, Norway

  • Hildegunn Lygren,

    Affiliations Department of Physiotherapy, Faculty of Health and Social Sciences, Bergen University College, Bergen, Norway, Department of Physiotherapy, Haukeland University Hospital, Bergen, Norway

  • Bente Frisk,

    Affiliations Centre for Evidence Based Practice, Faculty of Health and Social Sciences, Bergen University College, Bergen, Norway, Department of Physiotherapy, Haukeland University Hospital, Bergen, Norway

  • Jan Magnus Bjordal

    Affiliations Centre for Evidence Based Practice, Faculty of Health and Social Sciences, Bergen University College, Bergen, Norway, Physiotherapy Research Group, Department of Global Public Health and Primary Care, University of Bergen, Bergen, Norway

Impact of a Multifaceted and Clinically Integrated Training Program in Evidence-Based Practice on Knowledge, Skills, Beliefs and Behaviour among Clinical Instructors in Physiotherapy: A Non-Randomized Controlled Study

  • Nina Rydland Olsen, 
  • Peter Bradley, 
  • Birgitte Espehaug, 
  • Monica Wammen Nortvedt, 
  • Hildegunn Lygren, 
  • Bente Frisk, 
  • Jan Magnus Bjordal


Background and Purpose

Physiotherapists practicing at clinical placement sites assigned the role as clinical instructors (CIs), are responsible for supervising physiotherapy students. For CIs to role model evidence-based practice (EBP) they need EBP competence. The aim of this study was to assess the short and long term impact of a six-month multifaceted and clinically integrated training program in EBP on the knowledge, skills, beliefs and behaviour of CIs supervising physiotherapy students.


We invited 37 CIs to participate in this non-randomized controlled study. Three self-administered questionnaires were used pre- and post-intervention, and at six-month follow-up: 1) The Adapted Fresno test (AFT), 2) the EBP Belief Scale and 3) the EBP Implementation Scale. The analysis approach was linear regression modeling using Generalized Estimating Equations.


In total, 29 CIs agreed to participate in the study: 14 were invited to participate in the intervention group and 15 were invited to participate in the control group. One in the intervention group and five in the control group were lost to follow-up. At follow-up, the group difference was statistically significant for the AFT (mean difference = 37, 95% CI (15.9 -58.1), p<0.001) and the EBP Beliefs scale (mean difference = 8.1, 95% CI (3.1 -13.2), p = 0.002), but not for the EBP Implementation scale (mean difference = 1.8. 95% CI (-4.5-8.1), p = 0.574). Comparing measurements over time, we found a statistically significant increase in mean scores related to all outcome measures for the intervention group only.


A multifaceted and clinically integrated training program in EBP was successful in improving EBP knowledge, skills and beliefs among CIs. Future studies need to ensure long-term EBP behaviour change, in addition to assessing CIs’ abilities to apply EBP knowledge and skills when supervising students.


Health care professionals are expected to make evidence-based clinical decisions. Evidence-based practice (EBP) involves integrating the best available research evidence with clinical expertise and patient values, within the context of available resources [1,2]. With the publication of the Sicily statement [1], it became clear that health care professionals should incorporate the necessary knowledge, skills and attitudes of EBP into their training and registration requirements. Curricula should be based on the five EBP steps and processes: asking clinical questions, searching for and appraising research evidence, integrating the evidence into clinical practice and evaluating this process [1]. To ensure that future health care graduates learn how to incorporate EBP steps with their own life-long learning and patient care, EBP should be an integral part of learning throughout their training; including clinical education [1,3].

Clinical education is recognized as an important element in physiotherapy education ([4], p. 125,[5]) and provides an opportunity to learn EBP in authentic clinical settings. Learning in clinical settings requires total immersion to learn the culture and norms of the profession [6]; by actively engaging and participating within the community of practice [7,8]; comprised of physiotherapists, other professionals, other students, patients and their family members [9]. Engagement in the community of practice is also dependent on a formal mentor; the clinical instructor (CI) [10]. The CI is a practicing clinical physiotherapist at the clinical placement site and is responsible for supervising physiotherapy students [5]. Physiotherapists volunteer or are assigned to be CIs. The influence of the CI is recognized as central to students’ knowledge growth and professional development [11,12]. In addition to mentoring and supervising students, the CI should serve as a role model [5]. Results from several studies within different disciplines indicate that students lack role models with strong skills in EBP, in particular among their CIs [1318]. CIs themselves recognize that they need training in EBP [15,19,20]. The effect of EBP training among undergraduates, postgraduates and practicing health care professionals has been studied extensively, and interventions of varying content, format and duration have been evaluated [21]. Multifaceted interventions that combine teaching strategies, are clinically integrated and involve assessment, lead to improvements in EBP knowledge, skills and attitudes amongst learners [21]. Among practicing health professionals, such interventions also lead to improvements in EBP behaviour [21]. The impact of teaching EBP to CIs has been specifically addressed only in a small number of uncontrolled before-and after studies [2224]. Hagler et al. [22] tested the impact of an EBP workshop among staff nurse preceptors (CIs), and found that preceptors’ EBP beliefs improved. Kouhpayehzadeh et al. [23] tested the impact of an EBP workshop among clinical teachers in medicine, and found improvements relating to EBP attitudes and skills. Weberschock et al. [24] tested the effect of a web-based course on how to teach evidence-based medicine principles, and found that knowledge improved among clinical teachers in medicine. In addition to these uncontrolled before-and-after studies, Mohide & Matthew-Maich [25] piloted and evaluated an EBP workshop among nursing preceptor-student pairs. Telephone interviews indicated improvements with regard to EBP attitudes, skills, and sharing of EBP knowledge and skills with colleagues [25]. To the best of our knowledge, no controlled studies have been conducted to study the impact of teaching EBP to CIs. The aim of this non-randomized controlled study was to assess the short and long term impact of an EBP program on the knowledge, skills, beliefs and behaviour of CIs supervising physiotherapy students.


Design and participants

We conducted a non-randomized controlled study, with a six-month follow-up, to assess the impact of an EBP program on CIs’ EBP knowledge, skills, beliefs and behaviour. Self-administered questionnaires were used among CIs before and after the intervention, and at six-month follow-up. The study was carried out from September 2008 to November 2009. Exemption from obtaining ethical approval was granted by the Regional Committee for Medical and Health Research Ethics, Western-Norway. The study was approved by the Norwegian Social Science Data Services (NSD). Written informed consent was obtained prior to the intervention. The study was supported by all of the involved institutions: Bergen University College (BUC), Haraldsplass Deaconess Hospital (HDS) and Haukeland University Hospital (HUS).

BUC is one of four university colleges in Norway that offers a three-year bachelor program (180 ECTS-credits) in physiotherapy [26,27]. During the three-year bachelor program, physiotherapy students spend 30 weeks in clinical work, in various clinical settings: primary health care, outpatient clinics, rehabilitation clinics, local hospitals and university hospitals. During clinical placement, students are supervised by CIs who are physiotherapists practicing at the clinical placement site.

At the start of the study (August 2008), 37 physiotherapists at different hospitals in Norway were assigned the role as CIs for 3rd year physiotherapy students (BUC), for the students’ final 10-week clinical placement in January 2009. These CIs (n = 37) were asked to volunteer for the study via e-mail.

We invited CIs at hospitals situated in Bergen to participate in the intervention group (n = 17), and CIs at hospitals situated outside Bergen to participate in the control group (n = 20) (Fig 1). This decision was based on consideration of time and cost of travelling. Hospitals outside Bergen are situated far apart, and the geographical distances between Bergen and the hospitals situated outside Bergen can be fairly large (up to 400 km), and often only one or two CIs are situated at these hospitals.

The intervention

Intervention content and delivery.

The intervention consisted of a multifaceted and clinically integrated training program in EBP (6 ECTS-credits), delivered to CIs over a six-month period (October 2008- April 2009) (Table 1). The training program was multifaceted as combinations of several teaching strategies were employed: workshops, assignments, supervision and exams. The workshops were a mixture of lectures (didactic sessions) and small-group activities that required participants to be interactive. Four half-day workshop sessions covering the EBP steps and processes were held in classrooms at hospital settings (HUS) and delivered sequentially over a six-week period (Table 1). Between and after workshops, five individual written assignments (Week 2, 4, 7, 1, 21) were required from the participants. To ensure clinically integrated learning of EBP, each assignment required participants to reflect on and describe how to apply the EBP steps in real clinical situations. Assignment one and two required participants to write a coherent paper on identified clinical information need and question formulation. For the three first assignments they had to reflect on how to supervise students in using the EBP steps. Assignment three to five required participants to work through all EBP steps using the EBP tool (Table 2). The EBP tool can be used to document the EBP steps and process: information need, clinical question asked, search strategy and result of this, critical appraisal of the research results found (validity, reliability and applicability), if and how the research results are integrated or shared in clinical practice and how the changes made to clinical practice are evaluated. The EBP tool is a learning tool intended to provide health care professionals with practical EBP skills. The EBP steps and processes registered in one document facilitate the learning process and the possibility of receiving and giving feedback. Development of this tool was inspired by working files developed for trainee doctors in Norway [28]. For each assignment, participants received supervision via phone and/or email, in addition to guidance from a librarian when necessary. All five assignments had to be completed before the final exam. The exam took form of an individual oral presentation, where participants focused on how to apply the EBP steps to a real patient situation and how to supervise students in the EBP process (Week 27) (Table 1). The exam was assessed as pass/fail. The objectives of the program (Table 1) were related to the EBP steps and processes described previously [29].

Table 1. Training program in evidence-based practice (EBP): workshops, assignments and objectives (6 ECTS-creditsa).

The overall idea behind the intervention was that learning of EBP should be integrated into clinical workplaces; a thought that is in line with different experiential learning theorists, in particular socio-cultural theorists such as Vygotsky, Lave and Wenger [30]. The intervention was based on several aspects from these latter theorists, by viewing learning as situated and triggered by authentic practice-based experiences, and considering interaction fundamental to learning [7,30,31].

The workshop delivery was inspired by the Critical Appraisal Skills Program (CASP) where workshops are problem-based, small-group oriented, enjoyable, grounded within the clinical decision making process, and focus is on interactive teaching in a safe environment and use of high-quality and user-friendly materials [3234].

The program was delivered by a project group of five physiotherapists (including NRO, HL, BF), from both academic and clinical positions, and with a range of expertise in EBP, physiotherapy, higher education and research.

Data collection

Outcome measures.

We chose to focus on outcome measures defined by Shaneyfelt et al. [35] as the EBP domains for evaluating EBP in education, including knowledge, skills, attitudes and behaviour. Knowledge is defined as knowledge about EBP (e.g. EBP steps), whereas skills is defined as applying knowledge about EBP by performing EBP steps to solve clinical scenarios, for example with a written patient cases (e.g. formulating a clinical question or finding the best evidence). Attitude is defined as attitudes towards EBP (e.g. beliefs about the value of EBP). EBP behaviour is defined as actual performance of EBP in practice, as in enacting of the EBP steps in practice, performing evidence-based maneuvers in actual practice (e.g. following guideline prescriptions) or affecting patient outcomes. In this study, we focused on the EBP behaviour as in self-reported enacting of the EBP steps in practice.

For evaluating these four different EBP domains a wide range of instruments were identified in a systematic review by Shaneyfelt et al [35]. However, no valid instruments were identified for evaluating EBP in educational interventions among rehabilitation professionals. We were able to identify only three previously validated questionnaires instruments for assessing the impact of the educational program of this study: 1) The Adapted Fresno test (AFT) [36,37], 2) the EBP Belief Scale [38], and 3) the EBP Implementation Scale [38].

Permission to translate and use these three different instruments was obtained from the respective authors [36,38]. The AFT, the EBP Belief Scale and the EBP Implementation Scale were translated to Norwegian using a forward and backward translation procedure as described by the World Health Organization [39]. In addition, we collected demographic data such as age and gender, and background information about type and size of position, postgraduate education and years of experience.

The Adapted Fresno test (AFT).

The AFT is a seven-item test developed for rehabilitation professionals, educators and researchers to measure change in the EBP skills and knowledge (EBP competence) following training in EBP [36]. The AFT measures EBP knowledge about: information sources, the hierarchy of evidence, the study design that best answers questions about effectiveness, keywords and limits to use when searching and methodological biases in study designs. The AFT measures EBP skills related to: the ability to write a focused clinical question, the ability to reflect upon advantages and disadvantages of information sources, the ability to describe an effective and efficient search strategy and the ability to interpret and critically appraise a published paper. The AFT is focused around different clinical scenarios relevant to rehabilitation professionals. There are three versions of the AFT that include identical items, but different sets of clinical scenarios to help minimize practice effects when AFT is used for pre-, post- and follow-up testing [36]. The total score range from 0–156, and the test takes 20 minutes to complete and 20 minutes to score using a scoring matrix developed by McCluskey and Bishop [36]. The AFT has been reported to have acceptable psychometric properties, with excellent inter-rater reliability for AFT total score (ICC > 0.9) and acceptable internal consistency (Cronbach’s alpha 0.74) [36]. Improvements of 10% (15.6 points) in the mean total score at post-workshop, and 15% (23.4 points) at follow-up are considered as educationally important change, when compared to baseline [37]. The AFT is most useful for evaluating change in novice learners [36].

The EBP beliefs scale.

The EBP beliefs scale was “…designed to measure clinicians’ beliefs about the value of EBP and their beliefs/confidence in implementing it in practice” ([38], p. 209); and Melnyk et al. found that the scale was sensitive to a wide range of attitudes ([38], p. 214). Melnyk et al. ([38], p. 210]) defined EBP beliefs as “…endorsement of the premise that EBP improves clinical outcomes and confidence in one’s EBP knowledge/skills”. The test contains 16 statements addressing EBP beliefs on a continuum from 1 (strongly disagree) to 5 (strongly agree) (5-point Likert scale). Negatively phrased items (item 11 and 13) are reversed before summing responses of the 16 items, with a total score that ranges between 16 and 80 [38]. Testing of the EBP Beliefs scale has shown that it has excellent internal consistency (Cronbach alpha 0.9) and measures a unidimensional construct [38].

The EBP implementation scale.

The EBP Implementation Scale is designed to measure clinicians’ implementation of essential components and steps of EBP [38]. Melnyk et al. ([38], p. 210) defines EBP implementation as “engaging in relevant behaviours, including: (1) seeks and appraises scientific evidence, (2) shares evidence or data with colleagues or patients, (3) collects and evaluates outcome data, and (4) uses evidence to change practice.” The test contains 18 statements where participants are asked to indicate how often in the past eight-weeks they performed the item on a 5-point frequency scale from 0 (“0 times”) to 4 (>8 times). Scoring the 18-item test involves summing responses, and a total score could range from 0 to 72. Testing of the EBP Implementation scale has shown that it has excellent internal consistency (Cronbach alpha 0.96) and measures a unidimensional construct [38].

Data collection procedure.

All participants were asked to individually complete the paper and pencil versions of the EBP Beliefs scale, the EBP Implementation Scale and the AFT at three different measurement times: at pre-intervention in September 2008, at post-intervention in May 2009 and at follow-up in November 2009. In addition, all participants were asked to fill out demographic sheets at pre-intervention (September 2008). Questionnaires and demographic sheets from the control group were collected via mail. Participants in the intervention group were gathered at a meeting room at the hospital (HUS) where they filled out the questionnaires.

The AFT tests were scored during a two-week period. A training session was conducted using examples of scored and unscored copies, a similar procedure as described by McCluskey and Bishop [36]. All the AFTs (n = 76) were scored independently by two raters with a good understanding of EBP (first author, and a nurse (PhD) experienced with teaching EBP). Both raters were blinded to the status of the AFTs, as to whether the tests were from the control or the intervention group. To measure agreement between raters we calculated intraclass correlation coefficients (ICC) type 2, 1 (random effect model) [40]. Agreement between raters was very good for all the total AFT scores (Version 1: ICC 0.89, 95% CI 0.74–0.95; Version 2: ICC 0.95, 95% CI 0.88–0.98; Version 3: ICC 0.97, 95% CI 0.92–0.99). For the analysis we used AFT consensus scores; disagreement among raters was solved by discussing question-by-question.

Data analysis.

IBM SPSS Statistics version 20 (SPSS Inc., Chicago, USA 2012) was used for data analyses. Descriptive statistics, including frequencies, percentages, mean, range and standard deviation (SD) were calculated to describe characteristics of the participants. To account for correlated data imposed by the study design with repeated measures of the outcome, we used generalized estimating equations (GEE) ([41], p. 62–77) to estimate differences in mean scores. An interaction term between group and measurement time was included in the regression models to investigate time dependent group differences. In these analyses, an unstructured working correlations structure was applied and standard errors were calculated using robust estimates. Additional analyses were performed to adjust for potential confounding by age, gender, type of position, size of position, type of post-graduate education and years of experience. Estimated differences in outcomes were reported as mean difference (MD) with 95% confidence interval (95% CI). P-values less than 0.05 were considered statistically significant for all analyses.

Internal consistency for all outcome measures was calculated using Cronbach’s α. Alpha values of ≥ 0.70 were regarded as satisfactory for comparing groups [42].



From a total of 37 eligible CIs, 29 (78.4%) chose to participate in this study (Fig 1). Fourteen CIs were allocated to the intervention group and 15 to the control group. In total, 13 from the control group and nine from the intervention group contributed with information at all measurements.

Table 3 provides an overview of the participants’ characteristics. The intervention group and the control group were similar at baseline with respect to all participant characteristics. The majority were female (n = 26) and the mean age was 39.7 years (range 26–61, SD 9.9). Mean years of experience was 12.9 years (range 2–32, SD 8.6). More than half of the participants had some kind of postgraduate education. Only three of the participants held a leadership position, and nine of the participants were specialists. Only one participant was in a part-time position, and the rest of the participants held 80–100% positions.

Internal consistency

Internal consistency measured by Cronbach’s alpha at pre-intervention was satisfactory for all instruments, including the EBP Beliefs scale (0.85), the EPB implementation scale (0.85) and the AFT (0.93).

Changes in scores related to Adapted Fresno Test, EBP Beliefs and Implementation scale

The GEE regression analyses showed statistically significant differences in favor of the intervention for all three outcome measures at post-intervention. At follow-up, the group difference was statistically significant for two of the outcome measures: the AFT (mean difference = 37, 95% CI (15.9–58.1), P <0.001) and the EBP Beliefs scale (mean difference = 8.1, 95% CI (3.1–13.2), P = 0.002) (Table 4).

Table 4. Adapted Fresno Test, EBP Beliefs Scale and EBP Implementation scale mean scores in the intervention and the control group.

Comparing measurements over time within groups, we found a statistically significant increase in mean scores related to all outcome measures for the intervention group only. The GEE analysis was adjusted for gender and years of experience. Further adjustment for age, type of position, size of position and type of post-graduate education gave only marginal differences. Total observations from the participants included in the GEE analysis from the intervention group were 40 and 37 from the control group (Fig 1).


This study has generated unique findings on the impact of a multifaceted and clinically integrated training program in EBP among CIs at hospitals in Norway. At follow-up we found statistically significant between-group differences in favor of the intervention group with regard to EBP knowledge, skills and beliefs, but not for behaviour. When comparing measurement over time, a statistically significant increase in mean scores, with regard to knowledge, skills, beliefs and behaviour was found for the intervention group only.

Our findings concur with results from previous systematic reviews on the effect of teaching EBP; concluding that interactive, multifaceted, and clinically integrated interventions that involve assessment lead to improvements in EBP knowledge, skills, attitudes and behaviour among health professionals [21]. This is in line with the socio-cultural perspective on learning [7,8], which support that learning takes place in authentic settings (clinically integrated) and through social interaction (interactive). Previous reviews on the impact of teaching EBP did not identify or include studies specifically focusing on CIs [21]. However, the impact of EBP training on CIs has been evaluated positively in three uncontrolled before-and-after studies within nursing and medicine [2224]. Findings from these studies have to be interpreted with caution due to methodological deficits (e.g. uncontrolled studies). Furthermore, these studies assessed knowledge and attitudes, not behaviour [2224]. This is not unusual as many studies on the impact of teaching EBP focus only on outcomes such as attitudes (beliefs about the value of EBP), beliefs/confidence in ability to conduct EBP (self-efficacy), knowledge or skills [2]. However, when investigating the impact of teaching EBP—it is also essential to assess EBP behaviour, as we need to know if learners apply their skills in actual practice [2,35,43]. Assessing only EBP attitudes or beliefs is not sufficient, as EBP attitudes do not necessarily lead to EBP behaviour [44]. Authors of a recent systematic review question the link between attitudes and behaviour, as they identified studies reporting of physiotherapists with positive attitudes failing to implement EBP [44]. In our study, we assessed all the typical outcome measures, including EBP behaviour and found statistically significant between-group differences at post-intervention in favor of the intervention group for all outcome measures, but changes related to behaviour were not sustained at six-month follow-up. After the intervention-phase these CIs might have experienced that they were left alone, without requirements to apply EBP and the support that they received during the intervention phase. Possibly, they were also faced with barriers that hindered them applying EBP. In this study, we did not aim to identify potential EBP barriers at individual and organisational level during or after the intervention. However, lack of time, inability to understand statistics, lack of support from employer, lack of resources, lack of interest, and lack of generalisation of results have been identified as frequently reported barriers among physiotherapists [45]. These barriers could explain why EBP behaviour differences were not sustained in our study.

It is essential to find ways of ensuring long term change in EBP behaviour among CIs, in particularly when considering the potential impact that behaviour among CIs can have on students’ professional development. Cole and Wessel [46] found that CIs demonstrating professional behaviour related to EBP enhanced students learning experience in clinical placements. A non-evidence-based culture and CIs not practicing evidence-based at the placement has been reported as barriers towards implementation of EBP among undergraduate health care students [14,15,17,29]. In contrast, CIs applying the principles of EBP encourage students to practice evidence-based [13,14]. No doubt, CIs are important role models for students and influence students’ visions of how to practice physiotherapy in the future ([4], p. 131). Also, when both students and CIs have EBP competence, there is room for dialogue, for example about real patient situations where students and CIs can discuss the applicability of research evidence. Dialogue is essential for shared meaning to develop [6], and with this type of dialogue CIs and students could develop a shared meaning with regard to how to practice evidence-based. Developing shared meaning about how to practice evidence-based could enable students to make decisions about which behaviours to imitate and develop visions about their own professional identity (i.e. how they want to practice as future professionals) [6].

Few studies have, however, explored if and how EBP is taught or emphasised by clinicians during clinical education [19,20,47]. Kljakovic et al. [20] report of a small percentage of clinicians with training in evidence-based medicine (EBM), and they report of small percentages that teach it. Not surprisingly, clinicians with training were also more likely to teach EBM [20]. Meyer and Willett [47] explored which learning activities physical therapy students reported that their CIs used to promote the acquisition of different core competencies, including employing evidence-based practice. Students reported that promoting the EBP competencies, through skills such as applying quality improvement and utilizing informatics, was one of the learning activities that their CIs did not emphasise to a great extent [47]. Few examples of CIs teaching EBP could be explained by barriers that CIs experience in implementing EBP among students. Hankemeier et al. [48] report of barriers among CIs within athletic training, such as time, equipment, access to literature, knowledge, negative attitudes among colleagues, poor integration between the clinical setting and the classroom, and EBP not integrated across the athletic training program. Future studies on the impact of teaching EBP among CIs need to take these barriers into account.

Our EBP program also focused on how to supervise students in EBP, and CIs participating in this program had to reflect on how they applied EBP steps and principles when supervising students. However, we did not specifically assess if and how the CIs’ transferred their EBP knowledge and skills to supervisory situations with students. Similarly, this has not been assessed in other previous studies on the impact of teaching EBP to CIs. Future studies on the impact of teaching EBP to CIs need to consider if CIs are able to transfer their EBP knowledge and skills to supervisory situations with their students. However, according to Walczak et al. [49] there is a lack of assessment tools for this type of outcome as per today. Therefore, there is a need for developing assessment tools for measuring CIs’ teaching skills and skills related to supervising students in EBP.


This study is the first to assess an interactive and clinically integrated training program in EBP among CIs in physiotherapy education. The controlled design allowed us to efficiently pilot the EBP program and the educational material, including the EBP tool. This study has, however, several limitations. The feasibility of implementing this training program in EBP at other hospitals in Norway, or at hospitals outside Norway, remains uncertain. We conducted a controlled non-randomized study. In this study, true randomization was not performed, mainly because of geographical distances. Blinding of researchers and participants was not possible as the tutors involved in the intervention and the CIs were aware of the allocation. Communication between CIs in the control group and the intervention group may have occurred, although allocation within different geographical areas (hospitals situated in Bergen and outside Bergen) most likely protected against contamination. There may have been a selection bias; CIs who volunteered to participate in the study could have been more motivated than those who did not volunteer. Possibly, our findings could also be biased due to the fact that most of the participants in the intervention group worked at a university hospital; and it has been documented that physiotherapist in such setting feel part of a research-oriented culture [50,51]. Still, there were now statistically significant differences in scores between the groups at pre-intervention. Participants in both the control and the intervention group can be classified as novice learners, reflected by pre-intervention scores below 50% (78/156) for both groups [36].

The limited number of available participants led to low statistical power causing a probable type I error. Although, the sample size limits the statistical power of the study and probably cause low precision, our results showed statistically significant changes on the 5% level for all pre- and post-intervention comparisons, indicating that the impact of training is reasonably large. With regard to the AFT, the reported mean change between groups was 37 points at follow-up when compared to baseline, which is considered an educationally important change [37]. The impact of the intervention was assessed using outcome measures that had been previously validated and psychometrically tested (for this purpose). The fact that change was detected in this study indicates that each of these instruments are responsive to change, also among CIs in Norway. Still, these findings must be interpreted with caution; we need to question if change detected regarding EBP behaviour was reflected in real practice, as we measured self-reported EBP behaviour. Eccles et al. [52] argue that intention (self-reported measures) appears to be a valid proxy measure for behaviour for use in the development of implementation interventions, as measuring actual behaviour is challenging. Still, it might be worth the effort. As pointed out by Shaneyfelt et al. [35], EBP behaviour documented through retrospective self-report might be biased as respondents tend to overestimate their actions. Ideally, behaviour should be measured using some form of activity monitoring [2]; a consideration for future research. Translation of EBP knowledge, skills and attitudes or beliefs to real-time EBP behaviour could be explored, for example by observing and audiotaping practitioners and their interactions with students [44,53,54].

Scoring of the AFT was reliable indicated by excellent ICC values. The outcome assessors were blinded to the status of the AFTs, as to whether the tests were from the control or the intervention group. Assessors were however not blinded when analyzing data related to the EBP Beliefs Scale and the EBP Implementation Scale.


In this study, we have demonstrated the success of an interactive, multifaceted and clinically integrated training program in EBP among CIs at hospitals in Norway. The intervention resulted in statistically significant increase in mean scores, with regard to knowledge, skills, beliefs and behaviour for the intervention group, when comparing measurement over time. At follow-up, we found statistically significant between-group differences with regard to EBP knowledge, skills and beliefs in favor of the intervention group. Future research needs to investigate how to ensure long-term EBP behaviour change. Further investigations are also needed to investigate whether similar finding can be achieved among other health professionals, in addition to the feasibility of implementing a similar program among a larger group using a stronger study design. Future studies also need to consider CIs’ abilities to transfer their EBP knowledge and skills to supervisory situations with their students.


The authors wish to thank all students, CIs and teachers who kindly participated in this study. We are also very grateful to a group of researchers and colleagues who took part in the process of planning and conducting this study: Mildrid Haugland (Bergen University College), Torunn Urnes Meyer (Haukeland University Hospital) and Bård Bogen (Haraldsplass Deaconess Hospital).

Author Contributions

Conceived and designed the experiments: NRO PB BE MWN HL BF JMB. Performed the experiments: NRO HL BF. Analyzed the data: NRO BE. Contributed reagents/materials/analysis tools: NRO PB BE MWN HL BF JMB. Wrote the paper: NRO PB BE MWN HL BF JMB.


  1. 1. Dawes M, Summerskill W, Glasziou P, Cartabellotta A, Martin J, Hopayian K, et al. Sicily statement on evidence-based practice. BMC Med Educ. 2005;5: 1. pmid:15634359
  2. 2. Tilson JK, Kaplan SL, Harris JL, Hutchinson A, Ilic D, Niederman R, et al. Sicily statement on classification and development of evidence-based practice learning assessment tools. BMC Med Educ. 2011;11: 78. pmid:21970731
  3. 3. Glasziou P, Burls A, Gilbert R. Evidence based medicine and the medical curriculum. BMJ. 2008;337:a1253. pmid:18815165
  4. 4. Frost JS. Preparation for teaching in clinical settings. In: Jensen GM, Mostrom E, editors. Handbook of teaching and learning for physical therapists. 3rd ed. St. Louis, Mo.: Elsevier Butterworth Heinemann; 2013. pp. 124–144.
  5. 5. World Confederation of Physical Therapy. WCPT guideline for the clinical education component of physical therapist professional entry level education. Available: Accessed 20 March 2015.
  6. 6. Plack M. The Development of Communication Skills, Interpersonal Skills, and a Professional Identity Within a Community of Practice. J Phys Ther Educ. 2006;20: 37–46.
  7. 7. Lave J, Wenger E. Situated learning: legitimate peripheral participation. Cambridge: Cambridge University Press; 1991.
  8. 8. Wenger E. Communities of practice: learning, meaning, and identity. Cambridge: Cambridge University Press; 1998.
  9. 9. Skoien AK, Vagstol U, Raaheim A. Learning physiotherapy in clinical practice: student interaction in a professional context. Physiother Theory Pract. 2009;25: 268–278. pmid:19418364
  10. 10. Plack M. The learning triad potential barriers and supports to learning in the physical therapy clinical environment. J Phys Ther Educ. 2008;22: 7–18.
  11. 11. Spouse F. The effective mentor: a model for student-centred learning. Nurs Times. 1996;92: 32–35.
  12. 12. Spouse J. Bridging theory and practice in the supervisory relationship: a sociocultural perspective. J Adv Nurs. 2001;33: 512–522. pmid:11251739
  13. 13. Bradley P, Oterholt C, Nordheim L, Bjorndal A. Medical students' and tutors' experiences of directed and self-directed learning programs in evidence-based medicine: a qualitative evaluation accompanying a randomized controlled trial. Eval Rev. 2005;29: 149–177. pmid:15731510
  14. 14. Ilic D, Forbes K. Undergraduate medical student perceptions and use of Evidence Based Medicine: a qualitative study. BMC Med Educ. 2010; 10:58. pmid:20718992
  15. 15. Olsen NR, Bradley P, Lomborg K, Nortvedt MW. Evidence based practice in clinical physiotherapy education: a qualitative interpretive description. BMC Med Educ. 2013;13: 52. pmid:23578211
  16. 16. Jonsen E, Melender HL, Hilli Y. Finnish and Swedish nursing students' experiences of their first clinical practice placement—A qualitative study. Nurse Educ Today. 2013;33: 297–302. pmid:22795745
  17. 17. Stronge M, Cahill M. Self-reported knowledge, attitudes and behaviour towards evidence-based practice of occupational therapy students in Ireland. Occup Ther Int. 2012;19: 7–16. pmid:22183972
  18. 18. Maben J, Latter S, Clark JM. The theory-practice gap: impact of professional-bureaucratic work conflict on newly-qualified nurses. J Adv Nurs. 2006;55: 465–477. pmid:16866842
  19. 19. Kljakovic M. Practising GPs teaching medical students evidence based medicine-a questionnaire survey. Aust Fam Physician. 2006;35: 999–1002. pmid:17149476
  20. 20. Kljakovic M, Love T, Gilbert A. Attitudes of teachers to evidence based medicine. Aust Fam Physician. 2004;33: 376–378. pmid:15227872
  21. 21. Young T, Rohwer A, Volmink J, Clarke M. What are the effects of teaching evidence-based health care (EBHC)? Overview of systematic reviews. PLoS One. 2014;9: e86706. pmid:24489771
  22. 22. Hagler D, Mays MZ, Stillwell SB, Kastenbaum B, Brooks R, Fineout-Overholt E, et al. Preparing clinical preceptors to support nursing students in evidence-based practice. J Contin Educ Nurs. 2012;43: 502–508. pmid:22908881
  23. 23. Kouhpayehzadeh J, Baradaran H, Arabshahi KS, Knill-Jones R. Clinical teachers' attitudes toward the efficacy of evidence-based medicine workshop and self-reported ability in evidence-based practice in Iran. J Contin Educ Health Prof. 2006;26: 210–214. pmid:16986146
  24. 24. Weberschock T, Sorinola O, Thangaratinam S, Oude Rengerink K, Arvanitis TN, Khan KS, et al. How to confidently teach EBM on foot: development and evaluation of a web-based e-learning course. Evid Based Med. 2013;18: 170–172. pmid:22864372
  25. 25. Mohide EA, Matthew-Maich N. Engaging nursing preceptor-student dyads in an evidence-based approach to professional practice. Evid Based Nurs. 2007;10: 36–40. pmid:17431963
  26. 26. Utdannings- og forskningsdepartementet. Rammeplan for fysioterapiutdanningen. [National Physiotherapy Curriculum]. Oslo, Norway: Utdannings- og forskningsdepartementet. [The Ministry of Education and Research]; 2005.
  27. 27. Skoien AK, Vagstol U, Raaheim A. Learning physiotherapy in clinical practice: student interaction in a professional context. Physiother Theory Pract. 2009;25: 268–278. pmid:19418364
  28. 28. Kongerud IC, Vandvik PO. Work files as learning tools in knowledge management. Tidsskr Nor Laegeforen. 2013;133: 1587–1590. pmid:23970272
  29. 29. Olsen NR, Lygren H, Espehaug B, Nortvedt MW, Bradley P, Bjordal JM. Evidence-based Practice Exposure and Physiotherapy Students' Behaviour during Clinical Placements: A Survey. Physiother Res Int. 2014;19: 238–247. pmid:24664886
  30. 30. Yardley S, Teunissen PW, Dornan T. Experiential learning: AMEE Guide No. 63. Med Teach. 2012;34: e102–115. pmid:22289008
  31. 31. Bleakley A, Bligh J, Browne J. Socio-Cultural Learning Theories. In: Bleakley A, Bligh J, Browne J, editors. Medical Education for the Future Identity, Power and Location. Dordrecht, Netherlands: Springer 2011. pp. 43–60.
  32. 32. Bradley P, Hill A. Critical Appraisal Skills Programme international network: making sense of the evidence. Eur J of Public Health. 2001; 2: 238.
  33. 33. Bradley P. How to teach evidence-based practice? Two theory-based interventions and a systematic review. Doctoral Thesis. Faculty of Medicine: The University of Oslo. Norway. 2005.
  34. 34. Taylor RS, Reeves BC, Ewings PE, Taylor RJ. Critical appraisal skills training for health care professionals: a randomized controlled trial [ISRCTN46272378]. BMC Med Educ. 2004;4: 30. pmid:15585061
  35. 35. Shaneyfelt T, Baum KD, Bell D, Feldstein D, Houston TK, Kaatz S, et al. Instruments for evaluating education in evidence-based practice: a systematic review. JAMA. 2006;296: 1116–1127. pmid:16954491
  36. 36. McCluskey A, Bishop B. The Adapted Fresno Test of competence in evidence-based practice. J Contin Educ Health Prof. 2009;29: 119–126. pmid:19530195
  37. 37. McCluskey A, Lovarini M. Providing education on evidence-based practice improved knowledge but did not change behaviour: a before and after study. BMC Med Educ. 2005;5: 40. pmid:16364181
  38. 38. Melnyk BM, Fineout-Overholt E, Mays MZ. The evidence-based practice beliefs and implementation scales: psychometric properties of two new instruments. Worldviews Evid Based Nurs. 2008;5: 208–16. pmid:19076922
  39. 39. World Health Organization. Process of translation and adaptation of instruments. Available: Accessed 20 March 2015.
  40. 40. Shrout PE, Fleiss JL. Intraclass correlations: uses in assessing rater reliability. Psychol Bull. 1979;86: 420–428. pmid:18839484
  41. 41. Twisk JWR. Applied longitudinal data analysis for epidemiology: a practical guide. 2nd ed. Cambridge: Cambridge University Press; 2013.
  42. 42. Bland JM, Altman DG. Cronbach's alpha. BMJ. 1997;314: 572. pmid:9055718
  43. 43. Oude Rengerink K, Zwolsman SE, Ubbink DT, Mol BW, van Dijk N, Vermeulen H. Tools to assess evidence-based practice behaviour among healthcare professionals. Evid Based Med. 2013;18: 129–138. pmid:23349216
  44. 44. Scurlock-Evans L, Upton P, Upton D. Evidence-Based Practice in physiotherapy: a systematic review of barriers, enablers and interventions. Physiotherapy. 2014;100: 208–219. pmid:24780633
  45. 45. Mota da Silva T, da Cunha Menezes Costa L, Garcia AN, Costa LO. What do physical therapists think about evidence-based practice? A systematic review. Man Ther. 2014. Available:
  46. 46. Cole B, Wessel J. How clinical instructors can enhance the learning experience of physical therapy students in an introductory clinical placement. Adv Health Sci Educ Theory Pract. 2008;13: 163–179. pmid:17120080
  47. 47. Meyer KP, Willett G. Are physical therapy clinical instructors teaching the Institute of Medicine core competencies? An exploratory investigation using student perceptions. J Allied Health. 2007;36: e293–312. pmid:19759998
  48. 48. Hankemeier DA, Van Lunen BL. Perceptions of approved clinical instructors: barriers in the implementation of evidence-based practice. J Athl Train. 2013;48: 382–393. pmid:23675798
  49. 49. Walczak J, Kaleta A, Gabrys E, Kloc K, Thangaratinam S, Barnfield G, et al. How are "teaching the teachers" courses in evidence based medicine evaluated? A systematic review. BMC Med Educ. 2010;10: 64. pmid:20920240
  50. 50. Barnard S, Wiles R. Evidence-based Physiotherapy: Physiotherapists' attitudes and experiences in the Wessex area. Physiotherapy. 2001;87: 115–124.
  51. 51. Dannapfel P, Peolsson A, Nilsen P. What supports physiotherapists' use of research in clinical practice? A qualitative study in Sweden. Implement Sci. 2013;8: 31. pmid:23497502
  52. 52. Eccles MP, Hrisos S, Francis J, Kaner EF, Dickinson HO, Beyer F, et al. Do self- reported intentions predict clinicians' behaviour: a systematic review. Implement Sci. 2006;1: 28. pmid:17118180
  53. 53. Tilburt JC, Goold SD, Siddiqui N, Mangrulkar RS. How do doctors use information in real-time? A qualitative study of internal medicine resident precepting. J Eval Clin Pract. 2007;13: 772–80. pmid:17824871
  54. 54. Tilburt JC, Mangrulkar RS, Goold SD, Siddiqui NY, Carrese JA. Do we practice what we preach? A qualitative assessment of resident-preceptor interactions for adherence to evidence-based practice. J Eval Clin Pract. 2008;14: 780–4. pmid:19018911