Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Focus on Quality: Investigating Residents’ Learning Climate Perceptions

  • Milou E. W. M. Silkens ,

    Affiliation Professional Performance Research group, Center for Evidence-Based Education, Academic Medical Center/University of Amsterdam, Amsterdam, The Netherlands

  • Onyebuchi A. Arah,

    Affiliations Professional Performance Research group, Center for Evidence-Based Education, Academic Medical Center/University of Amsterdam, Amsterdam, The Netherlands, Department of Epidemiology, Fielding School of Public Health, University of California Los Angeles (UCLA), Los Angeles, California, United States of America, UCLA Center for Health Policy Research, Los Angeles, California, United States of America

  • Albert J. J. A. Scherpbier,

    Affiliation Department of Educational Development and Research, Maastricht University, Maastricht, The Netherlands

  • Maas Jan Heineman,

    Affiliation Professional Performance Research group, Center for Evidence-Based Education, Academic Medical Center/University of Amsterdam, Amsterdam, The Netherlands

  • Kiki M. J. M. H. Lombarts

    Affiliation Professional Performance Research group, Center for Evidence-Based Education, Academic Medical Center/University of Amsterdam, Amsterdam, The Netherlands

Focus on Quality: Investigating Residents’ Learning Climate Perceptions

  • Milou E. W. M. Silkens, 
  • Onyebuchi A. Arah, 
  • Albert J. J. A. Scherpbier, 
  • Maas Jan Heineman, 
  • Kiki M. J. M. H. Lombarts



A department’s learning climate is known to contribute to the quality of postgraduate medical education and, as such, to the quality of patient care provided by residents. However, it is unclear how the learning climate is perceived over time.


This study investigated whether the learning climate perceptions of residents changed over time.


The context for this study was residency training in the Netherlands. Between January 2012 and December 2014, residents from 223 training programs in 39 hospitals filled out the web-based Dutch Residency Educational Climate Test (D-RECT) to evaluate their clinical department’s learning climate. Residents had to fill out 35 validated questions using a five point Likert-scale. We analyzed data using generalized linear mixed (growth) models.


Overall, 3982 D-RECT evaluations were available to investigate our aim. The overall mean D-RECT score was 3.9 (SD = 0.3). The growth model showed an increase in D-RECT scores over time (b = 0.03; 95% CI: 0.01–0.06; p < 0.05).


The observed increase in D-RECT scores implied that residents perceived an improvement in the learning climate over time. Future research could focus on factors that facilitate or hinder learning climate improvement, and investigate the roles that hospital governing committees play in safeguarding and improving the learning climate.


Throughout the modernizations of postgraduate medical education (PGME), quality assurance (QA) and continuous quality improvement (QI) of residency training received considerable attention worldwide [13]. The department’s learning climate is considered to be an important indicator of PGME quality [2, 3] and as such often monitored as well as targeted during QI activities. However, whether these activities actually impact the learning climate is not clear.

The department’s learning climate (also known as the learning environment [4]) includes the formal and informal context in which learning takes place [5] and incorporates the perceived atmosphere of a department [6] as well as common perceptions of policies, practices and procedures [7]. It is acknowledged that a healthy learning climate contributes to the use of effective learning approaches [8], resident wellbeing [911] and training satisfaction [12]. Furthermore, the learning climate [10] is thought to influence residents’ perceptions of their own competencies [13], professional development [14] and resulting professional behavior. Hence, a healthy learning climate may benefit the development of the resident as a professional as well as the quality of care provided by the resident.

Recognition of the relevance of the learning climate has brought about an increase in QA/QI activities aimed at maintaining and improving learning climates in residency. As part of these activities, training programs often evaluate the learning climate by the administration of annual trainee surveys [1]. A widely used and well-researched questionnaire is the Dutch Residency Educational Climate Test (D-RECT) [15]. The D-RECT is a 35-item questionnaire evaluating nine domains of a department’s learning climate [16]. Repeated use of the D-RECT may provide insight into a department’s educational performance [17]. Ultimately, the results of a D-RECT evaluation might trigger QI initiatives aimed at enhancing the quality of the learning climate [15].

Given the importance of the learning climate for PGME quality and patient care, as well as the increase in QA/QI activities aiming to impact the learning climate, we investigated how the learning climate develops over time. The aim of our study is to investigate whether the D-RECT scores of training departments change over time.



In the Netherlands PGME is regulated by the Royal Dutch Medical Association. One of the core aspects of the regulations is the responsibility of clinical departments offering residency training to guarantee trainees a supportive learning climate [18]. The D-RECT is widely used throughout the Netherlands to continuously monitor the learning climate as perceived by residents. The aim of the D-RECT is to identify areas for improvement and, as such, serve as a foundation for QI initiatives.


For this study we used D-RECT learning climate data that were collected by teaching departments to evaluate the learning climate and to fuel QA/QI initiatives. Departments that wished to evaluate the learning climate could request and perform a D-RECT evaluation via a web-based system. In order to investigate the change of D-RECT scores over time, we assessed the longitudinal development of the scores.

Participants and data collection

All residents in clinical teaching departments that requested a D-RECT evaluation were invited to complete the questionnaire during a pre-determined period (commonly one month). Residents received a maximum number of three automatically generated reminders by e-mail. The number of residents differed per residency training program. We included departments that used the D-RECT at least once between January 2012 and December 2014. Departments originated from academic hospitals (providing top clinical care, scientific research and PGME as well as coordinating PGME for affiliated hospitals), top clinical teaching hospitals (providing top clinical care, scientific research and PGME) or general teaching hospitals (providing patient care and PGME). Participation in the D-RECT was voluntary and anonymous for all residents.

We obtained informed consent for the use of the D-RECT data. The institutional ethical review board of the Academic Medical Center of the University of Amsterdam provided a waiver stating the Medical Research Involving Human Subjects Act (WMO) did not apply to the current study.


The D-RECT was originally developed and preliminarily validated by Boor et al [15]. Recently, the original D-RECT was updated and extensively validated, leading to a 35-item questionnaire, covering nine domains: educational atmosphere, teamwork, role of specialty tutor, coaching and assessment, formal education, resident peer collaboration, work is adapted to residents’ competence, accessibility of supervisors, and patient sign-out [16]. The items of the D-RECT can be answered on a five point Likert-scale (1 = totally disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = totally agree). Additionally, a “not applicable” option is provided.

Data analysis

Descriptive statistics and frequencies were used to describe the main characteristics of the study population. Departments with less than three resident evaluations were removed from the analysis since previous research showed that at least three evaluations are needed for a reliable mean total score of the D-RECT [16]. For departments that used the D-RECT more than once a year, only the most recent measurement period was included. Resident evaluations that were missing more than 17 questions (>50%) were excluded from further analysis. For the remaining evaluations, missing values were assumed to be missing at random and imputed by using expectation-maximization (EM). An average composite score representing the overall learning climate was computed for the 35 items of the D-RECT.

To investigate whether the D-RECT scores of clinical departments change over time, individual resident evaluations were aggregated at the department level to get each department’s annual mean score. Unadjusted and adjusted linear growth models were used to assess the growth of D-RECT scores over time [19]. By using a linear mixed model with random intercept, the analysis accounted for the hierarchical clustering of repeated scores within departments and of departments within teaching hospitals. For the adjusted models, the type of hospital (academic, top clinical teaching hospitals, general hospital), the number of resident evaluations aggregated to a department score, gender of respondents and the year of training were used as covariates. Besides residents, we decided to include D-RECT evaluations provided by doctors not in training and fellows as well. Since program directors selected respondents they considered relevant to evaluate the department's learning climate, we decided to follow the program directors' decisions and include these trainees and fellows. To check the results, we performed the analyses with additional summary outcome definitions, namely median composite scores and factor scores [20].

Resulting associations were reported using regression coefficients (b) and their 95% confidence interval (95% CI). All analyses were performed using SPSS Statistics version 20 (IBM Corp).


Study participants

In total, 4190 residents completed the D-RECT questionnaire between January 2012 and December 2014. After exclusion due to missing values, number of evaluations or double measurements in one year, a sample of 3982 evaluations remained. The overall response rate was 70%, varying from 24% to 100% between departments. The sample represented 223 training programs in 39 teaching hospitals. The number of training programs that participated per hospital varied between 1 and 27. Of the 3982 evaluations, 1244 (31.2%), 1376 (34.6%) and 1362 (34.2%) depict the learning climate in 2012, 2013 and 2014 respectively. A detailed description of the study sample is provided in Table 1.

Learning climate change over time

The overall D-RECT score in the sample was 3.9 (Table 1). Mean D-RECT scores showed modest increases from 3.83 in 2012 to 3.86 in 2013 and to 3.91 in 2014 in the overall sample (Table 2). The unadjusted and adjusted growth models indicated this change was statistically significant (b = 0.03; 95% CI = 0.01–0.06; p < 0.05) (Table 3).

Table 3. Unadjusted and adjusted growth models for annualized change in learning climate (D-RECT) scores.


Main findings

Our study showed that from 2012 to 2014 the learning climate in residency training, as measured using the D-RECT questionnaire, improved significantly (from 3.83 to 3.91 on a 5-point scale) in an overall sample totaling 223 training programs.

Explaining learning climate change over time

Regarding our aim (to investigate whether the learning climate perceptions of residents change over time) our findings suggest a positive trend in the D-RECT scores. This implies that residents experienced an improvement in the learning climate. However, the absolute improvement in learning climate is quite small. Both instrument-based and practice-based arguments can be provided to explain these results. Regarding the instrument, small effects might be caused by properties of the answer scale [21]. The D-RECT makes use of a 5-point Likert scale, which gives residents a limited range of responses. As a result, respondents are restricted in their ability to indicate improvement and have few options to discriminate between levels of performance [21]. Research into the evaluation of surgeons’ teaching performance with a 5-point Likert scale instrument demonstrated a comparable small but positive effect [22].

From a practice-based perspective the progress measured may be a reflection of the increased attention paid to the quality of residency training at various levels in the Dutch health system. The introduction of competency based medical education has elicited numerous changes in PGME, including new regulations that have ratified the importance of healthy learning climates for residents' learning and for patient care. QA/QI efforts undertaken by hospital and departmental leadership over the past few years are likely contributors to the progress evidenced in this study. In particular it is known that climate is relatively resistant to change. Although the use of the D-RECT in the Netherlands was initiated in 2009, wide-scale spread of the instrument only started in 2012. As demonstrated in previous research, it may take a couple of years after the start of evaluation before a convincing improvement in the learning climate becomes noticeable [23]. Therefore, the identified small but positive trend towards improvement of the learning climate within the studied time span can be considered encouraging.

A second practice-based explanation for the small change might be that departments are not (yet) acting from an improvement perspective. Ideally, departments will work towards improvement of the learning climate on a continuous basis. Whereas departments with lower scores might feel pressured to use their data to improve the learning climate, clinical departments with high D-RECT scores might have less incentive to initiate QI actions. The learning climate evaluations in our sample were quite high (mean D-RECT scores above 3.83) and suggested that most residents perceived the learning climate as healthy. Faculty and residents might have interpreted these high scores to mean that no further action towards improvement was needed, resulting in limited progress of the learning climate.

Strengths and limitations of the study

We were able to use the widely accepted [24, 25] and well researched [16] D-RECT to assess the learning climate in PGME. Due to the wide spread use of the instrument in the Netherlands, we could use a large pool of resident evaluations to assess change over time. Although there is no consistent sampling rule for multilevel models, a general recommendation is to have 20 higher- and 20 lower-level units of analysis [26]. Although our sample had over 20 higher-level units, we made sure to use restricted maximum likelihood during estimation to reduce the impact of the number of higher-levels [26].

Furthermore, a common statistical phenomenon when analyzing repeated measures is regression to the mean [27], which tends to make the observation of change in data more likely. The design of the current study limited the opportunities for analyzing whether regression to the mean is present in the data and therefore its possible contribution to the observed changes in D-RECT scores cannot be completely dismissed.

With regards to the generalizability of this study, the multicenter approach of the current study, with the inclusion of both academic teaching hospitals as well as top clinical and general teaching hospitals all over the Netherlands, contributed to the representativeness of the study population.

Implications for practice and future research

Our results suggest that the recent focus on the learning climate may have resulted in a small statistically significant improvement in the learning climate for PGME in the Netherlands. To continuously improve the learning climate, departments could use the D-RECT results for defining QA/QI initiatives and setting improvement goals. In the Netherlands, some departments have been successful by giving residents a leading role in these efforts.

In general, the frequent use of (various) feedback generating tools can support a department's monitoring activities regarding the quality of residency training and learning climate in particular [23]. In the Netherlands, legislation enacted in 2011 underlines and ratifies this approach and hospital-wide committees are held accountable for its execution. Hospital-wide monitoring committees (HMCs), which are mandatory for each teaching hospital, represent all residency program directors, residents, and the hospital board [18], but may also represent other staff. The aim of the HMC is to oversee PGME quality and support the QA/QI initiatives of all training programs in a teaching hospital [18]. The HMC may serve as a platform for sharing best practices regarding QI/QA initiatives and thus facilitate the exchange of ideas between departments [28]. A well functioning HMC is expected to contribute to improved departmental learning climates and, ultimately, improved training outcomes. Future research will have to prove these intended effects.

Although the relevance of bodies such as the HMC has been stressed, previous research showed that the systematic approaches used by the HMCs in 2011 were premature [29]. Since evidence shows systematic organizational policies contribute to PGME quality [28, 30], more qualitative research should be undertaken to explore the factors that hinder or support the use of a systematic approach. Furthermore, we are aware that besides QA/QI initiatives, there might be numerous other factors influencing the department’s learning climate (e.g. organizational culture). Therefore, investigating the mechanism underlying learning climate change remains important.


This study provides insight into the development of the learning climate over time, suggesting that residents perceive an improvement in the learning climate. Future research could focus on factors that facilitate or hinder learning climate improvement, including the role of hospital governing committees.

Supporting Information

S1 Dataset.




The authors would like to thank residents that completed the D-RECT evaluations. Special thanks goes to the Professional Performance research group for their input on drafts.

Author Contributions

Conceived and designed the experiments: MS OA KL AS. Performed the experiments: MS OA MJH. Analyzed the data: MS OA. Wrote the paper: MS. Revised the manuscript: KL OA AS MJH. Approved the manuscript: MS KL OA AS MJH.


  1. 1. Wall D, Goodyear H, Singh B, Whitehouse A, Hughes E, Howes J. A new tool to evaluate postgraduate training posts: the Job Evaluation Survey Tool (JEST). BMC medical education. 2014;14:210. doi: 10.1186/1472-6920-14-210 pmid:25277827; PubMed Central PMCID: PMC4200137.
  2. 2. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system—rationale and benefits. The New England journal of medicine. 2012;366(11):1051–6. doi: 10.1056/NEJMsr1200117 pmid:22356262.
  3. 3. WFME. Postgraduate Medical Education: WFME Global Standards for Quality Improvement. Copenhagen: World Federation for Medical Education; 2003.
  4. 4. Soemantri D, Herrera C, Riquelme A. Measuring the educational environment in health professions studies: a systematic review. Med Teach. 2010;32(12):947–52. Epub 2010/11/26. doi: 10.3109/01421591003686229 pmid:21090946.
  5. 5. Roff S, McAleer S. What is educational climate? Med Teach. 2001;23(4):333–4. doi: 10.1080/01421590120063312 pmid:12098377.
  6. 6. Genn JM. AMEE Medical Education Guide No. 23 (Part 2): Curriculum, environment, climate, quality and change in medical education—a unifying perspective. Med Teach. 2001;23(5):445–54. doi: 10.1080/01421590120075661. pmid:12098364
  7. 7. Lombarts KM, Heineman MJ, Scherpbier AJ, Arah OA. Effect of the Learning Climate of Residency Programs on Faculty's Teaching Performance as Evaluated by Residents. PloS one. 2014;9(1):e86512. doi: 10.1371/journal.pone.0086512 pmid:24489734.
  8. 8. Delva M, Kirby J, Schultz K, Godwin M. Assessing the Relationship of Learning Approaches to Workplace Climate in Clerkship and Residency. Academic Medicine. 2004;79(11):1120–6. 2004-20079-011. pmid:15504785
  9. 9. Hobgood C, Hevia A, Tamayo-Sarver JH, Weiner B, Riviello R. The Influence of the Causes and Contexts of Medical Errors on Emergency Medicine Residents' Responses to Their Errors: An Exploration. Academic Medicine. 2005;80(8):758–64. 2005-08379-004. pmid:16043533
  10. 10. Mareiniss DP. Decreasing GME Training Stress to Foster Residents' Professionalism. Academic Medicine. 2004;79(9):825–31. 2004-17916-002. pmid:15326004
  11. 11. Tsai JC, Chen CS, Sun IF, Liu KM, Lai CS. Clinical learning environment measurement for medical trainees at transitions: relations with socio-cultural factors and mental distress. BMC medical education. 2014;14:226. doi: 10.1186/1472-6920-14-226 pmid:25335528; PubMed Central PMCID: PMC4287428.
  12. 12. Daugherty SR, Baldwin DC Jr., Rowley BD. Learning, satisfaction, and mistreatment during medical internship: a national survey of working conditions. JAMA: the journal of the American Medical Association. 1998;279(15):1194–9. pmid:9555759.
  13. 13. Busari JO, Verhagen EA, Muskiet FD. The influence of the cultural climate of the training environment on physicians' self-perception of competence and preparedness for practice. BMC medical education. 2008;8:51. doi: 10.1186/1472-6920-8-51 pmid:19025586; PubMed Central PMCID: PMC2596784.
  14. 14. Brown J, Chapman T, Graham D. Becoming a new doctor: A learning or survival exercise? Medical Education. 2007;41(7):653–60. 2007-10758-004. pmid:17614885
  15. 15. Boor K, van der Vleuten C, Teunissen P, Scherpbier A, Scheele F. Development and analysis of D-RECT, an instrument measuring residents' learning climate. Medical Teacher; 2011. p. 820–7. doi: 10.3109/0142159X.2010.541533. pmid:21355691
  16. 16. Silkens ME, Smirnova A, Stalmeijer RE, Arah OA, Scherpbier AJ, Van Der Vleuten CP, et al. Revisiting the D-RECT tool: Validation of an instrument measuring residents' learning climate perceptions. Med Teach. 2015:1–6. Epub 2015/07/15. doi: 10.3109/0142159x.2015.1060300 pmid:26172348.
  17. 17. Piek J, Bossart M, Boor K, Halaska M, Haidopoulos D, Zapardiel I, et al. The work place educational climate in gynecological oncology fellowships across Europe: the impact of accreditation. International journal of gynecological cancer: official journal of the International Gynecological Cancer Society. 2015;25(1):180–90. Epub 2014/12/20. doi: 10.1097/igc.0000000000000323 pmid:25525769.
  18. 18. Directive of the Central College of Medical Specialists. (2009). Utrecht: Royal Dutch Medical Association [In Dutch]
  19. 19. Singer JD. Using SAS PROC MIXED to fit multilevel models, hierarchical models, and individual growth models. J Educ Behav Stat. 1998;23(4):323–55. WOS:000078991900002.
  20. 20. Boerebach BC, Lombarts KM, Arah OA. Confirmatory Factor Analysis of the System for Evaluation of Teaching Qualities (SETQ) in Graduate Medical Training. Evaluation & the health professions. 2014. Epub 2014/10/05. doi: 10.1177/0163278714552520 pmid:25280728.
  21. 21. Boerebach BCM, Arah OA, Heineman MJ, Lombarts KMJMH. Embracing the Complexity of Valid Assessments of Clinicians' Performance: A Call for In-Depth Examination of Methodological and Statistical Contexts That Affect the Measurement of Change. Academic Medicine. 9000;Publish Ahead of Print. doi: 10.1097/acm.0000000000000840 00001888-900000000-98735.
  22. 22. Boerebach BC, Arah OA, Heineman MJ, Busch OR, Lombarts KM. The impact of resident- and self-evaluations on surgeon's subsequent teaching performance. World journal of surgery. 2014;38(11):2761–9. doi: 10.1007/s00268-014-2655-3 pmid:24867473.
  23. 23. Byrne JM, Loo LK, Giang D, Wittich CM, Beckman TJ, Drefahl MM, et al. Monitoring and improving resident work environment across affiliated hospitals: A call for a national resident survey. Academic Medicine. 2009;84(2):199–205. 2009-16780-003. doi: 10.1097/ACM.0b013e318193833b. pmid:19174665
  24. 24. Bennett D, Dornan T, Bergin C, Horgan M. Postgraduate training in Ireland: expectations and experience. Irish journal of medical science. 2014. Epub 2014/01/07. doi: 10.1007/s11845-013-1060-5 pmid:24390312.
  25. 25. Pinnock R, Welch P, Taylor-Evans H, Quirk F. Using the DRECT to assess the intern learning environment in Australia. Med Teach. 2013;35(8):699. doi: 10.3109/0142159X.2013.786175 pmid:23837553.
  26. 26. Zyphur MJ, Kaplan SA, Islam G, Barsky AP, Franklin MS. Conducting multilevel analyses in medical education. Advances in health sciences education: theory and practice. 2008;13(5):571–82. doi: 10.1007/s10459-007-9078-y pmid:17932779.
  27. 27. Barnett AG, van der Pols JC, Dobson AJ. Regression to the mean: what it is and how to deal with it. Int J Epidemiol. 2005;34(1):215–20. doi: 10.1093/ije/dyh299 pmid:15333621.
  28. 28. Curry RH, Burgener AJ, Dooley SL, Christopher RP. Collaborative Governance of Multiinstitutional Graduate Medical Education: Lessons from The McGaw Medical Center of Northwestern University. Academic Medicine. 2008;83(6):568–73. doi: 10.1097/Acm.0b013e3181722fca. WOS:000267654100008. pmid:18520462
  29. 29. Lombarts MJ, Scherpbier AJ, Heineman MJ. [Monitoring and improving the quality of residency training: the hospital's central teaching committee as director?]. Ned Tijdschr Geneeskd. 2011;155(49):A3793. pmid:22166178
  30. 30. Heard JK, O'Sullivan P, Smith CE, Harper RA, Schexnayder SM. An institutional system to monitor and improve the quality of residency education. Acad Med. 2004;79(9):858–64. pmid:15326012.