Figures
Abstract
The Progress Test (PT) is a valuable tool for the continuous assessment of medical students’ learning. This quantitative, cross-sectional study aimed to understand the perceptions of 908 medical students (702 from a private and 206 from a public institution) in São Paulo, Brazil, regarding the design and implementation of PT. Analysis included descriptive statistics, Spearman’s correlation, Pearson’s Chi-square test, and Fisher’s exact test. Results revealed a strong positive correlation between academic progression and expected PT performance. Students from both institutions positively perceived PT construction and received adequate information about its importance. Significant differences were observed between institutions regarding classroom discussion of PT questions and content adequacy. The study highlights PT’s strengths and challenges, emphasizing the need for curriculum alignment, individualized feedback, and strategies to enhance student motivation. It contributes to the field of medical education assessment by providing insights for improving PT implementation and effectiveness. Future research directions include exploring stakeholder perspectives, conducting longitudinal studies, and examining PT’s impact across diverse educational contexts.This research underscores the value of student perspectives in developing effective assessment practices, supporting the evolution of competent medical professionals through meaningful educational strategies.
Citation: Gonçalves Cristóvão HL, Gonçalves Cristóvão NB, de Moraes Castellari LM, Gonçalves Cristóvão AB, dos Santos ER, Quitério AB, et al. (2024) Continuous assessment in medical education: Exploring students’ views on the progress test. PLoS ONE 19(12): e0314848. https://doi.org/10.1371/journal.pone.0314848
Editor: André Ricardo Ribas Freitas, Faculdade Sao Leopoldo Mandic, BRAZIL
Received: July 23, 2024; Accepted: November 12, 2024; Published: December 19, 2024
Copyright: © 2024 Gonçalves Cristóvão et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: All relevant data are within the article and its Supporting Information files.
Funding: The author(s) received no specific funding for this work.
Competing interests: The authors have declared that no competing interests exist.
Introduction
The 1970s was a period when numerous curricular changes began to occur in higher education courses. This educational movement resulted in the creation and introduction of the Progress Test (PT) at the end of that decade [1–3]. Initially developed at the University of Limburg in Maastricht, Netherlands, the PT has since gained global recognition as a valuable tool for continuous assessment of medical students’ learning [4].
The PT’s global adoption has been diverse, with implementations in countries such as the United Kingdom, Canada, Germany, and Brazil, each adapting the test to their specific educational contexts [5,6]. In Brazil, the first PT was applied in 1998, expanding to more than 60 medical school teaching institutions [7,8]. Currently, the PT in Brazil consists of 120 multiple-choice questions, with four response options, which assess students’ knowledge in the seven main areas of medicine: gynecology and obstetrics, surgery, internal medicine, pediatrics, public health, basic sciences, and bioethics [2,9].
From a theoretical perspective, the PT can be understood through the lens of Vygotsky’s concept of zones of proximal development [10]. This approach allows for the systematic design of curricular interfaces that guide students from their current developmental zone to their potential developmental zone [11]. This theoretical framework aligns with the PT’s goal of continuously and systematically monitoring students’ learning processes throughout their medical education [12–15].
While the benefits of PT are widely reported in the literature as positive for the learning process, few published studies explore the perceptions of medical students who take it, especially in the Brazilian context [14,16–18]. Studies from other countries have provided valuable insights. For instance, undergraduate students in the United Kingdom reported that constant learning, intrinsic motivation, understanding, and application of knowledge in clinical scenarios were important for the success of PT. Students in the Netherlands felt that, instead of being oriented towards the "mechanized" reproduction of knowledge, PT is an assessment of meaning [3,19].
However, it is important to note that the reception of new assessment methods can vary across different cultural contexts, reinforcing the socio-interactionist stance that forms a fundamental element of the basic epistemological matrix for understanding developmental zones [20,21]. This cultural variability underscores the need for context-specific research on PT perceptions and effectiveness.
Recent research has highlighted the importance of understanding students’ beliefs and perceptions about assessment methods, including the PT, as these can significantly influence their learning approaches and outcomes [22]. This study, which explored medical students’ perceptions of continuous assessment methods, including the PT, found that students generally viewed such methods positively, perceiving them as beneficial for their learning and professional development.
Considering the originality of this study in the Brazilian context and its potential contributions to the improvement of PT and the advancement of knowledge in the area of educational assessment in medicine [23], and given that students are the main protagonists in the assessments used in institutional programs, it is imperative to evaluate their perceptions and experiences to inform the future development of assessment methods [24].
Thus, this study aims to address the following research questions:
- How do medical students from two distinct Brazilian institutions perceive the design and implementation of the PT?
- What are the students’ beliefs about the effectiveness of the PT in assessing their medical knowledge and skills?
- How do students’ perceptions of the PT align with or differ from those reported in international contexts?
To frame these questions theoretically, we draw upon the conceptual framework of "assessment for learning" [25], which emphasizes the formative role of assessment in guiding and enhancing student learning. This framework allows us to explore how students’ perceptions of the PT align with its intended purpose as a tool for continuous learning and development. By addressing these questions, this study aims to contribute to the growing body of literature on student perceptions of assessment methods in medical education, with a specific focus on the Brazilian context. The findings will provide valuable insights for educators and policymakers involved in the design and implementation of assessment strategies in medical education.
Materials and methods
Study design and setting
This study employed a quantitative, cross-sectional design to investigate medical students’ perceptions of the PT at two higher education institutions in São Paulo, Brazil: one public and one private. The choice of these institutions aimed to capture potential differences in perceptions related to the distinct academic contexts and student profiles often associated with public and private institutions in Brazil.
Participants and sample size
Participants included medical students from both institutions who had completed at least one PT and volunteered to participate in the study. To ensure representativeness, a sample size calculation was performed using a confidence level of 95% and a margin of error of 5%, considering the estimated student populations of each institution. This calculation resulted in a target sample size of 200 students for the public institution and 650 students for the private institution. The difference in sample sizes between the two institutions reflects the variation in their respective student populations.
Data collection instrument
Data were collected using a self-administered questionnaire specifically developed for this study. The questionnaire comprised three sections:
- Demographics: This section collected information on participants’ age, gender, year of medical school, and institution type (public or private).
- Perceptions of PT Construction and Institutional Movements: This section used Likert-type scales to assess students’ opinions on various aspects of the PT, including clarity of questions, relevance to the curriculum, and perceived impact on learning.
- Access to Commented Answer Key and PT Results: This section used both Likert-type scales and open-ended questions to explore students’ experiences with accessing and utilizing feedback provided through the PT.
Before implementation, the questionnaire underwent a pilot test with a small group of medical students (n = 50) to ensure clarity, comprehensibility, and relevance of the questions. Feedback from the pilot test was used to refine the questionnaire and ensure its suitability for the target population. The complete questionnaire, in both Portuguese (original language) and English, is available in S1 Appendix.
This study was conducted following the ethical guidelines outlined by the Research Ethics Committee (REC), which approved the research protocol (Protocol number: 3.570.977). Participation in the study was voluntary and anonymous. Informed consent was obtained from all participants, who were informed of the study’s objectives, procedures, and their right to withdraw at any time without penalty.
Data analysis
The data were collected, coded, and tabulated in Excel, and imported into IBM-SPSS Statistics version 28 software (IBM Corporation, NY, USA) for exploratory and comparative analysis between groups. Exploratory data analysis included descriptive statistics for numerical variables and numbers, and proportion for categorical variables [26]. Spearman’s non-parametric correlation analysis was performed to verify the correlation between discrete and ordinal variables (undergraduate semester and questionnaire item).
The results were demonstrated by Spearman’s correlation coefficient, measured using a scale of values from +1 to -1; for a value close to +1, a perfect positive linear correlation is assumed (i.e., the higher the value of one variable, the higher the value of the other variable), and for a coefficient value close to -1, a perfect negative linear correlation is assumed (i.e., the higher the value of one variable, the lower the value of the other). Values close to zero indicated the absence of correlation. The strength of the correlation between two variables was interpreted as follows: r |≤0.25| = absence of correlation; |0.26–0.50| = weak correlation; |0.51–0.75| = moderate correlation, and |>0.75| = strong correlation. The "r" result was found in the sample. On the other hand, the 95% CI (Confidence Interval) showed the r values extrapolated to the population, with 95% confidence [27,28].
The comparison between students from the two HEIs was performed using Pearson’s Chi-square test or Fisher’s exact test [26,27,29]. When statistical significance was found, multiple comparisons (pairwise method) were performed using the z-test for difference in proportions [30], with p-values adjusted by the Bonferroni correction, to verify where the effect of the differences was. The results were expressed in the tables as numbers and percentages, with notations in letters indicating similarities or differences between categories (IBM SPSS Statistics Algorithms, 2020).
Results
Sample characteristics
The study encompassed 908 medical students, comprising 702 (65.3% of 1,075 enrolled students) from a private medical school and 206 (46.2% of 446 enrolled students) from a public medical school. Table 1 presents the demographic characteristics of the participants.
The predominant age group was 21–25 years, representing 350 (49.9%) participants from the private institution and 131 (63.6%) from the public institution. Female students constituted the majority in both institutions, accounting for 56.7% and 54.4% of participants in the private and public schools, respectively. The distribution of students across years of study was similar between the two institutions, with the highest proportion of participants in the first year (28.6%) and the sixth year (30.1%) of medical education.
Student perception and course progression
Non-parametric analysis revealed a strong positive correlation between the semester of study and students’ perception of the percentage of questions they expected to answer correctly in the Progress Test (PT) at both institutions (Table 2).
PT adequacy and institutional actions
The majority of students from both institutions strongly agreed with the adequacy of PT construction and institutional actions to promote student adherence. Correlation analysis (Table 3) demonstrated no significant correlation between responses and semester of study, except for the item "Completion time is adequate for the content," which exhibited a weak positive correlation.
Access to answer key and PT results
The majority of students expressed strong interest in accessing the commented answer key and PT results. Correlation analysis (Table 4) revealed a weak positive correlation between semester of study and interest in accessing the commented answer key.
Utilization of PT results by institutions
Comparative analysis of student responses regarding the use of PT results by the institution (Table 5) revealed significant differences between the two schools, particularly for the item "Questions are subsequently discussed in the classroom."
Correlation analysis (Table 6) demonstrated a weak positive correlation between semester of study and the perception that questions are discussed in the classroom.
Student motivation and use of PT results
Comparative analysis between students from the two institutions (Table 7) revealed significant differences in three of the four items related to motivation and use of PT results.
Correlation analysis (Table 8) showed no significant correlations between responses and semester of study.
Discussion
The present study aimed to understand the perceptions of medical students from two distinct institutions regarding the design and implementation of the Progress Test (PT). The results obtained showed that the PT offers teachers and institutions the opportunity to review the teaching-learning processes related to their course, knowledge area, and competency maturation depending on the year of graduation, in addition to helping identify strengths and weaknesses in the curricular structure, which is useful for finding possible solutions to the obstacles presented and for improving teaching tools [12,14,31–33].
From the perspective of Vygotsky’s theory, the PT represents an opportunity for institutions to monitor the perimeter of their current zones of development through scaffolding that observes the progression of difficulty and skillfully integrates the focal knowledge and competencies of each learning cycle, as well as evaluating the idiosyncrasies of the seven main areas observed [34]. This approach aligns with the longitudinal nature of the PT, allowing for a more holistic view of student development throughout the course [35].
The results of the present study corroborate the effectiveness and reliability of the PT as an instrument for assessing learning during undergraduate training in health professions [36]. Given its broad scope, PTs can reveal curricular deficiencies in student preparation, knowledge gaps, or areas of low performance that need attention, offering realistic and contextualized content, oriented towards individual monitoring [16,37]. This finding is reinforced by the "strong positive correlation" found in the present research regarding students’ perception, from both HEIs, in the sense of answering a higher percentage of PT questions as the semester progresses, since their academic development can generate confidence according to the level of maturity achieved.
Cultural differences play a significant role in shaping students’ perceptions of the PT. A study by Verhoeven et al. [38] found that students from diverse cultural backgrounds may interpret and approach the PT differently, influenced by their educational experiences and cultural values. This cultural dimension adds complexity to the implementation and interpretation of PT results across different institutions and regions.
Considering the need to adapt the PT to the various local cultural and academic needs [39,40], the participants of this study had a positive perception that the two HEIs adequately constructed the PT to promote student adherence to its completion. These data provide important feedback for the institutions, highlighting the importance of encouraging students to adopt a longitudinal self-directed learning style and understand the consolidation of the knowledge developed for the proper exercise of the profession [32,33,38]. Therefore, it is up to the pedagogical projects of the courses to point out, in regulatory terms, the tools by which this spiral curricular approach will be worked on in daily practice.
The analysis of students’ motivation to take the PT, the use of results for academic development, and the evolution of performance in the test to guide studies revealed significant statistical differences between the HEIs studied. Students from the private school demonstrated greater agreement regarding these items compared to students from the public institution. This divergence can be explained by the different profiles of students and the insertion of active methodologies in the institutions, which favor interest in formative assessments [41]. One strategy to address this situation would be to conduct the PT more frequently, around two to four times a year, potentially leading to a more positive impact on students’ perception of the test’s value [42].
Surprisingly, no correlation was observed between responses about motivation and the use of PT results by the student for their academic development with the progression of the undergraduate semester in the two medical schools studied. This lack of association may be related to high academic demands, pressure from summative assessments, anxiety about the risk of not progressing in the program, and concern about the proximity of the residency exam, which has a different format from the PT [43]. However, Norman et al. [44] showed that the implementation of PT can reduce failures in medical licensing exams, since conventional exams encourage mechanical memorization in favor of comprehensive reasoning, while, conversely, PT interrupts this effect and allows for deep learning, which may favor performance in other types of exams, such as the medical residency exam [45].
The influence of summative assessments on students’ motivation to learn is another relevant aspect to consider [46]. Given the lack of direct consequences of formative test results, as is the case of PT in the HEIs analyzed in this study, students may be less motivated to put forth their best effort in this assessment. This phenomenon can be explained by the Expectancy-Value Theory (EVT), a conceptual framework frequently used in the context of test-taking motivation [47]. Future research could explore strategies to enhance students’ intrinsic motivation to engage with the PT, potentially improving its effectiveness as a learning tool.
Regarding access to the commented answer key and the PT result, most students from both schools strongly agreed with these items. Research conducted indicates that many students wish to access commented answer keys after taking PTs, as they believe this practice helps them better understand the mistakes made and direct their future studies more effectively [24,48]. However, a significant statistical difference was observed in relation to the item ’The questions are later discussed in the classroom’, in which students from the private institution had a greater perception of the use of PT results compared to students from the public HEI. This finding highlights the importance of understanding how different institutions implement PT to promote learning within each specific reality. In addition to thinking of PT as a subsidy for the institutional validation of curricular practices, it is necessary to promote dialogical reflections that allow the generation of longitudinal feedback to be offered to students, so that this tool can be a valuable resource in the systematization of academic mentoring processes and monitoring of individual acquisition of expected competencies [31].
The adequate approach to curricular content in the institutions studied to take the PT also showed greater agreement among students from the private HEI compared to the public one. Hence the importance of the institutional pedagogical project being aligned with the PT matrix so that students can verify the development of their cognitive performance in the various areas of the course and curriculum, and identify their potential problems, in order to value this exam through this analysis [7]. In addition, it is essential to consider the different assessment cultures in each institution, since schools with traditional teaching tend to value summative assessments, while those that employ active methodologies use a higher proportion of formative assessments [49].
The integration of PT into a programmatic assessment system, as suggested by Van der Vleuten et al. [50], could optimize both the learning function and decision-making in competency-based educational contexts, using well-founded assessment principles. Although this programmatic assessment model is well received in educational practice [51,52], many consider it complex and theoretical. To address this, Van der Vleuten et al. [53] described concrete tips for implementing programmatic assessment in educational institutions. Future research could explore the effectiveness of these implementation strategies across diverse educational settings.
The uniqueness of the present study is linked to its relevance in assessing medical students’ perceptions of PT, which can be used by academic managers to develop new interventions in the curriculum, improving the quality of health courses. A limitation of this study is the sample restricted to two institutions, which may limit the generalization of the results to other contexts. In addition, the research was based exclusively on student perceptions, not covering the perspective of other actors involved in the process, such as teachers and educational managers. Thus, as a future perspective, more research is needed in other educational institutions to compare and deepen the understanding of the impact of PT on the teaching-learning process of medical students, as well as to verify the effect that any changes in curricular design have on these indicators.
It is important to emphasize that structural modifications in the educational system are not sufficient; a fundamental reformulation of educational models is imperative. Higher education institutions must move beyond the model of transmitting information disconnected from practical reality. Pedagogical processes need to emphasize debate and critical reflection, with teachers acting as facilitators and catalysts, while students take active responsibility for their learning [54]. Students must understand the meaning and applicability of what they learn, using information critically and developing the skills and competencies necessary to become professionals capable of transforming their environment.
Conclusions
This study examined medical students’ perceptions of the Progress Test (PT) design and implementation across two distinct institutions. Our findings underscore the PT’s significance as a continuous, formative assessment tool that enables the monitoring of knowledge development throughout the medical curriculum.
The research revealed both strengths and challenges in PT implementation, highlighting the need for alignment between PT content and course curricula, as well as the importance of providing individualized feedback to students. These insights can inform strategies to optimize PT implementation and enhance its effectiveness as a learning tool.
By considering student perspectives, educational institutions can develop more meaningful assessment practices that support the development of competent medical professionals. This study contributes to the field of medical education assessment, offering evidence-based insights for enhancing Progress Tests and opening avenues for future research in this area.
Supporting information
S1 Appendix. The complete questionnaire, in both Portuguese (original language) and English.
https://doi.org/10.1371/journal.pone.0314848.s001
(DOCX)
References
- 1. Pinheiro CEA, Souza DOD. Avaliação longitudinal de estudantes de medicina: o teste de progresso é apropriado? Estud Em Aval Educ. 2023 Sep 12;34:e09220.
- 2. Rosa MI da, Isoppo CC, Cattaneo HD, Madeira K, Adami F, Ferreira OF. O Teste de Progresso como Indicador para Melhorias em Curso de Graduação em Medicina. Rev Bras Educ Médica. 2017 Mar;41:58–68.
- 3. Sattar K, Ahmad T, Khalil MS, Soliman MM, Punnamperuma GG, Abdulghani H. Medical students’ perception of the progress test as a quality-controlled assessment tool for improving learning and teaching, at a public sector medical college in Saudi Arabia. Journal of the Pakistan Medical Association. Vol. 67, No. 2. 2017 Feb;
- 4. Wrigley W, Van Der Vleuten CP, Freeman A, Muijtjens A. A systemic framework for the progress test: Strengths, constraints and issues: AMEE Guide No. 71. Med Teach. 2012 Sep;34(9):683–97. pmid:22905655
- 5. Freeman A, Van Der Vleuten C, Nouns Z, Ricketts C. Progress testing internationally. Med Teach. 2010 Jan;32(6):451–5. pmid:20515370
- 6. Nouns ZM, Georg W. Progress testing in German speaking countries. Med Teach. 2010 Jan;32(6):467–70. pmid:20515374
- 7. Sakai MH, Ferreira Filho OF, Almeida MJ de, Mashima DA, Marchese M de C. Teste de progresso e avaliação do curso: dez anos de experiência da medicina da Universidade Estadual de Londrina. Rev Bras Educ Médica. 2008 Jun;32:254–63.
- 8. Vleuten C van der, Freeman A, Collares CF. Progress test utopia. 2018 Mar 9;7(2):136.
- 9. Matsuyama Y, Muijtjens AMM, Kikukawa M, Stalmeijer R, Murakami R, Ishikawa S, et al. A first report of East Asian students’ perception of progress testing: a focus group study. BMC Med Educ. 2016 Sep 22;16(1):245. pmid:27658501
- 10. Vygotsky LS. Mind in Society: Development of Higher Psychological Processes [Internet]. Cole M, Jolm-Steiner V, Scribner S, Souberman E, editors. Harvard University Press; 1980.
- 11.
Chaiklin S. The Zone of Proximal Development in Vygotsky’s Analysis of Learning and Instruction. In: Kozulin A, Gindis B, Ageyev VS, Miller SM, editors. Vygotsky’s Educational Theory in Cultural Context [Internet]. 1st ed. Cambridge University Press; 2003.
- 12. Herrmann L, Beitz-Radzio C, Bernigau D, Birk S, Ehlers JP, Pfeiffer-Morhenn B, et al. Status Quo of Progress Testing in Veterinary Medical Education and Lessons Learned. Front Vet Sci [Internet]. 2020 Aug 21 [cited 2024 Jun 20];7. Available from: https://www.frontiersin.org/articles/10.3389/fvets.2020.00559 pmid:32974407
- 13. Möltner A, Wagener S, Burkert M. Measuring competency-relevant knowledge in the competency-oriented student progress test. GMS J Med Educ. 2020;37(1):Doc6. pmid:32270020
- 14. Reberti AG, Monfredini NH, Ferreira Filho OF, Andrade DF de, Pinheiro CEA, Silva JC. Progress Test in Medical School: a Systematic Review of the Literature. Rev Bras Educ Médica. 2020 Mar 13;44:e014.
- 15. Sartor LB, Rosa LL da, Madeira K, Uggioni MLR, Ferreira Filho OF, Rosa MI da. Undergraduate Medical Student’s Perception about the Progress Testing. Rev Bras Educ Médica. 2020 May 8;44:e062.
- 16. Couto LB, Durand MT, Wolff ACD, Restini CBA, Faria M, Romão GS, et al. Formative assessment scores in tutorial sessions correlates with OSCE and progress testing scores in a PBL medical curriculum. Med Educ Online. 2019 Jan 1;24(1):1560862. pmid:31023185
- 17. Yielder J, Wearn A, Chen Y, Henning MA, Weller J, Lillis S, et al. A qualitative exploration of student perceptions of the impact of progress tests on learning and emotional wellbeing. BMC Med Educ. 2017 Aug 29;17(1):148. pmid:28851341
- 18. Dion V, St-Onge C, Bartman I, Touchie C, Pugh D. Written-Based Progress Testing: A Scoping Review. Acad Med. 2022 May;97(5):747–57.
- 19. Zanella AV. Zona de desenvolvimento proximal: análise teórica de um conceito em algumas situações variadas. Temas Em Psicol. 1994;2(2):97–110.
- 20.
Rogoff B. The cultural nature of human development. Oxford New York Auckland Bangkok Buenos Aires Capetown: Oxford University Press; 2003. 434 p.
- 21.
Lave J, Wenger E. Situated learning: legitimate peripheral participation. Cambridge [England]; New York: Cambridge University Press; 1991. 138 p. (Learning in doing).
- 22. Saha M, Islam S, Akhi AA, Saha G. Factors affecting success and failure in higher education mathematics: Students’ and teachers’ perspectives. Heliyon. 2024 Apr;10(7):e29173. pmid:38601642
- 23. Chinelato MMR, Martinez JE, Azevedo GRD. Teste de Progresso: a percepção do discente de Medicina. Rev Bras Educ Médica. 2022;46(suppl 1):e154.
- 24. Ali K, Cockerill J, Zahra D, Tredwin C, Ferguson C. Impact of Progress testing on the learning experiences of students in medicine, dentistry and dental therapy. BMC Med Educ. 2018 Dec;18(1):253. pmid:30413204
- 25. Schuwirth LWT, Van Der Vleuten CPM. Programmatic assessment: From assessment of learning to assessment for learning. Med Teach. 2011 Jun;33(6):478–85. pmid:21609177
- 26.
Conover WJ. Practical nonparametric statistics. 3a. New York: John Wiley & Sons; 1999. 584 p.
- 27.
Siegel S, Castellan NJ. Estatística não paramétrica para ciências do comportamento. 2nd ed. Porto Alegre: Artmed; 2006. 448 p.
- 28. Bishara AJ, Hittner JB. Confidence intervals for correlations when data are not normal. Behav Res Methods. 2017 Feb 1;49(1):294–309. pmid:26822671
- 29.
Field AF. Descobrindo a Estatística Usando o SPSS. 2nd ed. Porto Alegre: Artmed; 2009. 688 p.
- 30.
Bussab W de O, Morettin PA. Estatística básica. 9th ed. São Paulo: Saraiva Uni; 2017. 568 p.
- 31. Oliveira SS de, Postal EA, Afonso DH, s CE, Cyrino EG, Abreu Junior AF de, et al. Teste de Progresso da Abem: consolidando uma estratégia de avaliação para o ensino médico. Rev Bras Educ Médica. 2022 Apr 4;46:e061.
- 32. Bicudo AM, Hamamoto PT, Abbade JF, Hafner M de LMB, Maffei CML. Teste de Progresso em Consórcios para Todas as Escolas Médicas do Brasil. Rev Bras Educ Médica. 2019 Oct 14;43:151–6.
- 33. Coombes L, Ricketts C, Freeman A, Stratford J. Beyond assessment: Feedback for individuals and institutions based on the progress test. Med Teach. 2010 Jan;32(6):486–90. pmid:20515378
- 34. Sanders D, Welk DS. Strategies to Scaffold Student Learning: Applying Vygotskyʼs Zone of Proximal Development. Nurse Educ. 2005 Sep;30(5):203–7. pmid:16170261
- 35. Gordon M, Patricio M, Horne L, Muston A, Alston SR, Pammi M, et al. Developments in medical education in response to the COVID-19 pandemic: A rapid BEME systematic review: BEME Guide No. 63. Med Teach. 2020 Nov 1;42(11):1202–15. pmid:32847456
- 36. Görlich D, Friederichs H. Using longitudinal progress test data to determine the effect size of learning in undergraduate medical education–a retrospective, single-center, mixed model analysis of progress testing results. Med Educ Online. 2021 Jan 1;26(1):1972505. pmid:34459724
- 37. Falcão FMV, Pereira DSM, Pêgo JM, Costa P. Progress is impossible without change: implementing automatic item generation in medical knowledge progress testing. Educ Inf Technol. 2024 Mar;29(4):4505–30.
- 38. Verhoeven BH, Snellen-Balendong HAM, Hay IT, Boon JM, Van Der Linde MJ, Blitz-Lindeque JJ, et al. The versatility of progress testing assessed in an international context: a start for benchmarking global standardization? Med Teach. 2005 Sep;27(6):514–20. pmid:16199358
- 39. Alavarce DC, De Medeiros ML, De Araújo Viana D, Abade F, Vieira JE, Machado JLM, et al. The progress test as a structuring initiative for programmatic assessment. BMC Med Educ. 2024 May 21;24(1):555. pmid:38773470
- 40. Khan AS, AlMulhem A, Refaie S, Dawood AFA, Angeles G, Pangan JK. Progress on Progress Test: International vs Local Experience. Int J Chang Educ. 2024 Mar 14;1(2):57–62.
- 41. Torralba KD, Doo L. Active Learning Strategies to Improve Progression from Knowledge to Action. Rheum Dis Clin N Am. 2020 Feb;46(1):1–19. pmid:31757278
- 42. Liao H, Zhang Q, Yang L, Fei Y. Investigating relationships among regulated learning, teaching presence and student engagement in blended learning: An experience sampling analysis. Educ Inf Technol. 2023 Oct;28(10):12997–3025. pmid:37361845
- 43. Hamamoto PT, Zeferino AMB. Cursinhos Preparatórios para Residência Médica: Reflexões sobre Possíveis Causas e Consequências. Rev Bras Educ Médica. 2011;35(4):550–6.
- 44. Norman G, Neville A, Blake JM, Mueller B. Assessment steers learning down the right road: Impact of progress testing on licensing examination performance. Med Teach. 2010 Jan;32(6):496–9. pmid:20515380
- 45. Albanese M, Case SM. Progress testing: critical analysis and suggested practices. Adv Health Sci Educ. 2016 Mar;21(1):221–34. pmid:25662873
- 46. Chaou CH, Yu SR, Chang YC, Ma SD, Tseng HM, Hsieh MJ, et al. The evolution of medical students’ preparedness for clinical practice during the transition of graduation: a longitudinal study from the undergraduate to postgraduate periods. BMC Med Educ. 2021 Dec;21(1):260. pmid:33957907
- 47.
Hulleman CS, Barron KE, Kosovich JJ, Lazowski RA. Student Motivation: Current Theories, Constructs, and Interventions Within an Expectancy-Value Framework. In: Lipnevich AA, Preckel F, Roberts RD, editors. Psychosocial Skills and School Systems in the 21st Century [Internet]. Cham: Springer International Publishing; 2016. p. 241–78. (The Springer Series on Human Exceptionality).
- 48. Chen Y, Henning M, Yielder J, Jones R, Wearn A, Weller J. Progress testing in the medical curriculum: students’ approaches to learning and perceived stress. BMC Med Educ. 2015 Sep 11;15(1):147. pmid:26362199
- 49. Troncon LEDA Elias LLK, Osako MK Romão EA, Bollela VR Moriguti JC. Reflexões sobre a utilização do Teste de Progresso na avaliação programática do estudante. Rev Bras Educ Médica. 2023;47(2):e076.
- 50. Van Der Vleuten CPM, Schuwirth LWT, Driessen EW, Dijkstra J, Tigelaar D, Baartman LKJ, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012 Mar;34(3):205–14. pmid:22364452
- 51. Driessen EW, Van Tartwijk J, Govaerts M, Teunissen P, Van Der Vleuten CPM. The use of programmatic assessment in the clinical workplace: A Maastricht case report. Med Teach. 2012 Mar;34(3):226–31. pmid:22364455
- 52. Bok HG, Teunissen PW, Favier RP, Rietbroek NJ, Theyse LF, Brommer H, et al. Programmatic assessment of competency-based workplace learning: when theory meets practice. BMC Med Educ. 2013 Dec;13(1):123. pmid:24020944
- 53. Van Der Vleuten CPM, Schuwirth LWT, Driessen EW, Govaerts MJB, Heeneman S. Twelve Tips for programmatic assessment. Med Teach. 2015 Jul 3;37(7):641–6. pmid:25410481
- 54. De Oliveira MAC, Miles A, Asbridge JE. Modern medical schools curricula: Necessary innovations and priorities for change. J Eval Clin Pract. 2024 Mar;30(2):162–73. pmid:37656633