e-Learning for enhancement of medical student performance at the Objective Structured Clinical Examination (OSCE)

This study aimed to investigate the impact of student e-learning on the development of clinical competencies. The study participants were 3rd year students (n = 43) at a private mid-sized medical school located in a South Korean suburb on a four-year medical program. Educational intervention was implemented to enhance student clinical performance. Students engaged in learning activities that intended to promote their self-directed learning abilities and clinical performances using e-learning resources. Intervention was conducted for the duration of six months during the 3rd year and its effectiveness was investigated by comparing student performances in OSCEs in a pre- and post- comparison format and also by comparing them with national scores. In addition, student perceptions of the impact of e-learning on their OSCE performances were assessed using a questionnaire, which included 36 items that elicited student perceptions of their experiences of e-learning and readiness for e-learning. Student OSCE scores improved significantly after educational intervention in all domains of clinical competencies assessed and for total scores (p < 0.001). Furthermore, students achieved higher OSCE scores than national average scores in the post-test, whereas they had performed lower than national average scores in the pre-test. Students showed neutral or slightly positive responses to the effectiveness of e-learning, and their perceptions of e-learning were not associated with their e-learning readiness scores. The study shows student OSCE performance improved significantly after educational intervention, which indicate the effectiveness of e-learning to support student learning of clinical performance. Despite significant improvements in student OSCE scores after e-learning, their perceptions of its effectiveness were neutral. Furthermore, student perceptions of e-learning were not associated with their readiness for it. Suggestions are made to help students use e-learning more effectively to enhance their clinical competencies.


Background
Clinical education is increasingly facing challenges that call for innovation. First, given the COVID-19 pandemic, there is an urgent need for innovation in the teaching and learning of medical material [1]. In particular, medical schools are obliged to respond to the need for clinical education among students with limited opportunities for rst-hand patient experiences while clinical clerkships are suspended due to the pandemic. As such, online learning has attained a position of prominence in medical education [2].
Second, there is a mismatch between students' learning needs and the realities of today's clinical settings [3,4]. Clinical education at tertiary academic medical centers is not suitable for teaching medical students the competencies required of primary physicians at the basic medical education phase [5]. Moreover, clinical experiences involving engagement in clinical encounters ranging from patient presentation to making diagnosis and treatment plans are limited for medical students during clerkships [6,7], and such problems are likely to be exacerbated during the COVID-19 pandemic. Third, although the Objective Structured Clinical Examination (OSCE) is an important to tool for assessing student clinical competencies, students frequently lack opportunities to practice, other than in the high-stakes OSCE examination [8]. Furthermore, some medical schools have moved the teaching and assessments of clinical skills into online formats, including virtual OSCEs [9,10], in response to the COVID-19 pandemic.
Under these circumstances, it is likely that there will be fewer opportunities to teach and learn clinical performance, and thus, prepare students for clinical performance tests. Therefore, there is a need for educational interventions that supplement teaching and learning, promote student clinical competencies, and help students prepare for OSCEs.
The literature suggests that re ection and self-directed learning are of paramount importance for effective learning [11] and key to promoting the development of professional competencies [12]. Likewise, previous indicate that re ection and self-directed learning improve student OSCE scores [13,14]. e-Learning is becoming increasingly important in medical education and is an integral part of supporting medical students' self-directed learning as in MOOCs (Massive Open Online Courses) [15] and other online learning resources [16][17][18]. Still, research is scant on the impact of the use of e-learning for selfdirected learning to promote student clinical competencies and its impact on student performance in OSCEs. Thus, this study aimed to investigate the impact of e-learning on student self-directed learning and the development of clinical competencies. e-Learning is a term often used interchangeably with online learning or distance learning, but there are subtle differences [19]. e-Learning is known to be at least as effective as conventional learning in medical education [20,21]. However, research on e-learning in the clinical education setting is scant, although the effectiveness of online videos for the learning of clinical skills has been well demonstrated [22,23]. Proliferation of the use of technology due to the COVID-19 pandemic has generated an increasing need for research on the effective use of e-learning in clinical education. Cook [24] asserts that research on elearning needs to focus on when and how to use it effectively rather than on head-to-head comparisons of its effectiveness with traditional instruction.
Research indicates that individual factors need to be taken into account when designing and implementing effective online learning environments [25]. One of the factors that in uences effective elearning is learner readiness. Several instruments have been developed to measure college student readiness for online learning [26]. Hung and colleagues [27] developed and validated an instrument for measuring learner readiness for online learning that consisted of ve domains and found college student levels of on learning readiness were high for computer/internet and online communication self-e cacy, and motivation for learning, but were low for learner control and self-directed learning. However, little research has been performed on learner preparedness for e-learning in medical education or the impacts student perceptions have on its effectiveness.
In this study, the research questions asked were: Does student OSCE performance improve after they engage in e-learning? Does student readiness for e-learning affect their perceptions of its effectiveness?

Methods
Study participants and the study setting Study participants were 3 rd year students (n = 43) at a private mid-sized medical school located in a suburb in South Korea. The school has a four-year basic medical education curriculum, and the 3 rd year curriculum ran from January 2019 till February 2020, during which students attended core clerkships. Students underwent OSCEs twice during the Year 3 curriculum, that is, initially in June 2019 and then again in February 2020. The OSCE comprised six stations with 10 minutes allocated for each station at which they interacted with standardized patients. Students were assessed for their clinical competences in history-taking, physical examinations, and patient-doctor relationships (patient-physician interactions).
We designed and implemented an educational intervention implemented for Year 3 students in 2019 to promote the development of students' clinical competencies and enhance their clinical performances as their performance in the rst round of OSCE was short of the national average. This self-directed learning course ran for six months and was awarded one credit-hour. Students engaged in learning activities intended to promote their re ection and self-directed learning on clinical performance using e-learning resources. During the course, students wrote a re ective paper on their OSCE performance. The purpose of this re ective paper was to promote student self-assessment of his/her clinical performance. For this re ection students rst reviewed their clinical performances by watching video clips of themselves at OSCE stations and then rated their clinical performances using a 5-point Likert-type scale in 14 areas from history-taking, physical examinations, to patient-doctor relationships, and overall performance.
Based on this self-assessment, students wrote a re ective paper on their OSCE performances.
In addition to self-re ection, students engaged in self-directed e-learning of clinical performance. The elearning resources were provided by the Korean Consortium for e-Learning in Medical Education (www.mededu.or.kr), which were developed and shared by nationwide medical schools. This website offers online clinical videos of duration approximately 10 minutes per station that show patient encounters for over 40 clinical presentation topics and demonstrate the entire process of patient encounters in a clinic setting. Further information on these video resources is provided elsewhere [18]. Students were asked to analyze patient encounters shown in video clips in terms of history-taking and physical examinations and asked about the information obtained and patient problems based upon the schema of clinical presentations. Students wrote a report on their analysis of patient encounters after watching at least six video clips, and were free to choose any subject from a repository of over 40 clinical presentations based on their individual learning needs.

Study procedures
Students took the OSCE twice during the Year 3 curriculum. The rst took place in June 2019 approximately ve months after they had attended clinical core clerkship rotations. The second OSCE was taken in February 2020 when the Year 3 curriculum ended. Between the two OSCE rounds, students participated in the self-directed learning course while attending clinical rotations.
To investigate the effectiveness of e-learning intervention, student performance at OSCE tests in Year 3 were assessed using a pre/post comparative format. Student OSCE scores were also compared with national scores to investigate improvements in performances, and student perceptions of the impact of elearning on their OSCE performances were assessed using a questionnaire. For this purpose, we developed a questionnaire that included 36 items composed of three dimensions. The rst dimension addressed demographic information, and the second dimension consisted of 10 items that elicited participants' perceptions of their experiences of e-learning. The third dimension contained 12 items (adapted from Hung [27]) that assessed student readiness for e-learning and four sub-scales that addressed (a) self-directed learning, (b) learner control, and (c) motivation for learning. We translated the original questionnaire, which was written in English into Korean, as we had experience in medical education research and Korean translation.
Students responded to the statements in the second and third dimensions using a ve-point scale, where 1 = "Strongly Disagree," and 5 = "Strongly Agree." The questionnaire was self-administered and implemented using an online survey tool in June 2020.

Data collection and ethical considerations
Student OSCE scores consisted of history-taking, physical exams, and patient-doctor relationships were obtained for the rst and second tests and were summed to produce total scores with a maximum possible score of 100. Student OSCE scores were compared with those of students at other medical schools who had taken OSCEs administered by a consortium of 18 medical schools in the Seoul metropolitan area.
Questionnaires were administered after obtaining permission from the Institutional Review Board of Inha University School of Medicine (2020-06-036). Also, all methods were carried out in accordance with relevant guidelines and regulations. We provided all participating students with a description of the purpose and methods of the study and stressed their rights regarding voluntary participation and personal con dentiality. Students that agreed to participate in the survey completed the questionnaire online. The survey data were retrieved anonymously from the online survey site.

Data analysis
Student OSCE performances were analyzed using descriptive statistics, and OSCE scores obtained for the rst-and second-round tests were compared using the paired t-test. Responses to the questionnaire were analyzed using descriptive statistics. Cronbach's alpha values were calculated to determine the internal consistency of items. Differences between responses to the survey of students with different backgrounds were analyzed using the independent t-test. Students were divided into two age groups about median age (24.0 years). Correlation coe cients were calculated to establish relationships between student perceptions of e-learning and e-learning readiness. The analysis was performed using SPSS Ver. 25.0, and statistical signi cance was accepted for p values of < 0.05.

Improvements in student performances
All year-3 students participated in both OSCE rounds (n = 43). Sixteen (37%) were female, and 27 (63%) were male. Student ages ranged from 23 to 35 years (M = 26.56 years; SD = 3.48). Student Grades Point Average (GPAs) in the third year curriculum ranged from 2.57 to 4.33 of a possible 4.5 (M = 3.50, SD = 0.45). Student OSCE performances in the pre-test were compared across different backgrounds as baseline measurements. Student OSCE scores in the pre-test differed by age and gender (p < 0.01), that is, older and female students had better OSCE scores than their counterparts. However, post-test scores did not differ by gender or age. Note: Each assessment domain was awarded a maximum 100 points, and total scores were calculated using average scores. Figure 1 shows comparisons of national average scores and student OSCE performances. Students' OSCE scores were higher than national average scores in the post-test, whereas they performed lower than national average in the pre-test.
Students' perceived effectiveness of e-learning Thirty-ve students completed and returned the questionnaires (an 81.4% response rate). Cronbach's alpha for items on student perceptions of e-learning was 0.90, which demonstrated a high level of reliability. Table 2 provides a summary of descriptive statistics of student responses questionnaire items. Students agreed with the statement that "the clinical videos aided learning clinical performance" (M = 3.66, SD = 0.76), and tended to agree with the statement that "the clinical videos helped me prepare for the OSCE" (M = 3.54, SD = 0.92). Students found that the videos aided learning because "I was able to repeatedly view aspects with which I was unfamiliar" (M = 3.57, SD = 0.917). In particular, students found the videos most helpful for understanding physical examination (M = 3.63, SD = 0.88) and history-taking skills (M = 3.57, SD = 0.78). Student readiness for e-learning and its relationship with perceptions of e-learning Table 3 shows descriptive statistics of student e-learning readiness. Total scores for student e-learning readiness ranged from 33.0 to 60.0 (M = 44.40, SD = 6.16), and the mean score was 3.70 (SD = 0.51), a score of 5 indicated highest level of readiness. Cronbach alpha for the items on student e-learning readiness was 0.85, which demonstrated acceptable reliability. Student e-learning readiness was higher for males than females (t = 2.77, p < 0.05), but did not differ across ages (t = 1.19, p = 0.24). Students' overall perceived effectiveness of e-learning was not found to be associated with their total elearning readiness scores (r = 0.20, p < 0. 25) or with that of each domain of student e-learning readiness.

Conclusions
Student OSCE performances were found to improve signi cantly after e-learning intervention and the results obtained demonstrated e-learning effectively supports the self-directed learning of clinical performance by medical students. Our study illustrates that e-learning provides an effective means of supplementing the face-to-face teaching of clinical performance in this time when face-to-face contact in clinical settings is limited.
Despite the signi cant improvement in student OSCE scores after e-learning, student perceptions of its effectiveness were rather neutral. This nding may have been because clinical performance involves the use of tacit knowledge, such as diagnostic reasoning, which is a requisite competence for doctors but not easily understood by such novices as medical students [28]. Thus, it is likely that such tacit knowledge was not easy to comprehend for students by just watching a patient encounter demonstrated in the video. Moreover, this nding is also likely due to student orientation to learning for assessment. Although the clinical videos presented for e-learning were not intended to teach students test-taking skills, they tend to use them as a checklist to understand what is assessed in the OSCE and to learn how to perform correctly according to the checklist rather than to gain an understanding of patient encounters and identify their learning needs.
Therefore, we make a few suggestions for improving the effectiveness of e-learning to enhance student clinical performance. First, faculty debrie ng is suggested to supplement student learning from the clinical videos. During these brie ngs, faculty members review key points with students either online or face-to-face to help them understand the application of tacit knowledge as demonstrated by the videos. Second, clinical videos should not only demonstrate procedures during patient encounters but also provide information on the knowledge and skills underlying the patient encounter.
The student e-learning readiness scores obtained during the present study are similar to those of Taiwanese college students reported by Hung et al. [27]. We found that students generally felt they were prepared for e-learning but that their e-learning readiness scores were not associated with their perceptions of e-learning. These ndings appear to re ect the backgrounds of Korean medical students, as all had experienced e-learning in high school as an important component of preparation for college entrance exams. Thus, it seems that student levels of e-learning readiness in the present study did not signi cantly impact their perceptions of e-learning. Therefore, our ndings indicate that e-learning readiness does not signi cantly impact perceived learning effectiveness among those familiar with this type of learning.
We acknowledge several limitations of the present study. First, it was conducted on a relatively small number of students recruited at a single institution. Second, the participants in the study were those who have been exposed to high use of technologies and had already been experienced with e-learning. Thus, we did not measure student e cacy on the computer / Internet self-e cacy in our study participants, which were included in the Hung's (27) original scale on the learner readiness for e-learning. Thus, our ndings may not be generalizable to students with limited experience of e-learning or computer usage. Therefore, a larger-scale study is warranted to investigate the generalizability of our ndings and to determine the effectiveness or impacts of e-learning in students with different levels of exposure to elearning and computer usage. Also, all methods were carried out in accordance with relevant guidelines and regulations.