Peer Review History

Original SubmissionNovember 20, 2020
Decision Letter - Jenny Wilkinson, Editor

PONE-D-20-36536

Prioritizing topics for developing e-learning resources in healthcare curricula: a comparison between students and educators

PLOS ONE

Dear Dr. Ng,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Jun 04 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Jenny Wilkinson, PhD

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

  1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2.  Thank you for including your ethics statement: 'The participants read through the participant information sheet and signed the online consent form if they agreed to participate. Subsequently, the participants filled up the demographic data and the main questionnaire. The study was approved by the respective medical research ethics committees (MECID No 2019225-7166, JKEUPM-2019-103).'

a. Please amend your current ethics statement to include the full names of the ethics committees/institutional review board(s) that approved your specific study.

b. Once you have amended this/these statement(s) in the Methods section of the manuscript, please add the same text to the “Ethics Statement” field of the submission form (via “Edit Submission”).

For additional information about PLOS ONE ethical requirements for human subjects research, please refer to http://journals.plos.org/plosone/s/submission-guidelines#loc-human-subjects-research

3. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

Additional Editor Comments:

Thank you for your submission; attached are reviewer comments for your information and I now invite you to provide a response to these comments.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: N/A

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: The work aim is valid and can be easily replicated in other settings. Nevertheless, there are many confusing points of concern in the manuscript that need to be explained and commented on and, if found necessary to improve the presentation, to make appropriate changes to improve the presented work. [1] Why in case of TU, 2 different courses (pharmacy and Biomed) were separated in methods and tables but combined only in results? Was there any difference in results figures? Was this a reason to have educator choose completely different choices than students when combined results were presented? [2] In table S2 (B) and figure 3, it is scientifically unacceptable that 25/educator and 205/student agree (independently) to value 16 topics by only 2 values out of the 5-point Likert scale (4 by13/16 and 3 by 3/16)!! This has produced an odd “scatter” plot as shown in figure 3. What went wrong here? [3] Results, line 152 reads: “topics were sorted in descending order (the most selected to least selected topics) separately for each of student and educator groups. The first quartile of most selected topics from the students and the first quartile of most selected topics from educators were included in Round 2 survey.” OK, why item #2 (scoring 52/educator and 55.1/student) was included as a quartile in R2 and not item #23 (scoring 60/educator and 61/student); see (table S2 (B): Supplementary documents). How can the value of 52 and 55.1 be included in a quartile from descending values and in same table values of 60 and 61 (respectively) are not included in the same quartile?? No clue can be discovered. [4] Need to comment on the difference in results from TU in comparison with UM and UPM by referring to difference in nature of the lists of “topics”. The lists used in UM and UPM cover number of “competencies’ while those for TU covered merely listed subjects relating to purely factual knowledge as detailed in the supplemental materials (tables S2 A-D). Accordingly, one major result should emphasize that nature of topics can affect the results. That was also superimposed by the fact that in case of TU there was a large number of topics with a small number of participants especially educators. [5] There is a need to emphasize that students’ role as a participant in identifying their “learning needs” which can improve their outcomes and those of forthcoming batches, are more relevant than role of their educators in identifying their “teaching needs” which can easily be subjective depending on each educator’s interest and subspecialty. [6] You need to check values as in table 2, row 1 indicates that (n=) for UPM is 25 and for TU-pharmacy is 3 but the following rows indicate that either the figure in row 1 is reversed or details in following rows are reversed between the two universities. [7] In figure 2, I suggest using different shapes for the plotted values instead of colours as usually printouts are made in BW rendering the figure to be of less value. [8] Check text carefully, there are several sentences containing doubled words and misspellings e.g. lines 214, 249. [9] Results, line #177 to 178, the sentence states that fig 2 is for students and fell short to mention for educators also. I Suggest rewriting and resubmitting as a study of student learning needs assessment of previous students for future classes’ RLO. The needs of the students can give a more valid list than a list of educator’s choices of importance of the competencies. Meanwhile, the study should cover competencies in UM and UPM only (exclude factual knowledge of TU).

Reviewer #2: Very interesting idea - Does the difference in student level at the institutions play a role here? I am a bit unclear on your final conclusion. Please consider clarifying the take-away for the reader and perhaps increasing discussion a bit. Overall, very nicely done.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Ghanim Alsheikh

Reviewer #2: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 1

Response to reviewers

Reviewer #1:

The work aim is valid and can be easily replicated in other settings. Nevertheless, there are many confusing points of concern in the manuscript that need to be explained and commented on and, if found necessary to improve the presentation, to make appropriate changes to improve the presented work.

[1] Why in case of TU, 2 different courses (pharmacy and Biomed) were separated in methods and tables but combined only in results? Was there any difference in results figures? Was this a reason to have educator choose completely different choices than students when combined results were presented?

Response:

Thank you for highlighting this discrepancy to us. Although it was our original intention to analyse the two courses separately, we did not include TU Biomed data in our final results because only one educator responded in Round 2 which made comparison of responses between students and educators responses unattainable. Therefore, only TU Pharmacy results were reported for TU in Figure 2C and Figure 3C.

We have previously stated it under Methods, line 175-176 "The comparison between students and educators was not done for TU (Biomed) because there was only one educator who responded in Round 2.". We realised that this was not clear and we have revised the manuscript accordingly as follows, under Methods, 2nd paragraph, line 110-114 (Please refer to the manuscript with tracked changes for the line numbering):

"For TU, two courses (Pharmacy and Biomedical) participated in this modified Delphi survey. As the response rate from TU Biomedical was low, where only one educator responded in Round 2 (Round 1: 3 educators, 48 students; Round 2: 1 educator, 40 students), we decided to exclude TU Biomedical from the results and only reported the findings from TU Pharmacy (Supplementary Table S1)."

[2] In table S2 (B) and figure 3, it is scientifically unacceptable that 25/educator and 205/student agree (independently) to value 16 topics by only 2 values out of the 5-point Likert scale (4 by13/16 and 3 by 3/16)!! This has produced an odd "scatter" plot as shown in figure 3. What went wrong here?

Response:

Thank you for pointing out the error in Supplementary Table S2(B) ans Figure 3 and we would like to sincerely apologise for the mistakes. The values were such because the decimal points were rounded up to a whole figure instead of presented in two decimal points like the other institutionas.

We have revised the values to the original two decimal points, and have amended Table S2(B) and figure 3 accordingly. The revised scatterplot shows the spread of the agreement. The revised results did not affect the interpretation and conclusion of this study.

[3] Results, line 152 reads: "topics were sorted in descending order (the most selected to least selected topics) separately for each of student and educator groups. The first quartile of most selected topics from the students and the first quartile of most selected topics from educators were included in Round 2 survey." OK, why item #2 (scoring 52/educator and 55.1/student) was included as a quartile in R2 and not item #23 (scoring 60/educator and 61/student); see (table S2 (B): Supplementary documents). How can the value of 52 and 55.1 be included in a quartile from descending values and in same table values of 60 and 61 (respectively) are not included in the same quartile?? No clue can be discovered.

Response:

Thank you for pointing out the discrepancy and we sincerely apologised for the mistake. For item #23, the scoring by educator (n=12) was 48% (not 60%, n=15). The scoring #23 for students was correct i.e. 61% (n=125). Therefore, #23 was not in the first quartile for the educator scoring and hence was not included in Round 2. We have made the amendment in Supplementary material Table S2(B).

[4] Need to comment on the difference in results from TU in comparison with UM and UPM by referring to difference in nature of the lists of "topics". The lists used in UM and UPM cover number of "competencies' while those for TU covered merely listed subjects relating to purely factual knowledge as detailed in the supplemental materials (tables S2 A-D). Accordingly, one major result should emphasise that nature of topics can affect the results. That was also superimposed by the fact that in case of TU there was a large number of topics with a small number of participants especially educators.

Response:

Thank you for the suggestions. We have added the following points under Discussion, 4th paragraph, line 260-265 as shown below:

“Another possible explanation for differences in prioritisation pattern across different universities is the nature of the topics. For instance, TU’s topics were mainly on basic science (knowledge-based) while those of UM and UPM focused on clinical competencies (e.g. prescription and communication). In addition, TU had a large number of topics for selection but only a smaller number of participants, especially educators. This again might contribute to the difference in the prioritization process for TU.”

[5] There is a need to emphasise that students' role as a participant in identifying their "learning needs" which can improve their outcomes and those of forthcoming batches, are more relevant than role of their educators in identifying their "teaching needs" which can easily be subjective depending on each educator's interest and subspecialty.

Response:

We totally agree with the Reviewer and have added the points with supporting references under Discussion, 6th paragraph, line 278-398 to emphasise the relevance of students' role in identifying their own learning needs.

"Lack of knowledge and understanding among the faculty and educators could be a factor that contributes to an unmet expectation among the students. Hence, it is important to take the students' preference of topics into consideration when developing RLOs. Students have their own experiences what topics that are difficult to understand where RLO would be helpful; or what topics require more multimedia interaction to enhance their learning processes. While educators' opinions on the selection of topics for RLO development should be considered as they are the content and education experts, their preference and prioritisation may be influenced by their personal interest and perceptions of the students. Afshar et al. have highlighted how educator’s preferred teaching approach resulted in the loss of interest and reluctance in learning biochemistry among medical students, and how this can be addressed by taking into considerations of the students' learning needs (34). We would, therefore, like to argue that teaching approaches should prioritise students’ needs and preferences over those of the educators. In addition, students' need assessment should be the cornerstone in curriculum planning and development to allow educators in identifying topics, skills and knowledge that address learning needs (35). With needs assessment, learning becomes more relevant to their clinical practice, which is more likely to lead to a change in their practice (36)."

[6] You need to check values as in table 2, row 1 indicates that (n=) for UPM is 25 and for TU-pharmacy is 3 but the following rows indicate that either the figure in row 1 is reversed or details in following rows are reversed between the two universities.

Response:

We apologised for the mistake and have amended the figure accordingly in Table 2.

[7] In figure 2, I suggest using different shapes for the plotted values instead of colours as usually printouts are made in BW rendering the figure to be of less value.

Response:

Thank you for your suggestion. We have changed the colour to different shapes and shading in Figure 2

[8] Check text carefully, there are several sentences containing doubled words and misspellings e.g. lines 214, 249.

Response:

We have combed through the manuscript and edited all the doubled words accordingly.

[9] Results, line #177 to 178, the sentence states that fig 2 is for students and fell short to mention for educators also.

Response:

We have now added the word "educators" in the sentence.

I Suggest rewriting and resubmitting as a study of student learning needs assessment of previous students for future classes' RLO. The needs of the students can give a more valid list than a list of educator's choices of importance of the competencies. Meanwhile, the study should cover competencies in UM and UPM only (exclude factual knowledge of TU).

Response:

Thank you for the suggestion to improve the manuscript.

Although we acknowledged that the nature of topics (clinical competencies vs basic science knowledge) might affect the topic selections of students, this study was intended to be exploratory and hope to compare the learning needs of the students with those perceived by the educators across different health disciplines. This, we hope, will identify gaps and findings for future research, particularly whether and how the nature of topics (and other factors) affects students' learning needs.

We address the limitation under Discussion section, last paragraph, line 331-333 as below:

"The nature of topics was different between institutions where UM's and UPM's topics were related to clinical competencies while TU Pham's topics were basic science knowledge. This might contribute to the discrepancy of results between institutions."

Reviewer #2:

Very interesting idea - Does the difference in student level at the institutions play a role here?

Response:

We agree with the reviewer that the difference in student level at the institutions might influence the results of this present study. We acknowledged this limitation under Discussion, last paragraph, line 334-335 as below:

"The students in this study were at different stages of their undergraduate programmes; this might have affected their learning needs and experience. This factor was not further explored in this study."

I am a bit unclear on your final conclusion. Please consider clarifying the take-away for the reader and perhaps increasing discussion a bit. Overall, very nicely done.

Response:

We have refined the conclusion to clarify the take-away message for the readers, under conclusion, line 338-344 as below:

"This study showed the variations of opinions in topic selection between students and educators across institutions and health topics. Further research is needed to explore the factors influencing the discrepancy in students’ learning needs from the perspective of the students and educators. This study also highlighted the importance of conducting students' learning needs assessment before developing eLearning resources for effective implementation. Learning needs assessment should be the starting point when designing eLearning resources for healthcare curricula."

Attachments
Attachment
Submitted filename: Response to reviewer_PlosOne_31.5.2021.docx
Decision Letter - Jenny Wilkinson, Editor

Prioritising topics for developing e-learning resources in healthcare curricula: a comparison between students and educators using a modified Delphi survey.

PONE-D-20-36536R1

Dear Dr. Ng,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Jenny Wilkinson, PhD

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Thank you for your responses and manuscript revisions, these have satisfactorily addressed reviewer comments.

Reviewers' comments:

Formally Accepted
Acceptance Letter - Jenny Wilkinson, Editor

PONE-D-20-36536R1

Prioritising topics for developing e-learning resources in healthcare curricula: a comparison between students and educators using a modified Delphi survey.

Dear Dr. Ng:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr Jenny Wilkinson

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .