Peer Review History
| Original SubmissionFebruary 22, 2021 |
|---|
|
Transfer Alert
This paper was transferred from another journal. As a result, its full editorial history (including decision letters, peer reviews and author responses) may not be present.
PONE-D-21-05904 Medical ethical review of COVID-19 research in the Netherlands; a mixed-method evaluation among Medical Research Ethics Committees and investigators PLOS ONE Dear Dr. IJkema, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by May 23 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Prof. Ritesh G. Menezes, M.B.B.S., M.D., Diplomate N.B. Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements.
https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and
Furthermore, please provide additional information regarding the questionnaire development process, including the theories or frameworks which were employed. When reporting the results of qualitative research, we suggest consulting the COREQ guidelines: http://intqhc.oxfordjournals.org/content/19/6/349. In this case, please consider including more information on the number of interviewers, their training and characteristics; and please provide the interview guide used. Please provide additional details regarding participant consent. In the ethics statement in the Methods and online submission information, please ensure that you have specified (1) whether consent was suitably informed and (2) what type you obtained (for instance, written or verbal). If your study included minors under age 18, state whether you obtained consent from parents or guardians. If the need for consent was waived by the ethics committee, please include this information.
In your revised cover letter, please address the following prompts: 3a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially identifying or sensitive patient information) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent. 3b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. Please see http://www.bmj.com/content/340/bmj.c181.long for guidelines on how to de-identify and prepare clinical data for publication. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. We will update your Data Availability statement on your behalf to reflect the information you provide. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly Reviewer #2: Yes Reviewer #3: Yes Reviewer #4: Partly Reviewer #5: Yes Reviewer #6: No Reviewer #7: Yes Reviewer #8: Partly ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: I Don't Know Reviewer #2: Yes Reviewer #3: I Don't Know Reviewer #4: No Reviewer #5: I Don't Know Reviewer #6: No Reviewer #7: I Don't Know Reviewer #8: N/A ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes Reviewer #4: No Reviewer #5: No Reviewer #6: Yes Reviewer #7: No Reviewer #8: No ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes Reviewer #4: No Reviewer #5: Yes Reviewer #6: No Reviewer #7: Yes Reviewer #8: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: 1. Box 1 states, "a so-called reasonable assessment period of 8 weeks (56 days) normally applies to medical and scientific research subject...." Comparing this to your data in Table 1, the median review days for regular research: 50.0 (1-158) shows that the 56 days spec can be far exceeded. What is the cause? 2. I note in Table 1 that basically everything get approved! Out of 471 submissions, only 2 were rejected. That is a reject rate of 0.4% -- this is very, very low. Someone can interpret this as a low quality standard -- or, are you sending these submissions back and forth over and over to the PI for corrections, until you give an approval? 3. What the revise and resubmit less or more with the fast-track process? How many cycles in each (regular and fast track)? 4. What about the issue of submissions returned to the PI as they were out of scope (not WMO)? This is known to be a problem in the Netherlands because it is very difficult to determine what is in scope for WMO as the definition is vague and there is the ambiguity of biopsych/psych research. This would apply to COVID as well due to the mental health and isolation issues associated with the topic. Did MRECs find more of this happening with the COVID submissions? 5. Line 170, are you saying that a digital signature is not permitted (even before COVID)? This seems unusual and burdensome. 6. There is no real explanation of what the fast track process is -- does it mean that it gave a priority to COVID research (over regular research?) or was it that there was an additional reviewing mechanism (a funnel) that took those submissions while there was no change to how regular research submissions were managed? 7. I don't see a formal measure of review quality. How was that actually measured? 8. What about the issue of the capability of the MREC members to review COVID research? Did you need external experts? How did you find them? Did this impact review time (looking for them)? Or did you have the on-site/internal COVID expertise to review those submissions? This is important to consider in light of the problem with HRECs approving research that is low quality/poor design . See and perhaps reference this paper, https://jme.bmj.com/content/46/12/803 Reviewer #2: This is a timely and well-written paper on a relevant topic (ethics reviews of COVID research). I have some minor suggestions for improvement: Introduction: A bit more information is required about the functioning of the MREC's in The Netherlands (eg what are typical member's profiles), so that readers can compare these to the working of the MREC's in their own country. p6 "As an explanatory follow-up the quantitative results were further explained with in-depth qualitative data" --> 'explained' is not really an appropriate term as qualitative and quantitative research have different finality, I would suggest changing it to 'shed more light on'. p6: how many people were there in each focus group? Best to name exact numbers per group also here. It would be good to provide a table with demographics of focus group participants as well. Am I correct that the factual data such as the review times were asked in the questionnaire? So these were based on self-reporting of the participants rather than on actual administrative data? Best to make this clear in the text. p10: "Working remotely from home and using video conferences probably contributed to a rapid scheduling of review meetings." --> is this what the focus group participants said or is it your own interpretation? Make sure to distinguish this in the results of the qualitative research. p13: "The little decline" --> the small decline The discussion of the qualitative research is very concise. I feel that this would be helped if some quotes from the transcripts were provided to illustrate the themes. At the end of the paper, there are recommendations in a table, which is fine. However, I would expect a bit more elaboration of these recommendations: --> what would these practically entail and how do they relate to your research findings? There is already something in the text preceding the table, but I feel this needs a bit more structure, and the recommendations themselves have to be made more visible in the text. Reviewer #3: This paper presents research focused on the experience of researchers and MRECs during the beginning of the COVID-19 pandemic in 2020. The researchers used both administrative data on review length and novel data (both quantitative and qualitative) focused on the experience of reviewers and submitters to understand how the fast track review process (FTRP) impacted COVID-19 research. Overall, the paper does a nice job showing the need for the FTRP, as well as the experience on both sides. The paper should include more technical information, such as more detail about the questionnaire process, any relevant demographics for respondents, how data was analyzed. There are some interesting points in the data that are not reflected in the writing. Major issues: 1. Please provide more detail about the questionnaire and the semi-structured interview guide; it’s very unclear how they were designed, what was asked, how long they took, what the hypotheses were, etc. 2. Some of the more interesting findings are not focused on as much – investigators unanimously thought their research was either equal or better than usual, while reviewers that it was equal or worse (33%!); investigators also seemed to express frustration at legal, tech, participating centers for not moving fast enough. This pokes at a really interesting ethical issue – you have researchers not spending enough time on their own research plans, believing they are as good or better than their other research, and feeling frustrated that legal is concerned…combined with reviewers feeling pressured to approve quickly. All of the recommendations are focused on getting approvals done even more quickly, with nothing focused on the ethics of rushing approvals for protocols that are not adequate. 3. A few references to patients and patient burden – should make clear why patients were not contacted for the study; it would be interesting to at least comment on patient experience here, or discuss as a limitation Minor issues: 1. 115-118: Provide response rates for individuals, not by MRECs. Provide details on recruitment, follow-ups, compensation 2. In Implementation Process – it’s confusing to jump between MRECs as entities and individual respondents as entities 3. In total, four semi-structured group interviews were conducted with 10 MRECs representatives and 9 investigators – mixed or by group? 4. A few spelling and grammar errors throughout – nothing major, just recommend a close read to make sure all errors are caught Reviewer #4: Summary of the manuscript: This paper aims to evaluate the fast-track review procedures (FTRP) set up by medical research ethics committees (MREC) in the Netherlands during the COVID-19 pandemic in an effort to meet the urgent need for accelerated review of research proposals. It is an exploratory sequential mixed-method study that uses online questionnaires and in-depth interviews. The authors found that the total number of review days was shorter in FTRP than in regular review procedures (RRP) but this did not seem to impact the quality of review. The main difficulties of the FTRP were the heavy workload for MREC members, the lack of accessibility/coordination between stakeholders, and the lack of “overview of COVID-19 research”. The authors end the paper with a table of recommendations for more efficient and less burdensome FTRPs in the future. Furthermore, in the discussion, the aim is re-phrased so that the goal of the study is to identify differences between FRTP and RRP, rather than to evaluate FRTP in a medical ethical context. Overall, the paper succeeds in giving a description of the FRTP process by MRECs during the COVID-19 pandemic, but not an evaluation or analysis. Strengths: This paper is ambitious and its objective is very important, as 2020 has been a unique opportunity for examining medical ethical review processes. It also appears methodologically sound. The study received ethical approval and participants provided informed consent. The results are quite striking and thought-provoking, and there is a good, broad summary of the qualitative results. Weaknesses: The results are straightforward, but lacking in-depth analysis and ethical reflection. It is also unclear what questions were asked in the questionnaire and interviews. The language of the paper is overall moderately difficult to follow, so the paper would benefit from thorough copyediting. Major issues: 1) It is unclear what questions were asked in the online questionnaire (Line 84-85) and how the interviews were structured (e.g. the topics that were planned to be discussed) and coded (line 99-105) This would be helpful if the authors could include a sample of the questionnaire and the coding, e.g. as part of an appendix/supplementary information. 2) The quantitative results section would benefit from a more rigorous statistical analysis, e.g. to identify patterns or trends, and/or differences between investigator findings and MREC findings. Currently, the results are stated without detailed explanation or comparison e.g. Line 143-155 3) The qualitative results section would benefit from more detailed description of the interview findings. For example, Line 169: explore the “leniency” further. Line 179-183: How much impact did the delay of regular research have on the investigators? It would further be important to analyse the ethical implications of delaying appraisals of regular research, perhaps in the discussion section. 4) The discussion section seems to re-state the results. It would benefit from an in-depth ethical analysis, e.g. the implications of “leniency” of required documents, or the increased need for human tissues (line 211-212.) The paper would also benefit from a discussion of the strengths/weaknesses of the study, e.g. sample size, subjective nature of the data (like results mentioned in Line 194-195), and applicability of the findings. 5) The discussion section would benefit from a description of the recommendations before outlining them in a table, as well as giving examples and evaluating the benefits, costs, feasibility, and justification of the recommendations. 6) While some of the findings are endorsed by previous papers (Line 260) the points in discussion section need to be tied back into the existing literature to place the findings in a wider context, especially when analysing the ethical implications or providing recommendations. 7) Overall, the language, sentence structure, and (lack of) punctuation decrease the readability of this manuscript. This also makes some of the results unclear. For example: line 210-211 - the protocols did not have a “good overview of the availability of patients and possibilities in the hospital”? Minor issues: 8) The tables are not very easy to read; they could benefit from more formatting Reviewer #5: This manuscripts reports the results of a mixed method student into the fast track reviewing procedures that medical research ethics committees (MRECs) in the Netherlands implemented during the COVID-19 pandemic in 2020. Overall, the research design is sound and the data reported supports the conclusions, and the results are of interest to an international audience as human research ethics committees worldwide have implemented similar fast track review procedures throughout the pandemic. However, several issues need attending to bring the manuscript up to a standard applicable for an international journal and audience. I have made many suggestions in the comments of the PDF manuscript file (attached). Notably, the authors should append into the Supp Materials a completed COREQ checklist for qualitative research as well as the the survey questions and interview guide. In the manuscript they should provide an assessment of response/compliance rates and address some discrepancies in responses shown in Table 2. Substantively, while I concur with the authors recommendations for improving the fast track procedures of MRECs, it is unclear why they could also not be for regular review processes, as it seems that this system is in need of improvement as well – 90 days for regular review seems excessive and a national approach may help to reduce duplication. I’m unsure if an exceptional case can be made – why not just improve the whole review system while we are here? Finally, the manuscript contains some awkward phrasing throughout and should be thoroughly edited for an English-language international journal. Reviewer #6: Introduction 1. It is not clear why the information about the Dutch setting has been put in a box. I would suggest this material is simply described in the introduction section. And that Dutch names of legislation are excluded. 2. Although the literature of ethics review of research in pandemics may be limited, it would be helpful if the authors outline the key findings of this research and WHO recommendations in more detail in the introduction section, including the few publications regarding ethics review in the current pandemic. 3. The aims and importance of the study are not clearly stated at the end of the introduction section Methods 4. The methods are currently described in insufficient detail. More details are needed regarding the development, contents, implementation, and analysis of the survey, and regarding the design, implementation, and analysis of the group interviews. It would be preferable if the authors followed reporting guidelines. Results 5. Quantitative results section is difficult to follow and I would suggest that authors revise to improve clarity and readability. It would be helpful is the authors provide demoniators numerators, and a table provide full results. 6. The authors should consider adding a table that provides an overview of the interview codes and subcodes identified. Discussion 7. The discussion section should begin with a clear statement of the key findings of the study. Currently, there is too much repetition of study aims and results. 8. The discussion of the results in the context of existing literature is not sufficient, partly reflected by the article citing a total of 11 publications. Although there may be limited reearch focusing on ethics review of research in pandemics, there is certainly relevant research that the authors should draw on. 9. The practical implications of the research need to be discussed in more detail. Recommendations are listed in table 3, but the authors should discuss these in the discussion section. As these recommendations are from the authors, the summary should also be in a box and not a table. 10. The limitations of the study are not described. Reviewer #7: In this mixed-methods study, the authors evaluate how expedited research review in the Netherlands during the COVID-19 pandemic affected processes and perceptions form both investigators and board review members. This is one of the first studies of its kind to use mixed-methods techniques to study implementation of expedited research review during COVID, and as such is likely to be of significant interest to the research community. The generalizability of this study is questionable given that nation-wide efforts expedited review processes were rare in countries like the US, where research review is usually performed by single institutions. However, as these institutions likely had COVID fast track policies, there are still learning lessons. A few comments: INTRODUCTION 1) Some further detail on the FTRP in the introduction could be helpful. Did the FTRP apply only to COVID-related research? Did it apply only to prospective studies, or retrospective/observational studies as well? Furthermore, was other non-COVID research allowed to be submitted for review during the COVID period? Did these policies vary by institution or were they set country-wide? METHODS My biggest issues apply to the qualitative portion 2) The authors should follow a published checklist, such as the COREQ checklist, for reporting their qualitative study. 3) The authors should describe how saturation of interview themes during the interview period was arrived at. 4) In the methods, The number of interview guide questions should be included, even though the guide itself is provided as a supplement. 5) A weakness in this paper is the lack of a cited theoretical model or foundation for the qualitative interviews. How were codes arrived at for the qualitative portion? Was a grounded theory approach, or another method used? 6) To calculate review timeline, were institutions required to send all of their COVID reviews for analysis, or was there some sort of convenience sampling where institutions were allowed to choose which COVID reviews to send for analysis? RESULTS 7) The manuscript would significantly benefit from inclusion of direct quotes from interview transcripts to support some of the general points made. Alternatively, a table of representative quotes within each theme would be beneficial. 8) How were the number of 4 total interview groups arrived at? How many participates participated in each group? Reviewer #8: 1. I would first like to note the importance of this topic and the potential impact of this research. I especially respect the integration of qualitative data here, and I appreciate this research team’s identification and exploration of such an important issue. 2. I’m concerned that the level of detail provided in the Methods section is insufficient to understand the data; in particular, expanding the description of the instrumentation would be helpful to understand the meaning of the data; e.g., how were these surveys/interview guides developed? Was there cognitive interviewing or pilot testing? Were they based on any validated instruments, literature, etc.? What data was collected via survey versus secondary analysis of CCMO records, and how were these integrated? 3. Further, additional explanation and description of the coding and thematic analysis would be helpful to understand the level of rigor involved in the methodology; e.g., did the coders measure any sort of inter-coder agreement? I do see that the coders “independently” applied the codebook to the transcripts, but am not clear on whether they each coded all 4 transcripts in their entireties. And were these a priori structural themes? Emergent thematic? 4. At several points throughout, there was some confusion re: the extent to which statements were actual presentations of data, versus conjecture/observation/opinion of the authors; e.g., “More central management and collaboration between different research groups and MRECs could improve this.” [216-7] -- it is unclear if this is an actual finding emerging from the data or just an argument from the authors. There are several similar instances throughout. 5. The presentation of quantitative results as separate from qualitative results—and the descriptor of “mixed method study”—is confusing. According to the authors’ statement that “as an explanatory follow-up the quantitative results were further explained with in-depth qualitative data” [81-2], it seems more informative/appropriate to present these results in a cohesive, synthesized analysis. Some qualitative researchers would argue that collecting follow-up qual data from participants to further inform their prior quant responses does not constitute a “mixed methods” study design. I don’t have a strong opinion on this, but I wanted to point this out so the authors can revise to clarify or preemptively defend against potential critiques/confusion. 6. I’m somewhat concerned about the framing of investigators being “satisfied” with the review process (e.g., [143-4])—investigator satisfaction is not the goal of any ethics review in any circumstances, and should not be messaged as a measure of quality, success, or efficiency. 7. I’m also struck by the reporting of quant/qual data re: MERC reps’ quality assessment of reviews, only insofar as a MERC rep could reasonably have patent or latent bias in that assessment; i.e., if standard review is the minimum threshold for ethics review, why would a MERC rep ever consider their review to be below the ethical standard, much less discuss that openly with other MERC reps? For the sake of clarity: theoretically, the MERC reps who reviewed these studies would *by definition* believe that their review was appropriate, otherwise they would not have done it that way. Further, given the method chosen here (group interviews), a rep who questioned the quality of a review would have little-to-no incentive to disclose that doubt amongst their peers. But perhaps I’m missing something here. 8. Overall, I worry that the authors may have limited the value/applicability/impact of their study by presenting what appears to be a cursory analysis, with the results section comprising little more than a recitation of (presumably structural) themes. Additional themes, further exploration, or just a more extensive analysis would be helpful to better support the study’s overall impact. 9. The discussion of other review timelines seems out of place and uninformative considering the wide range of legal/regulatory requirements in oversight/review, additional systemic burdens from the pandemic, etc. 10. It would be helpful to provide some amount of qualitative data to make sense of the themes presented; without this, readers cannot know how salient a theme was, if 2 or more themes were from the same groups/participants, etc. 11. “These changes did not affect the quality of the review. Moreover, the FTRP is associated with a high degree of satisfaction among both investigators as well as MREC members.” [252-3] Is this first statement demonstrably true based on the data? Or is this an argument by the authors? There doesn’t seem to have been any actual measurement of quality beyond the perceptions of those who were (i) being reviewed or (ii) doing the very review of which the quality is in question. 12. The overall framing of the results and discussion is focused on investigator and MERC rep perspectives, with little-to-no acknowledgment of competing priorities (e.g., clinical care, allocation of limited resources); i.e., to recommend any action/policy without representation of additional stakeholders seems unwise at best. 13. Some attention is needed for minor grammatical and typographical errors. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No Reviewer #3: No Reviewer #4: No Reviewer #5: No Reviewer #6: No Reviewer #7: No Reviewer #8: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.
|
| Revision 1 |
|
Medical ethical review of COVID-19 research in the Netherlands; a mixed-method evaluation among Medical Research Ethics Committees and investigators PONE-D-21-05904R1 Dear Dr. IJkema, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Prof. Ritesh G. Menezes, M.B.B.S., M.D., Diplomate N.B. Academic Editor PLOS ONE Additional Editor Comments: - Title: Replace ''Medical ethical review'' with ''Ethical review''. Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: All comments have been addressed Reviewer #4: All comments have been addressed Reviewer #6: All comments have been addressed Reviewer #7: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes Reviewer #4: Yes Reviewer #6: (No Response) Reviewer #7: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes Reviewer #4: I Don't Know Reviewer #6: (No Response) Reviewer #7: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes Reviewer #4: Yes Reviewer #6: (No Response) Reviewer #7: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes Reviewer #4: Yes Reviewer #6: (No Response) Reviewer #7: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: great work to satisfy 8 peer reviewers! best of luck with your continued research -- this is a very important topic. Reviewer #2: I am satisfied with the changes that the authors have made, it is now ready for publication. Thank you. Reviewer #4: My previous comments have been addressed. However the paper would benefit from more copy-editing to address minor issues. Reviewer #6: (No Response) Reviewer #7: The authors have sufficiently responded to my critiques. This paper is likely to be of significant relevance to the international research community. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: Yes: Kristien Hens Reviewer #4: No Reviewer #6: No Reviewer #7: No |
| Formally Accepted |
|
PONE-D-21-05904R1 Ethical review of COVID-19 research in the Netherlands; a mixed-method evaluation among Medical Research Ethics Committees and investigators Dear Dr. IJkema: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Prof. Dr. Ritesh G. Menezes Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .