Peer Review History
Original SubmissionOctober 3, 2019 |
---|
PONE-D-19-26633 Error rates of human reviewers during abstract screening in systematic reviews PLOS ONE Dear Dr. Wang, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. ============================== This is an interesting study. However, as pointed out by two reviewers, there are some issues needed to be addressed in particular the methodology. Since the work is quite unique, more explanation in the background (see Reviewer #1) as well as the Methods (see Reviewers #1 and 3), are needed. ============================== We would appreciate receiving your revised manuscript by Jan 26 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols Please include the following items when submitting your revised manuscript:
Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out. We look forward to receiving your revised manuscript. Kind regards, Sompop Bencharit, DDS, MS, PhD, FACP Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at http://www.journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and http://www.journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf 1. Please ensure that you include a title page within your main document. You should list all authors and all affiliations as per our author instructions and clearly indicate the corresponding author. 2. Please upload a copy of Figure, to which you refer in your text on page 3. If the figure is no longer to be included as part of the submission please remove all reference to it within the text. 3. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: No ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: I Don't Know Reviewer #3: No ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: No Reviewer #2: Yes Reviewer #3: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Thank you for this study which contributes to our knowledge of the process of conducting systematic reviews. Data Availability: The list of systematic reviews examined is provided; however, there are no data regarding the full sets of references obtained from the original searches of the examined systematic reviews, nor the inclusion/exclusion decisions of each reviewer for each systematic review. Therefore, readers would not be able to replicate this analysis with the data currently provided. Statistical Analysis: the descriptive statistics were calculated correctly. Background: You mention that systematic reviews employ suboptimal methodological approaches and the potential for human errors; however, you do not acknowledge that automated systems and algorithms could introduce errors, potentially systematic errors which could introduce bias into a systematic review, such as that found in this study https://science.sciencemag.org/content/366/6464/447 Methods: Consider explaining/justifying the eligibility criteria #3 use a single software program, Distiller SR - were reviews excluded because they used a different software program? For eligibility criteria #2 - was the citation inclusion by one reviewer prompting automatic inclusion happen at only the abstract screening level (vs. full text screening)? under what circumstances were disagreements between reviewers reconciled via consensus or arbitration. Results: Consider breaking out the error rate to errors of exclusion and errors of inclusion, as these may be different and of interest to readers and would allow you to provide rates of specificity and sensitivity, typical metrics for classification problems as you mention in the Background section and as was done in the study by Bannach-Brown referenced below. Further, I would like to see a breakdown of errors from the abstract screening vs. the full text screening (and within this stratification, report exclusion vs. inclusion errors). This is important and interesting, because as you note in the limitations, the errors of inclusion at the abstract screening level reflect the fact that more information is needed to make a decision, rather than commission of an error by the reviewer. Limitations: You note that "In our practice, citations from abstract screening were automatically included when conflicts between two independent reviewers emerged" - perhaps note that this may have resulted in a spuriously lower error rate than "truth." Consider referencing the following article in the background or implications as this study calculated error rates from a Machine Learning screening algorithm: Bannach-Brown, A., Przybyła, P., Thomas, J., Rice, A. S., Ananiadou, S., Liao, J., & Macleod, M. R. (2019). Machine learning algorithms for systematic review: reducing workload in a preclinical review of animal studies and reducing human screening error. Systematic reviews, 8(1), 23. Limitations or perhaps conclusions: As this is a novel study, consider contextualizing the findings as an initial calculation of human errors observed in a small number of systematic reviews and discussing the need for replication of this study using reviews conducted by other EPCs or other non-EPC institutions, using different software, and most important to increase the sample size of studies used to inform our knowledge of valid human error rates. Typo: 3rd paragraph of background section, In recent year should be In recent years Reviewer #2: Dear authors - Thank you for investigating this question. It's of interest to me. I agree that the sample is small and I wondered how meaningful the 10% error rate is. It's expected that human reviewers will make mistakes and if each reviewer errors on 10% of decisions, then... what? That's one reason why there are two reviewers. However, the use of that number as a benchmark to test automated systems puts it in an interesting context. If the automated process is as effective as a human reviewer, then we can save significant human time by eliminating one of the reviewers. But, again, the sample is small, and I imagine the type of question can influence the error rate, and some screening questions might be more conducive to human review / automation. In other words, a 10% error rate for one question may vary drastically for another. Some of this is touched on in the 'implications' section, which I'd like to see expanded, but I understand that such discussion can easily go beyond the study. Overall, the small sample and the topic variability in the sample and the variability of the screening questions in each of the reviews in the sample lead me to question the significance of the 10%. And that makes me question its utility as a benchmark. But it's a thought-provoking topic and the authors use an interesting method to address it (Distiller data). Thank you, by the way, for including the complete list of systematic reviews included in the analysis. This helps with reproducibility. Reviewer #3: Dear Authors, I have attached a revised version of your manuscript and a PDF file including my specific comments. My major concerns are related to the methdology you followed in your study. Please, check both. Best regards. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: Yes: Mark MacEachern Reviewer #3: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step.
|
Revision 1 |
Error rates of human reviewers during abstract screening in systematic reviews PONE-D-19-26633R1 Dear Dr. Wang, We are pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it complies with all outstanding technical requirements. Within one week, you will receive an e-mail containing information on the amendments required prior to publication. When all required modifications have been addressed, you will receive a formal acceptance letter and your manuscript will proceed to our production department and be scheduled for publication. Shortly after the formal acceptance letter is sent, an invoice for payment will follow. To ensure an efficient production and billing process, please log into Editorial Manager at https://www.editorialmanager.com/pone/, click the "Update My Information" link at the top of the page, and update your user information. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to enable them to help maximize its impact. If they will be preparing press materials for this manuscript, you must inform our press team as soon as possible and no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. With kind regards, Sompop Bencharit, DDS, MS, PhD, FACP Academic Editor PLOS ONE Additional Editor Comments (optional): The authors have sufficiently addressed all comments from the reviewers. Reviewers' comments: |
Formally Accepted |
PONE-D-19-26633R1 Error rates of human reviewers during abstract screening in systematic reviews Dear Dr. Wang: I am pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please notify them about your upcoming paper at this point, to enable them to help maximize its impact. If they will be preparing press materials for this manuscript, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. For any other questions or concerns, please email plosone@plos.org. Thank you for submitting your work to PLOS ONE. With kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Sompop Bencharit Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .