Peer Review History
Original SubmissionJanuary 14, 2021 |
---|
Transfer Alert
This paper was transferred from another journal. As a result, its full editorial history (including decision letters, peer reviews and author responses) may not be present.
PONE-D-21-01387 Automated detection of superficial fungal infections from microscopic images through a regional convolutional neural network PLOS ONE Dear Dr. Jue, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Apr 22 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols We look forward to receiving your revised manuscript. Kind regards, Mohd Nadhir Ab Wahab, Ph.D. Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and Additional Editor Comments: Please address all the comments given by the reviewers before your manuscript can be considered for publication. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: No ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: This paper presents a novel, deep learning-based approach for detecting fungal infections in microscopic images via the autodetection of long branch-like structures (called hyphae) appearing through KOH examination. The proposed method consists of both location determination of hyphae (object detection) and image classification (positive or negative result for evaluating the presence of infection, respectively). The authors achieved promising results that exhibit rather high sensitivity and specificity, which are claimed to be higher than those of known experts. In my view the paper provides a valid and valuable contribution towards a more efficient detection of fungal infections, which is certainly not a trivial task. The achieved results are good, nevertheless I see that one of the major limitations that the authors did not dwell into, is that the current results are based on images collected from a single imaging and sample preparation setup and in this way, it's difficult to see whether the proposed model would achieve similarly great results on a more general basis. Actually, there is a tale-tell sign in the paper regarding this problem, that is, when mixing the datasets with 40x and 100x magnification, there is a significant drop in precision and F1 scores. For this reason I would like to see that that the authors discuss this issue further, particularly in the sense that what could be expected (or what modifications are anticipated in their method) in case they applied the deep-learning based autodetection on a more complete set of images that originate from various imaging setups and settings, as well as taking into account potential variability in sample preparation. For this very reason I feel that the authors need to be more cautious with a conclusion that the presented results demonstrate a higher sensitivity and specificity than those achieved by manual detection of experts, since the experts can clearly gather practice on a wide variety of setups with differing sample/image quality. Another points that would require clarification: (1) The authors use IoU=0.25 value as reference, while it is well-known that an IoU of at least 0.5 is considered a good result in object detection. Therefore I contend that using IoU=0.25 needs some more specific justification in the paper. (2) Pg. 6, lines 140-141, it is written that: "After calculating the minimum, maximum, and average probabilities, we obtain the final probability for a given microscopic image." Here and throughout the subsequent evaluation of results it is not clear what 'final probability' means and how the authors arrive at it from the defined min, max and avg probabilities. By the same token, the usage of these probabilities seems rather vague throughout the paper and it does not stand out clearly what is the purpose of these min, max and avg probabilities. Could you please clarify? (3) Pg. 7. lines 166-167: "The optimal setting of the Fungus hyphae database YOLO v4 model occurs when the IoU value is 0.25 and the detection probability threshold value is 0.244". How do you arrive at the optimal probability threshold 0.244? A minor note: please provide a proper table legend for Table 2, where all abbreviated measures are clearly given in text (e.g. TP, FP written as true and false positives, which are nowhere else referenced). Reviewer #2: The authors have recorded a data set of microscopic images for a clinically relevant application case, showing that hyphae detection can be performed well enough for practical purposes using a state-of-the-art object detection network architecture. While there is no methodological advance, I would argue that a well-executed application, along with an annotated data set (that is interesting per se for the image analysis community), should merit a publication. I have, however, the following comments/requests for revision: - You mention how many images are contained in the data set, but not how many different skin samples (each of which probably leading to several images). This needs to be clarified. If training and test data contain images taken from the same sample, we would overestimate the performance. For a real application, the model would need to generalize to completely new samples. Also, you write that samples from both skin and nails were used. Are both distributed equally among training and test data? - Yolo is a standard object detection network, but by far not the only network suitable for this task. It would clearly strengthen the evaluation if you could report the performance of another method, e.g. a two-stage detector such as Mask R-CNN, as a reference. - "We also created dataset-N (100, 40, all), which included microscopic images without dermatophyte hyphae, for testing." Does this mean that the negative cases that are available for 40 and 100 are only used during training and not part of the test data? - "All dataset": Pooling images from different magnifications is not such a practically relevant scenario. In an automated system, all images would be recorded at the same mangnification (or, if different magnifications are supported, one would have separate training sets). I would rather analyze the effect of including/excluding negative cases that will likely occur in a real application. - The images in Fig.3 differ a lot in their appearance/color. Can you discuss this and maybe provide a few more example images in a supplementary figure? Can you show that positive/negative cases do not differ with respect to their background color, but only with respect to the existence of the target objects? - Do all hyphae for x100 magnification look like in Fig. 3a and do all hyphae for x40 look like in Fig. 3b? Or is there more variability within x100 and x40, respectively. Again, showing more example images would help. - Also, does Fig. 3 show full resolution images or can you maybe show them at a higher resolution? The hyphae in Fig. 3b are really hard to see, and some are actually covered by the labels/boxes: Can this be changed? - Can you clarify how the data was recorded? What does it mean that "the images were recorded by rotating the screen at a 360 degrees angle where the hyphae were observed"? Why do you record a video instead of still images? - You claim, already in the abstract, that the network performed better than human experts. In the discussion it then becomes clear that this statement is based on a comparison to values reported in a different context by Jacob et al. Is this really comparable? It is ok to mention their results as a reference, but I would be careful making strong claims in the abstract that are then not fully supported by the results in the paper. In order to make such a claim, you would need an independent ground truth for your data, which would then serve to judge both the network and the human experts. - The annotated image data set is a valuable contribution of this work and will be of interest for the image analysis community (target objects in front of complex background/also negative examples provided, which is rare). If you plan to publish images and annotations along with the paper (as indicated in the data availability section), I would mention this in the paper, referring to the respective supplementary file/web link. - Minor points: - reference [14] contains two papers and should be split - Fig 4: "contusion matrix": ouch, better write "confusion matrix" ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
Revision 1 |
PONE-D-21-01387R1 Automated detection of superficial fungal infections from microscopic images through a regional convolutional neural network PLOS ONE Dear Dr. Jue, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Jun 19 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Mohd Nadhir Ab Wahab, Ph.D. Academic Editor PLOS ONE Journal Requirements: Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. Additional Editor Comments (if provided): Please address all the comments given by the reviewers. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: N/A ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The authors have definitely improved their paper with its revision, raised issues regarding description of the applied methods have been well addressed and clarified. The only question I still find somewhat ambiguous, and with some claims unwarranted in my view, is the performance of the author's model in terms of (1) comparability to that of dermatologists in real-world practice and (2) whether it can be generalized for other imaging setups than the specific one used in this study. The authors themselves exhibit in their reply to the reviewers' comments that while one could expect that their model may retain good performance in such relations, there is no hard evidence for it at this stage. Therefore, a cautious approach for deriving conclusions in that regard should be reflected throughout the paper. Regarding (1), the authors have already softened their claim in the Abstract ("The performance of our model had high sensitivity and specificity, indicating that hyphae can be detected with reliable accuracy."). This, however, is in contrast what is written in the Discussion: line 256-257. : "The performance of our model is higher than the sensitivity and specificity of the known experts, indicating that it is possible to detect hyphae with reliable accuracy. Accordingly, diagnosis can be made more efficiently and in a more straightforward manner." I would suggest that these parts should conform to what is written in the Abstract. As for (2), I would see it necessary to insert a few sentences again in the Discussion part, where the text would make it clear that: although the applied methodology certainly demonstrates very promising performance, it has been only tested with a single imaging setup and in future work a more thorough testing is needed with a bigger dataset from more variable imaging settings in order to establish how reliable the model's performance is and how it compares to that of real-world dermatologists in such a broader context. Reviewer #2: The authors have addressed my comments and now provide additional information, figures and access to the data. There are still a few smaller issues that can be resolved by a minor revision: - Thanks for explaining that you are developing "an automatic hyphae detection system that can be utilized in the field". I would actually mention this in the abstract and make it more clear in the introduction. This would help to introduce your application case to the reader, and it also motivates your decision to prefer a fast network over a potentially more accurate one. Fig. 1 illustrates the training process with a heavy multi-GPU machine. Maybe something like the image you have appended at the end of the reviewer PDF could be used to illustrate that the final system with the trained model should require only a smaller mobile device? - I would move the new paragraph between lines 93 and 99 to the section "dataset generation" You could also include the information about samples and skin/nail into Table 1. - "We split the labeled dataset into training and test sets at a 6:4 ratio." I understand this was stratified by sample and skin/nail. Apart from that, was the 6:4 split random? - Regarding the sentence: "the images were recorded by rotating the screen at a 360 degrees angle where the hyphae were observed" With you explanations, I now see what you mean. But the sentence is hard to understand and could be improved: You rotate a screen or actually a camera? Rotating by 360 degrees would mean no rotation? - It would be helpful to also show an example with ground truth annotations. There are some dubious cases where the non-expert reader wonders whether some faint structures are true negatives or actually hyphae that were missed. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
Revision 2 |
Automated detection of superficial fungal infections from microscopic images through a regional convolutional neural network PONE-D-21-01387R2 Dear Dr. Jue, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Mohd Nadhir Ab Wahab, Ph.D. Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: N/A ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The authors have addressed all of the concerns raised and I find their answers and modifications to the manuscript satisfactory. The paper in its current form is acceptable for publication in my view. Reviewer #2: (No Response) ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No |
Formally Accepted |
PONE-D-21-01387R2 Automated detection of superficial fungal infections from microscopic images through a regional convolutional neural network Dear Dr. Jue: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Mohd Nadhir Ab Wahab Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .