Peer Review History
| Original SubmissionMay 19, 2020 |
|---|
|
Pécs, Hungary June 22, 2020 PONE-D-20-14769 OrganoidTracker: efficient cell tracking using machine learning and manual error correction PLOS ONE Dear Dr. Kok, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised by the Reviewers, listed below. Please submit your revised manuscript by Aug 07 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols We look forward to receiving your revised manuscript. Kind regards, Joseph Najbauer, Ph.D. Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Kok et al. describe a method based on convolutional neural networks (CNNs) for the segmentation of cell nuclei, applying it to 3D organoid data. The problem is in general a challenging one, and the authors' tools will likely be useful. The paper is very clearly written, in terms of both text and figures. The methods used are not particularly novel, but this is not a criterion for PLOS ONE, which I fully support. Moreover, the methods are quite recent in their origin, so having additional examples of their use in the literature is valuable. The one major flaw of the paper, however, is that it does not adequately survey past work, or explain similarities and differences between this approach and other approaches. It is very hard for the reader to get a sense of whether they should adopt these methods without a comparison. More specifically: - The authors note that cell trackers exist [2, 8, 9], but this only gets one sentence. Ref 8 compares several methods: how do these differ from the authors'? Ref 8 also provides 'test data,' I believe -- how do the authors' methods perform when applied to this data? - The authors dismiss Ref. 10 (Amat 2014) with "For example, for tracking cells in fruit flies, zebrafishes and mouse embryos there are existing software packages" -- I fail to see how the nature of the organism matters. Is there something different about the *images* that makes these tools fail for organoid images? The images in Amat et al's paper are, in fact, 3D images of cell nuclei -- the same thing being imaged here. - Some of the papers [11-17] seem to be about similar sorts of images to the authors' . What is different about either the images, the algorithm performance on similar data, or other features? - There are many recent papers that are likely relevant, but not discussed; the authors should describe several of these, and compare with at least a few if these also involve similar microscopies and cell types. There are for example OrgaQuant: Human Intestinal Organoid Localization and Quantification Using Deep Convolutional Neural Networks (Kassis et al 2019, https://www.nature.com/articles/s41598-019-48874-y), Nucleus segmentation across imaging experiments: the 2018 Data Science Bowl (Caicedo 2019, https://www.nature.com/articles/s41592-019-0612-7.pdf?draft=collection), Volumetric Segmentation of Cell Cycle Markers in Confocal Image (Khan 2019, https://www.biorxiv.org/content/10.1101/707257v1.full.pdf); DeepSynth: Three-dimensional nuclear segmentation of biological images using neural networks trained with synthetic data (Dunn et al. 2019) Reviewer #2: This is a very nice straightforward paper. Rutger et al. proposes here a method to detect, "segment" and track nuclei in time-lapse videos displaying growth of intestinal organoids. The detection is based on convolution network that is trained to produce small bright spots near nuclei centres, nuclei are then fitted with Gaussian to estimate their volumes (precise boundary is not extracted nor required). Then, published solver is used to establish the tracking links as a solution to a set of straightforward rules. Finally, a set of rules is considered to alert an user of potentially spurious tracking (the rules violating) loci and allow to curate it. The method is described nicely, and I think I could reproduce it, besides it is publicly available on Github. The paper reads very well, the problem is clearly given, well motivated, well explained and evaluated. I could only find a few typos (please, see below). I have, unfortunately, a major concern about scientific novely of the paper. Pardon me to say: it "only" describes a success story -- an application of carefully tuned pipeline of otherwise known steps applied on one type of data. While I appreciate (and see very well) the non-trivial amount of work beneath it all, the nice software and the manuals to it!, what new lesson can we learn after reading it? There is, for instance, no comparison against any other state-of-the-art method. Or, since manual curation is here an acknowledged part of the method, I was missing an evaluation of how efficient are the quality-checking rules after the tracking? Does it emit a lot of false positives (requests to check what is already good), does it have false negatives (does not report what should have been reported), how much does the result improve after the curation? Could the method/SW be compared against, e.g., TrackMate or Mamut that both also do some cell detection, tracking and offer a GUI curation? I don't want to kill the paper. I'm therefore trying to propose relevant topics that could, in my opinion, enrich the study: Regarding the evaluation of the detection and tracking, and since the authors claim the availability of the export module to the Cell Tracking Challenge data format, could the, to some extent standard methods, of DET and TRA be also used to evaluate the tracking? This could illustrate the difficulty of the problem in the context of other cell tracking tasks. I think the strength of the paper is not necessarily the tracking itself, I think it is more the environment around it, the SW: its ease of use, flexibility, how to train the detector for another data, how to obtain some initial annotation (can the proposed SW do it at all?), can one re-track only a subset of the data, or just one lineage tree (starting from one cell, the daughter cells may end up around each other or in some symmetry pattern; and anything different is likely a reason for the inspection of the tracking?), how easily can one manipulate the created lineage etc. There was, I believe, a passage about re-training the network after first round of detections, does it help at all? I have tried some months back also to implement a rules to describe what is a good track and what would allow me to point at problematic tracks. I've tried to look for sudden change of trajectory direction, sudden change in displacement distance of a cell between consecutive time points -- none of that worked well (lot of false positives) for my embryonic developmental data. The best test, still not ideal, for me was to detect how many different n-nearest neighbors are there between consecutive time points. A comparison of similar rules, or proposal of some good options, would be indeed a novelty... I am not aware of a literature about precisely this. Could it be formulated for neural networks and have them to detect anomalies? What I also found an interesting aspect of the work is the proposed solution how to evaluate performance of a method on only a partially annotated real data. Immediately a question arises: To what extent does the choice of the annotation subset influences the obtained performance level. In other words, how much does the obtained performance represent a performance when measured against the full annotation. The authors have 6 frames with full annotation, perhaps that could be used a basis for such an experiment. Minor, typps: Fig 2. "corrected data _that_ can be", L(ine)77 "ground-truth data, which _is_ done using", L128 "apply a a 3D" L193 spacing between words Fig 4D, an intensity profile of the data and of the fit would be worthwhile to appreciate the accuracy of the fit ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 1 |
|
Pécs, Hungary September 11, 2020 PONE-D-20-14769R1 OrganoidTracker: efficient cell tracking using machine learning and manual error correction PLOS ONE Dear Dr. Kok, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised by Reviewer #2, listed below. Please submit your revised manuscript by Oct 26 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols We look forward to receiving your revised manuscript. Kind regards, Joseph Najbauer, Ph.D. Academic Editor PLOS ONE [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The authors have done a good job of clarifying the relationship between this work and past work, as well as describing the particular challenges of organoid images. I wish the discussion in the text had the level of detail of the "response to reviewers," though I realize that this would perhaps have been distracting. I support the publication of the paper. Reviewer #2: The authors have addressed all questions, issues and suggestions of this reviewer in their revised manuscript (and I thank you for that). The manuscript is now even more clear. Well done. The changed parts of the manuscript align well in the manuscript. If I were to suggest final set of changes, I would ask for the following minor improvements or clarifications: Regarding the requested comparison to the TrackMate and Mamut, I thank the authors for doing this extra work and for sharing with me the new findings. I would like to encourage them to actually include this piece of information into the manuscript. Not only it shows another "plus points" for the SW of the authors, I think it is indeed a fair (despite "only" quantitative) comparison of SW that include the same stages (detection, linking _and_ curation) -- and that's useful piece of information also for the community. Regarding the conversion of nuclei detection (blobs) into pseudo-segmentation masks (by placing spheres of 5 um radius), why were some detections (and seg. masks) removed? It biases the measurement, no? Please clarify in the manuscript if the filtering was due to the fact the only six time points are fully annotated (in which case the filtering makes sense). What is the average diameter of nuclei, is it less than 10 um? Otherwise, the spheres are over-segmenting and bias the DET score. In Fig 2, perhaps the first parentheses on line 2 could be "(obtained from manual detection)"... detection is less difficult activity than tracking, while it is enough for creating training data for the network... just an idea. Line 8 (of the document with marked changes): "the the historical" Line 48: "in in time-lapse movies." PS: I apologize to the first author for mixing up his first name and surname (in my first review I wrongly wrote "Rutger et al."). ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 2 |
|
Pécs, Hungary October 5, 2020 OrganoidTracker: efficient cell tracking using machine learning and manual error correction PONE-D-20-14769R2 Dear Dr. Kok, We’re pleased to inform you that your manuscript (R2 version) has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Joseph Najbauer, Ph.D. Academic Editor PLOS ONE |
| Formally Accepted |
|
PONE-D-20-14769R2 OrganoidTracker: efficient cell tracking using machine learning and manual error correction Dear Dr. Kok: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Joseph Najbauer Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .