Peer Review History
| Original SubmissionNovember 22, 2019 |
|---|
|
PONE-D-19-32488 Reproducibility, repeatability and agreement of corneal topography measured by Revo NX, Galilei G6 and Casia 2. PLOS ONE Dear MD Wylęgała, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. We would appreciate receiving your revised manuscript by Feb 22 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols Please include the following items when submitting your revised manuscript:
Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out. We look forward to receiving your revised manuscript. Kind regards, Andrzej Grzybowski Academic Editor PLOS ONE Additional Editor Comments: Thank you for submitting this interesting. Please revise the ms according to the reviewer's comments. Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at http://www.journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and http://www.journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf 2. Thank you for stating the following financial disclosure: "Adam Wylęgała is The Kosciuszko Foundation Scholar." a. Please state what role the funders took in the study. If the funders had no role, please state: "The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript." If this statement is not correct you must amend it as needed. b. Please include this amended Role of Funder statement in your cover letter; we will change the online submission form on your behalf. 3. Please provide details of the obtained participant consent in the ethics statement on the online submission form. Currently this information is only available in the methods section of your manuscript. 4. Please carefully proofread your manuscript for typographical errors. For example, in the Methods section “… had been informed of the experimental procedure.We included 94 eyes of 35 Males and 59 Females …” should be written as “… had been informed of the experimental procedure. We included 94 eyes of 35 males and 59 females …” 5. Please provide further details regarding how participants were recruited, including the participant recruitment date. 6. We note that you have a patent relating to material pertinent to this article. a. Please provide an amended statement of Competing Interests to declare this patent (with details including name and number), along with any other relevant declarations relating to employment, consultancy, patents, products in development or modified products etc. Please confirm that this does not alter your adherence to all PLOS ONE policies on sharing data and materials, as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests by including the following statement: "This does not alter our adherence to PLOS ONE policies on sharing data and materials.” If there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared. We note that the Revo NX equipment used in this study was provided by Optopol Technology Ltd. Please state this information in your competing interests statement and clarify whether Optopol Technology Ltd. played any further role in the study. b. This information should be included in your cover letter; we will change the online submission form on your behalf. Please know it is PLOS ONE policy for corresponding authors to declare, on behalf of all authors, all potential competing interests for the purposes of transparency. PLOS defines a competing interest as anything that interferes with, or could reasonably be perceived as interfering with, the full and objective presentation, peer review, editorial decision-making, or publication of research or non-research articles submitted to one of the journals. Competing interests can be financial or non-financial, professional, or personal. Competing interests can arise in relationship to an organization or another person. Please follow this link to our website for more details on competing interests: http://journals.plos.org/plosone/s/competing-interests [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Partly ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: I Don't Know ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: In general, this study is very interesting and timely! I strictly recommend publication of this manuscript after a thorough overwork! Here are some comments and recommendations for improvement of the manuscript: Title: Title should be reformulated: What do the authors mean with agreement? There should be a statement that measurements were performed in 'normal' eyes Abstract: What does 'correct measurements' mean? Use coreal power of the anterior/posterior surface instead of anterior/posterior keratometry! Keratometry always means anterior surface radius converted to diopters using an artificial keratometer index! Pleas consider this throughout the manuscript! How were K values averaged? separately for flat and steep meridian? What are measures such as 0.975 for repeatability stand for? Cronbach's alpha? Is this a value for the steep or the flat meridian? What is the repeatability for thickness measurement? Anterior corneal power should be around 49 diopters instead of 43 diopters, maybe the authors converted the front surface radius with a keratometry index instead of reald air-cornea interface data? Introduction: Please mention the benefits AND DRAWBACKS of combined posterior/anterio segment OCTs and dedicated anterior segment OCTs! The major drawback of combined systems is the lack of collimated light at the cornea, which means that the measurement results depend on the measurement distance. Another major drawback of all comboned systems is the small diameter of the measurement volume, here 8 mm for the Revo instead of 16 mm for the Casia Methods: males and females instead of Males and Females Did the authors check all volunteers for ectatic diseases such as KC? 6 measurements FROM each operator... Why didn't the authors record corneal thickness at thinnes point? Instead of simply evaluating the data of the steep and flat meridian, data should be assessed using vector decomposition, e.g. the classical Humphrey notation. This is necessary due to the fact that beside flat and steep meridian the orientation axes between measurement and devices may vary! 5 µm axial resolution refers to air or medium? Please clarify: Keratometry...calculated in the 3 mm zone: Where exactly? At a ring with diameter of 3 mm or including all data from the central 3 mm zone? This is important because classical keratometry does not consider the entire 3 mm zone but distinct measurement points at a diameter of around 3 mm depending on the device. Conversion of radius of curvature to dioptric power using a keratometer index os not appropriate for anterior surface power. Instead, evaluate keratometric power OR anterior and posterior surface power. Keratometric power somehow mimics the behaviour of he entire cornea. Why is Galilei using posterior mean K instead of evaluating both meridians separately? What does 'Thus, unlike the (anterior) SimK, it includes the central 1 mm in diameter' mean? standard deviation instead of Standard deviation... Results: Interoperator repeatability and reproducibility??? Sure you refer to reproducibility only comparing the results of different operators... For all results I recommend to consider vector components instead of flat/steep meridional data due to the fact that otherwise different orientations of 'astigmatism is ignored. Why is corneal thickness missed in tables 1 and 2? Discussion: is launched to the market instead of brought to the market tomographis systems instead of keratometric systems... Scheimpflug camera instead of Sheimpflug camera. Please state that it is simply a question of using a specific eye model or assumingany average speed of sound or refractive index if ultrasound, Scheimpflug or OCT (with different wavelength) yield different results for central corneal thickness... Scheimplfug should read Scheimpflug. 'Measuring anterior corneal surface is easier th 258 an measuring the posterior[21]. In order to measure the latter, sophisticated mathematic algorithms have to be implemented, which is why there is a significant difference between the recordings of the devices.' might be half of the truth!! At the front surface there is a very strong reflex due to air-cornea interface which makes exact and reliable detection of the edge difficult. On the other hand, back surface evaluation (not measurement!!) requires inverse raytracing, which means that all errors of front surface measurement affect back surface measurements! Figures: The resolution of the figures is weak. The authors should use other image formats with a higher resolution for the upload of the revised version! In general, in times where topographers and tomographers are more and more used for diagnostics and surgery planning such evaluations of systems which are newly on the market are more than welcome and very important for the reader! My congratulations to this interesting manuscript! Reviewer #2: The goal of this study is to compare the repeatability and inter-operator reproducibility of a new corneal topographer module of Revo NX SD-OCT with Galilei G6 Schimpflug camera and Casia 2 SS-OCT in normal eyes. I suggest several things that could make the presentation clearer. 1. Put the description of the devices before the description of the measurement techniques. Just switch the order. 2. There are several places in the text and in the footnotes of the tables in which punctuation (a comma) is needed. Also, the references in the footnotes do not totally match what they are referring to. For example in Table 1, Standard deviation Sw = within-subject standard. 3. This section below is not clear. Clarify what the 6 measurements are. Also put commas and an “and” in to clarify the paragraph. Every participant had 6 measurements (for each operator) starting with 3 Topo scan program on the Revo NX carried out by each operator to measure repeatability and reproducibility, followed by one corneal map measurement on Galilei G6, and? Corneal Map scan on Casia 2. For every device, anterior and posterior K1 and K2 values were recorded as well as apical CCT. Only measurements well centered and with high quality indexes were included in the study. I can only count 5 measurements if you are referring to devices: 3 topo scan, one Galilei GT and one Casia 2. I can only count 5 measurements if you are referring to variables: anterior and posterior K1 and K2 and CCT. 4. This section needs clarification. You don’t need to capitalize mean and standard deviation. Use a comma to separate items in a list. Note that “Within-subject standard deviation” is just hanging at the end of the paragraph. It is not a sentence. Numerical results for repeatability and reproducibility contain six quantities computed for observers separately and respectively for the entire dataset: mean, standard deviation, within-subject standard deviation, test-retest repeatability, within-subject coefficient of variation, and intraclass correlation coefficient were calculated for repeatability and reproducibility of the Revo NX. Within-subject standard deviation. 5. The statistical methods needs more elaboration. Define how each of these measurements were computed: • within-subject standard deviation (Is this the root mean square error from the model?) • test-retest repeatability • Cov% (this appears to be with-subject standard deviation divided by the mean rather than the standard deviation divided by the mean) Is this computed using the rmse from the model?) • ICC—was this computed based on a random effects model? How many measurements were included? If this was done for each operator, was it the 3 measures each? 6. Interoperator and Reproducibility Section and Table 1 issuses: • It is not clear which devices are being compared in the table? Is is all three? Or just the new device? Based on the comments below the table, it appears to be an assessment of Revo NX. Labels for the table and section would help the reader. • It is labeled as “interoperator repeatability,” yet the table contains information for each operator (A and B). Therefore, this should be labeled as “intra-operator.” • Column in table says observator. Observator is not an English word. Use “operator” to match the title of the table. • I don’t understand how there is an ICC for each operator. If the two operators are being compared, there should be only one ICC. • OR Are you saying that the 3 measurements for each operator are being compared separately? If that is the case, the title should be ‘Intra-operator.’ • How many measurements were used? This needs to be specified in the table. • From the text below the table, it appears that the way the operators are being compared is by taking the difference between values in the table. To carry out a true INTER-operator assessment, the measurement from both operators need to be in the model. For example, the data would need to be laid out like this. Operator Measurement K1 K2… A 1 A 2 A 3 B 1 B 2 B 3 • Note that the computations from the model need to use the number of repetitions used. In the models that appear in Table 1, I assume that there were 3 measurements per operator. So, in the computation of ICC the number of measurements must be used. o BET=(MSB-MSE)/3 ; *** NOTE: this denominator must match number of observers/measurements ; 7. Table 2 issues • I am not sure what is being compared in this table. At least the table title specified that it is for Revo NX. Define the reproducibility. Is this the result of having both operators in the same model? • The footnote also needs to be cleaned up. 8. Table 3 • What is the rationale for comparing the Galilei G6 to the other two devices? • If Galilei G6 is the “gold standard,” it would be helpful to state that in the methods. 9. Comparison section of results • This section is comparing pair-wise differences between devices for paired t-tests. What is a little confusing is whether just one measurement per device is being used. It appears that 3 measures per operator were used for the Revo NX and one measurement for each of the other devices (based on my understanding of the methods). So, which of the three Revo NX measures are being compared with the other devices and for which operator. Or was this done separately for each operator and only results for one operator presented. • This is just one example of the lack of explanation and clarity shows up in this paper. 10. Plots • Plots are hard to read. They seem blurry. A better rendering should be done for the manuscript. • Note that all the plots say “P=0.000.” This should be corrected to match the text that describes the paired t-test or use P<0.001 for labels on plots if the values are actually that low. • Bland-Altman plots are a visual indication of agreement. In addition to the paired t-tests, the limits of agreement should be mentioned. These are presented in the plots. One would state within which limits one would consider devices to be interchangeable, regardless of the statistical significance of the tests. 11. Overall, this paper may offer new information that is useful. However, clearing up some of the questions above would help readers understand it better. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: Yes: Sandra Stinnett [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 1 |
|
Reproducibility, and repeatability of corneal topography measured by Revo NX, Galilei G6 and Casia 2 in normal eyes. PONE-D-19-32488R1 Dear Dr. Wylęgała, We are pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it complies with all outstanding technical requirements. Within one week, you will receive an e-mail containing information on the amendments required prior to publication. When all required modifications have been addressed, you will receive a formal acceptance letter and your manuscript will proceed to our production department and be scheduled for publication. Shortly after the formal acceptance letter is sent, an invoice for payment will follow. To ensure an efficient production and billing process, please log into Editorial Manager at https://www.editorialmanager.com/pone/, click the "Update My Information" link at the top of the page, and update your user information. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to enable them to help maximize its impact. If they will be preparing press materials for this manuscript, you must inform our press team as soon as possible and no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. With kind regards, Andrzej Grzybowski Academic Editor PLOS ONE Additional Editor Comments (optional): Thank you for preparing a high quality revision and including all reviewers comments. Thank you also for submitting your interesting paper to our journal. Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Thank you for this detailled overwork of the manuscript! In the present version this manuscript is ready for publication! ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No |
| Formally Accepted |
|
PONE-D-19-32488R1 Reproducibility, and repeatability of corneal topography measured by Revo NX, Galilei G6 and Casia 2 in normal eyes. Dear Dr. Wylęgała: I am pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please notify them about your upcoming paper at this point, to enable them to help maximize its impact. If they will be preparing press materials for this manuscript, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. For any other questions or concerns, please email plosone@plos.org. Thank you for submitting your work to PLOS ONE. With kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Andrzej Grzybowski Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .