Peer Review History
Original SubmissionApril 7, 2021 |
---|
PONE-D-21-11491 Evaluating the Quality of Remote Sensing Products for Agricultural Index Insurance PLOS ONE Dear Dr. Kenduiywo, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Specifically
Please submit your revised manuscript by Jul 08 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Gerald Forkuor Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly Reviewer #2: Yes ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: No Reviewer #2: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Review: Evaluating the Quality of Remote Sensing Products for Agricultural Index Insurance, submitted to PlosOne. The paper addresses a relevant question, i.e. how to improve the (assessment of) the quality of remote sensing based agricultural insurance. The paper is well-written, informative and technical solid. However, I have several concerns with the framing and analysis. For example, the embedding in the existing literature is not yet fully developed. Moreover, some expansions of the empirical analysis may be needed. I summarize key points below The paper by Vroege et al 2021 gives a recent overview and reflection on the high potential of remote sensing for agricultural insurances. The key claim and hook of this paper is that previous research in this field usually focused on measures like R2 or RMSE to evaluate index insurance schemes. This is wrong. Below just a quickly complied list with exceptions to your claim. For example: Heimfarth and Musshoff 2011 use a value at risk, Musshoff et al 2011 use a hedging effectiveness metric, Vedenov and Barnett, 2004 use semi-variances, Bucheli et al 2021 use expected utility and lower partial moments, Dalhaus et al 2020 use cumulative prospect theory. Along these lines, Conradt et al 2015 even suggest an econometric approach base don quantile regression to better account for low-yield events in designing index insurance schemes. You really need to incorporate previous literature in your paper. The arising question is: what is the value added of your manuscript, how does the suggested approach improve all that exists. I still think there is a potential value added, but this needs to be developed clearly. In any case, you need to evaluate your proposed metric against more than R2. We already know the limitations of that. You may also use other metrics listed above for such comparison. The developed metric implicitly assume a expected utility framework. We now that not all farmers have the same risk preferences, also aspects like loss aversion may differ substantially across farmers. Thus, is creating ‘one’ metric really realistic – shouldn’t this depend on preferences and be flexible to incorporate this? Again, you clearly need to motivate why this new approach is a value added. Empirical analysis: i) if you focus especially on extreme events, quantile regression may be much more valuable (Conradt et al 2015), ii) you conduct your analysis in sample as far as I can see this from the paper. This has clear limitations, i.e. leads to biased results. Please provide an out if sample analysis. Minor comments: i) the dataset is well known in the literature but still, using it deserves more details – maybe add an appendix, ii) estimation results shall be provided in tables, only figures are not sufficient, iii) how do you define ‘seasons’ you make your analysis for (page 8)? Be more specific in what you do. Again, maybe add an appendix with more details. References: Bucheli, J., Dalhaus, T., & Finger, R. (2021). The optimal drought index for designing weather index insurance. European Review of Agricultural Economics, in press https://doi.org/10.1093/erae/jbaa014 Conradt, S., Finger, R., Bokusheva, R. (2015). Tailored to the extremes: Quantile regression for index-based insurance contract design. Agricultural Economics 46: 1-11 Dalhaus T., Barnett B.J., Finger R. (2020) Behavioral weather insurance: Applying cumulative prospect theory to agricultural insurance design under narrow framing. PLos One 15(5): e0232267 Heimfarth, L. E., & Musshoff, O. (2011). Weather index‐based insurances for farmers in the north China Plain. Agricultural Finance Review. Musshoff, O., Odening, M., & Xu, W. (2011). Management of climate risks in agriculture–will weather derivatives permeate?. Applied economics, 43(9), 1067-1077. Vedenov, D. V., & Barnett, B. J. (2004). Efficiency of weather derivatives as primary crop insurance instruments. Journal of Agricultural and Resource Economics, 387-403. Vroege, W., Vrieling, A., Finger, R. (2021). Satellite support to insure farmers against extreme droughts. Nature Food 2, 215–217 Reviewer #2: This paper develops an alternative to a standard R-squared fit to assess the functioning of index-based insurance product. To this end the authors create a new a metric (0<rib<1) assesses="" that="" the="">The paper is extremely well written, and fills a clear void in the literature and instruments available to researchers and the industry in assessing the functioning of index insurance products. This is a very welcome and timely contribution. I have several comments, which I present below. 1. An alternative metric used in insurance is the so-called Catastrophic Performance Ratio (CPR), or the return to insurance relative to the premium paid, at different levels of the underlying index metric (e.g. different rainfall levels). The CPR is calculated by multiplying the probability of receiving a claim in case the farmer has catastrophic crop loss with the average amount of claim she receives in these cases and dividing this by the commercial premium. Given that to the best of my knowledge the CPR is the only alternative to date to assess the performance in negative states of an insurance, it would be important for this paper to acknowledge it and either 1) to some comparative analysis in terms of performance between CPR and RIB, or 2) at least describe how the RIB is different, and supposedly better, than the CPR. 2. The paper is very concise, which is appreciated, but there are two additions to their analysis that in my view can help the reader better understand the usefulness of the metric in question, and would really augment the contribution of the paper. The paper presents two NDVI measures and compares and contrasts their ability to generate functioning insurance as measured through the RIB. It also uses one rainfall dataset to the same end. It finds that the RIB helps distinguish ‘better’ indexes, and shows clearly that it does so better than a simple R-squared, as it puts more weight to severe false negatives, for instance. It finds that the segmented regression log transformed MODIS NDVI performs better than the other hypothetical indexes. This is quite illustrative of the power of RIB as it shows for instance that between MODIS NDVI and NOAA NDVI the preferred dataset to use is the former, using segmented regression modelling. It would be great to see a similar parallel analysis for rainfall, in the sense that for a similar proxy of weather we clearly see that certain models and certain datasets work better than others. To this end, I am wondering if it would be possible to complement the analysis thus far with an additional rainfall dataset: namely the ARC2 dataset that is used by other rainfall-based insurance products in Kenya. 3. Likewise, I would like to see at least one application of the RIB to an actual index. It would be great to have some insights into how the 24 (simplified) hypothetical models designed work in relationship to a real insurance. In fact, insurance payout triggers often conceal checks and balances that might be designed to avoid indemnifying policies that are the result of strategic behaviour etc. More generally speaking, actual insurance contracts might be more complex than the ones exposed in this paper. If the authors have access to them, it would be great to have at least one ‘true’ index applied as a ‘dry run’ on the same livestock mortality rates. Given that the context of the study is clearly pointing to IBLI, perhaps this real insurance could be a version of the IBLI index in Kenya, perhaps its’ very first iteration. Even more ideally, a first version of the IBLI could be compared to a more recent one (using the same mortality data)—to show how RIB can also pick up incremental improvements in the way real indexes are designed. 4. The livestock mortality dataset is described too hastily. How many surveys were collected in the 15 locations in total? How many data points, what is the average mortality rate in the dataset? More time should be spent describing the nature of this data. 5. The point above may also help to explain the extent to which the differences in RIB are to be expected if the same indexes are applied to the same locations but in 5 years time. In other words, to what extent can one say with certainty that a higher RIB means that an index alway functions better and to what extent may this be just by chance. More time should be spent discussing how the RIB can be used to discern index quality (some tests of significance?) and especially how it should NOT be used. This may require entering a discussion around power in the dataset you currently have. I am not asking you here to prove that these differences are systematic, I am asking mostly to discuss the issue openly and to caution against using the RIB to make assumptions that might be context and time specific. 6. If the data allows it (if the dataset is large enough) I would suggest randomly censoring half the data and calculating the RIB on the remaining random portion. Iterating this process several times and showing the RIB distribution can provide a useful impression of how data-dependent the RIB is, and how much we can expect it to vary by removing a few outliers. 7. Another important sensitivity analysis that should be performed and shown is that with respect to rho. It is acceptable that the authors pick a rho from literature and use it throughout their analysis, but they should show, at least in an appendix, what is the range of RIB values that we should expect when rho is varied within a reasonable range. Most importantly, they should discuss what is the likelihood that the relative RIB values of different indexes swap magnitude as the assumptions over rho vary. 8. It would be good to have a summary table somewhere describing the insurance performance in laymen terms. Which percentage of losses is covered by each index? Payout rates? True negatives? What is the percentage of false negatives and false positives? This will also help understand the improvement of RIB over R-squared more immediately. Ideally these rates can be presented both overall and at higher loss states. 9. The example with livestock mortality is compelling. However I am missing the complete picture that I was expecting from the title, referring to agricultural insurance. Authors should provide a discussion of how a similar index could be used for crop loss analysis. At the moment this is completely absent other than a brief mention in the discussion relative to the cost-benefit of crop cuts. I might be wrong, but the nature of livestock mortality makes it very suitable for a RIB-type index, while crop losses are much less lumpy and might be harder to assess—a prerequisite to make a functioning RIB. This should be discussed more, even in case the authors disagree, and the ‘complexities’ of constructing a RIB index should be weighted on its evident advantages. It is my hope that RIB will become a standard in the industry, well beyond livestock insurance, but for it to happen the necessary toolkit should be made clear across a variety of indexes, or at least the most common ones. 10. Equation 9, page 5, might need to be revised. Possibly the last superscript is N instead of P.</rib<1)> ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: Yes: francesco cecchi [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
Revision 1 |
PONE-D-21-11491R1 Evaluating the Quality of Remote Sensing Products for Agricultural Index Insurance PLOS ONE Dear Dr. Kenduiywo, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.
Please submit your revised manuscript by Sep 16 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Gerald Forkuor Academic Editor PLOS ONE Journal Requirements: Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Big thanks to the authors for the thorough revision. That helped clarifying many things. I have two remaining issues, where I am not satisfied by the responses given. First, the authors acknowledge that similar lines of thinking (using alternative metrics cp to R2) have been already used widely in the literature on index insurance. This is standard in many ag econ journals. BUT. the big gap is that it has not been done in the remote sensing related insurance applications. I do not see the difference here. Technically, it does not matter what index I used if I talk about assessment in economic terms. Please tone down the description of your own contribution with this paper, adjust language. Second, I do not fully get your response on out of sample performance. It does not matter what estimator etc you use. Within sample assessment (as you do it) is biased in terms of assessing the economic performance of an insurance solution, i.e. it will be usually too optimistic if doing all in-sample. Thus, also comparisons and assessments presented here are likely biased. An out-of-sample procedure (again, widely used in ag econ journals) can address these concerns and add robustness to your conclusions. As you have sufficient data, it shall not be an issue I believe. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
Revision 2 |
Evaluating the Quality of Remote Sensing Products for Agricultural Index Insurance <o:p></o:p> PONE-D-21-11491R2 Dear Dr. Kenduiywo, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Gerald Forkuor Academic Editor PLOS ONE Additional Editor Comments (optional): Thank you for paying attention to the comments and performing extra analysis. Reviewers' comments: |
Formally Accepted |
PONE-D-21-11491R2 Evaluating the Quality of Remote Sensing Products for Agricultural Index Insurance Dear Dr. Kenduiywo: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Gerald Forkuor Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .