Peer Review History
| Original SubmissionApril 28, 2021 |
|---|
|
PONE-D-21-14058 Data uncertainty in LU/LC data is unevenly distributed and has disproportionately high impacts on assessment results in environmental planning PLOS ONE Dear Dr. Neuendorf, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. The study seem to have overly concentrated on spatial resolution as the main source of uncertainty in environmental modelling to the neglect of equally important sources/factors, including from the modelling process itself. Please, provide a justification for this or explore other sources in line with recommendations from the reviewers. In addition, kindly explore possibilities of improving the clarity of the structure and text, as well as strengthening the discussion based on recent works. Please, pay attention to comments on terminologies and ensure consistency. Further, ensure that you have provided and/or corrected the relevant information in the supporting data. Please submit your revised manuscript by Jul 07 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Gerald Forkuor Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements.
https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and
[Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Partly ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: No ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: No Reviewer #2: No ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: General comments: In this manuscript, the authors assess the implications of data uncertainty on assessments in environmental planning based on two practical case studies. Uncertainties are important but often ignored in environmental assessments, so this manuscript addresses a relevant gap in the literature. However, I find that the clarity of both the structure and text could be improved, and the discussion could be strengthened with reference to existing work on uncertainties in general and more specific data uncertainties in spatial data. In addition, I think a stronger focus on the criteria for when the uncertainty is too high for specific applications would highlight the novelty of this paper. I would also suggest a language edit, as there are several typos and unclear formulations (see details in specific comments below). 1) In this manuscript, the authors mainly address data uncertainty. In particular, they focus on thematic accuracy and spatial resolution. While these are important sources of uncertainty, it would be helpful to put them in context with other types of uncertainties that affect assessments and decisions in environmental planning (such as model uncertainties, the inherent variability of environmental systems, or ambiguity among views of different stakeholders). How important are data uncertainties in relation to other sources of uncertainty? Addressing this question would add some depth to the discussion. 2) A clearer structure would help to make the paper easier to follow. In particular, I think it would be important to clearly define the criteria of when uncertainty is “too high” at an early stage in the manuscript, as this is a particularly novel and relevant aspect. Then, the results could be presented in relation to these criteria. Furthermore, I would suggest organizing the Methods section by case studies, and clearly describing the specific questions (as currently described in the discussion), data, and type of uncertainty addressed in each case study. It might also be helpful to have a short reference for each case study, e.g. Case study A and B or “carbon mapping” and “renewable energies”, which would be used consistently throughout the paper. 3) It seems to me that especially the results of the second case study (choice of suitable areas for RE) mainly highlight the need to select the appropriate spatial and thematic resolution of data, rather than the uncertainty of the data itself. The importance of choosing the appropriate resolution is well known in the spatial modelling/assessment community, but its implications for practical decisions in environmental planning are still interesting. I think it would therefore be useful to explicitly state the types of data uncertainty that are addressed in each case study (e.g. misclassification, loss of information due to spatial or thematic aggregation and insufficient resolution, etc.). 4) There has been a lot of work in the remote sensing community about various dimensions and metrics of data quality, and their effect on the usability of geospatial data. I think that referring to this literature would help strengthen the content of this paper, see e.g.: • GEOSS Data Quality guidelines (https://www.earthobservations.org/documents/dsp/GEOSS_Data_Quality_Guidelines.pdf) • Batini C, Blaschke T, Lang S, Albrecht F, Abdulmutalib HM, Barsi, et al. Data quality in remote sensing. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives. 2017;42(2W7):447–53. Specific comments 5) Lines 44-46: “Often…” this sentence is overly complex and difficult to understand, please consider rephrasing. 6) Lines 54-55: It would be helpful to define what authors mean by terms such as “data quality” and “preciseness”, as data quality includes both accuracy and precision, as well as some other attributes (e.g. appropriate temporal, spatial, and thematic resolution). 7) Line 100: An “error matrix” is commonly called a “confusion matrix” in the field of remote sensing classifications, which are a common source of LU/LC data, so it might be useful to mention both terms here. 8) Table 1: To make it easier to connect this table to the case studies, please consider adding a column or label to indicate which datasets are used for which case study. 9) Table 1 footnote: Typo “statesuse”. What do you mean by topicality in this context? 10) Lines 139-141: I don’t understand this sentence – why are biotope mapping useless because landscape planners are obliged to use appropriate data? Please clarify. 11) Lines 170-171: This sentence seems like it would fit more into the introduction, as it stresses the general relevance of choosing the appropriate spatial and thematic resolution. 12) Line 209: By accuracy, do you mean producer’s or users’ accuracy? Please specify. 13) Line 243: Phrases such as “dramatic effect” would be more appropriate in the discussion. 14) Lines 264-268: These results highlight the need to choose an appropriate spatial resolution, but do not really inform about uncertainty. 15) Lines 281-282: “When looking at…” this sentence is strangely formulated, please consider rephrasing. 16) Line 284: Typo “compatbible” should be “compatible”. 17) Lines 294-295: Here, you might want to mention that certain land-cover classifications (e.g. random forest-based classifications) can in fact provide spatially explicit information about uncertainties. 18) Lines 315-321: These criteria are very relevant. For a clearer structure, I would suggest that these criteria already be mentioned earlier in the manuscript (e.g. in the introduction or methods section), so that the reader can keep them in mind when presented with the results. 19) Lines 322-325: These questions should be presented when the case study in introduced in the methods section. Reviewer #2: This paper addresses an important topic of data spatial scale and its influence on landscape modeling. The paper shows that using data of different spatial scales (and from different sources) can result in different results when used in modeling. The general conclusion that uncertainty in results is often overlooked in modeling studies is an important one. Below I list some major concerns. I think a simple solution to addressing my concerns would be to change the theme of the paper away from providing a method of characterizing uncertainty (which the paper does not explore), and instead focus on a comparison of the dependence of modeling results on the spatial resolution and other properties of the input data. I think the latter approach will allow you to still achieve your main aim of highlighting the importance of considering how the data used affects the resulting model. Major concerns 1. The paper makes broad and sweeping claims, including that it provides a simple and practical method for characterizing uncertainty, that are not supported by the analysis or the results. There is no investigation of the uncertainty of any of the data sets, or in the modeling itself. Also, as a proposed method for characterizing data suitability, by comparing the results with higher spatial resolution data, it does not seem to be useful to me, because why not use the higher spatial resolution data in the first place? 2. There are multiple sources of uncertainty in comparing data sets – spatial resolution is simply one of the sources of differences. Even something as subtle as the difference in how the classes are defined can be important. Other issues include spatial location accuracy, thematic classification accuracy, date of acquisition, etc. The mismatch between the classes required by the model and the classes in the geospatial data may be even greater than the issue of the spatial resolution. 3. After making the correct statement, that “There is no spatial environmental planning without uncertainties”, the paper then goes on to treat the biotype mapping as “ground truth”, and to call differences between the biotype data and other data sets as uncertainty in the other data, not in the biotype data. I think it is a fundamental flaw to suggest the biotype data are the truth and by implication so are the model results using the biotype data. 4. It would be useful to provide a great deal more information on how each data set was generated, and to explain the methods of analysis in more detail. In particular, information on the biotype data seems important – what scale is it produced at and more detail on how it is mapped? How were the DEM products generated (e.g. photogrammetry, lidar, etc), and what is their source? In Table 1, you could separate out the minimum mapping unit area and the pixel size for raster data sets. What is the spatial uncertainty in the location of the data? For the data overlay for generating the confusion matrix, it is useful to state the resampling method used (presumably nearest neighbor). What is the cross-walk for the conversion of the data legends between the different data sets? The concept of the emission level classes (figure 4) and their definitions should be explained in the methods. 5. There appear to be some errors in the supporting document’s calculation of the confusion matrices user’s accuracies. For example, for Saxony Anhalt, I get different values for both Mining and landfill sites (I get 0.0) as well as Areas of sparse vegetation (I get 0.06) user’s accuracies. I suggest checking all the numbers. 6. I suggest presenting both user’s and producer’s accuracies in Figure 2 and the text, and not just user’s accuracies, since producer’s accuracies are equally important. (Though see comments below about the term “accuracy”.) Minor issues 1. As explained above, I don’t think it is conceptually correct to refer to the differences in the maps as “accuracy” or even “uncertainty”. A better term would be agreement or consistency. For example, you could use the terms “user’s consistency” and “producer’s consistency” instead of “user’s accuracy” and “producer’s accuracy” – though you’d need to define the terms since they are not commonly used. Using agreement or consistency would not preclude making an argument that one data set more closely matches the assumed spatial scale (or other properties) assumed by the model. For example, if the model has wetlands as a key class, and wetlands are generally smaller than the 25 ha MMU of the Corine landcover, then the use of the finer spatial scale data will allow the model to incorporate these smaller wetlands. 2. Avoid “error matrix” for the above reason and instead use a term like “confusion matrix”. 3. Avoid the term “ground truth” – this is generally thought not to be a good term in remote sensing, and in a paper on uncertainty in data layers it seems strange to imply that any data could be the “truth”. 4. I suggest reserving “significant” for “statistically significant”. Instead, for general use, use terms like notable. 5. L28. I don’t see an analysis of the propagation of inaccuracy. 6. L30. I don’t see any “differences in uncertainty”, but instead differences in results. 7. L31. I think you are promising too much in claiming to have demonstrated uneven distribution in uncertainties. However, you have shown that the magnitude of the differences in results between using different data sets can vary, depending on the application and even location. 8. Use a subscript for the 2 in CO2 throughout. 9. L92 – I can’t understand this sentence – perhaps rephrase? 10. L115 – “no significant missing data” – does this mean there is some missing data? If so, please explain. 11. Figure 1. Please add (a) scale bar (in m, or km, as may be appropriate) and (b) a legend. 12. I think it would be useful to point out to the reader that the issue is not just the pixel size of raster data, but the minimum mapping unit area – it is convenient that the CORINE data specifies this so clearly. 13. L189, 190 and 195/196. The details of the grid cell size should be in Table 1. (I suggest using “grid cell size” or “grid cell dimension”, or even “pixel size”, not “grid width”) 14. L195 – In my experience, even 50 m is still very coarse for this type of modeling. I think this reinforces the idea that often in modeling we have to use data that don’t entirely fit our purpose. 15. L198 – “user’s accuracy”, not “user accuracy” (but seem my comment above about not using the term accuracy). 16. L199 – what is “combined accuracy” – do you just mean just “the range in the user’s accuracy”? – but that doesn’t make sense since the lowest value seems to be 0%, not 82%? Do you perhaps mean the “overall accuracy” (which is defined as the sum of the correctly classified areas in each class, divided by the total area of the map)? 17. Figure 2. The y-axis should be “Agreement”, not “Spatial accuracy” (the latter would be a measure of the spatial registration of the two data sets, which I don’t believe is what you mean). Is it possible to use color – I struggled to tell the different grays apart. 18. L219 – “correlation” – do you perhaps mean “confusion”? 19. Figure 3. Are the legends correct? For example, for Figure 3 (a), my interpretation from the legend (“misclassified GRASSLAND”) is that these are areas that are in reality GRASSLAND but have been misclassified. But I think you actually might mean “Proportion of pixels misclassified as GRASSLAND”? 20. L260 “rundue” should be two words. 21. L274 – replace “geometric” with “spatial resolution”. 22. L294/295 – I don’t understand the sentence (or the subsequent discussion) and my best interpretation doesn’t seem to be correct. The confusion matrix is a spatial overlay and produces a summary of agreement. It is true however that it does not consider the contiguity of the distribution of pixels in any one class, but I don’t see why that should matter. 23. L300 – yes, it has uncertainty. Since “to a certain degree” are not defined, they can be removed. 24. L302 - The term “geodata uncertainty marker” is not defined and it is not clear any case that one can generalize for other applications – your work shows that even for different areas using the same dataset, the agreement varies. I suggest dropping this. 25. L321 – I don’t understand why you are saying “the uncertainty is too high” when the “effect is small”? 26. L332 I don’t understand why small land holders would be affected more than large ones. Wouldn’t small ones be too small to show up in the analysis, and therefore be ignored? Perhaps you could elaborate. 27. L345. I suggest not using a term like “better” (which is not defined), but instead simply state the properties that you mean by “better” – e.g. data with a spatial resolution that matches the modeling requirements, or whatever you mean. 28. L359. I don’t think you have analyzed the uncertainty in the products you have generated, as there are many other sources of uncertainty, including in the model itself, not to mention uncertainty in the “ground truth” data. I think instead you have shown that different data sets can produce different results when applied to a model, and therefore it is important to consider what spatial scale (as well as other data characteristics, such as classes mapped, etc) the model assumes for input data. 29. Figure 4. Could you use color here, too, to make it easier to interpret? 30. Supporting data. What are the units of measurement in the supporting confusion matrix data tables? Ha, pixels, etc? (if pixels, what size?). The tables are very hard to read because of the decimals (it is hard to get a visual idea of where the confusions lie) and the size of the numbers. Units like square km would make the table a lot more visually understandable, and perhaps make your overall points easier to make. 31. Supporting data. Very important – all the user’s and producer’s accuracies don’t seem to be %, though they are labeled as such. I think they are just straight proportions (on a 0-1 scale). 32. Supporting data – what does “balance" mean? What is "sum/balance"? ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Ana Stritih Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 1 |
|
PONE-D-21-14058R1 Uncertainties in land use data may have substantial effects on environmental planning recommendations: a plea for careful consideration PLOS ONE Dear Dr. Neuendorf, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. The revisions required are minor. Please, make the necessary changes or provide the required clarifications. If a particular recommendation is not feasible, please provide cogent explanations as to why. Please submit your revised manuscript by Oct 15 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Gerald Forkuor Academic Editor PLOS ONE Journal Requirements: Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: No ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: No ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The revised version of the manuscript is greatly improved in terms of clarity and provides an important message for the planning community. Reviewer #2: The authors have made a good-faith effort to take into account my concerns. My suggestions below are mainly minor and mainly merely a matter of clarifying wording or terms that I find confusing, as unfortunately the revised paper is often a bit confusing. 1. In a number of locations “topographical” is used where it appears “land cover” would be a better term (topographical means the form of the land – e.g. hill, valley, etc). See L57, L137 2. L81-85 – The meaning of this is not clear – can you rephrase? 3. L107 – Adressees – not sure what this means – do you mean “Potential users”? 4. L135 – Actuality – do you mean “currency” (as in the data is up to date)? Or something else? 5. L145 – Pictures – replace with “images” 6. Methods – do you explain how you handle your spatial overlay operations evaluating “consistency” where the pixel size differ? Specifically, how do you resize the pixels? 7. L191 onwards – in the results you refer to “After Walter” and “After Thiele” – I suggest making it clear by including those names in the text here and also explaining explicitly how your methods relate to those sources – e.g. are you following their procedures or using their output or what? 8. L215, Figure 1 and Table 1. The word “consistent” alone does not explicitly explain the meaning that I think you are applying to this term. If I understand it correctly, you do two comparisons: an aspatial comparison of total areas mapped as a certain class, and a spatial overlay operation where you look for agreement in class labels of pixels. You appear to use “consistent” to mean the latter only. I think to make this usage clearer it would help a lot to say “consistent labels” or “consistent labeling” or “consistency in labeling of pixels” or “spatial agreement”. I also think you should state in the methods that you have these two ways of evaluating agreement. 9. L221 and also figure 2. In some cases you use the term “area” where I think you mean “class” or “label”. For example, in L221 – I think the point you are making is that errors in the delineation of certain classes can have a large effect on the model outcome. 10. L227 – “this case” – do you mean “this class”? I’m also confused about “deficits” and “untapped” – it would be clearer to simply talk about “disagreements” and “differences” or “not mapped”. 11. L231 – perhaps rephrase along the lines of “In contrast, the use of ATKIS data results in an underestimation of about 5.8 million tons of potential C02 emissions compared to the reference data.” 12. In the next sentence replace with something like “This is because a total of 3,025 fewer ha were mapped as Level 1…” 13. Table 2. This is a tricky table to read and absorb. I think it would help a lot to eliminate the use of decimals for the t and ha values. This level of precision (1/100 of 1 ha!) is of little purpose when the numbers differ as much as they do (thousands to millions of ha). Important: the “Potential CO2 emission or retention” cell label for the reference Biotope needs units (presumably t). 14. L248 – add a caveat “if we assume the Biotope mapping is correct.” 15. L251 – It wouldn’t be potentially disregarded, but would be disregarded – i.e. rephrase something like “This would lead to a total of 255 ha of areas important to conservation not recognized in the planning recommendations due to the lower detail of the data used.” 16. Table 3 – indicate in the table that Biotope is the reference mapping 17. Table 3 – need to clarify in the headings that it is “class label” that matches, or that the “class is not mapped” – rather than the area that is missed or consistent (which would mean something else). 18. L302 – thematic attributes - do you mean fewer classes? 19. L307- thematic information is lacking – do you mean that if the thematic classes do not match the classes of interest? 20. L312 – 2980 km2 – the significance of the area is hard to understand – I recommend also specifying this as a percentage. 21. L323- easy data – do you mean “easily obtained data?? 22. L344 – albeit this would not be necessary – do you mean “that are not actually helpful based on the real land use of the area.” 23. L347 – I don’t understand what this fourth constellation is – do you mean something like “when the potential effort or response required to address the issues raised by the model outcome is large but the data uncertainty is also large? 24. L375-6 – I can’t understand this sentence. 25. L394 – I don’t understand the sentence (“in how far”?) and what does “behavioral mechanisms” mean? Can you rephrase this? ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Ana Stritih Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 2 |
|
Uncertainties in land use data may have substantial effects on environmental planning recommendations: a plea for careful consideration PONE-D-21-14058R2 Dear Dr. Neuendorf, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Gerald Forkuor, Ph.D Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: |
| Formally Accepted |
|
PONE-D-21-14058R2 Uncertainties in land use data may have substantial effects on environmental planning recommendations: a plea for careful consideration Dear Dr. Neuendorf: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Gerald Forkuor Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .