Peer Review History

Original SubmissionOctober 31, 2023
Decision Letter - Elias Kaiser, Editor
Transfer Alert

This paper was transferred from another journal. As a result, its full editorial history (including decision letters, peer reviews and author responses) may not be present.

PONE-D-23-35924Computationally reproducing results from meta-analyses in Ecology and Evolutionary Biology using shared code and dataPLOS ONE

Dear Dr. Kambouris,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Feb 09 2024 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Elias Kaiser

Academic Editor

PLOS ONE

Journal Requirements:

1. When submitting your revision, we need you to address these additional requirements.

Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at 

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please update your submission to use the PLOS LaTeX template. The template and more information on our requirements for LaTeX submissions can be found at http://journals.plos.org/plosone/s/latex.

3. Did you know that depositing data in a repository is associated with up to a 25% citation advantage (https://doi.org/10.1371/journal.pone.0230416)? If you’ve not already done so, consider depositing your raw data in a repository to ensure your work is read, appreciated and cited by the largest possible audience. You’ll also earn an Accessible Data icon on your published paper if you deposit your data in any participating repository (https://plos.org/open-science/open-data/#accessible-data).

4. Thank you for stating in your Funding Statement: 

SK received support from a Melbourne Research Scholarship (https://scholarships.unimelb.edu.au/awards/melbourne-research-scholarship) and an Australian Government Research Training Program (RTP) Scholarship (https://www.education.gov.au/research-block-grants/research-training-program). FF received funding from Australian Research Council Future Fellowship FT150100297 (https://www.arc.gov.au/). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Please provide an amended statement that declares all the funding or sources of support (whether external or internal to your organization) received during this study, as detailed online in our guide for authors at http://journals.plos.org/plosone/s/submit-now.  Please also include the statement “There was no additional external funding received for this study.” in your updated Funding Statement. 

Please include your amended Funding Statement within your cover letter. We will change the online submission form on your behalf.

5. Thank you for stating the following in the Acknowledgments Section of your manuscript: 

SK received support from a Melbourne Research Scholarship and an Australian Government Research Training Program (RTP) Scholarship. FF received funding from Australian Research Council Future Fellowship FT150100297

We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form. 

Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows: 

SK received support from a Melbourne Research Scholarship (https://scholarships.unimelb.edu.au/awards/melbourne-research-scholarship) and an Australian Government Research Training Program (RTP) Scholarship (https://www.education.gov.au/research-block-grants/research-training-program). FF received funding from Australian Research Council Future Fellowship FT150100297 (https://www.arc.gov.au/). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: No

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This paper touched upon an interesting topic about reproducibility of ecological and evolutionary meta-analysis papers. The authors found that the code and data sharing in ecological and evolutionary meta-analyses were alarmingly low (26/177; 15%), and only 4-13% of papers were successfully reproduced. Generally, I like the idea of this paper and it is a trendy paper that will arise discussion once formally released. Note that this paper has many merits that I would not mention emphasize them. I will focus on the concerns and the ways that this paper can be improved.

Major comments:

1. The conclusions made are a bit arbitrary and not in a rigorous way to me. They way they calculated the success rate was quite unusual. The core conclusion they made is “The low overall success rate was primarily driven by the low rate of code sharing”. This is a conclusion that any one can make, and does not need any research and analyses. How the author simple equate “no data and code sharing” and “not reproducible ”. A proper way is to use the subset that has data and code available (in this case, 26 papers) to calculate success rate. I assume this paper will reach a quite wide range of audiences and “4-13%” computational reproducibility rate is really conveying misleading information to the community. Literally, it means at best, only 13% meta-analyses can be reproduced in using some software like R; at worst, only 4% can be reproduced. Do the authors believe this?

2. The authors appear to “hacked” the abstract. The authors write in the abstract in a way with the intension to express “how bad of the data sharing and reproducibility is in the field of ecology and evolution”. A decent data sharing rate was found by the authors, as described as “sharing rate among this sample of meta-analysis articles was 75% (133 out of 177).”. This key finding should be emphasized in the abstract. The author said in the abstract: 26 articles (15%) were found to have obtainable data and code files. This sentence gives audience an illusion that only 15% meta-analyses shared data and code. But actually, only 15% meta-analyses shared both data and code at the same time. This is not a transparent way to report your results! Please make it clear.

3. The authors reproduced only meta-analyses with both data and code available, and calculated the success rate. This is not a proper way. What the authors should do is to reproduce the results of meta-analyses with data available, if following the definition of computational reproducibility. Having the original code available is not a criterion to filter your data and use this subset to calculate success rate.

Minor concerns:

Abstract:

The last sentence does not make sense. Please conclude this paper in a broader sense and discuss implications of the results the authors found.

Introduction:

The introduction was not well framed. The authors lacked the descriptions of the recent research progress of meta-science in ecology. The current version only slightly touched upon or simply mentioned meta-science in ecology and evolution. There are quite a few recent work that has not been properly credited. For example

Kimmel, K., Avolio, M. L. & Ferraro, P. J. Empirical evidence of widespread exaggeration bias and selective reporting in ecology. Nature ecology & evolution 7, 1525-1536 (2023).

Yang, Y. et al. Publication bias impacts on effect size, statistical power, and magnitude (Type M) and sign (Type S) errors in ecology and evolutionary biology. BMC biology 21, 1-20 (2023).

The majority of the introduction were about the data and code sharing/archiving. The author’s paper is about reproducibility but not much relevant literature was mentioned, even those in other fields. Without those research progress, it is hard to judge the significance of the paper.

Methods:

The authors used a very outdated dataset. They searched the meta-analysis papers published between 2015 to 2017, on 20th December 2017 in Scopus abstract and citation database search. I would like to suggest the author to update their data. As the open science is a fast-moving moment in ecology and evolution. What the community wants to know is the “current” reproducibility of meta-analyses or the dynamics of the reproducibility. My intuition is that the data and code sharing and reproducibility of recent 5 years’ meta-analyses are expected to increase a lot. The paper is a meta-scientific work. Information/conclusions disseminated in this paper should not have any misleading elements. Imagine that by roughly reading this paper, an audience will have an impression that the success rate of reproducing ecological and evolutionary meta-analyses is 4-13%. This is totally misleading because the way of calculating success rate (a wrong number used as the denominator!) and the outdated data.

I did not get the point of the Box about “Definitions of data, code, and sharing”. Any researchers should know what are the data, code, and sharing, although they might not know the exact definitions. The aim of the Box is to explain the jargons or terminologies that are not familiar by the general audiences.

Although I tried, but I did not find the code and data that can be used to reproduce the paper itself. Please clearly indicate where it is and provide the publicly accessible repository and link.

Results

What is the point of lines 161 – 163 “The practice of including some kind of supplemental information alongside a published article was very common in this sample. The vast majority (168/177, or 95%) of meta-analysis articles included some kind of supplementary or supporting document (regardless of whether or not they also shared data or code).”. Are these sentences relevant to any aims defined in the introduction of this paper?

As said, the author should have a subsection to show the results of reproducing target results when data are available.

Discussions

The discussions were badly written. A proper way is to put your results in the context of the current literature: compare your results with others, interpret your results, the implications, and limitations. The authors only compared their results to one paper and then have a very general discussion on a “topic” that is not very relevant to the topic of their paper – The widespread use of R in ecology and evolution for meta-analysis. Please rewrite the discussions and have proper comparisons.

Reviewer #2: This paper presents the results of a study of the quality of data and code archiving from a sample of meta-analyses in ecology and evolutionary biology.

Understanding the status of data and code archiving in subsets of the literature is important to building an understanding of the implementation of open science practices more generally, and is essential for identifying areas needing improvement.

The methods employed in this study appear appropriate and thorough.

The paper presents sufficient background information, describes methods in detail, provides a thorough accounting of results, and places the work in context.

I have no substantive comments about the science.

The one broad comment I wish to make is that the writing could be improved.

------------------------

comments below are organized by line number

------------------------

My first set of comments do not address the quality of the science or the reliability of the presentation (both of which are excellent), but rather the readability of the prose. I make these comments about readability in an attempt to be useful to the primary author, not to demand edits. I think your paper will be better if you take my suggestions, but please invest your time as you see fit.

I provide a few specific suggestions that I hope will serve as examples of edits that can be made throughout the text.

2: Vary your sentence structure. In this case, you start two sentences in a row with “This study”. One possible edit would be to begin the second of these sentences to “We surveyed the data…”

5: Look for superfluous words to cut. In this case, you could say “Twenty-six articles (15%) had obtainable data and code files.”

6: Use active voice. For instance, this sentence could read: “Using these data and code files, we attempted to computationally reproduce the published results.”

Seek to promote coherence between sentences. In other words, help readers link the ideas in sequential sentences, either by using transition phrases (where relevant; examples include ‘Also’, ‘In contrast’, and ‘For example’), or more commonly, by beginning each new sentence with an explicit link to the content in the prior sentence. To see an example of such a transition, see my prior comment where I suggest starting your sentence with “Using these data and code…”, which obviously is a direct link to the data and code you just mentioned in the prior sentence.

161: “-ly” adverbs for emphasis (such as ‘very’) are typically neither necessary nor sufficient to change the reader’s understanding of whatever you’re describing. In this case, saying something was “very common” will not generate a different understanding than saying it was “common”. Instead, the “very” just clutters the sentence.

174-175: avoid including too many moderators before the noun. Too many is typically two or more. In this case, you have “data-sharing” (2) before “articles.” This sentence would read much better as something like “The majority of articles that shared data shared some or all of the data files on the journal publisher’s website.”

Personally, I think it would be even better as something like: “The majority of articles that shared data did so on the journal publisher’s website.”

------------------------

other miscellaneous comments

51: I encourage you to use the past tense to describe your project. This is a philosophical point more than one of writing style. You planned and implemented the project in the past, and your observations were made in the past, and so the writing should reflect this.

Fig 2 caption: you may wish to avoid contractions

147: It might be helpful to readers define these errors as percentages, and to include the ‘%’ symbol to remind readers of what this number is

174: incomplete sentence

242: the ‘,’ should be replaced by a ‘;’, or else place ‘refer to the Supplementary In243

formation (S9) for details’ in brackets.

Table 9: This table is confusing without consulting the text. The table caption needs a clear explanation of what the ‘N’ and ‘%’ columns refer to.

443 – ‘ans’ typo

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Yefeng Yang

Reviewer #2: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Attachments
Attachment
Submitted filename: comments_PONE-D-23-35924.docx
Revision 1

All responses to reviewers are contained within the Cover Letter (file "Response to Reviewers.pdf"), please refer to this document.

Decision Letter - Elias Kaiser, Editor

Computationally reproducing results from meta-analyses in ecology and evolutionary biology using shared code and data

PONE-D-23-35924R1

Dear Dr. Kambouris,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Elias Kaiser

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

Reviewer #2: (No Response)

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: The authors did a good job in addressing my comments. Although the analysis is not perfet, it is publicable level.

Reviewer #2: I have read the revised manuscript. The writing is substantially improved. I was confident in the research before, and I remain so.

I have two trivial copy edits:

453 – insert comma after ‘study’

556 – delete first “the”

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Yefeng Yang

Reviewer #2: No

**********

Formally Accepted
Acceptance Letter - Elias Kaiser, Editor

PONE-D-23-35924R1

PLOS ONE

Dear Dr. Kambouris,

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now being handed over to our production team.

At this stage, our production department will prepare your paper for publication. This includes ensuring the following:

* All references, tables, and figures are properly cited

* All relevant supporting information is included in the manuscript submission,

* There are no issues that prevent the paper from being properly typeset

If revisions are needed, the production department will contact you directly to resolve them. If no revisions are needed, you will receive an email when the publication date has been set. At this time, we do not offer pre-publication proofs to authors during production of the accepted work. Please keep in mind that we are working through a large volume of accepted articles, so please give us a few weeks to review your paper and let you know the next and final steps.

Lastly, if your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

If we can help with anything else, please email us at customercare@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Elias Kaiser

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .