Peer Review History

Original SubmissionJune 14, 2022
Decision Letter - Niklas Bobrovitz, Editor

PONE-D-22-16736Data sharing upon request and statistical consistency errors in Psychology: A replication of Wicherts, Bakker and Molenaar (2011)PLOS ONE

Dear Dr. Claesen,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

==============================

Dear Authors,

Thank you for your submission. 

This is a comprehensive and well reported replication study on an important topic. 

It provides valuable information on the relevance of data sharing practices and policies with respect to the reproducibility of research results. 

The reviewers have suggested several minor revisions. Please address these. 

I also have one suggested revision. Reviewer 2 made a comment about acknowledgement of the different journals included in the replication study as this may be helpful to further provide context for the degree of inference that can be made from the replication results. As part of this acknowledgment you may wish to highlight the distinction between a "retest (direct) reproduction attempt" and an "approximate (conceptual) reproduction attempt". This is a distinction described by: Zwaan, R., Etz, A., Lucas, R., & Donnellan, M. (2018). Making replication mainstream. Behavioral and Brain Sciences, 41, E120. doi:10.1017/S0140525X17001972. It is also summarized well by Niven, D.J., McCormick, T.J., Straus, S.E. et al. Reproducibility of clinical research in critical care: a scoping review. BMC Med 16, 26 (2018). https://doi.org/10.1186/s12916-018-1018-6. 

This differentiation may help to also frame the methodological approach you have taken.

==============================

Please submit your revised manuscript by Oct 26 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Niklas Bobrovitz

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please provide additional details regarding participant consent. In the ethics statement in the Methods and online submission information, please ensure that you have specified (1) whether consent was informed and (2) what type you obtained (for instance, written or verbal, and if verbal, how it was documented and witnessed). If your study included minors, state whether you obtained consent from parents or guardians. If the need for consent was waived by the ethics committee, please include this information.

If you are reporting a retrospective study of medical records or archived samples, please ensure that you have discussed whether all data were fully anonymized before you accessed them and/or whether the IRB or ethics committee waived the requirement for informed consent. If patients provided informed written consent to have data from their medical records used in research, please include this information

Once you have amended this/these statement(s) in the Methods section of the manuscript, please add the same text to the “Ethics Statement” field of the submission form (via “Edit Submission”).

For additional information about PLOS ONE ethical requirements for human subjects research, please refer to http://journals.plos.org/plosone/s/submission-guidelines#loc-human-subjects-research.

3. We note that the grant information you provided in the ‘Funding Information’ and ‘Financial Disclosure’ sections do not match.

When you resubmit, please ensure that you provide the correct grant numbers for the awards you received for your study in the ‘Funding Information’ section.

4. We note that you have stated that you will provide repository information for your data at acceptance. Should your manuscript be accepted for publication, we will hold it until you provide the relevant accession numbers or DOIs necessary to access your data. If you wish to make changes to your Data Availability statement, please describe these changes in your cover letter and we will update your Data Availability statement to reflect the information you provide.

5. Your ethics statement should only appear in the Methods section of your manuscript. If your ethics statement is written in any section besides the Methods, please move it to the Methods section and delete it from any other section. Please ensure that your ethics statement is included in your manuscript, as the ethics statement entered into the online submission form will not be published alongside your manuscript.

6. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

********** 

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

********** 

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: Yes

********** 

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

********** 

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: In this submission, the authors report assessments of the reproducibility, robustness, and replicability of the association between data sharing upon request and misreported statistical results as documented by me and my co-authors in 2011. This work is meticulous, reported well, and the conclusions are clearly supported by the data and analyses. This submission is of very high quality and meets all PLOS ONE publication criteria. Just a few minor issues.

1) neither the original study nor the replication were preregistered, while two of studies in Nuijten et al. (2017) were preregistered. Although the multiverse analyses shed some light on the robustness of findings, the use of preregistration would have improved these studies. This could be discussed.

2) I couldn't access the OSF files without requesting access. I agree with the authors that the sharing of data on data sharing behaviours represents an ethical risk and so it is understandable that they could not put all the data in the open.

3) multiverse analyses presume that analytic choices made are mostly arbitrary. This perhaps warrants some discussion.

Signed,

Jelte Wicherts

Reviewer #2: PONE-D-22-16736

Thank you for the opportunity to review this manuscript. Replication studies, especially those regarding issues with data sharing, are both timely and important. The manuscript was well-written and replication was comprehensive while being transparent. Overall, the manuscript would benefit from minimal revisions mainly aiming at contextualizing the impact of the study.

1. One reason Wicherts et al. (2011) conducted their study is because it is an ethical imparative for psychologists to share their data upon request by other psychologists. Noting the importance of why authors' reluctance to share data and syntax/code for verifying results would strengthen the impact of the manuscript. From an ethics standpoint, Bosma & Granger (2022) provide a recent commentary on the ethical implications related to data sharing, which could be helpful for contextualizing why this manuscript is important.

Bosma, C. M., & Granger, A. M. (2022). Sharing is caring: Ethical implications of transparent research in psychology. American Psychologist.

2. Although it is noted in the discussion that Wicherts et al. results did not replicate with the new sample, more discussion can be provided on considering potential cohort effects. Awareness of open data practices have been growing over the past ~15 years and researchers may be more careful about reporting results since Wicherts et al and others began shedding more light on this issue. Though this cannot be tested given the data available to the authors, it is worth discussing.

3. More discussion should be provided regarding sampling bias. Transparency in research varies greatly between sub-disciplines in psychology, yet the the authors attempted replicating Wicherts et al. using convenient data from different journals. Further, although APA journals fall under one organization, the policies for each journal differ greatly in addition to sub-discipline norms. Greater acknowledgement of the different journals included in the replication study would be helpful to further provide context for the degree of inference that can be made from the replication results.

4. Building on comment 1, the authors can add several statements in the introduction and conclusion for why the study broadly matters. Psychologists are ethically supposed to share their data for verification as part of ensuring that what is reported is accurate. The bigger story is that psychologists are still reluctant to share their data upon request, but is helpful to know that perhaps reluctance to sharing is not clearly connected with consistency/decision errors.

********** 

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Jelte Wicherts

Reviewer #2: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Attachments
Attachment
Submitted filename: CB_pone-d-22-16726_review.rtf
Revision 1

Response to reviewers

Dear editor,

Thank you for the opportunity to submit a revised version of our manuscript. With the help of the constructive comments, we improved our manuscript. We provide a detailed response to your and the reviewers’ comments below.

Academic editor: Niklas Bobrovitz

This is a comprehensive and well reported replication study on an important topic.

It provides valuable information on the relevance of data sharing practices and policies with respect to the reproducibility of research results.

Thank you for your encouraging words.

The reviewers have suggested several minor revisions. Please address these.

I also have one suggested revision. Reviewer 2 made a comment about acknowledgement of the different journals included in the replication study as this may be helpful to further provide context for the degree of inference that can be made from the replication results. As part of this acknowledgment you may wish to highlight the distinction between a "retest (direct) reproduction attempt" and an "approximate (conceptual) reproduction attempt". This is a distinction described by: Zwaan, R., Etz, A., Lucas, R., & Donnellan, M. (2018). Making replication mainstream. Behavioral and Brain Sciences, 41, E120. doi:10.1017/S0140525X17001972. It is also summarized well by Niven, D.J., McCormick, T.J., Straus, S.E. et al. Reproducibility of clinical research in critical care: a scoping review. BMC Med 16, 26 (2018). https://doi.org/10.1186/s12916-018-1018-6.

This differentiation may help to also frame the methodological approach you have taken.

Thank you for your suggestion. We removed the term “exact replication” from the manuscript. However, whether this study is a direct or conceptual replication, depends on which elements of the study are seen as critical to produce the original result, according to Zwaan et al. (2018). This distinction makes sense, however, we did not have a particular a priori expectation, and even if we did, we cannot claim it since we did not preregister this study. Therefore, we opted for referring to Zwaan et al. (2018) and emphasizing the similarities and differences between the studies.

We added a paragraph on limitations to the discussion section:

“In our analyses, we did not exploit researcher degrees of freedom, because we applied the same analyses as Wicherts et al. (2011), and even investigated the robustness of the outcomes to alternative analytical choices. Nevertheless, clear preregistered expectations regarding the hypotheses under test would have allowed for stronger theoretical conclusions. For instance, we did not specify whether we expect the relationship between data sharing and consistency errors would change over time or whether the method by which consistency errors are retrieved would matter. In other words, there are no critical elements predefined under which the hypotheses would theoretically hold. This complicates the distinction between direct and conceptual replication (Zwaan et al., 2017), even though we tried to stay as close as possible to the original methods.

In the following subsections, we delve into two major differences between the original and the replication study that can explain the differences in results. First, the two samples differed in the papers they included, and are possibly subjected to sampling bias. Second, there were differences between the detection and evaluation of triplets of the test statistic, the degrees of freedom, and the p-value.”

Reviewer 1: Jelte Wicherts

In this submission, the authors report assessments of the reproducibility, robustness, and replicability of the association between data sharing upon request and misreported statistical results as documented by me and my co-authors in 2011. This work is meticulous, reported well, and the conclusions are clearly supported by the data and analyses. This submission is of very high quality and meets all PLOS ONE publication criteria. Just a few minor issues.

We wish to thank Dr. Wicherts for the constructive comments. We address the issues raised below.

1) neither the original study nor the replication were preregistered, while two of studies in Nuijten et al. (2017) were preregistered. Although the multiverse analyses shed some light on the robustness of findings, the use of preregistration would have improved these studies. This could be discussed.

We added a paragraph to the discussion section, which contains the following on preregistration: “In our analyses, we did not exploit researcher degrees of freedom, because we applied the same analyses as Wicherts et al. (2011), and even investigated the robustness of the outcomes to alternative analytical choices. Nevertheless, clear preregistered expectations regarding the hypotheses under test would have allowed for stronger theoretical conclusions.”

2) I couldn't access the OSF files without requesting access. I agree with the authors that the sharing of data on data sharing behaviours represents an ethical risk and so it is understandable that they could not put all the data in the open.

Thank you for checking, something went wrong with including the links. This problem should be fixed now. Note that, in the submission form we stated that we will provide repository information for the data at acceptance. The OSF project is still private and you need a separate link (provided in the manuscript) to access each component.

3) multiverse analyses presume that analytic choices made are mostly arbitrary. This perhaps warrants some discussion.

In the introduction we removed the following: “We then continue with an investigation of the robustness of their results. Because their analysis, like all statistical analyses, featured a number of arbitrary decisions, we carried out several alternative, analytical approaches to evaluate the robustness of the results against other justifiable analytical decisions (i.e., a multiverse analysis; see Steegen et al., 2016).”

We replaced it with (and added a reference): “Because there are often numerous viable ways to conduct a data analysis (i.e., researcher degrees of freedom, Wicherts et al., 2016), any data analysis which reports a single pathway provides an incomplete picture. We continue with an investigation of the robustness of their results against other reasonable analytical decisions, by carrying out several alternative analytical approaches (i.e., a multiverse analysis; see Steegen et al., 2016).”

Reviewer 2

Thank you for the opportunity to review this manuscript. Replication studies, especially those regarding issues with data sharing, are both timely and important. The manuscript was well-written and replication was comprehensive while being transparent. Overall, the manuscript would benefit from minimal revisions mainly aiming at contextualizing the impact of the study.

We wish to thank Reviewer 2 for the constructive comments. We address their suggestions below.

1. One reason Wicherts et al. (2011) conducted their study is because it is an ethical imparative for psychologists to share their data upon request by other psychologists. Noting the importance of why authors' reluctance to share data and syntax/code for verifying results would strengthen the impact of the manuscript. From an ethics standpoint, Bosma & Granger (2022) provide a recent commentary on the ethical implications related to data sharing, which could be helpful for contextualizing why this manuscript is important.

Bosma, C. M., & Granger, A. M. (2022). Sharing is caring: Ethical implications of transparent research in psychology. American Psychologist.

Thank you for the suggested reference. We included it in the beginning of the introduction: “A major benefit of sharing research data is that it allows the scientific community to verify the empirical results and scientific findings, and to further build upon the published work. Moreover, as Bosma and Granger (2022) showed, there are also ethical implications of data sharing. Despite all these benefits, Wicherts et al. (2006) found that researchers are not keen on sharing their data when they requested data for reanalysis to investigate the influence of outliers, as they received data from only 27% of the contacted authors.”

We further emphasized the ethical implication in the discussion. We added a sentence regarding the original study by Wicherts et al. (2011): “Their study was of great importance, not only because of the relationship they found, but also because of the emphasis it put on the ethical imperative for open data.”

2. Although it is noted in the discussion that Wicherts et al. results did not replicate with the new sample, more discussion can be provided on considering potential cohort effects. Awareness of open data practices have been growing over the past ~15 years and researchers may be more careful about reporting results since Wicherts et al and others began shedding more light on this issue. Though this cannot be tested given the data available to the authors, it is worth discussing.

We could have indeed discussed this a bit more. We adapted the discussion part (see our reply to the next comment), and added: ”Awareness of (the need for) transparent research practices has been growing in the past years, so the replication sample might show a cohort effect.“

3. More discussion should be provided regarding sampling bias. Transparency in research varies greatly between sub-disciplines in psychology, yet the the authors attempted replicating Wicherts et al. using convenient data from different journals. Further, although APA journals fall under one organization, the policies for each journal differ greatly in addition to sub-discipline norms. Greater acknowledgement of the different journals included in the replication study would be helpful to further provide context for the degree of inference that can be made from the replication results.

This is a good point, although we do not believe that the replication data are the result of convenience sampling. Vanpaemel et al. (2015) contacted authors in the same way as Wicherts et al. (2015) in similar APA journals.

In the discussion section, we added: “In the following subsections, we delve into two major differences between the original and the replication study that can explain the differences in results. First, the two samples differed in the papers they included, and are possibly subjected to sampling bias. Second, there were differences between the detection and evaluation of triplets of the test statistic, the degrees of freedom, and the p-value.”

In our manuscript, we already specified the differences in topics. Based on this comment, we also acknowledge that the specific journal policies could have differed at the time. We added the following to the subsection about the papers included in the samples: “Besides topics, the specific journal policies (including guidelines regarding data sharing) probably differed between journals.”

4. Building on comment 1, the authors can add several statements in the introduction and conclusion for why the study broadly matters. Psychologists are ethically supposed to share their data for verification as part of ensuring that what is reported is accurate. The bigger story is that psychologists are still reluctant to share their data upon request, but is helpful to know that perhaps reluctance to sharing is not clearly connected with consistency/decision errors.

We added the following to the conclusions: “Overall, in our work, we did not find evidence for a strong, general relationship between sharing research data upon request and consistency (and decision) errors. However, this should not lead to the conclusion that sharing research data does not matter, on the contrary. Without raw data, detectable errors can have many sorts of possible underlying causes in a paper. For a thorough verification of the reported results, raw data is necessary. For a thorough verification of the reported results, raw data is necessary.”

Attachments
Attachment
Submitted filename: Response to reviewers.docx
Decision Letter - Niklas Bobrovitz, Editor

Data sharing upon request and statistical consistency errors in Psychology: A replication of Wicherts, Bakker and Molenaar (2011)

PONE-D-22-16736R1

Dear Dr. Claesen,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Niklas Bobrovitz

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Formally Accepted
Acceptance Letter - Niklas Bobrovitz, Editor

PONE-D-22-16736R1

Data sharing upon request and statistical consistency errors in Psychology: A replication of Wicherts, Bakker and Molenaar (2011)

Dear Dr. Claesen:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Niklas Bobrovitz

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .