Peer Review History

Original SubmissionApril 21, 2023
Decision Letter - Simon Porcher, Editor

PONE-D-23-09667Care to share? Experimental evidence on code sharing behavior in the social sciencesPLOS ONE

Dear Dr. Krähmer,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Overall, the reviewers are very positive on your work but they ask for some precisions or alternative presentations of the results to clarify some points.

Please submit your revised manuscript by Jul 01 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Simon Porcher

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. We note that you have stated that you will provide repository information for your data at acceptance. Should your manuscript be accepted for publication, we will hold it until you provide the relevant accession numbers or DOIs necessary to access your data. If you wish to make changes to your Data Availability statement, please describe these changes in your cover letter and we will update your Data Availability statement to reflect the information you provide

3. Thank you for stating the following in the Acknowledgments Section of your manuscript: 

"This research has been generously funded by the German Research Foundation through the Priority Program META-REP (Project 464507200). We thank Katrin Auspurg, who has been intimately involved in planning and conducting the experiment. Thanks to Richard Vielberg for superb research assistance. Lastly, we are indebted to all researchers who participated in our experiment and went to great lengths to share their

research code with us"

We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form. 

Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows: 

"This research has been funded by the German Research Foundation (www.dfg.de/en) through the Priority Program META-REP (Project 464507200). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

Please include your amended statements within your cover letter; we will change the online submission form on your behalf

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This is a review for "Care to share? Experimental evidence on code sharing behavior in the social sciences." The submission contains a 2x4x2 audit-style nudge experiment designed to elicit the reasons why authors of research using observational data do or do not share their code.

My recommendation for this experiment is a positive revise and resubmit. I consider the questions considered (the sharing of code, distinct from data sharing, replicability) and the methodology (audit study, causal interpretation, multiple treatment arms, sufficient power) to be of acceptable quality. I find solely targeting research articles using a single dataset is an important innovation, and a sample size of 1206 responses is large. I am further encouraged by the fact this analysis was pre-registered.

My comments below are split into major and minor.

Major Comments

1. Please include a discussion of the duplicitous researcher. If I am a duplicitous researcher, and I receive an email (as described) stating the following: "our project aims to assess the reproducibility of randomly selected articles from the European Social Survey’s bibliographic database. Would you mind sharing your code with us to make sure your article can be included in our analysis?" I will simply ignore it. What implications does this have for the paper's results? To my reading, this will mean that those who reply are more likely to be honest researchers. The estimated nudge effects are then lower bounds, should the proportion of duplicitous researchers shrink in the future (who would not be nudge-able ever).

2. While Table D of the supplementary information is almost comprehensive, I would like to see the following: Line 234 mentions a strong assumption. " Researchers who promised to share their code but failed until our deadline were coded as unwilling to share." Are the results robust to their exclusion? Are results robust to their coding as 1? This proportion of researchers seemed non-trivial.

3. Please answer: By treatment, is response rate different? I understand that the final dummy variable is 0 when not and 1 when the code was shared, but response rate could tell the reader something more. The data is already well anonymized so I do not see this being an issue of privacy for this manuscripts authors.

4. What is the proportion of authors that needed clarification about what research code is? Line 268. Is this extra coaching applied evenly between treatments?

5. There is an assumption being made here about no interactivity between nudges. I would like to see a simple regression examining this, using interaction effects. A simple linear probability model would suffice in my opinion.

6. The post-hoc achieved power for the framing treatment is 62.7%. However, this includes non-response. From G*Power we have...

z tests - Proportions: Difference between two independent proportions

Analysis: Post hoc: Compute achieved power

Input: Tail(s) = Two

Proportion p2 = 0.409

Proportion p1 = 0.340

α err prob = 0.05

Sample size group 1 = 508

Sample size group 2 = 520

Output: Critical z = 1.9599640

Power (1-β err prob) = 0.6277340

For the framing statistical test. I would like to see calculations of post-hoc achieved power for the remainder of the hypotheses.

Minor Comments

1. I downloaded and was able to successfully replicate the figures and tables using the same code and data as provided by the authors. The data has a last modified date of 2023-03-30. I then followed up on the pre-registration on "" ext-link-type="uri" xlink:type="simple">https://osf.io/bqjcz". The registration was created on 2022-07-06. The only unfortunate thing I can note (and this fact is by no means unique to this project) that it is not possible for the reviewer to verify if the registration was done prior to sample collection with the replication package as is. To this end, I must attach less meaning to the pre-registration than I would be able to otherwise. This can be remedied easily. Line 188 of the manuscript does mention that data collection began on 2022-07-06. I would appreciate some/any method of verification of the emails / replies being sent / coming in after the pre-registration was released.

2. Quotation marks, particularly opening quotations are often "backwards".

3. Line 13. I believe on line 13 the authors mean "on the authors' side". Typo in line 303 "a long time ago"

4. Line 20. The following should be mention/cited here, as it is a well-known and publicized example of this.

Silberzahn, R., Uhlmann, E. L. (2015). Crowdsourced research: Many hands make tight work. Nature, 526(7572), 189-191.

Silberzahn, R., Uhlmann, E. L., Martin, D. P., Anselmi, P., Aust, F., Awtrey, E., ... Nosek, B. A. (2018). Many analysts, one data set: Making transparent how variations in analytic choices affect results. Advances in Methods and Practices in Psychological Science, 1(3), 337-356.

Reviewer #2: This is an interesting and very innovative paper on an important topic. In a large randomized field experiment, the authors investigate how they can get researchers to share their analysis code upon requests. Very neatly, the authors only contact researchers who have worked with data from the European Social Survey, which means that data sharing is not an issue. That’s really great! In this pre-registered experiment, the authors randomly vary various aspects of the code requests, and largely find null effects of their treatments with the exception of a positive effect of negative wording alluding to the replication crisis and this was against the prior of the authors.

I am very positive to this paper. It was great to see the actual emails sent out as this makes it clear to me that the message must have come across to those that read the email. Most of my comments are minor and are written below in the order of appearance in the paper:

p.2: You write “In practice, replications hinge on the availability of an original study’s code and data.” I would argue that this is about reproducibility, and that replication involves new data. I think this is how most psychologists would define replication (including the work of Brian Nosek and others). Some discussion of reproducibility vs replicability might make sense to include.

p.2: When writing about publishing research data being a step towards more open science, I guess this also depends on how “true” we think the data is in terms of whether we only get to see the p-hacked data or not (i.e. outcome variables that were not significant are not included etc).

p.3, first paragraph in “state of the art”: Here it could be interesting to note that journals increasingly have Data Editors (at least in economics and political science) who go through the data and code to make sure that the code runs and leads to the same results as in the paper. For example, the journals of the American Economic Association only conditionally accept papers before the data editors have approved the data and code, which timing-wise makes sense.

Theoretical background: Do you consider all framing nudges? I personally do not really see the need to call the different versions in your experiment nudges, but this is of course up to you. When citing references 49-51 it looks in the main text like those studies were on code sharing PGGs which is not the case since they are on standard ones – perhaps that could be clarified?

In relation to H2.3: Perhaps a bit related to this is the paper by Christensen et al. who find that data sharing is positively related to citations (Christensen G, Dafoe A, Miguel E, Moore DA, Rose AK (2019) A study of the impact of data sharing on article citations using journal policies as a natural experiment. PLoS ONE, 14(12): e0225883. https://doi.org/10.1371/journal.pone.0225883).

p.7: I very much appreciate the transparency in how deviations from the pre-analysis plan are listed. From my understanding of reading them they are pretty minor, which was reassuring, but when reading the main text I didn’t realize that so then I was a bit more worried (perhaps due to seeing so many papers in the past where there are serious deviations that are not justified or discussed). Perhaps this could be clarified in a footnote in the main text where the deviations are very briefly described?

p.9: Given potential power problems, why did you decide to have so many treatments? Some more discussion here would be great.

p.11, line 262: What is the size of this share?

p.13, line 319: I maybe misunderstand something, but a z-value of 2.287 should have a low p-value – is there an error here? (and maybe elsewhere? I haven’t checked the other ones.) The null hypothesis is no effect so even though the result is in the wrong direction, it is an effect that is statistically significant.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 1

Dear Editors,

We thank you and the two anonymous reviewers for their helpful comments that substantially improved our manuscript. We have edited our submission to address all suggestions and concerns that were raised during the review process.

Our resubmission includes a detailed point-by-point response.

We hope the revised manuscript is now suitable for publication in PLOS One and are very much looking forward to your decision.

Thank you and best wishes also on behalf of my co-authors,

Daniel Krähmer

Attachments
Attachment
Submitted filename: Rebuttal.pdf
Decision Letter - Simon Porcher, Editor

Care to share? Experimental evidence on code sharing behavior in the social sciences

PONE-D-23-09667R1

Dear Dr. Krähmer,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Simon Porcher

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Formally Accepted
Acceptance Letter - Simon Porcher, Editor

PONE-D-23-09667R1

Care to share? Experimental evidence on code sharing behavior in the social sciences

Dear Dr. Krähmer:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Pr. Simon Porcher

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .