Peer Review History
| Original SubmissionSeptember 23, 2019 |
|---|
|
PONE-D-19-26742 Economically rational sample-size choice and irreproducibility PLOS ONE Dear Dr. Braganza, I write you in regards to the manuscript PONE-D-19-26742 entitled “Economically rational sample-size choice and irreproducibility” which you submitted to PLOS ONE. I have solicited advice from two expert Reviewers, who have returned the reports shown below. The two reviewers provide positive recommendations, but both agree that the paper would benefit from a round of revisions before acceptance. Reviewer #1 suggests that “the manuscript would be stronger if more specific, concrete, prospective predictions were made” and that “it would have been interesting to explore the impact of modifying the input parameters on the predicted outputs of the model”. I encourage you to explore some of the modifications proposed by the reviewer. Reviewer #2 also suggests going beyond the analysis provided in the current version of the paper and exploring some modifications of the model. Additionally, (s)he raises a number of minor issues that could strengthen the paper. Based on the Reviewers' reports and my own reading of the paper, I came to the decision to offer you the opportunity to revise the manuscript. If you decide to prepare a substantially revised version of the paper, please provide a detailed response to both Reviewers regarding how you have addressed their concerns. If you resubmit, I would ask the same two Reviewers to review again the paper. We would appreciate receiving your revised manuscript by Jan 25 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols Please include the following items when submitting your revised manuscript:
Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out. We look forward to receiving your revised manuscript. Kind regards, Luis M. Miller, Ph.D. Academic Editor PLOS ONE Journal requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at http://www.journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and http://www.journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf 2. Please consider changing the title so as to meet our title format requirement (https://journals.plos.org/plosone/s/submission-guidelines). In particular, the title should be "Specific, descriptive, concise, and comprehensible to readers outside the field" and in this case it is not informative and specific about your study's scope and methodology. 3. Please do not include funding sources in the Acknowledgments or anywhere else in the manuscript file. Funding information should only be entered in the financial disclosure section of the submission system. https://journals.plos.org/plosone/s/submission-guidelines#loc-acknowledgments 4. Please upload a copy of Figure 4, to which you refer in your text on page 6. If the figure is no longer to be included as part of the submission please remove all reference to it within the text. Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The author presents a relatively simple model, whereby scientists choose “economically rational” sample sizes on the basis of the trade off between the costs of collecting data (with larger studies being more expensive) and the value of publications (i.e., resulting grant income). The authors acknowledge the relatively simplicity of the model, but suggest that the model performs reasonably well in terms of predicting the distribution of statistical power in a way that mirror empirical efforts to estimate this. The work is a valuable addition to the literature, and complements previous efforts that have taken a similar approach (i.e., using modelling) to understand the impact of incentive structures on the behaviour of scientists, and the resulting quality of published research. However, I felt that (notwithstanding the virtues of simplicity) it would have been interesting to explore the impact of modifying the input parameters on the predicted outputs of the model (i.e., on scientists’ behaviour). For example, the drivers of small sample size in the model are low base probability, small effect size, and/or low grant income per publication. One could argue that at least two of these are shaped by funding agencies – low base probability may result from an emphasis on novelty and “groundbreaking” research in funding applications, whilst a low grant income per publication may result from low funding rates. The latter, in particular, varies considerably (from ~5% to ~40% across countries, although I am not aware of any studies that have examined the relationship between funding rate and power distribution. Similarly, certain fields may be more likely to have a high base rate probability than others – clinical trials of medical interventions, for example, represent the culmination of a long process of discovery, experimentation, validation, etc. Indeed, the principle of equipoise should suggest that only 50% of trials should demonstrate a benefit for the intervention over the comparator (which is borne out by empirical work). What would the model predict should be the difference between research of this kind compared with more blue-sky discovery research where the base probability might be considerably lower. In other words, for these models to be more than useful descriptions they should provide us with insights into how changing incentive structures might shape scientists’ behaviour, and the resulting quality of research. There may be opportunities here to make prospective (perhaps even quantitative) predictions that then could be tested – either as part of this paper, if the author has the resources to do so, or in future studies. The author briefly touches on these issues, but I think the manuscript would be stronger if more specific, concrete, prospective predictions were made. Reviewer #2: This is a very timely and important article. A few minor revisions could improve the manuscript. • “IF is a positive constant reflecting mean grant income per publication.” Can we also think about this parameter as a reflection of the “cost to collect data.” If “IF” is large, this translates to the potential profit being higher. Does this imply that the relative cost per sample is small? Conversely, if “IF” is small, does this translate to a situation where data is relatively expensive? Can you discuss how your results reflect on fields where collecting is expensive versus fields where collecting data is relatively inexpensive? On a related note, see Sassenberg et al. (2019) who conclude that a journal’s “demand for higher statistical power [...] evoked strategic responses among researchers. [...] [R]esearchers used less costly means of data collection, namely, more online studies and less effortful measures.'' • Consider exploring (or at the very least commenting) on a nonlinear cost for sample size. In reality, the cost of increasing one’s sample size from 10 to 20 might be different than increasing the sample size from 100 to 110. • “Which sample sizes are scientifically ideal is of course a complex question in itself, and will depend not only on the cost of sampling but also on the scientific values of true and false positives as well as true and false negatives.” I suggest a comment or a reference on the meta-analytic impact of studies with low power. Consider for example the conclusions of Stanley et al. (2017) in “Finding the power to reduce publication bias.” • “For instance, Campbell and Gustafson [43] propose conditional equivalence testing to increase the publication value of negative findings.” I suspect that if, in your model, both positive and negative findings were given equal value, the optimal sample size would still be very low. If, regardless of the outcome, the study will be published and the “IF” received, won’t it be optimal to conduct a large number of very small studies? With this in mind, could you elaborate on “increase the publication value of negative findings”? In other words, for the equation “Profit = IF * TPR – s”, what could we consider for replacing the TPR term? How should we be compensating researchers for their work? While I understand that your model is only an approximation of the complicated research economy, can you point to any alternatives to giving researchers a certain amount of grant money per (positive) publication? Small things: • “For simplicity we assume they receive funding only, if they publish and they can publish only positive results.” This is an important idea and it needs to be made crystal clear. Please consider rewriting this and perhaps elaborating. • “To compute the implied distribution of emergent power and positive predictive values, the corresponding values for each ESS were weighted by its TPR/ESS.” This is another very important idea. I suggest writing out an example to make sure the reader understands. For instance: “For example, with the same total amount of ressources at hand, a researcher could conduct 10 small studies with a sample size of 10 or one 2 large studies with sample size of 50, … the emergent studies (i.e., the published literature) will then have….” • “Two, now prominent, studies from the pharmaceutical industry suggested reproducibility rates of 11 and 22% [1,2, respectively].” Consider adding here a comment/reference to Johnson et al. (2017) “On the Reproducibility of Psychological Science.” • Fix punctuation and spacing: “chosen for a set of parameters (b, d, IF , see table1)” ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Marcus Munafo Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 1 |
|
A simple model suggesting economically rational sample-size choice drives irreproducibility PONE-D-19-26742R1 Dear Dr. Braganza, We are pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it complies with all outstanding technical requirements. Within one week, you will receive an e-mail containing information on the amendments required prior to publication. When all required modifications have been addressed, you will receive a formal acceptance letter and your manuscript will proceed to our production department and be scheduled for publication. Shortly after the formal acceptance letter is sent, an invoice for payment will follow. To ensure an efficient production and billing process, please log into Editorial Manager at https://www.editorialmanager.com/pone/, click the "Update My Information" link at the top of the page, and update your user information. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to enable them to help maximize its impact. If they will be preparing press materials for this manuscript, you must inform our press team as soon as possible and no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. With kind regards, Luis M. Miller, Ph.D. Academic Editor PLOS ONE Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The author has addressed all the comments raised in the previous round; I have no further suggestions for improving the manuscript. Reviewer #2: Two very minor suggestions: Line 280 – “In the following, we explore the effects of CET given either = 0.5d or = d (Fig. 4).” You should add a note to clarify that you have alpha=0.05 for both the testing of the null and the testing of equivalence. (Campbell&Gustafson suggest that other options could be considered.) Also, perhaps you might want to add that there is a one-to-one correspondence between the alpha and the width of the equivalence margin (inverse relationship). Line 486 – “have explored the the effect of various” Remove second “the” ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Marcus Munafo Reviewer #2: No |
| Formally Accepted |
|
PONE-D-19-26742R1 A simple model suggesting economically rational sample-size choice drives irreproducibility Dear Dr. Braganza: I am pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please notify them about your upcoming paper at this point, to enable them to help maximize its impact. If they will be preparing press materials for this manuscript, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. For any other questions or concerns, please email plosone@plos.org. Thank you for submitting your work to PLOS ONE. With kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Luis M. Miller Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .