Peer Review History

Original SubmissionJune 28, 2020
Decision Letter - Jonathan Jong, Editor

PONE-D-20-19925

Honest signaling in academic publishing

PLOS ONE

Dear Dr. Tiokhin,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

As you can see, the paper received two divergent reviews, though both were fundamentally positive about the paper. I too am fundamentally positive about the paper, but some of R1's concerns about the assumptions of the model did occur to me too. However, I am not suggesting that you change your assumptions, merely that you acknowledge their limitations. Of course, you may be persuaded by R1's challenge: in which case, by all means change your assumptions and modify the model--or better yet, run multiple models with slightly different assumptions. I will probably not submit the next version of this paper for further review, but make the decision myself.

Please submit your revised manuscript by Nov 30 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Jonathan Jong, PhD

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2.  Thank you for stating the following in the Competing Interests section:

"The authors have declared that no competing interests exist.".

We note Simine Vazire's PLOS Board of Directors membership.

Please confirm that this does not alter your adherence to all PLOS ONE policies on sharing data and materials, by including the following statement: "This does not alter our adherence to  PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests).  If there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared.ii) Please include your updated Competing Interests statement in your cover letter; we will change the online submission form on your behalf.

Please know it is PLOS ONE policy for corresponding authors to declare, on behalf of all authors, all potential competing interests for the purposes of transparency. PLOS defines a competing interest as anything that interferes with, or could reasonably be perceived as interfering with, the full and objective presentation, peer review, editorial decision-making, or publication of research or non-research articles submitted to one of the journals. Competing interests can be financial or non-financial, professional, or personal. Competing interests can arise in relationship to an organization or another person. Please follow this link to our website for more details on competing interests: http://journals.plos.org/plosone/s/competing-interests

3. Thank you for stating the following in the Acknowledgments Section of your manuscript:

"LT and DL were supported by the Netherlands

Organization for Scientific Research (NWO) VIDI grant 452-17-01. KZ was supported by the

National Science Foundation (NSF) grant SES 1254291. The funders had no role in any aspects

of this study, the preparation of the manuscript, or the decision to publish."

We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form.

Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows:

"The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

4. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability.

Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. Any potentially identifying patient information must be fully anonymized.

Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access.

We will update your Data Availability statement to reflect the information you provide in your cover letter.

5. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: N/A

Reviewer #2: N/A

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This paper discusses an issue in academic publishing – the information asymmetry between authors and editors – and considers how the logic of costly signaling theory can be applied to ensure that only high-quality papers are submitted to high quality journals. Their model considers two types of papers and two types of journals: high quality and low quality. It argues that in the absent of costs, authors will always be incentivized to submit all papers to high-quality journals, but that low-quality papers being reviewed by or published in high quality journals is a problem. The authors consider how costs to submit papers can change the decision calculus so that rational authors will only submit high-quality papers to high-quality journals.

I found the paper to be an interesting exercise, but I wasn’t particularly convinced by its arguments. The model is very simple and relies a number of assumptions that either rarely hold or, in the case of submission costs, *always* hold, raising questions about the conclusions one can draw. I found the discussion around the model to be cursory and at times naïve, with some occasionally problematic recommendations. That said, I thought the approach was interesting, and I found no major errors in the analysis. Connecting signaling theory to academia is a good idea, and I found there to be useful food for thought in this paper. It could also provide the foundation for more detailed modeling work on the subject. Given the broad appeal of the topic, it seems like PLOS ONE could provide a good home for the paper, pending some considerations of the comments below.

I first want to talk about the basic assumptions of the model. The model assumes that

1) Scientists should be honest

2) Scientists can accurately assess the quality of their research

3) Journal rank is a mark of quality, not specificity.

4) All research must be submitted as a paper.

The focus is on (1), and on mechanisms to enforce honesty. The others are implied and not explored, but I think there are some real issues there.

First, the extent to which researchers accurately assess the value of their work is highly questionable. It may also be adaptive to overestimate the value of one’s work – scholars that undervalue their work may be selected against in a competitive institutional system.

Second, the distinction between high- and low-quality journals seems very artificial. Certainly some journals are known to be crap (e.g. some predatory journals), and some are very prestigious, but there are also many journals that are excellent in terms of editorial curation and reputation but somewhat narrow in scope – what are often called technical or specialty journals. The dilemma many authors face is whether their paper has broad enough appeal to be submitted to a fancy interdisciplinary journal like Nature or a more solid but niche society journal.

Third, the authors write that in their ideal scenario, “Scientists would then submit high-quality research to high-ranking journals and low-quality research to lower-ranking ones” (lines 9-10). But why? Why publish low-quality research at all, especially if it’s readily identifiable as low quality? I kept thinking about this throughout the analysis, especially in terms of the cost tradeoffs. It seems like a crucial but missing component was the decision to not submit anything at all. This would carry no cost (except maybe the sunk cost of having done the work) and provide no benefit. This also speaks to something that is glossed over but important. ALL paper submissions incur some cost. It is not trivial to write up the paper and give up time to have the paper under review. The fact that most journals require papers to not be under submission elsewhere is a real tangible cost for all submissions.

Another assumption made is that “high-quality” journals are always incentivized to publish quality research. However, they are also incentivized to publish things that will get attention, and to minimize the risk of needing a retraction due to quality. This probably correlates with quality as the authors describe it, but not always – it would be good to address this, since the model strongly relies on this assumption and might run into problems if journal editors are also acting strategically.

The authors main conclusions are that high quality journals should impose (or continue to impose) costs for submission to dissuade trivial submission of low-quality papers. There are a few concerns about this. First, as noted, all submission is at least somewhat costly. Second, journals have desk rejection, which is a relatively low-cost way to weed out the more low-quality papers. And third, and most importantly, imposing high costs to submission has a potential social consequence, which is that it will tend to preferentially favor researchers who can afford to pay such costs (and so for whom the costs are effectively less). This favors senior researchers at wealthy, prestigious institutions doing normal paradigmatic science. This seems to me to be a major downside and needs to be addressed in the authors’ discussion.

_______OTHER COMMENTS_______

The section “Relation to existing models in economics” is useful but probably belongs much earlier in the paper, before the detailed model description. Also, an important but missing reference the authors might consider engaging with is:

Crawford and Sobel (1982) Strategic information transmission. Econometrica 50.

In the description of high vs. low quality research, they write (lines 112-116):

“A high-quality paper might be one that thoroughly describes prior research, is methodologically rigorous, conducts appropriate statistical analyses and sensitivity checks, honestly reports all analyses and measures (e.g., no p-hacking or selective reporting of positive results), and clearly distinguishes between exploratory and confirmatory findings (see (5) for other factors that affect research quality). In contrast, a low-quality paper may have fewer or none of these qualities.”

I would argue that you can have low quality research that has all these properties but fails to ask meaningful or important questions. This isn’t just about impact, it’s about insight. Indeed, many papers are likely rejected from “high quality journals” not because the research isn’t rigorous, but because they fail to be asking probing questions in a sufficiently deep fashion. Also, something to think about might be the fact that some research (often but not always “high quality”) may itself be more costly to conduct, and therefore authors may feel “entitled” to publishing in more prestigious journals to justify the investment, while they may feel more comfortable publishing “low-quality” research in third-tier journals because it doesn’t represent a substantial investment.

Lines 144-145: “papers are randomly determined to be high- or low-quality.”

In reality, there may be many more low quality than high quality papers. Would a skewed ratio change the calculus?

Lines 165: “This illustrates a key conflict of interest in academic publishing.”

The conflict here is that without costs, scientists are always incentivized to submit everything to “high-quality” journals, but HQ journals only want to publish HQ work. This being a conflict is contingent on a number of assumptions holding, including the absence of costs. But as noted above, there are almost always costs. There are costs to writing up results, for example, so another option is to just not submit. Consider this paper, which found that most null results are never even written up:

Franco et al. (2014) Publication bias in the social sciences: Unlocking the file drawer. Science 345.

This comment also impinges on the assumption to make the cost of submission to LQ journals c = 0, which they say “does not affect our model’s generality.” (line 179). This is only because the choice is between submit high or submit low. If another option is submit nowhere, which imposes no cost and causes no benefit, there might be cases in which a strategy could be “go big or go home” — submit to high-ranking journal, and if rejected, submit nowhere. This reduces the generalizability of c = 0.

In terms of deriving values of C to separate HQ and LQ papers, “The key insight is that imposing costs can promote honesty” (line 200). Maybe, but it also seems likely calibrating those costs will often be quite difficult. Especially since the costs and benefits are not constant for all individuals, but vary by research career stage, current prestige of position, workflow to submit and re-submit, etc. Further, if P_h is similar to P_l, there is a very narrow range of C, which interplays with the variation between individuals on both B and C.

Lines 207-209: “If high-ranking publications are worth much more than low-ranking publications (large values of B – b), large submission costs are required to ensure honest submission; otherwise, scientists are tempted to submit all papers to high-ranking journals.”

Not necessarily, if P_h is low. In that case, costs could still be low. Perhaps this suggests that HQ journals should just be more selective?

Regarding limiting the number of submissions, they cite suggestions to limit scientists’ lifetime number of publications or to limit scientists to one publication per year (lines 289-291). This example isn’t really appropriate, because it’s talking about limiting the number of publications, not the number of submissions.

In the Implications, the authors first reiterate the main conclusion: “the costs associated with publishing reduce the incentive to submit low-quality research to high-ranking journals” (lines 334-336).

They seem to have ignored desk rejection, which is the main weapon that journals have against wasting time with low-quality submissions. In truth, that might be enough. You haven’t talked at all about decision calculus for the journals. Why should they become more or less selective? What criteria should they use to select papers?

Lines 337-339: “If the benefits of high-ranking publications remained large, scientists would have even larger incentives to submit low-quality research to high-ranking journals, because the costs of doing so would be trivial.”

An interesting counterexample might be Sociological Science, which guarantees up/down decisions in 30 days and does not allow major revisions. Their rejection rate is high, and they have established a good reputation.

Lines 405-406: “When submissions and resubmissions are cost free…”

Again, I think this is almost never the case in practice. Especially if there’s the additional choice of not submitting at all.

Lines 409-411: “If journals preferentially reject low-quality papers, editors can wait some time before sending authors “reject” decisions, thereby causing disproportionate delays for low-quality submissions.”

Jesus Christ. No. Editors that do this should be punched in their stupid faces.

Lines 424-426: “Limiting the number of times that papers can be submitted or rejected…”

Beyond this being an impossible level of top-down control and overly paternalistic, such a limitation already occurs in practice because most journals have a policy that papers shouldn’t be submitted elsewhere.

Lines 472-473: “we note several extensions.”

Rather than simply listing extensions, which is about what you could have done but didn’t, why not spend some more time talking about the limitations and caveats of your conclusions based on your current model’s assumptions?

As a final note, something I kept asking myself while reading it was: Is this a high quality paper? Is PLOS ONE a high quality journal? Were the authors able to accurately assess the quality of their own research and use their model to help them decide where to submit? This is asked mostly rhetorically, but also seriously to the extent that it forces the question: are there real lessons to be drawn from this?

Reviewer #2: I really liked this paper. I thought it was clear, simple has a valuable insight, and useful prescriptions.

The basic idea is straightforward (and well explained): there is a fundamental informational and incentives problem in publishing—authors have an incentive to over-sell their work and have information about its value that the readers cannot as easily access, such as how much they had to twist the result or model or citations to make the results sound compelling and novel and interesting. The authors do a good job summarizing some of the inefficiencies this information+incentives problem creates. And then discuss how this problem can be understood and targeted using a standard costly signaling framework. Namely, in order to help readers differentiate between high and low quality papers, its essential that high quality papers can get into higher quality journals with lower relative cost—e.g. by having more attentive or better equipped referees, academic system that gives less credence to bad papers in good journals, or makes journal submission more onerous, misrepresenting results harder, or limits the frequency of submissions. I Think these are all nice insights and valuable prescriptions, and costly signaling proves to be a useful perspective to look at this problem.

Perhaps I missed something significant that other referees will catch, but on my reading, I couldn’t think of anything I would want changed in this paper or any reason to prevent its publication.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Moshe Hoffman

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 1

Response to reviewers has been uploaded along with the other materials.

Attachments
Attachment
Submitted filename: Response to reviewers.docx
Decision Letter - Wing Suen, Editor

Honest signaling in academic publishing

PONE-D-20-19925R1

Dear Dr. Tiokhin,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Wing Suen

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

The paper provides a nice application of signaling theory to academic publishing. The authors have adequately addressed the referees' comments at the earlier round. However, I have two simple suggestions to make before the paper can go into production.

1. Figure 1 of the paper is not particularly helpful. In the interest of brevity, I would suggest taking out the figure.

2. Referee 1 had some concerns about the suggestion that "editors could wait before sending authors 'reject' decisions, thereby causing disproportionate delays for low-quality submissions." (Lines 439-440, page 17). I share a similar concern. I could understand the underlying logic of making submissions differentially more costly to induce a separating equilibrium, but the suggestion borders on unethical editorial behavior. Clearly inducing a separating equilibrium may be desirable, but is not the only objective that can override all other concerns. I would suggest that the authors either take out the said recommendation, or clearly spell out the competing ethical issues that needs to be considered.

Reviewers' comments:

Formally Accepted
Acceptance Letter - Wing Suen, Editor

PONE-D-20-19925R1

Honest signaling in academic publishing

Dear Dr. Tiokhin:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Professor Wing Suen

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .