Peer Review History
| Original SubmissionAugust 19, 2022 |
|---|
|
PONE-D-22-23245Public Perception of Scientists: Partisan and Non-partisan ThinkingPLOS ONE Dear Dr. Sonmez, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please, read carefully comments from Reviewers 1 and 2 and approach all of them. Please submit your revised manuscript by Jan 26 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Celia Andreu-Sánchez Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 2. Thank you for stating the following in the Acknowledgments Section of your manuscript: "The authors thank participants of the 2019 meeting of ESSEXLab Behavioural Mini." We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form. Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows: "N.A received funding from the University of Essex as part of his personal research account. https://www.essex.ac.uk/. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript." Please include your amended statements within your cover letter; we will change the online submission form on your behalf 3. We note that you have stated that you will provide repository information for your data at acceptance. Should your manuscript be accepted for publication, we will hold it until you provide the relevant accession numbers or DOIs necessary to access your data. If you wish to make changes to your Data Availability statement, please describe these changes in your cover letter and we will update your Data Availability statement to reflect the information you provide. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Partly ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: I Don't Know Reviewer #2: I Don't Know ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The technical bit of this seems really interesting and I'm glad someone has tried this. I've had a book sitting in my office on conjoint experiments for a related question. Nevertheless, there's a couple of conceptual things here that I think are really important to integrate before this is published. 1. First, I think the authors need to do an edit in which they are more careful in how the use the words such as trust, trustworthiness, and 'dimension.' In this regard, the manuscript often suggests that expertise and 'fairness' are dimensions of trust but that's not quite right. Per the widely accepted 'integrative model of organizational trust' (as well as Fiske's BIAS model, to some extent), it's quite important to distinguish between trustworthiness perceptions and behavioral trust. In this regard, behavioral trust involves actually making oneself vulnerable to a trustee. The behavior part is important. Trustworthiness, in contrast, is a word often used to refer to perceptions and thus ability (or expertise, if you use Fiske's terminology, or competence if you use Hovland's), benevolence (an aspect of warmth for Fiske; part of trustworthiness for Hovland), and integrity (an aspect of warmth for Fiske; part of trustworthiness for Hovland) can be called dimensions of trustworthiness but not dimensions of trust. This distinction is obnoxious but it matters because we really need to make sure we're distinguishing between perceptions/beliefs and behaviors when we do trust-related research. This distinction is also important because it suggests that just asking people 'how much do you trust X' -- as you do -- will give you really messy data as we have little idea what people are actually thinking about when they respond. Even better is if we ask people more specific questions such as 'to what degree do you think X has expertise in ...' or 'to what degree would you be willing to take the advice of X on Y". Beyond just usage, this edit should thus influence how you understand your results. In any case, I just think you need to clean this up. 2. My other ask is that you specifically address the validity/utility of the approach you used. I understand using this type of experiment when you're trying to decide on whether a product with X1 characteristic is more/less appealing than product with X2 characteristics. Indeed, I first learned of the approach in the context of cigarette labeling and it seems reasonable to ask people to pick between two similar options. What I don't get here is why it make sense to ask someone to pick between two scientists. When would someone ever do that unless they were on a hiring committee? And wouldn't doing that make the differences especially salient? Practically, for most times that we ask people to trust scientists, I don't see why we wouldn't be better doing things like presenting systematically randomly different scientists and asking people to rate them on relevant characteristics (e.g., see the work of Shupei Yuan). Practically, I also can't help believing that your results are contaminated by substantial desirability bias. 3. I'm a little sad that you only varied things over which people have little control. From a communication perspective, I can't do much about my race, gender, or employer. I suppose you could choose a spokesperson for an issue in some cases based on these factors but, in general, the scientist you have is the scientist you get. It is true, however, that a scientist can choose to disclose their political ideology (or religiosity in the case of someone like Katherine Hayhoe or James Watson; or values, more generally, as in the case of of some of Kahan's research), but this isn't exactly a common approach. Training programs (and Hayhoe) often encourage scientists to identify values they share with their audience but I've never heard anyone advocate sharing party ID. 4. I think you need to also be a little more careful about how you talk about motivated reasoning. The differentiation between 'motivated' reason and heuristic reasoning seems important here. I'm not sure you're invoking motivated reasoning here inasmuch as you don't have a stimulus that seeks to make a specific identity salient and then ask people to evaluate something using effortful processing. Instead, you're just asking people to make judgements based on heuristic cues, it seems. 5. The figures need extensive notes that include things like range and the meaning of everything. In this regard, I note that figures generally be understandable without substantial reference to the originating text (at least according to APA). Ultimately, what I'd like to see happen is a more targeted manuscript that focuses on the potential of the method in this space but recognizes the odd, unrealistic context + danger of desirability bias of the current attempt. I would also like to see something that's much more careful with the trust/trustworthiness language used. Reviewer #2: Thank you for the opportunity to review this manuscript on how the characteristics of scientists themselves shape public perceptions of scientists. Please note that I do not consider myself to be an expert in conjoint designs and thus my observations focus on the introduction/argument, sample and measurement (rather than analysis), and discussion/implications. I hope the authors find these comments helpful as they continue their work in this space. 1. I appreciated the authors’ thorough literature review and argument that informed the selection of the five characteristics of scientists. With regards to the fields of study, I’m curious why the authors did not assess other social/behavioral sciences beyond economics (e.g., psychology, sociology, communication)? These other disciplines would seem germane. 2. For H5.2, can the authors specify the nature of the interaction effect they expected (i.e., “differ” in what way, as far as directionality and strength of association are concerned)? 3. One could imagine a situation in which the scientist does not affiliate themselves directly with a political party or stance, but instead with being supportive of or opposed to a given behavioral strategy, policy, etc. I realize the authors would not, with these data, be in a position to assess what effects this positioning might have on public trust, but perhaps this is worth considering in future research (and thus could be considered in the discussion; see comment #7). My instinct is that this sort of proxy alignment may be more prevalent than direct alignment/declarations of affiliation with a particular political party or stance (i.e., scientists may have more of an implied vs. overt ideological bent). 4. In the abstract and methods, the authors describe the sample as “representative of U.S. adults,” which is misleading, as Prolific does not maintain a panel created via RDD and/or address-based sampling methods (compared to, say, NORC’s AmeriSpeak panel). It is certainly fair to note the quota sampling. 5. This study was fielded during the early days of the COVID-19 pandemic. This was a rather exceptional time, of course, in which to be evaluating trust in scientists—especially in the U.S., where there is considerable evidence that the pandemic was politicized early and often (see, for example, https://press.princeton.edu/books/hardcover/9780691218991/pandemic-politics and https://read.dukeupress.edu/jhppl/article-abstract/45/6/967/165291/The-Emergence-of-COVID-19-in-the-US-A-Public). Can the authors speak to the implications of the timing of data collection for interpreting and contextualizing study results? 6. With regards to DV measurement, I’m not sure that U.S. participants would know what a “Board of Scientific Councillors” is in “their district.” This seems like UK terminology that might not translate. Was this defined for participants? What assurances do we have that participants did in fact understand this reference? Has this measure been used in other U.S.-based contexts? 7. For all the strengths of this study, it is not without limitations, and those should be acknowledged in what is, at present, quite a short discussion section (e.g., see above note about the sample, which has implications for generalizability). Similarly, at present there is a solitary sentence that speaks to the implications of this research for science communication (lines 473-475). Further elaboration here would be helpful: What are the implications for journalistic practice, specifically; for scientists, in their public-facing work; and so forth? ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 1 |
|
PONE-D-22-23245R1Public perception of scientists: Partisan and non-partisan thinkingPLOS ONE Dear Dr. Sonmez, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Jun 18 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Celia Andreu-Sánchez Academic Editor PLOS ONE Journal Requirements: Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #3: All comments have been addressed Reviewer #4: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #3: Yes Reviewer #4: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #3: Yes Reviewer #4: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #3: Yes Reviewer #4: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #3: Yes Reviewer #4: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #3: You did an adequate job responding to the suggestions of the reviewers. However, the quality of the figures should be addressed before publication. Reviewer #4: Thank you for the opportunity to review “Public perceptions of scientists: partisan and non-partisan thinking.” I was not an initial reviewer on this piece, but I appreciated the previous reviewers’ very thoughtful suggestions and the authors’ comprehensive work to be responsive. I think it’s an important analysis and a solid study design that contributes new insights about how the public understands scientists and the heterogeneity in perspectives attributable to different characteristics of both scientists and of the public evaluating scientists. I have mainly minor comments, and some reinforcing of previous reviewers’ comments. First, I’m not sure the title is accurate – or at least, does not convey the main points of this paper. I wasn’t sure what “partisan and non-partisan thinking” meant when I started the paper, and I’m still not sure this well-captures the paper. It reduces the paper to the partisan cues/heuristics and I think doesn’t convey much of what the paper does. Perhaps “Public perception of scientists: partisan and non-partisan characteristics contributing to perceptions” or something similar? In some of the rationale and background supporting why the authors tested different characteristics of scientists, they conflate attributes of the scientists with those same attributes (or stereotypes related to those attributes) among the general public. This is most salient in the paragraph about race (111-122) where some of the justifications are about racialized stereotypes about scientists and some, such as racial differences in science literacy, are about the general public. I suggest authors edit this paragraph accordingly to be more clear that they are referring to racialized perceptions of scientists. Also in the background (lines 168-176), is an alternative explanation for why some scientific fields might be regarded with less trust or credibility be how applied the science is into policy and politics? I think vaccination, infectious disease, climate change can also be characterized by how closely adjacent those fields are to relevant policy decisions that get embroiled in politics (mandates, restrictions, and taxes or other consumer restrictions) and not only how “permeated” they are by conspiratorial thinking. I know the interaction between partisanship and institution/place of work is only exploratory, but I still think it needs more explanation. For instance, the authors note that “Republicans and Democrats have contrasting views on institutions” (line 201) but do not explain what these views are. The study was field in March of 2020. I know one of the previous reviewers asked the author to spell out how this timing might have been related to trust and the authors note that trust declines began later. However, there were other elements of partisan gaps in views about the pandemic that emerged as early as February and early March (see https://fivethirtyeight.com/features/how-concerned-are-americans-about-coronavirus-so-far/). I would like to see the authors engage a bit more about how the salience of the emerging pandemic might shape perhaps how respondents viewed reference to medical science in particular as a field, or to whether and how the stage of the pandemic could have affected survey response or other aspects of survey administration. (Please also note the actual dates that the study was fielded, not just “March of 2020”). Like Reviewer 2, I too was surprised by the language of “Board of Scientific Councillors in your district.” This is not a concept I have ever heard of (as a U.S. citizen who studies science communication!) so I’m doubtful that most respondents had heard of this or knew what they were responding about. Further, the concept of “district” is not commonly used in the U.S. to refer to local government (as opposed to county, municipality, or town/city). District normally refers to schools, not other governmental entities. While I see the authors’ response to the initial reviewer who raised this, I would push back gently and suggest that there may be a limitation in the study design wherein there could be some unmeasured error in how respondents understood this concept that they were rating. Would suggest more acknowledgement or a footnote about this concern. For the 3 key dependent variables, please state the measurement of the outcomes outright. Authors say they are measured on a Likert scale “where 7 is strongly trust and 1 is strongly mistrust” but this measurement does not make sense for the first DV (“where would you place your assessment”) nor the second “How much would you agree..”). It would be clearest to just note the response options for each item. In the limitations, I would like to see the authors emphasize Reviewer 1’s very important concern about ecological validity more forcefully (lines 527-528). They acknowledge rather obliquely that the task is artificial, but I think they should acknowledge more explicitly that the task they are asking respondents to do (select a scientist to serve in a particular role) is something the general public will rarely if ever actually do. I think the tradeoffs between the validity of this task and the causal inference of the design are justified, but it does bear more explicit discussion. Overall I learned a lot from this study and found the findings interesting, thought-provoking, and well supported by the rigorous study design. I also appreciated the comprehensive literature review which I suspect will also be useful for future research teams when the study is published. Minor points: I found some examples of awkward language and language that could be more concise. Of course, the authors can decide not to edit these, but just flagging: Line 73 – “public preference for science is unequally distributed” – unequal distribution sounds like a moral claim, when I think the authors just mean “heterogeneous based on…” Line 77 – the reference to messages that scientists communicate is confusing, as this study does not address messaging at all. And yet in lines 77 through the Fauci example, the authors describe messages that trigger reactions based on audience predispositions. I would cut or edit this content as it made me think the paper was going to address communication and source x message interactions. Line 95-96 – “a considerable fraction of stereotypes concerning sex roles conveys a denigrating message about women and often denies that they possess certain traits” – this is awkward and the subject-verb agreement is off (i.e., stereotypes convey messages; stereotypes deny that they possess); I would rephrase. Line 133 – “depending on whether they are state, industry, or academic” – should “state” be government, or is the reference to the U.S. states? Line 146 – “are open to being mistrusted from a different perspective.” This took me a while to understand. Perhaps another way to word this is that “scientists working for government…potentially face mistrust from a different perspective”? Lines 165-167 – can authors clarify what they mean by autonomy-heteronomy? Figures – like Reviewer 1, I want to make sure that the figures have comprehensive notes attached. For instance, I know what the Partisan column is referring to in the first figure, and what the non-partisan column is, but what is “Nonpartisan-Partisan”? Double checking that there are complete notes and legends for all exhibits will be important, as this was hard for me to assess in how the figures appeared in the reviewer copy. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #3: No Reviewer #4: No ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 2 |
|
Public perception of scientists: Experimental evidence on the role of sociodemographic, partisan, and professional characteristics PONE-D-22-23245R2 Dear Dr. Sonmez, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Celia Andreu-Sánchez Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #3: All comments have been addressed Reviewer #4: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #3: Yes Reviewer #4: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #3: Yes Reviewer #4: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #3: Yes Reviewer #4: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #3: Yes Reviewer #4: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #3: I think you have addressed the reviewers' comments adequately. For future research on this topic, it might be worth incorporating Douglas and Wildavsky's Cultural Theory, which has been used mostly in the risk perception literature, but I think would also help predict the public's perceptions of scientists. Reviewer #4: The authors have done an excellent job responding to my concerns and suggestions and the paper is improved as a result. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #3: No Reviewer #4: No ********** |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .