Peer Review History
| Original SubmissionNovember 16, 2023 |
|---|
|
PONE-D-23-35664A local community on a global collective intelligence platform: a case study of individual preferences and collective bias in ecological citizen sciencePLOS ONE Dear Dr. Arazy, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Apr 24 2024 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Hong Qin Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 2. Note from Emily Chenette, Editor in Chief of PLOS ONE, and Iain Hrynaszkiewicz, Director of Open Research Solutions at PLOS: Did you know that depositing data in a repository is associated with up to a 25% citation advantage (https://doi.org/10.1371/journal.pone.0230416)? If you’ve not already done so, consider depositing your raw data in a repository to ensure your work is read, appreciated and cited by the largest possible audience. You’ll also earn an Accessible Data icon on your published paper if you deposit your data in any participating repository (https://plos.org/open-science/open-data/#accessible-data). 3. Please note that PLOS ONE has specific guidelines on code sharing for submissions in which author-generated code underpins the findings in the manuscript. In these cases, all author-generated code must be made available without restrictions upon publication of the work. Please review our guidelines at https://journals.plos.org/plosone/s/materials-and-software-sharing#loc-sharing-code and ensure that your code is shared in a way that follows best practice and facilitates reproducibility and reuse. 4. Thank you for stating the following financial disclosure: "This research was supported in part by the University of Haifa’s Data Science Research Center." Please state what role the funders took in the study. If the funders had no role, please state: ""The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."" If this statement is not correct you must amend it as needed. Please include this amended Role of Funder statement in your cover letter; we will change the online submission form on your behalf. 5. When completing the data availability statement of the submission form, you indicated that you will make your data available on acceptance. We strongly recommend all authors decide on a data sharing plan before acceptance, as the process can be lengthy and hold up publication timelines. Please note that, though access restrictions are acceptable now, your entire data will need to be made freely accessible if your manuscript is accepted for publication. This policy applies to all data except where public deposition would breach compliance with the protocol approved by your research ethics board. If you are unable to adhere to our open data policy, please kindly revise your statement to explain your reasoning and we will seek the editor's input on an exemption. Please be assured that, once you have provided your new statement, the assessment of your exemption will not hold up the peer review process. 6. Please amend your list of authors on the manuscript to ensure that each author is linked to an affiliation. Authors’ affiliations should reflect the institution where the work was done (if authors moved subsequently, you can also list the new affiliation stating “current affiliation:….” as necessary). [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly Reviewer #2: Partly ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: I Don't Know Reviewer #2: N/A ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: No Reviewer #2: No ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Dear colleagues, thank you very much for the possibiliy of reviewing this manuscript about potential biases induced by individual and personal attitudes and preferences in an ecological citizen science project. I enjoyed reading the paper and I find the subject very topical for many aspects, from the mere evaluation of citizen science approaches to the indirect relevance of directly involving people in monitoring environmental changes. In general I find the paper well written and the supporting literature exhaustively mentioned. However, I also find the text flow sometimes unnecessarily redundant (similar considerations in the introduction and in the discusssion e.g.) and to some extent confusing, as far as the captions are concerned: for instance at page 4 (apparently the end of the introduction) where there is a hint to the results ("Our results reveal four primary factors that ...") without having even mentioned the research questions and/or the methodological approach (at this point there is a caption called 'Background', which should be mentioned earlier, maybe). Please check agian the sequence and contents of the sinlge captions. This will help the reader to follow the study. My major concerns regard the method: the paper reads that "roughly 40,000 observations were reported on Tatzpiteva by 400 observers, making up roughly half of all the iNaturalist observations in Israel. Most of Tatzpiteva’s observations were contributed by the community’s core members, where most members are peripheral and contribute only occasionally." Would it be possible to be more precise about the so called core members (how many are they and how many observations did they deliver?) as well as about the so called peripheral members? Are the core members the 38 persons that were interviewed? It would be appreciated if the entire questionnaire would be made availale as a supplementary material (e.g.) instead of listing the questions in the text flow. Regarding the questionnaire, it is not fully clear to me, why only one species was chosen as higly likely to be observed whereas three species were listed as higly unlikely -why not having two likely and two unlikely species? Table 1 show a very broaden distribution of the age and the contributions (number of reported observations) of the 27 participants: I am wondering (i) if it would be possible to add a hint to the standard error (or standard deviation) for both these measurements and most important (ii) how representative the sample is (27 out of 400 observers in sum, this is less than 10%; a maximum of approx. 3,500 reproted observation out of a sum of 40,000, the same here). I would appreciate reading some considerations about this in the discussion. I am not a social scientist and I am not confident with citing answers from questionnaires in a paper, so while begging for understanding for my ignorance, I wonder (again) to which extent citing single statements of a low number of participants is representative for the study topic. In the discussion I would appreciate some considerations for possible biases in studies on similar topics conducted by scientists instead of interested volunteers, for instance, and how combining the two approaches could maybe generate a win-win situation for everybody. Minor comments: . please check the references once more, some of them did not get a number and are included in the text flow (see for instance Hochmair et al., 2020 at page 2 and Arazy & Malkinson, 2021 (page 27) in the discussion. . please carefully check the language again - sometomes I had the feeling that the sentences were incomplete (but I am not a native speaker, so I might be biased). E.g. page 2: "In particular, and the local organization of these communities could possibly hinder their ability to produce unbiased outcomes." Reviewer #2: I read this manuscript with great interest. The authors report qualitatively on a multi-year citizen science project to collect data about the flora and fauna of Israel. The key focus of this manuscript is on biases in data arising from characteristics of contributors in deciding which observations to report. Overall, this manuscript provides useful qualitative insights into "what makes citizen scientists "tick" in the context of a specific project. The four characteristics of recordability, collective considerations, personal preferences, and convenience are intuitive and useful. The detailed comments from contributors help contextualize the findings. I do have several concerns and suggestions for improvement: 1. There seems to be an underlying assumption that collectively, contributions will "cancel out" biases of individual contributors. However, I do not see the validity of using this as a starting point. Geographic biases (e.g., concentrations near towns, roads, etc.) will not be overcome necessarily by having more contributors, nor will temporal biases, or biases toward reporting rare species or species that are hard to locate (e.g., nocturnal). These limitations seem built in to many CS projects. It would be useful, though, if the paper proposed strategies to mitigate such biases (e.g, design decisions, instructions to contributors). 2. The project seems to require participants to upload photos, with a reason for not contributing being things like animals appearing fleetingly so that photos could not be captured. Why not allow contributors to report observations using text? A fast moving mammal or bird might not afford a photo oppotunity, but contributors certainly could make such reports (thereby mitigating bias!). 3. The paper seems to be hastily written. There are numerous typos (no space to list them here), and the paper uses strange word choices at time (notably, "slant", which seem to be a very informal work choice for a scientific paper. Why not stick to bias? ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 1 |
|
PONE-D-23-35664R1A local community on a global collective intelligence platform: a case study of individual preferences and collective bias in ecological citizen sciencePLOS ONE Dear Dr. Arazy, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Jul 14 2024 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Hong Qin Academic Editor PLOS ONE Journal Requirements: Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: N/A ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The manuscript has consistently improved since last time and I acknowledge the thoroughful revision done by the authors. There are no further comments from my side and I am looking forward to seeing the work published. Reviewer #2: I thank the authors for their revision and the clear response to the reviews in the previous round. I continue to like the paper and have only three remaining substantive comments, both arising from the response/revisions: 1. Framework for Biases: The topic of bias in citizen-generated data is important. A recent article proposed a framework for understanding "socio-ecological biases" in citizen science data (Carlen et al., 2024). Given the apparent relevance to your work, it might help to relate your perspective to their framework. I don't think they have "scooped" you, but on a first glance, your in-depth qualitative work can provide support for aspects of their framework and perhaps challenge others. 2. Data Quality: As noted on page 4, traditional approaches to data quality focus on fitness-for-use and, to assess this, "we must consider the possible uses." However, it may be that uses are not be fully known at the time a project is designed and launched and/or may change (consider, e.g., Pharr et al., 2023, where data from a CS project is combined with US government light and noise data to answer questions not considered when the original CS project was designed). As you note, "CS data can be valuable for addressing a variety of research questions." In that context, there has been a suggestion that data quality in citizen science (and crowdsourcing more broadly) can sometimes be considered from a "use-agnostic" perspective (Wiersma et al., 2024; Lukyanenko et al., 2014). In that context, it is unclear how contributors' preferences (versus the goals of project sponsors) affect the nature of bias and resulting impacts on data quality. For example, project sponsors likely have specific goals (e.g., understanding the distribution and prevalence of (focal) species), whereas contributors may produce biases along the dimensions identified in your analysis. Some discussion of the inherent tension between viewing data quality from a fitness-for-use perspective and the way in which data can become biased as a results of that focus imposed by project goals (in addition to, and perhaps in different ways from, the biases you discuss in the paper based on behavior and preferences of contributors) would deepen the analysis of the relationship between bias and data quality. 3. Platform-design-induced bias: The only response to my previous comments that was less that satisfying was the one regarding the photo requirement. I realize this is iNaturalist-driven, rather than Tatzpiteva-driven, but it seems clear that such a requirement is a big source of systematic bias. One of your key findings is that "recordability" drives the decision to (not) report, or in the words of one of your respondents "the ability to take a picture." Unlike issues related to personal preferences or convenience, this seems an artifact of underlying decisions of the platform. While a photo might be useful in enabling verification, it seems a mechanism to induce severe bias in certain contexts (e.g., low light, fast-moving animals) by preventing participants from reporting observations they might otherwise be inclined to report. As Lukyanenko et al. (2019) show, it is possible to perform post hoc processing on textual data to get a high level of classification performance on textual data (no photos) that lacks species identification. In conclusion, this paper stands to make a strong contribution to our understanding of preferences and biases among contributors to citizen science projects. The comments above are intended in the spirit of better refining the message and contextualizing it with respect to related work. References Carlen, Elizabeth J, Cesar O Estien, Tal Caspi, Deja Perkins, Benjamin R Goldstein, Samantha ES Kreling, Yasmine Hentati, Tyus D Williams, Lauren A Stanton, Simone Des Roches, Rebecca F Johnson, Alison N Young, Caren B Cooper, Christopher J Schell (2024). A framework for contextualizing social-ecological biases in contributory science. People and Nature, 6(2), 377-390. Lukyanenko, Roman, Jeffrey Parsons, Yolanda Wiersma, Mahed Madadah (2019). Expecting the unexpected: effects of data collection design choices on the quality of crowdsourced user-generated content. MIS Quarterly, 43(2), 623-647. Pharr, Lauren D, Caren B Cooper, Brian Evans, Christopher E Moorman, Margaret A Voss, Jelena Vukomanovic, Peter P Marra (2023). Using citizen science data to investigate annual survival rates of resident birds in relation to noise and light pollution. Urban Ecosystems 26(6), 1629-1637. Wiersma, Yolanda F, Tom Clenche, Mardon Erbland, Gisela Wachinger, Roman Lukyanenko, Jeffrey Parsons (2023). Advantages and Drawbacks of Open-Ended, Use-Agnostic Citizen Science Data Collection: A Case Study. Citizen Science: Theory and Practice, 9(1), 5pp. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 2 |
|
A local community on a global collective intelligence platform: a case study of individual preferences and collective bias in ecological citizen science PONE-D-23-35664R2 Dear Dr. Arazy, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice will be generated when your article is formally accepted. Please note, if your institution has a publishing partnership with PLOS and your article meets the relevant criteria, all or part of your publication costs will be covered. Please make sure your user information is up-to-date by logging into Editorial Manager at Editorial Manager® and clicking the ‘Update My Information' link at the top of the page. If you have any questions relating to publication charges, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Hong Qin Academic Editor PLOS ONE Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #2: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #2: N/A ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #2: The authors have addressed the remaining concerns I identified in the previous round. This is a solid paper with the potential to make an important impact in understanding bias in citizen science data. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #2: No ********** |
| Formally Accepted |
|
PONE-D-23-35664R2 PLOS ONE Dear Dr. Arazy, I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now being handed over to our production team. At this stage, our production department will prepare your paper for publication. This includes ensuring the following: * All references, tables, and figures are properly cited * All relevant supporting information is included in the manuscript submission, * There are no issues that prevent the paper from being properly typeset If revisions are needed, the production department will contact you directly to resolve them. If no revisions are needed, you will receive an email when the publication date has been set. At this time, we do not offer pre-publication proofs to authors during production of the accepted work. Please keep in mind that we are working through a large volume of accepted articles, so please give us a few weeks to review your paper and let you know the next and final steps. Lastly, if your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. If we can help with anything else, please email us at customercare@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Hong Qin Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .