Peer Review History
| Original SubmissionNovember 5, 2023 |
|---|
|
PONE-D-23-35578Seek and you may (not) find: A multi-institutional analysis of where research data are sharedPLOS ONE Dear Dr. Herndon, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. The study discussed here explores how academic research data is shared, focusing on understanding the processes and expenses involved in making this data accessible to the public. The research takes a close look at where scholars commonly share their data. The article is intriguing because it takes a practical approach to tackling the problem. However, it suggests that there's room for improvement. For instance, the methods used in the study could be better explained to give readers a clearer understanding of how the research was conducted. Additionally, the findings could be expanded upon to provide more detailed insights. By refining these aspects, the study could become even more valuable in shedding light on scholarly publishing practices and the accessibility of research data. Please submit your revised manuscript by Mar 30 2024 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Ayesha Maqbool, PhD Academic Editor PLOS ONE Journal requirements: 1. When submitting your revision, we need you to address these additional requirements. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 2. Thank you for stating the following financial disclosure: [CHV NSF award #2135874 National Science Foundation No]. Please state what role the funders took in the study. If the funders had no role, please state: ""The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."" If this statement is not correct you must amend it as needed. Please include this amended Role of Funder statement in your cover letter; we will change the online submission form on your behalf. 3. Thank you for stating the following in the Acknowledgments Section of your manuscript: [Funding for this research and the Realities of Academic Data Sharing (RADS) Initiative was provided by the National Science Foundation (NSF), award #2135874, EAGER grant: Completing the Lifecycle: Developing Evidence Based Models of Research Data Sharing. We would especially like to thank Martin Halbert, Program Director at NSF, for his considerations regarding interoperable research infrastructure, data, and metadata. We would also like to thank Ted Habermann, of Metadata Game Changers, for his work on the RADS study and contributions to this research, and to DataCite and Crossref for providing their APIs. We would also like to thank the members of the Data Curation Network (DCN) for engagement and discussion around these topics. Finally, we thank Mikala Narlock, Megan O’Donnell, Sara Mannheimer, and Kristin Briney for providing early feedback on a draft of this paper.] We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form. Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows: [CHV NSF award #2135874 National Science Foundation No]. Please include your amended statements within your cover letter; we will change the online submission form on your behalf. 4. Thank you for stating the following in the Competing Interests section: [I have read the journal's policy and the authors of this manuscript have the following competing interests: Institutions involved in this study maintain paid memberships in either the CrossRef or DataCite data sharing services.]. Please confirm that this does not alter your adherence to all PLOS ONE policies on sharing data and materials, by including the following statement: ""This does not alter our adherence to PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests). If there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared. Please include your updated Competing Interests statement in your cover letter; we will change the online submission form on your behalf. 5. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly Reviewer #2: Yes Reviewer #3: Yes ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: N/A ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: This paper focuses on answering three research questions in order to shed light on (i) where research data is shared, (ii) how many datasets are shared, as well as (iii) completeness of datasets' metadata. Research conducted in this paper is well motivated and it seems very relevant to me, since it allows reader to understand how different stakeholders (not only researchers) are engaged with open science. Other strong point of the paper is that a method is proposed for other researchers and academic institutions to better know how data sharing practices are being applied (beyond the ones considered in this paper). Actually, other strength of this paper is that authors consider a bottom-up approach, thus being a realistic overview of current application of open science practices. However, the paper has some weak points, authors need to consider in a revised manuscript: 1-Chart in figure 5 (a line chart) is not the best way to visualize percentage data. I would recommend authors to consider other kind of charts that better fits with percentage data. I would recommend authors to use radar charts, but they may want to consider some kind of pie charts. 2-Metadata that is considered is not explicitly described in detail. There are some concepts that are crystal clear, such as DOI, but others need to be further described, such as "Related Identifiers". Please, provide a more detailed explanation of the considered metadata (highlighted concepts in Table 4). 3-Authors provide good recommendations but I would suggest authors to create a specific section for recommendations (and not mixing conclusions with recommendations in a unique section). Also, this new recommendation section must be more long with more complete explanations. 4-Research questions that are stated at the beginning of the paper must be explicitly answered when data has been analyzed. 5-Authors should consider and summarize other related work in order to explicitly state their contributions with regard to related work. Also, you may consider other related initiatives around the world, such as EOSC (https://eosc-portal.eu/). Reviewer #2: This is an interesting paper about using repositories for sharing of research data based on a bibliographic analysis. It provides data on which repositories are used for data sharing by researchers from 6 US institutions and about the completeness of metadata fields for top DataCite repositories. The paper is well written but needs some improvement: Methods: From the viewpoint of the reviewer, the presentation of the methods could be made clearer. It should be stated whether a study protocol was available before start of the study and whether such a protocol was registered or not. It should also be made clear whether the addition of a 3rd group changed after initial analysis (as it seems to be) and whether that resulted in an updated study protocol. The section “data collection – institutional repositories” is difficult to understand. There is a mix of datasets related to the listed institutional repositories (2164, see table 1) and a remaining number of institutional repositories (2390 minus 2164, table 2) running through a cleaning/curating process. If the reviewer understood it correctly, for the institutional dataset, no cleaning/curating was necessary because of the pre-selection of this dataset according to the defined and listed repositories. Taken into consideration that a different search strategy was used for the DataCite/Crossref and the institutional sample, it could be of advantage to separate the analysis and not to merge all the data into one sample. In figure 1 filtering according to resource type = dataset is missing for Institutional Repositories (IRs) and Crossref. Results In table 3 the institutional repositories are ranked no. 5. As explored, the datasets here are mainly from the listed institutional repositories with some additional datasets retrieved from Datacite and Crossref search. Because a different search strategy was used for the DataCite/Crossref and the institutional sample, it would be clearer, to separate this row into 2 subgroups. Discussion The discussion about the unit of analysis and the consequences of it (e.g. DOI per file, DOI per study) is important. There is still no common and standardised definition of a research study or a research project, to which a dataset can be linked as a research output. This issue is discussed, for example, in the section “Data analysis”, referring to individual studies that may have many DOIs and where datasets falling into the same container were collapsed. This point is very relevant for any bibliographic analysis and needs more discussion from the viewpoint of the reviewer. One approach is the OpenAIRE Knowledge Graph, including metadata and links between scientific products (e.g. literature, datasets, software, and "oth-er research outputs"), organizations, funders, funding streams, projects, communities, and data sources, another approach is trying to provide a proposal for a metadata framework for contextual metadata defining research projects and linking it to research outputs (e.g. da-tasets), taking into consideration that relationships between projects/studies with fund-ing/grants and research outputs (e.g. datasets) can be difficult. The reviewer could imagine that comparing the completeness of the DOI metadata of the institutional sample with the metadata available in the institutional repositories could be of major interest and worth to be investigated in the future. If it turns out that the metadata completeness in the institutional metadata is much higher than the DOI metadata, it is pri-marily a transfer and not documentation problem. The reviewer agrees with most of the conclusions and recommendations, especially with respect to guidance how DOIs are applied and referenced within components of a study. This is truly an area with need for clarification. But this needs a broader discussion involving frameworks for representing research projects and research outputs, ontolo-gies/terminologies, including crosswalks and requirements for machine-actionable metadata (relationships between entities/records, unique identifiers, research graphs). Reviewer #3: This paper is a study of metadata quality on a subset of research data. The results are not surprising (metadata are incomplete) but it is still good to put numbers on it. Since these results are limited to the six universities in question, some information about them would help determine just how representative (or not) these results are of the entirety of the academic research enterprise - something similar to a summary of demographics of study subjects. Maybe budget, size of faculty, grant dollars, etc. My guess is that some of these results are not very generalizable, which is also an interesting result. I like the recommendations at the end. Line 333: There is a spurious "of" ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No Reviewer #3: Yes: Anne E Thessen ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 1 |
|
Seek and you may (not) find: A multi-institutional analysis of where research data are shared PONE-D-23-35578R1 Dear Dr. Herndon, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice will be generated when your article is formally accepted. Please note, if your institution has a publishing partnership with PLOS and your article meets the relevant criteria, all or part of your publication costs will be covered. Please make sure your user information is up-to-date by logging into Editorial Manager at Editorial Manager® and clicking the ‘Update My Information' link at the top of the page. If you have any questions relating to publication charges, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Ayesha Maqbool, PhD Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .