Peer Review History

Original SubmissionMay 6, 2020
Decision Letter - Nguyen Tien Huy, Editor

PONE-D-20-13400

Analysis of Data Dictionary Formats of HIV Clinical Trials

PLOS ONE

Dear Dr. Mayer,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Aug 24 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Nguyen Tien Huy, Ph.D., M.D.

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability.

Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. Any potentially identifying patient information must be fully anonymized.

Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access.

We will update your Data Availability statement to reflect the information you provide in your cover letter.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Partly

Reviewer #3: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: N/A

Reviewer #2: Yes

Reviewer #3: N/A

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: Yes

Reviewer #3: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: The topic is important, as biomedical research in general needs to keep pressing on improving data sharing in a way that can enhance reproducible research.

A few comments:

1. It may help to expand somewhat on the reasons for standards of data sharing, e.g. reproducible research. The article jumps into the review without giving as much background as could be helpful.

2. If we think of the studies that were not included as part of the analyzed 18 studies as 'missing data', then this manuscript could do a more thorough job in explaining the sampling frame. Some expansion on the filtering profess from n studies to 18 studies, and additional information on reasons why some studies were not included (e.g., in a table) would help. If we want to make inference for a population of studies, then greater understanding of how the 18 studies are a biased sample of the population of studies would help.

3. The manuscript mentions dozens of data sharing platforms. I wonder about the level of diversity in the data standards across these platforms? Is part of the issue that the different platforms do not all adopt the same 'constitution' of data standards principles?

4. In some places, qualitative statements are made about studies having 100% of data elements with information included in a data dictionary, vs. < 100 of data elements with information included in a data dictionary. Is it possible to quantify this issue better, e.g. by providing the percentage instead of binary 100% vs. M< 100%?

5. Adding a few more tables and figures may help provide substantive results in a convenient manner to review and come back to.

6. I think the recommendations should be strengthened, by being more specific (e.g., to the data elements standards system) and complete, the recommendations will be the take home items of the manuscript I believe. Also, linking the work more to other standardization papers or documents would help.

Reviewer #2: it is a very interesting research and the creation of a single information model is a new idea, while there are some mistakes in using the words in improper scientific way and i recommend to enhance the language

Reviewer #3: The manuscript “Analysis of Data Dictionary Formats of HIV Clinical Trials” by Mayer et al. appeared to be an interesting analysis for data collection from the clinical trials of HIV studies. Their analysis and recommendations might be meaningful and helpful for data sharing in future original studies which could facilitate re-using data researchers for their works. However, I have some concerns that I think the authors should clarify them in their manuscript:

1. For the method, the authors said that they included and analyzed HIV studies, but HIV studies are not clear to me. Does it mean all studies with HIV population? For example, a study of another infectious disease which HIV is a comorbidity could be included?

2. Databases/Platforms/Networks that the authors searched to find included studies are not clear in the method. Although they presented several names of databases/platforms in the results, they are not all used ones. It makes the method hard for readers to repeat their steps. I recommend them to present all these databases/platforms/networks names in the method.

3. In the method, the mentioned sufficiently the definitions, but no analysis strategy was presented. That makes me feel hard to follow their results, especially the differentiation between the section Data element dictionaries and the section Data Element Description. In fact, I saw the overlapped results when they mentioned studies fully or partially missed data element description.

4. For the results, Page 8, line 186 – 187, the following sentence is confusing “One study from HPTN and the 5 studies acquired from the NIDA Data Archive used a CDISC format, which we analyzed separately.” At first, I think that they will analyzed these studies separately from 18 other studies. But no analysis for these studies was done, this meant they are excluded. The authors should rewrite the sentence.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: Yes: Dao Ngoc Hien Tam

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 1

Also available as an attached file:

Dear Dr. Nguyen Tien Huy,

We thank you and the reviewers for the input and constructive comments on our manuscript. We have thoroughly reviewed each comment and our manuscript and have edited our paper taking into account each comment. We are now submitting the edited and improved version of our manuscript.

See below our response (in bold font) to each reviewer comment. Our response follows the reviewer’s comments (shown in regular un-bolded text).

To ensure transparency and reproducibility all underlying data used in our analysis can be found in our aggregated data element file at our github repository at https://github.com/lhncbc/CDE/tree/master/hiv/datadictionary

Thank you,

Craig Mayer, Nick Williams and Vojtech Huser

Reviewer #1:

The topic is important, as biomedical research in general needs to keep pressing on improving data sharing in a way that can enhance reproducible research.

We greatly appreciate each of the comments and we considered each one and made many additions, specified below, to improve the ability of articulating the papers meaning and results.

A few comments:

1. It may help to expand somewhat on the reasons for standards of data sharing, e.g. reproducible research. The article jumps into the review without giving as much background as could be helpful.

We added a paragraph in the middle of the introduction section. The added text expands on the usefulness of CDEs, data standards, good data sharing practices. The added text also comments on the use of such standards in improving the re-usability of the data and the use of existing tools and techniques.

2. If we think of the studies that were not included as part of the analyzed 18 studies as 'missing data', then this manuscript could do a more thorough job in explaining the sampling frame. Some expansion on the filtering profess from n studies to 18 studies, and additional information on reasons why some studies were not included (e.g., in a table) would help. If we want to make inference for a population of studies, then greater understanding of how the 18 studies are a biased sample of the population of studies would help.

We have revised section 3.1 based on this comment. We expanded on our filtering and exclusion criteria in results section 3.1. We stated that we excluded studies that did not provide a data dictionary in a machine-readable format or documents that can be converted into a machine readable format due to our analysis strategy.

3. The manuscript mentions dozens of data sharing platforms. I wonder about the level of diversity in the data standards across these platforms? Is part of the issue that the different platforms do not all adopt the same 'constitution' of data standards principles?

Our observation was that only NIDA Data Share had any globally used data standard, while the other platforms allowed studies to use whatever format was preferred. The revised manuscript has an added a sentence stating this in section 3.1 of the results.

4. In some places, qualitative statements are made about studies having 100% of data elements with information included in a data dictionary, vs. < 100 of data elements with information included in a data dictionary. Is it possible to quantify this issue better, e.g. by providing the percentage instead of binary 100% vs. M< 100%?

Thank you for the comment and a good new idea. We agree that quantifying this statement regarding the number of data elements in the the data vs. data elements in the data dictionary is desirable. We have conducted an additional analysis and modified the manuscript. We used a 50% sample of studies that could be used for this investigation (where we have both data dictionary (DD) and individual participant data [IPD]). The revised manuscript now has a new new Table 4 with results of this analysis and added discussion of results. (We no longer just report about 100% vs. <100% but provide exact numbers for all studies we analyzed for this indicator. See revised 3.2.1. The new table clearly shows that the completeness of DD ranges between 45.4% to 100%. The low value of 45.4% is an outlier. Four studies out of five analyzed, had completeness >85%.

5. Adding a few more tables and figures may help provide substantive results in a convenient manner to review and come back to.

To improve the readability of the results, we added four new tables. The first is in response to the previous comment showing the percentage of data elements in the data dictionary (table 4). The second is the breakdown of data elements by data type (Table 5). The third is the list of very common form names (Table 6), and the fourth is the list of trials using the CDISC standard (Table 6).

6. I think the recommendations should be strengthened, by being more specific (e.g., to the data elements standards system) and complete, the recommendations will be the take home items of the manuscript I believe. Also, linking the work more to other standardization papers or documents would help.

We are glad that reviewer#1 has the same stance on standardization (that it should be strengthened). In the revised manuscript, we have expanded section 4.2 (Recommendations). In the revised introduction, we also refer to 5 new references in a revised Introduction section to refer to relevant papers arguing for CDEs and advanced data sharing. The revised recommendation section 4.2 now comments specifically on the standard choice and also links to 2 additional new references.

Reviewer #2:

It is a very interesting research and the creation of a single information model is a new idea, while there are some mistakes in using the words in improper scientific way and I recommend to enhance the language.

We carefully reread the manuscript and made revisions where we encountered improper scientific language. (e.g. changes to Introduction and Trial Acquisition sub-sections of both Methods and Results sections). We made additional edits to improve the language and better explain the reasons for certain term usage. Additional revisions to correct “improper scientific language” were made in response to reviewer #1 and reviewer #3.

Reviewer #3:

The manuscript “Analysis of Data Dictionary Formats of HIV Clinical Trials” by Mayer et al. appeared to be an interesting analysis for data collection from the clinical trials of HIV studies. Their analysis and recommendations might be meaningful and helpful for data sharing in future original studies which could facilitate re-using data researchers for their works. However, I have some concerns that I think the authors should clarify them in their manuscript:

Thank you for your comments and input, as we have considered them and edited our paper as stated below to incorporate the comments and answer any concerns.

1. For the method, the authors said that they included and analyzed HIV studies, but HIV studies are not clear to me. Does it mean all studies with HIV population? For example, a study of another infectious disease which HIV is a comorbidity could be included?

We clarified this statement by adding to section 2.1 in the methods where we explained that HIV studies describes any study with HIV positive patients or any study relating to the contracting HIV, such as HIV vaccines and prevention.

2. Databases/Platforms/Networks that the authors searched to find included studies are not clear in the method. Although they presented several names of databases/platforms in the results, they are not all used ones. It makes the method hard for readers to repeat their steps. I recommend them to present all these databases/platforms/networks names in the method.

We have modified the manuscript and added the list of all searched platforms (see the methods in sub-section 2.1). This should help anyone who is trying to reproduce our search results. The revised expanded text also describes which sources we acquired data from and which sources did not have any studies that we included.

3. In the method, the mentioned sufficiently the definitions, but no analysis strategy was presented. That makes me feel hard to follow their results, especially the differentiation between the section Data element dictionaries and the section Data Element Description. In fact, I saw the overlapped results when they mentioned studies fully or partially missed data element description.

Thank you for the input. We agreed that this information regarding missing data description was repeated and so we removed it from the section data element dictionary (section 3.2) and expanded on it in data element description (section 3.2.3). We also added more specific sentences regarding our analysis techniques in the methods, which can be found throughout the revised section 2.2.

4. For the results, Page 8, line 186 – 187, the following sentence is confusing “One study from HPTN and the 5 studies acquired from the NIDA Data Archive used a CDISC format, which we analyzed separately.” At first, I think that they will analyzed these studies separately from 18 other studies. But no analysis for these studies was done, this meant they are excluded. The authors should rewrite the sentence.

We clarified this sentence to articulate that these CDISC studies were separated and not included in any analysis with the other studies, with their presence being the result itself. The emphasis of the paper was to learn mostly from the ad-hoc data dictionaries rather than simply re-discover CDISC standard benefits. In our future work (that is not limited to HIV clinical domain), we hope to collect as large as possible sample of CDISC-formatted studies and analyze those in a future publication.

Attachments
Attachment
Submitted filename: Response-to-reviewers.docx
Decision Letter - Nguyen Tien Huy, Editor

Analysis of data dictionary formats of HIV clinical trials

PONE-D-20-13400R1

Dear Dr. Mayer,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Nguyen Tien Huy, Ph.D., M.D.

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

Reviewer #3: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #3: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #3: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #3: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #3: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thank you for the revision which was responsive. I especially appreciate the addition of tables that provide greater quantification of the data issues.

Reviewer #3: (No Response)

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #3: No

Formally Accepted
Acceptance Letter - Nguyen Tien Huy, Editor

PONE-D-20-13400R1

Analysis of data dictionary formats of HIV clinical trials

Dear Dr. Mayer:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Nguyen Tien Huy

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .