Peer Review History

Original SubmissionDecember 12, 2021
Decision Letter - Mehdi Rahimi, Editor

PONE-D-21-39215A nonparametric alternative to the Cochran-Armitage trend test in genetic case-control association studies: the Jonckheere-Terpstra trend testPLOS ONE

Dear Dr. Zhou,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Apr 08 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Mehdi Rahimi, Ph.D.

Academic Editor

PLOS ONE

Journal Requirements:

1. When submitting your revision, we need you to address these additional requirements.

Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at 

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2.Thank you for stating in your Funding Statement: 

(This work was supported by National Institute on Minority Health and Health Disparities 5U54MD013376-8281 (ZZ). 

The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.)

Please provide an amended statement that declares *all* the funding or sources of support (whether external or internal to your organization) received during this study, as detailed online in our guide for authors at http://journals.plos.org/plosone/s/submit-now.  Please also include the statement “There was no additional external funding received for this study.” in your updated Funding Statement. 

Please include your amended Funding Statement within your cover letter. We will change the online submission form on your behalf.

3. We note you have included a table to which you do not refer in the text of your manuscript. Please ensure that you refer to Table 2 in your text; if accepted, production will need this reference to link the reader to the Table.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: I have reviewed the manuscript entitled: "nonparametric alternative to the Cochran-Armitage trend test in genetic case-control association studies: The Jonckheere-Terpstra trend test", with Manuscript Number: PONE-D-21-39215. This manuscript is an interesting topic that could help researchers to tackle any bias results might be happened in the related analysis. But still need more revisions (major/minor) that I have mentioned viewpoints/ comments as below:

# Abstract:

Abstract is too general, while it would be better to be more specific in case of the definition and importance of the problem, and authors should give evidences in which are supported with quantitative data such as sample size etc.!

#Introduction:

In Line 47: it seems that sentence is incomplete ...is not additive (e.g.,(Gonzalez et al., 2008;…

Generally: This section at this format seems to be non-informative, authors must give more evidences of reported studies, try to highlight the weakness including suffering from the power loss etc., then discuss importance and objectives of the method of interest with more details.

# Simulation:

In Lines 113~120: authors must present more supportive data in case of sample size and MAFs to show advantages/ dis-advantages of both methods in analyzing genome-wide association studies.

# Real Data Analysis:

At this section, authors are strongly advised to use more real data set for comparing the efficiency of the two tests. With few cases, we are not able to observe power/failure of each method.

# Tables:

In table 2: Could authors show whether the calculated statistics at different sample size are significantly different? By the way, the table was not referred in the main text.

With the best regards.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Ali Moumeni

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Attachments
Attachment
Submitted filename: Reviewer-Points-Comments.docx
Revision 1

Please see the attached "Plos One response letter.docx" in this resubmission. The content is attached here too:

Point-by-point response to comments from Reviewer

Thank you very much for reviewing our paper and the detailed and helpful review report. We greatly appreciate your time, effort, encouragement, and insight. We have revised our paper addressing all issues raised in your report. The following is our point-by-point response to your comments. For convenience, your original comments are copied and our replies follow in blue. The associated revisions in the manuscript are highlighted in track change.

Comments:

I have reviewed the manuscript entitled: "nonparametric alternative to the Cochran-Armitage trend test in genetic case-control association studies: The Jonckheere-Terpstra trend test", with Manuscript Number: PONE-D-21-39215. This manuscript is an interesting topic that could help researchers to tackle any bias results might be happened in the related analysis. But still need more revisions (major/minor) that I have mentioned viewpoints/ comments as below:

Thank you very much for your concise summary and encouraging comment. We appreciate very much for your time and effort.

# Abstract:

Abstract is too general, while it would be better to be more specific in case of the definition and importance of the problem, and authors should give evidences in which are supported with quantitative data such as sample size etc.!

Thank you for the comment. In this revision, we have provided more related materials and details in the Abstract.

#Introduction:

In Line 47: it seems that sentence is incomplete ...is not additive (e.g.,(Gonzalez et al., 2008;…

In this sentence, the period is after the long list of citations, so it may appear that the sentence is incomplete:

However, it can suffer from power loss when the true genetic model is not additive (e.g., Gonzalez et al., 2008; Kuo & Feingold, 2010; Li, Zheng, Liang, & Yu, 2009; Loley, König, Hothorn, & Ziegler, 2013).

When reviewing it, however, we did find an extra “(” after “e.g.,”, and we have removed it in this revision (line 55, page 4).

Generally: This section at this format seems to be non-informative, authors must give more evidences of reported studies, try to highlight the weakness including suffering from the power loss etc., then discuss importance and objectives of the method of interest with more details.

We appreciate this suggestion to add more details in Introduction. In this revision, we expanded Introduction by providing more context information and details for the tests and significance of the research question.

# Simulation:

In Lines 113~120: authors must present more supportive data in case of sample size and MAFs to show advantages/ dis-advantages of both methods in analyzing genome-wide association studies.

In the original simulation, we considered sample sizes of N∈(200,500,1000) and MAFs of q∈(0.05,0.1,0.2,0.3). To evaluate the relative performance of the CA and JT trend tests in a wider range of data scenarios, we further considered the sample size of N=1500 and 2000 as well as q=0.4. Because the sample sizes and MAF were fairly large, we reduced the effect size in the alternative hypothesis to λ=0.5 to make the power comparison meaningful (otherwise the power of both tests will be close to 1 and hence difficult to evaluate the relative performance).

The empirical power of the two tests for the additional simulation settings were summarized in Table S1 in the Supplementary Materials. The results were consistent with original conclusion: compared to the CA trend test, the JT trend test is more powerful when the underlying genetic model is dominant. The power advantage of T_JT diminishes as the genetic model evolves toward the additive model and have approximately equivalent power when the underlying model is additive. T_JT becomes less powerful than T_CA^Add and the disadvantage enlarges when the genetic model keeps evolving toward the recessive end.

# Real Data Analysis:

At this section, authors are strongly advised to use more real data set for comparing the efficiency of the two tests. With few cases, we are not able to observe power/failure of each method.

Thank you for the suggestion. In this revision, we compared the two tests on additional studies for SNPs that were reported associated with hypertension. The results were reported in Table S2 in the Supplementary Materials. The conclusion still holds in this real data analysis: T_JT and T_CA^Add had similar power when the genetic model tended to be additive (rs7961152, rs1937506, rs6997709) and T_JT is more powerful than T_CA^Add when the genetic model tended to be dominant (rs2398162).

# Tables:

In table 2: Could authors show whether the calculated statistics at different sample size are significantly different? By the way, the table was not referred in the main text.

Thank you for the comment and we apologize for not referring Table 2 in the main text in the original submission. This table refers to the simulation findings verification using expected theoretical tables for each simulation setting (last paragraph of Simulation section) and we now explained Table 2 in the right place of the manuscript.

We would like to explain that the test statistics (T_JT and T_CA^Add) in Table 2 were calculated based on “expected theoretical tables”. Specifically, in each simulation setting, using the fixed combination of sample size, MAF, and genetic model, the cell probabilities of each genotype for cases and controls in the genotype distribution table (Table 1) can be calculated, and therefore, the expected cell values can also be calculated by multiplying the probabilities with the sample size. We refer such table consisted of the expected cell values as a “expected” table, which allows us to evaluate the relative performance of the two tests by comparing their theoretical test statistics (T_JT and T_CA^Add) in each simulation setting. The relative difference between theoretical test statistics (ΔT=((T_JT-T_CA^Add ))/(T_CA^Add )×100%) are then reported in Table 2, which are used to verify the empirical findings observed in simulation. To fully explain this, we also provided more details and clarifications to this paragraph in this revision. However, since each simulation setting corresponds to a single ΔT that is non-random, we cannot assess the statistical significance of ΔT. Instead, readers may refer to Figure 1 for the actual power comparison from simulation, which can serve as a proxy for the relative statistical significance between the two tests.

Attachments
Attachment
Submitted filename: Plos One response letter.docx
Decision Letter - Mehdi Rahimi, Editor

PONE-D-21-39215R1A nonparametric alternative to the Cochran-Armitage trend test in genetic case-control association studies: the Jonckheere-Terpstra trend testPLOS ONE

Dear Dr. Zhou,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Jul 07 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Mehdi Rahimi, Ph.D.

Academic Editor

PLOS ONE

Additional Editor Comments:

Dear Author

The reviewer(s) have recommended major revisions to your manuscript. Therefore, I invite you to respond to the reviewer(s)' comments and revise your manuscript.

With Thanks

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 2

Point-by-point response to comments from Reviewer

Thank you very much for reviewing our paper and the detailed and helpful review report. We greatly appreciate your time, effort, encouragement, and insight. We have revised our paper addressing all issues raised in your report. The following is our point-by-point response to your comments. For convenience, your original comments are copied and our replies follow in blue. The associated revisions in the manuscript are highlighted in track change.

Comments:

I have reviewed the manuscript entitled: "nonparametric alternative to the Cochran-Armitage trend test in genetic case-control association studies: The Jonckheere-Terpstra trend test", with Manuscript Number: PONE-D-21-39215. This manuscript is an interesting topic that could help researchers to tackle any bias results might be happened in the related analysis. But still need more revisions (major/minor) that I have mentioned viewpoints/ comments as below:

Thank you very much for your concise summary and encouraging comment. We appreciate very much for your time and effort.

# Abstract:

Abstract is too general, while it would be better to be more specific in case of the definition and importance of the problem, and authors should give evidences in which are supported with quantitative data such as sample size etc.!

Thank you for the comment. In this revision, we have provided more related materials and details in the Abstract.

#Introduction:

In Line 47: it seems that sentence is incomplete ...is not additive (e.g.,(Gonzalez et al., 2008;…

In this sentence, the period is after the long list of citations, so it may appear that the sentence is incomplete:

However, it can suffer from power loss when the true genetic model is not additive (e.g., Gonzalez et al., 2008; Kuo & Feingold, 2010; Li, Zheng, Liang, & Yu, 2009; Loley, König, Hothorn, & Ziegler, 2013).

When reviewing it, however, we did find an extra “(” after “e.g.,”, and we have removed it in this revision (line 55, page 4).

Generally: This section at this format seems to be non-informative, authors must give more evidences of reported studies, try to highlight the weakness including suffering from the power loss etc., then discuss importance and objectives of the method of interest with more details.

We appreciate this suggestion to add more details in Introduction. In this revision, we expanded Introduction by providing more context information and details for the tests and significance of the research question.

# Simulation:

In Lines 113~120: authors must present more supportive data in case of sample size and MAFs to show advantages/ dis-advantages of both methods in analyzing genome-wide association studies.

In the original simulation, we considered sample sizes of N∈(200,500,1000) and MAFs of q∈(0.05,0.1,0.2,0.3). To evaluate the relative performance of the CA and JT trend tests in a wider range of data scenarios, we further considered the sample size of N=1500 and 2000 as well as q=0.4. Because the sample sizes and MAF were fairly large, we reduced the effect size in the alternative hypothesis to λ=0.5 to make the power comparison meaningful (otherwise the power of both tests will be close to 1 and hence difficult to evaluate the relative performance).

The empirical power of the two tests for the additional simulation settings were summarized in Table S1 in the Supplementary Materials. The results were consistent with original conclusion: compared to the CA trend test, the JT trend test is more powerful when the underlying genetic model is dominant. The power advantage of T_JT diminishes as the genetic model evolves toward the additive model and have approximately equivalent power when the underlying model is additive. T_JT becomes less powerful than T_CA^Add and the disadvantage enlarges when the genetic model keeps evolving toward the recessive end.

# Real Data Analysis:

At this section, authors are strongly advised to use more real data set for comparing the efficiency of the two tests. With few cases, we are not able to observe power/failure of each method.

Thank you for the suggestion. In this revision, we compared the two tests on additional studies for SNPs that were reported associated with hypertension. The results were reported in Table S2 in the Supplementary Materials. The conclusion still holds in this real data analysis: T_JT and T_CA^Add had similar power when the genetic model tended to be additive (rs7961152, rs1937506, rs6997709) and T_JT is more powerful than T_CA^Add when the genetic model tended to be dominant (rs2398162).

# Tables:

In table 2: Could authors show whether the calculated statistics at different sample size are significantly different? By the way, the table was not referred in the main text.

Thank you for the comment and we apologize for not referring Table 2 in the main text in the original submission. This table refers to the simulation findings verification using expected theoretical tables for each simulation setting (last paragraph of Simulation section) and we now explained Table 2 in the right place of the manuscript.

We would like to explain that the test statistics (T_JT and T_CA^Add) in Table 2 were calculated based on “expected theoretical tables”. Specifically, in each simulation setting, using the fixed combination of sample size, MAF, and genetic model, the cell probabilities of each genotype for cases and controls in the genotype distribution table (Table 1) can be calculated, and therefore, the expected cell values can also be calculated by multiplying the probabilities with the sample size. We refer such table consisted of the expected cell values as a “expected” table, which allows us to evaluate the relative performance of the two tests by comparing their theoretical test statistics (T_JT and T_CA^Add) in each simulation setting. The relative difference between theoretical test statistics (ΔT=((T_JT-T_CA^Add ))/(T_CA^Add )×100%) are then reported in Table 2, which are used to verify the empirical findings observed in simulation. To fully explain this, we also provided more details and clarifications to this paragraph in this revision. However, since each simulation setting corresponds to a single ΔT that is non-random, we cannot assess the statistical significance of ΔT. Instead, readers may refer to Figure 1 for the actual power comparison from simulation, which can serve as a proxy for the relative statistical significance between the two tests.

Attachments
Attachment
Submitted filename: Plos One response letter.docx
Decision Letter - Mehdi Rahimi, Editor

PONE-D-21-39215R2A nonparametric alternative to the Cochran-Armitage trend test in genetic case-control association studies: the Jonckheere-Terpstra trend testPLOS ONE

Dear Dr. Zhou,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Dec 08 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Mehdi Rahimi, Ph.D.

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #2: (No Response)

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #2: No

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #2: No

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #2: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #2: This article concerns a power comparison of the Cochran-Armitage trend test and the non-parametric Jonckheere-Terpstra trend test, under common genetic models (additive, dominant, recessive), different minor allele frequencies and sample sizes, and for bi-allelic genetic variants. The article is concise and well written, and the conclusions are clear. I have some major and minor concerns detailed below.

Major concerns:

The example in the Real data analysis section on page 9 is not clearly a case of dominance; in fact, a co-dominant model seems to be the best, where the heterozygote has the highest risk. Variant rs20541 is thus a poor example to make the case of an advantage of the Jonckheere-Terpstra test. In fact, the clearest case of dominance is rs2398162 in Table S2. If the authors wish to practically illustrate the advantage of the Jonckheere-Terpstra test, then rs2398162 would be a better choice.

On line 192 the authors state “variant rs10900589 in ATP2B4 was associated with the disease in the Ghanaian samples and the association was replicated in the Gambian samples”. This statement is obviously FALSE. For Figure 2 shows very significant association for the Ghanaian sample, but a clearly non-significant association for the Gambian sample. Please correct the sentence.

In genetic association studies it is common to test the SNPs involved for Hardy-Weinberg equilibrium. If there are significant deviations from HWE, the results of the association tests may be questioned, because disequilibrium is potentially indicative of the presence of genotyping errors (Hosking et al, 2004; Leal, 2005). The HWE testing can be done with many genetic data analysis software such as PLINK (Purcell et al., 2007) or R-package Hardy-Weinberg (Graffelman, 2015). It is recommended to test cases and controls separately, and report exact Hardy-Weinberg p-values for each group. Exact testing is the preferred approach, for it has the highest power. I suggest the authors to include HW test results in the paper, for this can only improve the credibility of their conclusions.

The authors emphasize the power gain of the Jonckheere-Terpstra test under the dominant model. However, the power loss of this test under recessive model is generally much larger than the gain under the dominant model (see Figure 1, Table 2). In practice one often does not know the correct genetic model. One thus say that, a priori, and overall, the Cochran Armitage test may be the best choice, at least if the MAF is not low. This conclusion should be added to the evaluation of the tests in the Discussion section.

In genetic association studies with SNPs, it is also common to test for association not only at the genotype level, but also at the level of alleles. A standard test for this purpose is the so called alleles test (Laird & Lange, 2011). For all empirical 9 SNPs (Page Figure 2, Table S2) the alleles test leads to the same conclusion as the Jonckheere-Terpstra test. This could at least be stated, as it strengthens the conclusions of the “Real data analysis” Section.

Minor issues:

L59: “recource” -- “resource”

L138: “set of simulation” -- “set of simulations”

L161: “We refer this table consisted of” -- “This table consists of”

L162: delete “as an expected table”.

L167, L202: “were reported” -- “are reported”

L262: capitalize “Kendall”

References:

Graffelman, J. (2015) Exploring Diallelic Genetic Markers: The HardyWeinberg Package. The Journal of Statistical Software 64(3): 1–23.

Hosking, L., S. Lumsden, K. Lewis, A. Yeo, L. McCarthy, A. Bansal, J. Riley, I. Purvis, and C. Xu (2004): Detection of genotyping errors by Hardy-Weinberg equilibrium testing. Eur. J. Hum. Genet., 12, 395–399.

Laird, N. M. and Lange, C. (2011) The fundamentals of modern statistical genetics. Springer.

Leal SM (2005) Detection of genotyping errors and pseudo-SNPs via deviations from Hardy–Weinberg equilibrium. Genet Epidemiol 29:204–214

S. Purcell and B. Neale and K. Todd-Brown and L. Thomas and M. A. R. Ferreira and D. Bender and J. Maller and P. Sklar and P. I. W. de Bakker and M. J. Daly and P. C. Sham (2007) PLINK: A Toolset for Whole-Genome Association and Population-Based Linkage Analysis. American Journal of Human Genetics 81(3): 559—575.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #2: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 3

The word document of response letter is included in this submission too.

Reviewer #2: This article concerns a power comparison of the Cochran-Armitage trend test and the non-parametric Jonckheere-Terpstra trend test, under common genetic models (additive, dominant, recessive), different minor allele frequencies and sample sizes, and for bi-allelic genetic variants. The article is concise and well written, and the conclusions are clear. I have some major and minor concerns detailed below.

Thank you very much for your concise summary and encouraging comment. We appreciate very much for your time and effort.

Major concerns:

The example in the Real data analysis section on page 9 is not clearly a case of dominance; in fact, a co-dominant model seems to be the best, where the heterozygote has the highest risk. Variant rs20541 is thus a poor example to make the case of an advantage of the Jonckheere-Terpstra test. In fact, the clearest case of dominance is rs2398162 in Table S2. If the authors wish to practically illustrate the advantage of the Jonckheere-Terpstra test, then rs2398162 would be a better choice.

Thank you for this suggestion. In this revision we updated this section by using the variant rs2398162 in the hypertension study in the Real data analysis as an example of dominant model. Relevant text and Table S2 were also updated to reflect this change.

On line 192 the authors state “variant rs10900589 in ATP2B4 was associated with the disease in the Ghanaian samples and the association was replicated in the Gambian samples”. This statement is obviously FALSE. For Figure 2 shows very significant association for the Ghanaian sample, but a clearly non-significant association for the Gambian sample. Please correct the sentence.

Thank you for this observation and we apologize that the previous statement was inaccurate. In Loley et al. (2013, EJHG 21:1442-1448) it was shown this signal was only significant when coded in a recessive model in the Gambian group (Table 1). In this revision we corrected the sentence and stated that “This association was also evaluated in the Gambian samples and it was significant under a recessive model but insignificant under dominant and additive models.”

In genetic association studies it is common to test the SNPs involved for Hardy-Weinberg equilibrium. If there are significant deviations from HWE, the results of the association tests may be questioned, because disequilibrium is potentially indicative of the presence of genotyping errors (Hosking et al, 2004; Leal, 2005). The HWE testing can be done with many genetic data analysis software such as PLINK (Purcell et al., 2007) or R-package Hardy-Weinberg (Graffelman, 2015). It is recommended to test cases and controls separately, and report exact Hardy-Weinberg p-values for each group. Exact testing is the preferred approach, for it has the highest power. I suggest the authors to include HW test results in the paper, for this can only improve the credibility of their conclusions.

Thank you for this suggestion. In the revision we conducted the exact test for HWE among the cases and controls for each of the variant using the suggested R package (HardyWeinberg). The p-value results were summarized in S3 Table and we concluded that “Results showed that the p-values of the HWE tests for all the variants were larger than 0.01, with only two between 0.01 and 0.05, suggesting that there was little evidence of genotyping error among the variants”.

The authors emphasize the power gain of the Jonckheere-Terpstra test under the dominant model. However, the power loss of this test under recessive model is generally much larger than the gain under the dominant model (see Figure 1, Table 2). In practice one often does not know the correct genetic model. One thus say that, a priori, and overall, the Cochran Armitage test may be the best choice, at least if the MAF is not low. This conclusion should be added to the evaluation of the tests in the Discussion section

Thank you for bringing up this excellent point. In this revision, we provided such information about the selection between JT and CA trend tests when the true genetic model is unknown at the end of the discussion.

In genetic association studies with SNPs, it is also common to test for association not only at the genotype level, but also at the level of alleles. A standard test for this purpose is the so called alleles test (Laird & Lange, 2011). For all empirical 9 SNPs (Page Figure 2, Table S2) the alleles test leads to the same conclusion as the Jonckheere-Terpstra test. This could at least be stated, as it strengthens the conclusions of the “Real data analysis” Section.

Thank you for this suggestion. We conducted the allelic test for the variants included in the Real data analysis section and confirmed that the test results were close to the discussed methods. The results of the allelic test were included in S3 table.

Minor issues:

L59: “recource” -- “resource”

L138: “set of simulation” -- “set of simulations”

L161: “We refer this table consisted of” -- “This table consists of”

L162: delete “as an expected table”.

L167, L202: “were reported” -- “are reported”

L262: capitalize “Kendall”

Thank you for these observations. We have made corresponding changes in this revision.

Attachments
Attachment
Submitted filename: Response letter.docx
Decision Letter - Mehdi Rahimi, Editor

A nonparametric alternative to the Cochran-Armitage trend test in genetic case-control association studies: the Jonckheere-Terpstra trend test

PONE-D-21-39215R3

Dear Dr. Zhou,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Mehdi Rahimi, Ph.D.

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #2: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #2: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #2: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #2: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #2: This article concerns a power comparison of the Cochran-Armitage trend test and the non-parametric Jonckheere-Terpstra trend test, under common genetic models (additive, dominant, recessive), different minor allele frequencies and sample sizes, and for bi-allelic genetic variants. The article has improved after the previous round of review. The authors have addressed most of my concerns satisfactorily. I have only some minor points for improvement left which are detailed below.

L75: “as Table” --- “as in Table”

L82: The null hypothesis is supposed to apply to all i (0, 1 or 2); that should be stated.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #2: No

**********

Formally Accepted
Acceptance Letter - Mehdi Rahimi, Editor

PONE-D-21-39215R3

A nonparametric alternative to the Cochran-Armitage trend test in genetic case-control association studies: the Jonckheere-Terpstra trend test

Dear Dr. Zhou:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Associate Prof. Mehdi Rahimi

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .