Peer Review History

Original SubmissionNovember 29, 2021
Decision Letter - Richard Huan XU, Editor

PONE-D-21-37787Discrete choice experiment versus swing-weighting: A head-to-head comparisonPLOS ONE

Dear Dr. Whichello,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Aug 26 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Richard Huan XU

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please include a complete ethics statement in the Methods section, including the name of the IRB, the approval number, and a statement on whether the study was approved or whether approval was waived. We note that the current statement in the Appendices appears to both state that the study was approved and that approval was waived - please specify which one of these is correct. Please also clarify how participants provided consent."

3. Please provide additional details regarding participant consent. In the ethics statement in the Methods and online submission information, please ensure that you have specified what type you obtained (for instance, written or verbal, and if verbal, how it was documented and witnessed). If your study included minors, state whether you obtained consent from parents or guardians. If the need for consent was waived by the ethics committee, please include this information.

4. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thanks the authors for this interesting paper. This study compares discrete choice experience and swing weighting in quantifying relative importance of attributes of glucose monitoring device. The study design and implementation are robust, while there are a few places need further clarification or additional details.

1. I would suggest that the title should specify this DCE/SW comparison was made for glucose monitoring device among patients with diabetes.

2. Could the authors show the initial 12 attributes derived from the interviews in the supplementary file and the primary reasons why some of them were excluded? It would help the audience understand how the seven attributes were selected.

3. It seems the order of the attributes shown in the DCE choice sets was not randomized for different respondents, which may lead to position bias (the cost and precision, the last and the first attribute in the choice tasks, happened to be the two most important attributes identified from DCE). Please clarify whether the randomization was done, and why if not randomized, and how it may affect the outcome (in comparison with SW which the attributes were randomized).

4. What is the response rate of the survey? Was the randomization to three blocks and the order of DCE and SW stratified by their age, type of diabetes, and/or current glucose monitor usage, etc.? Could the authors also report whether there was difference in key characteristics of the patients across DCE blocks and the orders of DCE/SW?

5. For DCE part, how many percentages of respondents in the analytical dataset (n=459) consistently chose the alternatives on the left-hand or right-hand side in all 12 choice tasks? This may also indicate that the respondents did not actually focus on the choice tasks.

6. Is there any difference in relative importance of the attributes (both methods as shown in Figure 3) for patients with different glucose-monitoring device usage and health literacy/numeracy? This could show whether the DCE and SW results are sensitive to whether having relevant knowledge and experience, which can help other researchers to interpret their results of these methods.

7. Could the authors provide more details in the sensitivity analysis on respondent-level rankings of the attributes using the ordered logit models, e.g. how the respondent-level ranking of attributes was defined in DCE, the model specification, etc.? In addition, the sensitivity analysis finding showed that fingerpick frequency were more likely to rank the highest in DCE than SW, while Figure 3 shows that proportion of importance of fingerpicks is 9.0% in DCE, which is lower than 15.4% and 17.7% in SW, could the authors explain this inconsistency?

8. The relative importance identified by ROC and point allocation seems to be very similar in Table 3 and Figure 3. Could the authors clarify how they drew the conclusion that “The findings of this study support the conclusion that point allocation is a more robust weight calculation method than ROC (line 372-373)”, apart from the evidence from previous studies?

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 1

Dear PlosOne Editors,

We would like to re-submit our PONE-D-21-37787 paper entitled ‘Discrete choice experiment versus swing-weighting: A head-to-head comparison’ to PlosOne, because we believe this article could be of great interest for you and your readers. We sincerely appreciate the comments from your reviewer, and we would like to integrate and address them.

1) Comment: I would suggest that the title should specify this DCE/SW comparison was made for glucose monitoring device among patients with diabetes.

i. Response: Thank you for the suggestion, we have re-titled to “Discrete choice experiment versus swing-weighting: A head-to-head comparison of diabetic patient preferences for glucose-monitoring devices”

2) Comment: Could the authors show the initial 12 attributes derived from the interviews in the supplementary file and the primary reasons why some of them were excluded? It would help the audience understand how the seven attributes were selected.

i. Response: In step 3, the list of 12 attributes were rated according to relevance, completeness, non-redundancy, operationality, and preferential independency by the research team. This process resulted in five attributes being were removed (based on failing these criteria) or being combined with other, similar attributes. This resulted in a final list of seven attributes which were used for the DCE and SW (Table 1).

3) Comment: It seems the order of the attributes shown in the DCE choice sets was not randomized for different respondents, which may lead to position bias (the cost and precision, the last and the first attribute in the choice tasks, happened to be the two most important attributes identified from DCE). Please clarify whether the randomization was done, and why if not randomized, and how it may affect the outcome (in comparison with SW which the attributes were randomized).

i. Response: This is correct that attribute randomization was not implemented for the DCE – and this is addressed and the impact discussed in paragraph 3 of the limitations section. As discussed, lexicographic behaviour was very low in the dataset, therefore the impact was minimal.

4) Comment: What is the response rate of the survey? Was the randomization to three blocks and the order of DCE and SW stratified by their age, type of diabetes, and/or current glucose monitor usage, etc.? Could the authors also report whether there was difference in key characteristics of the patients across DCE blocks and the orders of DCE/SW?

i. Response: Of 5,620 invited participants, 500 completed the survey, indicating a response rate of 8.9%, which we have added to the supplemental file. The randomization to the elicitation methods (50:50) or the three DCE blocks (33:33:33) or elicitation method were randomly allocated across the entire sample, and there were no meaningful sociodemographic or clinical characteristics between the assignments.

5) Comment: For DCE part, how many percentages of respondents in the analytical dataset (n=459) consistently chose the alternatives on the left-hand or right-hand side in all 12 choice tasks? This may also indicate that the respondents did not actually focus on the choice tasks.

i. As reported in the section “Discrete choice experimental results”, there was a slight left-right bias, and this is also reported in the last row of Table 2, with the coefficient mean 0.359 (p<0.01). However, this is very common in DCE surveys in languages that read from left to right, and in line with other DCE studies.

6) Comment: Is there any difference in relative importance of the attributes (both methods as shown in Figure 3) for patients with different glucose-monitoring device usage and health literacy/numeracy? This could show whether the DCE and SW results are sensitive to whether having relevant knowledge and experience, which can help other researchers to interpret their results of these methods.

i. Response: It is possible that preferences can vary between subgroups, such as high or low health numeracy/literacy. However, we feel that exploring this would be outside the scope of this paper and not quite aligned with the research objectives – we want to prioritise comparing the DCE and SW using a single, identical sample since this paper was methodological in focus and comparing the relative-attribute importance of various sociodemographic or clinical subgroups would shift the focus.

7) Comment: Could the authors provide more details in the sensitivity analysis on respondent-level rankings of the attributes using the ordered logit models, e.g. how the respondent-level ranking of attributes was defined in DCE, the model specification, etc.? In addition, the sensitivity analysis finding showed that fingerpick frequency were more likely to rank the highest in DCE than SW, while Figure 3 shows that proportion of importance of fingerpicks is 9.0% in DCE, which is lower than 15.4% and 17.7% in SW, could the authors explain this inconsistency?

i. Response: Thank you for this comments, and we have provided more detail about the ordered logic model under the heading “sensitivity analysis”. Figure 3 shows the relative attribute importance of a single attribute as a proportion of all attribute importance – essentially the proportion of decisions (i.e. choice task decisions) that were driving preferences and the attribute with the largest influence on patients’ choices of treatments. The differences between DCE and SW are discussed in detail in the section “comparison of weight distribution between the DCE and SW”. The sensitivity analysis in Appendix VI does not necessarily show that finger-prick frequency would be “more likely” to rank overall higher in the DCE than the SW: merely that, for example, this attribute had a 47% chance of being ranked #1 in a DCE, compared to a 19% chance in the SW. The sensitivity analysis was strictly comparing individual rankings (1 to 7) against other individual rankings (1 to 7).

8) Comment: The relative importance identified by ROC and point allocation seems to be very similar in Table 3 and Figure 3. Could the authors clarify how they drew the conclusion that “The findings of this study support the conclusion that point allocation is a more robust weight calculation method than ROC (line 372-373)”, apart from the evidence from previous studies?

i. Response: Thank you for this helpful comment. You are correct that this is an overstatement and we have adjusted it to be more in line with our previous statements: that there are indeed slight differences between point allocation and the ROC method, but point allocation was as robust as ROC for this study.

Thank you for your attention, and looking forward to hearing from you,

Chiara Lauren Whichello, PhD

Attachments
Attachment
Submitted filename: PONE-D-21-37787 Diabetes Response Letter.docx
Decision Letter - Richard Huan XU, Editor

PONE-D-21-37787R1Discrete choice experiment versus swing-weighting: A head-to-head comparison of diabetic patient preferences for glucose-monitoring devicesPLOS ONE

Dear Dr. Whichello,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Apr 15 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Richard Huan XU

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: (No Response)

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thanks the authors to provide revisions to the manuscript which addresses most of the comments.

There is one more comment from me. For sensitivity analysis for attribute rankings between DCE and SW, while the authors have added a few details to its method, it is still unclear how respondent-level ranking of attributes was derived from DCE. Could the authors tell more specifically how to use DCE marginal utility (which I assumed refers to model estimates in Table 2) to find out the attribute rankings of each individual respondent?

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 2

Dear PlosOne Editors,

We would like to re-submit our PONE-D-21-37787 paper entitled ‘Discrete choice experiment versus swing-weighting: A head-to-head comparison of diabetic patient preferences for glucose-monitoring devices’ to PlosOne, because we believe this article could be of great interest for you and your readers. We sincerely appreciate the minor revision requested from your reviewer, and we would like to integrate and address it in this resubmission.

1) Reviewer #1: Thanks the authors to provide revisions to the manuscript which addresses most of the comments. There is one more comment from me. For sensitivity analysis for attribute rankings between DCE and SW, while the authors have added a few details to its method, it is still unclear how respondent-level ranking of attributes was derived from DCE. Could the authors tell more specifically how to use DCE marginal utility (which I assumed refers to model estimates in Table 2) to find out the attribute rankings of each individual respondent?

i. Response: Thank you for this comments, and we have provided more detail the heading “sensitivity analyses” for the methodology of (generalized) ordered logit model. We hope that this explanation provides greater clarification:

“The respondent-level ranking of the attributes in the DCE and the SW were identified by determining how each individual participant ranked the attributes, either by participant rankings for the SW, or a ranked marginal utility for the DCE method. These individual ranking were then compared using a (generalised) ordered logit model. For the SW, these were individual-specific ranking outputs derived from Step 1 of the SW method, ranking the attributes from 1 to 7. For the DCE, the individual-specific rankings of each attribute were determined by first examining the patient uptake rates for the most preferred device (high precision, zero fingerpricks, low effort, low skin irritability, 25 euro, plain information, no alarm) replicating the same device hypothetically created during the SW exercise (i.e. the swing from ‘worst’ to ‘best’ attributes). As detailed above, the systematic utility is defined as an additive function consisting of marginal utilities. Therefore, the coefficients for each corresponding attribute-level for each individual were then ranked from lowest to highest and given a corresponding value (1 to 7). Each participant’s DCE rankings were then compared against their SW rankings through the ordered logit model, and determining the probability that an attribute had of being ranked 1-7 in either the DCE or the SW.”

We sincerely hope that this submission will be accepted upon this resubmission. Thank you for your attention, and looking forward to hearing from you,

Chiara Lauren Whichello, PhD

Attachments
Attachment
Submitted filename: PONE-D-21-37787 Response to Reviewers 17Mar2023.docx
Decision Letter - Richard Huan XU, Editor

Discrete choice experiment versus swing-weighting: A head-to-head comparison of diabetic patient preferences for glucose-monitoring devices

PONE-D-21-37787R2

Dear Dr. Whichello,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Richard Huan XU

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Formally Accepted
Acceptance Letter - Richard Huan XU, Editor

PONE-D-21-37787R2

Discrete choice experiment versus swing-weighting: A head-to-head comparison of diabetic patient preferences for glucose-monitoring devices

Dear Dr. Whichello:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Richard Huan XU

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .