Peer Review History
| Original SubmissionApril 26, 2023 |
|---|
|
PONE-D-23-12291Cost-effectiveness of incorporating Ebola prediction score tools and rapid diagnostic tests into a screening algorithm: a decision analytic modelPLOS ONE Dear Dr. Tshomba, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Jul 07 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Jan Rychtář Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 2. Please provide additional details regarding participant consent. In the ethics statement in the Methods and online submission information, please ensure that you have specified (1) whether consent was informed and (2) what type you obtained (for instance, written or verbal, and if verbal, how it was documented and witnessed). If your study included minors, state whether you obtained consent from parents or guardians. If the need for consent was waived by the ethics committee, please include this information. If you are reporting a retrospective study of medical records or archived samples, please ensure that you have discussed whether all data were fully anonymized before you accessed them and/or whether the IRB or ethics committee waived the requirement for informed consent. If patients provided informed written consent to have data from their medical records used in research, please include this information. 3. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability. "Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. Any potentially identifying patient information must be fully anonymized. Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access. We will update your Data Availability statement to reflect the information you provide in your cover letter. Additional Editor Comments: The reviewer makes several suggestions for revisions and improvements and I urge the authors to take all of these suggestions into considerations and to make appropriate changes to their manuscript [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: No ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Tshomba et al performed a computational decision tree analysis to compute the cost-effectiveness of different Ebola screening algorithms. They determined that combining the WHO case definition with the ECPS screening procedure and additionally screening negative suspect cases with the QuickNavi molecular rapid test is the most cost-effective strategy. Additional sensitivity analyses indicate that this recommendation holds across the range of plausible parameter measurement error. The approach used is sensible and is applied reasonably. However, I have a few technical questions/comments that are relevant to the overall conclusions; in particular, I wonder whether the cost of missing an infected individual during screening is appropriately estimated here. The manuscript’s findings are relevant to healthcare workers and public-health officials involved in Ebola outbreak response and address a very important issue with real-world implications. Therefore, I recommend publication after the technical issues have been addressed. MAJOR COMMENTS: - Should the potential for additional infections in the community be incorporated into the cost of missing a case (false negative error)? The payoff for this outcome is set to 0 “by assumption” in Table 3, which I’m guessing was chosen to simply represent the lack of benefit acquired from correctly isolating a case. However, missing a true Ebola infection may have additional public-health consequences, as the infected individual may infect several others in the community, which is not considered here. In the current analysis, the cost of erroneous isolation (false positive) is greater than the cost of missing a true case, which would not make sense if an infected individual allowed to remain in the community is expected to infect 2 or more others. Please consider incorporating these secondary effects into your analysis, as the risk of infecting others due to a false negative screening result may outweigh the risk of erroneous isolation, which is quite costly in the current analysis. - Can the cost of erroneously isolating an uninfected individual (given in Table 3) be reduced by improving isolation practices to reduce the probability of becoming infected as a result of isolation? - It is not clear to me what the precise definitions of screening algorithms 3 and 4 are. They are simply described as “2) ECPS as a join [sic] test, 3) ECPS as a conditional test” (lines 144-145). The tree in Fig. 1 does not clarify this point, but rather simply labels a branch as “ECPS as a conditional test”. From that description, I’m not sure what other test(s) are performed in addition to the ECPS. Since the molecular RDT is described separately and the only other screening algorithm mentioned is the WHO criteria, I assume the ECPS is being used joint/conditionally with the WHO criteria. Also, in terms of the conditional version of the test, I’m not sure which screening protocol is being applied first. Please clarify further in the text. - The results in Tables 4 and 5 for algorithms 3 and 4 are exactly the same, which seems unlikely given their different sensitivities/specificities in Table 2. Is this expected? - I’m confused as to why the sensitivity analyses in Fig. 2B-C show no change in cost-effectiveness for any condition tested. For example, a back-of-the-envelope calculation suggests that in 2C, if the sensitivity of the conditional test increases by ~12% (as it does over the range of the x-axis, 0.61 to 0.69), then the cost-effectiveness ratio of algorithm 4 should decrease substantially, from ~$106/case to ~$95/case, since the number of cases identified should go up by 12% as well (from Equation 4 in S1 Text). This change should be visible in Fig. 2C but is not. Please explain. - The sensitivities of the ECPS joint and conditional tests (algorithms 3 and 4) are presumably not independent variables, since both are functions of the sensitivities of the ECPS and the other screening method with which the ECPS was combined (again, possibly the WHO definition?). Therefore, if the joint sensitivity changes, the conditional sensitivity will also change, and treating these two values as separate quantities in the sensitivity analysis doesn’t make much sense. A better choice might be to perform the sensitivity analysis in Fig. 2 on the sensitivities of the individual tests making up the joint/conditional screening protocols (i.e., the ECPS and WHO protocol sensitivities). - Is the WTP value of $50,000 used in Fig. 6 and specified as the “default” value in the Methods appropriate in any circumstance? It seems to me that the country specific WTP threshold (used in Fig. 7, for example) is much more relevant and that the $50,000 value was chosen arbitrarily and is far too high for the likely use contexts of these screening algorithms. MINOR COMMENTS: - The description of algorithm 4 in Table 1 should indicate that this is a conditional test, not a joint test. - “Joint test” is sometimes written erroneously as “join test”. - The resolution of some of the original figure files (e.g., Fig. 2) are quite low, making them difficult to read. - A couple of clauses end with “or so” (e.g., lines 536, 545), which I don’t believe is standard English- perhaps they can be replaced with “etc”? - Perhaps defining the term “incremental” (e.g., cost vs. incremental cost) in the text would help readers who are unfamiliar with formal cost-effectiveness analysis. - Table 5 includes a lot of numerical data and is difficult to interpret quickly. Perhaps plotting these data as a heatmap, for example, would make them more interpretable? The raw tabular data can be included in the supplement. - Raw numerical outputs from key parts of the cost-effectiveness analysis are included in the main text (Tables 4 and 5). However, raw data from most of the sensitivity analyses (Figs. 3-7) are not available. These data are probably reproducible using the commercial software used by the authors, but this would not be accessible to many readers. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Debra Van Egeren ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 1 |
|
PONE-D-23-12291R1Cost-effectiveness of incorporating Ebola prediction score tools and rapid diagnostic tests into a screening algorithm: a decision analytic modelPLOS ONE Dear Dr. Tshomba, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. ============================== While the reviewer appreciate that most of the previous concerns were addressed, there are still several issues that need to be improved/addressed. Please revise your manuscript accordingly addressing all reviewer's comments. ============================== Please submit your revised manuscript by Aug 16 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Jan Rychtář Academic Editor PLOS ONE Additional Editor Comments: While the reviewer appreciates that most of the previous concerns were addressed, there are still several issues that need to be improved/addressed. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Thank you for addressing the majority of my questions and comments. However, I still have concerns about how the costs of false positives and false negatives are being estimated, which were further exposed while I was reviewing the changes made during this revision. I think additional clarification and/or justification is required for the term of the form (1-(1-θ)δ) in equations 12-15 in S2 File, which estimates the “probability of infection given random contact with an EVD patient”. The secondary attack rate θ is estimated in the relevant reference (Okware et al) as the fraction of contacts of an EVD case that get infected. Given that definition, I don’t understand why the duration of infectiousness (1/δ) is being used here in this way. I suppose if you’re assuming that the risk to each suspected case entering isolation is equivalent to the risk of coming into contact with a single EVD patient for one day, and you are assuming the published secondary attack rate is being estimated from household contacts who are exposed continuously for the entire duration of infectiousness, there is some possible justification for this equation, which would need to be further explained in the manuscript. Unfortunately I believe that this risk estimate cannot be appropriately justified using this reasoning- it’s not clear from the reference for the secondary attack rate value exactly how a contact is defined, and there are published estimates for the household attack rate that are much higher than 2.5% (see the meta-analysis in Dean et al. Clin Infect Dis 2016 https://doi.org/10.1093/cid/ciw114 which includes the reference cited here). A better way to estimate infection risk for false positives would probably be to use results from a study specifically designed to measure the risk of nosocomial infection in patients erroneously isolated in EVD units (e.g., Arkell et al Tropical Medicine and International Health 2016 https://doi.org/10.1111/tmi.12802) which seem to estimate a higher risk of infection than calculated in this manuscript (though the absolute risk of exposure is still reasonably low). The estimate of risk of community transmission from a false negative (which was calculated using the same reasoning) is even more problematic. Again, the assumption being made seems to be that the expected number of cases resulting from a true EVD case being allowed to remain in the community is equivalent to the risk of a single person coming into contact with that EVD patient for one day, except now the secondary attack rate is 50% lower than was estimated previously (presumably to account for lower risk from “asymptomatic” EVD patients). This seems to me to be quite an underestimation of the expected number of cases. First, I think these patients are not truly “asymptomatic” and instead may have nonspecific symptoms or be presymptomatic, so I don’t think there is good justification for the assumption that their infectiousness is lower (especially by an arbitrary 50%, which the authors do not justify). Also, I’m guessing that many of these patients will return to their homes after a negative screening, potentially exposing multiple people (possibly in higher-risk caregiving roles) over multiple days, making the expected number of new cases resulting from a single false negative to be much higher than the 0.00125 calculated here. Using the R0 value for EVD community transmission (while perhaps a bit of an overestimate) for the effect of a false negative is likely more correct, and more easily justified, and would result in a false negative penalty that is orders of magnitude higher than the value currently used here. Since these false negative/positive risks are very important factors in the cost-effectiveness analysis (and thus the conclusions of the whole manuscript), they require suitable justification before I can recommend publication. I apologize for not noticing these issues in my first review. I have a couple additional minor comments: - I think Eq. 13 in S2 File is missing a 1-Prev term? - Please explicitly state in the data availability statement that the raw data are available in the Supporting Outputs Data so readers know exactly where to find them. - Figs. 7 and 8 legends: define red/green colors of points - It’s still a little unclear in the text what “joint” and “conditional” mean when referring to the ECPS test. The authors define these terms very clearly in their previous publication (ref. [26] in this manuscript, relevant passage copied below), and I’d suggest adding a similar description here or at least referencing this publication at the point in the text where these algorithms are defined. “Finally, we evaluated our prediction models according to two additional clinical practice approaches in healthcare settings: joint and conditional tests or approaches. In both approaches, the suspects with no reported risk of exposure would be considered to not have the disease and the clinical team would act accordingly. No additional action, e.g., isolation, would be required. In the joint approach, the clinical team should clinically examine all suspects at low-, intermediate-, and high-risk reported exposure and recommend for isolation only those with a predicted probability of EVD greater than 5% (the cut-off chosen to maximize sensitivity, about 90 percent, in disease adverse context). In the conditional approach, the clinical team should isolate all suspects with high-risk reported exposure irrespective of their predicted probability of the disease and then suspects at low and intermediate reported exposure having an EVD-predicted probability greater than 5%.” ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Debra Van Egeren ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 2 |
|
PONE-D-23-12291R2Cost-effectiveness of incorporating Ebola prediction score tools and rapid diagnostic tests into a screening algorithm: a decision analytic modelPLOS ONE Dear Dr. Tshomba, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. The reviewer continues to raise a number of substantial issues. Given this is already a second revision, you will have only one more chance to revise the manuscript. The reviewer is willing to communicate with you directly to go over the methodology rather than continue back and forth with the revisions. If you are agreeable to this, please reach out to me directly via email (rychtarj@vcu.edu) and I will connect you with the reviewer. Please submit your revised manuscript by Sep 14 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Jan Rychtář Academic Editor PLOS ONE Additional Editor Comments: The reviewer continues to raise a number of substantial issues. Given this is already a second revision, you will have only one more chance to revise the manuscript. The reviewer is willing to communicate with you directly to go over the methodology rather than continue back and forth with the revisions. If you are agreeable to this, please reach out to me directly via email (rychtarj@vcu.edu) and I will connect you with the reviewer. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Unfortunately, I think there are still two major issues with the core methodology of the study that have not been addressed. First, I don’t think my concerns from the previous revision have been adequately addressed, so please allow me to clarify. I appreciate your response and your update to the SAR values used. However, I still believe that the formulae used to estimate the false positive and false negative costs do not accurately reflect the conceptual description of these costs given in the text or author response, do not correctly use the SAR values as consistently defined in both Gilbert et al and Dean et al, and may underestimate the expected number of cases resulting from a false positive/negative. The authors state that “[a]ccording to the paper by Gilbert et al., this random probability (1-(1-θ)^δ) measures the risk of transmission during an infectious patient's entire period of time” in their response. This is not consistent with the passage describing that expression in the Gilbert et al reference, which describes it as “the probability of infection given contact with an infectious individual” and uses it as a part of a differential equations SIR model. There are two differences between this definition and the authors’ interpretation. First, this expression represents risk of infection per contact, rather than the absolute risk for each infected individual regardless of contact rate. This is why, in this reference, this expression is multiplied by the contact rate. Second, the expression represents the probability of infection within one unit of time (here, 1 day or one contact), not the entire period of time (hence its inclusion in a system of differential equations). Therefore, by using this expression (without accounting for the number of contacts or amount of time each contact lasts) to represent the expected number of new infections that an infected individual will subsequently infect, the authors are implicitly assuming that each uninfected individual erroneously isolated in an EVD ward only experiences a single instance of direct contact with an infected patient (in the case of a FP) and infected individual erroneously screened as negative only has a single, transient contact after returning to the community (in the case of a FN). This assumption should be at least explicitly stated in the manuscript, and likely should be reevaluated, particularly for FNs. I can see how there would be little contact with infected individuals for uninfected patients in an EVD ward, but I have a hard time believing that an infected individual returning to a community after a false negative screening would have the same total risk over the entire period they are in the community as an infected individual with a single transient contact. That seems to be an underestimate, as I’m guessing most of these individuals will return home for multiple days and be in contact with multiple family members. However, the authors’ definition reproduced above describing the overall risk of transmission does not match with the expression they used but better matches either the SAR alone (defined in Gilbert et al as “the proportion of individuals who will become infected upon contact with an infectious individual during the total infectious period”, which agrees with the definition employed in Dean et al, the reference from which the parameter values were taken), which still doesn’t take into account the contact rate, or R0, which does take into account the contact rate and seems to be the closest metric to what the authors intend this cost to represent. As the authors correctly state in their response, “R0 is the number of secondary cases that one case would cause in a population that is completely susceptible”, which seems to exactly what is intended with this cost- the expected number of new cases resulting from a single infected individual returning to the community without isolation. I agree that there are issues with the estimation of R0 but the assumptions underlying the strategy the authors are currently using to estimate the FN risk is at the very least not justified in the manuscript and is likely underestimating the risk by multiple orders of magnitude. Admittedly it seems that with the current approach the FN risk hardly seems to affect the results at all (during the last revision this risk increased by about an order of magnitude and the results in Table 4 are almost the same), but I think this may reflect an issue with the overall approach as well, as outlined below. Second, I have very serious concerns about how the effectiveness/payoff of a screening strategy is being defined more generally here. The authors repeatedly state that the effectiveness is defined as the number or fraction of true EVD cases isolated (e.g., in the abstract line 54, header row in Table 4, Methods lines 210-211); however, this is not the definition implemented in the payoff matrix (Table 3) or described later in the Methods (lines 213-215), where true positive and true negative outcomes are in fact weighted equally. The effectiveness is therefore nearly equal to the accuracy of the algorithm (i.e., (TP+TN)/(total screened)), with a negligible contribution from the penalties from FPs and FNs (which are 2 orders of magnitude smaller than the payoffs given to TN and TP, “by assumption”). This can lead to very problematic conclusions since the overwhelming majority of subjects being screened don’t have EVD (prevalence ~6% as given in Table 2), causing the specificity of the test to dominate the effectiveness metric. For example, consider the trivial screening algorithm where all individuals being screened are given a negative test result and sent home. This procedure has a sensitivity of 0 and specificity of 1, leading to an effectiveness payout of approximately 0.94 (calculated as (1-Prev)*(payoff of TN) + Prev*(payoff of FN)). This effectiveness value is higher than the best real algorithm tested in the manuscript! In fact, since the cost of each of these “screenings” would be very low (presumably $0), the methodology used in this manuscript would identify this dummy procedure as the best possible screening test in terms of cost, effectiveness, and the cost-effectiveness ratio. Obviously, this is undesirable behavior (just shutting down EVD isolation units is not a good strategy!). Therefore, it is necessary to come up with a different payoff matrix that doesn’t just mostly optimize the test specificity. What would be a better payoff matrix? One option would be to implement the payoff matrix that is repeatedly described in the text but not actually used (“the number of isolated EVD cases (true positives)” lines 210-211), which would be assigning a value of 1 to TP and 0 to all other outcomes. Of course, this would just be proportional to the sensitivity of the screening algorithm and would not account for the specificity at all, so the strategy of isolating everyone who comes in would have the highest value (though also the highest cost). A better strategy, however, might be to quantify all payoffs in terms of monetary cost or benefit. This actually solves two serious issues with the current approach. First, it naturally solves the problem with deciding on how to weight TP and TN benefits that is explained above. TP events will have a benefit defined as the financial benefit to society for their isolation, which I’d suggest defining as the value generated by the effects of the patient receiving supportive care in the unit and extending their lifespan (e.g., see Bartsch et al Pathog Glob Health 2015 https://doi.org/10.1179%2F2047773214Y.0000000169 for estimates of the financial costs of EVD). TN events can be defined as having a payoff of $0. The second problem that this proposed redefinition into monetary units solves is the current mismatch in units between the TP or TN benefits and the FP and FN costs. TP and TN payoffs represent the number of correct screening calls made (assigning them each a value of 1) while FP and FN costs represent the number of additional expected EVD cases generated by the outcome (assigning a value of -1 per expected new EVD case). Therefore, the approach gives a correct negative screening result the same weight as generating a new EVD case that wouldn’t have occurred without an FP/FN result. These two events don’t seem like they should be on the same scale or be equivalent- causing a new infection that wouldn’t have otherwise occurred should be quite a bit more costly, I think! Instead, this current approach makes the FP and FN costs essentially negligible. Assigning FP and FN events monetary costs instead (perhaps by estimating the financial cost of each new EVD case) would put them on the same scale as TP/TN payoffs. Overall, I think the conceptual issues and problematic limiting behavior of the current approach are serious and warrant major changes before publication. However, it’s possible that even after these changes, the conclusions of the manuscript will remain the same. Currently, the general conclusion seems to be that algorithms with higher specificity (i.e., those that use the ECPS as a joint or conditional test) perform better than lower specificity algorithms, not only because they have higher “effectiveness” as currently defined, but also because they have significantly lower cost since they isolate fewer people. The differences in specificity between these two groups of algorithms is so great (~35% vs. >80%) that the conclusion that these tests are more cost effective will almost certainly continue to hold. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Debra Van Egeren ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 3 |
|
PONE-D-23-12291R3Cost-effectiveness of incorporating Ebola prediction score tools and rapid diagnostic tests into a screening algorithm: a decision analytic modelPLOS ONE Dear Dr. Tshomba, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. ============================== The reviewer is happy with the revision and recommends only minor changes before the manuscript can be accepted for the publication. ============================== Please submit your revised manuscript by Nov 11 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Jan Rychtář Academic Editor PLOS ONE Journal Requirements: Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. Additional Editor Comments: The reviewer is happy with the revision and recommends only minor changes before the manuscript can be accepted for the publication. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: At this point I think the manuscript can be published if the language describing the definition of effectiveness is corrected, and I do not think I need to see this manuscript again. In the abstract (and in many other places in the text, all of which need to be changed), the effectiveness is described as follows: Our analysis found dual ECPS as a conditional test with the QuickNavi™-Ebola RDT algorithm to be the most cost-effective screening algorithm for EVD, with an effectiveness of 0.86 (e.g., isolating 86% of EVD cases). The last part ("isolating 86% of EVD cases") is incorrect. The provided definition (isolating X% of true cases) I quoted above is actually the sensitivity of the test, not the effectiveness. The effectiveness formula now being used still has no real-world meaning, but I believe it is mathematically equivalent to using the expected number of EVD cases prevented per individual screened (do NOT use this as the definition either though, as stated this is not the actual definition of that number as it stands). I would instead not provide any real-world definition and simply indicate that it is a metric of test effectiveness. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Debra Van Egeren ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 4 |
|
Cost-effectiveness of incorporating Ebola prediction score tools and rapid diagnostic tests into a screening algorithm: a decision analytic model PONE-D-23-12291R4 Dear Dr. Tshomba, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Jan Rychtář Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: |
| Formally Accepted |
|
PONE-D-23-12291R4 Cost-effectiveness of incorporating Ebola prediction score tools and rapid diagnostic tests into a screening algorithm: a decision analytic model Dear Dr. Tshomba: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Jan Rychtář Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .