Skip to main content
Advertisement
  • Loading metrics

Diagnostic accuracy of DPP Fever Panel II Asia tests for tropical fever diagnosis

  • Sandhya Dhawan,

    Roles Conceptualization, Data curation, Formal analysis, Software, Visualization, Writing – original draft, Writing – review & editing

    Affiliation Mahidol-Oxford Tropical Research Medicine Unit, Faculty of Tropical Medicine, Mahidol University, Bangkok, Thailand

  • Sabine Dittrich,

    Roles Conceptualization, Funding acquisition, Investigation, Project administration, Resources, Supervision, Validation, Visualization, Writing – original draft

    Current address: European-Campus-Rottal-Inn, Deggendorf Institut of Technology, Pfarrkirchen, Germany

    Affiliations FIND, Campus Biotech, Geneva, Switzerland, Centre for Tropical Medicine and Global Health, Nuffield Department of Medicine, University of Oxford, Oxford, United Kingdom

  • Sonia Arafah,

    Roles Data curation, Formal analysis

    Affiliation FIND, Campus Biotech, Geneva, Switzerland

  • Stefano Ongarello,

    Roles Software, Writing – review & editing

    Affiliation FIND, Campus Biotech, Geneva, Switzerland

  • Aurelian Mace,

    Roles Data curation, Methodology, Validation

    Affiliation Lao-Oxford-Mahosot Hospital-Wellcome Trust Research Unit, Microbiology Laboratory, Mahosot Hospital, Vientiane, Lao PDR

  • Siribun Panapruksachat,

    Roles Investigation, Validation

    Affiliation Lao-Oxford-Mahosot Hospital-Wellcome Trust Research Unit, Microbiology Laboratory, Mahosot Hospital, Vientiane, Lao PDR

  • Latsaniphone Boutthasavong,

    Roles Investigation, Validation

    Affiliation Lao-Oxford-Mahosot Hospital-Wellcome Trust Research Unit, Microbiology Laboratory, Mahosot Hospital, Vientiane, Lao PDR

  • Aphaphone Adsamouth,

    Roles Investigation, Validation

    Affiliation Lao-Oxford-Mahosot Hospital-Wellcome Trust Research Unit, Microbiology Laboratory, Mahosot Hospital, Vientiane, Lao PDR

  • Soulignasak Thongpaseuth,

    Roles Investigation, Validation

    Affiliation Lao-Oxford-Mahosot Hospital-Wellcome Trust Research Unit, Microbiology Laboratory, Mahosot Hospital, Vientiane, Lao PDR

  • Viengmon Davong,

    Roles Investigation, Validation

    Affiliation Lao-Oxford-Mahosot Hospital-Wellcome Trust Research Unit, Microbiology Laboratory, Mahosot Hospital, Vientiane, Lao PDR

  • Manivanh Vongsouvath,

    Roles Investigation, Validation

    Affiliation Lao-Oxford-Mahosot Hospital-Wellcome Trust Research Unit, Microbiology Laboratory, Mahosot Hospital, Vientiane, Lao PDR

  • Elizabeth A. Ashley,

    Roles Conceptualization, Funding acquisition, Project administration, Resources, Supervision, Writing – review & editing

    Affiliations Centre for Tropical Medicine and Global Health, Nuffield Department of Medicine, University of Oxford, Oxford, United Kingdom, Lao-Oxford-Mahosot Hospital-Wellcome Trust Research Unit, Microbiology Laboratory, Mahosot Hospital, Vientiane, Lao PDR

  • Matthew T. Robinson,

    Roles Conceptualization, Data curation, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliations Centre for Tropical Medicine and Global Health, Nuffield Department of Medicine, University of Oxford, Oxford, United Kingdom, Lao-Oxford-Mahosot Hospital-Wellcome Trust Research Unit, Microbiology Laboratory, Mahosot Hospital, Vientiane, Lao PDR

  • Stuart D. Blacksell

    Roles Conceptualization, Data curation, Funding acquisition, Methodology, Project administration, Resources, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    stuart.blacksell@ndm.ox.ac.uk

    Affiliations Mahidol-Oxford Tropical Research Medicine Unit, Faculty of Tropical Medicine, Mahidol University, Bangkok, Thailand, Centre for Tropical Medicine and Global Health, Nuffield Department of Medicine, University of Oxford, Oxford, United Kingdom, Lao-Oxford-Mahosot Hospital-Wellcome Trust Research Unit, Microbiology Laboratory, Mahosot Hospital, Vientiane, Lao PDR

Abstract

Background

Fever is the most frequent symptom in patients seeking care in South and Southeast Asia. The introduction of rapid diagnostic tests (RDTs) for malaria continues to drive patient management and care. Malaria-negative cases are commonly treated with antibiotics without confirmation of bacteraemia. Conventional laboratory tests for differential diagnosis require skilled staff and appropriate access to healthcare facilities. In addition, introducing single-disease RDTs instead of conventional laboratory tests remains costly. To overcome some of the delivery challenges of multiple separate tests, a multiplexed RDT with the capacity to diagnose a diverse range of tropical fevers would be a cost-effective solution. In this study, a multiplex lateral flow immunoassay (DPP Fever Panel II Assay) that can detect serum immunoglobulin M (IgM) and specific microbial antigens of common fever agents in Asia (Orientia tsutsugamushi, Rickettsia typhi, Leptospira spp., Burkholderia pseudomallei, Dengue virus, Chikungunya virus, and Zika virus), was evaluated.

Methodology/Principal findings

Whole blood (WB) and serum samples from 300 patients with undefined febrile illness (UFI) recruited in Vientiane, Laos PDR were tested using the DPP Fever Panel II, which consists of an Antibody panel and Antigen panel. To compare reader performance, results were recorded using two DPP readers, DPP Micro Reader (Micro Reader 1) and DPP Micro Reader Next Generation (Micro Reader 2). WB and serum samples were run on the same fever panel and read on both micro readers in order to compare results. ROC analysis and equal variance analysis were performed to inform the diagnostic validity of the test compared against the respective reference standards of each fever agent (S1 Table). Overall better AUC values were observed in whole blood results. No significant difference in AUC performance was observed when comparing whole blood and serum sample testing, except for when testing for R. typhi IgM (p = 0.04), Leptospira IgM (p = 0.02), and Dengue IgG (p = 0.03). Linear regression depicted R2 values had ~70% agreement across WB and serum samples, except when testing for leptospirosis and Zika, where the R2 values were 0.37 and 0.47, respectively. No significant difference was observed between the performance of Micro Reader 1 and Micro Reader 2, except when testing for the following pathogens: Zika IgM, Zika IgG, and B pseudomallei CPS Ag.

Conclusions/Significance

These results demonstrate that the diagnostic accuracy of the DPP Fever Panel II is comparable to that of commonly used RDTs. The optimal cut-off would depend on the use of the test and the desired sensitivity and specificity. Further studies are required to authenticate the use of these cut-offs in other endemic regions. This multiplex RDT offers diagnostic benefits in areas with limited access to healthcare and has the potential to improve field testing capacities. This could improve tropical fever management and reduce the public health burden in endemic low-resource areas.

Author summary

Tropical fevers, specifically those caused by non-malarial infectious agents, contribute to considerable morbidity and mortality in the Asia-Pacific region. Diagnosis of these pathogens is challenging since the clinical signs are often indistinguishable. Conventional laboratory tests to differentiate between tropical diseases require substantial infrastructure and experienced staff, limiting access to accurate tests in low-resource endemic regions. Rapid diagnostic tools (RDTs) offer an affordable solution for disease management and patient care. Although RDTs are also available for detecting non-malarial pathogens, there are financial and accessibility issues in establishing multiple separate tests in resource-constrained regions. To overcome these challenges, a multi-detection diagnostic platform with the capacity to diagnose a diverse range of tropical fevers would be a solution. This study aimed to evaluate the accuracy of an easier-to-use multiplex lateral flow immunoassay (DPP Fever Panel II Assay) that can detect IgM antibodies and specific antigens of common tropical diseases in Asia (Scrub typhus, Murine typhus, Leptospirosis, Melioidosis, Dengue fever, Chikungunya, and Zika virus). The test performed offers comparable diagnostic accuracy to commercially available tests, as well as some reference tests. The test also performs at equivalent accuracy with both blood and serum samples. If the fever panel were used as a stand-alone test for acute febrile illness diagnosis, cut-offs would need to be adjusted depending on the use of the test, and the desired sensitivity and specificity. There is a need to investigate the use of these cut-offs in other endemic regions, which could improve the rate of tropical fever diagnosis in low-resource settings.

Introduction

Tropical fever diagnosis has long perplexed healthcare professionals [1,2]. It is well-established that infectious agents are the primary cause of fever-related illness worldwide. In addition to globally prevalent agents, various pathogens are restricted to specific geographical regions and largely contribute to fever epidemiology in resource-limited settings [3]. In South and Southeast Asia, most of the population lives in rural areas, where poverty rates are high, and healthcare access is limited [4]. Diagnosing and treating diseases in these areas can be challenging due to the limited data available on the causes, resulting in incorrect treatment, including the unnecessary use of antimicrobials. However, it is well documented that febrile illnesses account for substantial morbidity and mortality in these regions [5,6].

While fever is the most frequent and debilitating clinical symptom in the tropics, measures to identify the spectrum of tropical fever aetiology and implement appropriate management measures have been limited [2]. This is especially accurate for non-malarial febrile illnesses. Clinically differentiating between common tropical diseases is challenging because the clinical presentation of fever-causing pathogens is similar. The lack of specific early presentation confounds diagnosis and subsequent treatment [2,7].

The use of rapid diagnostic tests (RDTs) for the early detection of malaria parasites has become common practice over the last decade and aided in improving malaria point-of-care testing globally [8]. As a result, improved case management and control measures significantly decreased the incidence of malarial fever [9], whereas other fever aetiologies proportionally increased [10]. Although single-plex qualitative RDTs for detecting non-malarial fevers are available, there are significant financial and access issues in establishing RDTs for numerous tropical pathogens, both at the patient management and healthcare system level [7]. Once malaria is ruled out, healthcare practitioners are unable to provide further testing and treatment because they receive insufficient training, support, and compensation [2,4,11,12]. As such, curable bacterial infections are often missed during diagnosis [4,13,14], and empiric antibiotic treatments are routinely administered [10,14]. Unnecessary antibiotic use acts as a driver for antimicrobial resistance across communities [15,16]. In low-resource settings where access to laboratory and human resource capacity is constrained, RDTs are preferred for diagnosis because of their affordability and ease of use.

However, RDT kits are designed with set cut-off values that often compromise sensitivity for specificity; in fact, this is a challenge of many serological tests [17]. Thresholds are often selected based on limited samples from one or two regions and often do not take into account varying background seropositivity across different countries, resulting in suboptimal test performance when used outside of the regions tested [7]. There is also a common problem of RDTs of unknown quality being used. While highly sensitive RDTs are vital, tests with low specificity have limited utility in clinical and public health decision-making. Low specificity can lead to high misdiagnosis rates, inappropriate use of antibiotics, and undertreatment of bacterial infections [1821]. In addition, tests with low specificity can also distort the accuracy of disease estimates, which further hinders the effectiveness of public health response measures [7,1821].

To overcome some of the delivery challenges of multiple separate tests, a multiplexed RDT with the capacity to diagnose a diverse range of tropical fevers would be a solution. A multiplex assay could deliver significant advantages over current single-plex qualitative RDTs, as they would enable the simultaneous detection and differentiation of numerous infections with comparable clinical manifestations. Additionally, if such a tool is quantitative rather than qualitative as current RDTs, region-specific cut-offs can be used to accomplish defined objectives. Quantitative readings for specific antigens can also serve as indices of severity, as has been shown for histidine-rich protein 2 (HRP2) in malaria [22], capsular polysaccharide (CPS) in melioidosis[23], and non-structural protein 1 (NS1) in dengue [24].

In this study, a multiplex lateral flow immunoassay (DPP Fever Panel II Assay Asia, Chembio, Inc.), that can detect serum immunoglobulin M (IgM) and specific microbial antigens of common fever agents in Asia (Orientia tsutsugamushi, Rickettsia typhi, Leptospira spp., Burkholderia pseudomallei, Dengue virus, Chikungunya virus, and Zika virus), will be evaluated. The objectives were to assess (i) the diagnostic accuracy of the test in a clinical setting representative of the intended use setting, (ii) compare test performance across whole blood and serum samples, and (iii) assess reader performance variability between two types of micro readers, a DPP Fever Panel II Asia Micro Reader (Micro Reader 1) and the other a DPP Fever Panel II Asia Micro Reader Next Generation (Micro Reader 2).

Methods

Ethics statement

The UI-study was approved by the Oxford Tropical Research Ethics Committee (OxTREC, 006–07), and the National Ethics Committee for Health Research in Lao PDR (049/NECHR and 046/NECHR), with approval to use leftover specimens for further research. All patients provided written consent for use of leftover specimens.

Study population

Specimens were obtained from adult patients (>15 years old) enrolled in the “Prospective study of the causes of fever amongst patients admitted to Mahosot Hospital, Vientiane, Lao PDR” (UI-study) between November 2019 to October 2020. Mahosot Hospital is a main primary-tertiary public hospital in Vientiane (capital of Laos) and receives referrals from across the country. Patients who had fever (≥ 38°C) within 24 hours of admission or at enrolment, an illness duration <1 week, a request for blood culture, and leftover paired whole blood and serum volumes of >250μl (following standard diagnostic testing) were enrolled for this study. Samples used for this study were collected from leftover samples on day samples were received, and were stored at 4°C for a maximum of 24 hours prior to testing in this current study.

DPP fever panel investigation

Whole blood (WB) and serum samples from 300 patients recruited in Vientiane, Laos PDR were tested using the DPP Fever Panel II test, consisting of an Antibody panel and Antigen panel. DPP tests were repeated on samples if they failed. For each patient, testing procedure followed the manufacturer’s instructions and were done with both paired blood and serum specimens (to compare specimen suitability), using 50μl of sample for the antigen panel and 10μl of specimen for the antibody panel. To compare reader performance, results were recorded using two DPP readers, DPP Micro Reader (Micro Reader 1) and DPP Micro Reader Next Generation (Micro Reader 2). Whilst diagnostic staff were not blinded to the results of the comparator (S1 Table) and DPP tests, review bias was minimized as the DPP test results do not require interpretation by an operator, only numerical values are displayed by the reader, and the result interpretation was done during data analysis and was not be given to the operator; and pre-specified thresholds for positivity were used for ELISA tests. Targets tested included O. tsutsugamushi IgM, R. typhi IgM, Leptospira spp. IgM, B. pseudomallei CPS Ag, Dengue IgM, Dengue IgG, Dengue NS1, Chikungunya IgM, Zika IgM, Zika IgG (Table 1). The test is not yet commercially available; the cutoff values have not been finalised.

thumbnail
Table 1. Diagnostic performance of the DPP II Fever Panel Asia on serum versus whole blood.

Summary statistics (cut off, sensitivity, specificity, and AUC values) for the diagnostic performance of whole blood and serum samples run on the panel are depicted. True positives, referring to positives by reference test, have been included as well. Results from both micro readers are shown.

https://doi.org/10.1371/journal.pntd.0012077.t001

Reference diagnostics

True positives were determined as positives by reference diagnostic tests. The reference diagnostic methods for each pathogen are outlined in S1 Table. For O. tsutsugamushi IgM and R. typhi IgM detection, an in-house ELISA was used as the reference assay, while for Leptospira spp. IgM detection, the SERION ELISA classic Leptospira IgM test was used. The reference test for B. pseudomallei CPS detection was blood culture. For both Dengue IgM and NS1 detection, the SD Bioline Dengue Duo IgM/IgG/NS1 (CE Marked) assay was used the reference. While for Chikungunya IgM, Zika IgM, and Dengue IgG, DPP Zika/ Chikungunya/ Dengue multiplex test (CE-marked) was used as the reference assay. Where carried out retrospectively, diagnostic staff were blinded to DPP results.

Statistical analysis

The data and statistical analysis for this study were performed using Stata/BE 17.0 and R programming language (R 4.1.0). The diagnostic performance of the assays was assessed via sensitivity and specificity. Receiver operating characteristic (ROC) curves were also created using the pROC and ROCR packages on R. The area under the curve (AUC) was examined to inform the diagnostic validity of the test and to advise an appropriate region-specific diagnostic cut-off. A optimal cut-off was selected by maximising both sensitivity and specificity indices from the ROC analysis [25]. A test of equal variance on the AUCs was performed using ‘roccomp’ and ‘rocgold’ commands on Stata to inform performance variability between WB and serum samples. Chi-squared hypothesis testing and linear regressions were also conducted to assess the statistical significance of sample type variability and reader variability.

Results

Diagnostic accuracy of the DPP Fever Panel in whole blood samples

At an optimal cut-off where sensitivity and specificity are at a suitable compromise, the DPP II O. tsutsugamushi IgM test sensitivity was between 57.1–66.7%, while the specificity was approx. 59.6–63.0% (Table 1). R. typhi IgM sensitivity was at an appropriate 72.9–76.3%, and specificity was between 73.3–77.6% at an optimal cut-off value. Leptospira IgM sensitivity at an optimal cut-off was 55.8–57.7%, and specificity was low at 50.2–63.6% (Table 1). Dengue IgM sensitivity at its optimal range was between 77.8–83.3%, while the specificity was 74.1–78.1%. In comparison, sensitivity for Dengue IgG detection was between 60.7–70.8% and specificity 54.9–67.0% at the optimal cut-off. At an optimal cut-off, Chikungunya IgM detection provided a sensitivity and specificity of 71.4–78.6% and 78.1–85.0%, respectively. Zika IgM detection had a sensitivity of 100%, at 76.7–84.5% specificity. While sensitivity and specificity for Zika IgG detection were compromised, sensitivity was optimal at 50.0–59.0% and specificity at 49.3–57.5% (Table 1).

Diagnostic accuracy of the DPP Fever Panel in serum samples

At similar cut-off ranges (Table 1), O. tsutsugamushi IgM detection in serum samples had an optimal sensitivity of 42.9–57.1% and a specificity of 55.2–59.1%. R. typhi IgM sensitivity and specificity were similar, with a sensitivity of 69.5–72.9% and a specificity of 67.5–70.7%. Leptospira IgM detection sensitivity at an optimal cut-off was 50.0–51.9%, and specificity was 52.9–54.0%. However, test performance for B. pseudomallei CPS Ag displayed greater variance, with sensitivity ranging from 25–75% and specificity ranging from 53–72% (Table 1). Dengue IgM sensitivity and specificity at its optimal range were between 75.0–80.6% and 74.5–76.8%, respectively. While Dengue IgG sensitivity at the optimal cut-off was 60.7%, and specificity was between 58.3–61.7%. Dengue NS1, on the other hand, provided a sensitivity of 83–86% and a specificity of 93.4%. Chikungunya IgM detection had a sensitivity of 71.4–78.6% and a specificity was 72.4–88.2%. At the optimal cut-off, Zika IgM detection was 75.0–78.6% sensitive and 90.8–98.6% specific. While for Zika IgG detection, sensitivity and specificity were compromised, with sensitivity optimal at 50–53.9% with a specificity of 52.1–53.4% (Table 1).

AUROC analysis

ROC analysis was performed to assess the diagnostic accuracy of the DPP test performance in whole blood samples against serum samples (Fig 1). O. tsutsugamushi IgM detection via the DPP Fever Panel II provided an AUC value of 0.61 and 0.71 when run using WB samples and 0.49 and 0.59 with serum samples (Fig 1A). R. typhi IgM detection had an AUC of 0.79 across both readers for WB, with an AUC of 0.76 and 0.75 for serum detection (Fig 1B). The AUC for Leptospira IgM detection was 0.60 and 0.59 in WB, while in serum was 0.53 across both readers (Fig 1C). Dengue IgM detection in WB resulted in an AUC of 0.85 and 0.84 and was 0.81 and 0.84 using serum samples (Fig 1E). In comparison, Dengue IgG had an AUC of 0.66 and 0.68 in WB, while IgG detection in serum provided an AUC value of 0.64 (Fig 1F). Chikungunya IgM performed adequately, with an AUC of 0.82 across both readers for WB detection and 0.86 and 0.82 for serum detection (Fig 1H). Zika IgM detection in WB provided an AUC of 0.97 and 0.94 and AUC values of 0.91 and 0.94 in serum detection (Fig 1I). On the other hand, Zika IgG had an AUC value of 0.64 and 0.51 using WB samples and 0.53 when conducted on serum samples. (Fig 1J).

thumbnail
Fig 1. Receiver Operative Characteristic (ROC) analysis for WB and serum samples.

Area under the curve (AUC) values for WB and serum samples across both readers are shown. No WB samples were available for Dengue NS1 assay and B pseudomallei CPS Ag assay. Legend: embedded in the graph.

https://doi.org/10.1371/journal.pntd.0012077.g001

B. pseudomallei CPS antigen detection resulted in an AUC of 0.65 and 0.71 on readers 1 and 2, respectively (Fig 1D). The serum samples were also tested for Dengue NS1 detection via the DPP Fever Panel, which provided an AUC value of 0.88 with Micro Reader 1 and 0.93 using Micro Reader 2 (Fig 1G).

Pairwise comparison of whole blood and serum test performance

Overall better AUC values were observed when WB samples were tested (Table 1). AUC values for whole blood and serum were compared against the gold standard reference results. A test of equality of ROC areas was performed. The AUC variance between whole blood and serum samples ranged from 0.51 to 0.95, with the difference between pathogens being ±0.1 units (Table 2). No significant difference in AUC performance was observed when comparing whole blood and serum sample testing, except for when testing for R. typhi IgM (p = 0.04), Leptospira IgM (p = 0.02), and Dengue IgG (p = 0.03) (Table 2). The AUC for R. typhi IgM WB samples was 0.79, while for serum samples was 0.75. The AUC for Leptospira IgM WB samples was 0.59, while for serum samples was 0.53. Dengue IgG WB samples had an AUC value of 0.67, and serum samples had one of 0.64. Linear regression analysis was also conducted to compare WB and serum sample result variance; all outputs were significant. The R2 values generally had ~70% agreement across WB and serum samples, except when testing for Leptospirosis and Zika, where the R2 values were 0.37 and 0.47, respectively (Table 3).

thumbnail
Table 2. Analysis of Equal Variance of WB and serum AUC values.

Pairwise comparison of area under the curve values for whole blood and serum was performed via a chi-square test to deduce variance in performances. Results from both readers (Micro Reader 1 and 2) were compiled to inform robust results.

https://doi.org/10.1371/journal.pntd.0012077.t002

thumbnail
Table 3. Linear regression analysis of WB and serum diagnostic performance.

WB and serum sample results were directly compared via linear regression to deduce test performance variance across both sample types. Results from both micro readers (1 and 2) were compiled to inform robust results.

https://doi.org/10.1371/journal.pntd.0012077.t003

Pairwise comparison of reader performance

Whole blood.

Linear regression analysis and ROC test of equal variance were performed to compare performance across both readers. There was no significant difference between reader performances for O. tsutsugamushi IgM (p = 0.046), R. typhi IgM detection (p = 0.872), Leptospira IgM (p = 0.317), Dengue IgM (p = 0.466), Dengue IgG (p = 0.209), Chikungunya IgM (p = 0.930), Zika IgM (p = 0.200). There was a significant difference in reader performance for Zika IgG detection (p = 0.004). Although, a linear regression of the reader results suggests similar R2 values for Zika IgM and IgG detection, at 0.662 and 0.664, respectively (Table 4).

thumbnail
Table 4. Linear regression analysis of reader performance.

Reader results from both Micro Reader 1 and Micro Reader 2 were directly compared via linear regression to deduce test performance variance.

https://doi.org/10.1371/journal.pntd.0012077.t004

Serum.

There was no significant difference between reader performances for R. typhi IgM (p = 0.114), Leptospirosis IgM (p = 0.910), Dengue IgM (p = 0.08), Dengue IgG (p = 0.904), Dengue NS1 (p = 0.124), Chikungunya IgM (p = 0.525), Zika IgM (p = 0.550) and Zika IgG (p = 0.944) (Table 4). There was a 94.1% agreement between reader results (R2, 0.941) for O. tsutsugamushi IgM detection; however, a significant difference between reader performances was detected (p = 0.05). B. pseudomallei CPS antigen showed no significant difference across reader performance (p = 0.411); though the linear regression revealed an R2 value of 0.0002, it was not a significant output (p = 0.811). While there was no statistical difference between reader performance regarding Zika IgM and IgG detection, the agreement between reader performance was limited. Linear regression analysis displayed an R2 value of 0.435 and 0.439 for Zika IgM and IgG detection, respectively (Table 4).

Discussion

This study evaluated the DPP Fever Panel II for the multi-analyte detection of scrub typhus, murine typhus, leptospirosis, melioidosis, dengue fever, chikungunya, and zika virus. The two micro readers (Micro Reader 1 and 2) were screened for performance variability, and the diagnostic platform was assessed using both whole blood and serum samples. Here, test performance was assessed using cutoffs recommended by the manufacturers and region-specific cutoffs calibrated for an optimal level of sensitivity and specificity in endemic settings.

The DPP assay performed poorly when compared to established O. tsutsugamushi RDTs, which had greater overall sensitivity (66–84%) and specificity (93–99%) [2628]. Since it remains unclear how long IgM and IgG antibodies persist in human scrub typhus, samples taken early after symptom onset may not have detectable levels of IgM antibodies [2830]. Due to the antigenic diversity of O. tsutsugamushi strains, cutoffs should be re-evaluated regionally, and local strains included in the antigen pool should be continually updated for accurate clinical diagnosis [31,32].

The DPP assay component for R. typhi performed comparably to other RDTs (sensitivity: ~51–60%, specificity: ~94–100%) [3336], though on the lower end of specificity (67–78%). Little advancements have been made in rapid tests for murine typhus diagnosis [34,37], and it is speculated that the cause of low sensitivity could be the antigenic diversity of R. typhi strains geographically, as is the case for O. tsutsugamushi [38].

The DPP Leptospira spp. IgM assay performed similarly to other RDTs available for leptospirosis diagnosis (sensitivity: 17.9–75%, specificity: 62.1–97.7%) [3943], albeit at the lower end of specificity. Despite this, the DPP assay obtained consistent sensitivity (~50–58%) and specificity (~50–63%) across sample types, and the diagnostic performance was comparable to earlier used diagnostic tools among healthy slum populations to detect leptospirosis on admission [44]. Commercially available RDTs for the detection of Leptospira spp. remain limited in their diagnostic accuracy, none reliably delivering a sensitivity or specificity of >80% on admission [39]. According to published studies, the circulation of location-specific leptospiral serovars contributes to regional variances in background antibody levels [41,45,46], and some serovars may impact the diagnostic accuracy of RDTs [47]. However, the reason region-specific serovars cause more severe illness remains unknown. It is also important to note that anti-Leptospira IgM antibodies are not detectable 4–5 days after symptom onset (S2 and S3 Tables) [48,49], and IgM can persist in the blood for years after infection [50,51]. Assays are required to be adjusted to local settings, and samples are collected after a period of seroconversion to avoid false positive results and ensure higher accuracy in diagnosis.

The sensitivity of the DPP B. pseudomallei CPS Ag (25%) was comparable to commercially used RDTs for melioidosis (31%) [52], although Micro Reader 2 provided a higher sensitivity (75%) using the regional cutoff. It is well-described that antigen test accuracy in unamplified blood is limited compared to blood culture [53], and only serum samples were tested for B. pseudomallei CPS Ag in this study. However, as demonstrated by the DPP test performance, the CPS antigen is not recommended for melioidosis serodiagnosis, as the sensitivity remains lower than culture, the current gold standard (60%) [54,55].

It should be noted that previous studies demonstrate clear associations between CPS positivity and fatality among melioidosis patients [7,23]. By examining the relationship between CPS-positives and disease severity/mortality, we can further investigate the biomarker capacity of the DPP CPS antigen test. The Chembio recommended cutoffs provide a CPS test of higher specificities (95–96%), which could serve high utility in clinical settings to distinguish mild self-limiting illness from severe disease if validated and studied further.

The sensitivity and specificity of the DPP dengue NS1 antigen and IgM antibody were equivalent to that of other RDTs. The Dengue NS1 assay provided greater sensitivity (83–90%) in diagnosis compared to commercially available tests (~45–85%) [5662]. Commercially available Dengue IgM RDTs provide a diverse range of sensitivity (~20–82%), but generally, studies demonstrate diagnostic sensitivity to be on the lower end of the spectrum [5659,61]. The DPP Dengue IgM assay specificity is comparable to other IgM RDTs; however, specificity is reduced to ~70% if sensitivity is prioritised. However, the variability in diagnostic accuracy of the DPP Dengue IgM target across WB and serum samples was inconclusive (Table 4).

Further validation studies must be done to confirm the disparities of using whole blood or serum samples. Cutoffs should be adjusted appropriately to represent the region’s background seropositivity to achieve desired clinical outcomes. The DPP Dengue IgG assay does not perform as well as the IgM assay and is not comparable to the sensitivity and specificity of readily available Dengue IgG RDTs. There was also a significant difference in assay performance across WB and serum samples. This may be attributed to the average duration of illness, which was observed to be ~6.4 days (S2 and S3 Tables). IgM antibodies are only detectable ~50% of patients 3–5 days after symptom onset [63], and IgG develops latently and may not be detectable for up to 2 weeks after onset of symptoms. A combination of the NS1, IgM, and IgG tests could provide a higher level of accuracy for dengue fever diagnosis. Consistent with previous research, pooling all three analytes, or a combination of two or three, bestowed optimal diagnostic performance (sensitivity ~90%, specificity ~89%) and proved to be of great clinical utility in many low-technology settings [57,61,6466].

The DPP Chikungunya IgM assay performed at an above-average range, typically in line with the sensitivity (20–100%) and specificity (73–100%) of commercially available RDTs for Chikungunya IgM detection [60,67]. Previous studies document that Chikungunya IgM detection sensitivity increases in the second week after symptom onset [68,69]. The sample collection time is, as such, paramount to ensuring valid test performance and should be considered in the future.

The DPP Zika IgM assay was performed as well as other Chembio Zika IgM RDT assays, showing similar levels of sensitivity (~79–86%) and specificity (~87–100%) [70,71]. The DPP Zika IgG test did not offer high levels of diagnostic accuracy even when cutoffs were optimised to suit regional settings, although IgG detection across RDTs is effective (~90–99%) [71]. IgG levels in Zika infections are often used as a marker of exposure since it develops weeks after onset and can persist in the body for 5–6 months [72,73]. Samples in this study were collected <24h after hospitalisation and may have contributed to lower sensitivity and specificity levels. Further investigation into coinfection rates and cross-reactivity between ZIKV and DENV antigens and antibodies [74,75] is required for diagnosing Zika infections with higher confidence.

The main limitations of the study are the restricted sample size it was conducted in; an absence of true positives (Table 1) can bias sensitivity and specificity, which does not parallel real-life settings. The diagnostic accuracy of the DPP Fever Panel II Asia was overall limited, while the sensitivity of the diagnostic panel is lower than the specificity, it is likely attributed to low levels of antibody during the acute phase of infection [51]. It is recommended to repeat the test after a period of seroconversion to allow for higher confidence in pathogen detection [40]. Further validation should explore cross-reactivity rates as well [76].

Further validation studies are recommended to ensure the synonymous performance of both readers. The performance of both readers was incompatible with one another regarding specific pathogens (Zika IgM, Zika IgG, and B. pseudomallei CPS Ag). Further research to explore why WB samples provided better diagnostic accuracy than serum samples overall may also be of interest. However, they are particularly informative for the company to decide on the most appropriate sample to list in their IFU.

The DPP Fever Panel II Asia offers the opportunity for highly specific rapid multiplex diagnosis of bacterial and arboviral infections. In many low-resource settings, where access to diagnostic infrastructure is limited, introducing an adequately sensitive and specific tool would afford immense benefits for point-of-care clinical management and outbreak surveillance. The DPP Fever Panel II Asia provides quick results without requiring specialised equipment. Given the ease with which the test can be performed, it serves both clinical and field utility, especially when health workers may have limited training [40]. Point-of-care diagnostic tools, particularly biomarker-based and multi-pathogen detection assays, must be prioritised to help guide treatment decisions in decentralised settings [77].

Supporting information

S3 Table. Summary statistics for serum assay.

https://doi.org/10.1371/journal.pntd.0012077.s003

(DOCX)

S1 Fig. Estimated accuracies and 95%-confidence intervals for reader performance.

Error bars are shown. Legend: x-axis, micro readers 1 and 2 (Micro Reader 1 and 2); light pink circles, WB; green circles, serum.

https://doi.org/10.1371/journal.pntd.0012077.s004

(DOCX)

S1 Data. Supporting information (raw data).

https://doi.org/10.1371/journal.pntd.0012077.s005

(XLSX)

Acknowledgments

The authors would like to thank the patients and staff of Mahosot Hospital, Vientiane. We thank the staff of the Microbiology Laboratory, Mahosot Hospital, and Dr Susath Vongphachanh Director of Mahosot Hospital and the directors team. We further thank the team from Chembio, particularly Dr Angelo Gunasekera for their technical support and onsite training of staff. We also thank Dr Danoy Chammanam for helping enrol patients.

References

  1. 1. Oldach DW, Richard RE, Borza EN, Benitez RM. A mysterious death. N Engl J Med. 1998;338(24):1764–9. pmid:9625631
  2. 2. Bottieau E, Yansouni CP. Fever in the tropics: the ultimate clinical challenge? Clin Microbiol Infect. 2018;24(8):806–7. pmid:29940346
  3. 3. Prasad N, Murdoch DR, Reyburn H, Crump JA. Etiology of Severe Febrile Illness in Low- and Middle-Income Countries: A Systematic Review. PLoS One. 2015;10(6):e0127962. pmid:26126200
  4. 4. Chandna A, Chew R, Shwe Nwe Htun N, Peto TJ, Zhang M, Liverani M, et al. Defining the burden of febrile illness in rural South and Southeast Asia: an open letter to announce the launch of the Rural Febrile Illness project. Wellcome Open Res. 2021;6:64. pmid:34017924
  5. 5. Crump JA, Kirk MD. Estimating the Burden of Febrile Illnesses. PLoS Negl Trop Dis. 2015;9(12):e0004040. pmid:26633014
  6. 6. Shrestha P, Dahal P, Ogbonnaa-Njoku C, Das D, Stepniewska K, Thomas NV, et al. Non-malarial febrile illness: a systematic review of published aetiological studies and case reports from Southern Asia and South-eastern Asia, 1980–2015. BMC Med. 2020;18(1):299. pmid:32951591
  7. 7. Amornchai P, Hantrakun V, Wongsuvan G, Boonsri C, Yoosuk S, Nilsakul J, et al. Sensitivity and specificity of DPP(R) Fever Panel II Asia in the diagnosis of malaria, dengue and melioidosis. J Med Microbiol. 2022;71(8).
  8. 8. WHO. World Malaria Report 2020. WHO; 2020.
  9. 9. Landier J, Parker DM, Thu AM, Lwin KM, Delmas G, Nosten FH, et al. Effect of generalised access to early diagnosis and treatment and targeted mass drug administration on Plasmodium falciparum malaria in Eastern Myanmar: an observational study of a regional elimination programme. Lancet. 2018;391(10133):1916–26. pmid:29703425
  10. 10. Hopkins H, Bruxvoort KJ, Cairns ME, Chandler CI, Leurent B, Ansah EK, et al. Impact of introduction of rapid diagnostic tests for malaria on antibiotic prescribing: analysis of observational and randomised studies in public and private healthcare settings. BMJ. 2017;356:j1054. pmid:28356302
  11. 11. McLean ARD, Wai HP, Thu AM, Khant ZS, Indrasuta C, Ashley EA, et al. Malaria elimination in remote communities requires integration of malaria control activities into general health care: an observational study and interrupted time series analysis in Myanmar. BMC Med. 2018;16(1):183. pmid:30343666
  12. 12. Lubell Y, Chandna A, Smithuis F, White L, Wertheim HFL, Redard-Jacot M, et al. Economic considerations support C-reactive protein testing alongside malaria rapid diagnostic tests to guide antimicrobial therapy for patients with febrile illness in settings with low malaria endemicity. Malar J. 2019;18(1):442. pmid:31878978
  13. 13. Mayxay M, Castonguay-Vanier J, Chansamouth V, Dubot-Peres A, Paris DH, Phetsouvanh R, et al. Causes of non-malarial fever in Laos: a prospective study. Lancet Glob Health. 2013;1(1):e46–54. pmid:24748368
  14. 14. Lubell Y, Blacksell SD, Dunachie S, Tanganuchitcharnchai A, Althaus T, Watthanaworawit W, et al. Performance of C-reactive protein and procalcitonin to distinguish viral from bacterial and malarial causes of fever in Southeast Asia. BMC Infect Dis. 2015;15:511. pmid:26558692
  15. 15. Ashley EA, Dhorda M, Fairhurst RM, Amaratunga C, Lim P, Suon S, et al. Spread of artemisinin resistance in Plasmodium falciparum malaria. N Engl J Med. 2014;371(5):411–23. pmid:25075834
  16. 16. Hamilton WL, Amato R, van der Pluijm RW, Jacob CG, Quang HH, Thuy-Nhien NT, et al. Evolution and expansion of multidrug-resistant malaria in southeast Asia: a genomic epidemiology study. Lancet Infect Dis. 2019;19(9):943–51. pmid:31345709
  17. 17. Blacksell SD, Bryant NJ, Paris DH, Doust JA, Sakoda Y, Day NP. Scrub typhus serologic testing with the indirect immunofluorescence method as a diagnostic gold standard: a lack of consensus leads to a lot of confusion. Clin Infect Dis. 2007;44(3):391–401. pmid:17205447
  18. 18. Hinjoy S, Hantrakun V, Kongyu S, Kaewrakmuk J, Wangrangsimakul T, Jitsuronk S, et al. Melioidosis in Thailand: Present and Future. Trop Med Infect Dis. 2018;3(2):38. pmid:29725623
  19. 19. Tiono AB, Diarra A, Sanon S, Nebie I, Konate AT, Pagnoni F, et al. Low specificity of a malaria rapid diagnostic test during an integrated community case management trial. Infect Dis Ther. 2013;2(1):27–36. pmid:25135821
  20. 20. Ishengoma DS, Francis F, Mmbando BP, Lusingu JP, Magistrado P, Alifrangis M, et al. Accuracy of malaria rapid diagnostic tests in community studies and their impact on treatment of malaria in an area with declining malaria burden in north-eastern Tanzania. Malar J. 2011;10:176. pmid:21703016
  21. 21. Chinkhumba J, Skarbinski J, Chilima B, Campbell C, Ewing V, San Joaquin M, et al. Comparative field performance and adherence to test results of four malaria rapid diagnostic tests among febrile patients more than five years of age in Blantyre, Malawi. Malar J. 2010;9:209. pmid:20646312
  22. 22. Hendriksen IC, Mwanga-Amumpaire J, von Seidlein L, Mtove G, White LJ, Olaosebikan R, et al. Diagnosing severe falciparum malaria in parasitaemic African children: a prospective evaluation of plasma PfHRP2 measurement. PLoS Med. 2012;9(8):e1001297. pmid:22927801
  23. 23. Amornchai P, Hantrakun V, Wongsuvan G, Wuthiekanun V, Wongratanacheewin S, Teparrakkul P, et al. Evaluation of antigen-detecting and antibody-detecting diagnostic test combinations for diagnosing melioidosis. PLoS Negl Trop Dis. 2021;15(11):e0009840. pmid:34727111
  24. 24. Paranavitane SA, Gomes L, Kamaladasa A, Adikari TN, Wickramasinghe N, Jeewandara C, et al. Dengue NS1 antigen as a marker of severe clinical disease. BMC Infect Dis. 2014;14:570. pmid:25366086
  25. 25. Habibzadeh F, Habibzadeh P, Yadollahie M. On determining the most appropriate test cut-off value: the case of tests with continuous results. Biochem Med (Zagreb). 2016;26(3):297–307. pmid:27812299
  26. 26. Kannan K, John R, Kundu D, Dayanand D, Abhilash KPP, Mathuram AJ, et al. Performance of molecular and serologic tests for the diagnosis of scrub typhus. PLoS Negl Trop Dis. 2020;14(11):e0008747. pmid:33180784
  27. 27. Kim YJ, Park S, Premaratna R, Selvaraj S, Park SJ, Kim S, et al. Clinical Evaluation of Rapid Diagnostic Test Kit for Scrub Typhus with Improved Performance. J Korean Med Sci. 2016;31(8):1190–6. pmid:27478327
  28. 28. Saraswati K, Day NPJ, Mukaka M, Blacksell SD. Scrub typhus point-of-care testing: A systematic review and meta-analysis. PLoS Negl Trop Dis. 2018;12(3):e0006330. pmid:29579046
  29. 29. Paris DH, Chattopadhyay S, Jiang J, Nawtaisong P, Lee JS, Tan E, et al. A nonhuman primate scrub typhus model: protective immune responses induced by pKarp47 DNA vaccination in cynomolgus macaques. J Immunol. 2015;194(4):1702–16. pmid:25601925
  30. 30. Chattopadhyay S, Jiang J, Chan TC, Manetz TS, Chao CC, Ching WM, et al. Scrub typhus vaccine candidate Kp r56 induces humoral and cellular immune responses in cynomolgus monkeys. Infect Immun. 2005;73(8):5039–47. pmid:16041019
  31. 31. Batra HV. Spotted fevers & typhus fever in Tamil Nadu. Indian J Med Res. 2007;126(2):101–3.
  32. 32. Blacksell SD, Jenjaroen K, Phetsouvanh R, Wuthiekanun V, Day NP, Newton PN, et al. Accuracy of AccessBio Immunoglobulin M and Total Antibody Rapid Immunochromatographic Assays for the Diagnosis of Acute Scrub Typhus Infection. Clin Vaccine Immunol. 2010;17(2):263–6. pmid:20016046
  33. 33. Blacksell SD, Jenjaroen K, Phetsouvanh R, Tanganuchitcharnchai A, Phouminh P, Phongmany S, et al. Accuracy of rapid IgM-based immunochromatographic and immunoblot assays for diagnosis of acute scrub typhus and murine typhus infections in Laos. Am J Trop Med Hyg. 2010;83(2):365–9. pmid:20682883
  34. 34. Kelly DJ, Chan CT, Paxton H, Thompson K, Howard R, Dasch GA. Comparative evaluation of a commercial enzyme immunoassay for the detection of human antibody to Rickettsia typhi. Clin Diagn Lab Immunol. 1995;2(3):356–60. pmid:7664182
  35. 35. Tay ST, Kamalanathan M, Rohani MY. Antibody prevalence of Orientia tsutsugamushi, Rickettsia typhi and TT118 spotted fever group rickettsiae among Malaysian blood donors and febrile patients in the urban areas. Southeast Asian J Trop Med Public Health. 2003;34(1):165–70. pmid:12971530
  36. 36. Saunders JP, Brown GW, Shirai A, Huxsoll DL. The longevity of antibody to Rickettsia tsutsugamushi in patients with confirmed scrub typhus. Trans R Soc Trop Med Hyg. 1980;74(2):253–7. pmid:6770503
  37. 37. Yuhana Y, Tanganuchitcharnchai A, Sujariyakul P, Sonthayanon P, Chotivanich K, Paris DH, et al. Diagnosis of Murine Typhus by Serology in Peninsular Malaysia: A Case Report Where Rickettsial Illnesses, Leptospirosis and Dengue Co-Circulate. Trop Med Infect Dis. 2019;4(1). pmid:30708964
  38. 38. Parola P, Blacksell SD, Phetsouvanh R, Phongmany S, Rolain JM, Day NP, et al. Genotyping of Orientia tsutsugamushi from humans with scrub typhus, Laos. Emerg Infect Dis. 2008;14(9):1483–5. pmid:18760027
  39. 39. Dittrich S, Boutthasavong L, Keokhamhoung D, Phuklia W, Craig SB, Tulsiani SM, et al. A Prospective Hospital Study to Evaluate the Diagnostic Accuracy of Rapid Diagnostic Tests for the Early Detection of Leptospirosis in Laos. Am J Trop Med Hyg. 2018;98(4):1056–60. pmid:29488460
  40. 40. Rao M, Amran F, Aqilla N. Evaluation of a Rapid Kit for Detection of IgM against Leptospira in Human. Can J Infect Dis Med Microbiol. 2019;2019:5763595. pmid:30881530
  41. 41. Dinhuzen J, Limothai U, Tachaboon S, Krairojananan P, Laosatiankit B, Boonprasong S, et al. A prospective study to evaluate the accuracy of rapid diagnostic tests for diagnosis of human leptospirosis: Result from THAI-LEPTO AKI study. PLoS Negl Trop Dis. 2021;15(2):e0009159. pmid:33606698
  42. 42. Amran F, Liow YL, Halim NAN. Evaluation of a Commercial Immuno-Chromatographic Assay Kit for Rapid Detection of IgM Antibodies against Leptospira Antigen in Human Serum. J Korean Med Sci. 2018;33(17):e131. pmid:29686599
  43. 43. Alia SN, Joseph N, Philip N, Azhari NN, Garba B, Masri SN, et al. Diagnostic accuracy of rapid diagnostic tests for the early detection of leptospirosis. J Infect Public Health. 2019;12(2):263–9. pmid:30502041
  44. 44. Nabity SA, Ribeiro GS, Aquino CL, Takahashi D, Damiao AO, Goncalves AH, et al. Accuracy of a dual path platform (DPP) assay for the rapid point-of-care diagnosis of human leptospirosis. PLoS Negl Trop Dis. 2012;6(11):e1878. pmid:23133686
  45. 45. Chadsuthi S, Bicout DJ, Wiratsudakul A, Suwancharoen D, Petkanchanapong W, Modchang C, et al. Investigation on predominant Leptospira serovars and its distribution in humans and livestock in Thailand, 2010–2015. PLoS Negl Trop Dis. 2017;11(2):e0005228. pmid:28182662
  46. 46. Blacksell SD, Smythe L, Phetsouvanh R, Dohnt M, Hartskeerl R, Symonds M, et al. Limited diagnostic capacities of two commercial assays for the detection of Leptospira immunoglobulin M antibodies in Laos. Clin Vaccine Immunol. 2006;13(10):1166–9. pmid:17028219
  47. 47. Goris MG, Leeflang MM, Loden M, Wagenaar JF, Klatser PR, Hartskeerl RA, et al. Prospective evaluation of three rapid diagnostic tests for diagnosis of human leptospirosis. PLoS Negl Trop Dis. 2013;7(7):e2290. pmid:23875034
  48. 48. Picardeau M, Bertherat E, Jancloes M, Skouloudis AN, Durski K, Hartskeerl RA. Rapid tests for diagnosis of leptospirosis: current tools and emerging technologies. Diagn Microbiol Infect Dis. 2014;78(1):1–8. pmid:24207075
  49. 49. Silva MV, Camargo ED, Batista L, Vaz AJ, Brandao AP, Nakamura PM, et al. Behaviour of specific IgM, IgG and IgA class antibodies in human leptospirosis during the acute phase of the disease and during convalescence. J Trop Med Hyg. 1995;98(4):268–72. pmid:7636924
  50. 50. Budihal SV, Perwez K. Leptospirosis diagnosis: competancy of various laboratory tests. J Clin Diagn Res. 2014;8(1):199–202. pmid:24596774
  51. 51. Haake DA, Levett PN. Leptospirosis in humans. Curr Top Microbiol Immunol. 2015;387:65–97. pmid:25388133
  52. 52. Wongsuvan G, Hantrakun V, Teparrukkul P, Imwong M, West TE, Wuthiekanun V, et al. Sensitivity and specificity of a lateral flow immunoassay (LFI) in serum samples for diagnosis of melioidosis. Trans R Soc Trop Med Hyg. 2018;112(12):568–70. pmid:30219869
  53. 53. Hoffmaster AR, AuCoin D, Baccam P, Baggett HC, Baird R, Bhengsri S, et al. Melioidosis diagnostic workshop, 2013. Emerg Infect Dis. 2015;21(2). pmid:25626057
  54. 54. Limmathurotsakul D, Jamsen K, Arayawichanont A, Simpson JA, White LJ, Lee SJ, et al. Defining the true sensitivity of culture for the diagnosis of melioidosis using Bayesian latent class models. PLoS One. 2010;5(8):e12485. pmid:20830194
  55. 55. Suttisunhakul V, Wuthiekanun V, Brett PJ, Khusmith S, Day NP, Burtnick MN, et al. Development of Rapid Enzyme-Linked Immunosorbent Assays for Detection of Antibodies to Burkholderia pseudomallei. J Clin Microbiol. 2016;54(5):1259–68. pmid:26912754
  56. 56. Pan-ngum W, Blacksell SD, Lubell Y, Pukrittayakamee S, Bailey MS, de Silva HJ, et al. Estimating the true accuracy of diagnostic tests for dengue infection using bayesian latent class models. PLoS One. 2013;8(1):e50765. pmid:23349667
  57. 57. Blacksell SD, Jarman RG, Bailey MS, Tanganuchitcharnchai A, Jenjaroen K, Gibbons RV, et al. Evaluation of six commercial point-of-care tests for diagnosis of acute dengue infections: the need for combining NS1 antigen and IgM/IgG antibody detection to achieve acceptable levels of accuracy. Clin Vaccine Immunol. 2011;18(12):2095–101. pmid:22012979
  58. 58. Blacksell SD, Jarman RG, Gibbons RV, Tanganuchitcharnchai A, Mammen MP Jr., Nisalak A, et al. Comparison of seven commercial antigen and antibody enzyme-linked immunosorbent assays for detection of acute dengue infection. Clin Vaccine Immunol. 2012;19(5):804–10. pmid:22441389
  59. 59. Raafat N, Blacksell SD, Maude RJ. A review of dengue diagnostics and implications for surveillance and control. Trans R Soc Trop Med Hyg. 2019;113(11):653–60. pmid:31365115
  60. 60. Moreira J, Brasil P, Dittrich S, Siqueira AM. Mapping the global landscape of chikungunya rapid diagnostic tests: A scoping review. PLoS Negl Trop Dis. 2022;16(7):e0010067. pmid:35878158
  61. 61. Mahajan R, Nair M, Saldanha AM, Harshana A, Pereira AL, Basu N, et al. Diagnostic accuracy of commercially available immunochromatographic rapid tests for diagnosis of dengue in India. J Vector Borne Dis. 2021;58(2):159–64. pmid:35074951
  62. 62. Wiwanitkit S, Wiwanitkit V. Rapid diagnosis of dengue infection in acute phase. J Vector Borne Dis. 2015;52(1):110. pmid:25815877
  63. 63. WHO. Dengue: Guidelines for Diagnosis, Treatment, Prevention and Control. 2009.
  64. 64. Macedo JVL, Frias IAM, Oliveira MDL, Zanghelini F, Andrade CAS. A systematic review and meta-analysis on the accuracy of rapid immunochromatographic tests for dengue diagnosis. Eur J Clin Microbiol Infect Dis. 2022;41(9):1191–201. pmid:35988010
  65. 65. Fry SR, Meyer M, Semple MG, Simmons CP, Sekaran SD, Huang JX, et al. The diagnostic sensitivity of dengue rapid test assays is significantly enhanced by using a combined antigen and antibody testing approach. PLoS Negl Trop Dis. 2011;5(6):e1199. pmid:21713023
  66. 66. Wang SM, Sekaran SD. Evaluation of a commercial SD dengue virus NS1 antigen capture enzyme-linked immunosorbent assay kit for early diagnosis of dengue virus infection. J Clin Microbiol. 2010;48(8):2793–7. pmid:20573879
  67. 67. Andrew A, Navien TN, Yeoh TS, Citartan M, Mangantig E, Sum MSH, et al. Diagnostic accuracy of serological tests for the diagnosis of Chikungunya virus infection: A systematic review and meta-analysis. PLoS Negl Trop Dis. 2022;16(2):e0010152. pmid:35120141
  68. 68. Kosasih H, Widjaja S, Surya E, Hadiwijaya SH, Butarbutar DP, Jaya UA, et al. Evaluation of two IgM rapid immunochromatographic tests during circulation of Asian lineage Chikungunya virus. Southeast Asian J Trop Med Public Health. 2012;43(1):55–61. pmid:23082554
  69. 69. Rianthavorn P, Wuttirattanakowit N, Prianantathavorn K, Limpaphayom N, Theamboonlers A, Poovorawan Y. Evaluation of a rapid assay for detection of IgM antibodies to chikungunya. Southeast Asian J Trop Med Public Health. 2010;41(1):92–6. pmid:20578487
  70. 70. Debi Boeras CTD, Jose L. Pelegrino, Marc Grandadam, Veasna Duong, Philippe Dussart, Paul Brey DR, Marisa Adati, Annelies Wilder-Smith, Andrew K. Falconar, Claudia M. Romero,Maria Guzman, Nagwa Hasanin, Amadou Sall, and Rosanna W. Peeling. Evaluation of Zika rapid tests as aids for clinical diagnosis and epidemic preparedness. Lancet. 2022;49.
  71. 71. Kim YH, Lee J, Kim YE, Chong CK, Pinchemel Y, Reisdorfer F, et al. Development of a Rapid Diagnostic Test Kit to Detect IgG/IgM Antibody against Zika Virus Using Monoclonal Antibodies to the Envelope and Non-structural Protein 1 of the Virus. Korean J Parasitol. 2018;56(1):61–70. pmid:29529852
  72. 72. Munoz-Jordan JL. Diagnosis of Zika Virus Infections: Challenges and Opportunities. J Infect Dis. 2017;216(suppl_10):S951-S6.
  73. 73. Petersen LR, Jamieson DJ, Honein MA. Zika Virus. N Engl J Med. 2016;375(3):294–5.
  74. 74. Endale A, Medhin G, Darfiro K, Kebede N, Legesse M. Magnitude of Antibody Cross-Reactivity in Medically Important Mosquito-Borne Flaviviruses: A Systematic Review. Infect Drug Resist. 2021;14:4291–9. pmid:34703255
  75. 75. Waggoner JJ, Pinsky BA. Zika Virus: Diagnostics for an Emerging Pandemic Threat. J Clin Microbiol. 2016;54(4):860–7. pmid:26888897
  76. 76. Blacksell SD, Lim C, Tanganuchitcharnchai A, Jintaworn S, Kantipong P, Richards AL, et al. Optimal Cutoff and Accuracy of an IgM Enzyme-Linked Immunosorbent Assay for Diagnosis of Acute Scrub Typhus in Northern Thailand: an Alternative Reference Method to the IgM Immunofluorescence Assay. J Clin Microbiol. 2016;54(6):1472–8. pmid:27008880
  77. 77. Hanson KE, Couturier MR. Multiplexed Molecular Diagnostics for Respiratory, Gastrointestinal, and Central Nervous System Infections. Clin Infect Dis. 2016;63(10):1361–7. pmid:27444411