Figures
Abstract
Background
The demand for rapid molecular diagnostics for respiratory viruses has increased substantially. Several point-of-care PCR platforms have become available, yet comparative performance data remain limited.
Objectives
To evaluate the diagnostic accuracy and operational reliability of four rapid PCR platforms for detection of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), influenza A and B viruses (IAV, IBV), and respiratory syncytial virus (RSV), in comparison with the GeneXpert (Cepheid, USA) platform.
Methods
Nasopharyngeal swabs from patients with respiratory symptoms were tested using the GeneXpert, positive samples were subsequently analysed on four alternative systems: the 30-minute and 1-hour M10 (SD Biosensor, South Korea) assays, FlashDetect™ Flash10 (Coyote Bioscience, China), Vivalytic (Bosch Healthcare Solutions, Germany), and Galaxy Lite (Igenesis, China). Additional lower viral load samples and cultured IAV/ IBV strains were included.
Results
A total of 223 GeneXpert positive samples were prospectively analysed. Flash10 showed 94.6% overall agreement, missing SARS-CoV-2 (n = 4, GeneXpert Cycle threshold (Ct) min-max; 37.7–42.2), IAV (n = 5, 32.8–37.7), and IBV (n = 3, 26.5–36.7). Vivalytic showed 83.0% overall agreement, missing SARS-CoV-2 (n = 16, 30.4–42.2), IAV (n = 9, 26.8–37.7), IBV (n = 9, 27.2–36.7), and RSV (n = 4, 31.5–37.0). Galaxy Lite achieved 88.2% overall agreement but failed in 27.2% of test runs. With a smaller sample size the M10 (30-minute) assay showed 98.6% overall agreement with GeneXpert, missing one SARS-CoV-2 case (Ct 39.7).
Citation: Smit PW, Man Pd, Brand H, Breijer S, Wijk Mv, Meijer A, et al. (2025) Comparative evaluation of five rapid PCR platforms for respiratory virus detection. PLoS One 20(12): e0338716. https://doi.org/10.1371/journal.pone.0338716
Editor: Minal Dakhave, Dognosis India Private Limited, INDIA
Received: August 12, 2025; Accepted: November 26, 2025; Published: December 11, 2025
Copyright: © 2025 Smit et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: All relevant data are within the manuscript and its Supporting information files.
Funding: The author(s) received no specific funding for this work.
Competing interests: The authors have declared that no competing interests exist.
Introduction
Respiratory viral infections, particularly those caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), influenza A and B viruses (IAV and IBV), and respiratory syncytial virus (RSV), continue to impose a significant global health burden. These pathogens are associated with considerable morbidity and mortality, placing immense pressure on healthcare systems and underscoring the need for rapid and reliable diagnostic strategies [1–3]. Timely identification of these infections is critical not only for initiating appropriate clinical management but also for implementing effective infection control measures and public health interventions [2–4].
Although laboratory-based reverse transcription polymerase chain reaction (RT-PCR) assays have long been regarded as the diagnostic gold standard, their reliance on centralized laboratory infrastructure and extended turnaround times can delay critical decision-making [2]. In response to these challenges, there has been a notable shift toward the development and deployment of point-of-care (POC) molecular diagnostics. These rapid PCR platforms are designed to deliver accurate results within a significantly reduced timeframe outside of a laboratory setting, facilitating prompt treatment decisions and improving patient management in both hospital and community settings [5–7].
Among the available POC solutions, the GeneXpert system (Cepheid, USA) has established itself as a robust reference platform for the simultaneous detection of multiple respiratory pathogens. However, the diagnostic landscape is rapidly evolving, with several new platforms entering the market that claim comparable or superior performance [6–8]. Despite this progress, comprehensive head-to-head evaluations comparing a multitude of platforms remain scarce, highlighting the need for comparative studies assessing their analytical performance, sensitivity, specificity, and operational feasibility. Additionally, the economic burden associated with large-scale testing has grown substantially due to more frequently testing for respiratory viruses since the emergence of coronavirus disease 2019 (COVID-19) and greater recognition of the importance of detection for not only SARS-CoV-2 and IAV/IBV but also RSV in older adults and those with pulmonary or cardiac comorbidities [9]. The high costs associated with established platforms such as GeneXpert necessitate the exploration of cost-effective alternatives without compromising diagnostic accuracy.
In this study, we compared four rapid PCR platforms—M10 (SD Biosensor, South Korea), Vivalytic (Bosch, Germany), FlashDetect™ Flash10 (Coyote Bioscience, China), and Galaxy Lite (Igenesis, China)—against the GeneXpert (Cepheid, USA) as the reference standard for the detection of SARS-CoV-2, IAV/IBV, and RSV. The M10 and Vivalytic platforms have been available for a few years, while the FlashDetect™ Flash10 and Galaxy Lite were introduced in 2024. Our primary objective was to evaluate these platforms in terms of their diagnostic accuracy, usability, and operational costs to inform clinical and laboratory practices.
Materials and methods
Study design and setting
This prospective study was conducted at the Franciscus Hospital between November 2024 and March 2025. Patients presenting with respiratory symptoms were consecutively tested for SARS-CoV-2, IAV/IBV, and RSV using Xpert Xpress SARS-CoV-2/Flu/RSV Plus assay on the GeneXpert platform (Cepheid, USA) as part of routine diagnostic care. Obtained nasopharyngeal swabs (MANTACC, Miraclean technology Co, China) were placed in Viral Transport Medium (HiViral Transport Medium, HiMedia Laboratories, India). No specific inclusion criteria regarding patient characteristics were applied, ensuring a broad and representative sample of the patient population. Testing was performed as part of the triage protocol for patients presenting respiratory symptoms at the emergency department. Sample size calculation was performed with the aim to be able to detect at least detect a 15% difference in agreement per viral target with 80% power and an α of 0.05 (McNemar), which showed that 53 samples were required per viral target.
To ensure adequate representation of each targeted virus, at least 50 positive samples for respiratory virus were prospectively included. Positive samples according to GeneXpert were enrolled consecutively until the target number per virus was reached. No separate group of negative samples was included. Single-target positives were considered negatives for the other virus targets. A small number of co-infections were observed during routine testing; these were included based on the dominant target for which they were selected. An additional group of 30 IAV positive samples with GeneXpert Ct values >27 was prospectively collected during the same study period to evaluate samples with lower viral loads. The study aimed to evaluate and compare the diagnostic performance of four rapid PCR platforms—M10 (SD Biosensor), Vivalytic (Bosch), FlashDetect™ Flash10 (Coyote Bioscience), and Galaxy Lite (Igenesis)—with the GeneXpert assay as the reference method. The analyses on the evaluation platforms were mostly performed on the same day as initial testing but with a maximum delay of 48 hours for clinical samples during the weekends. To represent most clinical settings, invalid test results were repeated, but discrepant test results in comparison to other platforms were not repeated.
Midway through the study period, SD Biosensor introduced a new 30-minute assay; Standard M10 FLU/RSV/SARS-CoV-2 FAST, replacing the original 60-minute version. Samples processed after this transition were analyzed using the new 30-minute protocol cartridges with current Research Use Only (RUO) label. This change was documented and considered in the data analysis to assess any diagnostic performance differences between the two versions.
The Institutional Review Board (IRB) approved the study protocol (IRB protocol number 2024−075) and declared that this study does not fall within the scope of the Dutch Medical Research Involving Human Subjects Act.
Specimen collection and inclusion of cultured influenza virus strains
Nasopharyngeal swabs were collected following standardized procedures and were immediately transported to the microbiology laboratory located within the hospital. Each sample was typically processed within one hour after arrival.
To further assess the platform’s capability to detect various influenza virus strains –given the virus’s high genetic diversity- a panel of cultured IAV and IBV viruses, representing different genetic clades and H- and N-subtypes of IAV and different genetic clades of B/Victoria lineage viruses were included. This strain set comprised of human seasonal A(H1N1)pdm09 clade 5a.2a (C.1.9.1), A(H3N2) clade 2a.3a.1 (J.2.1, J.2, and J.1.1) and B/Victoria clade V1A.3a.2 (C.5.6, C.5.1, and C.5.7) lineage viruses of the same period that study clinical samples were collected. This set also contained specimens from a previous carried out External Quality Assessment (EQA) 2023 among clinical diagnostic laboratories in the Netherlands on zoonotic influenza viruses, comprising human seasonal A(H1N1)pdm09 and A(H3N2) strains with mutations in the M-genome segment that caused previously increased false negatives with some commercial assays, and avian (H7N2, H5N6, H5N1, (HA clade 2.3.4.4b in two concentrations)) and swine (H1N1v, H1N2v isolated from human cases) influenza A virus subtypes that previously caused or might cause a zoonotic infection.
Statistical analysis
Diagnostic performance of each rapid PCR platform was determined by comparing the results to those obtained with the GeneXpert assay. Key performance metrics such as concordance and percentage agreement ratios, including qualitative assessments of user experiences and operational reliability, were assessed for each platform.
Results
A total of 223 nasopharyngeal samples prospectively were included from patients presenting to the hospital (Table 1). Prospective samples were categorized into target groups based on GeneXpert results: positive for SARS-CoV-2, IAV, IBV, and RSV. The group labeled as “negative” refers to samples that tested negative for the virus in question (e.g., SARS-CoV-2) but may have tested positive for another target. Most platforms demonstrated high reliability in processing samples, generating valid results in nearly all cases. However, the Galaxy Lite platform showed a markedly higher error rate even after replacing the system. Out of the 195 samples tested on this platform, 53 (27.2%) failed to produce a valid result. Due to this high failure rate, Galaxy Lite was excluded from further testing midway through the study.
The agreement of each platform with the GeneXpert reference method is summarized in Table 1. Although fewer samples for M10 (30-minute version) compared to Flash10 were tested, both platforms consistently demonstrated high agreement percentages (Table 1). Vivalytic demonstrated variable performance for prospective samples, with high agreement for RSV (93.5%) and negative samples (100%), but lower concordance for SARS-CoV-2 (72.4%) and IAV (84.7%) compared to the other platforms. The M10 in its earlier 60-minute configuration showed high agreement with the reference method but missing low viral load SARS-CoV-2 and IAV samples, while the updated 30-minute version performing comparably or slightly better across most targets (Table 1).
Among the undetected GeneXpert positive samples in the test platforms, GeneXpert Ct values were generally high, though some platforms failed to detect targets at lower Ct values. Focusing on prospective samples; for M10 (1-hour), missed detections included SARS-CoV-2 (n = 8, GeneXpert Ct range (min-max) 33.0–42.2), IAV (n = 2, 36.7–37.7), and IBV (n = 1, 36.7). The updated 30-minute version of the M10 test only missed one SARS-CoV-2 case (Ct 39.7). Flash10 failed to detect SARS-CoV-2 (n = 4, Ct min-max; 37.7–42.2), IAV (n = 5, 32.8–37.7), and IBV (n = 3, 26.5–36.7). For Galaxy Lite, missed detections included SARS-CoV-2 (n = 10, 32.9–42.2), IAV (n = 3, 29.2–35.3), IBV (n = 4, 14.8–36.7), and RSV (n = 6, 20.3–35.4). Vivalytic showed the highest number of missed detections for SARS-CoV-2 (n = 16, 30.4–42.2), and IAV (n = 9, 26.8–37.7); IBV (n = 9, 27.2–36.7) and RSV (n = 4, 31.5–37.0) samples were also not detected.
All ‘false’ positives detected by one of the platforms in comparison to GeneXpert were not confirmed by the other platforms and were therefore regarded as true false positives (Table 1).
The reference influenza strains (see material & method section) were detected by all platforms, except for Vivalytic that missed repeatedly H1N1v, a swine flu virus strain isolated from a patient.
From the IAV positive clinical samples collected in the same study period with GeneXpert Ct > 27, agreement ranged from 31% to 100% (Table 1), indicating reduced to similar performance compared to GeneXpert.
To compare performance in terms of usability, machine size, storage requirements, and functionality, an overview of the characteristics of each platform and test is presented in Table 2. According to laboratory technicians’ experience during the evaluation, the Galaxy Lite platform was less user-friendly than the other systems, requiring additional manual steps and being the only assay that could not be stored at room temperature. In the Dutch laboratory setting, all newly introduced platforms were less expensive per test than the GeneXpert platform, although such price differences may vary across countries and healthcare systems.
Discussion
The use of rapid PCR platforms for respiratory virus detection has expanded significantly since the COVID-19 pandemic, driven by the need to improve patient flow, reduce patient isolation durations, and support efficient patient triage in emergency departments. Concurrently, many new platforms have entered the market, often promising faster turnaround times and reduced costs. This highlights the importance of comparative evaluations to guide evidence-based implementation.
In this study, we assessed four rapid PCR systems against GeneXpert as the reference test for the detection of SARS-CoV-2, IAV/IBV, and RSV on clinical specimens. Compared to the 60-minute version, the M10 30-minute assay version showed higher agreement, despite the limited number of samples tested. Flash10 also performed well, but with slightly lower agreement for SARS-CoV-2 and IAV for mainly samples with low viral load. Vivalytic showed more variable performance, with high agreement for RSV and negative samples, but lower agreement for SARS-CoV-2 (72.4%) and IAV (84.7%). However, it should be mentioned that according to the manufacturer, the sample medium used (VTM) has not been validated for the test. Galaxy Lite was excluded midway due to persistent technical issues, including high error rates in two separate instruments. According to the manufacturer, these issues have since been resolved. However, it was too late to be able to include the modified test in our comparative analysis. Our findings for GeneXpert and M10 (60 minutes) align with previously published studies evaluating similar assays [4,5,7,8,10,11]. The lower performance of Vivalytic for SARS-CoV-2 and IAV in our evaluation is also consistent with an earlier report describing reduced agreement, particularly in samples with low viral loads [10].
With the exception of M10, all platforms showed a modest reduction in agreement for low viral load samples, particularly evident in IAV cases with GeneXpert Ct values above 30. This difference was intentionally further explored through the inclusion of additional samples with high Ct-values. Although relevant from an analytical perspective, this reduction may be of limited clinical impact in emergency settings, where rapid turnaround and decision-making are prioritized over detection of very low viral loads with uncertain clinical relevance [12].
A critical factor contributing to platform-specific performance is the design of the assays, particularly the number of PCR targets and different target genes used per virus (Table 2). All platforms use two or more targets for SARS-CoV-2 (e.g., E gene and N2 or ORF1ab). This on purpose redundancy offers a benefit in case of target dropout due to viral mutation. It also can increase the analytical sensitivity in clinical specimens, especially if the PCR amplifies multi copy RNAs, like mRNA or intermediate viral RNA copies. However, with the exception of GeneXpert and M10 (Table 2), the remaining platforms use only a single target for RSV and for both IAV and IBV. While this simplifies assay design, it introduces a vulnerability to false-negative results if mutations occur in the targeted region. This risk is especially relevant for genetically dynamic viruses such as SARS-CoV-2 and has also been described for IAV, where mutations in the matrix or hemagglutinin gene regions have occasionally led to reduced assay performance or target failure in single-target designs [13,14]. The inclusion of genetically diverse IAV and IBV strains strengthens the analytical evaluation of the platforms and reflects broader diagnostic challenges, including preparedness for zoonotic variants as emphasized in recent national EQA studies [13].
Strengths of this study include its prospective design, parallel testing of identical clinical specimens across platforms, and inclusion of both routine and high-Ct IAV samples as well as cultured influenza virus strains (from different genetic clades and H- and N-subtypes including potential zoonotic viruses). Limitations include the absence of blinding, predefined group sizes, and limited sample numbers for certain platforms (e.g., SD Biosensor with two different kits) or subgroups of respiratory viruses. A separate group of samples negative for all viral targets was not included in this study. However, for each pathogen, negative results were inherently represented by samples positive for other viral targets, ensuring that each assay’s specificity could still be evaluated within the tested cohort. Nevertheless, the absence of a distinct all-negative control group may limit a full assessment of assay performance in populations with low pathogen prevalence.While the inclusion of genetically diverse IAV and IBV strains strengthens the analytical evaluation of the platforms, the continual emergence of viral variants poses an ongoing challenge to assay robustness. Under the EU In Vitro Diagnostic Regulation (IVDR), manufacturers of high-risk diagnostic tests are required to demonstrate sustained performance, including in the face of viral evolution. This process involves extensive documentation, validation, and in some cases collaboration with EU Reference Laboratories (EURLs), which may delay rapid assay adaptation. Ultimately, platform resilience depends on the manufacturer’s ability to proactively update assay design and meet evolving regulatory demands in a timely and efficient manner. This is particularly relevant for newly introduced platforms, which often rely on single-target assay designs and have not yet demonstrated long-term performance across diverse clinical settings. Ongoing independent evaluations and transparent dissemination of performance data will be essential to ensuring sustained diagnostic reliability.
In conclusion, several of the rapid PCR platforms evaluated in this study show strong diagnostic potential and may serve as reliable alternatives in clinical workflows using standard laboratory-based NAAT, provided platform-specific limitations are considered. Future studies should continue to assess not only analytical performance but also test stability, workflow integration, and resilience to viral genetic variability.
Acknowledgments
We would like to thank Gabriel Goderski, Sharon van de Brink, Samantha Zoomer, John Sluimer (NIC location RIVM, Bilthoven), Ron Fouchier, Oanh Vuong (NIC location Erasmus MC, Rotterdam), Erhard van der Vries, and Manon Houben (Royal GD, Deventer) for sharing the cultured IAV and IBV viruses.
References
- 1. Lakshmanan K, Liu BM. Impact of Point-of-Care Testing on Diagnosis, Treatment, and Surveillance of Vaccine-Preventable Viral Infections. Diagnostics (Basel). 2025;15(2):123. pmid:39857007
- 2. Clark TW, Beard KR, Brendish NJ, Malachira AK, Mills S, Chan C, et al. Clinical impact of a routine, molecular, point-of-care, test-and-treat strategy for influenza in adults admitted to hospital (FluPOC): a multicentre, open-label, randomised controlled trial. Lancet Respir Med. 2021;9(4):419–29. pmid:33285143
- 3. Tan C, Chan CK, Ofner M, O’Brien J, Thomas NR, Callahan J, et al. Implementation of point-of-care molecular testing for respiratory viruses in congregate living settings. Infect Control Hosp Epidemiol. 2024;45(9):1–5. pmid:38659123
- 4. Johnson G, Gregorchuk BSJ, Zubrzycki A, Kolsun K, Meyers AFA, Sandstrom PA, et al. Clinical evaluation of the GeneXpert® Xpert® Xpress SARS-CoV-2/Flu/RSV PLUS combination test. Can J Microbiol. 2023;69(3):146–50. pmid:36657122
- 5. Wolters F, Grünberg M, Huber M, Kessler HH, Prüller F, Saleh L, et al. European multicenter evaluation of Xpert® Xpress SARS-CoV-2/Flu/RSV test. J Med Virol. 2021;93(10):5798–804. pmid:34050951
- 6. Domnich A, Bruzzone B, Trombetta C-S, De Pace V, Ricucci V, Varesano S, et al. Rapid differential diagnosis of SARS-CoV-2, influenza A/B and respiratory syncytial viruses: Validation of a novel RT-PCR assay. J Clin Virol. 2023;161:105402. pmid:36805601
- 7. Jensen CB, Schneider UV, Madsen TV, Nielsen XC, Ma CMG, Severinsen JK, et al. Evaluation of the analytical and clinical performance of two RT-PCR based point-of-care tests; Cepheid Xpert® Xpress CoV-2/Flu/RSV plus and SD BioSensor STANDARD™ M10 Flu/RSV/SARS-CoV-2. J Clin Virol. 2024;172:105674. pmid:38643722
- 8. Han E, Kim J, Kim YJ, Choi HJ, Bae MH. Clinical performance of a rapid RT-PCR assay using STANDARD™ M10 SARS-CoV-2 between July 2022 and January 2023 in Korea. Diagn Microbiol Infect Dis. 2024;110(4):116523. pmid:39244844
- 9. Heger LA, Elsen N, Rieder M, Gauchel N, Sommerwerck U, Bode C, et al. Clinical analysis on diagnostic accuracy of Bosch Vivalytic SARS-CoV-2 point-of-care test and evaluation of cycle threshold at admission for COVID-19 risk assessment. BMC Infect Dis. 2022;22(1):486. pmid:35606698
- 10. De Pace V, Caligiuri P, Ricucci V, Nigro N, Galano B, Visconti V, et al. Rapid diagnosis of SARS-CoV-2 pneumonia on lower respiratory tract specimens. BMC Infect Dis. 2021;21(1):926. pmid:34493222
- 11. Fragkou PC, Moschopoulos CD, Dimopoulou D, Ong DSY, Dimopoulou K, Nelson PP, et al. Performance of point-of care molecular and antigen-based tests for SARS-CoV-2: a living systematic review and meta-analysis. Clin Microbiol Infect. 2023;29(3):291–301. pmid:36336237
- 12.
Zoomer S, Goderski G, van den Brink S, Sluimer J, Fouchier R, Vuong O, et al. Medical microbiology laboratories in the Netherlands can detect animal type A influenza viruses well. Bilthoven: External Quality Assessment; 2023.
- 13. Stellrecht KA. The Drift in Molecular Testing for Influenza: Mutations Affecting Assay Performance. J Clin Microbiol. 2018;56(3):e01531-17. pmid:29305549
- 14. Yang J-R, Kuo C-Y, Huang H-Y, Wu F-T, Huang Y-L, Cheng C-Y, et al. Newly emerging mutations in the matrix genes of the human influenza A(H1N1)pdm09 and A(H3N2) viruses reduce the detection sensitivity of real-time reverse transcription-PCR. J Clin Microbiol. 2014;52(1):76–82. pmid:24153120