Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Health management information system (HMIS) data verification: A case study in four districts in Rwanda

  • Alphonse Nshimyiryo ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Resources, Software, Visualization, Writing – original draft, Writing – review & editing

    anshimyiryo@pih.org

    Affiliation Maternal and Child Health Program, Partners In Health/Inshuti Mu Buzima, Kigali, Rwanda

  • Catherine M. Kirk,

    Roles Conceptualization, Investigation, Methodology, Project administration, Resources, Validation, Writing – original draft, Writing – review & editing

    Affiliation Maternal and Child Health Program, Partners In Health/Inshuti Mu Buzima, Kigali, Rwanda

  • Sara M. Sauer,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Resources, Validation, Visualization, Writing – review & editing

    Affiliation Department of Biostatistics, Harvard T.H. Chan School of Public Health, Boston, MA, United States of America

  • Emmanuel Ntawuyirusha,

    Roles Data curation, Investigation, Resources, Validation, Writing – review & editing

    Affiliation Planning, Health Financing and Information Systems, Ministry of Health, Kigali, Rwanda

  • Andrew Muhire,

    Roles Data curation, Investigation, Resources, Validation, Writing – review & editing

    Affiliation Planning, Health Financing and Information Systems, Ministry of Health, Kigali, Rwanda

  • Felix Sayinzoga,

    Roles Investigation, Methodology, Resources, Validation, Writing – review & editing

    Affiliation Maternal, Child and Community Health Division, Rwanda Biomedical Center, Kigali, Rwanda

  • Bethany Hedt-Gauthier

    Roles Conceptualization, Investigation, Methodology, Resources, Supervision, Validation, Writing – review & editing

    Affiliations Maternal and Child Health Program, Partners In Health/Inshuti Mu Buzima, Kigali, Rwanda, Department of Biostatistics, Harvard T.H. Chan School of Public Health, Boston, MA, United States of America, Department of Global Health and Social Medicine, Harvard Medical School, Boston, MA, United States of America

Abstract

Introduction

Reliable Health Management and Information System (HMIS) data can be used with minimal cost to identify areas for improvement and to measure impact of healthcare delivery. However, variable HMIS data quality in low- and middle-income countries limits its value in monitoring, evaluation and research. We aimed to review the quality of Rwandan HMIS data for maternal and newborn health (MNH) based on consistency of HMIS reports with facility source documents.

Methods

We conducted a cross-sectional study in 76 health facilities (HFs) in four Rwandan districts. For 14 MNH data elements, we compared HMIS data to facility register data recounted by study staff for a three-month period in 2017. A HF was excluded from a specific comparison if the service was not offered, source documents were unavailable or at least one HMIS report was missing for the study period. World Health Organization guidelines on HMIS data verification were used: a verification factor (VF) was defined as the ratio of register over HMIS data. A VF<0.90 or VF>1.10 indicated over- and under-reporting in HMIS, respectively.

Results

High proportions of HFs achieved acceptable VFs for data on the number of deliveries (98.7%;75/76), antenatal care (ANC1) new registrants (95.7%;66/69), live births (94.7%;72/76), and newborns who received first postnatal care within 24 hours (81.5%;53/65). This was slightly lower for the number of women who received iron/folic acid (78.3%;47/60) and tested for syphilis in ANC1 (67.6%;45/68) and was the lowest for the number of women with ANC1 standard visit (25.0%;17/68) and fourth standard visit (ANC4) (17.4%;12/69). The majority of HFs over-reported on ANC4 (76.8%;53/69) and ANC1 (64.7%;44/68) standard visits.

Conclusion

There was variable HMIS data quality by data element, with some indicators with high quality and also consistency in reporting trends across districts. Over-reporting was observed for ANC-related data requiring more complex calculations, i.e., knowledge of gestational age, scheduling to determine ANC standard visits, as well as quality indicators in ANC. Ongoing data quality assessments and training to address gaps could help improve HMIS data quality.

Introduction

National health management information systems (HMIS) have been established in many low- and middle-income countries (LMICs) for routine collection and management of facility-based data on health care service delivery [1,2]. When the data are of good quality, they can be used–with little-to-no costs–to identify areas that need improvement, to evaluate various health interventions, to inform evidence-based health policies, and to design programs and allocate resources at all levels of the health system [35]. However there is evidence of variable quality of HMIS data [613], limiting its value in monitoring, evaluation and research in LMICs.

Regular data quality assessment is one of the strategies that can be used for improving the quality of HMIS data in LMICs [14,15]. The World Health Organization (WHO) provides guidelines on data quality review (DQR) through a desk review [16] of data which was previously reported to the HMIS and the verification of HMIS data quality through facility survey [17]. The desk review assesses HMIS data quality in four dimensions: 1) completeness and timeliness of data, 2) internal consistency of reported data, 3) external consistency; and 4) external comparisons of population data. Detailed information on definitions and requirements for the calculation of these dimensions, as well as the application of HMIS data quality assessment through the desk review can be reviewed elsewhere [13,16,18]. The WHO toolkit for data quality review defines a verification factor (VF) as the ratio of recounted data from facility source documents over HMIS data [17]. The application of the VF has been more limited and using variable definitions. However, there is evidence that the level of agreement between HMIS data and records in facility source documents can vary depending on the type of data being collected and be rooted in early stages of collecting those data from facility source documents [11,1924]. Over- or under-reporting in HMIS data can result from human errors that occur when counting events from source documents or simply not including all events for the reporting period–by omitting some of the necessary source documents or not covering the entire reporting period [11,24]. In addition, intentional over- or under-reporting in HMIS data at the facility level can be motivated by the pressure to meet national targets, whereas, inaccuracies in transferring data from facility source documents to the electronic database can also be associated with excessive workload of staff combined with the pressure to meet reporting deadlines [23].

The Rwanda HMIS was established in 1998 and since 2012, with the goal to improve the quality of routinely-collected health data from community health workers (CHWs) and all HFs across the country, the Rwanda Ministry of Health (MoH) upgraded the HMIS to a web-based system known as the District Health Information System version 2 (DHIS2) [25]. Recent assessments of the Rwanda HMIS data quality using WHO guidelines have been limited to a desk review of available data in the HMIS [13,26]. These assessments revealed that the quality of the Rwanda HMIS data is high, regarding completeness and internal consistency of reported data for studied HMIS indicators [26]. However, findings from two small-scale studies that tried to compare HMIS data and records in source documents, using a reporting accuracy definition different to the WHO data verification definition, suggest a variable level of agreement between HMIS data and records in source documents [27,28]. One of these two studies that defined accuracy of reporting in HMIS data as a deviation less than or equal to 5% between HMIS and facility source documents data on family planning, antenatal care and delivery, found the overall accuracy of reporting at 70.6% for 37 HFs sampled in three districts in the Eastern Province of Rwanda [28].

Good quality HMIS data would play an important role in identifying areas that need improvement and monitoring progress and evaluating interventions that are designed to address those gaps [2931]. Rwanda experienced a remarkable progress in reducing mortality among children aged under-five in the past two decades, however the reduction in neonatal deaths (those occurring within 28 days of birth and mainly in health facilities) has been slow [32]. There is no doubt that good quality HMIS data on maternal and newborn healthcare are needed to identify gaps in the existing facility-based care for designing appropriate interventions and for monitoring progress [33]. Since 2013, the “All Babies Count (ABC)” intervention has been implemented to accelerate the reduction in preventable neonatal deaths in Rwanda [34]. The ABC intervention focuses on improved coverage and quality of antenatal, maternity and postnatal care services. ABC is implemented through a joint collaboration of Partners In Health/Inshuti Mu Buzima (PIH/IMB), an international non-profit organization and the Rwanda MoH. However, the evaluation of the impact of these programs has been costly, with a parallel collection of data on program indicators through HMIS and review of facility source documents, given the concerns about poor quality of HMIS data [34]. Therefore, this study uses the WHO guidelines for HMIS data verification to provide evidence on the level of agreement between Rwanda HMIS data and records in source documents of reporting HFs. We calculate VFs for fourteen HMIS data elements that were identified jointly by PIH/IMB and MoH as priority indicators for quality improvement in maternal and newborn health care for 76 HFs (7 hospitals and 69 health centers) that received the ABC intervention between 2017 and 2019 in four districts of Rwanda. The criteria for indicator selection included having clinical relevance to neonatal survival, government priority indicators, and/or being indicators within the WHO standards for improving quality in maternal and newborn care in health facilities [35].

Methods

Study design

This was a cross-sectional study that was conducted to assess the quality of Rwanda HMIS data, measured as agreement between HMIS and facility source documents data, for 76 HFs on 14 HMIS data elements related to maternal and newborn health data that were used to monitor quality and progress and to inform quality improvement efforts through the ABC intervention (Table 1). The antenatal care (ANC) register was the source document for 5 data elements that were only reported by health centers (HCs), while the maternity and postnatal care (PNC) registers were source documents for 7 documents that were reported by both HCs and hospitals. Two data elements related to neonatal admissions and deaths were recounted from the neonatology care unit (NCU) register and were only reported by hospitals.

thumbnail
Table 1. Description of data elements and source of data.

https://doi.org/10.1371/journal.pone.0235823.t001

Study setting

This study included 48 HFs in Gakenke and Rulindo districts in Northern Rwanda and 28 HFs in Gisagara and Rusizi districts in Southern and Western Rwanda, respectively. The 76 HFs were grouped into seven hospital catchment areas (HCAs): Nemba District Hospital (15 HFs), Ruli District Hospital (10 HFs), Kinihira Provinical Hospital (9 HFs), Rutongo District Hospital (14 HFs), Gakoma District Hospital (6 HFs), Kibilizi District Hospital (10 HFs) and Mibilizi District Hospital (12 HFs). These MoH-operated facilities were included because they received the ABC intervention, and represented 14% (69/499) of health centers and 15% (7/48) of the hospitals in all 30 districts in Rwanda [36].

The ABC project was originally implemented in 2013 by Partners in Health/Inshuti Mu Buzima (PIH/IMB) in Kayonza and Kirehe districts in Eastern Rwanda, in partnership with the Rwanda MoH [34]. Later on, ABC was scaled-up to Gakenke and Rulindo (July 2017) and then to Gisagara and Rusizi (October 2017). The ABC scale-up project used health facility-based data, collected monthly through the Rwanda HMIS, to monitor indicator progress from baseline to the end of the project and to evaluate its impact. In addition, the project underwent the HMIS data verification process by recounting the same data in standardized HF registers for the same data elements and reporting periods. Five data elements related to antenatal care (ANC) services (ANC new registrants, syphilis testing and iron/folic acid distribution at the first ANC visit and number of women with first and fourth ANC standard visits) were only reported by health centers, whereas two data elements on the number of neonatal admissions and neonatal deaths in the hospital neonatology care unit were specific to hospitals [37]. Data are recorded using standardized registers developed by the MoH and provided to all HFs (see Table 1 for data sources); women attending ANC are recorded at their first ANC visit and provided an ANC card and an ANC number that facilitates continuity of data recording at the individual level for ANC. This study reports data from the baseline period prior to ABC intervention in the 7 HCAs.

Sources of data

HMIS data.

The ABC team worked with the MoH national HMIS team to extract HMIS reports data for the study periods. The HMIS data collection starts from the reporting facility, with clinical staff in each care service registering patients/clients and the care provided to them in standardized registers and/or medical files [38,39]. Then, for monthly reporting to HMIS, the facility data manager ensures the distribution of paper HMIS reporting forms to heads of services by the 25th of each month. The head of service collects those data that are relevant to their specific service and submits a completed HMIS report for the previous month to the facility data manager by the 3rd day of the month following the month of reporting. For timely reporting, the facility data manager should upload all facility data into DHIS 2 by the 5th day of every month. Data verification by facility team and corrections in the system are only allowed between the 5th and 15th of each month. Any request for changes on the data in the system beyond the 15th of each month should be submitted to the central MoH, and access is only granted upon strong justification of the request.

Recounted data from facility source documents.

A specialized team of two trained ABC data collectors went to all HFs under study and recounted the same data in the standardized HF source documents for the same reporting periods. ABC baseline data—April-June 2017 for 48 HFs in Gakenke and Rulindo districts and July-September 2017 for 28 HFs in Gisagara and Rusizi—were collected during the periods July 31-September 19, 2017 and November 14, 2017-January 11, 2018, respectively. In particular, due to observed variable unit of recording of gestational age (GA) in weeks or months in the ANC register by facility and care provider, ABC data collectors worked with midwives or nurses responsible for providing ANC to standardize the calculation of GA in weeks before recounting data on ANC1 and ANC4 standard visits, as the reporting to HMIS on these data elements is based on GA calculated in weeks. The data collection team used a pregnancy wheel and the data recorded on date of last menstrual period and dates of ANC visits for individual women who attended ANC to determine GA at each visit. For all data elements, data collection was done in consultation with the health facility staff responsible for routine reporting of data into HMIS.

Data analysis and presentation

We used the WHO DQR guidelines on data verification and system assessment to calculate verification factors (VFs) for each data element [17]. A VF was defined as the ratio of register data to HMIS data (Eq 1). ABC baseline data were aggregated for the three-month reporting period. HMIS and facility source documents data were compared by data element and HF. At the HCA level, a VF was calculated by summing all the non-missing values for each data element and all the reporting HFs under that HCA during the study period. Then, a HCA-level VF was calculated as a ratio of the aggregated recounted data to HMIS data. In addition, a VF for data elements with rare events was only calculated at the HCA level, where aggregated data were compared to avoid denominators with true zero values that would be expected if these data were compared at the HF level. For each data element, we excluded from our analyses any HFs that were not eligible for reporting on that data element or that had either incomplete HMIS data or missing source documents’ data for any month during the reporting period.

Eq (1)

A VF of 1.00 indicated a perfect match between recounted data and HMIS data. The acceptable margin of error for the discrepancy between HMIS reports data and recounted data in facility registers was (0.90≤VF≤1.10), based on the WHO DQR guidelines. A VF<0.90 or VF>1.10 indicated over-reporting and under-reporting in HMIS data, respectively. We used Stata v.15.1 (Stata Corp, College Station, TX, USA) and R Language and Environment for Statistical Computing for analysis and visual presentation of data [40].

Ethics

The ABC scale-up project received approval from the Rwanda MoH to access HMIS data for the project’s indicators for all HFs that received the intervention. This study was approved by the Rwanda National Ethics Committee approval (Kigali, Rwanda, protocol #0067/RNEC/2017). As this study was completed using de-identified routinely-collected aggregate data, informed consent was not required.

Results

The proportion of HFs with all data sources by data element for the three-month period was the lowest for iron/folic (87.0%; 60/69), while the proportion of facilities with complete reporting in HMIS was between 90.8% and 100%, and was the lowest for the first postnatal care (PNC1) (85.5%; 65/76) (Table 2). A facility level HMIS data VF was calculated for eight data elements (Table 2 and Fig 1). A high proportion of HFs achieved acceptable VFs for the following data elements: the number of deliveries (98.7%; 75/76), antenatal care (ANC1) new registrants (95.7%; 66/69), live births (94.7%; 72/76), and newborns who received PNC1 visit within 24 hours (81.5%; 53/65). The median VF was 1.00 (interquartile range [IQR]: 0.99–1.00) for HMIS data on deliveries, 1.00 (IQR: 1.00–1.00) for ANC1 data, 1.00 (IQR: 0.99–1.00) for HMIS data on live births and 1.00 (IQR: 0.97–1.02) for PNC1 data.

thumbnail
Fig 1. Facility-level verification factors (VF) by data element.

The median is not visible when it is very close to or equal to 1.

https://doi.org/10.1371/journal.pone.0235823.g001

thumbnail
Table 2. Facility-level verification factors (VF) by data element.

https://doi.org/10.1371/journal.pone.0235823.t002

The proportion of HFs with acceptable VFs was lower for the number of women who received iron/folic acid (78.3%; 47/60) and the number of women who were tested for syphilis on ANC1 (67.6%; 46/68). The median VF was 1.00 (IQR: 0.99–1.00) for iron/folic acid distribution and 1.00 (0.89–1.00) for syphilis testing, respectively. For HFs with a VF out of the acceptable range, 9 in 13 (69.2%) over-reported on iron/folic acid distribution, while 18 in 22 (81.8%) HFs over-reported on HMIS data for syphilis testing on ANC1.

The indicators for which the lowest proportion of HFs obtained acceptable VFs were the number of women with an ANC1 standard visit (at less than 16 weeks’ gestational age (GA)) (25.0%; 17/68) and the number of women who completed 4 standard ANC visits (ANC4 standard: ANC1 at less than 16 weeks GA, ANC2 at between 24–28 weeks, ANC3 at between 28–32 weeks and ANC4 at between 36–38 weeks) (17.4%; 12/69). The median VF for HMIS data on ANC1 standard visit was 0.85 (IQR: 0.67–0.99) and 0.75 (IQR: 0.56–0.88) for ANC4 standard visit data. Of the majority of HFs that over- or under-reported on ANC1 standard (75.0%; 51/68) and ANC4 standard (82.6%; 57/69), over-reporting was at 93.0% (53/57) and 86.3% (44/51) for ANC4 and ANC1 standard visits.

When data were aggregated at the HCA level (Table 3), all 7 HCAs achieved acceptable VFs for HMIS data on the number of ANC1, deliveries, live births and newborns who received PNC1 within 24 hours of birth. Five and four in seven HCAs achieved acceptable VFs for HMIS data on the number of pregnant women who received iron/folic acid and were tested for syphilis on ANC1, respectively. Two in seven HCAs achieved acceptable VFs for HMIS data on the number of women with ANC1 standard visit. With a VF ranging between 0.59 and 0.86, none of the HCAs obtained an acceptable VF, and all HCAs over-reported on HMIS data, for the number of women with ANC4 standard visit.

thumbnail
Table 3. Hospital catchment-level verification factor (VF) for aggregated data by data element, a VF>1.10 or <0.90 is highlighted.

https://doi.org/10.1371/journal.pone.0235823.t003

A VF was calculated at the HCA level only for the six HMIS data elements with rare events or for data elements concerning a service that was only provided at the hospital level (Table 4). Six in seven hospitals achieved acceptable VFs for the number of admissions in the hospital NCU and 5 in 7 hospitals obtained an acceptable VF for the number of neonatal deaths in NCU. The majority (5/7) of sites also achieved acceptable VFs for the number of stillbirths and the number of babies with low weight at birth. Less than half (3/7) of sites obtained acceptable VFs for the number of live newborns who needed resuscitation and were resuscitated successfully. Only 2 in 7 sites achieved an acceptable VF for the number of live newborns who didn’t cry at birth and the majority (4/7) of sites under-reported on this data element.

thumbnail
Table 4. Hospital catchment-level verification factors (VF) for data elements with rare events, a VF>1.10 or <0.90 is highlighted.

https://doi.org/10.1371/journal.pone.0235823.t004

Discussion

In this study, we assessed the quality of the Rwanda HMIS data, measured as the level of agreement between HMIS data and records in facility source documents, using data from 76 public HFs, in Northern, Southern and Western Rwanda. Fourteen HMIS data elements were selected for this study, considering their importance in identifying gaps and monitoring progress towards the improvement of maternal and newborn health for reducing preventable neonatal deaths in Rwanda. Our findings suggested several strengths while also observing variation in the quality of HMIS data by type of data element, which is consistent with other HMIS DQA studies in Rwanda and other Sub-Saharan African countries [27,41].

Notably, this verification showed high level of agreement between data reported to HMIS and records in facility source documents for the number of ANC1, deliveries and live births. These data elements are among the WHO recommended core indicators for DQR in HMIS data on maternal health [17]. The quality of Rwanda HMIS data on these data elements is higher than what was found in HMIS data verifications for the same data in Ethiopia [41] and Nigeria [42] and similar for the ANC1 indicator in Malawi [22]. This verification of the Rwanda HMIS data also showed similar patterns of data quality of HMIS data elements at the facility level and when HFs were grouped by HCA. This finding may indicate that there were common challenges for accurate reporting to HMIS, regardless of the HF geographical location. This is a different finding to that found by other studies that revealed quality of routinely collected health data in Africa varied by geographical location of reporting HFs [18,43]. Consistent level of HMIS data quality across Rwanda may be the result of adherence of HFs to the Rwanda MoH’s existing standard operating procedures for high quality HMIS data [25] and performance based financing system that includes regular review of facility records with specific focus on maternal and child health [44].

However, there was poor quality of HMIS data on the number of ANC1 and ANC4 standard visits with a general trend of over-reporting. A systematic review of immunization data quality identified insufficient human resources and limited healthcare worker capacity for reporting and using data as key issues that contribute to poor data quality [23]. The accuracy of reporting on these data elements might be dependent on the health care provider’s knowledge of how to calculate the pregnancy’s gestational age in weeks and how to correctly schedule the ANC standard visits, as well as the availability of tools, mainly pregnancy wheels, that facilitate these calculations. It was observed by the study data collection team, that data in registers were recorded in ways that were not compatible with HMIS reporting, for example–gestational age at first visit recorded in the register in months and then reported into HMIS based on cutoffs in weeks which may contribute to challenges in reporting incompatible information out of the source documents into HMIS. In addition, an analysis of the ABC baseline data on the availability of essential medical equipment and supplies at facilities that received the ABC intervention, revealed that only 44.9% (31/69) of health centers were having a pregnancy wheel in the pre-intervention period. The low quality of HMIS data on these data elements might be justified by the findings of a recent study on quality of ANC services provision in HFs conducted in 13 sub-Sahara African countries, including Rwanda [45]. Using data from Service Provision Assessments (SPA) and Service Availability and Readiness Assessments (SARA) surveys, it was revealed that the proportion of HFs providing ANC services with ANC-trained staff was less than half in 6 of the 13 countries. In addition, the median proportion of HFs with ANC guidelines was 62.3%. In Rwanda in particular, only 31.2% of 432 facilities providing ANC services had ANC guidelines and only 79.8% of these facilities had staff trained in ANC [45]. This may particularly affect the observed over-reporting on the number of women who completed four ANC standard visits which requires also knowing the standard visit schedule for ANC to know whether women completed the four visits at the correct time, versus just reporting the number of women who came for four visits at any time.

Further, the over-reporting of key quality of care elements such as iron/folic acid supplementation and syphilis testing in ANC, are concerning. In addition to challenges of correct reporting of ANC coverage identified in this study, these data elements [46] are included to help understand the content and quality of ANC visits that women receive. These are critical interventions for prevention of stillbirths, genetic abnormalities, and poor neonatal outcomes [47,48]. Over-reporting masks an important problem that has major implications for the health of women and newborns.

Study limitations

First, this study included only 14 Rwanda HMIS data elements related to maternal and newborn care and HFs that were selected to receive the ABC scale-up intervention, and this non-random sample might not be representative of all facilities in Rwanda. However, the fact that these facilities were located in different parts of the country, and that the results from this study show similar variations in data quality by HMIS data element across all geographical locations, we are confident that the findings of this study can help with understanding the level of agreement between HMIS data and facility source documents’ records for the considered data elements. We also believe that the findings of this study will be useful to other studies for the verification of more HMIS data elements or on a larger scale in Rwanda. Second, this study only assessed the quality of HMIS data by focusing on the concordance between HMIS and facility source documents data. The true accuracy of the source documents is not known, and is a critical component of data quality that requires further evaluation.

Conclusions

Findings of this study suggest the variation of HMIS data quality by data element and similar patterns of reporting accuracy, irrespective of the geographical locations of a health facility. Reporting to HMIS was less accurate for some data elements, particularly those that are more complex to generate. This challenge to accurate reporting by HFs has implications for decision-making on key interventions affecting maternal and newborn outcomes. Ongoing regular data quality assessments, promoting the use of HMIS data for quality improvement in health care delivery at the facility level, and training to address gaps could help improve HMIS data to be used in program evaluations.

Acknowledgments

We acknowledge the contributions of Robert M. Gatsinzi and Ibrahim Hakizimana from the monitoring and evaluation team for data collection in health facility source documents. We are also grateful to the leadership, nurses and midwives of health facilities involved in this study, for their critical support in facilitating data collection for this study.

References

  1. 1. MEASURE Evaluation. Using DHIS 2 to Strengthen Health Systems. Chapel Hill, NC: MEASURE Evaluation, University of North Carolina; 2017. Available: https://www.measureevaluation.org/resources/publications/fs-17-212
  2. 2. University of Oslo. DHIS2 factsheet. 2018. Available: https://s3-eu-west-1.amazonaws.com/content.dhis2.org/general/dhis-factsheet.pdf
  3. 3. AbouZahr C, Boerma T. Health information systems: the foundations of public health. Bulletin of the World Health Organization. 2005;83: 578–583. pmid:16184276
  4. 4. Nyamtema AS. Bridging the gaps in the Health Management Information System in the context of a changing health sector. BMC Medical Informatics & Decision Making. 2010;10.
  5. 5. Wickremasinghe D, Hashmi IE, Schellenberg J, Avan BI. District decision-making for health in low-income settings: a systematic literature review. Health Policy and Planning. 2016;31. pmid:27591202
  6. 6. Ahanhanzo YG, Ouedraogo LT, Kpozehouen A, Coppieters Y, Makoutode M, Wilmet-Dramaix M. Factors associated with data quality in the routine health information system of Benin. Archives of Public Health. 2014;72.
  7. 7. Foster M, Bailey C, Brinkhof MW, Graber C, Boulle A, Spohr M, et al. Electronic medical record systems, data quality and loss to follow-up: survey of antiretroviral therapy programs in resource-limited settings. Bulletin of the World Health Organization. 2008;86: 939–947. pmid:19142294
  8. 8. Garrib A, Stoops N, McKenzie A, Dlamini L, Govender T, Rohde J, et al. An evaluation of the District Health Information System in rural South Africa. South African Medical Journal. 2008;98: 549–522. pmid:18785397
  9. 9. Makombe SD, Hochgesang M, Jahn A, Tweya H, Hedt B, Chuka S, et al. Assessing the quality of data aggregated by antiretroviral treatment clinics in Malawi. Bulletin of the World Health Organization. 2008;86: 310–314. pmid:18438520
  10. 10. Maokola W, Willey BA, Shirima K, Chemba M, Armstrong Schellenberg JRM, Mshinda H, et al. Enhancing the routine health information system in rural southern Tanzania: successes, challenges and lessons learned. Tropical Medicine and International Health. 2011;16: 721–730. pmid:21395928
  11. 11. Mate KS, Bennett B, Mphatswe W, Barker P, Rollins N. Challenges for Routine Health System Data Management in a Large Public Programme to Prevent Mother-to-Child HIV Transmission in South Africa. PLoS ONE. 2009;4. pmid:19434234
  12. 12. Mavimbe JC, Braa J, Bjune G. Assessing immunization data quality from routine reports in Mozambique. BMC Public Health. 2005;5. pmid:16219104
  13. 13. Maïga A, Jiwani SS, Mutua MK, Porth TA, Taylor CM, Asiki G, et al. Generating statistics from health facility data: the state of routine health information systems in Eastern and Southern Africa. BMJ Global Health. 2019;4: e001849. pmid:31637032
  14. 14. Mutale W, Chintu N, Amoroso C, Awoonor-Williams K, Phillips J, Baynes C, et al. Improving health information systems for decision making across five sub-Saharan African countries: Implementation strategies from the African Health Initiative. BMC Health Services Research. 2013;13. pmid:23819699
  15. 15. Mphatswe W, Mate K, Bennett B, Ngidi H, Reddy J, Barker P, et al. Improving public health information: a data quality intervention in KwaZulu-Natal, South Africa. Bulletin of the World Health Organization. 2012;90: 176–182. pmid:22461712
  16. 16. World Health Organization. Data quality review: a toolkit for facility data quality assessment. Module 2. Desk review of data quality. 2017.
  17. 17. World Health Organization. Data quality review: a toolkit for facility data quality assessment. Module 3. Data verification and system assessment. Geneva: World Health Organization; 2017.
  18. 18. Ouedraogo M, Kurji J, Abebe L, Labonté R, Morankar S, Bedru KH, et al. A quality assessment of Health Management Information System (HMIS) data for maternal and child health in Jimma Zone, Ethiopia. Ginsberg SD, editor. PLOS ONE. 2019;14: e0213600. pmid:30856239
  19. 19. Bosch-Capblanch X, Ronveaux O, Doyle V, Remedios V, Bchir A. Accuracy and quality of immunization information systems in forty-one low income countries. Tropical Medecine and International Health. 2009;14: 2–10. pmid:19152556
  20. 20. Ronveaux O, Rickert D, Hadler S, Groom H, Lloyd J, Bchir A, et al. The immunization data quality audit: verifying the quality and consistency of immunization monitoring systems. Bulletin of the World Health Organization. 2005;83: 503–510. pmid:16175824
  21. 21. Cambodia. Assessment of health facility data quality: Data quality report card. Cambodia; 2012. Available: http://www.who.int/healthinfo/KH_DataQualityReportCard_2012.pdf
  22. 22. O’Hagan R, Marx MA, Finnegan KE, Naphini P, Ng’ambi K, Laija K, et al. National Assessment of Data Quality and Associated Systems-Level Factors in Malawi. Global Health: Science and Practice. 2017;5.
  23. 23. Wetherill O, Lee C, Dietz V. Root Causes of Poor Immunisation Data Quality and Proven Interventions: A Systematic Literature Review. 2017;2: 7.
  24. 24. Venkateswaran M, Mørkrid K, Abu Khader K, Awwad T, Friberg IK, Ghanem B, et al. Comparing individual-level clinical data from antenatal records with routine health information systems indicators for antenatal care in the West Bank: A cross-sectional study. Agyepong I, editor. PLOS ONE. 2018;13: e0207813. pmid:30481201
  25. 25. Republic of Rwanda. Data Quality Assessment Procedures Manual. Ministry of Health; 2016.
  26. 26. Nisingizwe MP, Iyer HS, Gashayija M, Hirschhorn LR, Amoroso C, Wilson R, et al. Toward utilization of data for program management and evaluation: quality assessment of five years of health management information system data in Rwanda. Global Health Action. 2014;7. pmid:25413722
  27. 27. Mitsunaga T, Hedt-Gauthier BL, Ngizwenayo E, Bertrand Farmer D, Gaju E, Drobac P, et al. Data for Program Management: An Accuracy Assessment of Data Collected in Household Registers by Community Health Workers in Southern Kayonza, Rwanda. Journal of Community Health. 2015;40: 625–632. pmid:25502593
  28. 28. Karengera I, Onzima RAD, Katongole S-P, Govule P. Quality and Use of Routine Healthcare Data in Selected Districts of Eastern Province of Rwanda. International Journal of Public Health Research. 2016;4: 5–13.
  29. 29. Brandrud AS, Schreiner A, Hjortdahl P, Helljesen GS, Nyen B, Nelson EC. Three success factors for continual improvement in healthcare: an analysis of the reports of improvement team members. BMJ Quality and Safety. 2011;20: 251–259. pmid:21209149
  30. 30. Tunçalp Ö, Were W, MacLennan C, Oladapo O, Gülmezoglu A, Bahl R, et al. Quality of care for pregnant women and newborns-the WHO vision. BJOG. 2015.
  31. 31. Iyer HS, Hirschhorn LR, Nisingizwe MP, Kamanzi E, Drobac PC, Rwabukwisi FC, et al. Impact of a district-wide health center strengthening intervention on healthcare utilization in rural Rwanda: Use of interrupted time series analysis. PLoS ONE. 2017;12. pmid:28763505
  32. 32. National Institute of Statistics of Rwanda (NISR) [Rwanda], Ministry of Health (MOH) [Rwanda], ICF International. Rwanda Demographic and Health Survey 2014–15. Rockville, Maryland, USA: NISR, MOH and ICF International; 2015.
  33. 33. Day LT, Ruysen H, Gordeev VS, Gore-Langton GR, Boggs D, Cousens S, et al. “Every Newborn-BIRTH” protocol: observational study validating indicators for coverage and quality of maternal and newborn health care in Bangladesh, Nepal and Tanzania. Journal of Global Health. 2019;9.
  34. 34. Magge H, Chilengi R, Jackson EF, Wagenaar BH, Kante AM, AHI PHIT Partnership Collaborative. Tackling the hard problems: implementation experience and lessons learned in newborn health from African Health Initiative. BMC Health Services Research. 2017;17. pmid:29297352
  35. 35. World Health Organization. Standards for improving quality of maternal and newborn care in health facilities. Geneva: World Health Organization; 2016.
  36. 36. Rwanda Ministry of Health. Fourth Health Sector Strategic Plan. Kigali, Rwanda: Republic of Rwanda; 2018.
  37. 37. Rwanda Ministry of Health. Health Service Packages for Public Health Facilities. 2017.
  38. 38. Rwanda Ministry of Health. Standard Operating Procedures for Management of Routine Health Information at Referral/Provincial and District Hospitals (Public and Privates). Rwanda Ministry of Health; 2019.
  39. 39. Rwanda Ministry of Health. Standard Operating Procedures for Management of Routine Health Information at Health Centers/Posts/Private Health Facilities. Rwanda Ministry of Health; 2019.
  40. 40. R Core Team. R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing; 2017. Available: https://www.R-project.org/
  41. 41. Ethiopian Public Health Institute, Federal Ministry of Health, World Health Organization. Ethiopia Health Data Quality Review: System Assessment and Data Verification for Selected Indicators. Addis Ababa, Ethiopia: Ethiopian Public Health Institute; 2016.
  42. 42. Bhattacharya AA, Umar N, Audu A, Felix H, Allen E, Schellenberg JRM, et al. Quality of routine facility data for monitoring priority maternal and newborn indicators in DHIS2: A case study from Gombe State, Nigeria. Bazzano AN, editor. PLOS ONE. 2019;14: e0211265. pmid:30682130
  43. 43. Nicol E, Dudley L, Bradshaw D. Assessing the quality of routine data for the prevention of mother-to-child transmission of HIV: An analytical observational study in two health districts with high HIV prevalence in South Africa. International Journal of Medical Informatics. 2016;95: 60–70. pmid:27697233
  44. 44. Gergen J, Josephson E, Coe M, Ski S, Madhavan S, Bauhoff S. Quality of Care in Performance-Based Financing: How It Is Incorporated in 32 Programs Across 28 Countries. Global Health: Science and Practice. 2017;5: 90–107. pmid:28298338
  45. 45. Kanyangarara M, Munos MK, Walker N. Quality of antenatal care service provision in health facilities across sub-Saharan Africa: Evidence from nationally representative health facility assessments. Journal of Global Health. 2017;7. pmid:29163936
  46. 46. Hodgins S, D’Agostino A. The quality-coverage gap in antenatal care: toward better measurement of effective coverage. Global Health: Science and Practice. 2014;2. pmid:25276575
  47. 47. Newman L, Kamb M, Hawkes S, Gomez G, Say L, Seuc A, et al. Global Estimates of Syphilis in pregnancy and Associated Adverse Outcomes: Analysis of Multinational Antenatal Surveillance Data. PLoS Medicine. 2013;10. pmid:23468598
  48. 48. Black RE, Victoria CG, Walker SP, Bhutta ZA, Christian P, de Onis M, et al. Maternal and child undernutrition and overweight in low-income and middle-income countries. Lancet. 2013; 427–51.