Health management information system (HMIS) data verification: A case study in four districts in Rwanda

Introduction Reliable Health Management and Information System (HMIS) data can be used with minimal cost to identify areas for improvement and to measure impact of healthcare delivery. However, variable HMIS data quality in low- and middle-income countries limits its value in monitoring, evaluation and research. We aimed to review the quality of Rwandan HMIS data for maternal and newborn health (MNH) based on consistency of HMIS reports with facility source documents. Methods We conducted a cross-sectional study in 76 health facilities (HFs) in four Rwandan districts. For 14 MNH data elements, we compared HMIS data to facility register data recounted by study staff for a three-month period in 2017. A HF was excluded from a specific comparison if the service was not offered, source documents were unavailable or at least one HMIS report was missing for the study period. World Health Organization guidelines on HMIS data verification were used: a verification factor (VF) was defined as the ratio of register over HMIS data. A VF<0.90 or VF>1.10 indicated over- and under-reporting in HMIS, respectively. Results High proportions of HFs achieved acceptable VFs for data on the number of deliveries (98.7%;75/76), antenatal care (ANC1) new registrants (95.7%;66/69), live births (94.7%;72/76), and newborns who received first postnatal care within 24 hours (81.5%;53/65). This was slightly lower for the number of women who received iron/folic acid (78.3%;47/60) and tested for syphilis in ANC1 (67.6%;45/68) and was the lowest for the number of women with ANC1 standard visit (25.0%;17/68) and fourth standard visit (ANC4) (17.4%;12/69). The majority of HFs over-reported on ANC4 (76.8%;53/69) and ANC1 (64.7%;44/68) standard visits. Conclusion There was variable HMIS data quality by data element, with some indicators with high quality and also consistency in reporting trends across districts. Over-reporting was observed for ANC-related data requiring more complex calculations, i.e., knowledge of gestational age, scheduling to determine ANC standard visits, as well as quality indicators in ANC. Ongoing data quality assessments and training to address gaps could help improve HMIS data quality.


Introduction
Reliable Health Management and Information System (HMIS) data can be used with minimal cost to identify areas for improvement and to measure impact of healthcare delivery. However, variable HMIS data quality in low-and middle-income countries limits its value in monitoring, evaluation and research. We aimed to review the quality of Rwandan HMIS data for maternal and newborn health (MNH) based on consistency of HMIS reports with facility source documents.

Methods
We conducted a cross-sectional study in 76 health facilities (HFs) in four Rwandan districts. For 14 MNH data elements, we compared HMIS data to facility register data recounted by study staff for a three-month period in 2017. A HF was excluded from a specific comparison if the service was not offered, source documents were unavailable or at least one HMIS report was missing for the study period. World Health Organization guidelines on HMIS data verification were used: a verification factor (VF) was defined as the ratio of register over HMIS data. A VF<0.90 or VF>1.10 indicated over-and under-reporting in HMIS, respectively.

Results
High proportions of HFs achieved acceptable VFs for data on the number of deliveries (98.7%;75/76), antenatal care (ANC1) new registrants (95.7%;66/69), live births (94.7%;72/ 76), and newborns who received first postnatal care within 24 hours (81.5%;53/65). This was slightly lower for the number of women who received iron/folic acid (78.3%;47/60) and tested for syphilis in ANC1 (67.6%;45/68) and was the lowest for the number of women with ANC1 standard visit (25.0%;17/68) and fourth standard visit (ANC4) ( Introduction National health management information systems (HMIS) have been established in many low-and middle-income countries (LMICs) for routine collection and management of facility-based data on health care service delivery [1,2]. When the data are of good quality, they can be used-with little-to-no costs-to identify areas that need improvement, to evaluate various health interventions, to inform evidence-based health policies, and to design programs and allocate resources at all levels of the health system [3][4][5]. However there is evidence of variable quality of HMIS data [6][7][8][9][10][11][12][13], limiting its value in monitoring, evaluation and research in LMICs. Regular data quality assessment is one of the strategies that can be used for improving the quality of HMIS data in LMICs [14,15]. The World Health Organization (WHO) provides guidelines on data quality review (DQR) through a desk review [16] of data which was previously reported to the HMIS and the verification of HMIS data quality through facility survey [17]. The desk review assesses HMIS data quality in four dimensions: 1) completeness and timeliness of data, 2) internal consistency of reported data, 3) external consistency; and 4) external comparisons of population data. Detailed information on definitions and requirements for the calculation of these dimensions, as well as the application of HMIS data quality assessment through the desk review can be reviewed elsewhere [13,16,18]. The WHO toolkit for data quality review defines a verification factor (VF) as the ratio of recounted data from facility source documents over HMIS data [17]. The application of the VF has been more limited and using variable definitions. However, there is evidence that the level of agreement between HMIS data and records in facility source documents can vary depending on the type of data being collected and be rooted in early stages of collecting those data from facility source documents [11,[19][20][21][22][23][24]. Over-or under-reporting in HMIS data can result from human errors that occur when counting events from source documents or simply not including all events for the reporting period-by omitting some of the necessary source documents or not covering the entire reporting period [11,24]. In addition, intentional over-or under-reporting in HMIS data at the facility level can be motivated by the pressure to meet national targets, whereas, inaccuracies in transferring data from facility source documents to the electronic database can also be associated with excessive workload of staff combined with the pressure to meet reporting deadlines [23].
The Rwanda HMIS was established in 1998 and since 2012, with the goal to improve the quality of routinely-collected health data from community health workers (CHWs) and all HFs across the country, the Rwanda Ministry of Health (MoH) upgraded the HMIS to a web-based system known as the District Health Information System version 2 (DHIS2) [25]. Recent assessments of the Rwanda HMIS data quality using WHO guidelines have been limited to a desk review of available data in the HMIS [13,26]. These assessments revealed that the quality of the Rwanda HMIS data is high, regarding completeness and internal consistency of reported data for studied HMIS indicators [26]. However, findings from two small-scale studies that tried to compare HMIS data and records in source documents, using a reporting accuracy definition different to the WHO data verification definition, suggest a variable level of agreement between HMIS data and records in source documents [27,28]. One of these two studies that defined accuracy of reporting in HMIS data as a deviation less than or equal to 5% between HMIS and facility source documents data on family planning, antenatal care and delivery, found the overall accuracy of reporting at 70.6% for 37 HFs sampled in three districts in the Eastern Province of Rwanda [28].
Good quality HMIS data would play an important role in identifying areas that need improvement and monitoring progress and evaluating interventions that are designed to address those gaps [29][30][31]. Rwanda experienced a remarkable progress in reducing mortality among children aged under-five in the past two decades, however the reduction in neonatal deaths (those occurring within 28 days of birth and mainly in health facilities) has been slow [32]. There is no doubt that good quality HMIS data on maternal and newborn healthcare are needed to identify gaps in the existing facility-based care for designing appropriate interventions and for monitoring progress [33]. Since 2013, the "All Babies Count (ABC)" intervention has been implemented to accelerate the reduction in preventable neonatal deaths in Rwanda [34]. The ABC intervention focuses on improved coverage and quality of antenatal, maternity and postnatal care services. ABC is implemented through a joint collaboration of Partners In Health/Inshuti Mu Buzima (PIH/IMB), an international non-profit organization and the Rwanda MoH. However, the evaluation of the impact of these programs has been costly, with a parallel collection of data on program indicators through HMIS and review of facility source documents, given the concerns about poor quality of HMIS data [34]. Therefore, this study uses the WHO guidelines for HMIS data verification to provide evidence on the level of agreement between Rwanda HMIS data and records in source documents of reporting HFs. We calculate VFs for fourteen HMIS data elements that were identified jointly by PIH/IMB and MoH as priority indicators for quality improvement in maternal and newborn health care for 76 HFs (7 hospitals and 69 health centers) that received the ABC intervention between 2017 and 2019 in four districts of Rwanda. The criteria for indicator selection included having clinical relevance to neonatal survival, government priority indicators, and/or being indicators within the WHO standards for improving quality in maternal and newborn care in health facilities [35].

Study design
This was a cross-sectional study that was conducted to assess the quality of Rwanda HMIS data, measured as agreement between HMIS and facility source documents data, for 76 HFs on 14 HMIS data elements related to maternal and newborn health data that were used to monitor quality and progress and to inform quality improvement efforts through the ABC intervention ( Table 1). The antenatal care (ANC) register was the source document for 5 data elements that were only reported by health centers (HCs), while the maternity and postnatal care (PNC) registers were source documents for 7 documents that were reported by both HCs and hospitals. Two data elements related to neonatal admissions and deaths were recounted from the neonatology care unit (NCU) register and were only reported by hospitals.
The ABC project was originally implemented in 2013 by Partners in Health/Inshuti Mu Buzima (PIH/IMB) in Kayonza and Kirehe districts in Eastern Rwanda, in partnership with the Rwanda MoH [34]. Later on, ABC was scaled-up to Gakenke and Rulindo (July 2017) and then to Gisagara and Rusizi (October 2017). The ABC scale-up project used health facilitybased data, collected monthly through the Rwanda HMIS, to monitor indicator progress from baseline to the end of the project and to evaluate its impact. In addition, the project underwent the HMIS data verification process by recounting the same data in standardized HF registers for the same data elements and reporting periods. Five data elements related to antenatal care (ANC) services (ANC new registrants, syphilis testing and iron/folic acid distribution at the first ANC visit and number of women with first and fourth ANC standard visits) were only reported by health centers, whereas two data elements on the number of neonatal admissions and neonatal deaths in the hospital neonatology care unit were specific to hospitals [37]. Data are recorded using standardized registers developed by the MoH and provided to all HFs (see Table 1 for data sources); women attending ANC are recorded at their first ANC visit and provided an ANC card and an ANC number that facilitates continuity of data recording at the individual level for ANC. This study reports data from the baseline period prior to ABC intervention in the 7 HCAs.
Sources of data HMIS data. The ABC team worked with the MoH national HMIS team to extract HMIS reports data for the study periods. The HMIS data collection starts from the reporting facility, with clinical staff in each care service registering patients/clients and the care provided to them in standardized registers and/or medical files [38,39]. Then, for monthly reporting to HMIS, the facility data manager ensures the distribution of paper HMIS reporting forms to heads of services by the 25 th of each month. The head of service collects those data that are relevant to their specific service and submits a completed HMIS report for the previous month to the facility data manager by the 3 rd day of the month following the month of reporting. For timely reporting, the facility data manager should upload all facility data into DHIS 2 by the 5 th day of every month. Data verification by facility team and corrections in the system are only allowed between the 5 th and 15 th of each month. Any request for changes on the data in the system beyond the 15 th of each month should be submitted to the central MoH, and access is only granted upon strong justification of the request.
Recounted data from facility source documents. A specialized team of two trained ABC data collectors went to all HFs under study and recounted the same data in the standardized HF source documents for the same reporting periods. ABC baseline data-April-June 2017 for 48 HFs in Gakenke and Rulindo districts and July-September 2017 for 28 HFs in Gisagara and Rusizi-were collected during the periods July 31-September 19, 2017 and November 14, 2017-January 11, 2018, respectively. In particular, due to observed variable unit of recording of gestational age (GA) in weeks or months in the ANC register by facility and care provider, ABC data collectors worked with midwives or nurses responsible for providing ANC to standardize the calculation of GA in weeks before recounting data on ANC1 and ANC4 standard visits, as the reporting to HMIS on these data elements is based on GA calculated in weeks. The data collection team used a pregnancy wheel and the data recorded on date of last menstrual period and dates of ANC visits for individual women who attended ANC to determine GA at each visit. For all data elements, data collection was done in consultation with the health facility staff responsible for routine reporting of data into HMIS.

Data analysis and presentation
We used the WHO DQR guidelines on data verification and system assessment to calculate verification factors (VFs) for each data element [17]. A VF was defined as the ratio of register data to HMIS data (Eq 1). ABC baseline data were aggregated for the three-month reporting period. HMIS and facility source documents data were compared by data element and HF. At the HCA level, a VF was calculated by summing all the non-missing values for each data element and all the reporting HFs under that HCA during the study period. Then, a HCA-level VF was calculated as a ratio of the aggregated recounted data to HMIS data. In addition, a VF for data elements with rare events was only calculated at the HCA level, where aggregated data were compared to avoid denominators with true zero values that would be expected if these data were compared at the HF level. For each data element, we excluded from our analyses any HFs that were not eligible for reporting on that data element or that had either incomplete HMIS data or missing source documents' data for any month during the reporting period.

VF ¼ recounted number of events from facility source documents Reported number of events from the HMIS Eq ð1Þ
A VF of 1.00 indicated a perfect match between recounted data and HMIS data. The acceptable margin of error for the discrepancy between HMIS reports data and recounted data in facility registers was (0.90�VF�1.10), based on the WHO DQR guidelines. A VF<0.90 or VF>1.10 indicated over-reporting and under-reporting in HMIS data, respectively. We used Stata v.15.1 (Stata Corp, College Station, TX, USA) and R Language and Environment for Statistical Computing for analysis and visual presentation of data [40].

Ethics
The ABC scale-up project received approval from the Rwanda MoH to access HMIS data for the project's indicators for all HFs that received the intervention. This study was approved by the Rwanda National Ethics Committee approval (Kigali, Rwanda, protocol #0067/RNEC/ 2017). As this study was completed using de-identified routinely-collected aggregate data, informed consent was not required.
The proportion of HFs with acceptable VFs was lower for the number of women who received iron/folic acid (78.3%; 47/60) and the number of women who were tested for syphilis on ANC1 (67.6%; 46/68). The median VF was 1.00 (IQR: 0.99-1.00) for iron/folic acid distribution and 1.00 (0.89-1.00) for syphilis testing, respectively. For HFs with a VF out of the acceptable range, 9 in 13 (69.2%) over-reported on iron/folic acid distribution, while 18 in 22 (81.8%) HFs over-reported on HMIS data for syphilis testing on ANC1.
When data were aggregated at the HCA level (Table 3), all 7 HCAs achieved acceptable VFs for HMIS data on the number of ANC1, deliveries, live births and newborns who received PNC1 within 24 hours of birth. Five and four in seven HCAs achieved acceptable VFs for HMIS data on the number of pregnant women who received iron/folic acid and were tested for syphilis on ANC1, respectively. Two in seven HCAs achieved acceptable VFs for HMIS data on the number of women with ANC1 standard visit. With a VF ranging between 0.59 and 0.86, none of the HCAs obtained an acceptable VF, and all HCAs over-reported on HMIS data, for the number of women with ANC4 standard visit.
A VF was calculated at the HCA level only for the six HMIS data elements with rare events or for data elements concerning a service that was only provided at the hospital level (Table 4). Six in seven hospitals achieved acceptable VFs for the number of admissions in the hospital NCU and 5 in 7 hospitals obtained an acceptable VF for the number of neonatal deaths in NCU. The majority (5/7) of sites also achieved acceptable VFs for the number of stillbirths and the number of babies with low weight at birth. Less than half (3/7) of sites obtained acceptable VFs for the number of live newborns who needed resuscitation and were resuscitated successfully. Only 2 in 7 sites achieved an acceptable VF for the number of live newborns who didn't cry at birth and the majority (4/7) of sites under-reported on this data element.

Discussion
In this study, we assessed the quality of the Rwanda HMIS data, measured as the level of agreement between HMIS data and records in facility source documents, using data from 76 public  HFs, in Northern, Southern and Western Rwanda. Fourteen HMIS data elements were selected for this study, considering their importance in identifying gaps and monitoring progress towards the improvement of maternal and newborn health for reducing preventable neonatal deaths in Rwanda. Our findings suggested several strengths while also observing variation in the quality of HMIS data by type of data element, which is consistent with other HMIS DQA studies in Rwanda and other Sub-Saharan African countries [27,41].
Notably, this verification showed high level of agreement between data reported to HMIS and records in facility source documents for the number of ANC1, deliveries and live births. These data elements are among the WHO recommended core indicators for DQR in HMIS data on maternal health [17]. The quality of Rwanda HMIS data on these data elements is higher than what was found in HMIS data verifications for the same data in Ethiopia [41] and Nigeria [42] and similar for the ANC1 indicator in Malawi [22]. This verification of the Rwanda HMIS data also showed similar patterns of data quality of HMIS data elements at the facility level and when HFs were grouped by HCA. This finding may indicate that there were common challenges for accurate reporting to HMIS, regardless of the HF geographical location. This is a different finding to that found by other studies that revealed quality of routinely collected health data in Africa varied by geographical location of reporting HFs [18,43]. Consistent level of HMIS data quality across Rwanda may be the result of adherence of HFs to the Rwanda MoH's existing standard operating procedures for high quality HMIS data [25] and performance based financing system that includes regular review of facility records with specific focus on maternal and child health [44].
However, there was poor quality of HMIS data on the number of ANC1 and ANC4 standard visits with a general trend of over-reporting. A systematic review of immunization data quality identified insufficient human resources and limited healthcare worker capacity for reporting and using data as key issues that contribute to poor data quality [23]. The accuracy of reporting on these data elements might be dependent on the health care provider's knowledge of how to calculate the pregnancy's gestational age in weeks and how to correctly schedule the ANC standard visits, as well as the availability of tools, mainly pregnancy wheels, that facilitate these calculations. It was observed by the study data collection team, that data in registers were recorded in ways that were not compatible with HMIS reporting, for example-gestational age at first visit recorded in the register in months and then reported into HMIS based on cutoffs in weeks which may contribute to challenges in reporting incompatible information out of the source documents into HMIS. In addition, an analysis of the ABC baseline data on the availability of essential medical equipment and supplies at facilities that received the ABC intervention, revealed that only 44.9% (31/69) of health centers were having a pregnancy wheel in the pre-intervention period. The low quality of HMIS data on these data elements might be justified by the findings of a recent study on quality of ANC services provision in HFs conducted in 13 sub-Sahara African countries, including Rwanda [45]. Using data from Service Provision Assessments (SPA) and Service Availability and Readiness Assessments (SARA) surveys, it was revealed that the proportion of HFs providing ANC services with ANC-trained staff was less than half in 6 of the 13 countries. In addition, the median proportion of HFs with ANC guidelines was 62.3%. In Rwanda in particular, only 31.2% of 432 facilities providing ANC services had ANC guidelines and only 79.8% of these facilities had staff trained in ANC [45]. This may particularly affect the observed over-reporting on the number of women who completed four ANC standard visits which requires also knowing the standard visit schedule for ANC to know whether women completed the four visits at the correct time, versus just reporting the number of women who came for four visits at any time. Further, the over-reporting of key quality of care elements such as iron/folic acid supplementation and syphilis testing in ANC, are concerning. In addition to challenges of correct reporting of ANC coverage identified in this study, these data elements [46] are included to help understand the content and quality of ANC visits that women receive. These are critical interventions for prevention of stillbirths, genetic abnormalities, and poor neonatal outcomes [47,48]. Over-reporting masks an important problem that has major implications for the health of women and newborns.

Study limitations
First, this study included only 14 Rwanda HMIS data elements related to maternal and newborn care and HFs that were selected to receive the ABC scale-up intervention, and this nonrandom sample might not be representative of all facilities in Rwanda. However, the fact that these facilities were located in different parts of the country, and that the results from this study show similar variations in data quality by HMIS data element across all geographical locations, we are confident that the findings of this study can help with understanding the level of agreement between HMIS data and facility source documents' records for the considered data elements. We also believe that the findings of this study will be useful to other studies for the verification of more HMIS data elements or on a larger scale in Rwanda. Second, this study only assessed the quality of HMIS data by focusing on the concordance between HMIS and facility source documents data. The true accuracy of the source documents is not known, and is a critical component of data quality that requires further evaluation.

Conclusions
Findings of this study suggest the variation of HMIS data quality by data element and similar patterns of reporting accuracy, irrespective of the geographical locations of a health facility. Reporting to HMIS was less accurate for some data elements, particularly those that are more complex to generate. This challenge to accurate reporting by HFs has implications for decision-making on key interventions affecting maternal and newborn outcomes. Ongoing regular data quality assessments, promoting the use of HMIS data for quality improvement in health care delivery at the facility level, and training to address gaps could help improve HMIS data to be used in program evaluations.