Skip to main content
Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Facility management associated with improved primary health care outcomes in Ghana

  • Erlyn K. Macarayan ,

    Contributed equally to this work with: Erlyn K. Macarayan, Hannah L. Ratcliffe

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Software, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliations Ariadne Labs, Brigham and Women’s Hospital & Harvard T.H. Chan School of Public Health, Boston, MA, United States of America, Department of Global Health and Population, Harvard T.H. Chan School of Public Health, Boston, MA, United States of America

  • Hannah L. Ratcliffe ,

    Contributed equally to this work with: Erlyn K. Macarayan, Hannah L. Ratcliffe

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Writing – original draft, Writing – review & editing

    Affiliation Ariadne Labs, Brigham and Women’s Hospital & Harvard T.H. Chan School of Public Health, Boston, MA, United States of America

  • Easmon Otupiri,

    Roles Conceptualization, Data curation, Project administration, Writing – review & editing

    Affiliation Kwame Nkrumah University of Science and Technology, Kumasi, Ghana

  • Lisa R. Hirschhorn,

    Roles Conceptualization, Supervision, Validation, Writing – review & editing

    Affiliations Ariadne Labs, Brigham and Women’s Hospital & Harvard T.H. Chan School of Public Health, Boston, MA, United States of America, Feinberg School of Medicine, Northwestern University, Chicago, IL, United States of America

  • Kate Miller,

    Roles Conceptualization, Formal analysis, Methodology, Supervision, Validation, Visualization, Writing – review & editing

    Affiliation Ariadne Labs, Brigham and Women’s Hospital & Harvard T.H. Chan School of Public Health, Boston, MA, United States of America

  • Stuart R. Lipsitz,

    Roles Conceptualization, Formal analysis, Methodology, Supervision, Validation, Visualization, Writing – review & editing

    Affiliations Ariadne Labs, Brigham and Women’s Hospital & Harvard T.H. Chan School of Public Health, Boston, MA, United States of America, Center for Surgery and Public Health, Brigham and Women’s Hospital, Harvard Medical School, Boston, MA, United States of America

  • Atul A. Gawande,

    Roles Conceptualization, Supervision, Writing – review & editing

    Affiliations Ariadne Labs, Brigham and Women’s Hospital & Harvard T.H. Chan School of Public Health, Boston, MA, United States of America, Department of Surgery, Brigham and Women’s Hospital, Boston, MA, United States of America

  • Asaf Bitton

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Software, Supervision, Validation, Visualization, Writing – review & editing

    Affiliations Ariadne Labs, Brigham and Women’s Hospital & Harvard T.H. Chan School of Public Health, Boston, MA, United States of America, Center for Primary Care, Harvard Medical School, Boston, MA, United States of America, Department of Health Care Policy, Harvard Medical School, Boston, MA, United States of America, Division of General Medicine, Brigham and Women’s Hospital, Boston, MA, United States of America



Strong primary health care (PHC) is essential for achieving universal health coverage, but in many low- and middle-income countries (LMICs) PHC services are of poor quality. Facility management is hypothesized to be critical for improving PHC performance, but evidence about management performance and its associations with PHC in LMICs remains limited.


We quantified management performance of PHC facilities in Ghana and assessed the experiences of women who sought care at sampled facilities. Using multi-level models, we examined associations of facility management with five process outcomes and eight experiential outcomes.


On a scale of 0 to 1, the average overall management score in Ghana was 0·76 (IQR = 0·68–0·85). Facility management was significantly associated with one process outcome and three experiential outcomes. Controlling for facility characteristics, facilities with management scores at the 90th percentile (management score = 0·90) had 22% more essential drugs compared to facilities with management scores at the 10th percentile (0·60) (p = 0·002). Positive statistically non-significant associations were also seen with three additional process outcomes—integration of family planning services (p = 0·054), family planning types provided (p = 0·067), and essential equipment availability (p = 0·104). Compared to women who sought care at facilities with management scores at the 10th percentile, women who sought care at facilities at the 90th percentile reported 8% higher ratings of trust in providers (p = 0·028), 15% higher ratings of ease of following provider’s advice (p = 0·030), and 16% higher quality rating (p = 0·020). However, women who sought care in the 90th percentile facilities rated their waiting times as worse (22% lower, p = 0·039).


Higher management scores were associated with higher scores for some process and experiential outcomes. Large variations in management performance indicate the need to strengthen management practices to help realize the full potential of PHC in improving health outcomes.


In October 2018, the world celebrated the fortieth anniversary of the Alma Ata Declaration, which established primary health care (PHC) as the key mechanism for achieving the ambitious goal of “Health for All.”[1] The anniversary came at an auspicious time, with the global community turning its focus to achieving Universal Health Coverage (UHC).[2] While the centrality of PHC in this effort is recognized,[3] PHC services in many low- and middle-income countries (LMICs) are often of strikingly poor quality.[4] Furthermore, significant knowledge gaps remain on determinants of improved PHC facility performance in LMICs.[5] To achieve UHC and deliver on the promise of Alma Ata, more evidence is needed about which bottlenecks are most detrimental to PHC system performance and the best strategies to overcome these barriers.

Facility management is assumed to be an important contributor to PHC performance.[6] In high-income countries, substantial evidence exists that better management is associated with improved facility performance and health outcomes, particularly in hospitals.[79] Across nine mostly high-income countries, management practices in inpatient settings were strongly associated with better clinical outcomes including mortality rates from myocardial infarctions and surgeries, shorter waiting lists, and reduced staff turnover.[9] However, what constitutes “good” PHC management and its impact on performance is less known in LMICs. Available evidence comes almost exclusively from the hospital settings,[10] with little known about management of PHC facilities or management’s effect on PHC quality or patient experience. Further limiting our understanding of PHC facility management effects on care delivery in LMICs is the lack of validated measurement tools that can be implemented at scale. The best available evidence from both low- and high-income countries relies primarily on detailed case studies,[11] extensive 360° reviews of individual manager’s performance,[12] or in-depth qualitative interviews[13]–methods which generate rich data but are costly, time-consuming, and difficult to replicate at scale.

In this paper, we examined the associations of management on service delivery process outcomes and women’s experience of care using a new survey methodology for assessing management of PHC facilities. The survey was implemented in a national sample of facilities across Ghana, a middle-income country of 28 million people with an average life expectancy of 66 years,[14] to determine the level and variations in management performance by region and facility type. Ghana’s health system includes both private and public facilities. The public sector provides approximately 65% of care delivery,[15] with PHC services delivered through a range of facilities including Community-Based Health Planning and Services (CHPS) compounds, Ghana’s most basic, community-based PHC facility; Health Centers; and District Hospitals. District hospitals provide comprehensive health care and are responsible for partnering with the District Health Administration and local government to plan, supervise, monitor, and coordinate service delivery, while health centers are responsible for planning, developing, monitoring, and evaluating community-based service delivery. CHPS facilities work at the community level to provide promotive, preventive, and basic curative care through facility and home-based care.[16] To our knowledge, this is the first national survey to quantify management practices and explore its associations with PHC service quality in Sub-Saharan Africa.


Adaptation of a management framework

To identify a framework for measuring PHC facility management performance, we undertook an extensive scoping effort, the methods and results of which are described in the S1 File. We identified the World Management Survey (WMS) as a particularly well-validated and influential framework for measuring management performance across sectors.[13,17] The WMS identifies four domains of management: Operations, Performance Monitoring, Target Setting, and Human Resources.[17] We added a fifth domain—Community Engagement—that is essential for high-quality management of PHC facilities in LMICs.[11]

Data sources

Survey design.

The WMS employs an intensive, qualitative methodology that is difficult to implement at scale in LMIC PHC facilities, so we used our adapted WMS framework to guide the development of a new, quantitative, close-ended facility survey suitable for this setting. We also designed a household survey to assess women’s experience of PHC services in Ghana, drawing extensively from validated survey questions including measures of responsiveness from the WHO World Health Survey Responsiveness Module[18] and respectfulness of care and future care seeking intentions measures from work in Tanzania and Ethiopia.[19,20] The surveys were administered in 2016 in partnership with the Performance Monitoring and Accountability 2020 (PMA2020) platform and the Kwame Nkrumah University of Science and Technology (KNUST) through integration into existing PMA2020 facility and household surveys of women of reproductive age, designed to track progress towards family planning targets.[21] Surveys were administered in English, with translation into local languages as needed for the household survey. Because not all local languages in Ghana have a written form, hard copy translations of the surveys tools in all local languages were not developed, but enumerators with full English and local-language fluency undertook a structured, guided process to ensure consistency of verbal translations. English-language surveys are available in S2 File and S3 File.

Study sites.

The Ghana Statistical Service selected 100 enumeration areas across Ghana’s ten regions with probability proportional to size using a master sampling frame stratified by urban-rural areas. All public health facilities that served each enumeration area and any private facilities (hospitals, polyclinics, and clinics) within its boundaries were included in the sample. If the sampled facility had inpatient services, questions related to service readiness and service delivery were confined to outpatient services only. In total, 142 facilities offering PHC services were surveyed with at least one facility representing each enumeration area and six to 25 PHC facilities per region. Within each of these same enumeration areas, 42 households were selected using a random number generator to complete the household survey. The household survey sample size was calculated to enable nationally-representative estimates of modern contraceptive prevalence rates within three percentage points.[21]

Facility survey administration.

At each facility, trained interviewers asked to speak with the head of the facility. Eligible respondents included the Medical Director/Superintendent, Director of Nursing, or Nursing Matron at hospitals; Nurse, Midwife, Physician Assistant, or Physician In-Charge at health centers; and midwife or Community Health Nurse within CHPS facilities. At private facilities, eligible respondents included the owner, managing partner, administrator, and/or highest-ranking doctor. Respondents were allowed to refer the interviewer to additional facility staff to answer specialized questions, as needed. Facility-level data were collected from 19 September to 14 December 2016.

Household survey and linking to facility survey.

All sampled households with at least one woman of reproductive age (aged 15–49 years) were included. All women of reproductive age residing in the household were identified, and efforts were made to interview all women who consented, including returning at different times of day to identify a time that respondents were available. Women who reported seeking care for themselves or a family member within the last six months were asked to identify which facility they went to for their most recent visit. If the facility was also included in our facility sample, the respondents and the facility surveys were matched, allowing us to make a direct link between women’s reports of their care experience and facility-level data. The household survey data were collected from 24 August to 23 November 2016, overlapping with facility-level data collection for three months.


We assessed three categories of pre-specified outcomes, following STROBE guidelines (see S4 File): facility management scores, process outcomes at the facility level, and women-reported experiential outcomes:


We categorized 27 indicators from the facility survey into the five management domains. All indicators were rescaled from 0 (lowest) to 1 (highest) (see S5 File). “Do not know” and missing responses for each indicator were treated as zero if it was determined by the research team that the respondent was responsible for knowing the answer.

Process outcomes: Facility level.

Five process outcomes were selected based on their feasibility of assessment through the established survey platform and the data available through the core PMA2020 family planning survey. All outcomes were scaled from 0 (low) to 1 (high).

Essential Drug Index—proportion of availability of up to 21 drugs selected from the Service Delivery Indicators essential drugs list [22] adjusted for drugs expected to be available at each facility type according to the Ghana essential medicines list (see S6 File).[23] Information about drug availability was missing for five facilities which were excluded from the drug availability analysis.

  1. Equipment Index—proportion of availability and functionality of six basic pieces of equipment (stethoscope, sphygmomanometer, child and adult weighing scale, thermometer, and any form of sterilization equipment), selected from the Service Delivery Indicators list of essential equipment.[22]
  2. Integration of family planning services into maternal and child health (MCH) services and HIV services–If a facility offered both MCH and HIV services, a score of 1 was given if family planning was integrated in all these services. Otherwise, facilities were given a score of 0. Twenty-one facilities were excluded from this analysis for not offering MCH services (n = 16), HIV services (n = 1), or both (n = 4).
  3. Index for family planning types provided—proportion of family planning types provided out of up to 13 types that should be available at each facility type based on national guidelines.[24] (see S7 File)
  4. Index for family planning types counseled—proportion of family planning types counseled out of up to 16 types, including natural family planning methods, that should be available based on national guidelines.[24] (see S7 File)

Experiential outcomes: Woman respondent level.

We assessed six outcomes related to responsiveness—as defined by the World Health Survey Responsiveness Module [18]—and two quality of care outcomes based on respondents’ ratings of their most recent experience seeking care for themselves or a family member within the last six months. All ratings were on a 5-point Likert scale and outcomes were scaled from 0 (low) to 1 (high).

  1. Prompt attention to health needs–indicated by ratings for waiting times for consultation and treatment;
  2. Basic amenities of health services–indicated by ratings of facility cleanliness;
  3. Trust in the skills and abilities of facility health providers;
  4. Dignity–indicated by ratings of level of respect shown by facility health providers;
  5. Ease of understanding information from health provider;
  6. Ease of following health advice of provider;
  7. Likelihood of returning to the facility for future care;
  8. Overall rating of quality of care received.

Control factors.

Facility characteristics captured included: facility type; region; managing authority (public or private, including faith-based organizations); approval status to receive National Health Insurance Scheme (NHIS) reimbursements, a government social intervention program to provide financial access to health care; and facility size, defined by the number of beds. Household survey respondent sociodemographic information captured included age, educational attainment, marital status, insurance coverage, borrowing money or selling something to afford the cost of care, and region of residence.

Statistical analysis

Using facility-level data, scores for each of the five management domains were calculated as unweighted averages of each set of indicators per domain, with scores ranging from 0 (lowest) to 1 (highest). Scores of all five management domains were averaged to calculate an overall management score. We measured reliability of the domain scores by computing Chronbach’s Alpha. Generalized linear models with a log link were used to model the facility-level process outcomes and individual-level experience outcomes as a linear function of overall management score (as a continuous covariate on the scale from 0 to 1) and all facility and/or individual-level control factors. A machine learning technique—supervised principal components[25]–was used to incorporate highly correlated predictors into the model to minimize confounding bias. Use of the log link allowed us to interpret the effect of management scores on each outcome as a ratio of outcome means (or proportions); for ease of interpretation, we display the ratio of adjusted outcome means for the 90th percentile management score (= 0·90) versus the 10th percentile management score (= 0·60). In the regression model with the log link, this ratio is estimated by multiplying the regression coefficient of continuous management score by the difference between the 90th and 10th percentiles of management score and then exponentiating this product. PMA2020 employs a complex survey design, utilizing survey weights, stratification by enumeration area, and clustering by service delivery points. [26] All analyses accounted for this complex survey design by adjusting for stratification, clustering, and weighting. Since the woman respondent-level analysis was restricted to include only those who sought care at a sampled facility, we adjusted the standard PMA2020 survey weights [26] by the inverse probability of seeking care. The probability of seeking care was estimated by fitting a logistic regression model that included the respondent and facility-level characteristics as covariates from our sample.[27] The facilities in our sample are an unweighted, stratified (by enumeration area), random sample of facilities in Ghana.[26] All analyses were conducted in Stata version 15·0 (StataCorp, LP, College Station, TX). AB, LRH, SRL, KM, EKM, and HLR had full access to the data. AB, EKM, and HLR had full responsibility for final submission of the manuscript.


All study participants provided informed, written consent. Non-literate respondents were requested to have a witness present to review the consent form, and the witness provided written consent alongside the respondent’s thumbprint. Participants under 18 years of age were consented alongside a parent or guardian.

This study was approved by School of Medical Sciences/Komfo Anokye Teaching Hospital Committee on Human Research Publications and Ethics (protocol CHRPE/AP/740/1.3), the Johns Hopkins School of Public Health Institutional Review Board (protocol 7238), and the Partners Human Research Committee (protocol 2016P002284).

Role of the funding source

Funding was provided by the Bill & Melinda Gates Foundation. The funders played no role in study design; collection, analysis, and interpretation of data; writing of the paper; or decision to submit for publication.


One hundred and forty-two facilities providing PHC services were included in the management analysis (Table 1). Hospitals and polyclinics made up half of the sample (n = 71), followed by health centers/clinics (n = 48), and CHPS (n = 23). Of the 142 facilities, 16·2% were private and 97·2% were approved for reimbursement from the NHIS. The average number of beds per facility was 51 (SD = 63).

Table 1. Characteristics of facilities offering primary health care services and of women who sought primary health care services at a sampled facility.

Fig 1. Analysis and linking of facility- and woman respondent-level datasets.

The Chronbach’s Alpha values for each management domain (target setting, operations, human resources, monitoring, and community engagement) were 0.06, 0.56, 0.61, 0.85, and 0.66, respectively. The average overall management score was 0·76 (SD = 0·12) (Table 2), with significant variation across management domains. Human Resources was the highest scoring domain (mean = 0·89; SD = 0·17), while Community Engagement (mean = 0·65, SD = 0·20) was lowest.

Table 2. Indicators of management performance and average scores of all facilities per management domain.

Regional disparities existed in overall management performance and specific domain performance (Fig 2); facilities in the Central region scored lowest (mean = 0·64, SD = 0·10) and facilities in Greater Accra region scored highest (mean = 0·90, SD = 0·05) in most domains.

Fig 2. Regional variations in overall management and for each management domain in Ghana (n = 142).

Regions are colored based on average management scores (in absolute values) with the lowest regional score in brown to the highest regional score in green. Values shown below each map are the national averages and standard deviations.

Management performance varied significantly by facility type (p<0·0001), with hospitals and polyclinics performing better overall than CHPS and health centers/clinics, whose performance was more variable (Fig 3). Additionally, significant variations existed by facility type in performance of individual management domains (see S8 File).

Fig 3. Differences in overall management of primary care facilities in Ghana by facility type and region (n = 142).

Box plots show the median line by facility types: Community-based Health Planning and Services (CHPS) (mean = 0·67), health centers and clinics (0·74), and hospitals and polyclinics (0·81). Numbers beside each region name (in parentheses) refer to the number of sampled facilities per region. CHPS facilities in Greater Accra were not represented in the sample. The outer box plot edges span the 25th to 75th percentile, and whiskers represent the 95th percentiles. The lines in the y-axes represent the 10th and 90th percentiles of management scores (blue) and the mean of overall management score (red) for each type of facility.

We also found significant differences in process and experiential outcomes by region and facility type. The average essential drug index was 0·74, ranging from 0·60 in the Northern and Upper East regions to 0·88 in Greater Accra (Table 3 and S9 File). Hospitals/polyclinics had the highest average essential drug index (mean = 0·88) compared to health centers/clinics and CHPS. The majority of all facilities across all regions had a high equipment index (mean = 0·97). Although family planning integration into maternal and HIV services was high (mean = 0·88), on average, facilities offered only 59% of family planning types included in national policies and counseled on only 73% of methods. CHPS had the highest scores for family planning types counseled (mean = 0·75) but the lowest scores for family planning types provided (mean = 0·50).

Table 3. Differences in essential supplies and women respondent’s experience of care in Ghana by region and facility type.

The average waiting time reported was 9·32 minutes, and the average acceptability rating of these waiting times was 0·65. Ratings of wait times were significantly better at CHPS facilities than hospitals/polyclinics (mean = 0·76 versus mean = 0·59, p<0·0001) despite slightly higher average wait times (11·91 minutes versus 8·96 minutes). The average rating of facility cleanliness was 0·60, with substantial regional variation (p<0·0001) but little variation between facility types. Ratings for trust in providers were significantly higher at both CHPS and hospitals/polyclinics (mean = 0·72) than health centers/clinics (mean = 0·66) (p = 0·024). Among all experiential outcomes assessed, respect shown by providers had the lowest overall average (mean = 0·57), with ratings highest in CHPS. Ratings for both ease of understanding and following provider’s advice were moderately high (0·69 and 0·71, respectively), and significantly higher at CHPS facilities than other facility types (p = 0·001, p = 0·015) with significant regional variation (p<0·001, p = 0·003). Similarly, reported likelihood of returning to the facility was high at 0·70, with significantly higher ratings in CHPS facilities (p = 0·024) and significant regional variation (p<0·001). Overall quality of care ratings were moderate (0·61), with significant regional variation (p<0·0001) but little variation between facility types.

Controlling for facility characteristics, facilities with management scores at the 90th percentile (management score = 0·90) had 22% more essential drugs compared to facilities with management scores at the 10th percentile (management score = 0·60) (p = 0·002) (Table 4). Although we found no statistically significant differences for other process outcomes, positive associations occurred with three additional process outcomes—integration of family planning services (p = 0·054), family planning types provided (p = 0·067), and essential equipment availability (p = 0·104). Controlling for facility characteristics and women’s sociodemographic characteristics, women who sought care at facilities at the 90th versus the 10th percentile of management scores reported 8% higher ratings of trust in providers (p = 0·028), 15% higher ratings of ease of following provider’s advice (p = 0·030), and 16% higher overall quality rating (p = 0·020). Additionally, women in the 90th versus 10th percentile facilities rated their waiting times 22% lower (p = 0·039).

Table 4. Models of management performance, essential supplies and women respondent experience of care in Ghana.


This was the first known national study to quantify management performance in PHC facilities in an LMIC and associate it with process and experiential outcomes. Higher facility management scores were independently associated with higher essential drug availability and higher ratings in four of the eight experiential outcomes assessed.

Elements of experiential quality which were higher in better-managed facilities included promptness of care, issues related to provider-client interactions (trust, communication), as well as overall user rating of quality. Trust and overall perceived quality of care have been shown to be important predictors of care seeking behavior and bypass of health facilities for maternal and child health services, indicating that management is an important improvement target.[28] CHPS facilities had higher ratings on most experiential outcomes, potentially reflecting their strong focus on community engagement,[29] but lower scores on most process outcomes. Significant variability in experiential quality was seen across regions, with some regions tending to score lower across multiple domains of experiential quality. Together, these findings indicate that more a more systemic focus on improving experiential quality—rather than facility-by-facility efforts—is needed to ensure access to high-quality experiential care across all regions and facility types.

Our finding that better-managed facilities have higher essential drug availability is consistent with other available evidence. For example, Mabuchi found key PHC facility characteristics associated with better performance in a Performance-Based Financing scheme in Nigeria that are closely aligned with the management categories described here, including setting targets, monitoring progress towards targets, and strong community engagement.[11] A randomized-controlled trial of 80 PHC facilities in Nigeria found that a management consulting program led to improvements in outcomes such as supply availability and facility cleanliness, though these changes were not sustained one year later.[30]

We found significant differences in management performance across management domains, regions, and facility types in Ghana. The Human Resources domain was nearly uniformly high-scoring, while Community Engagement performance was low overall and highly variable across regions. Higher-level facilities had better management than lower-level facilities, though for each type of the facility the performance spectrum was wide and overlapping. Greater Accra region was the highest-performing region, while Northern and Central regions were the lowest, however all regions performed variably across management domains and each had a different performance profile. Together, these findings indicate that gaps in management performance are not confined to specific domains, regions, or facilities and highlight the need to improve facility management across all facility types and in CHPS facilities in particular, given the central role of CHPS in PHC service delivery.[16]

However, there is relatively limited evidence about which improvement strategies are best suited to the PHC context. In Ethiopia, a multipronged initiative aimed at improving management practices by strengthening management personnel through practice-based training, on-site mentorship and a Master of Hospital Administration resulted in improvements in 86 hospital performance standards from 27% at baseline to 51%.[31] Further work is needed to understand whether interventions effective at the hospital level in Ethiopia will also strengthen management performance at the PHC level and result in improvements in facility readiness, patient experience, or health outcomes. Evidence also suggests that management practices at the district level are key determinants of performance [32] and that management of PHC facilities may be most important in settings where PHC facility managers have at least a baseline level of autonomy to enact their agendas.[11] The regional variation documented in this study highlights the need to understand the broader subnational systems and context which may influence management culture and effectiveness at the facility level.

Our study had several limitations. Although the facility survey was a stratified, random sample of facilities, estimates from 2014 show that approximately 35% of health care services are provided in the private sector in Ghana,[15] suggesting that private sector facilities may be under-represented in our sample. However, we found no statistical difference in management scores between sampled private and public-sector facilities. Further, we were unable to assess how financial commitments from governmental and non-governmental sources or variable human resource capacities affect management performance; these areas should be the focus of future exploration. Additionally, our selection of facility process outcomes was limited by the data and respondents available through the PMA2020 survey platform and did not include measures of technical quality, assess the experiences of users other than women of reproductive age, or externally validate reported management capacity. The survey also did not capture women’s expectations of care, which may have confounded their reported experiences. Additionally, while we conducted pre-testing of the facility survey in multiple facilities in Ghana to ensure feasibility and acceptability, it was not previously formally validated for use in PHC settings in LMICs. The reliability scores of the management domains reported here indicate that improvements to the tool will be necessary. Further work is underway to repeat a modified version of this survey in Ghana and other LMIC contexts to assess the generalizability and reliability of the survey, including measuring experience in populations other than women of reproductive age.


Our results suggest that higher PHC facility management scores are significantly and independently associated with essential drug availability as well as overall quality and components of responsiveness of care as reported by patients and families. The results have important implications for Ghana and the broader research community. For Ghana, the significant variations in performance across region and facility type highlight that a one-size-fits-all improvement approach is unlikely to succeed and that improvements will need to be targeted to the specific context and performance profile in question. Further work is also needed to examine how existing policies, governance systems, and national and sub-national quality infrastructure may affect facility management, and how this management impacts health outcomes over time. At the global level, our results are part of a growing body of evidence highlighting the need for increased research and policy to better measure key service delivery functions, including facility management, to inform improvement work critical to the achievement of quality PHC necessary for effective UHC.[2]

Supporting information

S1 File. Additional methodological reports.

Additional information about the selection and modification of a management framework and model equations for the survey-weighted generalized linear models employed in this analysis.


S2 File. Facility survey.

Complete survey used to collect data at health facilities.


S3 File. Household survey—female questionnaire.

Complete survey used to collect data from individual women in their households.


S5 File. Management component indicators based on the World Management Survey framework.

Details of the 27 indicators which comprise the management index.


S6 File. Essential drugs assessed.

List of essential drugs assessed by facility type.


S7 File. Family planning types assessed.

List of family planning types assessed by facility type.


S8 File. Management domains by region and facility type.

Figure showing variation in performance of management overall as well as each management domain, by region and facility type.


S9 File. Process outcomes by region.

Maps showing regional variation across Ghana in process outcomes.



We acknowledge the enumerators in Ghana who patiently collected data for this study and the survey respondents who dedicated their time to this study. We also gratefully acknowledge collaborators at Johns Hopkins University, including Hannah Olson, Blake Zachary, Linnea Zimmerman, Shulin Jiang, and Scott Radloff, for their support in fielding the surveys and collecting and cleaning data, and Brandon Neal of Ariadne Labs for his assistance with data analysis. The Lancet High-Quality Health Systems Secretariat also provided useful methodological feedback. Dr. Koku Awoonor-Williams provided essential input to the design of the survey, and Dr. Awoonor-Williams and Vicky Okine provided invaluable feedback to help clarify and contextualize our results. Funding for this work was provided by the Bill & Melinda Gates Foundation.


  1. 1. World Health Organization, The United Nations Children’s Fund. Primary Health Care: Report of the International Conference on Primary Health Care, Alma-Ata, USSR, 6–12 September 1978 [Internet]. Geneva, Switzerland: World Health Organization; 1978. Available:
  2. 2. World Health Organization. Draft thirteenth general programme of work, 2019–2023. Seventy-First World Health Assembly Provisional agenda item 111 A71/4. Geneva, Switzerland; 2018. Available:
  3. 3. Pettigrew LM, Maeseneer J De, Anderson MP, Essuman A, Kidd MR, Haines A. Primary health care and the Sustainable Development Goals. Lancet. Elsevier Ltd; 2015;386: 2119–2121. pmid:26638948
  4. 4. Das J, Hammer J, Leonard K. Quality of Primary Care in Low-Income Countries: Facts and Economics. Annu Rev Econom. Annual Reviews; 2014;6: 525–553.
  5. 5. Bitton A, Ratcliffe HL, Veillard JH, Kress DH, Barkley S, Kimball M, et al. Primary Health Care as a Foundation for Strengthening Health Systems in Low- and Middle-Income Countries. J Gen Intern Med. Journal of General Internal Medicine; 2016;32: 566–571. pmid:27943038
  6. 6. Lega F, Prenestini A, Spurgeon P. Is management essential to improving the performance and sustainability of health care systems and organizations? A systematic review and a roadmap for future studies. Value Heal. Elsevier; 2013;16: S46–S51. pmid:23317645
  7. 7. McConnell KJ, Hoffman KA, Quanbeck A, McCarty D. Management practices in substance abuse treatment programs. J Subst Abuse Treat. Elsevier B.V.; 2009;37: 79–89. pmid:19195813
  8. 8. McConnell KJ, Lindrooth RC, Wholey DR, Maddox TM, Bloom N. Management practices and the quality of care in cardiac units. JAMA Intern Med. 2013;173: 684–692. pmid:23552986
  9. 9. Bloom N, Sadun R, Van Reenen J. Does Management Matter in Healthcare? Cambridge, Massachusetts; 2013.
  10. 10. McNatt Z, Linnander E, Endeshaw A, Tatek D, Conteh D, Bradley EH. A national system for monitoring the performance of hospitals in Ethiopia. Bull World Health Organ. 2015;93: 719–726. pmid:26600614
  11. 11. Mabuchi S, Sesan T, Bennett SC. Pathways to high and low performance: factors differentiating primary care facilities under performance-based financing in Nigeria. Health Policy Plan. 2017;33: 41–58. pmid:29077844
  12. 12. Munyewende PO, Levin J, Rispel LC. An evaluation of the competencies of primary health care clinic nursing managers in two South African provinces. Glob Health Action. United States; 2016;9: 32486. pmid:27938631
  13. 13. Bloom N, Sadun R, Van Reenen J. Management as a Technology? NBER Working Paper Series. Boston, Massachusetts; 2017. Report No.: 16–133.
  14. 14. The World Bank. Ghana | Data [Internet]. [cited 15 Jun 2018]. Available:
  15. 15. Ghana Statistical Service, Ghana Health Service, The DHS Program, ICF International. Ghana Demographic and Health Survey 2014 [Internet]. Accra, Ghana and Rockville, Maryland, USA; 2015. Available:
  16. 16. Ministry of Health of Ghana, Ghana Health Service. National Community-Based Health Service Planning and Services (CHPS) Policy [Internet]. 2016. Available:
  17. 17. Bloom N, Lemos R, Sadun R, Reenen J Van. Healthy Business? Managerial Education and Management in Healthcare [Internet]. Boston, Massachusetts; 2017. Report No.: 18–025. Available:
  18. 18. Darby C, Valentine N, Murray CJ, De Silva A. World Health Organization (WHO): Strategy on Measuring Responsiveness. Geneva, Switzerland; 2003. Report No.: 23.
  19. 19. Ratcliffe HL, Sando D, Lyatuu GW, Emil F, Mwanyika-Sando M, Chalamilla G, et al. Mitigating disrespect and abuse during childbirth in Tanzania: an exploratory study of the effects of two facility-based interventions in a large public hospital. Reprod Health. Reproductive Health; 2016;13: 79. pmid:27424608
  20. 20. Banks KP, Karim AM, Ratcliffe HL, Betemariam W, Langer A. Jeopardizing quality at the frontline of healthcare: prevalence and risk factors for disrespect and abuse during facility-based childbirth in Ethiopia. Health Policy Plan. 2017;33: 317–327. pmid:29309598
  21. 21. What We Do | PMA2020 [Internet]. [cited 2 Mar 2018]. Available:
  22. 22. Service Delivery Indicators | Health Indicators [Internet]. [cited 2 Apr 2018]. Available:
  23. 23. Ministry of Health (GNDP) Ghana. Ghana Essential Medicines List. 2010.
  24. 24. Ministry of Health of Ghana. National reproductive health research strategy. 2000.
  25. 25. Bair E, Hastie T, Paul D, Tibshirani R. Prediction by Supervised Principal Components. J Am Stat Assoc. Taylor & Francis; 2006;101: 119–137.
  26. 26. Zimmerman L, Ahmed S. Creation of household and female weights in PMA2020. In: [Internet]. Baltimore, MD; 2017 [cited 2 Mar 2018]. Available:
  27. 27. Moore CG, Lipsitz SR, Addy CL, Hussey JR, Fitzmaurice G, Natarajan S. Logistic Regression with Incomplete Covariate Data in Complex Survey Sampling: Application of Reweighted Estimating Equations. Epidemiology. Lippincott Williams & Wilkins; 2009;20: 382–390. pmid:19289959
  28. 28. Kruk ME, Mbaruku G, Mccord CW, Moran M, Rockers PC, Galea S. Bypassing primary care facilities for childbirth: a population-based study in rural Tanzania. 2009; 279–288. pmid:19304785
  29. 29. Koku Awoonor-Williams J, Tadiri E, Ratcliffe H. Translating research into practice to ensure community engagement for successful primary health care service delivery: The case of CHPS in Ghana. In: Primary Health Care Performance Initiative [Internet]. [cited 9 Aug 2017]. Available:
  30. 30. Dunsch FA, Evans DK, Eze-Ajoku E, Macis M. Management, Supervision, and Health Care: A Field Experiment [Internet]. Cambridge, Massachusetts; 2017. Report No.: 23749. Available:
  31. 31. Kebede S, Abebe Y, Wolde M, Bekele B, Mantopoulos J, Bradley EH. Educating leaders in hospital management: a new model in Sub-Saharan Africa. Int J Qual Health Care. 2010;22: 39–43. pmid:19951963
  32. 32. Kwamie A, van Dijk H, Agyepong IA. Advancing the application of systems thinking in health: realist evaluation of the Leadership Development Programme for district manager decision-making in Ghana. Heal Res Policy Syst. 2014;12. pmid:24935521