Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Systematic Monitoring of Male Circumcision Scale-Up in Nyanza, Kenya: Exploratory Factor Analysis of Service Quality Instrument and Performance Ranking

  • Dickens S. Omondi Aduda ,

    Affiliation School of Public Health and Community Development, Maseno University, Maseno, Kenya

  • Collins Ouma,

    Affiliation Department of Biomedical Sciences and Technology, School of Public Health and Community Development, Maseno University, Maseno, Kenya

  • Rosebella Onyango,

    Affiliation Department of Public Health, School of Public Health and Community Development, Maseno University, Maseno, Kenya

  • Mathews Onyango,

    Affiliation FHI360, Kisumu Office, Kisumu, Kenya

  • Jane Bertrand

    Affiliation Department of Global Health Systems and Development, Tulane University. New Orleans, Louisiana, United States of America



Considerable conceptual and operational complexities related to service quality measurements and variability in delivery contexts of scaled-up medical male circumcision, pose real challenges to monitoring implementation of quality and safety. Clarifying latent factors of the quality instruments can enhance contextual applicability and the likelihood that observed service outcomes are appropriately assessed.


To explore factors underlying SYMMACS service quality assessment tool (adopted from the WHO VMMC quality toolkit) and; determine service quality performance using composite quality index derived from the latent factors.

Study design

Using a comparative process evaluation of Voluntary Medical Male Circumcision Scale-Up in Kenya site level data was collected among health facilities providing VMMC over two years. Systematic Monitoring of the Medical Male Circumcision Scale-Up quality instrument was used to assess availability of guidelines, supplies and equipment, infection control, and continuity of care services. Exploratory factor analysis was performed to clarify quality structure.


Fifty four items and 246 responses were analyzed. Based on Eigenvalue >1.00 cut-off, factors 1, 2 & 3 were retained each respectively having eigenvalues of 5.78; 4.29; 2.99. These cumulatively accounted for 29.1% of the total variance (12.9%; 9.5%; 6.7%) with final communality estimates being 13.06. Using a cut-off factor loading value of ≥0.4, fifteen items loading on factor 1, five on factor 2 and one on factor 3 were retained. Factor 1closely relates to preparedness to deliver safe male circumcisions while factor two depicts skilled task performance and compliance with protocols. Of the 28 facilities, 32% attained between 90th and 95th percentile (excellent); 45% between 50th and 75th percentiles (average) and 14.3% below 25th percentile (poor).


the service quality assessment instrument may be simplified to have nearly 20 items that relate more closely to service outcomes. Ranking of facilities and circumcision procedure using a composite index based on these items indicates that majority performed above average.


Hitherto, a precise definition of quality remains elusive [1] although there is consensus about its multidimensionality as a service production variable [2]. The Institute of Medicine (IOM) defines quality as “the degree to which health services for individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge” [3]. Inherent in this definition are dimensions related to health experiences and anticipated outcomes for individuals and people-groups. Guidelines for program implementation ought to address how these dimensions can be correctly assessed across diverse care settings. Accordingly valid data tools are needed to capture objective service quality information during routine program practice. This would enhance operational decision-making [4], [5], [6]; determine scope for resource allocation and improvement tasks [5], [7]; enhance accountability for service delivery tasks planned or accomplished; guideline revisions [8], [9]; and facility accreditation [10]. Furthermore, the structure of data tools should be designed to improve their effectiveness [5], [11], [12], [13].

Voluntary medical male circumcision (VMMC) service delivery occurs across multiple service levels in diverse contexts which pose considerable potential for variability in service quality [14], [15], [16], [17], [18], [19], [20], [21]. Specific issues relate to lack of adherence to guidelines [22], [23]; low level support supervision; constrained documentation [24], [25], reporting and uptake of feedback [13], [14], [17], [18]. These have implications on service uptake, safety and improvement decisions [5], [26]. Furthermore, hurdles in service quality assessment occur in relation to conceptualizing comprehensive measures [5], [13]; conducting appropriate evaluations; aggregating and reporting assessment information; attributing variability to specific service quality measures and [27]; applying the results to improve program goals [10].

Considering these challenges and as part of efforts to ensure sustained quality of services, the national VMMC program adapted for use in 2009 [16], [28], [29], [30] the World Health Organization (WHO) toolkit [6] for monitoring a range of quality standards at facility level. The instrument is a comprehensive checklist comprising of 10 standards and 36 criteria. Additionally, the Kenya Quality Model for Health (KQMH) which was launched in 2012 provides a broad framework for sectoral integration of service quality improvement and management.

While the WHO quality toolkit is a useful guide for internal and external assessment of VMMC service activities across multiple levels, there is need to make it more user-friendly and assess its use across different service dimensions and locale since quality presentations may be influenced in part by constructs that relate closely to cultural context [13]. Hitherto, anecdotal field reports indicate that the toolkit is too laborious and the information collected is often complex to interpret, making its use problematic.

This paper applies exploratory factor analysis (EFA) to simplify the quality instrument, elicit variable interrelationships, identify latent dimensions and clarify content structure [31]. The principal factors explaining substantial variability is used to construct a quality index. This is a composite of the observed items that can be used for routine service quality assessment.

The aims include: (i) exploring the underlying dimensions and interrelationships of the items comprising the SYMMACS service quality assessment tool (adopted from the WHO quality toolkit [6]);(ii) identify key constructs which demonstrate optimal performance;(iii) to derive a quality index based on the observed quality factor scores and use it to categorize performance of the service facilities. The study outcomes can be adopted by team managers for routine service quality assessment.

Materials and Methods

Study design

This was comparative process evaluation of the voluntary medical male circumcision scale-up in Kenya over two years. The SYMMACS, Systematic Monitoring of the Medical Male Circumcision Scale-Up, quality instrument was used to assess respective facilities providing VMMC for availability of guidelines, supplies and equipment, infection control, and continuity of care services as well as direct observation of VMMC surgeries.

Context of primary study and data collection.

The SYMMACS study was conducted in Kenya (Nyanza region), Tanzania, Zimbabwe and South Africa. One of the objectives included evaluating evolution of safety and efficiency during VMMC scale-up, from the management perspective. The SYMMACS service quality assessment tool used was adopted from the WHO quality toolkit, which also serves as the national reference for VMMC service quality evaluation [21]. While comprehensive voluntary medical male circumcision is a standard-based HIV prevention service, no systematic evaluation of service quality has been conducted since its roll-out in 2008. In Kenya, the study provided the first opportunity for the national program to systematically evaluate performance.

Sampling and data collection methodology.

Bertrand and colleagues [24] have described in details the SYMMACS sampling procedure. Thirty fixed, outreach or mobile VMMC sites (15/12/3) out of the 235 operational by December 2010 were randomly selected for 2011 data collection. In 2012 four of the outreach sites were replaced because of programmatic changes and one outreach site was dropped for lack of clients, resulting in 29 sites in 2012. The four facilities were respectively replaced with those of similar categories by randomly selecting from among the functional sites in the original sampling frame. Field staff sampled all clinical VMMC service providers per site over two days of data collection (a total of 86 in 2011 and 82 in 2012). Ten VMMC procedures were observed per site where feasible, starting with the first operation on Day 1 and continuing with each subsequent one available for observation. In total, 151 and 218 circumcisions were observed in 2011 and 2012 respectively.

Measure instruments and measures.

Site level data was collected using the SYMMACS quality assessment tools, modified from the WHO quality assessment toolkit. The aspects considered were availability of guidelines, supplies and equipment, infection control, and continuity of care services. Specifically, 29 variables targeted the facility service setting and another 29 items the circumcision procedure [25]. A total of 167 clinical providers were interviewed and 369 circumcisions observed. With at least 50 variables and factor loadings of 0.40 required, the sample size obtained was considered sufficient to produce stable outcomes.

Statistical Analysis

Binary interval data from the two instruments were merged and analyzed using SAS (SAS Institute Inc. USA). Principal component analysis (PCA) was used to identify fewer variables which account for the most variability observed in the dataset, based on the variance within and correlation across the variables [32], [33], [34], [35]. In this process, after the first component is defined, consecutive components are extracted from each subsequent residual variance until virtually all variance of measured items are accounted for.

A Screeplot of the eigenvalues of unrotated factors displays a steep “cliff” of the curve representing the initial factors extracted (‘latent variables’ or constructs) from the observed variables and which maximize the variance accounted for (Fig.1), while the shallow “scree” demonstrates small extent of variance accounted for by the subsequent minor factors [36]. Conventionally, the cut-off point is where the slope forms an ‘elbow’, being ‘the point at which the slope approaches zero’. Factors with values above this point are retained while those below it are deleted given the variance accounted is almost zero [35], [37]. Exploratory factor analysis was used to examine the latent structure of extracted components and identify associations among multiple variables comprising each one [34], [35].

Figure 1. Scree plot showing distribution of factors by their eigenvalues.

A Scree plot of eigenvalues of the unrotated factors displaying an ‘elbow’ of the plot (shown by the red arrow). This point of the curve represents the threshold chosen for retention of the initial factors extracted from the observed variables and which maximize the variance accounted for. Three factors, each respectively with eigenvalues of 5.78; 4.29; 2.99 were retained. These factors cumulatively accounted for 29.1% of the total variance (12.9%; 9.5%; 6.7%) with final communality estimates being 13.06. The shallow “scree” distal to the arrow demonstrates small extent of variance accounted for by the subsequent minor factors, which were deleted.

Rotating factors.

to simplify the structure of the variables, Varimax rotation was used since it maximizes the variability of loadings between factors. Simple, meaningful structure is achieved when items cluster exclusively or highly on as few of the retained factors as possible, but primarily one [34].

Factor loading.

The maximum number of iterations was set at 25 to identify variables in each dimension. Factor loadings with absolute values ≥ 0.4were considered to contribute sufficiently to the overall variability accounted for by the factor [38], [39]. Cross loading items with values >0.3 were removed to improve consistency.

Quality weights for constructing the index was obtained from the first component as it accounted for the most variability in the items observed [33]. The constructed quality index was used to rank the facilities as being excellent; good; average and poor based respectively on the cut off scores corresponding to the 90th, 75th, 50th and 25th percentiles (0.867; 0.491;−0.219and −0.667).

Ethical Considerations

"Ethics approval for the SYMMACS study was obtained from the Tulane University Institutional Review Board (IRB) and local IRB, the Kenya Medical Research Institute. Academic approval was obtained from Maseno University. All study participants provided written informed consent."


Principal component analysis

A total of 54 item measures and 246 responses with normal distribution were analyzed. Based on a stepwise approach highly correlated items cluster around respective common factors (latent variable), such that the first few components/factors successively account for most of the variation in the original observed set of variables and are retained to form new dimensions for the measures. The initial estimate of common variance among all the 58 factors was 45, accounting for 77.6% of the total variance. Fifteen components with eigenvalues ≥1.00 accounted for 73.6% of the total variance. Based on eigenvalue>1.00 [35] three factors, each respectively with eigenvalues of 5.78; 4.29; 2.99 were retained (Fig. 1). These factors cumulatively accounted for 29.1% of the total variance (12.9%; 9.5%; 6.7%) with final communality estimates being 13.06.

Respective factor items with values ≥0.4 are displayed list-wise in table 1. Based on this cut off, fifteen (15) items loading on factor 1, five (5) on factor 2 and one (1) on factor 3 were retained.

Table 1. Rotated factor loadings of factor 1 and 2 relating to VMMC service quality dimensions.

Factor one items relate to different aspects of VMMC service delivery quality indicators, focusing broadly on the preparedness to provide VMMC in terms of physical infrastructure, guidelines and the interactive elements of circumcision service. Hence it is labeled ‘preparedness to deliver safe male circumcisions’. Items converging on factor 1 can be categorized further into: safety reliability (availability of basic life support equipment, eligibility assessment, observation of vital signs and other events post-operatively to identify potential harms and, availability of antibiotics for treatment of adverse events); appropriateness (using guidelines in performing necessary pre-, intra- and post-operative tasks);communication interaction(pre- and post- operative information-giving on HIV and circumcision); access to minimum service package (Syndromic management of STIs, individualized confidential HTC and condom distribution) and staff competence (correct surgical knots tying technique).

Factor 2 is labeled ‘performance-safety’ being related to skill-compliance issues and safety of surgical procedure. It comprises of variables related to continuity of care (discharge care and interactive follow-up instructions); staff safety (eye wear to prevent splash to the eyes); acute care (Oxygen as a basic life support). The only item loading on factor 3 retained based on the threshold for cut off was ‘Appropriate antibiotics in stock to treat infection related AEs’. However, it was also cross-loading on factor 1. Since factor three had only one item, also cross-loading on factor 1, it was considered weak hence was not considered further.

Quality of care by facility based ranking by composite quality index derived from the principal factors

The weighted factor coefficients from the first component were used to rank facilities and the cases performed (Table 2). Out of the 28 facilities, 32% (9/28) had scores between 90thand 95th percentile; and 45% between 50thand 75thpercentiles. In four of the facilities, scores for all cases observed were poor, being in the lower 25th percentile. Quality scores for more than half of the cases were rated as good or excellent, while almost a quarter of them were poor [table 3].

Table 2. Overall facility ranking by weighted quality scores.

Table 3. Service quality ranking by cases in 2011 and 2012 and by facility in 2012.


The factor analysis of the SYMMACS quality instrument reveals two discreet factors. The value of extracted factors in measuring service quality is however contingent on the observed data and the relationships between variables under consideration as well as the validity and reliability of the variables retained [40]. It is to be noted that rotation of factors to simplify structure may cause loss of variance on individual dominant sources [40].

The factor loadings show the hierarchical item importance within the factors, in terms of both component availability and task performance [34]. The observed variability in quality of VMMC service delivery is best explained by the dimensions ‘preparedness to deliver safe male circumcisions’, being complemented by ‘performance-safety’. Implicit in these dimensions are the technical and functional requirements necessary for accomplishing VMMC service delivery tasks correctly. The current study was undertaken from the management perspective to objectively evaluate provider performance, whereas existing studies on health service quality largely focus on the consumers' viewpoint, which is less likely to reflect accurately aspects like provider competence [41], [42], [43], [44].

The SYMMACS quality instruments exhibit similar underlying concepts as those described in existing health service quality studies [5], [44], [45], [46], [47], [48]. The factors configure well with the WHO toolkit criteria [6], [49] for assessing VMMC service quality and the domains (Effective, Appropriate, Safe, Efficient, Responsive, Accessible, Continuous, Capable, Sustainable) in other health systems quality performance frameworks [7], [50]. The principal component represents diversity of service quality sources and reflects the multidimensionality of VMMC services similar to other public health interventions.

The factor structure elicited demonstrates safety aspects and provider-client interactions as key quality considerations. This would potentially assist program managers to understand the scope encompassed by VMMC quality assessment tool, recognize its importance and progressively build into the delivery system capacities for proper service performance. The observations also indicate that staff performed dismally mainly in the tasks related to spontaneous patient-staff communication interactions, particularly engagement in the post operative period in contrast to availability of equipment and supplies which by default are provided by the program [25]. These performance tasks are inherently related to individual competency and responsiveness which if emphasized would greatly improve patient safety.

As part of quality improvement plans, operational guidelines ought to subsequently clarify systemic approaches to health practice safety. Likewise, emphasizing compliance with operational guidelines will ensure desired service quality outcomes are obtained [6], [43]. Refresher staff training and support supervision are helpful in enhancing progressive learning of skills for target tasks and responsibilities besides monitoring how well these are performed [43], [47], [51]. These capacities includes communication abilities and interpersonal skills to improve information-giving [52].

Historically, service quality has been assessed via similar basic dimensions [43], [44]. The Donabedian framework, for example, proposes ‘structure – process – outcome’ approach [47]. In this approach, the structure dimension relates to the care setting (including facility characteristics, equipment, training and special skills). The process dimension comprises aspects related to the provider-patient interaction and is considered a function of the technical and interpersonal skills. The outcome dimension reflects the immediate/intermediate and long-term changes occurring to the patient's status based on services provided [3], [8], [12], [53]. Another model developed by Parasuraman et al. (1985) has 10 dimensions: tangibles, reliability, responsiveness, competence, courtesy, credibility, assurance, access, communication, and customer understanding. Brown and Swartz (1989) applied this model to assess quality of medical-surgical service delivery and they determined that this list is reasonably applicable to health service settings. Bruce-Jain's framework however, consists of six dimensions: needs assessment, choice of contraceptive methods, information given to users, interpersonal relations, constellation of services, and continuity mechanisms. This has been used to assess contraceptive services to adolescents in Uganda [49]. Given the dimensions apparent in the SYMMACS service quality instruments are reasonable, a simplified version based on the principal factors can be adopted for routine quality assessment and monitoring.


Using exploratory factor analysis, it was possible to empirically discern the multidimensionality of VMMC service delivery by eliciting three principal factors of service quality. Whereas the quality assessment tool contains globally useful items, only twenty of these were more closely related to service quality performance outcomes in the local context. Future research should focus on defining other conceptually different item combinations for the toolkit.

The observed factor structure can be a realistic guide to quality performance improvement efforts despite inherent potential limitations related to its structure. In this study, performance of majority of facilities was rated as above average based on the derived composite quality index scores, indicating respective level of improvement efforts needed.

Study Limitations

A key limitation of the study is lack of the client perspective. However, it is unknown how its inclusion would alter the characteristics of the derived factor constructs, given that theoretically, this aspect as an outcome is difficult to link with the structure and process that produce it unless comprehensive reliable information is available.


The authors wish to acknowledge the Kenya team that collected the project data: Nicholus Pule, Rosemary Owigar and Dr. Violet Naanyu; we thank Linea Perry and Margaret Farrell of Tulane University for processing the project data; Vincent Were provided technical assistance with factor analysis and constructing the composite index score for ranking the facilities; thanks to Larissa Jennings for reviewing the initial draft of this manuscript and making useful contributions.

Author Contributions

Conceived and designed the experiments: DA JB. Performed the experiments: DA JB MO. Analyzed the data: DA. Contributed reagents/materials/analysis tools: DA MO. Wrote the paper: DA CO RO MO JB. Criteria for authorship read and met: DA JB MO CO RO. Agreed with manuscript results and conclusions: DA JB MO CO RO.


  1. 1. Parasuraman A, Zeithaml VA, Berry LL (1985) A conceptual model of service quality and its implications for future research. Journal of marketting 49: 41–50.
  2. 2. Hathorn E, Land L, Ross JDC (2011) How to assess quality in your sexual health service. Sexually Transmitted Infections 87: 508–510.
  3. 3. Derose SF, Schuster MA, Fielding JE, Asch SM (2002) Public health quality measurement: Concepts and Challenges. Annu Rev Public Health 23: 1–21.
  4. 4. Bellows J, Michael PS (2004) Background Paper: Could a Quality Index Help Us Navigate The Chasm? “Facilitating Use of Clinical Quality Measures through Aggregation”. April 21-22, 2004 Washington, DC.: Kaiser Permanente; Institute for health policy.
  5. 5. Hong R, Montana L, Mishra V (2006) Family planning services quality as a determinant of use of IUD in Egypt. BMC Health Services Research 6..
  6. 6. World Health Organization (2009) Male Circumcision Services: Quality Assessment Toolkit. WHO Geneva, Switzerland.
  7. 7. World Health Organization (2006) Quality of Care: A process for making strategic choices in health systems. Quality assurance, Health care. World Health Organization, 20 Avenue Appia, 1211 Geneva 27, Switzerland: WHO Press.
  8. 8. Derose SF, Petitti DB (2003) Measuring quality of care and performance from a population health care perspective. Annual Review of Public Health 24: 363–384.
  9. 9. Rapkin B, Weiss E, Chhabra R, Ryniker L, Patel S, et al. (2008) Beyond satisfaction: Using the Dynamics of Care assessment to better understand patients' experiences in care. Health and Quality of Life Outcomes 6: 20.
  10. 10. Bellows J, Sullivan MP (2004) Background Paper: Could a Quality Index Help Us Navigate The Chasm? “Facilitating Use of Clinical Quality Measures through Aggregation”. April 21–22, 2004 Washington, DC.: Kaiser Permanente; Institute for health policy.
  11. 11. Brown LD, Franco LM, Rafeh N, Hatzell T (ND.) Quality assurance of health care in developing countries. Quality Assurance Methodology Refinement Series. 7200 Wisconsin Ave., Suite 600. Bethesda, MD 20814 USA: Quality Assurance Project: funded by U.S. Agency for International Development, Office of Health, Bureau for Science and Technology.
  12. 12. Lazar EJ, Fleischut P, Regan BK (2013) Quality Measurement in Healthcare. Annual Review of Medicine 64: 485–496.
  13. 13. Kredo T, Gerritsen A, Heerden JV, Conway S, Siegfried N (2012) Clinical practice guidelines within the Southern African development community: a descriptive study of the quality of guideline development and concordance with best evidence for five priority diseases. Health Research Policy and Systems.
  14. 14. Mahler HR, Kileo B, Curran K, Plotkin M, Adamu T, et al.. (2011) Voluntary medical male circumcision: matching demand and supply with quality and efficiency in a high-volume campaign in Iringa Region, Tanzania. PLoS Med.pp. e1001131.
  15. 15. Mangham LJ, Hanson K (2010) Scaling up in international health: what are the key issues? Health Policy and Planning.
  16. 16. National AIDS Control Council (2009) Kenya National AIDS Strategic Plan 2009/10 - 2012/13: Delivering on Universal Access to Services. Nairobi: Ministry of Public Health and Sanitation.
  17. 17. Herman-Roloff A, Bailey RC, Agot K (2012) Factors associated with the safety of voluntary medical male circumcision in Nyanza Province, Kenya. Bull World Health Organ 90: 773–781.
  18. 18. MOPHS (2011) Ministry of Public Health and Sanitation. Provincial MMC Task force report, Nyanza Province; April, 2011.
  19. 19. NASCOP (2009) Kenya National Strategy for Voluntary Medical Male Circumcision. Nairobi: Ministry of Public Health and Sanitation.
  20. 20. NASCOP (2008) National Guidance for Voluntary Male Circumcision in Kenya. Nairobi: Ministry of Health, Republic of Kenya.
  21. 21. NASCOP (2008) Manual for Male Circumcision under Local Anaesthesia, Version 2.5C. Version 2.5C ed. Nairobi: National AIDS and STI Control Programme.
  22. 22. Campbell SM, Roland MO, Buetow SA (2000) Defning quality of care. Social Science & Medicine 51: 1611–1625.
  23. 23. Nietert P, Wessell A, Jenkins R, Feifer C, Nemeth L, et al. (2007) Using a summary measure for multiple quality indicators in primary care: the Summary QUality InDex (SQUID). Implementation Science 2: 11.
  24. 24. Bertrand JT, Rech D, Omondi Aduda D, Frade S, Loolpapit M, et al. (2014) Systematic Monitoring of Voluntary Medical Male Circumcision Scale-Up: Adoption of Efficiency Elements in Kenya, South Africa, Tanzania, and Zimbabwe. PLoS ONE 9: e82518.
  25. 25. Jennings L, Bertrand J, Rech D, Harvey SA, Hatzold K, et al. (2014) Quality of Voluntary Medical Male Circumcision Services during Scale-Up: A Comparative Process Evaluation in Kenya, South Africa, Tanzania and Zimbabwe. PLoS ONE 9: e79524.
  26. 26. Bickler SW, Spiegel D (2010) Improving Surgical Care in Low- and Middle-Income Countries: A Pivotal Role for the World Health Organization. World J Surg 34: 386–390.
  27. 27. Shaller D (2004) Implementing and Using Quality Measures for Children's Health Care: Perspectives on the State of the Practice. Pediatrics 113: 217–227.
  28. 28. English M, Esamai F, Wasunna A, Were F, Ogutu B, et al. (2004) Delivery of paediatric care at the first-referral level in Kenya. The Lancet 364: 1622–1629.
  29. 29. Ministry of Health (2005) Reversing the Trends: The Second National Health Sector Strategic Plan (NHSSP II, 2005–2010). Nairobi.
  30. 30. National AIDS and STI Control Program (NASCOP) (2009) Kenya National Strategy for Voluntary Medical Male Circumcision. Nairobi: Ministry of Public Health and Sanitation.
  31. 31. Floyd FJ, Widaman KF (1995) Factor Analysis in the Development and Refinement of Clinical Assessment Instruments. Psychological Assessment 7: 286–299.
  32. 32. Matsunaga M (2010) How to factor-Analyze your data right: Dos and Don'ts. International Journal of Psychological Research 3: 97–110.
  33. 33. Vyas S, Kumaranayake L (2006) How to do (or not to do). Constructing socio-economic status indices: how to use principal components analysis. HIVTools Research Group, Health Policy Unit, Department of Public Health and Policy, London School of Hygiene and Tropical Medicine,: Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.
  34. 34. Landrum M, Bronskill S, Normand S (2000) Analytic methods for constructing cross-sectional profiles of health care providers. Health Serv Outcomes Res Methodol 1: 23–47.
  35. 35. Frank JF, Keith FW (1995) Factor analysis in the development and refinement of clinical assessment instruments. Psychological Assessment 7: 286–299.
  36. 36. Zaslavsky A, Shaul J, Zaborski L, Cioffi M, Cleary P (2002) Combining health plan performance indicators into simpler composite measures. Health Care Financ Rev 23: 101–115.
  37. 37. Grimshaw J, McAuley L, Bero L, Grilli R, Oxman A, et al. (2003) Systematic reviews of the effectiveness of quality improvement strategies and programmes. Qual Saf Health Care 12: 298–303.
  38. 38. Stevens JP (1992) Applied Multivariate Statistics for the Social Sciences (2nd edition). Hillsdale, NJ: Erlbaum.
  39. 39. Costello AB, Osborne JW (2005) Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis. Practical Assessment Research & Evaluation 10.
  40. 40. Jolliffe IT (2002) Principal Component Analysis, Second Edition. In: P. Bickel PD, S Fienberg, K Krickeberg, I Olkin, N Wermuth, S Zeger, editor. Springer Series in Statistics. London: Springer .
  41. 41. Tabrizi J (2011) Quality of Delivered Care for People with Type 2 Diabetes: A New Patient-Centred Model. Journal of Research in Health Sciences, North America 9: 1–9.
  42. 42. Babikako H, Neuhauser D, Katamba A, Mupere E (2011) Patient satisfaction, feasibility and reliability of satisfaction questionnaire among patients with pulmonary tuberculosis in urban Uganda: a cross-sectional study. Health Research Policy and Systems 9: 6.
  43. 43. Zaneta P, Ilona B (2008) Different perspectives on health care quality: Is the consensus possible? Engeneering economics 1: 104–110.
  44. 44. Parasuraman A, Zeithaml VA, Berry LL (1985) A conceptual model of service quality and its implications for future research. Journal of marketting 49: 41–50.
  45. 45. Casey S, Mitchell K, Amisi I, Haliza M, Aveledi B, et al. (2009) Use of facility assessment data to improve reproductive health service delivery in the Democratic Republic of the Congo. Conflict and Health 3: 12.
  46. 46. Shahidzadeh-Mahani A, Omidvari S, Baradaran H-R, Azin S-A (2008) Factors affecting quality of care in family planning clinics: A study from Iran. International Journal for Quality in Health Care 20: 284–290.
  47. 47. Donabedian A (1988) The quality of care - how can it be assessed. Jama-Journal of the American Medical Association 260: 1743–1748.
  48. 48. Sower V, Duffy J, Kilbourne W, Kohers G, Jones P (2001) The dimensions of service quality for hospitals: development and use of the KQCAH. Heathe Care Management Review 26: 47–59.
  49. 49. Nalwadda G, Tumwesigye NM, Faxelid E, Byamugisha J, Mirembe F (2011) Quality of Care in Contraceptive Services Provided to Young People in Two Ugandan Districts: A Simulated Client Study. PLoS ONE 6..
  50. 50. Landgren F, Murray J (2008) A guide to using data for health care quality improvement. In: Victorian Quality Council Secretariat, editor. Big Print, 50 Lonsdale St, Melbourne VIC 3000: Rural and Regional Health and Aged Care Services Division, Victorian Government Department of Human Services, Melbourne, Victoria.
  51. 51. van Duong D, Binns CW, Lee AH, Hipgrave DB (2004) Measuring client-perceived quality of maternity services in rural Vietnam. International Journal for Quality in Health Care 16: 447–452.
  52. 52. Sofaer S, Firminger K (2005) Patient perceptions of the quality of health services. Annual Reviews in Public Health 26: 513–559.
  53. 53. Mangione-Smith R, DeCristofaro AH, Setodji CM, Keesey J, Klein DJ, et al. (2007) The Quality of Ambulatory Care Delivered to Children in the United States. New England Journal of Medicine 357: 1515–1523.