Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Implementation of Departmental Quality Strategies Is Positively Associated with Clinical Practice: Results of a Multicenter Study in 73 Hospitals in 7 European Countries

  • Rosa Sunol ,

    Contributed equally to this work with: Rosa Sunol, Cordula Wagner, Onyebuchi A. Arah, Solvejg Kristensen, Holger Pfaff, Niek Klazinga, Caroline A. Thompson, Aolin Wang, Maral DerSarkissian, Paul Bartels, Philippe Michel, Oliver Groene

    rsunol@fadq.org

    Affiliations Avedis Donabedian Research Institute (FAD), Universitat Autonoma de Barcelona, Barcelona, Spain, Red de investigación en servicios de salud en enfermedades crónicas REDISSEC, Barcelona, Spain

  • Cordula Wagner ,

    Contributed equally to this work with: Rosa Sunol, Cordula Wagner, Onyebuchi A. Arah, Solvejg Kristensen, Holger Pfaff, Niek Klazinga, Caroline A. Thompson, Aolin Wang, Maral DerSarkissian, Paul Bartels, Philippe Michel, Oliver Groene

    Affiliations NIVEL, Netherlands Institute for Health Services Research, Utrecht, The Netherlands, Department of Public and Occupational Health,EMGO Institute for Health and Care Research, VU University Medical Center, Amsterdam, The Netherlands

  • Onyebuchi A. Arah ,

    Contributed equally to this work with: Rosa Sunol, Cordula Wagner, Onyebuchi A. Arah, Solvejg Kristensen, Holger Pfaff, Niek Klazinga, Caroline A. Thompson, Aolin Wang, Maral DerSarkissian, Paul Bartels, Philippe Michel, Oliver Groene

    Affiliation Department of Epidemiology, Fielding School of Public Health, University of California Los Angeles (UCLA), Los Angeles, California, United States of America

  • Solvejg Kristensen ,

    Contributed equally to this work with: Rosa Sunol, Cordula Wagner, Onyebuchi A. Arah, Solvejg Kristensen, Holger Pfaff, Niek Klazinga, Caroline A. Thompson, Aolin Wang, Maral DerSarkissian, Paul Bartels, Philippe Michel, Oliver Groene

    Affiliations Danish Clinical Registries, Aarhus, Denmark, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark

  • Holger Pfaff ,

    Contributed equally to this work with: Rosa Sunol, Cordula Wagner, Onyebuchi A. Arah, Solvejg Kristensen, Holger Pfaff, Niek Klazinga, Caroline A. Thompson, Aolin Wang, Maral DerSarkissian, Paul Bartels, Philippe Michel, Oliver Groene

    Affiliations Institute for Medical Sociology, Health Services Research and Rehabilitation Science, University of Cologne, Cologne, Germany, Center for Health Services Research Cologne, University of Cologne, Cologne, Germany

  • Niek Klazinga ,

    Contributed equally to this work with: Rosa Sunol, Cordula Wagner, Onyebuchi A. Arah, Solvejg Kristensen, Holger Pfaff, Niek Klazinga, Caroline A. Thompson, Aolin Wang, Maral DerSarkissian, Paul Bartels, Philippe Michel, Oliver Groene

    Affiliation Department of Public Health, Academic Medical Center, University of Amsterdam, Amsterdam, The Netherlands

  • Caroline A. Thompson ,

    Contributed equally to this work with: Rosa Sunol, Cordula Wagner, Onyebuchi A. Arah, Solvejg Kristensen, Holger Pfaff, Niek Klazinga, Caroline A. Thompson, Aolin Wang, Maral DerSarkissian, Paul Bartels, Philippe Michel, Oliver Groene

    Affiliation Palo Alto Medical Foundation Research Institute (PAMFRI), Palo Alto, California, United States of America

  • Aolin Wang ,

    Contributed equally to this work with: Rosa Sunol, Cordula Wagner, Onyebuchi A. Arah, Solvejg Kristensen, Holger Pfaff, Niek Klazinga, Caroline A. Thompson, Aolin Wang, Maral DerSarkissian, Paul Bartels, Philippe Michel, Oliver Groene

    Affiliation Department of Epidemiology, Fielding School of Public Health, University of California Los Angeles (UCLA), Los Angeles, California, United States of America

  • Maral DerSarkissian ,

    Contributed equally to this work with: Rosa Sunol, Cordula Wagner, Onyebuchi A. Arah, Solvejg Kristensen, Holger Pfaff, Niek Klazinga, Caroline A. Thompson, Aolin Wang, Maral DerSarkissian, Paul Bartels, Philippe Michel, Oliver Groene

    Affiliation Department of Epidemiology, Fielding School of Public Health, University of California Los Angeles (UCLA), Los Angeles, California, United States of America

  • Paul Bartels ,

    Contributed equally to this work with: Rosa Sunol, Cordula Wagner, Onyebuchi A. Arah, Solvejg Kristensen, Holger Pfaff, Niek Klazinga, Caroline A. Thompson, Aolin Wang, Maral DerSarkissian, Paul Bartels, Philippe Michel, Oliver Groene

    Affiliations Danish Clinical Registries, Aarhus, Denmark, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark

  • Philippe Michel ,

    Contributed equally to this work with: Rosa Sunol, Cordula Wagner, Onyebuchi A. Arah, Solvejg Kristensen, Holger Pfaff, Niek Klazinga, Caroline A. Thompson, Aolin Wang, Maral DerSarkissian, Paul Bartels, Philippe Michel, Oliver Groene

    Affiliation Hospices Civils de Lyon, Université Claude Bernard Lyon 1, Lyon, France

  • Oliver Groene ,

    Contributed equally to this work with: Rosa Sunol, Cordula Wagner, Onyebuchi A. Arah, Solvejg Kristensen, Holger Pfaff, Niek Klazinga, Caroline A. Thompson, Aolin Wang, Maral DerSarkissian, Paul Bartels, Philippe Michel, Oliver Groene

    Affiliation Department of Health Services Research and Policy, London School of Hygiene and Tropical Medicine, London, United Kingdom

  • DUQuE Project Consortium

    Membership of the DUQuE Project Consortium is provided in the Acknowledgments.

Abstract

Background

Given the amount of time and resources invested in implementing quality programs in hospitals, few studies have investigated their clinical impact and what strategies could be recommended to enhance its effectiveness.

Objective

To assess variations in clinical practice and explore associations with hospital- and department-level quality management systems.

Design

Multicenter, multilevel cross-sectional study.

Setting and Participants

Seventy-three acute care hospitals with 276 departments managing acute myocardial infarction, deliveries, hip fracture, and stroke in seven countries.

Intervention

None.

Measures

Predictor variables included 3 hospital- and 4 department-level quality measures. Six measures were collected through direct observation by an external surveyor and one was assessed through a questionnaire completed by hospital quality managers. Dependent variables included 24 clinical practice indicators based on case note reviews covering the 4 conditions (acute myocardial infarction, deliveries, hip fracture and stroke). A directed acyclic graph was used to encode relationships between predictors, outcomes, and covariates and to guide the choice of covariates to control for confounding.

Results and Limitations

Data were provided on 9021 clinical records by 276 departments in 73 hospitals. There were substantial variations in compliance with the 24 clinical practice indicators. Weak associations were observed between hospital quality systems and 4 of the 24 indicators, but on analyzing department-level quality systems, strong associations were observed for 8 of the 11 indicators for acute myocardial infarction and stroke. Clinical indicators supported by higher levels of evidence were more frequently associated with quality systems and activities.

Conclusions

There are significant gaps between recommended standards of care and clinical practice in a large sample of hospitals. Implementation of department-level quality strategies was significantly associated with good clinical practice. Further research should aim to develop clinically relevant quality standards for hospital departments, which appear to be more effective than generic hospital-wide quality systems.

Introduction

Substantial research has been undertaken on the assessment and improvement of the quality of health care delivery in the past 30 years. While considerable progress has been made, quality and safety problems persist and the debate on how to accelerate and sustain quality management is more relevant than ever [14]. In response to this debate, a new research line related to the effectiveness of quality management has emerged in the last 10 to 15 years, with a focus on questions such as "Does quality management lead to better quality of care?", "Which quality tools are most effective?", and "What factors are associated with the effective implementation of quality management systems?” [511].These questions are highly relevant for hospital managers and clinicians aiming to implement quality management systems targeting organization-specific quality and safety issues.

Quality management systems can be understood as sets of interacting activities, methods, and procedures used to direct, monitor, control, and improve quality of care [12]. They usually exist at the highest levels of the organization and are operationalized into specific quality improvement activities within smaller organizational units. It is assumed that quality management systems are a prerequisite for the systematic application and sustainability of quality improvement activities through smaller units, which are needed to reduce undesired variations in clinical practice and to improve the effectiveness, safety, and patient-centeredness of care [13].

While variations in clinical practice are well documented, there is little evidence on factors associated with the uptake of quality management activities by hospitals or on their impact on clinical practice. A recent systematic review identified a number of studies exploring factors positively associated with the implementation of quality management systems [14]. Previous reports have identified positive associations between engagement in quality improvement activities and a range of perceived outcomes (e.g., patient satisfaction, productivity, and physician-nurse relations) and quality improvement outputs (e.g., use of ID bracelets) [6, 13, 15]. However, none of these studies have investigated the impact of quality management systems on clinical practice. Our study seeks to address this gap.

The study was conducted as part of the “Deepening our understanding of quality improvement in Europe (DUQuE)” project, which was funded by the European Union’s Seventh Research Framework Program. The overall aim of the project was to study relationships between organizational quality improvement systems and organizational culture, professional involvement, and patient empowerment at hospital and department levels, and to analyze the quality of health care delivery in terms of clinical effectiveness, patient safety, and patient experiences. In previous work, we explored the effects of hospital-level quality management systems on quality activities at the department level, and found mixed results, with positive but weak associations [16]. Our specific objectives in the present study were to assess performance based on clinical practice indicators and to explore their association with the implementation of hospital and departmental quality management systems in 4 clinical settings in a large sample of European hospitals.

Methods

Study Design, Setting and Population

In this multicenter cross-sectional study, data were collected at the hospital-, department-, and patient-level through questionnaires administered to managers and health professionals, retrospective case note reviews, and direct observation. Hospitals with more than 130 beds were randomly selected in each the participating countries: Czech Republic, France, Germany, Poland, Portugal, Spain, and Turkey. Including criteria was to reach general acute care hospitals, with a minimum hospital size of 130 beds that had a sufficient volume of care to ensure recruitment of 30 patients per condition over a 4-month period (a sample frame of a minimum of 90 patients). Specialty hospitals, hospital units within larger hospitals, and hospitals not providing care for the four clinical conditions of study were excludedn each hospital, detailed information was collected on 4 conditions: acute myocardial infarction, vaginal deliveries, hip fractures, and stroke. For each condition, we reviewed 35 consecutive cases that fulfilled the study’s inclusion and exclusion criteria study. Data for the clinical practice indicators were retrieved through retrospective case note review using a standardized data collection sheet and a manual translated into the local language. The review of case notes was organized locally according to protocol and performed by trained hospital staff with clinical background knowledge and experience with local clinical and documentation practice and not being involved in the practice being assessed. Other department’s data was retrieved from the department seeing most patients from the studied condition and emergency area (if applicable). All data were collected between May 2011 and February 2012. The data collection strategy was informed by a sample size calculation that took into account the multilevel structure of the study [17]. The broader theoretical conceptual framework, detailed objectives and methods, and different constructs assessed within the DUQuE project are described in detail elsewhere [18].

Outcomes, Predictors and Covariates

The outcome variables consisted of a set of 19 single clinical practice indicators selected based on a review of the literature and a systematic rating procedure by individual experts and relevant European scientific societies to assess their relevance, applicability, and feasibility. In addition, 5 clinical practice summary measures were constructed (see Textbox 1). Each clinical indicator was assigned a level of evidence rating (A, B, C, etc.) based on the scientific evidence quoted in the pertinent clinical practice guidelines [18].

Textbox 1. Clinical practice summary indicators and predictor variables (measuring the implementation of quality management systems at the hospital and department levels)

Clinical practice summary indicators*

  • Acute AMI: a) Was reperfusion therapy given on time (fibrinolytic agent administered within 75 minutes of hospital arrival or primary percutaneous coronary intervention within 90 minutes)?; b) Were all appropriate medications (including anti-platelet drugs, beta-blockers, statin and ACE inhibitors) prescribed (or contraindicated) at discharge?
  • Routine vaginal deliveries: Were there any birth-related complications (blood transfusion, acute C-section (planned vaginal delivery that finished in a C-section), instrumentation needed during vaginal delivery, third- or fourth-degree laceration, or newborn Apgar score <7 at 5 minutes)?
  • Hip fractures: What percentage of recommended care was delivered? (This was calculated from the following measures: prophylactic antibiotic within 60 minutes prior to surgical incision, prophylactic thrombolytic treatment received on the same day of admission, patient mobilized within 24 hours after surgery and patients with in-hospital surgical waiting time <48 hours.)
  • Stroke: Did stroke patients receive appropriate care?: (Appropriate care included treatment with an antiplatelet inhibitor within 48 hours, performance of a CT or MRI within 24 hours, and mobilization of the patient within 48 hours.)

*Clinical practice summary indicators had 3 possible values: “Yes”, “No”, or “Not applicable”.

Predictors at the hospital level**

  1. •. Quality Management Systems Index (QMSI). The QMSI is an overall measure of the extent of implementation of quality management systems and has 9 dimensions: 1) quality policy documents, 2) quality monitoring by hospital board, 3) training of professionals, 4) formal protocols for infection control, 5) formal protocols for medication and patient handling, 6) analysis of performance of care processes, 7) analysis of performance of professionals, 8) analysis of patient feedback, and 9) evaluation of results.
  2. -. Quality Management Compliance Index (QMCI). The QMCI is a measure of how the hospital management oversees hospital quality program initiatives and has 4 dimensions: 1) quality planning, 2) monitoring of patient/professional opinions, 3) monitoring of quality systems, and 4) improvement of quality through staff development activities.
  3. -. Clinical Quality Implementation Index (CQII). The CQII measures the implementation of quality efforts throughout the hospital and analyzes continuous improvement in clinical areas with 7 dimensions: 1) prevention of hospital infections, 2) medication management, 3) prevention of patient falls, 4) prevention of patient ulcers, 5) routine testing of elective surgery patients, 6) safe surgical practices, and 7) prevention of patient deterioration.

**All quality management measures are rated from 0 (lowest possible score) to 10 (highest possible score). Methods for the construction of all measures, including results from the factor analysis and rationale for the dimensions selected are provided in detail elsewhere [1921].

Predictors at the department level***

  • Specialized expertise and responsibility (SER). This assesses professional expertise and the allocation of clinical responsibilities. [21]
  • Evidence-based organization of pathways (EBOP). This assesses the extent to which departmental organizational processes incorporate evidence-based care recommendations (as expressed in NICE quality standards and SIGN audit tools) [2226].
  • Patient Safety Strategies (PSS). These identify the application of strategies for ensuring patient safety recommended by international agencies including patient identification, hand hygiene, medication management, resuscitation processes and adverse event declaration [21, 26]
  • Clinical Review (CR). This assesses the use of clinical audit and systematic monitoring as part of department-level quality management activities. [21]

***All department quality measures were rated on a 5-point Likert scale and were common to all conditions, except EBOP, which was based on specific evidence findings for each condition.

Predictor variables measuring characteristics of quality management system implementation included 3 quality index scores at the hospital level and 4 quality measures at the department level that cover complementary aspects of quality management systems. These measures were build on previously validated tools, and their detailed (psychometric) properties are described elsewhere [1921] (see Textbox 1). One of the measures (the Quality Management Systems Index) was derived from a questionnaire administered to hospital quality managers. The other 6 measures were collected through direct observation by an external trained surveyor.

Additional variables collected and included in multivariate analyses included the country in which the hospital was based, hospital teaching status (teaching versus non-teaching), hospital size (<200, 200–500, 501–1000, or >1000 beds), hospital ownership (public versus private), and the gender and age of each patient included in the chart review population.

Hypothesis and analytical strategy

We hypothesized that the implementation of hospital-level quality management systems and department quality activities (positively) predicted on clinical practice for all 4 conditions analyzed (acute AMI, vaginal deliveries, hip fractures, and stroke). We first described the hospitals in the sample, the mean scores for hospital- and department-level quality measures, the characteristics of the patients in the chart review population, and compliance with clinical indicators for each pathway analyzed. For categorical variables we calculated frequencies and percentages. For continuous variables, we calculated the mean and standard deviation. We provide details on missing data in the descriptive tables, but multivariable analyses excluded patients with incomplete records for any of the variables (exposure, confounders, and clinical practice indicators).

A directed acyclic graph (DAG) was used to depict our knowledge and assumptions about the (plausible) relations between predictors: hospital quality management measures, quality activities at the department level, and clinical practice indicators. Variable selection for the statistical models in this paper was guided by the DAG shown in Fig 1.

thumbnail
Fig 1. Directed acyclic graph (DAG) guiding the analysis and showing hypothesized relationships between predictors, outcome, and covariates in this study.

Directed acyclic graph (DAG) used to guide the analysis and showing hypothesized relationships between predictors, outcome, and covariates in this study. Unidirectional arrows show an effect and bidirectional dashed arrows show a correlation. Black unidirectional arrows show the relationships tested and quantified in this article, whereas the gray arrows show relationships between other variables that guided the choice of confounding variables to control for in the analyses.

https://doi.org/10.1371/journal.pone.0141157.g001

The edges in this graph encode relationships between predictors, outcomes, and covariates, and are governed by rules that affect choice of covariates to control for confounding [2729]. For example, because we assumed that QMSI predicted QMSCI, when estimating the relationship between QMSCI and a patient-level measure, we additionally controlled for QMSI, as it would be a confounding variable for the relation of interest.

We estimated multivariate binomial logistic models with random intercept by hospital to account for clustering of patients within hospitals for all clinical indicators except percentage of recommended care delivered in the hip fracture pathway, for which we used multivariate ordinal logistic mixed model (also with random intercept by hospital). Our models were adjusted for confounding fixed effects at the country level (country), hospital level (number of beds, teaching status, ownership), and patient level (age, gender, education level), in addition to quality measures as dictated by the DAG in Fig 1. Results of associations are presented in clustered forest plots for each type of hospital- or department-level quality management measure (exposure variables). All statistical analyses were carried out in SAS (version 9.3, SAS Institute Inc., Cary, NC, 2012) and the forest plots were created using R [30]

DUQuE fulfilled the ethic requirements for research projects as described in the 7th framework of EU DG Research. Ethics approval was granted through the Bioethical Committee of the Department of Health of the Government of Catalonia, Spain on 12th February 2010, in writing by Dr Josep M Busquets, responsible for bioethics at the Department of Health. Data collection in each country complied with confidentiality requirements according to national legislation or standards of practice of that country. Patient records and other information was anonymized and de-identified prior to analysis.

Results

Overall, 276 departments from 73 hospitals in 7 countries (87% of expected) provided valid data. The hospital and department characteristics are shown in Table 1. Most hospitals were public (n = 58, 79.4%) and almost half had a teaching function (n = 33, 45.2%)

thumbnail
Table 1. Characteristics of Hospitals That Participated in the Study.

https://doi.org/10.1371/journal.pone.0141157.t001

A total of 9021 clinical records (77% of expected) were analyzed. Their characteristics are shown in Table 2.

thumbnail
Table 2. Characteristics of Patients in the Chart Review.

https://doi.org/10.1371/journal.pone.0141157.t002

Table 3 shows descriptive statistics for the predictor variables, which reveal that departments dealing with hip fractures have the lowest implementation scores for quality management activities, with the exception of patient safety strategies. The highest variation in quality between departments was observed for evidence-based organization of pathways and clinical review.

thumbnail
Table 3. Descriptive Statistics for Hospital- and Department-Level Quality Management Measures.

https://doi.org/10.1371/journal.pone.0141157.t003

Descriptive results for the clinical practice indicators are shown in Table 4. Mean compliance for single indicators was 79.8%, but rates varied considerably, for example from 42.7% for mobilization of hip fracture patients within 24 hours of surgery to 97.7% for prescription of anti-platelets at discharge in AMI. Although some level A evidence-based practice indicators seem to be fully implemented (e.g., anti-platelet prescription at discharge for AMI), performance was generally low in other areas, such as admission to a specialized stroke unit (or specific area) within 24 hours of arrival for stroke patients (compliance of 50.5%) or appropriate use of antibiotic or thrombotic prophylaxis in hip fracture patients (70.3% and 69.8% respectively). Analysis of compliance with the 5 clinical indicator summary measures showed compliance with two measures in approximately two-thirds of patients: appropriate prescription of all medication at discharge for AMI and complication-free births. Compliance was over 50% for the appropriate management of stroke and under 50% for 2 measures: administration of appropriate, timely reperfusion therapy for AMI patients (40%) and delivery of 75% of recommended care for hip fracture patients (30.7%). We also report in this table the range of country specific values for clinical indicators regarding their percentage of positive achievement

thumbnail
Table 4. Descriptive Statistics for Clinical practice Indicators (*).

https://doi.org/10.1371/journal.pone.0141157.t004

At the hospital level, positive associations between quality management system and clinical indicators were found for only 4 of the 24 indicators analyzed: reperfusion therapy given in AMI (OR, 1.20; 95% CI 1.03 to 1.41), delivery of 75% of recommended care to hip fracture patients (OR, 1.17; 95% CI 1.03 to 1.33), admission to a specialized stroke unit (OR, 1.45; 95% CI 1.04 to 2.03) and administration of aspirin/antiplatelet drug within 48 hours of arrival at hospital (OR, 1.14; 95% CI 1.02 to 1.28). In this last case, a weak negative association was found with QMSI (OR, 0.94; 95% CI 0.88 to 1.00).

At the department level (Fig 2), we found substantially more significant associations between department-level quality activities and the clinical indicators analyzed, with a positive association observed for 50% of the indicators (12/24).

thumbnail
Fig 2. Clustered forest plot showing associations (OR and 95% CI) between department quality and clinical practice indicators, with level of evidence shown in brackets.

Clustered forest plot showing associations (OR and 95% CI) between department quality and clinical practice indicators, with level of evidence shown in brackets. (1) SER = Specialized expertise and responsibility; (2) EBOP = Evidence-based organization of pathways (EBOP); (3) = Patient Safety Strategies (PSS); (4) = Clinical Review.

https://doi.org/10.1371/journal.pone.0141157.g002

We also observed positive associations for a majority of the single indicators (5/6 indicators in AMI and 3/5 in stroke). Negative associations only were observed for one quality measure for 1 of 4 indicators for both deliveries and hip fracture. The percentage of positive associations between quality management systems and clinical indicators was substantially greater for indicators with level A evidence (7/10). Full table on the Association Between Hospital-Level Quality Measures and Clinical Practice Indicators is provided in Table A in S1 File; and full table on Association Between Department-Level Quality Measures (SER, PSS, and CR) and Clinical Practice Indicators in 4 Departments is provided in Table B in S2 File, both in the Supported Information files.

Discussion

Our study is the largest of its kind in Europe to examine the impact of quality management on clinical practice. The results demonstrate wide variations in the implementation of hospital- and department-level quality management systems and in performance assessed by clinical practice indicators. Mean compliance with clinical practice indicators was 76.3%, demonstrating substantial room for improvement in EU hospitals. Previous studies have found similar variations in clinical practice based on the analysis of care received by both the general population [3132] and hospitalized patients in different countries [33]. Our results for AMI were similar to average accomplishment rates reported by the GRACE study, a cohort study of outcomes for 9557 hospitalized patients with an acute coronary syndrome for the 2001–2007 period, but lower than the rates reported for 2007 [34]. This discrepancy could be linked to methodological differences, since the 2007 study was based on self-reporting by hospitals. Our study adds to the body of knowledge on variations in clinical practice by shedding light on associations between the implementation of quality management strategies and performance in clinical practice.

We found limited and weak associations between hospital-level quality management implementation and clinical indicators for all 4 conditions analyzed, but strong and clinically relevant associations for department-level quality management and clinical indicators for AMI and stroke. The observed effects remained strong and robust for both conditions after formal sensitivity analysis for uncontrolled confounding, selection bias, and measurement error using modern bias modeling methods [3538] (results available from authors upon request). Reasons for the observed higher effectiveness of department quality activities could be linked to their proximity to the patient-provider interaction. Quality activities carried out close to the clinical decision level appear to have a greater impact on the implementation of evidence-based processes, leading to better clinical practice.

Our findings raise questions about the effectiveness of current quality efforts widely implemented in hospitals in high-income countries. Hospital programs should provide both methodological guidelines and support to departments to guide the design of quality programs, while continuing to focus on overall hospital responsibilities, such as policies, overall monitoring, and infection control.

Despite the recommendations of Shortell et al [39] in a study published over 15 years ago, hospital quality research is still largely focused on the impact of management at the hospital level. In a systematic review by Hearld et al in 2008 [40], over two-thirds of studies analyzed focused on hospital-level relationships, and in a recent literature review by our team [14], we found only 1 article that focused on quality management systems at the department level [41]. This predominant focus on hospital-level management systems is at odds with the increasing evidence that microsystems management has an important impact on clinical behavior and outcomes [4244], with bottom-up organizational and clinical interventions applied in the front line appearing to lead to better results.

Why department-level quality measures seem to be more strongly associated with clinical indicators in the 2 medical pathways than in the surgical pathways could be linked to the type of care provided in these departments and/or to issues related to professional culture and practice, supporting the view that hospitals are loosely coupled systems whose components have limited or no knowledge of each other’s practices, resulting in little mutual influence and weak relationships [4546]. This loosely coupled structure could be due to the tendency of hospitals to differentiate into subsystems with their own “laws” Top-down governance in hospitals seems to increasingly losing force with the enhancement of professional autonomy by the use of specific clinical practice guidelines that governs the organization of labor at the departmental level.

The above differences, however, could also be linked to the level of evidence of the indicators chosen and the adoption of evidence-based recommendations by departments [4749]. In our study, positive associations were found between quality management activities at the department level for almost all indicators supported by level A evidence (7/10), which mostly applied to medical conditions. Furthermore, more and better-quality evidence is currently available for AMI and stroke than for vaginal deliveries and hip fracture surgery.

The findings of our study could also have an important impact on the organization of clinical departments. To evaluate the quality of activities at the department level, we choose 3 measures that were common to all departments (allocation of clinical responsibilities, implementation of patient safety strategies, and clinical review activities) and 1 specific measure (evidence-based organization of pathway), which while distinct for each condition, had the same structure in terms of patient flow (admittance, acute care, rehabilitation [if appropriate], and discharge). These 4 department-level quality measures seem to be associated with clinical practice indicators in AMI and stroke. Further studies are needed to investigate whether implementation of the above quality activities could be used to develop best practice recommendations applicable across hospital departments. If this were possible, a quality model for clinical departments combining department responsibilities, evidence-based organization, patient safety strategies, and quality review (audit and feedback) could be drawn up to provide guidance for clinical leaders when designing the organization of their departments.

Limitations of the Study

This study has a number of limitations that need to be highlighted. First, we cannot draw conclusions on causality due to the cross-sectional nature of the study. Second, the medical records analyzed may have been more complete and comprehensive in some countries than in others [18]. Third, differences could be found in countries regarding regulatory requirements, national standards for quality, public reporting policies or other characteristics of the national health systems. We addressed this possible shortcoming by using a DAG to control for confounding in the development of our statistical models, incorporating theory and knowledge derived from previous research. We were therefore able to adjust for different country and hospitals characteristics in ways that allowed us to address competing explanations and plausible (non)causal associations, while minimizing sources of bias. We also used random-intercept mixed modeling with fixed effects to account for contextual features shared by hospitals within countries and for effects due to between-country differences. Further research and detailed measurements will be needed to tease out which specific characteristics might explain and modify between country variations. Other contextual factors from external and internal environment will need further studies in the future”. Another limitation is related to the hospital sampling strategy employed. Although sampling was random, generalization to participating countries and hospitals was limited because of possible self-selection by the hospitals that participated in the project. This is reflected in the different acceptance rates seen across the 7 countries analyzed. The reasons given for not participating in the study by hospitals in the countries with the lowest participation rates were related mainly to research fatigue, burnout with regard to quality management issues, time constraints, and competing interests with regard to efficiency and productivity targets. Additional limitations include the inclusion of multiple hypothesis tests, which, due to insufficient sample sizes, were not corrected for in our statistical analyses. Thus, the statistically significant relationships identified should be interpreted with caution.

Conclusions

This study demonstrates significant gaps between evidence and practice for 4 common clinical conditions in a large sample of European hospitals. Our findings suggest that the implementation of department-level quality strategies is significantly associated with good clinical practice. Further research should aim to develop clinically relevant quality standards for hospital departments, as these appear to be more effective than the current widespread investment in generic, hospital-wide quality management systems.

Acknowledgments

We would like to thank Nuria Mora for her help in calculating and drawing Fig 2 and Anne Murray for her revision of the language in the final English version of the text. (*) The DUQuE Project Consortium comprises N. Klazinga, D. S. Kringos, M.J.M.H. Lombarts, and T. Plochg (Academic Medical Centre—AMC, University of Amsterdam, The Netherlands); M.A. Lopez, M. Secanell, R. Sunol, and P. Vallejo (Avedis Donabedian University Institute-Universitat Autónoma de Barcelona FAD, Red de investigación en servicios de salud en enfermedades crónicas REDISSEC, Spain); P. D. Bartels and S. Kristensen (Central Denmark Region and Center for Healthcare Improvements, Aalborg University, Denmark); P.Michel. and F. Saillour-Glenisson (Comité de la Coordination de l'Evaluation Clinique et de la Qualité en Aquitaine, France); F.V. (Czech Accreditation Committee, Czech Republic); M. Car, S. Jones, and E. Klaus (Dr Foster Intelligence—DFI, UK); S. Bottaro and P. Garel (European Hospital and Healthcare Federation-HOPE, Belgium); M. Saluvan (Hacettepe University, Turkey); C. Bruneau and A. Depaigne-Loth (Haute Autorité de la Santé-HAS, France); C.D. Shaw. (University of New South Wales, Australia); A. Hammer, O. Ommen, and H. Pfaff (Institute of Medical Sociology, Health Services Research and Rehabilitation Science, University of Cologne-IMVR, Germany); O. Groene (London School of Hygiene and Tropical Medicine, UK); D. Botje and C. Wagner (The Netherlands Institute for Health Services Research—NIVEL, The Netherlands); H. Kutaj-Wasikowska and B. Kutriba (Polish Society for Quality Promotion in Health Care-TPJ, Poland); A. Escoval and A. Lívio (Portuguese Association for Hospital Development-APDH, Portugal); M. Eiras, M. Franca, and I. Leite (Portuguese Society for Quality in Health Care-SPQS, Portugal); F. Almeman, H. Kus, and K. Ozturk (Turkish Society for Quality Improvement in Healthcare-SKID, Turkey); R. Mannion (University of Birmingham, UK); O. A. Arah, M. DerSarkissian, C. A. Thompson, and A. Wang (University of California, Los Angeles—UCLA, USA); and A. Thompson (University of Edinburgh, UK).

Author Contributions

Conceived and designed the experiments: RS CW OAA SK HP NK CAT AW MD PB PM OG. Analyzed the data: RS CW OAA SK HP NK CAT AW MD PB PM OG. Contributed reagents/materials/analysis tools: RS CW OAA SK HP NK CAT AW MD PB PM OG. Wrote the paper: RS CW OAA SK HP NK CAT AW MD PB PM OG.

References

  1. 1. Donabedian A. Evaluating the quality of medical care. 1966. Milbank Q 2005;83:691–729. pmid:16279964
  2. 2. Berwick DM, James B, Coye MJ: Connections between quality measurement and improvement. Med Care 2003, 41:I30–8. pmid:12544814
  3. 3. Fisher ES, Wennberg DE, Stukel TA, Gottlieb DJ, Lucas FL, Pinder EL: The Implications of Regional Variations in Medicare Spending. Part 1: The Content, Quality, and Accessibility of Care. Annals of Internal Medicine 2003, 138:273–287. pmid:12585825
  4. 4. Fisher ES, Wennberg DE, Stukel TA, Gottlieb DJ, Lucas FL, Pinder EL: The Implications of Regional Variations in Medicare Spending. Part 2: Health Outcomes and Satisfaction with Care. Annals of Internal Medicine 2003, 138:288–298. pmid:12585826
  5. 5. Greenfield D, Braithwhite J: Health sector accreditation research: systematic review. Int J Qual Health Care 2008, 20:172–184. pmid:18339666
  6. 6. Cohen AB, Restuccia JD, Shwartz M, Drake JE, Kang R, Kralovec P, et al. D: A survey of hospital quality improvement activities.Med Care Res Rev 2008, 65:571–95. pmid:18511811
  7. 7. Wagner C, De Bakker DH, Groenewegen PP: A measuring instrument for evaluation of quality systems. Int J Qual Health Care 1999, 11:119–30. pmid:10442842
  8. 8. Weiner BJ, Alexander JA, Shortell SM, Baker LC, Becker M, Geppert JJ: Quality improvement implementation and hospital performance on quality indicators. Health Serv Res 2006, 41:307–34. pmid:16584451
  9. 9. Weiner BJ, Alexander JA, Baker LC, Shortell SM, Becker M: Quality improvement implementation and hospital performance on patient safety indicators. Med Care Res Rev 2006, 63:29–57. pmid:16686072
  10. 10. Alexander JA, Hearld LR. The science of quality improvement implementation: developing capacity to make a difference. Med Care. 2011;49:S6–20. pmid:20829724
  11. 11. Makai P, Klazinga N, Wagner C, Boncz I, Gulacsi L: Quality management and patient safety: Survey results from 102 Hungarian hospitals. Health Policy 2008, 90:175–80.
  12. 12. Dückers M, Makai P, Vos L, Groenewegen P, Wagner C: Longitudinal analysis on the development of hospital quality management systems in the Netherlands. Int J Qual Health Care 2009, 21:330–40. pmid:19689988
  13. 13. Sunol R, Vallejo P, Thompson A, Lombarts MJMH, Shaw CD, Klazinga N. Impact of quality strategies on hospital outputs. Qual Saf. Heath Care 2009;18:62–8.
  14. 14. Groene O, Botje D, Sunol R, Lopez MA, Wagner C. A systematic review of instruments that assess the implementation of hospital quality management systems. Int J Qual Health Care. 2013;25(5):525–41. pmid:23970437
  15. 15. Macinati MS. The relationship between quality management systems and organizational performance in the Italian National Health Service. Health Policy 2008;85:228–41. pmid:17825941
  16. 16. Wagner C, Groene O, Thompson CA, Dersarkissian M, Klazinga NS, Arah OA, et al. DUQuE quality management measures: associations between quality management at hospital and pathway levels. Int J Qual Health Care. 2014;26(suppl_1):66–73. pmid:24615597
  17. 17. Groene O, Klazinga NS, Wagner C, Arah OA, Thompson A, Bruneau C, et al. (2010). Investigating organizational quality improvement systems, patient empowerment, organizational culture, professional involvement and the quality of care in European hospitals: the “Deepening our Understanding of Quality Improvement in Europe (DUQuE)” project. BMC Health Services Research; 10(1), 281.
  18. 18. Secanell M, Groene O, Arah OA, Lopez MA, Kutryba B, Pfaff H, et al. Deepening our understanding of quality improvement in Europe (DUQuE): overview of a study of hospital quality management in seven countries. Int J Qual Health Care. 2014;26(suppl_1):5–15. pmid:24671120
  19. 19. Wagner C, Groene O, Thompson CA, Klazinga NS, Dersarkissian M, Arah OA, et al. Development and validation of an index to assess hospital quality management systems. Int J Qual Health Care. 2014;26(suppl_1):16–26. pmid:24618212
  20. 20. Wagner C, Groene O, Dersarkissian M, Thompson CA, Klazinga NS, Arah OA, et al. The use of on-site visits to assess compliance and implementation of quality management at hospital level. Int J Qual Health Care. 2014;26(suppl_1):27–35. pmid:24671121
  21. 21. Wagner C, Thompson CA, Arah OA, Groene O, Klazinga NS, Dersarkissian M, et al. A checklist for patient safety rounds at the care pathway level. Int J Qual Health Care. 2014;26(suppl_1):36–46. pmid:24615594
  22. 22. NICE clinical guideline 68: diagnosis and initial management of acute stroke and transient ischaemic attack (TIA). http://www.nice.org.uk/guidance/index.jsp?action=download&o=42264
  23. 23. SIGN guideline 93: acute coronary syndromes. www.sign.ac.uk/pdf/sign93.pdf
  24. 24. NICE clinical guideline 55: audit criteria for intrapartum care, 2007. www.nice.org.uk/nicemedia/pdf/CG55AuditCriteria.doc
  25. 25. SIGN guideline 56: prevention and management of hip fracture on older people. http://www.sign.ac.uk/pdf/sign111.pdf
  26. 26. Sunol R, Wagner C, Arah OA, Shaw C, Kristensen S, Thompson CA, et al. Evidence-based organization and patient safety strategies in European hospitals. Int J Qual Health Care. 2014;26(suppl_1):47–55. pmid:24578501
  27. 27. Pearl J. Causality: Models, Reasoning and Inference. Cambridge: Cambridge University Press; 2009.
  28. 28. Greenland S, Pearl J, Robins JM. Causal diagrams for epidemiologic research. Epidemiology 1999 Jan;10(1):37–48. pmid:9888278
  29. 29. Arah OA. The role of causal reasoning in understanding Simpson's paradox, Lord's paradox, and the suppression effect: covariate selection in the analysis of observational studies. Emerg Themes Epidemiol 2008;5:5. pmid:18302750
  30. 30. R Core Team (2014). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. http://www.R-project.org/
  31. 31. McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, et al. The quality of health care delivered to adults in the United States. N Engl J Med 2003, 348:2635–45. pmid:12826639
  32. 32. Runciman WB, Hunt TD, Hannaford N, Hibbert PD, Westbrook JI, Coiera EW, et al. CareTrack: assessing the appropriateness of health care delivery in Australia. Med J Aust. 2012;197(2):100–5. pmid:22794056
  33. 33. Häkkinen U, Iversen T, Peltola M, Seppälä TT, Malmivaara A, Belicza É, et al. Health care performance comparison using a disease-based approach: the EuroHOPE project. Health Policy. 2013; 112(1–2).
  34. 34. Goodman SG, Huang W, Yan AT, Budaj A, Kennelly BM, Gore JM, et al. The expanded Global Registry of Acute Coronary Events: baseline characteristics, management practices, and hospital outcomes of patients with acute coronary syndromes. American Heart Journal,2009, 158(2), 193–201. pmid:19619694
  35. 35. Arah OA, Chiba Y, Greenland S. Bias formulas for external adjustment and sensitivity analysis of unmeasured confounders. Ann Epidemiol. 2008;18(8):637–46. pmid:18652982
  36. 36. Thompson CA, Arah OA. Selection bias modeling using observed data augmented with imputed record-level probabilities. Ann Epidemiol. 2014;24(10):747–53. pmid:25175700
  37. 37. Goto A, Arah OA, Goto M, Terauchi Y, Noda M. Severe hypoglycaemia and cardiovascular disease: systematic review and meta-analysis with bias analysis. BMJ. 2013;347:f4533. pmid:23900314
  38. 38. Vanderweele TJ, Arah OA. Bias formulas for sensitivity analysis of unmeasured confounding for general outcomes, treatments, and confounders. Epidemiology. 2011;22(1):42–52. pmid:21052008
  39. 39. Shortell SM, Bennett CL, Byck GR: Assessing the impact of continuous quality improvement on clinical practice: what it will take to accelerate progress. Milbank Q 1998, 76:593–624. pmid:9879304
  40. 40. Hearld Larry R., Alexander Jeffrey A., Fraser Irene and Jiang H. Joanna. Review: How Do Hospital Organizational Structure and Processes Affect Quality of Care? A Critical Review of Research Methods. Med Care Res Rev 2008; 65: 259–299. pmid:18089769
  41. 41. Kunkel S, Rosenqvist UF, Westerling R. The structure of quality systems is important to the process and outcome, an empirical study of 386 hospital departments in Sweden. BMC Health Serv Res 2007;7:104. pmid:17620113
  42. 42. Nelson EC, Bataleden PB, Huber TP, Mohr JJ, Godfrey MJ, Headrick LA, et al. Improving health care, Part 1: The Clinical Value Compass. Jt Comm J Qual Improv 22:243–258, 1996. pmid:8743061
  43. 43. Mohr JJ, Batalden PB. Improving safety on the front lines: the role of clinical microsystems. Qual Saf Health Care. 2002 11(1):45–50. pmid:12078369
  44. 44. Barach P, Johnson JK. Understanding the complexity of redesigning care around the clinical microsystem. Qual Saf Health Care 2006;15 Suppl 1:10–6.
  45. 45. Orton JD, Weick KE, Weick E. Systems: Coupled Loosely A Reconceptualization The University of Michigan. 2013;15(2):203–23.
  46. 46. Wieck K.E. Sense Making in Organizations. Thousand Oaks, CA: Sage, 1995.
  47. 47. Grimshaw J, Eccles M, Thomas R, MacLennan G, Ramsay C, Fraser C, et al. Toward evidence-based quality improvement. Evidence (and its limitations) of the effectiveness of guideline dissemination and implementation strategies 1966–1998. Journal of General Internal Medicine. 2006, 21 (Suppl 2), 14–20.
  48. 48. Francke AL, Smit MC, de Veer AJ, Mistiaen P. Factors influencing the implementation of clinical guidelines for health care professionals: a systematic meta-review. BMC Medical Informatics and Decision Making. 2008. 8:38. pmid:18789150
  49. 49. Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients' care. The Lancet 2003 362, 125–30.