Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Understanding how and why audits work in improving the quality of hospital care: A systematic realist review

  • Lisanne Hut-Mossel ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Writing – original draft, Writing – review & editing

    p.a.mossel@umcg.nl

    Affiliation Centre of Expertise on Quality and Safety, University Medical Centre Groningen, University of Groningen, Groningen, The Netherlands

  • Kees Ahaus ,

    Contributed equally to this work with: Kees Ahaus, Gera Welker, Rijk Gans

    Roles Investigation, Methodology, Supervision, Writing – original draft, Writing – review & editing

    Affiliation Department Health Services Management & Organisation, Erasmus School of Health Policy & Management, Erasmus University, Rotterdam, The Netherlands

  • Gera Welker ,

    Contributed equally to this work with: Kees Ahaus, Gera Welker, Rijk Gans

    Roles Investigation, Methodology, Supervision, Writing – original draft, Writing – review & editing

    Affiliation University Medical Center Groningen, University of Groningen, Groningen, The Netherlands

  • Rijk Gans

    Contributed equally to this work with: Kees Ahaus, Gera Welker, Rijk Gans

    Roles Investigation, Methodology, Supervision, Writing – original draft, Writing – review & editing

    Affiliation Department of Internal Medicine, University of Groningen, University Medical Centre Groningen, Groningen, The Netherlands

Abstract

Background

Several types of audits have been used to promote quality improvement (QI) in hospital care. However, in-depth studies into the mechanisms responsible for the effectiveness of audits in a given context is scarce. We sought to understand the mechanisms and contextual factors that determine why audits might, or might not, lead to improved quality of hospital care.

Methods

A realist review was conducted to systematically search and synthesise the literature on audits. Data from individual papers were synthesised by coding, iteratively testing and supplementing initial programme theories, and refining these theories into a set of context–mechanism–outcome configurations (CMOcs).

Results

From our synthesis of 85 papers, seven CMOcs were identified that explain how audits work: (1) externally initiated audits create QI awareness although their impact on improvement diminishes over time; (2) a sense of urgency felt by healthcare professionals triggers engagement with an audit; (3) champions are vital for an audit to be perceived by healthcare professionals as worth the effort; (4) bottom-up initiated audits are more likely to bring about sustained change; (5) knowledge-sharing within externally mandated audits triggers participation by healthcare professionals; (6) audit data support healthcare professionals in raising issues in their dialogues with those in leadership positions; and (7) audits legitimise the provision of feedback to colleagues, which flattens the perceived hierarchy and encourages constructive collaboration.

Conclusions

This realist review has identified seven CMOcs that should be taken into account when seeking to optimise the design and usage of audits. These CMOcs can provide policy makers and practice leaders with an adequate conceptual grounding to design contextually sensitive audits in diverse settings and advance the audit research agenda for various contexts.

PROSPERO registration

CRD42016039882.

Introduction

In recent years, quality and safety issues have become increasingly important in hospital care following an increased focus on both clinical outcomes and patient satisfaction. Health authorities and organisations prioritise audits as a quality improvement (QI) approach by systematically evaluating delivered care, identifying areas for improvement and implementing changes for the better [1]. Several types of audits, including external audits, internal audits, peer reviews and clinical audits, have been used, but all share the problem that the implementation of suggested improvements often fails to close the quality gap they exposed [13].

The limited effectiveness of audits suggests that carrying out audits and implementing improvements is not a straightforward process [46]. Although various explanations for how audits work have been offered, there has been little in-depth theorising about the causal mechanisms that determine the effectiveness of audits in a given context [7, 8].

The audits used in the area of improving healthcare can roughly be divided into: (1) external audits, used to gain insight into a hospital’s compliance with external criteria (e.g. accreditation, certification, external peer reviews); (2) internal audits, often in preparation for an external audit; and (3) clinical audits, carried out as a local initiative by healthcare professionals [1, 3] (Table 1). Although there are differences in the scope and approaches used in audits, they all share the objective of improving the quality of hospital care.

Audits are posited to increase accountability and improve the quality of hospital care through systematic monitoring and evaluation. However, many audits are designed without explicitly building on previous research or guided by theory [1618]. As a result, there has been little progress with respect to identifying the key ingredients for successful audits. The variety in the levels of audits, together with the heterogeneity of their contexts, suggests that it is unlikely that audits work in the same way in every setting. Given this situation, a detailed understanding of the contextual factors and the causal mechanisms that influence the effectiveness of audits is necessary if one is to improve the design and optimisation of the audit process.

The aim of the current study is to understand the mechanisms and contextual factors that determine why audits might, or might not, lead to improved quality of hospital care. Two related research questions were formulated:

  1. Through which mechanisms do audits deliver their intended outcomes?
  2. What contextual factors determine whether the identified mechanisms result in the intended outcomes?

Methods

We adopted a realist review approach to address our research questions. We were guided by the RAMESES publication standards for realist reviews and we followed the PRISMA guidelines for systematic searching of the literature (S3 Table) [19, 20].

In addition to the approach of a systematic review, we have chosen to use a realist review approach because this permits us to understand in what circumstances and through what processes audits might, or might not, lead to improved quality of hospital care and why. This approach recognises that the success of audits is shaped by the way in which they are implemented and the contexts in which it is implemented. Realist reviews belong to the school of theory-driven inquiry and are concerned with how an intervention works, rather than focussing solely on whether an intervention works. Furthermore, the realist review methodology is specifically designed to cope with the complexity and heterogeneity (e.g. in a study’s design and context [21]), identified in previous research on audits [5, 9, 22, 23]. Table 2 provides definitions of realist concepts. Realist reviews recognise that interventions are complex and can rarely be delivered consistently due to differences in context [19]. The unit of analysis in this realist review was not the audit itself, but the programme theories about how, why and in what circumstances audits might work. Thus, the focus of this review is on mechanisms that affect change and not on the type of audit conducted. A realist review examines the interaction between an intervention, the context (C) or ‘setting’ in which it is applied, the mechanism that describes how the participants use the intervention’s resources (Mresource) and how they respond to these resources (Mreasoning), and the intended or unintended outcome (O) in a set of primary studies [24, 25]. Mresource and Mreasoning together constitute of a mechanism, but explicitly disaggregating them helps in operationalising the difference between the intervention strategy and the mechanisms. As recalled by Pawson and Tilley (2004) mechanisms are linked to, but not synonymous with the intervention strategy [24]. Mechanisms and the intervention strategy are located at different levels of abstraction [26]. The attributes of the intervention refer to strategies and implemented activities whereas the attributes of mechanisms are centred on the elements of individual or collective reasoning or reactions of participants in regard of the available resources offered by the intervention (Table 2) [2427]. As Dalkin et al. put it, “resources must be introduced into a pre-existing context, which in collaboration induces an individual’s reasoning, leading to an outcome.” [27] (p.5). Data were collected and combined to identify context–mechanism–outcome configurations (CMOcs).

The protocol for this research, with a detailed description of the research team composition, search process and selection of primary studies, has been previously published (S1 Protocol) [9]. The first step of our synthesis sought to identify initial programme theories and was guided by the question ‘What is the intrinsic logic of audits?’ i.e., what explains why audits are assumed to be a good idea [19]. Drawing on the literature on effectiveness of QI strategies, we developed several initial programme theories in the form of ‘if-then’ statements (Table 3) [19, 29]. Next, a systematic literature search was conducted in MEDLINE, Embase, PsycINFO, Academic Search Premier, Business Source Premier, Emerald Insight, Cochrane Library and Web of Science covering the period 2005 –August 2020 (S1 Box). No changes were made to the inclusion criteria as formulated in the review protocol (S1 Protocol) [9]. Eligible studies were empirical studies that evaluated the effects of audits in hospital settings within high-income countries, and no restrictions were placed on the type of study design (Box 1). Articles were excluded if they did not meet the inclusion criteria [9]. Furthermore, articles reported in languages other than English were excluded to avoid misinterpretation of the content of an article due to language barriers.

thumbnail
Table 3. Key findings.

Initial programme theory in the form of ‘If-Then’ statements in relation to the CMO configurations (observed associations), key quotations and reflections of the focus group meeting.

https://doi.org/10.1371/journal.pone.0248677.t003

Box 1. Inclusion criteria

Studies on accreditation, certification, peer review/Dutch visitate model (external)a

or local clinical audit (internal)

Hospital settingb

High-income country

Published in English

English abstract available

Description of the medical or technical content

Descript of the process of how the audit was conducted

Description of the impact of audit on medical and/or process outcomes

a This is a doctors-led and -owned system of peer review designed to assess the quality of care provided by groups of hospital-based medical specialists. Practices are surveyed every 3–5 years by a group of peers.

b The rationale for focusing on hospital care is that these organisations share challenges regarding safety, effectiveness and values. In addition, these organizations can be quite similar in organisational structure.

Each paper’s quality was appraised by two reviewers (LH plus KA, GW or RG). To determine its rigour, the quality of the evidence in each paper was presented in the form of an evidence-level table following the criteria established by the Cochrane ‘Effective Practice and Organisation of Care’ (EPOC) review group [30]. In addition, the ‘Quality Improvement Minimum Quality Criteria Set’ (QI-MQCS) was used to assess the completeness of the reporting of each paper [31]. To date, cut-off scores for what may be considered low, medium or high quality have not yet been determined. In order to interpret the QI-MQCS score, we used the criteria previously used by Kampstra et al. [32]: a study was weighed to be of perfect quality if > 15 items ranked yes, good quality if > 12 items ranked yes, moderate quality if > 9 items ranked yes) and insufficient quality if ≤9 items ranked yes.

Although the process of analysis and synthesis is reported in two stages, in practice it was an iterative process guided by the review questions. The key analytical process in a realist review involves iterative testing and refinement of the initial programme theories using empirical findings in data sources [28]. This process was informed by the realist synthesis approach described by Rycroft-Malone et al. [28]. During the first stage of the data extraction and synthesis process, sections of text relating to context, Mresource, Mreasoning and produced outcomes were extracted (Fig 1). Four articles were independently coded for C, Mresource, Mreasoning and O by each member of the research team. The results of this exercise achieved a Krippendorff’s α of 0.79, which is considered an acceptable level of consistency [33, 34]. To further ensure consistent judgement, two reviewers independently coded and discussed the first 12 articles, resolving differences by consensus. This coding process was both deductive (codes based on the initial programme theories) and inductive (codes identified during data extraction). Thereafter, the remaining 73 papers were coded by the first author with the aim of confirming, adjusting, supplementing or refuting our initial programme theories. Details of the coding list for this review are provided in S2 Table. During the second stage, we compared and contrasted the evidence to identify patterns in the mechanisms across different contexts that related to diverse outcomes. The research team regularly discussed the patterns that emerged and their compatibility with the initial programme theories. Eventually, this iterative process allowed us to refine and advance the initial programme theories into a set of CMOcs that provide an explanatory account of how audits might, or might not, lead to improved quality of hospital care.

Pawson et al. [21] argue that stakeholders should be involved in both the process of confirming the emerging findings and in dissemination activities. To that end, the recognisability of the CMOcs were discussed during a focus group meeting, which was led by an independent and experienced moderator. The nine participants were purposively chosen based on their relevant experience with audits. In this respect, the Dutch Law on Medical Research Involving Human Subjects (WMO) did not require us to seek ethical approval as the focus group would not promote clinical medical knowledge, and there was no participation by patients or use of patients’ data. Informed consent was received from all participants.

Results

Of the 13709 potentially relevant records, 85 primary papers met our eligibility criteria (Box 1). During the full-text screening stage, we had to exclude more than 400 papers, often because the audit process was not sufficiently described to be able to identify mechanisms or because the outcomes were lacking (Fig 2). Of the included papers, 61 focussed on clinical audits, 17 on accreditation/certification and 7 on peer reviews. In terms of the evidence levels established by EPOC review group [30], one randomised trial was found [35], two controlled studies [36, 37], 66 case studies [38103], and 16 descriptive studies [104119]. Using the QI-MQCS, the completeness of reporting scores ranged from 4 to 16 (16 being the highest possible score) (S1 Table). Of the 85 included papers, 41 scored more than 12 items yes and where thus considered to be of good quality. 30 articles scored poorly on the QI-MQCS with a score ≤ 9, which is ranked as low quality. Only 61 of the 85 papers met the criteria for domain 12 (organisational readiness), that describes the QI culture and resources present in the organisation, which helps to assess the transferability of results. In addition, 52 of the 85 papers met the sustainability domain by including reference to organisational resources and policy changed needed to sustain the audit results after withdrawal of study personnel and resources. The synthesis began with evidence from the most rigorous studies (> 12 items ranked yes according to the QI-MQCS), as these studies provided the most relevant and detailed evidence about C, Mresource, Mreasoning or O. Less rigorous studies were used to test the emerging CMOcs, but also to test alternative explanations. The findings are organised across seven CMOcs describing how audits might work to improve the quality of hospital care. These are presented below and described in more detail in Table 3. These explanatory CMO configurations were constructed by the research team based on the realist analysis. The inputs from the focus group did not materially change the CMO configurations but triangulated and enriched the literature findings.

CMOc1. Externally initiated audits create QI awareness although their impact on improvement diminishes over time [37, 39, 51, 52, 68, 73, 83, 93, 102, 104, 110, 112, 114, 117]

Of the 17 studies that addressed externally mandated audits, eight reported on the ongoing improvement process after the initial audit had taken place. Although externally initiated audits do stimulate hospitals to improve their quality of healthcare [39, 44, 51, 68, 83, 93, 104], there seems to be a tendency towards “complacency with past improvements” and a diminishing urgency for continued QI [68, 73, 83, 104, 117]. Similarly, it seems that the rate of improvement levels off once the external pressure diminishes [51]. Several authors argue that a hospital’s focus on QI was influenced by perceived external pressure: if this pressure diminished, the healthcare professionals’ and organisational members’ motivation also eased [51, 73].

The number of years that an organisation had participated in audits affected the extent of the improvements that took place [73, 83, 104]. When a hospital started with audit activities, the process was perceived as a QI endeavour and as a challenge as the organisation attempted to align its care activities with best practices. However, after a few years, the extent of improvement reached a plateau and healthcare professionals and organisational members no longer considered the audit as a driver of change [83, 93, 104]. Notwithstanding these observations, the fact that organisational members had to comply with the audit requirements, had to participate in the audit process and had to collect data for the audit, fuelled the awareness and interest of healthcare professionals and organisational members in QI [73, 83].

CMOc2. A sense of urgency felt by healthcare professionals triggers engagement with an audit [36, 38, 4042, 44, 45, 48, 50, 5356, 58, 6165, 69, 70, 75, 77, 79, 8184, 87, 88, 93, 97100, 102109, 113, 117, 118]

Having a sense of urgency seems to be an important precondition for healthcare professionals to engage: if healthcare professionals perceive the current situation as untenable for themselves or for the safety of patients, this will urge them to take action [82, 105]. Several studies have described local audits that were started by intrinsically motivated healthcare professionals on issues they felt to impact on patient care [48, 53, 61, 63, 69, 82, 88, 97, 99, 103]. In contrast, if healthcare professionals perceive an audit as a ‘side-line’ or ‘mandatory exercise’, rather than a process to contribute to improved quality of care, they will not feel engaged and not put significant effort into the change [77, 118]. This is especially true when audits are externally initiated and have a strong organisational focus, sometimes without links to clinical care [102, 104, 106108, 117]. Additionally, some studies showed that healthcare professionals reported reduced time for patient care when preparing for an external audit [38, 62, 93, 117]. Other QI initiatives, which they perceived as more relevant to patient care, were sometimes paused or stopped during the external audit [83, 117]. It was also noted that at times issues might arise during an external audit that were not in line with healthcare professionals’ priorities for QI [102, 107].

CMOc3. Champions are vital for an audit to be perceived by healthcare professionals as worth the effort [36, 43, 47, 49, 53, 54, 57, 59, 69, 74, 82, 85, 86, 8992, 95, 101, 103, 104, 109, 119]

Champions are those healthcare professionals who are committed to implementing change, act as a role model for the intended change and are enthusiastic in convincing others in the organisation of the necessity of change [69, 82, 86, 89, 101, 103, 109]. Two kinds of champions were identified, those that initiated an audit [36, 49, 54, 59, 69, 74, 82, 89] and those nominated by the leadership of an organisation and given responsibility for carrying the audit forward [43, 47, 53, 57, 85, 86, 90, 91, 95, 101, 103, 104, 109, 119]. In successful audits champions challenge, educate and lead their colleagues to improve the quality of care [69, 90, 91, 95, 101, 119]. Furthermore, healthcare professionals consider an audit more relevant if one of their peers is the driving force, and this fosters trust [36]. This encouraged a more critical attitude and attention to the quality of the care they delivered, increases their commitment to QI [36, 69, 82, 89], and lead to changes intended to improve care being successfully implemented [36, 53, 57, 59]. A supportive organisational culture is an important contextual factor for champions to be able to perform [36, 54, 69, 86, 89, 90, 103].

CMOc4. Bottom-up initiated audits are more likely to bring about sustained change [36, 48, 53, 57, 59, 60, 63, 67, 76, 80, 82, 91, 96]

In bottom-up initiated audits, healthcare professionals take the lead. These healthcare professionals work closely together to define appropriate improvements for their own local situation, and this ownership helps developing informal social ties and ensuring the acceptance of sustainable improvements in practice [53, 57, 59, 60, 80, 91]. For example, during an audit about medication registration errors, nursing teams were empowered to develop their own strategies to improve practice [91]. Healthcare professionals are able to collaborate to improve the quality of care because they respect each other’s competences and are sensitive to each other’s work and needs [53, 57, 59]. Moreover, working together during an audit improves communication, reflection, feedback and collaboration among professionals and this encourages to think and work as a cohesive group in search for solutions on how to arrange and deliver multidisciplinary care [57, 59, 80]. As such, bottom-up initiated audits are more ‘natural’ and eventually more meaningful and appropriate for the local situation.

CMOc5. Knowledge-sharing within externally mandated audits triggers participation by healthcare professionals [35, 59, 66, 73, 80, 83, 88, 106, 116]

In contexts where an audit is initiated by an external party, healthcare professionals are often obliged to work together in a working group. Several studies describe how healthcare professionals see external audits as learning opportunities as they are able to exchange ideas and knowledge [35, 73, 83, 88, 106]. Healthcare professionals find it rewarding to share knowledge about quality and care processes in working groups, and this results in a better understanding of the challenges facing the organisation. As a consequence, changes tend to be more quickly implemented and spread throughout the organisation [73]. In this way, working groups act as forums for knowledge exchange, helping to bring together different professionals or various parts of an organisation [83]. Working groups have also been shown to have other positive impacts such as, boosting the cohesion of medical and paramedical teams through better communication within and between teams [59, 66, 106, 116].

CMOc6. Audit data support healthcare professionals in raising issues in their dialogues with those in leadership positions [63, 71, 82, 83, 88]

Audit data enable healthcare professionals to identify shortcomings in their local patient care and strengthen their confidence in discussing requests for changes with “those in positions of leadership” [82]. Even, when an audit is mandated by an external party, healthcare professionals will use the information provided by the audit to convince management of the need for change [83].

Healthcare professionals tend to express opportunities for quality improvement initiatives in contexts that are felt safe, as in a culture with a focus on collective learning and improvement, as opposed to a culture where speaking up is suppressed and mistakes are punished [63, 71, 82, 83]. In addition, by presenting audit data to leaders, healthcare professionals got the opportunity to become involved in the governance of their hospital [88]. This has enabled healthcare professionals and their leaders to share responsibility for allocating sufficient resources to improve care within the overall budgetary limits of the organisation [63, 71, 82, 83].

CMOc7. Audits legitimise the provision of feedback to colleagues, which flattens the perceived hierarchy and encourages constructive collaboration [46, 61, 72, 94, 111, 115]

Audits create feedback opportunities in multidisciplinary teams and make it easier to have conversations among professionals from different backgrounds. Based on shared decisions over the healthcare processes to be audited, professionals feel legitimised in giving each other feedback: they feel justified and empowered in challenging each other, including professionals who are perceived as higher up the hierarchy [72]. In some cases, lack of improvement was a result of dissonant relationships between various departments as they are often unaware of each other’s duties and responsibilities [94]. The audit process thus flattens the perceived hierarchy and encourages constructive collaboration [46, 61, 94, 115]. It should be noted that this mechanism of a flattened perceived hierarchy only ‘works’ in a context where the learning culture is considered safe. In a context where career progression depends on hierarchical power relationships, with little room for mistakes, healthcare professionals (especially junior doctors) did not feel engaged in the audit as they perceived it to be a threat to their career prospects [111].

Discussion

Undertaking a realist review allowed us to identify seven CMOcs that highlighted a range of enabling and constraining contextual factors and mechanisms, that are fundamental to the ways in which audits can improve the quality of hospital care. This realist review indicates that externally initiated audits create QI awareness and that knowledge-sharing within these audits is important as it triggers the participation of healthcare professionals. However, bottom-up initiated audits are more likely to bring about sustained change. A sense of urgency felt by healthcare professionals triggers engagement with an audit. Also, this review pointed out that champions are vital for audits to be perceived as worth the effort they involve. In addition, an audit can be an instrument that encourages healthcare professionals to provide each other with feedback, as well as to raise issues in dialogue with leaders and become engaged in the governance of the organisation.

The audit as a platform to raise issues and provide feedback

From a social perspective, an audit can be viewed as a platform for the internal bonding of individuals. Bonding has been shown to be key in building a sense of ‘community’ that contributes to implementation effectiveness [120, 125, 126]. Here, audit meetings provide opportunities for interaction, discussion and the sharing of ideas about changing practices, and for improved communication within and between teams.

Importantly, an audit creates opportunities for feedback on multidisciplinary issues that affect all professionals involved in the care process. Studies of collaboration within multidisciplinary teams have highlighted that the medical profession remains dominant, even when there have been acknowledged attempts to “democratise” teams [127129]. Nurses might witness and experience a variety of problems, but do not generally communicate these to medical specialists perceived to be higher up the hierarchy [130]. This study, however, has been able to demonstrate that healthcare professionals were empowered by the audit to challenge their peers and also staff who are perceived as more senior or higher up the hierarchy. Collaborative learning and speaking up to others in the hierarchy does not occur naturally in healthcare, despite its importance for improving care delivery. Therefore, creating a safe and open audit environment in which all professionals feel safe to provide feedback to each other, is of utmost importance. Hence, audits might be a next step to resolve a widespread issue in healthcare related to communication and collaboration between nurses and doctors.

Our findings regarding the importance of champions resonate with and adds to the findings of a recent systematic review and meta-synthesis by Brown et al. on the functioning of clinical performance feedback [131]. Their review reported that feedback from a clinical supervisor is likely to be perceived as having greater knowledge and skill, and is therefore more likely to be accepted [131]. Furthermore, healthcare professionals consider an audit and its outcomes as relevant for adoption in practice because one of their peers was the driving force. It has also been suggested that the audit process creates networks of like-minded collaborators across the organisation [132, 133]. By contrast, this emphasizes the role of peer involvement and collaborative practice over the role of a clinical supervisor.

The audit as a platform for continuing professional development by critical contributions of healthcare professionals

Externally initiated audits can be effective, but only, as reported above, if the audit is led by a champion, or if the healthcare professionals see the audit as a learning opportunity rather than as an external evaluation. This element was also recognised in a previous study into the barriers and facilitators of internal audits in preparation for an external audit: when healthcare professionals perceive internal audits as an examination tool that is only implemented because of external obligations they are less motivated to use the internal audit to drive improvements and their motivation will be lower during a follow-up audit [134]. Moreover, previous research has shown that involving healthcare professionals can be a critical factor for implementing and sustaining change. For example, a study by Hovlid et al. (2020) have showed that if organisations involve healthcare professionals in assessing care delivery for external audits, they are able to gain a better understanding of their current practice and consequently initiate action to improve quality of care. In organisations were involvement of healthcare professionals was low, no actual improvements were initiated or realized apart from updating written guidelines describing how care should be delivered [135].

The audit as a platform for collaborative practice

Audits can also be initiated in a bottom-up way by intrinsically motivated healthcare professionals who collaborate with one another to improve their own local care practices. Previous research has highlighted the importance of a local, bottom-up, practice-based improvement approach [136, 137]. We add depth to these findings by showing that the strength of bottom-up QI comes from the fact that healthcare professionals are in the lead, share the same language and feel a shared ownership of the incremental changes, aspects that are lacking when having an audit imposed upon them [136].

Different approaches to QI

In our review, we positioned audits as a QI approach to improve patient care and outcomes. Previous studies have often recommended a balanced or hybrid approach to audits in which organisations balance top-down control with empowering healthcare professionals to enable bottom-up improvements [138140]. Surprisingly, we did not find any evidence to either support or reject a balanced or hybrid approach to audits. Despite this lack of evidence, we did observe that audits themselves are becoming increasingly hybrid and balanced, with different techniques and approaches being passed back and forth between different types of audits [38, 134]. This strengthens the claim that the CMOcs we identified are valid across the spectrum of audits.

Is there such a thing as audit tiredness?

This review shows that with each year that an organisation participates in externally initiated audits, further improvements appear to tail off. While organisations initially invest heavily in order to satisfy the first accreditation visit, and maximise the benefits from the ensuing changes, after 3–10 years the learning curve levels off [83]. It seems that healthcare professionals no longer consider the audit to be a driver of change, and organisations then need to find other initiatives to revive the process [3, 4, 6]. Whether a similar levelling off occurs with bottom-up initiatives led by the healthcare professionals themselves is uncertain, but it is certainly plausible that here also an audit ‘tiredness’ may ensue after some time.

Practical implications

This study is of use to other researchers in this field, since it provides a framework for conceptualizing audits in their specific organisational context. The CMOcs that were identified offer policymakers and practice leaders an understanding of the mechanisms that promote the successful implementation of audit activities and of contextual factors that can either impede or support these mechanisms. Several ways to overcome the challenges facing audits are suggested, focusing on areas to consider when designing and optimising audit activities (Box 2). These recommendations are sufficiently generic to allow local tailoring to specific types of audits in varying contexts.

Box 2. How to make audits work effectively

  1. Build the audit on teamwork and engage all healthcare professionals involved (identification of stakeholders). Utilise the shared commitment of the entire team working together to identify and spread effective practice. This can be achieved more easily within the local bottom-up initiated audits as professionals already recognise issues of interested from their own practice, share the same language and frame of reference and feel ownership of the incremental changes, rather than having an audit topic imposed on them.
  2. Focus on knowledge-sharing within externally motivated audits. Healthcare professionals need to perceive the audit as a learning opportunity. This will only occur if attention and dedicated time is given to share knowledge with colleagues about the quality of care as it relates to the design of the care pathway. Preferably, this is blended with the approach described under 1.
  3. Ensure there are local champions. If the driving force behind the audit is a local champion and peer, healthcare professionals are more likely to display ownership and see the audit process as worth the effort. Also, champions will challenge their colleagues to implement changes in practice.
  4. Encourage during the audit the provision of feedback between healthcare professionals from all levels of the organisation. The audit needs to be positioned in such a way that all professionals feel it is safe to give feedback, also to someone who is perceived to by higher up the hierarchy.

Our study suggests that healthcare needs to find additional ways to sustain the motivation and engagement of healthcare professionals in QI, as is also suggested by the fact that in 39% of papers sustainability as item in the QI-MQCS was lacking (S1 Table). Other, largely untested, approaches could include feedback and public reporting of patient-reported outcomes and experience measures (PROMs and PREMs) and the use of patient narratives and patient journeys to inform change and the redesign of care pathways [4, 141, 142], without any increase of the current administrative burden on health care professionals [143, 144].

This study is of use to other researchers, since it provides a framework for conceptualizing audits in their specific organisational context. We strongly urge authors of primary studies in this field to include detailed descriptions of the audits, the context in which they take place (including institutional, system and process aspects that could influence the adoption of audits), the audit process itself and how to sustain relevant outcomes in practice. This will permit a more fine-grained analysis of the mechanisms and contexts that impede or support the intended outcome.

Limitations and strengths

This review has two fundamental limitations. First, most of the included articles report a positive response, which suggests a publication bias in that failed audits are likely not reported. Despite this possible bias, we were able to refine and advance our initial programme theories such that they can help to explain how audits might, or might not, lead to improved quality of hospital care.

Second, while this review has not examined interactions between the various CMOcs, they do seem to be interrelated. For example, those in leadership positions play an important role in ensuring a safe and trusting environment for the execution of audits, which in turn is a precondition for several of the mechanisms to come into play (i.e. for champions to play an active role in the audit, for giving and receiving feedback, for healthcare professionals to take ownership of the delivery of QI activities). Consequently, collecting primary data to explore these contextual factors and CMOcs would be an important step in further advancing our understanding of how and why audits might work. Also, we saw within our first CMOc (that externally initiated audits create QI awareness but their impact on improvement diminished over time) that successful audits can change the conditions that make them work in the first place. As such, CMOcs can be linked with the outcome of one phase of an audit becoming a contextual aspect for the next phase [145]. In this study, this ripple effect is premised on the idea that audit activities are “a series of events in the life course of a system, leading to the evolution of new structures of interaction and new shared meanings” [146] (p. 267).

In terms of strengths, this study contributes to the growing use of realist approaches in evidence synthesis. The realist approach is still developing, and key concepts are not always explained or applied in the same manner. For example, challenges have been identified in defining and operationalising mechanisms [24, 26, 27, 147, 148]. In this regard, Pawson (2012) observed that mechanisms “capture the many different ways in which the resources on offer may affect the stakeholders’ reasoning” [149] (p.187). Given that resources and reasoning are both constituent parts of a mechanism, explicitly disaggregating them has helped to understand the ways in which mechanisms affect outcomes. By systematically applying methodological guidelines and describing our understanding of the key concepts, we have stuck closely to the realist synthesis approach [19]. We have provided a detailed account of our search methodologies and the process of theory elicitation and selection (S1 Protocol) [9]. We believe that these strategies have enhanced the transparency, and thus improved the validity, of our review.

Conclusions

This realist review has indicated that champions are vital for an audit to be perceived by healthcare professionals as worth the effort. In addition, an audit can be an instrument that encourages healthcare professionals to provide each other with feedback, as well as to raise issues with leaders and become engaged in the governance of the organisation. As the broader learning- and QI infrastructure continues to mature, it will become increasingly important to think of audits in their wider context, especially one as complex and dynamic as a hospital care system, when designing and optimising audit processes. We believe that this work on theorizing audits using a realist review methodology will provide policy makers and practice leaders with sufficient conceptual grounding to design contextually sensitive audits in a wide variety of settings. Future research could test the validity of our configurations through empirical studies that include detailed process evaluations of audits in order to provide further insights into the mechanisms and contextual factors through which audits produce their results.

Acknowledgments

We wish to thank the participants of the focus group for their participation and insight. We would like to thank Johanna Schönrock-Adema for her constructive comments.

References

  1. 1. Spencer E, Walshe K. National quality improvement policies and strategies in European healthcare systems. Qual Saf Health Care. 2009;18 Suppl 1: i22–7. pmid:19188457
  2. 2. McDonald KM, Chang C, Schultz E. Closing the quality gap: revisiting the state of the science. Rockville (MD); 2013.
  3. 3. Bohigas L, Heaton C. Methods for external evaluation of health care institutions. Int J Qual Health Care. 2000;12: 231–238. pmid:10894195
  4. 4. Greenfield D, Braithwaite J. Health sector accreditation research: a systematic review. Int J Qual Health Care. 2008;20: 172–183. pmid:18339666
  5. 5. Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Libr. 2012. pmid:22696318
  6. 6. Flodgren G, Gonçalves-Bradley DC, Pomey M. External inspection of compliance with standards for improved healthcare outcomes. Cochrane Database Syst Rev. 2016. pmid:27911487
  7. 7. Kaplan HC, Brady PW, Dritz MC, Hooper DK, Linam WM, Froehle CM, et al. The influence of context on quality improvement success in health care: a systematic review of the literature. Milbank Q. 2010;88: 500–559. pmid:21166868
  8. 8. Colquhoun H, Michie S, Sales A, Ivers N, Grimshaw JM, Carroll K, et al. Reporting and design elements of audit and feedback interventions: a secondary review. BMJ Qual Saf. 2017;26: 54–60. pmid:26811541
  9. 9. Hut-Mossel L, Welker G, Ahaus K, Gans R. Understanding how and why audits work: protocol for a realist review of audit programmes to improve hospital care. BMJ Open. 2017;7: e015121-2016-015121.
  10. 10. Klazinga N. Re-engineering trust: the adoption and adaption of four models for external quality assurance of health care services in western European health care systems. International Journal for Quality in Health Care. 2000;12: 183–189. pmid:10894189
  11. 11. International Organization for Standardization. Quality Management Systems-Fundamentals and Vocabulary (ISO 9000: 2015).: ISO Copyright office; 2015.
  12. 12. Walshe K, Freeman T, Latham L, Wallace L, Spurgeon P. Clinical governance: from policy to practice. Birmingham: Health Services Management Centre. 2000.
  13. 13. Dixon N. What is clinical audit’s purpose: quality assurance or quality improvement? Faculty Dental Journal. 2011;2: 79–83.
  14. 14. Dixon N. Getting clinical audit right to benefit patients: Getting Clinical Audit Right; 2007.
  15. 15. National Institute for Clinical Excellence (Great Britain). Principles for best practice in clinical audit: Radcliffe publishing; 2002.
  16. 16. Ivers NM, Sales A, Colquhoun H, Michie S, Foy R, Francis JJ, et al. No more ‘business as usual’with audit and feedback interventions: towards an agenda for a reinvigorated intervention. Implement Sci. 2014;9: 14. pmid:24438584
  17. 17. Colquhoun HL, Brehaut JC, Sales A, Ivers N, Grimshaw J, Michie S, et al. A systematic review of the use of theory in randomized controlled trials of audit and feedback. Implement Sci. 2013;8: 66. pmid:23759034
  18. 18. Foy R, Eccles M, Jamtvedt G, Young J, Grimshaw J, Baker R. What do we know about how to do audit and feedback? Pitfalls in applying evidence from a systematic review. BMC health services research. 2005;5: 50. pmid:16011811
  19. 19. Wong G, Greenhalgh T, Westhorp G, Buckingham J, Pawson R. RAMESES publication standards: realist syntheses. BMC Med. 2013;11: 21. pmid:23360677
  20. 20. Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Int J Surg. 2010;8: 336–341. pmid:20171303
  21. 21. Pawson R, Greenhalgh T, Harvey G, Walshe K. Realist review-a new method of systematic review designed for complex policy interventions. J Health Serv Res Policy. 2005;10: 21–34. pmid:16053581
  22. 22. Brubakk K, Vist GE, Bukholm G, Barach P, Tjomsland O. A systematic review of hospital accreditation: the challenges of measuring complex intervention effects. BMC health services research. 2015;15: 280. pmid:26202068
  23. 23. Walshe K. Understanding what works and why in quality improvement: the need for theory-driven evaluation. International Journal for Quality in Health Care. 2007;19: 57–59. pmid:17337518
  24. 24. Pawson R. Evidence-based policy: a realist perspective: Sage; 2006.
  25. 25. Emmel N, Greenhalgh J, Manzano A, Monaghan M, Dalkin S. Doing Realist Research: SAGE Publications; 2018.
  26. 26. Lacouture A, Breton E, Guichard A, Ridde V. The concept of mechanism from a realist approach: a scoping review to facilitate its operationalization in public health program evaluation. Implement Sci. 2015;10: 153. pmid:26519291
  27. 27. Dalkin SM, Greenhalgh J, Jones D, Cunningham B, Lhussier M. What’s in a mechanism? Development of a key concept in realist evaluation. Implement Sci. 2015;10: 49-015-0237-x.
  28. 28. Rycroft-Malone J, McCormack B, Hutchinson AM, DeCorby K, Bucknall TK, Kent B, et al. Realist synthesis: illustrating the method for implementation research. Implement Sci. 2012;7: 33. pmid:22515663
  29. 29. Pearson M, Brand S, Quinn C, Shaw J, Maguire M, Michie S, et al. Using realist review to inform intervention development: methodological illustration and conceptual platform for collaborative care in offender mental health. Implement Sci. 2015;10: 134. pmid:26415961
  30. 30. [Anonymous]. The Effective Practice and Organisation of Care (EPOC) Group.
  31. 31. Hempel S, Shekelle PG, Liu JL, Sherwood Danz M, Foy R, Lim YW, et al. Development of the Quality Improvement Minimum Quality Criteria Set (QI-MQCS): a tool for critical appraisal of quality improvement intervention publications. BMJ Qual Saf. 2015;24: 796–804. pmid:26311020
  32. 32. Kampstra NA, Zipfel N, van der Nat Paul B, Westert GP, van der Wees Philip J, Groenewoud AS. Health outcomes measurement and organizational readiness support quality improvement: a systematic review. BMC health services research. 2018;18: 1–14.
  33. 33. Krippendorff K. Reliability in content analysis: Some common misconceptions and recommendations. Human communication research. 2004;30: 411–433.
  34. 34. Hayes AF, Krippendorff K. Answering the call for a standard reliability measure for coding data. Communication methods and measures. 2007;1: 77–89.
  35. 35. Roberts CM, Stone RA, Buckingham RJ, Pursey NA, Lowe D, Potter JM. A randomized trial of peer review: the UK National Chronic Obstructive Pulmonary Disease Resources and Outcomes Project: three-year evaluation. J Eval Clin Pract. 2012;18: 599–605. pmid:21332611
  36. 36. Reznek MA, Barton BA. Improved incident reporting following the implementation of a standardized emergency department peer review process. Int J Qual Health Care. 2014;26: 278–286. pmid:24771402
  37. 37. Kilsdonk MJ, van Dijk BAC, Otter R, Siesling S, van Harten WH. The impact of organisational external peer review on colorectal cancer treatment and survival in the Netherlands. Br J Cancer. 2014;110: 850–858. pmid:24423922
  38. 38. Hanskamp-Sebregts M, Zegers M, Boeijen W, Wollersheim H, van Gurp PJ, Westert GP. Process evaluation of the effects of patient safety auditing in hospital care (part 2). Int J Qual Health Care. 2018.
  39. 39. Hanskamp-Sebregts M, Zegers M, Westert GP, Boeijen W, Teerenstra S, van Gurp PJ, et al. Effects of patient safety auditing in hospital care: results of a mixed-method evaluation (part 1). Int J Qual Health Care. 2018.
  40. 40. Aldridge P, Horsley E, Rosettenstein K, Wallis T, Payne B, Pallot J, et al. What the FLOQ? A quality improvement project to reduce unnecessary paediatric respiratory viral swabs in a peripheral metropolitan hospital. J Paediatr Child Health. 2018;54: 416–419. pmid:29105978
  41. 41. Chua CC, Hutchinson A, Tacey M, Parikh S, Lim WK, Aboltins C. A physician targeted intervention improves prescribing in chronic heart failure in general medical units. BMC Health Serv Res. 2018;18: 206. pmid:29566753
  42. 42. Counihan T, Gary M, Lopez E, Tutela S, Ellrodt G, Glasener R. Surgical Multidisciplinary Rounds: An Effective Tool for Comprehensive Surgical Quality Improvement. Am J Med Qual. 2016;31: 31–37. pmid:25210093
  43. 43. Dafoe S, Chapman MJ, Edwards S, Stiller K. Overcoming barriers to the mobilisation of patients in an intensive care unit. Anaesth Intensive Care. 2015;43: 719–727. pmid:26603796
  44. 44. Johnson S, McNeal M, Mermis J, Polineni D, Burger S. Chasing Zero: Increasing Infection Control Compliance on an Inpatient Cystic Fibrosis Unit. J Nurs Care Qual. 2018;33: 67–71. pmid:28658183
  45. 45. Leung S, Leyland N, Murji A. Decreasing Diagnostic Hysteroscopy Performed in the Operating Room: A Quality Improvement Initiative. J Obstet Gynaecol Can. 2016;38: 351–356. pmid:27208604
  46. 46. Myers MK, Jansson-Knodell CL, Schroeder DR, O’Meara JG, Bonnes SL, Ratelle JT. Using knowledge translation for quality improvement: an interprofessional education intervention to improve thromboprophylaxis among medical inpatients. J Multidiscip Healthc. 2018;11: 467–472. pmid:30271162
  47. 47. Shadman KA, Wald ER, Smith W, Coller RJ. Improving Safe Sleep Practices for Hospitalized Infants. Pediatrics. 2016;138: e20154441. pmid:27482058
  48. 48. Stewart C, Bench S. Evaluating the implementation of confusion assessment method-intensive care unit using a quality improvement approach. Nurs Crit Care. 2018;23: 172–178. pmid:29766622
  49. 49. Wood SD, Candeland JL, Dinning A, Dow S, Hunkin H, McHale S, et al. Our approach to changing the culture of caring for the acutely unwell patient at a large UK teaching hospital: A service improvement focus on Early Warning Scoring tools. Intensive Crit Care Nurs. 2015;31: 106–115. pmid:25604030
  50. 50. Benitez-Rosario MA, Castillo-Padros M, Garrido-Bernet B, Ascanio-Leon B. Quality of care in palliative sedation: audit and compliance monitoring of a clinical protocol. J Pain Symptom Manage. 2012;44: 532–541. pmid:22795052
  51. 51. Bogh SB, Falstie-Jensen AM, Hollnagel E, Holst R, Braithwaite J, Johnsen SP. Improvement in quality of hospital care during accreditation: A nationwide stepped-wedge study. Int J Qual Health Care. 2016. pmid:27578631
  52. 52. Braithwaite J, Greenfield D, Westbrook J, Pawsey M, Westbrook M, Gibberd R, et al. Health service accreditation as a predictor of clinical and organisational performance: a blinded, random, stratified study. Qual Saf Health Care. 2010;19: 14–21. pmid:20172877
  53. 53. Clarke J, Deakin A, Dillon J, Emmerson S, Kinninmonth A. A prospective clinical audit of a new dressing design for lower limb arthroplasty wounds. J Wound Care. 2009;18: 5–11. pmid:19131911
  54. 54. Cohen M, Kimmel N, Benage M, Cox M, Sanders N, Spence D, et al. Medication safety program reduces adverse drug events in a community hospital. Qual Saf Health Care. 2005;14: 169–174. pmid:15933311
  55. 55. Cosgrove JF, Gaughan M, Snowden CP, Lees T. Decreasing delays in urgent and expedited surgery in a university teaching hospital through audit and communication between peri-operative and surgical directorates. Anaesthesia. 2008;63: 599–603. pmid:18477270
  56. 56. Dinescu A, Fernandez H, Ross JS, Karani R. Audit and feedback: an intervention to improve discharge summary completion. J Hosp Med. 2011;6: 28–32. pmid:21241038
  57. 57. Dupont C, Deneux-Tharaux C, Touzet S, Colin C, Bouvier-Colle MH, Lansac J, et al. Clinical audit: a useful tool for reducing severe postpartum haemorrhages? Int J Qual Health Care. 2011;23: 583–589. pmid:21733978
  58. 58. Easterlow D, Hoddinott P, Harrison S. Implementing and standardising the use of peripheral vascular access devices. J Clin Nurs. 2010;19: 721–727. pmid:20500315
  59. 59. Esposito P, Benedetto AD, Tinelli C, De Silvestri A, Rampino T, Marcelli D, et al. Clinical audit improves hypertension control in hemodialysis patients. Int J Artif Organs. 2013;36: 305–313. pmid:23504809
  60. 60. Gallagher-Swann M, Ingleby B, Cole C, Barr A. Improving transfusion practice: ongoing education and audit at two tertiary speciality hospitals in Western Australia. Transfus Med. 2011;21: 51–56. pmid:21039980
  61. 61. Gommans J, McIntosh P, Bee S, Allan W. Improving the quality of written prescriptions in a general hospital: the influence of 10 years of serial audits and targeted interventions. Intern Med J. 2008;38: 243–248. pmid:18298560
  62. 62. Gunningberg L, Stotts NA. Tracking quality over time: what do pressure ulcer data show? Int J Qual Health Care. 2008;20: 246–253. pmid:18390902
  63. 63. Hall A, Blanchford H, Chatrath P, Hopkins C. A multi-centre audit of epistaxis management in England: is there a case for a national review of practice? J Laryngol Otol. 2015;129: 454–457. pmid:25816868
  64. 64. Halpape K, Sulz L, Schuster B, Taylor R. Audit and Feedback-Focused approach to Evidence-based Care in Treating patients with pneumonia in hospital (AFFECT Study). Can J Hosp Pharm. 2014;67: 17–27. pmid:24634522
  65. 65. Hunter M, Kelly J, Stanley N, Stilley A, Anderson L. Pressure injury prevention success in a regional hospital. Contemp Nurse. 2014;49: 75–82. pmid:25549747
  66. 66. Ingen-Housz-Oro S, Amici JM, Roy-Geffroy B, Ostojic A, Domergue Than Trong E, Buffard V, et al. Dermatosurgery: total quality management in a dermatology department. Dermatology. 2012;225: 204–209. pmid:23128401
  67. 67. Jain N, Symes T, Doorgakant A, Dawson M. Clinical audit of the management of stable ankle fractures. Ann R Coll Surg Engl. 2008;90: 483–487. pmid:18765028
  68. 68. Johnson AM, Goldstein LB, Bennett P, O’Brien EC, Rosamond WD, investigators of the Registry of the North Carolina Stroke Care Collaborative. Compliance with acute stroke care quality measures in hospitals with and without primary stroke center certification: the North Carolina Stroke Care Collaborative. J Am Heart Assoc. 2014;3: e000423. pmid:24721795
  69. 69. Kalanithi L, Coffey CE, Mourad M, Vidyarthi AR, Hollander H, Ranji SR. The effect of a resident-led quality improvement project on improving communication between hospital-based and outpatient physicians. Am J Med Qual. 2013;28: 472–479. pmid:23526358
  70. 70. Kennedy NA, Rodgers A, Altus R, McCormick R, Wundke R, Wigg AJ. Optimisation of hepatocellular carcinoma surveillance in patients with viral hepatitis: a quality improvement study. Intern Med J. 2013;43: 772–777. pmid:23611607
  71. 71. Kurmis R, Heath K, Ooi S, Munn Z, Forbes S, Young V, et al. A Prospective Multi-Center Audit of Nutrition Support Parameters Following Burn Injury. J Burn Care Res. 2015;36: 471–477. pmid:25094004
  72. 72. Langston M. Effects of peer monitoring and peer feedback on hand hygiene in surgical intensive care unit and step-down units. J Nurs Care Qual. 2011;26: 49–53. pmid:22914666
  73. 73. Lanteigne G, Bouchard C. Is the introduction of an accreditation program likely to generate organization-wide quality, change and learning? Int J Health Plann Manage. 2016;31: e175–91. pmid:26358969
  74. 74. Lewis CM, Monroe MM, Roberts DB, Hessel AC, Lai SY, Weber RS. An audit and feedback system for effective quality improvement in head and neck surgery: Can we become better surgeons? Cancer. 2015;121: 1581–1587. pmid:25639485
  75. 75. Li C., Smith R., Berry B. Retrospective Clinical Audit of Adherence to a Protocol for Prophylaxis of Venous Thromboembolism in Surgical Patients. Can J Hosp Pharm. 2009;61.
  76. 76. McLiesh P, Mungall D, Wiechula R. Are we providing the best possible pain management for our elderly patients in the acute-care setting? Int J Evid Based Healthc. 2009;7: 173–180. pmid:21631858
  77. 77. Mills JK, Minhas JS, Robotham SL. An assessment of the dementia CQUIN–An audit of improving compliance. Dementia. 2014;13: 697–703. pmid:24445398
  78. 78. Munn Z, Scarborough A, Pearce S, McArthur A, Kavanagh S, Girdler M, et al. The implementation of best practice in medication administration across a health network: a multisite evidence-based audit and feedback project. JBI Database System Rev Implement Rep. 2015;13: 338–352. pmid:26455947
  79. 79. Nardini S, Cicchitto G, De Benedetto F, Donner CF, Polverino M, Sanguinetti CM, et al. Audit on the appropriateness of integrated COPD management: the "ALT-BPCO" project. Multidiscip Respir Med. 2014;9: 40-6958-9-40. eCollection 2014. pmid:25097757
  80. 80. Numan RC, Klomp HM, Li W, Buitelaar DR, Burgers JA, Van Sandick JW, et al. A clinical audit in a multidisciplinary care path for thoracic surgery: an instrument for continuous quality improvement. Lung Cancer. 2012;78: 270–275. pmid:22999081
  81. 81. Oliveri A, Howarth N, Gevenois PA, Tack D. Short- and long-term effects of clinical audits on compliance with procedures in CT scanning. Eur Radiol. 2016;26: 2663–2668. pmid:26577376
  82. 82. Perkins JN, Chiang T, Ruiz AG, Prager JD. Auditing of operating room times: A quality improvement project. Int J Pediatr Otorhinolaryngol. 2014;78: 782–786. pmid:24612553
  83. 83. Pomey M, Lemieux-Charles L, Champagne F, Angus D, Shabah A, Contandriopoulos A. Does accreditation stimulate change? A study of the impact of the accreditation process on Canadian healthcare organizations. Implement Sci. 2010;5: 31. pmid:20420685
  84. 84. Radford A, Undre S, Alkhamesi NA, Darzi AW. Recording of drug allergies: are we doing enough? J Eval Clin Pract. 2007;13: 130–137. pmid:17286735
  85. 85. Sheena Y, Fishman JM, Nortcliff C, Mawby T, Jefferis AF, Bleach NR. Achieving flying colours in surgical safety: audit of World Health Organization ’Surgical Safety Checklist’ compliance. J Laryngol Otol. 2012;126: 1049–1055. pmid:22892105
  86. 86. Stephenson M, Mcarthur A, Giles K, Lockwood C, Aromataris E, Pearson A. Prevention of falls in acute hospital settings: a multi-site audit and best practice implementation project. Int J Qual Health Care. 2016;28: 92–98. pmid:26678803
  87. 87. Ursprung R, Gray JE, Edwards WH, Horbar JD, Nickerson J, Plsek P, et al. Real time patient safety audits: improving safety every day. Qual Saf Health Care. 2005;14: 284–289. pmid:16076794
  88. 88. Vanoli M, Traisci G, Franchini A, Benetti G, Serra P, Monti MA. A program of professional accreditation of hospital wards by the Italian Society of Internal Medicine (SIMI): self-versus peer-evaluation. Intern Emerg Med. 2012;7: 27–32. pmid:21833771
  89. 89. Wright KM. Falls prevention strategies among acute neurosurgical and aged care inpatients in a tertiary hospital in sydney: A best practice implementation report. JBI Libr Syst Rev. 2014;12: 199–217.
  90. 90. Albornos-Muñoz L, Melián-Correa E, Acosta-Arrocha A, Gallo-Blanco C, Béjar-Bacas F, Alonso-Poncelas E, et al. Falls assessment and interventions among older patients in two medical and one surgical hospital wards in Spain: A best practice implementation project. JBI Database System Rev Implement Rep. 2018;16: 247–257. pmid:29324564
  91. 91. Alomari A, Sheppard-Law S, Lewis J, Wilson V. Effectiveness of Clinical Nurses’ interventions in reducing medication errors in a paediatric ward. J Clin Nurs. 2020.
  92. 92. Conaty O, Gaughan L, Downey C, Carolan N, Brophy MJ, Kavanagh R, et al. An interdisciplinary approach to improve surgical antimicrobial prophylaxis. Int J Health Care Qual Assur. 2018;31: 162–172. pmid:29504869
  93. 93. Ellis LA, Nicolaisen A, Bie Bogh S, Churruca K, Braithwaite J, von Plessen C. Accreditation as a management tool: a national survey of hospital managers’ perceptions and use of a mandatory accreditation program in Denmark. BMC Health Serv Res. 2020;20: 306-020-05177-7. pmid:32293445
  94. 94. Gazarin M, Mulligan E, Davey M, Lydiatt K, O’Neill C, Weekes K. Improving patient preparedness for the operating room: A quality improvement study in Winchester District Memorial Hospital—A rural hospital in Ontario. Canadian Journal of Rural Medicine. 2019;24: 44–51. pmid:30924460
  95. 95. Gude WT, Roos-Blom MJ, van der Veer SN, Dongelmans DA, de Jonge E, Peek N, et al. Facilitating action planning within audit and feedback interventions: a mixed-methods process evaluation of an action implementation toolbox in intensive care. Implement Sci. 2019;14: 90-019-0937-8. pmid:31533841
  96. 96. Healy K, O’Sullivan A, McCarthy L. A nurse-led audit on the incidence and management of inadvertent hypothermia in an operating theatre department of an Irish hospital. J Perioper Pract. 2019;29: 54–60. pmid:30062928
  97. 97. Mughal Z, Al-Jazieh I, Zaidi H. Development of a proforma to improve quality of handover of surgical patients at the weekend. J Eval Clin Pract. 2019;25: 456–462. pmid:30411446
  98. 98. Rohweder C, Wangen M, Black M, Dolinger H, Wolf M, O’Reilly C, et al. Understanding quality improvement collaboratives through an implementation science lens. Prev Med. 2019;129: 105859. pmid:31655174
  99. 99. Smiddy MP, Murphy OM, Savage E, Fitzgerald AP, O’Sullivan B, Murphy C, et al. Efficacy of observational hand hygiene audit with targeted feedback on doctors’ hand hygiene compliance: A retrospective time series analysis. J Infect Prev. 2019;20: 164–170. pmid:31428196
  100. 100. Smith L, Chapman A, Flowers K, Wright K, Chen T, O’Connor C, et al. Nutritional screening, assessment and implementation strategies for adults in an Australian acute tertiary hospital: A best practice implementation report. JBI Database System Rev Implement Rep. 2018;16: 233–246. pmid:29324563
  101. 101. Tíscar-González V, Uriarte-Diaz A, Morales-Boiza N, Linaza-Arriola MB, García-Guevara N, Izquierdo-García MJ. Postoperative pain management in a surgical unit in a Basque Country hospital: a best practice implementation project. JBI Database System Rev Implement Rep. 2019;17: 614–624. pmid:30973528
  102. 102. Weske U, Boselie P, van Rensen ELJ, Schneider MME. Using regulatory enforcement theory to explain compliance with quality and patient safety regulations: the case of internal audits. BMC Health Serv Res. 2018;18: 62-018-2865-8. pmid:29382331
  103. 103. Wooller KR, Backman C, Gupta S, Jennings A, Hasimja-Saraqini D, Forster AJ. A pre and post intervention study to reduce unnecessary urinary catheter use on general internal medicine wards of a large academic health science center. BMC Health Serv Res. 2018;18: 642-018-3421-2. pmid:30115051
  104. 104. Desveaux L, Mitchell JI, Shaw J, Ivers NM. Understanding the impact of accreditation on quality in healthcare: A grounded theory approach. Int J Qual Health Care. 2017;29: 941–947. pmid:29045664
  105. 105. Dunne D, Lal N, Pranesh N, Spry M, Mcfaul C, Rooney P. Surgical audit: are we not closing the loop? Int J Health Care Qual Assur. 2018;31: 966–972. pmid:30415615
  106. 106. Kilsdonk MJ, Siesling S, Otter R, van Harten WH. Two decades of external peer review of cancer care in general hospitals; the Dutch experience. Cancer Med. 2016;5: 478–485. pmid:26714788
  107. 107. Nicolaisen A, Bogh S, Churruca K, Ellis L, Braithwaite J, von Plessen C. Managers’ perceptions of the effects of a national mandatory accreditation program in Danish hospitals. A cross-sectional survey. Int J Qual Health Care. 2018.
  108. 108. Sinuff T, Muscedere J, Rozmovits L, Dale CM, Scales DC. A qualitative study of the variable effects of Audit and feedback in the ICU. BMJ Qual Saf. 2015;24: 393–399. pmid:25918432
  109. 109. Looper K, Winchester K, Robinson D, Price A, Langley R, Martin G, et al. Best practices for chemotherapy administration in pediatric oncology: Quality and safety process improvements (2015). J Pediatr Oncol Nurs. 2016;33: 165–172. pmid:26668214
  110. 110. Greenfield D, Hinchcliff R, Banks M, Mumford V, Hogden A, Debono D, et al. Analysing ’big picture’ policy reform mechanisms: the Australian health service safety and quality accreditation scheme. Health Expect. 2015;18: 3110–3122. pmid:25367049
  111. 111. Owen C, Mathews PW, Phillips C, Ramsey W, Corrigan G, Bassett M, et al. Intern culture, internal resistance: uptake of peer review in two Australian hospital internship programs. Aust Health Rev. 2011;35: 430–435. pmid:22126945
  112. 112. Thornlow DK, Merwin E. Managing to improve quality: the relationship between accreditation standards, safety practices, and patient outcomes. Health Care Manage Rev. 2009;34: 262–272. pmid:19625831
  113. 113. Anderson P, Fee P, Shulman R, Bellingan G, Howell D. Audit of audit: review of a clinical audit programme in a teaching hospital intensive care unit. Br J Hosp Med (Lond). 2012;73: 526–529.
  114. 114. Canitano S, Di Turi A, Caolo G, Pignatelli AC, Papa E, Branca M, et al. The Regina Elena National Cancer Institute process of accreditation according to the standards of the Organisation of European Cancer Institutes. Tumori. 2015;101 Suppl 1: S51–4.
  115. 115. Iyer RS, Swanson JO, Otto RK, Weinberger E. Peer review comments augment diagnostic error characterization and departmental quality assurance: 1-year experience from a children’s hospital. Am J Roentgenol. 2013;200: 132–137. pmid:23255752
  116. 116. Mazzini E, Cerullo L, Mazzi G, Costantini M. The experience of accreditation of the Reggio Emilia Research Hospital with the OECI model. Tumori. 2015;101 Suppl 1: S42–6.
  117. 117. Bogh SB, Blom A, Raben DC, Braithwaite J, Thude B, Hollnagel E, et al. Hospital accreditation: staff experiences and perceptions. Int J Health Care Qual Assur. 2018;31: 420–427. pmid:29865965
  118. 118. Currie K, Laidlaw R, Ness V, Gozdzielewska L, Malcom W, Sneddon J, et al. Mechanisms affecting the implementation of a national antimicrobial stewardship programme; multi-professional perspectives explained using normalisation process theory. Antimicrobial Resistance and Infection Control. 2020;9: 99. pmid:32616015
  119. 119. Dixon-Woods M, Campbell A, Aveling EL, Martin G. An ethnographic study of improving data collection and completeness in large-scale data exercises. Wellcome Open Res. 2019;4: 203. pmid:32055711
  120. 120. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82: 581–629. pmid:15595944
  121. 121. Power D, Terziovski M. Quality audit roles and skills: Perceptions of non-financial auditors and their clients. J Oper Manage. 2007;25: 126–147.
  122. 122. Johnston G, Crombie I, Alder E, Davies H, Millard A. Reviewing audit: barriers and facilitating factors for effective clinical audit. BMJ Quality & Safety. 2000;9: 23–36. pmid:10848367
  123. 123. Spurgeon P, Mazelan PM, Barwell F. Medical engagement: a crucial underpinning to organizational performance. Health Services Management Research. 2011;24: 114–120. pmid:21840896
  124. 124. Braithwaite J, Runciman WB, Merry AF. Towards safer, better healthcare: harnessing the natural properties of complex sociotechnical systems. Qual Saf Health Care. 2009;18: 37–41. pmid:19204130
  125. 125. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4: 50. pmid:19664226
  126. 126. Edmondson AC, Bohmer RM, Pisano GP. Disrupted routines: Team learning and new technology implementation in hospitals. Adm Sci Q. 2001;46: 685–716.
  127. 127. Lingard L, Vanstone M, Durrant M, Fleming-Carroll B, Lowe M, Rashotte J, et al. Conflicting messages: examining the dynamics of leadership on interprofessional teams. Academic Medicine. 2012;87: 1762–1767. pmid:23095927
  128. 128. Nugus P, Greenfield D, Travaglia J, Westbrook J, Braithwaite J. How and where clinicians exercise power: interprofessional relations in health care. Soc Sci Med. 2010;71: 898–909. pmid:20609507
  129. 129. Snelling I, Benson LA, Chambers N. How trainee hospital doctors lead work-based projects. Leadership in Health Services. 2019.
  130. 130. Tucker AL, Edmondson AC. Why hospitals don’t learn from failures: Organizational and psychological dynamics that inhibit system change. Calif Manage Rev. 2003;45: 55–72.
  131. 131. Brown B, Gude WT, Blakeman T, van der Veer Sabine N, Ivers N, Francis JJ, et al. Clinical performance feedback intervention theory (CP-FIT): a new theory for designing, implementing, and evaluating feedback in health care based on a systematic review and meta-synthesis of qualitative research. Implement Sci. 2019;14: 40. pmid:31027495
  132. 132. Greenfield D, Pawsey M, Braithwaite J. What motivates professionals to engage in the accreditation of healthcare organizations? International Journal for Quality in Health Care. 2011;23: 8–14. pmid:21084322
  133. 133. Dixon-Woods M, Bosk CL, Aveling EL, Goeschel CA, Pronovost PJ. Explaining Michigan: developing an ex post theory of a quality improvement program. Milbank Q. 2011;89: 167–205. pmid:21676020
  134. 134. van Gelderen SC, Hesselink G, Westert GP, Robben PB, Boeijen W, Zegers M, et al. Optimal governance of patient safety: A qualitative study on barriers to and facilitators for effective internal audit. J Hosp Adm. 2017;6: 15.
  135. 135. Hovlid E, Teig IL, Halvorsen K, Frich JC. Inspecting teams’ and organisations’ expectations regarding external inspections in health care: a qualitative study. BMC Health Services Research. 2020;20: 1–12. pmid:32641038
  136. 136. Braithwaite J, Churruca K, Long JC, Ellis LA, Herkes J. When complexity science meets implementation science: a theoretical and empirical analysis of systems change. BMC Med. 2018;16: 63. pmid:29706132
  137. 137. Veenstra GL, Ahaus K, Welker GA, Heineman E, van der Laan MJ, Muntinghe FL. Rethinking clinical governance: healthcare professionals’ views: a Delphi study. BMJ Open. 2017;7: e012591-2016-012591. pmid:28082364
  138. 138. Thor J, Herrlin B, Wittlöv K, Skår J, Brommels M, Svensson O. Getting going together: can clinical teams and managers collaborate to identify problems and initiate improvement? Qual Manag Health Care. 2004;13: 130–142. pmid:15127692
  139. 139. Staines A, Thor J, Robert G. Sustaining improvement? The 20-year Jönköping quality improvement program revisited. Qual Manag Health Care. 2015;24: 21–37. pmid:25539488
  140. 140. Bodenheimer T, Bojestig M, Henriks G. Making systemwide improvements in health care: Lessons from Jönköping County, Sweden. Qual Manag Health Care. 2007;16: 10–15. pmid:17235247
  141. 141. Greenhalgh J, Dalkin S, Gooding K, Gibbons E, Wright J, Meads D, et al. Functionality and feedback: a realist synthesis of the collation, interpretation and utilisation of patient-reported outcome measures data to improve patient care. Health Services and Delivery Research. 2017;5: 1–280. pmid:28121094
  142. 142. Greenhalgh J, Gooding K, Gibbons E, Dalkin S, Wright J, Valderas J, et al. How do patient reported outcome measures (PROMs) support clinician-patient communication and patient care? A realist synthesis. J Patient Rep Outcomes. 2018;2: 42-018-0061-6. eCollection 2018 Dec. pmid:30294712
  143. 143. Botje D, Ten Asbroek G, Plochg T, Anema H, Kringos DS, Fischer C, et al. Are performance indicators used for hospital quality management: a qualitative interview study amongst health professionals and quality managers in The Netherlands. BMC health services research. 2016;16: 574. pmid:27733194
  144. 144. Wallenburg I, Weggelaar AM, Bal R. Walking the tightrope: how rebels “do” quality of care in healthcare organizations. Journal of Health Organization and Management. 2019.
  145. 145. Jagosh J, Bush PL, Salsberg J, Macaulay AC, Greenhalgh T, Wong G, et al. A realist evaluation of community-based participatory research: partnership synergy, trust building and related ripple effects. BMC Public Health. 2015;15: 725-015-1949-1. pmid:26223523
  146. 146. Hawe P, Shiell A, Riley T. Theorising interventions as events in systems. Am J Community Psychol. 2009;43: 267–276. pmid:19390961
  147. 147. Marchal B, Van Belle S, Van Olmen J, Hoerée T, Kegels G. Is realist evaluation keeping its promise? A review of published empirical studies in the field of health systems research. Evaluation. 2012;18: 192–212.
  148. 148. Astbury B, Leeuw FL. Unpacking black boxes: mechanisms and theory building in evaluation. American journal of evaluation. 2010;31: 363–381.
  149. 149. Pawson R, Manzano-Santaella A. A realist diagnostic workshop. Evaluation. 2012;18: 176–191.