Skip to main content
  • Loading metrics

Mechanism mapping to advance research on implementation strategies

To advance as a science, implementation research must build an evidence base using, in part, experimental designs. But trials of implementation strategies—which are often complex health systems interventions—often do not demonstrate anticipated overall effects [1,2]. As a consequence, studies to understand whether a strategy will work are strongest when they can also reveal how they succeed or fail. The trial by Sarkies and colleagues published recently in PLOS Medicine illustrates this need [3].

Previous randomized trials and a meta-analysis provided evidence that weekend staffing with allied healthcare professionals (e.g., physical therapists) decreased length of stay in rehabilitation units, but such weekend staffing had no effects in general medical and surgical units [4]. Because staffing in real-world practice does not reflect these data [5], Sarkies cluster-randomized 45 healthcare units to receive one of 3 conditions: usual practice (the control); written materials about staffing evidence; or a knowledge broker (KB)—defined in this study as “intermediary agents who build relationships between decision makers and researchers, by sharing expert knowledge and establishing communication channels to convey evidence.” The study found that neither written dissemination nor a KB had an effect on staffing nor patient length of stay compared to usual practice. Why? Data about implementation showed that managers assigned to a KB had very limited actual interaction with the KB and attended very few meetings. But additional questions remain. Why was KB attendance poor? Was the knowledge itself or the broker of the knowledge unconvincing? Which contextual enablers were missing? The answers to these questions are crucial for understanding the prospects of KB as a useful implementation strategy. Understanding how strategies fail is no less important than identifying success.

One way to dissect how implementation strategies fail is through formal exploration of anticipated mechanisms. Several approaches in implementation science offer methods that can help conceptualize mechanisms systematically. The Intervention Mapping approach asks those planning health promotion activities to conceive of the multilevel, theory-based targets for behavior change that underlie a program and how they interact to produce desired effects [6]. The Theory of Change approach encourages programs to make their intentions, activities, steps behind change, and assumptions explicit through a visual diagram [7], drawing from literature review, contextual knowledge, and stakeholder-engaged methods. “Mechanism mapping” (a concept from policy analysis) [8] advocates decomposing an effect of interest into its component steps and asking how those steps interact with context. Directed acyclic graphs which have become widely used in epidemiology can be conceived of a tool for representing how an effect occurs mechanistically. A mechanistic exploration of the KB strategy tested by Sarkies and colleagues reveals potential routes of effect, accompanying assumptions and contextual influences, and thereby points us toward several key scientific opportunities.

First, decomposing the overall effects of an implementation strategy—in this case KB—into its hypothesized mechanistic component allows investigators to be more precise about where it may have broken down. Drawing from previous literature [9], we suggest KB acts through 3 potential pathways (Fig 1): forming social links with the target managers (step a); offering curated “evidence” of staffing to managers (step b); and conveying skills to enable use of staffing evidence (step c). In turn, these influence the manager’s capability, opportunity, and motivation (steps e and f) to use an evidence-based intervention (step g). Such a mapping conceptualizes KB as a multipathway approach that acts through affective (forming a bond), cognitive (brokered knowledge), social (linking to a network), and instrumental (capacity building) means. Visual mapping also reveals temporal dependencies. For example, forming a social bond (step a) likely needs to come first to magnify subsequent effects of conveying knowledge of evidence (step b). Mapping also makes explicit the need to draw from social network theory, facilitation, organizational theory, and diffusion of innovation among others [10,11]. In fact, such a decomposition can be seen as a “theory of the strategy” that helps researchers interrogate, revise, and build toward a general, scientific understanding of KB. In this case, mapping identifies several points of potential weakness in this version of the KB strategy: limited intention to first cultivate a relationship between KB and managers (step a), possible attenuated rapport formation via web-based connection, and less attention on skills development (step d).

Fig 1. The mechanism of the effect of an implementation strategy can be conceived of as being composed of a chain of events (in the larger ellipse below) underlying a larger phenomenon (smaller ellipse above) [18].

In this case, we synthesize and simplify existing literature to suggest that knowledge brokering potentially occurs through (a) developing a relationship with the knowledge recipient; (b) offering evidence to the recipient; and (c) building capabilities to use that evidence in the recipient. These activities, in turn, are hypothesized to increase the capability, motivation, and opportunity of the manager to apply evidence-based intervention, which may then lead to changes in behavior (step i). Contextual effects are represented as ellipses pointing into specific nodes in the chain of event (steps k, l, and m). We use a convention from some causal diagrams in which 2 arrows pointing into 1 node (i.e., word) implies effect modification. Image credit: Raymond Craver.

A visual mapping of the putative mechanism also highlights underlying assumptions and their uncertainties and therefore informs what to measure. The effect of conveying evidence (step b) assumed the evidence is perceived as credible. A mixed-methods manuscript [12] from the same study (which should be considered best practice) investigates these assumptions and finds it wanting: Managers in the targeted hospitals did not accept the “evidence” about staffing because in part they felt the “average effects” in trials were not applicable to their specific settings. Skepticism about the evidence likely explained the observed limited participation by managers in KB webinars after initial contact. Another assumption is that the manager would accept knowledge from the particular broker in this strategy (a postdoctoral scholar) who may not be of high status in their eyes (steps a and b) [13]. The study reveals little about whether this assumption held. Finally, mapping demonstrates the assumption that the manager (the action target) has sufficient authority to make changes (step i)—another unknown [14]. Additional details confirming or contesting these assumptions would further help explain lack of effects but were not present in the study.

Mechanism mapping also increases the rigor of a study through making explicit where and when particular elements of the context act on particular steps in the hypothesized chain of events. Several elements of context potentially differ between the study by Sarkies and colleagues and other settings using KB, differences that could inform external validity of null effects if better measured. For example, in this case, the nature of the evidence offers one important contextual element: It could be that evidence of a managerial practice in this study is perceived as fundamentally different from evidence of a clinical treatment and thereby attenuated the effect of brokered knowledge on target manager beliefs (step k). Additionally, the organizational relationship between the KB agency and the hospital may have a critical influence on the success of the social and affective link between the KB and the manager (step l) [15]. Other potential contextual factors exist (step m). Decomposing a hypothetical effect enables us to conceive of context with greater precision by naming its influences more precisely [16].

Sarkies and colleagues’ new paper moves implementation science ahead in 2 ways. First, it suggests that KB depends on the status of the actor, the means of contact, the perceived applicability of evidence, the affective bond created by the KB, and the interactions between each of these factors. Future studies of KB must emphasize these elements. Second, and more broadly, the study reminds us that trials of complex health systems interventions must allow us to explain whether a strategy was used and, if not, why not [1,2,17]. In this study, a rigorous conceptualization of the strategy and clear description of actor, action, and other components allowed us to see the that poor engagement with the KB explained null effects and also that credibility of the evidence to managers was in part to blame. Systematic mechanism mapping could potentially help yield additional insights. A failed intervention may yet be a successful study when it provides generalizable insights. Explicitly specifying and measuring hypothesized mechanisms of implementation strategies can help us make the most of our investments in implementation research.


  1. 1. Bhasin S, Gill TM, Reuben DB, Latham NK, Ganz DA, Greene EJ, et al. A Randomized Trial of a Multifactorial Strategy to Prevent Serious Fall Injuries. N Engl J Med. 2020 Jul 9;383(2):129–40. pmid:32640131
  2. 2. Berwick DM. The Science of Improvement. JAMA. 2008 Mar 12;299(10):1182–4. pmid:18334694
  3. 3. Sarkies Id MN, Robins LM, Id MJ, Williams Id CM, Taylor Id NF, O’brien Id L, et al. Effectiveness of knowledge brokering and recommendation dissemination for influencing healthcare resource allocation decisions: A cluster randomised controlled implementation trial. 2021. Available from:
  4. 4. Sarkies MN, White J, Henderson K, Haas R, Bowles J. Additional weekend allied health services reduce length of stay in subacute rehabilitation wards but their effectiveness and cost-effectiveness are unclear in acute general medical and surgical hospital wards: a systematic review. J Physiother. 2018 Jul;64(3):142–58. pmid:29929739
  5. 5. Lane H, Sturgess T, Philip K, Markham D, Martin J, Walsh J, et al. What Factors Do Allied Health Take Into Account When Making Resource Allocation Decisions? Int J Health Policy Manag. 2017 Sep 12;7(5):412.
  6. 6. Eldredge L, Markham CM, Ruiter R, Fernandez M, Kok G, Parcel G. Planning health promotion programs: an intervention mapping approach. Fourth. Hoboken, New Jersey: John Wiley & Sons, Inc; 2016.
  7. 7. Dhillon L, Vaca S. Refining theories of change. Evaluation. 2018;14:30.
  8. 8. Williams MJ. External Validity and Policy Adaptation: From Impact Evaluation to Policy Design. World Bank Res Obs. 2020 Aug 1;35(2).
  9. 9. Dobbins M, Robeson P, Ciliska D, Hanna S, Cameron R, O’Mara L, et al. A description of a knowledge broker role implemented as part of a randomized controlled trial evaluating three knowledge translation strategies. Implement Sci. 2009 Dec 27;4(1):1–9. pmid:19397820
  10. 10. Rogers E. Diffusion of Innovations. Simon & Schuster; 2010. pmid:22437269
  11. 11. Valente TW. Social network thresholds in the diffusion of innovations. Soc Networks. 1996 Jan;18(1):69–89.
  12. 12. White J, Grant K, Sarkies M, Haines T, Morris ME, Carey L, et al. Translating evidence into practice: a longitudinal qualitative exploration of allied health decision-making. Health Res Policy Syst. 2021 Dec 18 [cited 2022 Jan 13];19(1):1–11. Available from: pmid:33388085
  13. 13. Rogers L, de Brún A, Birken SA, Davies C, McAuliffe E. The micropolitics of implementation; a qualitative study exploring the impact of power, authority, and influence when implementing change in healthcare teams. BMC Health Serv Res. 2020 Dec 23;20(1):1–13.
  14. 14. White J, Grant K, Sarkies M, Haines T, Morris ME, Carey L, et al. Translating evidence into practice: a longitudinal qualitative exploration of allied health decision-making. Health Res Policy Syst. 2021 Dec 18;19(1). pmid:33736670
  15. 15. Lengnick-Hall R, Stadnick NA, Dickson KS, Moullin JC, Aarons GA. Forms and functions of bridging factors: specifying the dynamic links between outer and inner contexts during implementation and sustainment. Implement Sci. 2021 Dec 1;16(1):1–13. pmid:33413491
  16. 16. Mehrotra ML, Petersen ML, Geng EH. Understanding HIV Program Effects: A Structural Approach to Context Using the Transportability Framework. J Acquir Immune Defic Syndr. 1999;2019(82 Suppl 3):S199.
  17. 17. MERIT Study Investigators. Introduction of the medical emergency team (MET) system: a cluster-randomised controlled trial. Lancet. 2005 Jun;365(9477):2091–7. pmid:15964445
  18. 18. Craver CF, Bechtel W. Top-down Causation Without Top-down Causes. Biol Philos. 2007 Jul 30;22(4):547–63.