Skip to main content
Advertisement
  • Loading metrics

The behavioural and cognitive impacts of digital educational interventions in the emergency department: A systematic review

  • Sophie Cleff ,

    Contributed equally to this work with: Sophie Cleff, Shubhang Sreeranga

    Roles Conceptualization, Investigation, Writing – original draft, Writing – review & editing

    sophie.cleff@mail.mcgill.ca (SC); jennifer.turnbull@mcgill.ca (JT)

    Affiliation McGill University Health Centre, Montreal, Quebec, Canada

  • Shubhang Sreeranga ,

    Contributed equally to this work with: Sophie Cleff, Shubhang Sreeranga

    Roles Conceptualization, Formal analysis, Investigation, Visualization

    Affiliation McGill University Health Centre, Montreal, Quebec, Canada

  • Ibtisam Mahmoud,

    Roles Investigation

    Affiliation McGill University Health Centre, Montreal, Quebec, Canada

  • Abdullatif Hassan ,

    Roles Data curation

    ‡ AH, LGN and EF also contributed equally to this work. JT and EO also contributed equally to this work.

    Affiliation McGill University Health Centre, Montreal, Quebec, Canada

  • Laury Gueyie Noutiamo ,

    Roles Data curation

    ‡ AH, LGN and EF also contributed equally to this work. JT and EO also contributed equally to this work.

    Affiliation McGill University Health Centre, Montreal, Quebec, Canada

  • Elie Fadel ,

    Roles Data curation

    ‡ AH, LGN and EF also contributed equally to this work. JT and EO also contributed equally to this work.

    Affiliation McGill University Health Centre, Montreal, Quebec, Canada

  • Jennifer Turnbull ,

    Roles Conceptualization, Investigation, Writing – review & editing

    sophie.cleff@mail.mcgill.ca (SC); jennifer.turnbull@mcgill.ca (JT)

    ‡ AH, LGN and EF also contributed equally to this work. JT and EO also contributed equally to this work.

    Affiliations McGill University Health Centre, Montreal, Quebec, Canada, Department of Pediatrics, Division of Emergency Medicine, McGill University Health Centre, Montreal, Quebec, Canada

  • Esli Osmanlliu

    Roles Conceptualization, Investigation, Writing – review & editing

    ‡ AH, LGN and EF also contributed equally to this work. JT and EO also contributed equally to this work.

    Affiliations McGill University Health Centre, Montreal, Quebec, Canada, Department of Pediatrics, Division of Emergency Medicine, McGill University Health Centre, Montreal, Quebec, Canada

Abstract

Ensuring patients and their caregivers understand the health information they receive is an important part of every clinical visit. Digital educational interventions like video discharge instructions, follow-up text messaging, or interactive web-based modules (WBMs) have the potential to improve information retention and influence behaviour. This study aims to systematically evaluate the impact of these interventions on patient and caregiver cognition and behaviour, as well as identify the characteristics of successful interventions and observe how success is measured. In December of 2022, a systematic literature search was conducted in several databases (Cochrane, Embase, MEDLINE (Ovid), Web of Science, ClinicalTrials.gov, and Google Scholar) for randomized controlled trials (RCTs) published between 2012 and 2022. In 2024, an identical search was performed for articled published between 2022 and 2024. Studies testing patient- and caregiver-facing digital educational interventions in the emergency department for behavioural and cognitive outcomes were included. Data from 35 eligible studies encompassing 12,410 participants were analyzed and assessed for bias using the Cochrane RoB2.0 tool. Video was used in 22 studies (63%), making it the most common modality. Seventy-three percent (16/22) of these studies reported statistically significant improvements in their primary outcomes. Text messaging was used in eight studies, with two (25%) reporting significant improvement in their primary outcomes. WBMs and apps were used in seven studies, 71% (5/7) of which reported statistically significant improvements in primary outcomes. Statistically significant improvements in cognitive outcomes were reported in 64% (18/28) of applicable studies, compared with 17% (4/23) for behavioural outcomes. The results suggest that digital educational interventions can positively impact cognitive outcomes in the emergency department. Video, WBM, and app modalities appear particularly effective. However, digital educational interventions may not yet effectively change behaviour. Establishing guidelines for evaluating the quality of digital educational interventions, and the formal adoption of existing reporting guidelines, could improve study quality and consistency in this emerging field. Registration The study is registered with PROSPERO ID #CRD42023338771.

Author summary

In our study, we explored how emergency department (ED) patients and their caregivers can be educated about important health information through digital tools. We reviewed 35 articles on different methods – such as videos, apps, and text messages – aimed at improving patients’ health-related behaviours or their understanding of health issues. Our research is important for several reasons. Firstly, it’s crucial that patients and caregivers have a clear understanding of information that could impact their health when they leave the ED – this knowledge supports their future health and well-being. Secondly, digital tools are increasingly being used to convey health information to patients. Rigorous review of these tools supports the creation of effective and accessible interventions for all patients. This not only helps patients and caregivers, but also helps to ensure that the resources to develop such tools are used wisely. Our findings contribute to a broader understanding of how digital educational tools can enhance patient education in the ED.

Introduction

Background

Apparent gaps in patient understanding of health information have been studied for decades [1]. Comprehension of standard verbal discharge instructions has been found to be deficient in up to 78% of patients [2], and one study estimated that 60% of Canadian adults have low health literacy [3]. When patients have difficulty understanding complex health information during a clinical encounter, it can lead to improper subsequent care, increased risk of return visits to the hospital, and other poor health outcomes [47].

Digital educational interventions show promise for improving health-related outcomes, and may help bridge gaps in understanding for patients that struggle to understand and retain health information when relayed through traditional methods (i.e., verbal discharge instructions or a brochure) [810]. Text messaging, video instructions, apps, and Web-Based Modules (WBM) may allow for a more interactive experience with the material, reliable access to important information without a physician present, and a more tailored learning experience than traditional verbal discharge instructions [11]. Apps and WBMs specifically allow for a variety of interactive features such as chatbots, embedded text and videos, behavioural tracking, and questionnaires that help patients engage with the material. Patients can also complete a survey to establish what may help them the most, and the app or WBM can then tailor the information it presents to the needs of the patient. However, the development of new digital educational tools also runs the risk of leaving some low digitally-literate groups even further behind, exacerbating existing disparities [12]. It is thus important to study how these interventions are being implemented and evaluated.

To our knowledge, there has been no comprehensive review of both the behavioral and cognitive impacts of digital educational interventions in emergency medicine. Previous studies have examined solely cognitive impacts [13,14] or impacts in a limited disease area [14], commonly finding evidence that demonstrates some benefits of digital education as compared to usual care. This study will evaluate the cognitive impacts (e.g., information comprehension or confidence in disease management) and behavioural impacts (e.g., changes in health-seeking behaviour or adherence to a medical plan or advice) to determine how digital educational interventions are affecting patients, and guide future development and evaluative efforts in this field.

Theoretical frameworks

In deciding to focus on behavioural and cognitive impacts, we drew from several theoretical frameworks. The development of our cognitive analysis framework was influenced by the Health Belief Model. This framework proposes that individual perceptions of disease severity, belief in the efficacy of the prescribed action, and confidence in themselves to complete that action can influence health-related behaviours [15]. This directly correlates to several of the subcategories we created for analysis, namely: confidence with disease management, motivation to make behavioural changes, and disease awareness and understanding.

However, there are differences between thought and action; high motivation to change a behaviour does not always result in a behaviour change [16]. Our decision to also investigate changes to health-seeking behaviour is supported by the Normalization Process Theory, which evaluates the success of digital health interventions by emphasizing people’s actions over their intentions [17].

Furthermore, some research suggests that improvements to the quality of care are vitally important in determining the success or failure of a digital health intervention. This contributed to our inclusion of clinical outcomes, healthcare facility use, and patient satisfaction as categories for investigation [18].

Objectives

We developed and achieved two primary objectives. Firstly, to explore how digital educational interventions in the emergency department impact the cognition and behaviour of patients and caregivers. Secondly, to identify the characteristics of successful interventions and determine how success is measured in the studies we examined.

Materials and methods

We conducted a systematic literature review to fulfill both of our objectives, adhering to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines [19]. The protocol for this review can be found in S2 Appendix.

Inclusion criteria

Randomized Controlled Trials (RCTs) were the sole study design included in the primary analysis. RCTs were included if they tested a digital educational intervention in comparison with usual care, with an analog intervention, or through pre/posttest analysis. Digital interventions are defined in the WHO’s third global survey on eHealth as the “use of information and communication technologies in support of health services” [20]. We use this definition with the additional specification that the intervention must be caregiver and/or patient-facing and educational in nature. RCTs were eligible for inclusion if conducted in the emergency department.

Exclusion criteria

Studies were excluded if they (1) used a study population other than patients or caregivers, (2) used an analog-only intervention, (3) used a digital intervention that was not educational in nature (e.g., telemedicine or telemonitoring), (4) did not aim to investigate behavioural or cognitive impacts of their intervention (e.g., investigated the administrative cost of an intervention), or (5) were conducted without a control or comparator.

Search strategy

A medical librarian at the McGill University Health Centre searched the following databases: Cochrane, Embase, MEDLINE (Ovid), Web of Science, ClinicalTrials.gov, and Google Scholar. The first search was performed on December 2, 2022, restricted to studied published between December, 2012 and December, 2022. A second identical search was performed on June 28, 2024 to include studies published between December, 2022 and June 2024. The search was restricted to studies available in French or English. We restricted the initial search to the most recent 10 years to keep digital educational interventions relevant to our current context, given the rapid pace of innovation in this area of study. The detailed search strategy is included in S1 Appendix.

Study selection, data extraction, and data synthesis

The results of the database search were transferred into Rayyan [21] for screening and recording decisions about eligibility. Two blinded student researchers (SC, SS) independently screened all articles for pre-determined eligibility criteria, and conflicts were resolved by two senior researchers (EO, JT). We initially screened articles based on title and abstract. Included articles were then screened based on the full text. Finally, the two student researchers checked the reference lists from all included papers for any additional studies that may have been missed in the initial search. Data extraction was divided between the two student researchers for independent completion. We created an extraction table in Excel to record relevant data on study characteristics and results related to our objectives.

We defined cognitive impacts as relating to changes in patient thought and feeling. This category was subdivided into information comprehension, motivation to make behavioural changes, confidence with disease management, and patient satisfaction. Behavioural outcomes were defined as being related to patient behaviour and were subdivided into health-seeking behaviour (e.g., return visits to the ED) and adherence and concordance to medical plan or advice (e.g., contraception use or medication adherence). We also collected data on clinical outcomes, which was reported in several studies in our review. Results were also subdivided by study population age; adult populations were defined as being 18 years or older. Data were recorded qualitatively in the spreadsheet and later turned to quantitative data to describe the characteristics of studies in terms of cognitive or behavioural outcomes across the intervention modalities of video, text, WBM, or app. Bar graphs were generated from the quantitative analysis to visually represent our pool of studies.

To accurately represent the reported primary outcomes of eligible studies, we categorized studies as either showing (1) significant improvement (statistically significant improvement in a given outcome for the intervention group); (2) non-significant change(conflicting in their findings or reporting no statistically significant difference in the outcome between the intervention and control group); or (3) significant worsening (a statistically significant worsening of a given outcome for the intervention group). Statistical significance was determined by examining the reported data in published articles. The percentages of studies that fell into each of these categories, stratified by modality, cognitive vs behavioural, and adult vs pediatric, were then presented.

Risk of bias assessment

To assess the risk of bias in our group of studies, one student researcher (SS) independently used the Cochrane RoB2.0 tool for RCTs. The overall risk of bias of each study was calculated based on the Cochrane guidelines and then visually represented using the RoBvis tool [22].

Results

Study selection, data extraction, and data synthesis

The initial search in 2022 yielded 6,389 articles. Following duplicate removal, 6,151 articles remained. The second search in 2024 yielded 1,508 articles with no duplicates. The title and abstract screen yielded 53 studies eligible for inclusion form the first search, and 18 articles from the second search. After full text screening, 24 studies were included in the review from the first search, and 8 articles from the second search. We found three additional papers to be included after searching through the references, for a total of 35 included studies (Fig 1) [2357]. These studies encompass a total of 12,410 participants, with between 33 and 2,521 participants per study. More than half of the studies (20/35) were published in 2020 or after. Most studies (25/35) were published in the USA. About half of included studies (51%, 18/35) had exclusively adult (18+ y/o) patients as their study population. Twenty percent of studies (7/35) targeted both adult and pediatric patients in their study population. The remaining 29% of studies (10/35) had only pediatric patients. In the studies involving pediatric patients, the child’s caregiver received the intervention and was tested for changes in cognitive and behavioral outcomes. The caregiver could also report on clinical and behavioral outcomes for the child. There were no studies testing an intervention exclusively on pediatric patients themselves. The most frequently used digital intervention was video (22/35), followed by text messaging (8/35), apps (3/35), and WBMs (4/35). Out of 35 articles, 23 assessed behavioural impacts, while 28 assessed cognitive impacts (Table 1).

thumbnail
Fig 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow diagram for this review.

https://doi.org/10.1371/journal.pdig.0000772.g001

thumbnail
Table 1. Characteristics of studies in this systematic review.

https://doi.org/10.1371/journal.pdig.0000772.t001

Risk of bias assessment

The results of the risk of bias assessment showed that 25 studies (71%) had some concerns of bias; this was most often due to deviations from the intended intervention. The second most frequent reasons for some risk of bias were missing outcome data (often due to loss to follow-up) or bias in the outcome measurement. Eight studies (23%) had a high risk of bias, and only two studies had a low risk of bias (Fig 2).

thumbnail
Fig 2. Results of the Risk of Bias (RoB) assessment, shown through the RobVis tool.

https://doi.org/10.1371/journal.pdig.0000772.g002

Intervention modalities

Videos.

Video interventions were the most frequently used intervention (22/35) (Table 1). Videos were either animated or made with live actors, usually presenting adaptations of the standard written discharge instructions or depictions of a fictional scenario. Videos were shown to patients while in the ED, and patients were sometimes given a link to the video so it could be watched again later. Of the 22 studies using video interventions, 16 (73%) reported significant improvements in cognitive, behavioural, or clinical primary outcomes (Tables 24).

thumbnail
Table 2. A tabular summary visually depicting the outcomes for every study in this review.

https://doi.org/10.1371/journal.pdig.0000772.t002

thumbnail
Table 3. A tabular summary describing the outcomes of studies examining behavioural outcomes in this review. Bolded text indicates the primary outcome for a given study.

https://doi.org/10.1371/journal.pdig.0000772.t003

thumbnail
Table 4. A tabular summary describing the outcomes of studies examining cognitive outcomes in this review. Bolded text indicates the primary outcome for a given study.

https://doi.org/10.1371/journal.pdig.0000772.t004

Text messaging.

Text messaging interventions were initiated in the ED, but often (7/8) [26,30,31,47,48,56,57] extended for days, weeks, or months past the patient’s stay. Six of these studies used unidirectional texting [2931,47,56,57] and two studies used texting with an interactive component [23,45]. The frequency of texting campaigns ranged from daily to once a week. Three studies used strictly educational content in their texts [29,30,47], and five studies used both motivational and educational content [26,31,48,56,57]. Of the eight studies on texting interventions, two studies (25%) showed significant improvement in their primary outcomes; specifically, in disease awareness and confidence with the treatment plan (Tables 24).

Apps and web-based modules.

Apps and WBMs were the least used intervention modality (3/35 and 4/35, respectively). The rate of significant improvement in the primary outcome for this intervention modality was 71% (5/7), specifically with cognitive outcomes such as disease awareness/knowledge and confidence with the treatment plan [33,37,43,47,55]. All seven of these studies examined cognitive outcomes, and six out of seven included information comprehension as an outcome [33,37,43,47,53,55]. The one remaining study only included patient satisfaction as a cognitive outcome, and this study reported significant worsening in the primary outcome (Tables 24).

Pediatric and adult populations

About half of included studies (51%, 18/35) had exclusively adult (18+ y/o) patients as their study population, 20% of studies (7/35) had a mix of adult and pediatric patients in their study population, and 29% (10/35) had only pediatric patients or their caregivers (Table 1). In the studies involving pediatric patients, the child’s parent or caregiver received the intervention and was tested for changes in cognitive and behavioral outcomes. The caregiver could also report on clinical and behavioral outcomes for the child. There were no studies testing an intervention exclusively on pediatric (<18 y/o) patients themselves. Studies with adult-only populations reported significant improvement in their primary outcomes for the intervention group in 61% of studies (11/18) [26,27,32,35,41,43,47,50,52,54,55]. In studies with mixed-age patients, significant improvement in the primary outcome was reported in 43% of studies (3/7) [23,25,39]. Finally, studies with pediatric patients and their caregivers showed significant improvement in the primary outcome for 70% of studies (7/10) [24,28,33,36,37,45,51] (Fig 3 and Tables 3-4).

thumbnail
Fig 3. Primary outcomes of digital educational interventions in adult, age-mixed, and pediatric populations (n

= 35). Percentages indicate the proportion within a given age category.

https://doi.org/10.1371/journal.pdig.0000772.g003

Outcomes

Behavioural impacts.

There were 23 studies that examined behavioural outcomes (Tables 2, 3). These outcomes were measured through self-reporting and were sometimes confirmed through electronic medical records [35,50,52,56,57]. Significant improvement in behavioral outcomes were observed in four studies (17%), and non-significant change was observed in 82% of studies (19/23). There were two studies that showed significant worsening of their behavioural outcomes, one of which used a texting intervention and the other of which used an app (Fig 4 and Tables 23). Note that there were two studies (Omaki et al. and Golden-Plotnik et al) that were included in multiple intervention modality categories, resulting in 25 data points for Fig 4.

thumbnail
Fig 4. Stacked bar chart displaying the behavioural outcomes of digital educational interventions in the ED (n

= 23). Percentages indicate the proportion within a given modality.

https://doi.org/10.1371/journal.pdig.0000772.g004

Cognitive impacts.

There were 28 studies that examined cognitive outcomes (Fig 5 and Tables 2, 4). The outcomes in this category were mostly measured through knowledge questionnaires and surveys.

thumbnail
Fig 5. Stacked bar chart displaying the cognitive outcomes of digital educational interventions in the ED (n

= 28). Percentages indicate the proportion within a given modality.

https://doi.org/10.1371/journal.pdig.0000772.g005

Knowledge questionnaires were used in all studies measuring information comprehension (Table 4). They ranged from as few as three questions to as many as 25, varying between multiple choice, true or false, and short answer. Questionnaires were scored by healthcare workers or research assistants. In 12 out of the 21 studies measuring information comprehension, knowledge questionnaires were administered both before and after the intervention [25,27,33,34,36,37,42,45,47,51,53,55]. In the other studies, questionnaires were only administered after the intervention [23,24,28,32,41,43,44,49,54]. In three studies, the questionnaires were administered a third time, up to six weeks post-discharge, to evaluate longer-term knowledge retention [27,47,55]. One of these studies, using a combined app and texting intervention, did not report a significant difference in knowledge retention between the intervention and control group immediately after the intervention or at follow-up [47]. The other two studies, using video [27] and WBM [55] interventions, showed significantly better recall in the intervention group both immediately after discharge and at follow-up.

To evaluate cognitive impacts other than disease awareness (i.e., patient confidence in disease management, motivation to make behavioural changes, and satisfaction), 5-point Likert scales were most often used. Patient-Reported Experience Measures (PREMs) like the Emergency Room Patient Satisfaction Survey were also used to evaluate patient satisfaction [43].

Sixty-four percent of studies (18/28) demonstrated significant improvements in their cognitive outcomes, and in 32% of studies (9/28) there was no change or conflicting findings for the cognitive outcomes between the intervention and control groups. There was one study that demonstrated significant worsening in its cognitive outcome, which used an app intervention (Tables 2, 4). As with the behavioral outcomes, there were two studies (Omaki et al. and Golden-Plotnik et al) that were included in multiple intervention modality categories, resulting in 30 data points for Fig 5.

Clinical outcomes.

When reporting on clinical outcomes, researchers occasionally (2/8) used Patient-Reported Outcome Measures (PROMs) [43,44]. Examples include the Hospital Anxiety and Depression Scale [44] and the Safety and Imminent Distress Questionnaire [43], but no single PROM was commonly used among multiple studies. In other studies, physicians evaluated physiological aspects of clinical outcomes, for example by administering a test for HIV [25] or a blood pressure test [30]. Forty-four percent of studies examining clinical outcomes (4/9) showed significantly improvement in their primary outcomes. The rest of the studies reported no significant change in their clinical outcomes (Table 2).

Discussion

Principal results

In this systematic review, most health-related digital education interventions in the emergency department lead to positive outcomes. The evidence base was stronger for cognitive outcomes (e.g., information comprehension) than for behavioural outcomes (e.g., hospital return visits). Videos were the most frequently used intervention, followed by text-messages and WBMs. The heterogeneous evaluation frameworks used across studies limit the comparison of these tools. A few studies included clinical outcomes, and a minority described any PROMs, which are essential for value-based, patient-partnered digital innovation.

Our primary objective was to explore how digital educational interventions in the ED impact the cognition and behaviour of caregivers and patients. Many of the interventions in these studies focused on educating patients and caregivers; it follows that the researchers would investigate the comprehension of the information given, leading to a strong evidence base for cognitive outcomes. We observed that most interventions investigating cognitive outcomes led to significant improvement (reporting statistically significant improvements for the intervention groups as compared to the control) in their primary outcomes. Many other studies reports non-significant change or conflicting findings in their primary outcomes. There was a notable trend of interventions that had some observable positive effect but did not achieve statistical significance. While this lack of significance in some results could indicate that there is in fact no effect for a given intervention, real effects may have been missed due to lack of power and high dropout rates observed in many of the included studies. Overall, when utilizing digital educational interventions, the likelihood for improved cognitive outcomes appears high.

Interestingly, our analysis demonstrated a difference in efficacy of digital educational interventions for cognitive and behavioural outcomes: interventions appear less effective in influencing behavioural outcomes than cognitive outcomes. Most studies researching behavioural changes reported non-significant change in their primary outcomes, compared with significant improvement in most studies examining cognitive outcomes. These results may be due to complexities behind health-seeking behaviour that go beyond knowledge and understanding. For example, even if a patient understands the health information relayed to them during a visit, their actual health-seeking behaviour may be influenced by practical barriers like access to primary care [59].

In terms of age stratification, statistically significant improvements in both behavioral and cognitive outcomes were achieved in most studies with a solely adult study population, and even more frequently in studies using pediatric populations (<18 y/o) and their caregivers. In studies involving pediatric patients, parents were almost always the ones who received the intervention [23,24,26,28,33,3638,45,51]. Thus, the success in studies with pediatric populations should be compared with research involving parents or caregivers of pediatric patients, not research testing interventions on pediatric patients themselves. Previous reviews that have found digital health interventions to be effective in improving parents’ health literacy and changing their behaviour [60,61], although consensus has not been reached [62]. There are several potential explanations for why parents and caregivers of pediatric patients showed greater improvement in primary outcomes than adult populations participating on their own behalf. Although the exact age distribution was not reported in most studies, it’s possible that the average age of parents taking part in a study on behalf of their children may have been lower than the average age of patients in adult-only studies where participants could be in their 70s and 80s [41,56,57]. If this were the case, the success of interventions in studies using parents and caregivers of children might be explained by higher digital literacy and lower digital exclusion as compared to older populations [63,64]. Furthermore, parents may be motivated to engage in the intervention for the sake of their child’s health more than adults are for their own sake [65]. In some of the studies of adult-only patients, a caregiver could answer on behalf of the adult patients, but this was rare [23,54]. However, in the broader literature, there is evidence that educational interventions aimed at caregivers of adult and elderly patients could be effective as well [66,67].

As for interventions for age-mixed populations, significant improvement was reported in less than half of papers included in our review. These age-mixed studies were most often targeting teenagers and young adults with interventions related to sexual and reproductive health. Other research in this field has drawn diverse conclusions. Previous analysis in digital sexual health interventions for young adults has found 75% of studies to be effective in changing behaviour and cognition [68] -- this discrepancy could suggest that the emergency department is not the optimal setting to deliver such interventions. Yet, other research has found that digital health interventions are not effective in influencing the behaviour of adolescents and has called for greater participation of children and adolescents in co-design of future interventions [69]. Interventions designed (or co-designed) specifically for children could be an interesting avenue of future research, especially given that none of the studies in this review directed their interventions solely at a pediatric population.

Our second objective was to identify the characteristics of successful interventions and determine how success is measured in the studies we examined. Among intervention types, we observe that video interventions, closely followed by WBM and app interventions, had the highest rates of significant improvement, with greater improvement in cognitive outcomes than behavioural outcomes. Text message interventions also led to improvement in cognitive outcomes in most studies. This corroborates previous reviews which have found web-[70], video-[71], and text message-[72] based educational interventions to be effective at positively influencing cognitive outcomes. There are several potential explanations as to why videos are the most effective modality. Videos can provide dynamic visualization of complex concepts, while allowing the patient to pause, go back, and generally engage with the material at their own pace. Videos also provide a standardized learning environment where the information can be relayed accurately every time, reducing opportunities for human error in demonstration [73]. Furthermore, the Cognitive Theory of Multimedia Learning suggests that media combining visual and auditory modalities can improve understanding, in comparison with words only [74].

Regarding outcome measurement, the most common metric for success was an improved score on a knowledge questionnaire after receipt of the intervention. Patient satisfaction was also a common measure, although rarely the primary outcome.

Future directions

Standardized reporting and evaluation.

There are several opportunities for improvement that we identified for future development, implementation, and evaluation of digital educational interventions in the emergency department. There is an overarching lack of standardization in analysis of intervention quality and in reporting of results. A standardized, widely adopted evaluation framework for such interventions would facilitate the assessment of their quality, a factor which can greatly impact study outcomes, and which is currently difficult to gather data on. Some studies reference known issues with their interventions, such as use of jargon, but many studies make no reference to the perceived quality of their intervention. Several sets of reporting guidelines for digital health interventions exist (e.g., mERA [75], iCHECK-DH [76], and STEDI [77]) to supplement the CONSORT guidelines for RCTs. More widespread adoption of these guidelines would likely increase study quality and integrity, as well as make future studies more easily comparable.

Intervention accessibility.

It is also worth noting that many of these studies excluded participants with limited digital literacy or lack of phone or internet access, due to the fact that many of these interventions relied on the use of phones or the internet. It serves as a reminder of the risk of exacerbating disparities in health access and outcomes if the deployed technologies are not accessible and inclusive by design, an effect which is often pronounced for people who are underserved, people with disabilities, or those with lower incomes [78]. One strategy to tackle these disparities is to encourage the participation of people who have low technological literacy in studies, giving them controlled opportunities to engage with digital interventions [79]. Increased inclusion of diverse populations in future research would also aid in the generalizability of results and in addressing issues with accessibility early in the design process.

Generally, co-design with patients is an important element to consider for future research into digital educational interventions. Meaningful patient engagement adds value to every step of the research process, from formulating a research question, to outlining a methodology, designing an intervention, conducting an analysis, co-interpreting results, and mobilizing new knowledge [80,81]. Many of the studies in this review discussed having their interventions reviewed by physicians or experts for accuracy, but not by patients for clarity. Retroactive evaluation by patients via satisfaction measures is useful, but future digital educational interventions should engage patients with lived experiences in the ED in their co-design and evaluation [81,82]. These interventions are ultimately for the benefit of patients; they should be partners in the research efforts that create them and measure their impact.

Limitations

The results of this systematic review must be interpreted in the context of some limitations. Firstly, due to the heterogeneity of articles in our sample, we were unable to perform a meta-analysis. We worked to mitigate this through descriptive quantitative analyses and our thorough qualitative discussion. There was an overall lack of consistency in study methods and reporting of outcomes, which limits cohesive comparison of all articles, but this in itself is a notable finding for future researchers to build on.

Conclusions

Digital educational interventions have demonstrated the ability to positively impact cognitive outcomes in the emergency department, from information comprehension to patient satisfaction. Videos, WBM, and app interventions seem particularly impactful for caregivers of pediatric populations. However, digital educational interventions may not yet be effective at changing behaviour.

These interventions are important discharge tools. Increasing patients’ understanding of their condition and the care they should take after leaving the ED is important to reduce future injury, reduce return visits to the ED, and improve health outcomes [4,5,6,7]. Future research should work towards developing and enforcing guidelines for the conduct and reporting of digital educational interventions to standardize this quickly growing area of study. Ultimately, these tools should be used to empower patient decision-making and improve health-related outcomes.

Supporting information

S1 Appendix. Search strategies for this systematic review.

https://doi.org/10.1371/journal.pdig.0000772.s001

(DOCX)

S2 Appendix. Original Protocol for this systematic review.

https://doi.org/10.1371/journal.pdig.0000772.s002

(DOCX)

S1 Checklist. PRISMA checklist for this systematic review.

https://doi.org/10.1371/journal.pdig.0000772.s003

(DOCX)

References

  1. 1. Williams DM, Counselman FL, Caggiano CD. Emergency department discharge instructions and patient literacy: a problem of disparity. Am J Emerg Med. 1996;14(1):19–22. pmid:8630148
  2. 2. Engel KG, Heisler M, Smith DM, Robinson CH, Forman JH, Ubel PA. Patient comprehension of emergency department care and instructions: are patients aware of when they do not understand?. Ann Emerg Med. 2009;53(4):454-461.e15. pmid:18619710
  3. 3. Murray TH, Hagey J, Willms D, Shillington R, Desjardins R. Health Literacy in Canada: A Healthy Understanding. Ottawa: Canadian Council on Learning; 2008.
  4. 4. Aaby A, Friis K, Christensen B, Rowlands G, Maindal HT. Health literacy is associated with health behaviour and self-reported health: A large population-based study in individuals with cardiovascular disease. Eur J Prev Cardiol. 2017;24(17):1880–8. pmid:28854822
  5. 5. Matsuoka S, Tsuchihashi-Makaya M, Kayane T, Yamada M, Wakabayashi R, Kato NP, et al. Health literacy is independently associated with self-care behavior in patients with heart failure. Patient Educ Couns. 2016;99(6):1026–32. pmid:26830514
  6. 6. Barton AJ, Allen PE, Boyle DK, Loan LA, Stichler JF, Parnell TA. Health Literacy: Essential for a Culture of Health. J Contin Educ Nurs. 2018;49(2):73–8. pmid:29381170
  7. 7. Balakrishnan MP, Herndon JB, Zhang J, Payton T, Shuster J, Carden DL. The Association of Health Literacy With Preventable Emergency Department Visits: A Cross-sectional Study. Acad Emerg Med. 2017;24(9):1042–50. pmid:28646519
  8. 8. Park T, Muzumdar J, Kim H. Digital Health Interventions by Clinical Pharmacists: A Systematic Review. Int J Environ Res Public Health. 2022;19(1):532. pmid:35010791
  9. 9. Knop MR, Nagashima-Hayashi M, Lin R, Saing CH, Ung M, Oy S, et al. Impact of mHealth interventions on maternal, newborn, and child health from conception to 24 months postpartum in low- and middle-income countries: a systematic review. BMC Medicine. 2024;22(1):196.
  10. 10. Erku D, Khatri R, Endalamaw A, Wolka E, Nigatu F, Zewdie A, et al. Digital Health Interventions to Improve Access to and Quality of Primary Health Care Services: A Scoping Review. Int J Environ Res Public Health. 2023;20(19):6854. pmid:37835125
  11. 11. Saidinejad M, Zorc J. Mobile and web-based education: delivering emergency department discharge and aftercare instructions. Pediatr Emerg Care. 2014;30(3):211–6. pmid:24589814
  12. 12. Viswanath K, Kreuter MW. Health disparities, communication inequalities, and eHealth. Am J Prev Med. 2007;32(5 Suppl):S131-3. pmid:17466818
  13. 13. Sherifali D, Ali MU, Ploeg J, Markle-Reid M, Valaitis R, Bartholomew A, et al. Impact of Internet-Based Interventions on Caregiver Mental Health: Systematic Review and Meta-Analysis. J Med Internet Res. 2018;20(7):e10668. pmid:29970358
  14. 14. Verweel L, Newman A, Michaelchuk W, Packham T, Goldstein R, Brooks D. The effect of digital interventions on related health literacy and skills for individuals living with chronic diseases: A systematic review and meta-analysis. Int J Med Inform. 2023;177105114. pmid:37329765
  15. 15. Wayne W. LaMorte M. The Health Belief Model: Boston University School of Public Health; 2022. Available from: https://sphweb.bumc.bu.edu/otlt/mph-modules/sb/behavioralchangetheories/behavioralchangetheories2.html.
  16. 16. Glanz K, Bishop DB. The role of behavioral science theory in development and implementation of public health interventions. Annu Rev Public Health. 2010;31399–418. pmid:20070207
  17. 17. May CR, Cummings A, Girling M, Bracher M, Mair FS, May CM, et al. Using Normalization Process Theory in feasibility studies and process evaluations of complex healthcare interventions: a systematic review. Implement Sci. 2018;13(1):80. pmid:29879986
  18. 18. Granja C, Janssen W, Johansen MA. Factors Determining the Success and Failure of eHealth Interventions: Systematic Review of the Literature. J Med Internet Res. 2018;20(5):e10235. pmid:29716883
  19. 19. Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Syst Rev. 2021;10(1):89. pmid:33781348
  20. 20. Global diffusion of eHealth: making universal health coverage achievable. Report of the third global survey on eHealth.: World Health Organization; 2016.
  21. 21. Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan-a web and mobile app for systematic reviews. Syst Rev. 2016;5(1):210. pmid:27919275
  22. 22. McGuinness LA, Higgins JPT. Risk-of-bias VISualization (robvis): An R package and Shiny web app for visualizing risk-of-bias assessments. Res Synth Methods. 2021;12(1):55–61. pmid:32336025
  23. 23. Atzema CL, Austin PC, Wu L, Brzozowski M, Feldman MJ, McDonnell M, et al. Speak fast, use jargon, and don’t repeat yourself: a randomized trial assessing the effectiveness of online videos to supplement emergency department discharge instructions. PLoS One. 2013;8(11):e77057. pmid:24244272
  24. 24. Bloch SA, Bloch AJ. Using video discharge instructions as an adjunct to standard written instructions improved caregivers’ understanding of their child’s emergency department visit, plan, and follow-up: a randomized controlled trial. Pediatr Emerg Care. 2013;29(6):699–704. pmid:23714763
  25. 25. Calderon Y, Cowan E, Leu C-S, Brusalis C, Rhee JY, Nickerson J, et al. A human immunodeficiency virus posttest video to increase condom use among adolescent emergency department patients. J Adolesc Health. 2013;53(1):79–84. pmid:23582525
  26. 26. Suffoletto B, Kristan J, Callaway C, Kim KH, Chung T, Monti PM, et al. A text message alcohol intervention for young adult emergency department patients: a randomized clinical trial. Ann Emerg Med. 2014;64(6):664-72.e4. pmid:25017822
  27. 27. Chan Y-FY, Richardson LD, Nagurka R, Hao K, Zaets SB, Brimacombe MB, et al. Stroke education in an emergency department waiting room: a comparison of methods. Health Promot Perspect. 2015;5(1):34–41. pmid:26000244
  28. 28. Ismail S, McIntosh M, Kalynych C, Joseph M, Wylie T, Butterfield R, et al. Impact of Video Discharge Instructions for Pediatric Fever and Closed Head Injury from the Emergency Department. J Emerg Med. 2016;50(3):e177-83. pmid:26806318
  29. 29. Olives T, Patel R, Thompson H, Joing S, Miner J. Seventy-two-hour antibiotic retrieval from the ED: a randomized controlled trial of discharge instructional modality. Am J Emerg Med. 2016;34(6):999–1005.
  30. 30. Buis L, Hirzel L, Dawood RM, Dawood KL, Nichols LP, Artinian NT, et al. Text Messaging to Improve Hypertension Medication Adherence in African Americans From Primary Care and Emergency Department Settings: Results From Two Randomized Feasibility Studies. JMIR Mhealth Uhealth. 2017;5(2):e9. pmid:28148474
  31. 31. Chernick LS, Stockwell MS, Wu M, Castaño PM, Schnall R, Westhoff CL, et al. Texting to Increase Contraceptive Initiation Among Adolescents in the Emergency Department. J Adolesc Health. 2017;61(6):786–90. pmid:29056437
  32. 32. Chakravarthy B, Somasundaram S, Mogi J, Burns R, Hoonpongsimanont W, Wiechmann W, et al. Randomized pilot trial measuring knowledge acquisition of opioid education in emergency department patients using a novel media platform. Subst Abus. 2018;39(1):27–31. pmid:28873050
  33. 33. Golden-Plotnik S, Ali S, Drendel AL, Wong T, Ferlisi F, Todorovich S, et al. A Web-based module and online video for pain management education for caregivers of children with fractures: A randomized controlled trial. CJEM. 2018;20(6):882–91. pmid:29041997
  34. 34. Ong TE-L, Kua JPH, Yiew LJ, Lim ZY, Thia MXH, Sung SC. Assessing effective methods to educate caregivers on fever in children aimed at reducing input to the paediatric emergency department. Proceedings of Singapore Healthcare. 2017;27(2):73–84.
  35. 35. Platts-Mills TF, Hollowell AG, Burke GF, Zimmerman S, Dayaa JA, Quigley BR, et al. Randomized controlled pilot study of an educational video plus telecare for the early outpatient management of musculoskeletal pain among older emergency department patients. Trials. 2018;19(1):10. pmid:29304831
  36. 36. Belisle S, Dobrin A, Elsie S, Ali S, Brahmbhatt S, Kumar K, et al. Video Discharge Instructions for Acute Otitis Media in Children: A Randomized Controlled Open-label Trial. Acad Emerg Med. 2019;26(12):1326–35. pmid:31742809
  37. 37. Hart L, Nedadur R, Reardon J, Sirizzotti N, Poonai C, Speechley KN, et al. Web-Based Tools for Educating Caregivers About Childhood Fever: A Randomized Controlled Trial. Pediatr Emerg Care. 2019;35(5):353–8. pmid:27749811
  38. 38. Lepley BE, Brousseau DC, May MF, Morrison AK. Randomized Controlled Trial of Acute Illness Educational Intervention in the Pediatric Emergency Department: Written Versus Application-Based Education. Pediatric Emergency Care. 2020;36(4):e192-e8.
  39. 39. Vayngortin T, Bachrach L, Patel S, Tebb K. Adolescents’ Acceptance of Long-Acting Reversible Contraception After an Educational Intervention in the Emergency Department: A Randomized Controlled Trial. West J Emerg Med. 2020;21(3):640–6. pmid:32421513
  40. 40. Walsh K, Gilmore A, Schumacher J, Coffey S, Frazier P, Ledray L. Post-sexual assault cigarette smoking: Findings from a randomized clinical trial of a video-based intervention. Addictive Behaviors. 2020;100:106121.
  41. 41. Wilkin ZL. Effects of Video Discharge Instructions on Patient Understanding: A Prospective, Randomized Trial. Adv Emerg Nurs J. 2020;42(1):71–8. pmid:32000193
  42. 42. Merchant RC, Marks SJ, Clark MA, Carey MP, Liu T. Comparison of a video to a pictorial brochure in improving HIV/AIDS and HIV testing knowledge and increasing HIV testing motivation and behavioral skills among adult emergency department patients. J Am Coll Emerg Physicians Open. 2020;1(3):202–13. pmid:33000035
  43. 43. Dimeff LA, Jobes DA, Koerner K, Kako N, Jerome T, Kelley-Brimer A, et al. Using a Tablet-Based App to Deliver Evidence-Based Practices for Suicidal Patients in the Emergency Department: Pilot Randomized Controlled Trial. JMIR Ment Health. 2021;8(3):e23022. pmid:33646129
  44. 44. Hoek AE, Joosten M, Dippel DWJ, van Beeck EF, van den Hengel L, Dijkstra B, et al. Effect of Video Discharge Instructions for Patients With Mild Traumatic Brain Injury in the Emergency Department: A Randomized Controlled Trial. Ann Emerg Med. 2021;77(3):327–37. pmid:33618811
  45. 45. Jové-Blanco A, Solís-García G, Torres-Soblechero L, Escobar-Castellanos M, Mora-Capín A, Rivas-García A, et al. Video discharge instructions for pediatric gastroenteritis in an emergency department: a randomized, controlled trial. Eur J Pediatr. 2021;180(2):569–75. pmid:33029683
  46. 46. McElhinny M, Chea K, Carter-Powell A, Mishler A, Bhattarai B, Geren K. Adult emergency department naloxone education and prescription program: Video and pamphlet education comparison. J Subst Abuse Treat. 2021;127:108346. pmid:34134864
  47. 47. Omaki E, Castillo R, McDonald E, Eden K, Davis S, Frattaroli S, et al. A patient decision aid for prescribing pain medication: Results from a pilot test in two emergency departments. Patient Educ Couns. 2021;104(6):1304–11. pmid:33280968
  48. 48. Chernick LS, Santelli J, Stockwell MS, Gonzalez A, Ehrhardt A, Thompson JLP, et al. A multi-media digital intervention to improve the sexual and reproductive health of female adolescent emergency department patients. Acad Emerg Med. 2022;29(3):308–16. pmid:34738284
  49. 49. Meisel ZF, Shofer F, Dolan A, Goldberg EB, Rhodes KV, Hess EP, et al. A Multicentered Randomized Controlled Trial Comparing the Effectiveness of Pain Treatment Communication Tools in Emergency Department Patients With Back or Kidney Stone Pain. Am J Public Health. 2022;112(S1):S45–55. pmid:35143273
  50. 50. Rodriguez RM, Nichol G, Eucker SA, Chang AM, O’Laughlin KN, Pauley A, et al. Effect of COVID-19 Vaccine Messaging Platforms in Emergency Departments on Vaccine Acceptance and Uptake: A Cluster Randomized Clinical Trial. JAMA Intern Med. 2023;183(2):115–23. pmid:36574256
  51. 51. Lin Y-K, Yeh Y-S, Chen C-W, Lee W-C, Lin C-J, Kuo L-C, et al. Parental Educational Intervention to Facilitate Informed Consent for Pediatric Procedural Sedation in the Emergency Department: A Parallel-Group Randomized Controlled Trial. Healthcare (Basel). 2022;10(12):2353. pmid:36553877
  52. 52. Rodriguez RM, Eucker SA, Rafique Z, Nichol G, Molina MF, Kean E, et al. Promotion of Influenza Vaccination in the Emergency Department. NEJM Evid. 2024;3(4):EVIDoa2300197. pmid:38776635
  53. 53. Zhang AY, Leviter J, Baird J, Charles-Chauvet D, Frackiewicz LM, Duffy S, et al. Buckle me up! A randomised controlled trial using a tablet-based emergency department intervention for child car safety education. Inj Prev. 2024;30(4):334–40. pmid:38302281
  54. 54. Di Pietro S, Ferrari I, Bulgari G, Muiesan ML, Falaschi F, De Silvestri A, et al. Video clips for patient comprehension of atrial fibrillation and deep vein thrombosis in emergency care. A randomised clinical trial. NPJ Digit Med. 2024;7(1):107. pmid:38688958
  55. 55. Alqaydi A, Williams E, Nanji S, Zevin B. Optimizing the consent process for emergent laparoscopic cholecystectomy using an interactive digital education platform: a randomized control trial. Surgical Endoscopy. 2024;38(5):2593–601.
  56. 56. Adler DH, Wood N, Fiscella K, Rivera MP, Hernandez-Romero B, Chamberlin S, et al. Increasing Uptake of Lung Cancer Screening Among Emergency Department Patients: A Pilot Study. J Emerg Med. 2024;67(2):e164–76. pmid:38839453
  57. 57. Abar B, Park CS, Wood N, Marino D, Fiscella K, Adler D. Intervention to increase colorectal cancer screening among emergency department patients: results from a randomised pilot study. Emerg Med J. 2024;41(7):422–8. pmid:38777559
  58. 58. Patient-reported outcome measures (PROMs): Canadian Institute for Health Information; Available from: https://www.cihi.ca/en/patient-reported-outcome-measures-proms.
  59. 59. Claudia S, Nancy R. Experiencing difficulties accessing first-contact health services in Canada. Healthcare Policy. 2006;1(2):103–19.
  60. 60. Park J, Jeon H, Choi E. Digital health intervention on patient safety for children and parents: A scoping review. Journal of Advanced Nursing. 2023.
  61. 61. Mörelius E, Robinson S, Arabiat D, Whitehead L. Digital Interventions to Improve Health Literacy Among Parents of Children Aged 0 to 12 Years With a Health Condition: Systematic Review. J Med Internet Res. 2021;23(12):e31665. pmid:34941559
  62. 62. Peyton D, Goods M, Hiscock H. The Effect of Digital Health Interventions on Parents’ Mental Health Literacy and Help Seeking for Their Child’s Mental Health Problem: Systematic Review. J Med Internet Res. 2022;24(2):e28771. pmid:35142623
  63. 63. Seifert A, Reinwand DA, Schlomann A. Designing and Using Digital Mental Health Interventions for Older Adults: Being Aware of Digital Inequality. Front Psychiatry. 2019;10:568. pmid:31447716
  64. 64. Wale A, Everitt J, Ayres T, Okolie C, Morgan H, Shaw H. A rapid review of the effectiveness of interventions for addressing digital exclusion in older adults. medRxiv. 2024;2024:24304670.
  65. 65. Aarthun A, Øymar KA, Akerjordet K. Parental involvement in decision-making about their child’s health care at the hospital. Nurs Open. 2018;6(1):50–8. pmid:30534394
  66. 66. Evans I, Patel R, Stoner C, Melville M, Spector A. A systematic review of educational interventions for informal caregivers of people living with dementia in low and middle-income countries. Behav Sci (Basel). 2024;14(3):.
  67. 67. Lee J, Yeom I, Yoo S, Hong S. Educational intervention for family caregivers of older adults with delirium: An integrative review. J Clin Nurs. 2023;32(19–20):6987–97. pmid:37370251
  68. 68. Sewak A, Yousef M, Deshpande S, Seydel T, Hashemi N. The effectiveness of digital sexual health interventions for young adults: a systematic literature review (2010-2020). Health Promotion International. 2023;38(1):1–10.
  69. 69. The Lancet Digital Health. Children must co-design digital health research. Lancet Digit Health. 2023;5(5):e248. pmid:37032201
  70. 70. de Sousa D, Fogel A, Azevedo J, Padrão P. The Effectiveness of Web-Based Interventions to Promote Health Behaviour Change in Adolescents: A Systematic Review. Nutrients. 2022;14(6):1258. pmid:35334915
  71. 71. Deshpande N, Wu M, Kelly C, Woodrick N, Werner DA, Volerman A, et al. Video-Based Educational Interventions for Patients With Chronic Illnesses: Systematic Review. J Med Internet Res. 2023;25:e41092. pmid:37467015
  72. 72. Hall AK, Cole-Lewis H, Bernhardt JM. Mobile text messaging for health: a systematic review of reviews. Annu Rev Public Health. 2015;36:393–415. pmid:25785892
  73. 73. Krumm IR, Miles MC, Clay A, Carlos Ii WG, Adamson R. Making Effective Educational Videos for Clinical Teaching. Chest. 2022;161(3):764–72. pmid:34587482
  74. 74. Morgado M, Botelho J, Machado V, Mendes JJ, Adesope O, Proença L. Full title: Video-based approaches in health education: a systematic review and meta-analysis. Sci Rep. 2024;14(1):23651. pmid:39384592
  75. 75. Agarwal S, Lefevre AE, Labrique AB. A Call to Digital Health Practitioners: New Guidelines Can Help Improve the Quality of Digital Health Evidence. JMIR Mhealth Uhealth. 2017;5(10):e136. pmid:28986340
  76. 76. Perrin Franck C, Babington-Ashaye A, Dietrich D, Bediang G, Veltsos P, Gupta PP, et al. iCHECK-DH: Guidelines and Checklist for the Reporting on Digital Health Implementations. J Med Internet Res. 2023;25:e46694. pmid:37163336
  77. 77. Posadzki PCL, Jarbrink K, Bajpai R, Semwal M, Kyaw B. Protocol for development of the Standards for Reporting of Digital Health Education Intervention Trials (STEDI) statement; 2017.
  78. 78. Saeed SA, Masters RM. Disparities in Health Care and the Digital Divide. Curr Psychiatry Rep. 2021;23(9):61. pmid:34297202
  79. 79. Latulippe K, Hamel C, Giroux D. Social Health Inequalities and eHealth: A Literature Review With Qualitative Synthesis of Theoretical and Empirical Studies. J Med Internet Res. 2017;19(4):e136. pmid:28450271
  80. 80. Osmanlliu E, Paquette J, Grenier A-D, Lewis P, Bouthillier M-E, Bédard S, et al. Fantastic perspectives and where to find them: involving patients and citizens in digital health research. Res Involv Engagem. 2022;8(1):37. pmid:35918730
  81. 81. McDonald IR, Blocker ES, Weyman EA, Smith N, Dwyer AA. What Are the Best Practices for Co-Creating Patient-Facing Educational Materials? A Scoping Review of the Literature. Healthcare (Basel). 2023;11(19):2615. pmid:37830651
  82. 82. Bombard Y, Baker GR, Orlando E, Fancott C, Bhatia P, Casalino S, et al. Engaging patients to improve quality of care: a systematic review. Implement Sci. 2018;13(1):98. pmid:30045735