Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

What is the value and impact of the adaptation process on quality indicators for local use? A scoping review

  • Siyi Zhu ,

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Writing – original draft, Writing – review & editing

    hxkfzsy@scu.edu.cn (SZ); hxkfhcq2015@126.com (CH); 5648197@qq.com (LY)

    Affiliations Rehabilitation Medicine Center, West China Hospital, Sichuan University, Chengdu, China, Rehabilitation Key Laboratory of Sichuan Province, West China Hospital, Sichuan University, Chengdu, China, Arthritis Research Canada, Vancouver, British Columbia, Canada

  • Tao Wu,

    Roles Data curation, Formal analysis, Methodology, Resources, Visualization, Writing – review & editing

    Affiliations Rehabilitation Medicine Center, West China Hospital, Sichuan University, Chengdu, China, Rehabilitation Key Laboratory of Sichuan Province, West China Hospital, Sichuan University, Chengdu, China

  • Jenny Leese,

    Roles Data curation, Formal analysis, Investigation, Resources, Validation

    Affiliations Arthritis Research Canada, Vancouver, British Columbia, Canada, Faculty of Medicine, School of Epidemiology and Public Health, University of Ottawa, Ottawa, Ontario, Canada

  • Linda C. Li,

    Roles Conceptualization, Investigation, Methodology, Resources, Supervision, Validation, Visualization, Writing – review & editing

    Affiliations Arthritis Research Canada, Vancouver, British Columbia, Canada, Department of Physical Therapy, University of British Columbia, Vancouver, British Columbia, Canada

  • Chengqi He ,

    Roles Conceptualization, Funding acquisition, Methodology, Project administration, Resources, Supervision, Validation, Writing – review & editing

    hxkfzsy@scu.edu.cn (SZ); hxkfhcq2015@126.com (CH); 5648197@qq.com (LY)

    Affiliations Rehabilitation Medicine Center, West China Hospital, Sichuan University, Chengdu, China, Rehabilitation Key Laboratory of Sichuan Province, West China Hospital, Sichuan University, Chengdu, China

  • Lin Yang

    Roles Funding acquisition, Investigation, Software, Supervision, Validation, Visualization, Writing – review & editing

    hxkfzsy@scu.edu.cn (SZ); hxkfhcq2015@126.com (CH); 5648197@qq.com (LY)

    Affiliations Rehabilitation Medicine Center, West China Hospital, Sichuan University, Chengdu, China, Rehabilitation Key Laboratory of Sichuan Province, West China Hospital, Sichuan University, Chengdu, China

Abstract

Background

Quality indicators (QIs) are designed for improving quality of care, but the development of QIs is resource intensive and time consuming.

Objective

To describe and identify the impact and potential attributes of the adaptation process for the local use of existing QIs.

Data sources

EMBASE, MEDLINE, CINAHL and grey literature were searched.

Study selection

Literatures operationalizing or implementing QIs that were developed in a different jurisdiction from the place where the QIs were included.

Results

Of 7704 citations identified, 10 out of 33 articles were included. Our results revealed a lack of definition and conceptualization for an adaptation process in which an existing set of QIs was applied. Four out of ten studies involved a consensus process (e.g., Delphi or RAND process) to determine the suitability of QIs for local use. QIs for chronic conditions in primary and secondary settings were mostly used for adaptation. Of the ones that underwent a consensus process, 56.3 to 85.7% of original QIs were considered valid for local use, and 2 to 21.8% of proposed QIs were newly added. Four attributes should be considered in the adaptation: 1) identifying areas/conditions; 2) a consensus process; 3) proposing adapted QIs; 4) operationalization and evaluation.

Conclusion

The existing QIs, although serving as a good starting point, were not adequately adapted before for use in a different jurisdiction from their origin. Adaptation of QIs under a systematic approach is critical for informing future research planning for QIs adaptation and potentially establishing a new pathway for healthcare improvement.

Introduction

Quality improvement is a cornerstone in health care and clinical practice, and Institute of Medicine defined quality of care as “the degree to which health services for individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge” [1]. Performance measurement is one of the five domains involved in improving quality of care [2]. However, clinical uncertainty about practice and quality of care varies remarkably in reality. For example, divergence from evidence-based care is commonly seen in care for osteoarthritis and other conditions, where only around half of patients receive the guideline-recommended treatment [3]. Therefore, tremendous efforts have been made to investigate how and for which conditions quality deficiencies stand so as to target where improvements are needed. It is increasingly common for countries around the world to set national benchmarking and public standards to measure or monitor the quality of care by means of indicators [4].

Quality indicators (QIs) are “standardized, evidence-based measures of healthcare quality that can be used to measure and track health status and health system performance, and characteristics across different populations, between jurisdictions, or over time” [5]. Researchers in the RAND group developed a consensus method (similar to the development of clinical guidelines) for systematically combining best evidence with expert opinion to assess the appropriateness of healthcare and lead the way in the use of QIs [6]. With using a modified version of this method, QI statements specifying context structure, care process and expected outcomes are generated using the synthesized evidence or clinical guideline recommendations [7]. Typically, QIs include statements about the structure, process and outcomes of care in the ‘IF-THEN-BECAUSE’ format [8, 9]. The “IF” statement determines eligibility for the care process in question, the “THEN” statement specifies what care process should be performed, and the “BECAUSE” touches on the expected health impact where the indicator is implemented [10]. An example of a quality indicator is as follows: “IF a patient has symptomatic osteoarthritis of the knee or hip and has been overweight for 3 years THEN he/she should receive referral to a weight loss program” [11]. On different levels and for various stakeholders, QIs have been developed and applied to facilitate regulation, measure performance, advance accountability and improve the quality of care. Policymakers can use QIs as a tool to track and evaluate the impact of policies. Public and private payers are interested in QIs to measure the cost-effectiveness of treatment prior to approving payment in a manner of pay-for-performance schemes. Health professionals can use QIs to intensively monitor the practice and improve the performance accordingly [12]. QIs are an important source of consumer info for patients considering any treatment choice. For patients like with osteoarthritis, QIs were developed in 2004 by the Arthritis Foundation, which represent the minimal and evidence-based care that should be received by the targeted population [11].

Large-scale efforts to develop and implement QIs for healthcare in many conditions have been exerted mostly in the US and Europe [5, 1315]. However, the development process of QIs is time-consuming and comes at a high cost, so it is reasonable for countries to use existing QIs rather than develop their own. In the US, four years between 1995 and 1999 were taken to develop the RAND QIs, while researchers in the UK adapted this set of QIs for local use in less than two years [16]. Transferring QIs between countries with an intermediate process is therefore feasible, but the rate of agreement on accepting QIs for local use differed remarkably. Care of the elderly is considered as an increasingly important component in healthcare service worldwide. The Assessing Care Of Vulnerable Elderly (ACOVE) QIs were developed aiming to evaluate and optimize the care for elderly patients in US by Rand Healthcare and the UCLA [17]. Several studies adapted ACOVE QIs for local use in the UK and Netherlands with an approval rate (# of original QIs to # of proposed QIs in the adaptation) varying from 56–86% [16, 18, 19]. Further, there also have been examples in which QIs developed in one jurisdiction were used in another region without adaptation under a direct-adoption process [20, 21]. An expert panel from England rated 79 of 93 chosen ACOVE QIs as valid for local use without any amendment [18]. Although there are potential benefits in using an existing set of QIs as a starting point to adapt a set of QIs for local use, it is unclear what an adaptation process should involve transferring a set of QIs for local use, and whether such a process is always appropriate or desirable in adapting or applying a set of QIs. It is worth noting that a cross-cultural adaptation of a quality measurement should involve the development of versions equivalent to the original QIs, however simultaneously linguistically and culturally adjusted to the local context, in which the first step is to conduct a scoping review to understand the state-of-the art regarding the QIs selection and development [22]. Therefore, the aim of this study was to review the literature on the process used for adapting an existing set of QI’s for local use.

Materials and methods

A scoping review protocol was guided by the work of Arksey and O’Malley [23] and further refined by Peters Micah et al. initiated by the Joanna Briggs Institute [24]. The draft protocol was revised based on feedback received from co-authors and registered in an international prospective registry of systematic reviews (PROSPERO; reference number: CRD42018096844) [25]. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) guidelines were followed [26].

For the purpose of the review, a quality indicator was defined as a measure transformed into a statement that was validated and presented as percentage or proportion describing quality of care delivered [27]. We defined an adaptation process as a standardized process that purposefully selected a set of existing QIs (with or without an intermediate process, e.g., RAND or Delphi process) and transformed them into a set that was deemed suitable for use in the local context, before a pilot testing or full-scale implementation.

Search strategy

Search strategies were developed combining terms of subject headings and text words related to concepts of QIs and adaptation processes (Supplementary materials). EMBASE, MEDLINE and CINAHL were searched from January 1990 to September 2019. A search of Google Scholar using keywords including “quality indicators”, “quality improvement”, “healthcare”, “cross-cultural adaptation”, “transferability”, and websites of quality improvement research were conducted to identify grey literature. A librarian specializing in medical information helped revise searches. The reference lists of included studies were reviewed to identify additional eligible studies. An additional search was performed under a mechanism of living systematic review [28] to identify recently published relevant studies from October 2019 to October 2022 using the databases and keywords described above.

Study selection

Endnote X9 was used to manage citations retrieved from searches and remove duplicates. Studies were eligible if they 1) adapted or operationalized a set of existing QIs for local use; 2) assessed the acceptability or usability of adapted or operationalized QIs, with a definition of to what extent the indicator was judged as acceptable or practical for local use by those being assessed or those performing the assessment; 3) published in English or Chinese. Two reviewers independently conducted the level 1 (titles and abstracts) and level 2 (full text) screenings. A screening tool was developed and tested on a sample of citations retrieved prior to the full review (Supplementary materials). Any disagreement between the two was discussed and, if needed, resolved through discussion with a third party.

Data collection and synthesis

For included studies, data were extracted on study characteristics (e.g., first author, year and country of publication), participants/population description, healthcare settings, process involved in adaptation, objectives and key conclusions. QIs were classified as the original set and adapted set based on the aspects of structure, process and outcome assessed. Information on QIs used and adapted, test results where applicable and any finding related to the adaptation were reviewed and mapped. A data extraction form was developed and piloted on a sample of citations prior to the data extraction. Two reviewers were involved independently, and consensus was used to resolve any disagreements, or arbitration by a third party if applicable.

Quality assessment

The quality of included studies was assessed by two reviewers using Revised Standards for Quality Improvement Reporting Excellence (SQUIRE 2.0), a framework with 18 items for reporting new knowledge about how to improve healthcare [29]. A quality score out of eighteen was given to all studies included. Two reviewers scored studies independently, and agreements were achieved by either consensus or arbitration by a third party if applicable.

Results

The search yielded 7704 articles plus 687 records retrieved from the additional search and grey literatures, of which 56 articles were identified eligible for full-text screening after the removal of duplicates and review of titles and abstracts. Of these, 10 articles [16, 1821, 3034] were included in this study for data extraction and synthesis, 6 of which were identified manually by authors (Fig 1). For 90% of included studies, the primary aim was to assess acceptability of QIs transferred from one country to another with an intermediate or direct-adoption process, and one study [32] was conducted to assess quality of care by operationalizing QIs locally (Table 2). Studies were excluded because they did not involve adapting or operationalizing a set of existing QIs for local use. The main reasons for exclusion were no QI used in the study (n = 9) or no adaptation process was described (n = 20). Of the latter, 10 were QIs development studies, eight were QIs implementation studies, and two were methodology studies.

Quality of included studies

Articles included were assessed as high-quality, with SQUIRE 2.0 scores [29] of 18/18 for 4 studies, 17/18 for 3 studies, 16/18 for 2 studies and 15/18 for one study. Ethical consideration, the description of limitation and indicating funding resources were identified as three main weaknesses in the quality assessment. The findings are summarized in Table 1.

thumbnail
Table 1. Summary of the SQUIRE statement quality appraisal results criterion.

https://doi.org/10.1371/journal.pone.0278379.t001

Study characteristics

Table 2 summarizes the characteristics of studies included in the study. The years of publication ranged from 2002 to 2017. Seven out of 10 studies were undertaken in Europe including the UK (2), European consortium (2), Netherlands (1), Denmark (1) and Italy (1). Two studies [21, 30] were from Asia-Pacific regions and one was from Canada. The participants involved in included studies varied with respect to clinical disciplines and how QIs were adapted. These disciplines included geriatric medicine (3), headache conditions (3), palliative care (1), mental health (1), respiratory medicine (1) and rheumatology (1). Four studies [16, 18, 19, 30] formed a multi-disciplinary expert panel to assess the transferability of QIs between countries through an intermediate process supported by the RAND/UCLA and Delphi methods. Other studies tested the QIs in at least one target population in which the QIs were originally developed. These included health care professionals, health services managers, and patients. The acceptability of QIs was also assessed by reviewing medical records or results of surveys in the latter group of studies included. For the setting where QIs were adapted, six studies were conducted in the general practice setting or primary and secondary settings, others were conducted in the secondary or tertiary setting.

Adaptation of quality indicators

Table 3 maps the use of existing QIs, with or without adaptation. Seven separate sets of original QIs were employed in the included studies, of which three were from the USA, two were from Europe, and two were developed by an international consortium. Of these existing QIs targeted, the process domain was covered in all studies, the structure domain was addressed in four studies, while one study addressed the outcome domain.

The total number of proposed QIs across included studies ranged from 4 to 174, which combined the original set and newly added QIs. Three studies included new QIs in addition to the original set, which accounted for 2 to 21.8% in proposed QIs. After the adaptation process involved in four included studies, at least half of proposed QIs were approved as valid for use, and the range of approval rate for proposed QIs was 56.3 to 85.7%, which indicated that 17 to 76 of QIs were discarded in the process. Of these QIs proposed, 51–98 QIs of the original set were nearly identical without any change, and only 4 QIs were reported to be adapted with major changes [19]. Only two studies [16, 19] reported reasons or provided some comments on why QIs were discarded or changed in the adaptation. Fig 2 maps reasons or comments discarding or changing a QIs in the adaptation.

For the four studies [16, 18, 19, 30], only one study reported that transferability of the existing QIs was feasible, otherwise several issues including cultural context, practice difference, methodological issues, etc., should be considered in the adaptation process. None of these studies formulated a systematic or consistent adaptation approach addressing issues risen from the adaptation process. On the other hand, for six studies that did not adopt an adaptation process, all of them operationalized the existing QIs without any change. Three of the six studies reported that original QIs were easy to apply and acceptable, accounting for over an 80% approval rate for each QI used, and one reported that QIs were operationalized in assessing the quality of care, while two studies found that QIs without adaptation were not fully acceptable. Similar to studies adopting an adaptation process, issues included structure difference, different interpretation of QIs and cross-cultural validation should also be further considered in the adaptation process while operationalizing QIs. Based on steps recommended by Beaton et al. [35] and the proposed methodology used in the latest research [22], none of included studies involved the use of context framework or programme theories (e.g. Consolidated Framework for Implementation Research, Theoretical Domains Framework) to evaluate the adaptation process for local implementation or provide a mechanism for adaptation. Fig 2 maps main issues mentioned in studies that may potentially affect the process of adapting QIs for local use.

We mapped attributes of the adaptation process in Fig 2. A systematic approach of adaptation essentially included four attributes, which were 1) identification of areas/conditions for indicator adaptation and operationalization; 2) intermediate process with combinations and considerations of components listed in Fig 2; 3) introduction of a set of proposed indicators for local use; 4) operationalizing QIs adapted with the aim to evaluate the applicability of them with the implementation methods incorporated if applicable.

Discussion

The scoping review was conducted to specifically examine evidence adapting or operationalizing an existing set of QIs for local use. Our results highlight the essential role and four attributes of the adaptation process in transferring QIs between countries and different healthcare settings. Four out of ten included studies involved using a consensus process like RAND/UCLA and Delphi for evaluating the acceptability or usability of QIs for local use. The remaining six studies operationalized QIs by involving a multi-disciplinary staff team in medical records to assess validity and acceptability of these measures without an adaptation process. No study included involved using any implementation methods to conduct an assessment on the adaptation process. Most studies with good quality reporting were from European countries and involved QIs developed and implemented within similar settings. However, remarkable variability of how to apply an existed set of QIs locally was observed, relating to the lack of a systematic adaptation approach in adapting and operationalizing the existing QIs, which justified the necessity of the adaptation.

To apply QIs developed in a certain setting or condition, adaptation for the local context and a specific purpose, even within one country, is needed and should be conducted in a systematic manner [4]. Some studies suggested essential steps should be followed in the process of a cross-cultural adaptation, combining different approaches like evidence-based methods, consensus techniques and qualitative or implementation research [22, 35]. Further widespread and well-established method for the development of clinical guidelines should also become an important source to help shape QIs adaptation and dissemination [36]. However, the absence of a standard and systematic adaptation approach process posed challenges in retrieving citations from databases. Therefore, we found that very few studies applied existing QIs, considered as a useful evidence-based quality improvement tool, outside where that were developed. About 62% of the articles were excluded in this study due to the involvement of developing and implementing QIs on their own resources and at high cost [16]. Specifically, the development of several sets of QIs for populations with osteoarthritis were duplicate efforts addressing the assessment of the same process, like weight management recommendations that patients should receive from physicians [37].

Similar to the development process and systematic process of guidelines development, the adaptation process should ideally take place in an iterative way combining inputs from researchers and other stakeholders [38]. Based on what we found in this study, we proposed a systematic approach with four potential attributes should be addressed during the adaptation process. First of all, areas/conditions of interest for QIs adaptation should be targeted by a review of literatures systematically conducted, which is served as an input for the selection. With reference to the adaptation and cross-validation of the US Primary Care Assessment Scales in different countries [22, 3942], several steps should be taken in the intermediate process, which include but not limited to 1) identification of original set of QIs; 2) an expert panel and user focus group and interviews; 3) consensus techniques (e.g. Delphi process, RAND/UCLA, Nominal Group Technique process); 4) review of QIs wording for local use;. Although the review of scientific evidence is not imperative in the adaptation process, it served to promote mutual understanding across different contexts, and inform the development and adaptation process. After the introduction of a set of proposed QIs for local use, the degree to which QIs have been approved and whether QIs statements are relevant to clinical guidelines need to be assessed by the rigorous method validated in the development process to facilitate decision making [35, 43]. In the assessment, implementation theories, models and frameworks should be used to a deeper understanding of the fundamentals, processes and contexts that shape how adaptation process works and the interactions between different stakeholders [44]. The qualitative content extracted from the process are served to elicit and analyze the impact and value of QIs adaptation and to inform the future implementation [45].

Furthermore, several issues risen from the existing evidence should be taken into account during the intermediate process for the adaptation of QIs. In our study, we found that the approval rate of QIs involved in included studies ranged from 56.3 to 85.7%, and no study could conclude that introducing existing QIs without adaptation was just a matter of copy and paste [4]. Marshall et al. [16] and van der Ploeg et al. [19], adapted US QIs for local use in the UK and Netherlands separately, both indicated that issues like differences in professional practice, expert opinion, panel process, integration of literature source and healthcare context often lead to controversy and inconsistency in the approval of QIs for local use in two countries. For example, they found that the body of evidence reviewed by US researchers differed in content and literature source from that conducted in the UK and Netherlands, for which some QIs were discarded by panellists. In addition, the information infrastructure used to extract data and test the indicator and healthcare systems are different in different regions [4]. Several studies included in this study stated that regional and contextual variations in the structure and process of healthcare systems are major contributors to the failure of directly adopting QIs without any adaptation. In Asia-Pacific countries, more than half of 12 European QIs for dementia were found to be problematic to be used in residential long-term facilities which were different settings for which the QIs were applied [21].

On the other hand, contextual factors should be emphasized at the start of identification of an areas or conditions of interests for QIs adaptation. In the conceptual model of health care, although outcomes are of greatest interest to patients, improving process and how process improves outcomes become the primary target for quality improvement [27, 43, 46]. Process QIs were found in all included studies of this review, supported by findings that the volume of process QIs in The National Quality Measure Clearinghouse increased almost 10 times in a decade between 2003–2013 [27]. Process measurement of healthcare involving QIs has become increasingly common practice and a statutory obligation for administrations worldwide to identify gaps, advance decision-making and inform healthcare policy and delivery [12, 47]. In our study, we found studies in both primary and secondary care settings [20]. Consistent with current knowledge of the development and implementation of QIs, QIs are mostly designed for preventive care and care for chronic conditions, and there is a paucity of QIs for emergency outbreak and infectious diseases [43]. For example, the Coronavirus disease 2019 (Covid-19) pandemic has demonstrated health systems failure in terms of outbreak preparedness and coping with a sudden surge in demand for services such as acute care and rehabilitation service [48]. How to provide patients infected by Covid-19 or not with a consistent level of care poses great challenges on healthcare systems during the pandemic and beyond. Therefore, adaptation of QIs in intensive care units and respiratory medicine and implementation of them in the response to Covid-19 may provide for enormous opportunities to reshape the healthcare delivery system [49, 50].

Strengths and limitations

The study has several strengths. First, our scoping review was conducted in compliance with the guidance endorsed by the Joanna Briggs Institute [24], and supported by a research librarian with expertise in search strategy development and evidence synthesis. Second, our review serves as a good starting point to provide data on using and adapting existing QIs for local use, in order to inform policy makers, healthcare professionals and other stakeholders in how to introduce a set of QIs in a cost-effective manner. Third, the need for adaptation of existing QIs for local use was identified by the findings of this study, which will inform future research on this topic. Lastly, we worked closely with a multidisciplinary and cross-cultural research team in conducting the review and interpreting the findings.

It is worth noting that some limitations exist in this study. First, the quality of studies may not be appropriately assessed by SQUIRE 2.0 as the checklist is intended for reports that describe system level work and establish observed outcomes due to interventions, although quality assessment of studies is not required for a scoping review. Second, studies published only in English or Chinese were searched in this study by which all relevant studies may not be identified. Third, the lack of a systematic adaptation approach for QIs may have excluded some relevant studies.

Conclusions

Evidence gaps and future research trends are identified by conducting a scoping review (with at least 10 studies available on a specific topic) like this [51]. In summary, existing QIs serve as a good starting point for introducing a set of QIs for local use, while we found QIs developed in different contexts for different purposes were not adequately adapted before being applied and operationalized in other settings. Adaptation of QIs under a systematic approach is critical for local use, and our review firstly identifies four attributes of a systematic adaptation approach, combined with several issues need to be addressed during the adaptation, to inform future research planning for QIs adaptation. In the future, adaptation of QIs in a systematic manner may potentially provide policy makers, care providers, stakeholders and patients with a new pathway to initiate quality improvement in healthcare and gradually reshape healthcare systems.

Acknowledgments

The authors are grateful for Charlotte Beck in helping develop and revise search strategies, Clayon Hamilton, Halima Elmi, other colleagues and patient partners from Arthritis Research Canada for providing critical comments on study design.

References

  1. 1. Health UDo, Services H. National healthcare disparities report. Rockville, MD: Agency for Healthcare Research and Quality. 2003.
  2. 2. Batalden PB, Davidoff F. What is “quality improvement” and how can it transform healthcare?: BMJ Publishing Group Ltd; 2007.
  3. 3. Hunter DJ, Neogi T, Hochberg MC. Quality of osteoarthritis management and the need for reform in the US. Arthritis care & research. 2011;63(1):31–8. pmid:20583113
  4. 4. Delnoij DM, Westert GP. Assessing the validity of quality indicators: keep the context in mind!: Oxford University Press; 2012.
  5. 5. Farquhar M. AHRQ quality indicators. 2008.
  6. 6. Brook RH, Chassin MR, Fink A, Solomon DH, Kosecoff J, Park RE. A method for the detailed assessment of the appropriateness of medical technologies. International journal of technology assessment in health care. 1986;2(1):53–63. pmid:10300718
  7. 7. Petzold T, Deckert S, Williamson PR, Schmitt J. Quality measurement recommendations relevant to clinical guidelines in Germany and the United Kingdom:(what) can we learn from each other? Inquiry: The Journal of Health Care Organization, Provision, and Financing. 2018;55:0046958018761495. pmid:29591538
  8. 8. Lohr KN. Medicare: a strategy for quality assurance: National Academies Press; 1990.
  9. 9. Lohr KN, Schroeder SA. A strategy for quality assurance in Medicare. New England Journal of Medicine. 1990;322(10):707–12. pmid:2406600
  10. 10. Ganz DA, Chang JT, Roth CP, Guan M, Kamberg CJ, Niu F, et al. Quality of osteoarthritis care for community-dwelling older adults. Arthritis Care & Research: Official Journal of the American College of Rheumatology. 2006;55(2):241–7. pmid:16583414
  11. 11. MacLean CH, Saag KG, Solomon DH, Morton SC, Sampsel S, Klippel JH. Measuring quality in arthritis care: methods for developing the Arthritis Foundation’s quality indicator set. Arthritis Care & Research. 2004;51(2):193–202.
  12. 12. Westby MD, Klemm A, Li LC, Jones CA. Emerging role of quality indicators in physical therapist practice and health service delivery. Physical therapy. 2016;96(1):90–100. pmid:26089040
  13. 13. Arah OA, Westert GP, Hurst J, Klazinga NS. A conceptual framework for the OECD health care quality indicators project. International Journal for Quality in Health Care. 2006;18(suppl_1):5–13. pmid:16954510
  14. 14. Doran T, Fullwood C, Gravelle H, Reeves D, Kontopantelis E, Hiroeh U, et al. Pay-for-performance programs in family practices in the United Kingdom. New England Journal of Medicine. 2006;355(4):375–84. pmid:16870916
  15. 15. Kramers PG. The ECHI project: health indicators for the European Community. The European Journal of Public Health. 2003;13(suppl_1):101–6. pmid:14533758
  16. 16. Marshall MN, Shekelle PG, McGlynn Ea, Campbell S, Brook RH, Roland M. Can health care quality indicators be transferred between countries? Quality \& safety in health care. 2003;12(1):8–12. Marshall2003. pmid:12571338
  17. 17. Wenger NS, Shekelle PG. Assessing care of vulnerable elders: ACOVE project overview. Annals of Internal Medicine. 2001;135(8_Part_2):642–6. pmid:11601946
  18. 18. Steel N, Melzer D, Shekelle PG, Wenger NS, Forsyth D, McWilliams BC. Developinq quality indicators for older adults: Transfer from the USA to the UK is feasible. Quality and Safety in Health Care. 2004;13(4):260–4. Steel2004. pmid:15289628
  19. 19. van der Ploeg E, Depla MFIa, Shekelle P, Rigter H, Mackenbach JP. Developing quality indicators for general practice care for vulnerable elders; transfer from US to The Netherlands. Quality \& safety in health care. 2008;17(4):291–5. VanderPloeg2008. pmid:18678728
  20. 20. Hansen MP, Bjerrum L, Gahrn-Hansen B, Christensen RDP, Davidsen JR, Munck A, et al. Quality indicators for treatment of respiratory tract infections? An assessment by Danish general practitioners. European Journal of General Practice. 2013;19(2):85–91. Hansen2013. pmid:23072550
  21. 21. Jeon Y-H, Chien WT, Ha J-Y, Ibrahim R, Kirley B, Tan LL, et al. Application of the European quality indicators for psychosocial dementia care in long-term care facilities in the Asia-Pacific region: a pilot study. Aging \& Mental Health. 2017;0(0):1–8. Jeon2017. pmid:28714742
  22. 22. Taveira A, Macedo AP, Rego N, Crispim J. Assessing equity and quality indicators for older people–Adaptation and validation of the Assessing Care of Vulnerable Elders (ACOVE) checklist for the Portuguese care context. BMC geriatrics. 2022;22(1):1–14.
  23. 23. Arksey H, O’Malley L. Scoping studies: towards a methodological framework. International journal of social research methodology. 2005;8(1):19–32.
  24. 24. Peters MD, Marnie C, Tricco AC, Pollock D, Munn Z, Alexander L, et al. Updated methodological guidance for the conduct of scoping reviews. JBI Evidence Synthesis. 2020;18(10):2119–26. pmid:33038124
  25. 25. Siyi Z, Linda L, Chengqi H. The effects of the active adaption process on the suitability and usability of existing QIs for local use. PROSPERO 2018 CRD42018096844 2021 [cited 2021 May 20]. https://www.crd.york.ac.uk/prospero/display_record.php?ID=CRD42018096844.
  26. 26. Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Annals of internal medicine. 2018;169(7):467–73. pmid:30178033
  27. 27. Shekelle PG. Quality indicators and performance measures: methods for development need more standardization. Journal of clinical epidemiology. 2013;66(12):1338. pmid:24018346
  28. 28. Cochrane Library. Living systematic reviews. 2021 [cited 2021 May 16]. https://community.cochrane.org/review-production/production-resources/living-systematic-reviews#:~:text=Living%20Evidence%20Network-,What%20is%20a%20living%20systematic%20review%3F,the%20evidence%20(i.e.%20monthly%20searches).
  29. 29. Ogrinc G, Davies L, Goodman D, Batalden P, Davidoff F, Stevens D. SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): revised publication guidelines from a detailed consensus process. American Journal of Critical Care. 2015;24(6):466–73. pmid:26523003
  30. 30. Effendy C, Vissers K, Woitha Ka. Face-validation of quality indicators for the organization of palliative care in hospitals in Indonesia: a contribution to quality improvement. Supportive Care in Cancer. 2014;22(12):3301–10. Effendy2014. pmid:25091055
  31. 31. Katsarava Z, Gouveia RG, Jensen R, Gaul C, Schramm S, Schoppe A, et al. Evaluation of headache service quality indicators: pilot implementation in two specialist-care centres. Journal of Headache and Pain. 2015;16(1):1–8. Katsarava2015. pmid:26059349
  32. 32. Li LC, Sayre EC, Kopec JA, Esdaile JM, Bar S, Cibere J. Quality of nonpharmacological care in the community for people with knee and hip osteoarthritis. Journal of Rheumatology. 2011;38(10):2230–7. Li2011. pmid:21807776
  33. 33. Pellesi L, Benemei S, Favoni V, Lupi C, Mampreso E, Negro A, et al. Quality indicators in headache care: an implementation study in six Italian specialist-care centres. Journal of Headache and Pain. 2017;18(1). Pellesi2017. pmid:28477307
  34. 34. Schramm S, Uluduz D, Gouveia RG, Jensen R, Siva A, Uygunoglu U, et al. Headache service quality: evaluation of quality indicators in 14 specialist-care centres. The Journal of Headache and Pain. 2016. Schramm2016. pmid:27933580
  35. 35. Beaton D, Bombardier C, Guillemin F, Ferraz MB. Recommendations for the cross-cultural adaptation of the DASH & QuickDASH outcome measures. Institute for Work & Health. 2007;1(1):1–45.
  36. 36. Mourad S, Hermens R, Nelen W, Braat D, Grol R, Kremer J. Guideline-based development of quality indicators for subfertility care. Human reproduction. 2007;22(10):2665–72. pmid:17664242
  37. 37. Strömbeck B, Petersson IF, Vliet Vlieland TP, group EnW. Health care quality indicators on the management of rheumatoid arthritis and osteoarthritis: a literature review. Rheumatology. 2013;52(2):382–90. pmid:23086518
  38. 38. Delnoij DM, Rademakers JJ, Groenewegen PP. The Dutch consumer quality index: an example of stakeholder involvement in indicator development. BMC Health Services Research. 2010;10(1):88. pmid:20370925
  39. 39. Bresick G, Sayed A-R, le Grange C, Bhagwan S, Manga N. Adaptation and cross-cultural validation of the United States Primary Care Assessment Tool (expanded version) for use in South Africa. African journal of primary health care & family medicine. 2015;7(1):1–11. pmid:26245610
  40. 40. Jeon K-Y. Cross-cultural adaptation of the US consumer form of the short Primary Care Assessment Tool (PCAT): the Korean consumer form of the short PCAT (KC PCAT) and the Korean standard form of the short PCAT (KS PCAT). Quality in primary care. 2011;19(2):85–103. pmid:21575331
  41. 41. Dullie L, Meland E, Hetlevik Ø, Mildestvedt T, Gjesdal S. Development and validation of a Malawian version of the primary care assessment tool. BMC family practice. 2018;19(1):63. pmid:29769022
  42. 42. Hoa NT, Tam NM, Peersman W, Derese A, Markuns JF. Development and validation of the Vietnamese primary care assessment tool. PLoS One. 2018;13(1):e0191181. pmid:29324851
  43. 43. Campbell S, Braspenning Ja, Hutchinson A, Marshall M. Research methods used in developing and applying quality indicators in primary care. Quality and Safety in Health Care. 2002;11(4):358–64. pmid:12468698
  44. 44. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation science. 2009;4(1):1–15.
  45. 45. Assarroudi A, Heshmati Nabavi F, Armat MR, Ebadi A, Vaismoradi M. Directed qualitative content analysis: the description and elaboration of its underpinning methods and data analysis process. Journal of Research in Nursing. 2018;23(1):42–55. pmid:34394406
  46. 46. Brook RH, McGlynn EA, Cleary PD. Measuring quality of care. Mass Medical Soc; 1996.
  47. 47. Rubin HR, Pronovost P, Diette GB. Methodology Matters. From a process of care to a measure: the development and testing of a quality indicator. International Journal for Quality in Health Care. 2001;13(6):489–96.
  48. 48. Hick JL, Biddinger PD. Novel coronavirus and old lessons—preparing the health system for the pandemic. New England Journal of Medicine. 2020;382(20):e55. pmid:32212515
  49. 49. Flaatten H. The present use of quality indicators in the intensive care unit. Acta anaesthesiologica scandinavica. 2012;56(9):1078–83. pmid:22339772
  50. 50. Grace SL, Poirier P, Norris CM, Oakes GH, Somanader DS, Suskin N. Pan-Canadian development of cardiac rehabilitation and secondary prevention quality indicators. Canadian Journal of Cardiology. 2014;30(8):945–8. pmid:25064585
  51. 51. Tricco AC, Lillie E, Zarin W, O’Brien K, Colquhoun H, Kastner M, et al. A scoping review on the conduct and reporting of scoping reviews. BMC medical research methodology. 2016;16(1):15. pmid:26857112