Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Primary care quality for older adults: Practice-based quality measures derived from a RAND/UCLA appropriateness method study

  • Rebecca H. Correia ,

    Roles Conceptualization, Formal analysis, Investigation, Methodology, Project administration, Writing – original draft

    correirh@mcmaster.ca

    Affiliation Department of Health Research Methods, Evidence and Impact, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada

  • Darly Dash,

    Roles Investigation, Validation, Writing – review & editing

    Affiliation Department of Health Research Methods, Evidence and Impact, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada

  • Aaron Jones,

    Roles Conceptualization, Investigation, Methodology, Writing – review & editing

    Affiliation Department of Health Research Methods, Evidence and Impact, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada

  • Meredith Vanstone,

    Roles Conceptualization, Methodology, Writing – review & editing

    Affiliation Department of Family Medicine, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada

  • Komal Aryal,

    Roles Investigation, Validation, Writing – review & editing

    Affiliation Department of Health Research Methods, Evidence and Impact, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada

  • Henry Yu-Hin Siu,

    Roles Conceptualization, Methodology, Writing – review & editing

    Affiliation Department of Family Medicine, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada

  • Aquila Gopaul,

    Roles Investigation, Validation, Writing – review & editing

    Affiliation Department of Family Medicine, Western University, London, Ontario, Canada

  • Andrew P. Costa

    Roles Conceptualization, Methodology, Supervision, Writing – review & editing

    Affiliation Department of Health Research Methods, Evidence and Impact, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada

Abstract

We established consensus on practice-based metrics that characterize quality of care for older primary care patients and can be examined using secondary health administrative data. We conducted a two-round RAND/UCLA Appropriateness Method (RAM) study and recruited 10 Canadian clinicians and researchers with expertise relevant to the primary care of elderly patients. Informed by a literature review, the first RAM round evaluated the appropriateness and importance of candidate quality measures in an online questionnaire. Technical definitions were developed for each endorsed indicator to specify how the indicator could be operationalized using health administrative data. In a virtual synchronous meeting, the expert panel offered feedback on the technical specifications for the endorsed indicators. Panelists then completed a second (final) questionnaire to rate each indicator and corresponding technical definition on the same criteria (appropriateness and importance). We used statistical integration to combine technical expert panelists’ judgements and content analysis of open-ended survey responses. Our literature search and internal screening resulted in 61 practice-based quality indicators for rating. We developed technical definitions for indicators endorsed in the first questionnaire (n = 55). Following the virtual synchronous meeting and second questionnaire, we achieved consensus on 12 practice-based quality measures across four Priority Topics in Care of the Elderly. The endorsed indicators provide a framework to characterize practice- and population-level encounters of family physicians delivering care to older patients and will offer insights into the outcomes of their care provision. This study presented a case of soliciting expert feedback to develop measurable practice-based quality indicators that can be examined using administrative data to understand quality of care within population-based data holdings. Future work will refine and operationalize the technical definitions established through this process to examine primary care provision for older adults in a particular context (Ontario, Canada).

Introduction

Community-based primary healthcare is often patients’ first point-of-contact with health services [1]. Older adults (aged 65+) are the highest users of primary care services [2], and their number of interactions and dependence on health systems increases with age [3]. Aging demographics [4], compounded by older adults’ growing social vulnerabilities [510] and preference to remain in the community [11, 12], portends even greater demands on primary care systems in the coming years [3, 13]. Primary care is currently grappling with substantial healthcare provider workforce shortfalls [1416], posing significant challenges for patient access [17, 18]. Improvements in primary care structures and processes are essential to create the necessary capacity for family physicians to adequately address the complex and emerging needs of older adults [1, 19, 20].

In Canada, family physicians are central to primary care teams and are the most frequent physician providers of medical services to older adults [2]. Although all family physicians must achieve a baseline level of competence to care for elderly patients to be certified for independent practice, there is a range of competence across family physicians [21]. Variability in knowledge and clinical skills of providers may influence quality of care and health outcomes for older patients. Therefore, developing indicators to assess, measure, and compare the medical practice structures and services provided by family physicians–in the context of caring for older patients–may inform quality improvement activities and priorities for continuing professional education [22, 23].

Inferences about quality of care are limited by the availability of data since reliable and valid data holdings are needed to apply indicators [23]. Information systems, such as population-based health administrative data sources, enable the accurate and consistent measurement of indicators by providing access to data at the macro (e.g., healthcare organizations) and micro (e.g., individual patients, providers) levels. Research focused on the development of elderly-focused quality indicators for primary care to-date have not been concerned with operationalizing them [21]. In this context, “operationalize” refers to developing technical specifications to apply, measure, and examine indicators using available data holdings. Despite the limitations of using secondary data to make inferences about quality [21], indicators developed without reference to information sources or those that are not feasible to operationalize within a particular context are limited in their utility and capacity to promote change.

Therefore, we sought to establish consensus on measurable practice-based process metrics that characterize quality of care for older primary care patients. We focus on quality indicators that are directly relevant to the clinical activities of physicians (practice-based) and classified as acts of healthcare service delivery (processes). Along with clinical structures and medical outcomes, healthcare service delivery processes are one of three interrelated components comprising the Donabedian model–the dominant quality improvement paradigm that enables evaluations of medical care quality and the performance of health systems [21, 24]. Our study is situated within the province of Ontario, Canada to develop technical definitions for each indicator using available administrative data holdings. We organized our work around the research question: Within the framework of secondary, administrative data as a lens to understand primary care practice, can a technical expert panel establish consensus on which practice-based process metrics suggest better versus worse quality of care for older patients? We hypothesized that derived indicators would be clinically meaningful and feasible to measure, but not comprehensive of all clinical activities relevant to older adult-focused primary care. Our endorsed indicator set will support future work examining the medical practice of family physicians and quality of care for older patients.

Methods

Study design

We conducted a two-phase RAND/UCLA Appropriateness Method (RAM) study that has been described previously [21]. The Delphi technique and its derivatives, including RAM, have been widely applied in health services research for quality indicator development. We report our study using the CREDES guideline [25] (S1 Table). We describe the four major stages of our study.

Stage 1: Literature review to identify candidate indicators

We conducted a literature review to inform the RAM questionnaire items. We reviewed existing literature obtained through informal literature searches while preparing the study protocol. The titles and abstracts were scanned for relevance and screened against a set of inclusion and exclusion criteria [21]. We formally searched three online databases (i.e., PubMed, MEDLINE via Ovid, Google Scholar) to identify academic and grey literature. Lastly, we scanned the reference lists of included literature to obtain any additional materials. The full search strategy is detailed in our study protocol [21].

From the included texts, we extracted any quality indicators, metrics, measures, or processes to generate a candidate list. We assessed the relevance of candidate indicators for inclusion in our questionnaire by applying screening criteria (Table 1). Indicators must have satisfied all four criteria for inclusion.

thumbnail
Table 1. Criteria for screening the relevance of candidate indicators.

https://doi.org/10.1371/journal.pone.0297505.t001

Candidate quality indicators were then organized by the 18 “Priority Topics and Key Features for the Assessment of Competence in Care of the Elderly” (FM-COE Priority Topics) [29], which outline the bounds of best practices in caring for older adults [21]. For each Priority Topic, we grouped similar indicators to streamline our initial list and remove any duplicates. To finalize our RAM questionnaire items, we translated each indicator into a set of “if” and “then” statements and conducted an internal screening process to assess measurement feasibility and streamline the number of questionnaire items.

Stage 2: Recruitment of technical expert panel

We described the procedures to identify and recruit prospective panelists previously [21]. Panelists included individuals with extensive knowledge about primary care for older persons, evidenced by at least five years of clinical practice experience or activity with older patients and/or at least two relevant academic publications. Prospective panelists reviewed our letter of information and completed our demographic survey in April 2023. We used findings from the demographic survey to tailor our sampling approach and ensure diverse representation.

Stage 3: Achieving consensus in two RAM rounds

Round #1 questionnaire.

Panelists completed the first questionnaire over a two-week period in May 2023 to rate candidate indicators on two criteria: appropriateness and importance. We defined and rationalized our modified RAM criteria in our study protocol [21]. Each indicator received ratings for each criterion on a 9-point Likert scale, ranging from 1 (extremely inappropriate; extremely not important) to 9 (extremely appropriate; extremely important). Participants had the opportunity to provide comments about the quality statements and their ratings within free-text comment boxes. All study data were collected and managed using the secure, web-based data capture platform, Research Electronic Data Capture (REDCap), hosted at McMaster University [30, 31].

As per RAM guidelines, indicators advanced to the second round if they achieved a median score between 7 and 9 for both criteria without disagreement, where disagreement was defined as three or more panelists rating in both extremes (i.e., 1–3 or 7–9) for either criterion [32]. Among indicators deemed eligible for inclusion in the second round, we rank-ordered the indicators to develop corresponding technical definitions efficiently and feasibly. We summed the median scores for each indicator on both criteria and computed interquartile ranges (IQRs). We sequenced indicators by those with the highest summed median score. For indicators with equal summed median scores, we prioritized those with the lowest summed IQR.

Developing technical definitions.

We developed technical definitions for each endorsed indicator by suggesting approaches to operationalize the numerator (“then” portion of quality statement) and denominator (“if” portion). We anticipated prioritizing a subset of highly rated indicators to specify and present at our panel meetings. Within the previously determined meeting duration of two hours, we planned to allocate at least five minutes per indicator to gather feedback on each proposed technical definition. Based on the number of endorsed items from the first questionnaire, we planned to specify a threshold that would allow for a reasonable number of items to be discussed during our allotted meeting time (e.g., top 25% of the most highly rated endorsed indicators).

Health administrative data holdings at ICES in Ontario, Canada were referenced to identify relevant datasets and variables that expressed each indicator [33]. ICES is a central data repository for publicly funded administrative health services records in Ontario supporting population-based health research [34]. We identified, referenced, and modified pre-existing and validated indicators, specifications, classifications, procedures, outcomes, and derived cohorts, where possible [3542]. We developed a standard template for each quality statement detailing its conceptual definition, interpretation, inclusion and exclusion criteria, data source (including relevant datasets and variable names), computation, and references. The proposed technical definitions were reviewed by two health administrative data experts (AC and AJ) in advance of the panel meeting.

Synchronous panel meeting.

The purpose of the synchronous panel meeting was to review the endorsed indicators and gather feedback on their proposed technical definitions [21]. Panelists attended one of two virtual meetings (via Zoom) in June 2023. We were unable to identify a meeting time that satisfied the availability of all panelists, warranting two separate group meetings. Indicators were presented for discussion according to rank order; limited time and resources necessitated prioritizing a subset of indicators to present in each meeting.

Round #2 questionnaire.

Following the group meetings, panelists rated the quality statements and corresponding technical definitions in a second questionnaire in June 2023. Panelists were only asked to rate indicators that were discussed during their panel meeting. We provided summary notes from the group meetings for panelists to consider while completing their ratings. As in the first questionnaire, panelists rated each indicator on both criteria and could optionally provide free-text comments. Indicators comprising our final endorsed set included those rated between 7 and 9 on both criteria by more than 60% of panel members. Questionnaire responses were collected over a two-week period in June 2023.

Stage 4: Analysis

Our data sources comprised the demographic survey and two RAM questionnaires. We summarized the demographic characteristics of panelists using descriptive statistics. Panelists reported their race/ethnicity, age, and gender within free-text comment boxes; we categorized participants’ race and gender based on the responses provided verbatim and did not impose socially constructed groupings. After each questionnaire period concluded, we combined the de-identified judgements of panelists using statistical integration of Likert-scale ratings and conventional content analysis of open-ended responses [43]. We computed the median and IQR for each indicator on both criteria. We conducted the Wilcoxon matched-pairs signed-rank test to measure changes in consensus between rounds [44]. Qualitative data were used to revise quality statements and technical definitions. We shared individual feedback to panelists after each round, including summaries of their ratings relative to others. Statistical analyses were performed using Statistical Analysis Software (SAS) version 14 (Cary, North Carolina).

Ethics and registration

Our study was approved by the Hamilton Integrated Research Ethics Board (#15545) and prospectively registered with ISRCTN (#17074347). All panelists provided written consent before data collection.

Results

Literature review

The search, screening, and extraction results of our literature review yielded 36 included texts and 500 candidate indicators (S1 Fig). We list and summarize the texts from which these indicators were obtained in S2 Table. Upon screening the extracted indicators, 282 were eligible for inclusion (n = 58 indicators obtained from the previously collected literature, n = 52 from the new literature search results, and n = 127 of those obtained from the reference lists of included literature).

We organized the 282 quality indicators by the 18 FM-COE Priority Topics and streamlined indicators within each Topic (S1 Fig), resulting in 178 items for internal screening. Most indicators aligned with “Medical Conditions” (n = 83) or “Appropriate Prescribing” (n = 73), while some did not align with any areas (n = 12). We grouped similar items and removed duplicates within each Priority Topic; this process eliminated 183 indicators. For example, nine items from different information sources pertained to the process of medication review and reconciliation and were collapsed into a single quality statement. The considerable degree of duplication in our candidate list suggested saturation, defined as information redundancy, across information sources [45].

We translated the eligible indicators into 178 quality statements using the “if”/“then” format. 112 statements were deemed “definitely not feasible” to measure within ICES data holdings via independent review in duplicate by health administrative data experts (APC and AJ); these statements were subsequently removed. Five quality statements were removed after a review for clinical accuracy by a physician (AG) and alignment with clinical practice guidelines (S1 Fig). Therefore, the literature review process resulted in 61 quality statements for inclusion in our first questionnaire, aligning with 10 FM-COE Priority Topics.

Technical expert panel

We recruited 10 participants for our technical expert panel. All panelists reported involvement with health research for a median of 14 years (IQR: 8.75). Eight panelists held graduate degrees (i.e., master’s degree or PhD) and nine were medical doctors (MDs); six physicians held “Care of the Elderly” Certificates of Added Competence. Clinicians had been in independent practice for a median of 22 years (IQR: 14). Panelists had a median age of 49 years (IQR: 11.25) and 70% were female (n = 7). Nine panelists were affiliated with institutions or organizations within Ontario; one resided and worked in another province. Seventy percent of panelists self-identified as white and 30% as East Asian, South Asian, or Southeast Asian. Eight panelists primarily worked or practiced in urban areas, one primarily in rural areas, and one in both urban and rural areas.

Round #1 questionnaire

Across 61 questionnaire items, median scores ranged from 5.5 to 9.0 for appropriateness and 6.0 to 9.0 for importance (S3 Table). 55 indicators met our pre-specified threshold and advanced to the second round; six indicators were eliminated. Indicators that did not advance were rated poorly for not pertaining to the role/scope of family physicians or describing inappropriate clinical activities or timeframes.

Panelists’ open-ended responses justified or clarified their ratings, suggested revisions to the wording of quality statements, or shared other reflections. For example, a participant rationalized their rating as follows: “I rated importance lower because sometimes the info found in a CGA [Comprehensive Geriatric Assessment] would be included as part of the knowledge from continuity of care, not as a discrete assessment” (Panelist #3, Indicator #2). Panelists confirmed activities that were “standard of care” (Panelist #10, Indicator #52) and criticized statements that were inaccurate or irrelevant. For example, a panelist questioned the role of family physicians in cardiac rehabilitation activities (Indicator #21),

“While it is important that cardiac rehab be offered, should this be done by cardiology? Maybe the PCP [Primary Care Provider] role is to verify that it has been offered.

–Panelist #5

Participants often noted “it depends” (Panelist #4, Indicator #40) when rating statements, and clarified that the importance of performing any given clinical activity depended on the patient’s severity, health status, goals of care, available resources, and other factors. Where relevant, panelists stated discrepancies between current practice guidelines and the given the clinical scenario or activity, such as: “B12 is not in Canadian consensus guidelines” (Panelist #2, Indicator #23). Some panelists shared references to jurisdictional and organizational guidelines, frameworks, policies, or research studies that facilitated or impeded the specified activities. For example, “Offer all appropriate vaccines is consistent with the Canadian Frailty Network’s AVOID framework” (Panelist #5, Indicator #10). When diagnostic criteria or drugs were listed in the quality statement, participants frequently shared comments about adding or removing some of the specified items, such as: “Consider adding sglt2 inhibitors” (Panelist #2, Indicator #16).

Technical definitions

From the 55 endorsed quality statements, it was necessary to develop technical definitions, an example of which is provided in S4 Table. We documented our efforts to operationalize each quality statement and posed discussion questions within the technical definition workbook used to guide our discussion in the synchronous panel meeting. For example, if the frequency of a health service encounter or timeframe for follow-up was not specified within the quality statement, we noted these outstanding aspects as points of discussion.

We referenced four ICES datasets and three derived cohorts to draft technical definitions. The Primary Care Population (PCPOP) is an ICES dataset of eligible primary care patients in Ontario and contains information on demographics, primary care enrollment, healthcare utilization, specialist visits, and continuity of care. The Ontario Health Insurance Plan (OHIP) contains information on all billing claims submitted by physicians for consultations and procedures, including shadow billings for physicians practicing in non-fee-for-service models. The Ontario Laboratories Information System (OLIS) is an information repository containing lab test orders and results from hospitals, community labs and public health labs. The Ontario Drug Benefit (ODB) contains community-based, outpatient drug orders for products listed on the ODB formulary, notably for residents aged 65 years and older. We also referenced three ICES-derived cohorts that were previously developed using validated algorithms using diagnostic codes, hospitalization records, physician claims, and/or drug reimbursements for persons with dementia (DEMENTIA), chronic obstructive pulmonary disease (COPD), or congestive heart failure (CHF) [4648].

We rank-ordered endorsed indicators from the first questionnaire to pragmatically determine a subset that would be reasonable to present in our allotted panel meeting times; it would not have been possible to discuss all 55 endorsed indicators during the predetermined two-hour timeframe. We considered what threshold of the most highly rated indicators would be feasible to present (e.g., top 10% = 6 indicators, top 25% = 14 indicators, etc.). We allotted 10 minutes to review the project objectives and facilitate introductions, 15 minutes to review ratings from the first questionnaire, and five minutes per indicator to gather feedback on the proposed technical definition–resulting in sufficient time to review 19 indicators (the top 35% most highly rated indicators) over 95 minutes (S5 Table; S2 Fig). In round 1, 14 indicators were rated most highly with summed median scores ranging from 18 to 16.5. 14 indicators achieved the next highest summed median score of 16; from these, 5 were prioritized based on having the lowest summed IQR.

Synchronous panel meeting

We gathered feedback on the technical definitions during our synchronous panel meetings. Six panelists attended the first group meeting; four attended the second. The facilitator (RC) presented the technical definitions individually, and panelists provided feedback on the clinical scenario or activity. Panelists often commented on the frequency in which some physician fee codes were utilized, the diagnostic codes to specify clinical conditions, and medication groupings or classifications.

Through these discussions, we unanimously omitted four quality statements; two items were duplicates and two were not possible to accurately specify using administrative data. Panelists also suggested revising the language of eleven quality statements; these revised statements were incorporated into the second questionnaire and rated. Following the meetings, we finalized our technical definition workbook to reflect group feedback.

Round #2 questionnaire

Panelists rated 15 quality statements and their corresponding technical definitions in the second questionnaire. Median scores ranged from 6.0 to 9.0 for appropriateness and 6.5 to 8.5 for importance (S6 Table). 12 indicators met our threshold for inclusion in the final indicator set (Table 2); three indicators were eliminated. Drafted technical definitions for the endorsed indicators are available upon request.

thumbnail
Table 2. Final endorsed quality indicator set by FM-COE Priority Topic.

https://doi.org/10.1371/journal.pone.0297505.t002

Fig 1 illustrates the flow of candidate indicators across the study phases into our final set, broken down by FM-COE Priority Topics. The 12 endorsed quality statements expressed four Priority Topics: “Appropriate Prescribing” (n = 5), “Medical Conditions” (n = 4), “Cognitive Impairment” (n = 2), and “Driving Issues” (n = 1). The Wilcoxon matched-pairs signed-rank test achieved a P value of 0.0391, suggesting a significant difference in consensus scores between the two RAM rounds (S3 Fig).

thumbnail
Fig 1. Flow diagram of endorsed indicators, by FM-COE Priority Topic.

Legend: 61 indicators across 10 FM-COE Priority Topics were identified in the literature review for rating; 55 were endorsed in the first questionnaire. Of the 19 indicators reviewed in the panel meeting, 12 indicators across four Priority Topics were endorsed in the second questionnaire.

https://doi.org/10.1371/journal.pone.0297505.g001

Panelists’ open-ended responses offered suggestions to the proposed technical definitions accompanying the indicators. Some comments affirmed the importance of indicators: “A staple indicator in primary care” (Panelist #10, Indicator #1). Others noted the limitations of technical definitions, such as physician fee codes used as a proxy measure of vaccination status. Participants noted regional differences inhibiting measurement in some jurisdictions (Indicators #7,8). Both supportive feedback and criticism was received for indicators that were re-worded during the synchronous panel meeting, including “the revision is too broad and lacks context” (Panelist #8, Indicator #13). Challenges of measuring some indicators were expressed: “This indicator is appropriate, but as the discussion suggest[ed], will be very hard to measure/assess so my rating goes down” (Panelist #3, Indicator #2). Panelists also referenced guidelines, reports, research articles, and other documents to refine specifications, and suggested revisions to how some quality statements were stated. For example,

“To be future oriented, should the item read THEN "the PCP should perform tests aligned with the MOST CURRENT Canadian Consensus on Dementia". Presumably, the science/evidence will evolve and this language will prevent the indicator from becoming stale dated.

–Panelist #5, Indicator #9

Discussion

We established consensus on a set of 12 practice-based process metrics and technical definitions to characterize quality of care for older primary care patients through the use of secondary health administrative data. The endorsed statements express four of 18 FM-COE Priority Topics and achieved increasingly greater ratings on “appropriateness” (round 1: 5.5 to 9.0; round 2: 6.0 to 9.0) and “importance” (round 1: 6.0 to 9.0; round 2: 6.5 to 8.5) in subsequent rounds. While the literature review revealed that many relevant indicators exist, our feasibility screening affirmed that most cannot be measured at scale to assess performance; we were unable to operationalize all candidate or endorsed indicators due to limitations within the available information sources. High ratings for the appropriateness and importance of indicators were achieved in both questionnaires, affirming the relevance of the included literature in which candidate indicators were extracted. Findings from this consensus study will support future work to test the endorsed indicators within population-based data holdings on their ability to distinguish medical practice of family physicians and quality of care for older patients.

While prior RAM studies have established quality standards or priorities to improve care for older patients in different care settings, we intended to develop a measurable indicator set. Limited efforts to date have yielded real world indicator translation supporting quality improvement [49, 50]. This process revealed that many indicators–those identified from the literature and endorsed through our consensus process–were concentrated on the biomedical aspects of caring for elderly patients–namely, the “Medical Conditions” (n = 4) and “Appropriate Prescribing” (n = 5) FM-COE Priority Topics. While developing measurable definitions for established indicators is not a required activity [23, 51], we intended to develop feasible indicator set to reference readily collected data to assess performance; measurement is essential to inform and support quality improvement. Whereas some Priority Topics lend themselves well to measurement using administrative data (e.g., by specifying diagnostic codes, drug identification numbers), others were not possible to specify (e.g., “Communication,” “Decision making and capacity,” “Family and informal care supports”). More work can be done to develop approaches to measure other facets of elderly-focused care.

Further, our panelists with demonstrated clinical and research expertise likely prioritized indicators differently than older adults or informal caregivers would have. Patients are increasingly involved in quality improvement initiatives and their first-hand experiences and inputs into problem-solving have demonstrated value in redesigning care aligned with patient priorities [5254]. For example, indicators relevant to “organizing care using community resources,” a FM-COE Priority Topic, were identified in our literature review but not endorsed. Therefore, by not including patients or caregivers on our expert panel, the endorsed indicators may only constitute some of the primary care activities that are important or meaningful to patients. Through a separate but complementary study, we will engage older adults to compare and establish patient-important indicators. This approach will extend how we conceptualize and assess quality primary care for older adults, and maximize patient perspectives and feedback in an open-ended format, rather than limiting their viewpoints to the constraints of feasible secondary data.

Measurable quality indicators are limited by information sources available, and technical definitions–which are necessary to operationalize indicators–pertain to particular settings, thereby, affecting their generalizability [21]. There is substantial diversity in how clinical scenarios and activities are defined in different healthcare contexts and reported in administrative data sources, which prohibits the development of universal technical definitions. In an effort to establish measurable indicators to support future work, technical definitions developed through this process are contextualized to community-based primary care settings in Ontario, Canada, and relevant information sources. While the specific data set and variable names of our endorsed indicators may not be directly apply to other information sources, they can be adapted to the local context if studied in other settings [55].

Given current and projected demands on the geriatric healthcare workforce [56, 57], efforts are needed to improve the quality of medical care delivered to complex older patients [58]. This study focused on testing whether technical definitions can be established that define quality primary care for this vulnerable population. Quality measures do not function well on the individual (in this case, provider) level since other factors (e.g., setting, resources) may impact processes or resulting health outcomes. Therefore, we do not suggest that these indicators should be implemented in practice (such as to generate individual-level reports); rather, our findings may lead to education and awareness about approaches to measuring practice-based primary care encounters and for quality improvement. Ultimately, the use of indicators must be driven by the professional community.

Limitations

The technical expert panel was composed mostly of primary care clinicians and researchers who worked or practiced in urban areas and lacked representation of some ethnic minorities and experts located outside Ontario. Although we recruited fewer panelists than anticipated [21], all participants completed both questionnaires and actively contributed in the synchronous panel meetings. Due to the volume of indicators identified in our literature review, we restricted items for rating in the first questionnaire to those that were potentially feasible to measure using administrative data. Our focus on developing a measurable indicator set excluded many items that were not currently collected/available in data holdings; future work could examine excluded indicators as new measures become available or within other information sources. For example, we could not specify physicians’ referrals to social services or resources (e.g., adult day programs) or patients’ symptoms (e.g., insomnia, agitation, delirium), and we used physician fee codes as a proxy for some clinical activities (e.g., immunizations).

We only presented a subset of endorsed indicators from the first questionnaire and their technical definitions in the group panel meetings. Although excluding indicators for pragmatic reasons (i.e., the predetermined two-hour panel meeting length) may have biased our findings, our study demonstrated the utility of using a consensus process to derive a measurable indicator set for those rated most highly. We prioritized indicators for discussion in the panel meeting by rank-ordering endorsed indicators from round 1 with the highest summed median scores and, for those tied, the lowest summed IQR. We did not initially set out to establish a comprehensive indicator set; rather, we aimed to determine whether it was possible to establish consensus on measurable indicators by referencing administrative data holdings [21]. Lastly, we did not test our technical specifications for the endorsed indicators; these definitions will be operationalized and refined in future work. If a useful profile can be created, we can re-engage panelists to solicit feedback on the outstanding indicators (S5 Table).

Conclusion

We produced a measurable set of quality indicators that will support work to examine primary care provision for older adults using health administrative data. While not comprehensive of all 18 FM-COE Priority Topics, the endorsed indicators provide a framework to characterize practice- and population-level encounters of family physicians delivering care to older patients and will offer insights into the outcomes of their care provision. The technical definitions established through this study offer a case to trial their measurement within the context of population-based data sources in Ontario, Canada. There is work to be done to understand the feasibility of operationalizing the drafted specifications. Ultimately, these efforts may identify areas where high quality care for elderly patients is consistently provided and systematic challenges in delivering elderly-focused care. Measuring primary care quality for older adults using these metrics is a starting point to determine opportunities for resources, education, incentives, interventions, and policies to support quality improvement.

Supporting information

S1 Fig. Literature search, organization, and screening results.

Legend: Steps 1 to 4 outline the literature search results of 282 candidate indicators. Steps 5 to 7 illustrate how the candidate indicators were organized by the FM-COE Priority Topics and streamlined into 178 quality statements. Steps 8 and 9 display the results of internal feasibility screening and clinical review, resulting in 61 quality statements for inclusion in the first questionnaire.

https://doi.org/10.1371/journal.pone.0297505.s001

(TIFF)

S2 Fig. Distribution of summed median scores and IQRs for indicators that advanced from round 1.

Legend: 19 indicators whose summed median scores were rated most highly were prioritized for discussion in the group panel meeting. 14 indicators had equal summed median scores and were rank-ordered by the lowest summed IQR for prioritization.

https://doi.org/10.1371/journal.pone.0297505.s002

(TIFF)

S3 Fig. Distribution of Wilcoxon scores (matched-pairs signed-rank test).

Legend: The p-value obtained by the Wilcoxon matched-pairs signed-rank test suggests a significant difference in consensus scores between the two RAM rounds.

https://doi.org/10.1371/journal.pone.0297505.s003

(TIFF)

S1 Table. Recommendations for the conducting and REporting of DElphi studies (CREDES) checklist.

https://doi.org/10.1371/journal.pone.0297505.s004

(DOCX)

S4 Table. Example of a technical definition for an endorsed quality statement.

https://doi.org/10.1371/journal.pone.0297505.s007

(DOCX)

S5 Table. Prioritization of endorsed items from round 1 to round 2.

https://doi.org/10.1371/journal.pone.0297505.s008

(DOCX)

References

  1. 1. Boeckxstaens P, De Graaf P. Primary care and care for older persons: Position Paper of the European Forum for Primary Care. Qual Prim Care. 2011 Jan 1;19:369–89. pmid:22340900
  2. 2. Slade S, Shrichand A, Dimillo S. Health Care for an Aging Population: A Study of how Physicians Care for Seniors in Canada [Internet]. Ottawa, Ontario: Royal College of Physicians and Surgeons of Canada; 2019. https://www.royalcollege.ca/content/dam/documents/about/health-policy/royal-college-seniors-care-report-e.pdf
  3. 3. Vegda K, Nie JX, Wang L, Tracy CS, Moineddin R, Upshur RE. Trends in health services utilization, medication use, and health conditions among older adults: a 2-year retrospective chart review in a primary care practice. BMC Health Serv Res [Internet]. 2009 Nov 30 [cited 2023 Sep 19];9(1):217. Available from: pmid:19948033
  4. 4. Statistics Canada. Population Projections for Canada (2018 to 2068), Provinces and Territories (2018 to 2043) [Internet]. Her Majesty the Queen in Right of Canada as represented by the Minister of Industry; 2019 Sep. https://www150.statcan.gc.ca/n1/en/pub/91-520-x/91-520-x2019001-eng.pdf?st=OOnjLsP7
  5. 5. Curtis J, McMullin J. Dynamics of Retirement Income Inequality in Canada, 1991–2011. J Popul Ageing [Internet]. 2019 Mar 1 [cited 2023 Jul 5];12(1):51–68. Available from:
  6. 6. Khan MN, Ferrer I, Lee Y, Deloria R, Kusari K, Migrino L, et al. “We’re Always Looking at the Dollars and cents”: The Financial well-being of Racialized Older Immigrants in Canada Through the lens of Service Providers. J Fam Econ Issues [Internet]. 2023 Jan 31 [cited 2023 Jul 5]; Available from:
  7. 7. Johnson S, Bacsu J, McIntosh T, Jeffery B, Novik N. Social isolation and loneliness among immigrant and refugee seniors in Canada: a scoping review. Int J Migr Health Soc Care [Internet]. 2019 Jan 1 [cited 2023 Jul 5];15(3):177–90. Available from:
  8. 8. Salma J, Salami B. “We Are Like Any Other People, but We Don’t Cry Much Because Nobody Listens”: The Need to Strengthen Aging Policies and Service Provision for Minorities in Canada. The Gerontologist [Internet]. 2020 Feb 24 [cited 2023 Jul 5];60(2):279–90. Available from: pmid:31944237
  9. 9. Sibley LM, Weiner JP. An evaluation of access to health care services along the rural-urban continuum in Canada. BMC Health Serv Res [Internet]. 2011 Jan 31 [cited 2023 Jun 20];11(1):20. Available from: pmid:21281470
  10. 10. Ravichandiran N, Mathews M, Ryan BL. Utilization of healthcare by immigrants in Canada: a cross-sectional analysis of the Canadian Community Health Survey. BMC Prim Care [Internet]. 2022 Apr 6 [cited 2023 Jun 20];23(1):69. Available from: pmid:35387597
  11. 11. Pani-Harreman KE, Bours GJJW, Zander I, Kempen GIJM, van Duren JMA. Definitions, key themes and aspects of ‘ageing in place’: a scoping review. Ageing Soc [Internet]. 2021 Sep [cited 2023 Sep 19];41(9):2026–59. Available from: https://www.cambridge.org/core/journals/ageing-and-society/article/definitions-key-themes-and-aspects-of-ageing-in-place-a-scoping-review/91206401288CAB4E9AB97EDB5D479AB8
  12. 12. Channer NS, Hartt M, Biglieri S. Aging-in-place and the spatial distribution of older adult vulnerability in Canada. Appl Geogr [Internet]. 2020 Dec 1 [cited 2023 Jul 5];125:102357. Available from: https://www.sciencedirect.com/science/article/pii/S0143622820304008
  13. 13. Lafortune C, Huson K, Santi S, Stolee P. Community-based primary health care for older adults: a qualitative study of the perceptions of clients, caregivers and health care providers. BMC Geriatr [Internet]. 2015 Dec [cited 2020 Aug 25];15(1):57. Available from: http://bmcgeriatr.biomedcentral.com/articles/10.1186/s12877-015-0052-x pmid:25925552
  14. 14. Rudoler D, Peterson S, Stock D, Taylor C, Wilton D, Blackie D, et al. Changes over time in patient visits and continuity of care among graduating cohorts of family physicians in 4 Canadian provinces. CMAJ [Internet]. 2022 Dec 12 [cited 2023 Jun 21];194(48):E1639–46. Available from: https://www.cmaj.ca/content/194/48/E1639 pmid:36511867
  15. 15. Lavergne MR, Rudoler D, Peterson S, Stock D, Taylor C, Wilton AS, et al. Declining Comprehensiveness of Services Delivered by Canadian Family Physicians Is Not Driven by Early-Career Physicians. Ann Fam Med [Internet]. 2023 [cited 2023 Jun 21];21(2):151–6. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10042570/ pmid:36973051
  16. 16. Information (CIHI) CI for H. Supply, Distribution and Migration of Physicians in Canada, 2021 — Data Tables [Internet]. Canadian Institute for Health Information (CIHI); 2022 [cited 2023 Jun 20]. https://secure.cihi.ca/estore/productFamily.htm?pf=PFC4984&lang=en&media=0
  17. 17. CIHI. 88% of Canadians have a regular health provider but others struggle to access care [Internet]. Primary health care. 2023 [cited 2023 Sep 19]. https://www.cihi.ca/en/taking-the-pulse-a-snapshot-of-canadian-health-care-2023/88-of-canadians-have-a-regular-health
  18. 18. Lawson E. The Global Primary Care Crisis. Br J Gen Pract [Internet]. 2023 Jan 1 [cited 2023 Jun 20];73(726):3–3. Available from: https://bjgp.org/content/73/726/3 pmid:36543556
  19. 19. Pond CD, Regan C. Improving the delivery of primary care for older people. Med J Aust [Internet]. 2019 Jul 15 [cited 2023 Sep 19];211(2). Available from: https://www.mja.com.au/journal/2019/211/2/improving-delivery-primary-care-older-people pmid:31206179
  20. 20. Warshaw G. Providing Quality Primary Care to Older Adults. J Am Board Fam Med [Internet]. 2009 May 1 [cited 2023 Jun 20];22(3):239–41. Available from: https://www.jabfm.org/content/22/3/239 pmid:19429728
  21. 21. Correia R, Siu H, Vanstone M, Jones A, Gopaul A, Costa A. Development of Practice-based Quality Indicators for the Primary Care of Older Adults: A RAND/UCLA Appropriateness Method Study Protocol. BMJ Open [Internet]. 2023;13(9):e072232. Available from: pmid:37699633
  22. 22. Donabedian A. Explorations in Quality Assessment and Monitoring: The Definition of Quality and Approaches to its Assessment [Internet]. 1980 [cited 2023 Sep 19]. https://philpapers.org/rec/DONEIQ
  23. 23. Campbell SM. Research methods used in developing and applying quality indicators in primary care. Qual Saf Health Care [Internet]. 2002 Dec 1 [cited 2022 Oct 15];11(4):358–64. Available from: https://qualitysafety.bmj.com/lookup/doi/10.1136/qhc.11.4.358 pmid:12468698
  24. 24. Kringos DS, Boerma WG, Bourgueil Y, Cartier T, Hasvold T, Hutchinson A, et al. The european primary care monitor: structure, process and outcome indicators. BMC Fam Pract [Internet]. 2010 Dec [cited 2022 Oct 15];11(1):81. Available from: https://bmcfampract.biomedcentral.com/articles/10.1186/1471-2296-11-81 pmid:20979612
  25. 25. Jünger S, Payne SA, Brine J, Radbruch L, Brearley SG. Guidance on Conducting and REporting DElphi Studies (CREDES) in palliative care: Recommendations based on a methodological systematic review. Palliat Med [Internet]. 2017 Sep [cited 2022 Sep 21];31(8):684–706. Available from: http://journals.sagepub.com/doi/10.1177/0269216317690685 pmid:28190381
  26. 26. Canada H. About primary health care [Internet]. aem. 2005 [cited 2021 Mar 29]. https://www.canada.ca/en/health-canada/services/primary-health-care/about-primary-health-care.html
  27. 27. CanMEDS-FM 2017: A competency framework for family physicians across the continuum. Mississauga, ON: The College of Family Physicians of Canada; 2017.
  28. 28. Family Medicine Professional Profile. 2018;
  29. 29. Arcand M, Charles L, Feldman S, Frank C, Lam R, Mehta P, et al. Priority Topics and Key Features for the Assessment of Competence in Care of the Elderly [Internet]. The College of Family Physicians of Canada; 2017 [cited 2021 Nov 3]. https://www.cfpc.ca/CFPC/media/Resources/Education/COE_KF_Final_ENG.pdf
  30. 30. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—A metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform [Internet]. 2009 Apr 1 [cited 2023 Jul 18];42(2):377–81. Available from: https://www.sciencedirect.com/science/article/pii/S1532046408001226 pmid:18929686
  31. 31. Harris PA, Taylor R, Minor BL, Elliott V, Fernandez M, O’Neal L, et al. The REDCap consortium: Building an international community of software platform partners. J Biomed Inform [Internet]. 2019 Jul 1 [cited 2023 Jul 18];95:103208. Available from: https://www.sciencedirect.com/science/article/pii/S1532046419301261 pmid:31078660
  32. 32. Fitch K, editor. The Rand/UCLA appropriateness method user’s manual. Santa Monica: Rand; 2001. 109 p.
  33. 33. ICES Data Dictionary [Internet]. [cited 2023 Jun 16]. https://datadictionary.ices.on.ca/Applications/DataDictionary/Default.aspx
  34. 34. ICES | About Us | Community of Research, Data & Clinical Experts [Internet]. ICES. [cited 2023 Jun 20]. https://www.ices.on.ca/our-organization/
  35. 35. Schedule of Benefits Physician Services Under the Health Insurance Act March 9, 2023 (effective April 1, 2023)).
  36. 36. Schedule of benefits for Laboratory Services. Lab Med. 2020;
  37. 37. CIHI. Pan-Canadian Primary Health Care Indicators (Vol. 2). Ottawa: CIHI; 2006.
  38. 38. Ontario Health Teams (OHTs) [Internet]. [cited 2023 Jun 16]. https://www.ontariohealthprofiles.ca/ontarioHealthTeam.php
  39. 39. Formulary Search [Internet]. [cited 2023 Jun 16]. https://www.formulary.health.gov.on.ca/formulary/
  40. 40. Indicator Library keyword search—Health Quality Ontario (HQO) [Internet]. [cited 2023 Jun 16]. http://indicatorlibrary.hqontario.ca/Indicator/Search/EN
  41. 41. Government of Ontario M of H and LTC. Resource for Indicator Standards—Health Care Professionals—MOHLTC [Internet]. Government of Ontario, Ministry of Health and Long-Term Care; [cited 2023 Jun 16]. https://www.health.gov.on.ca/en/pro/programs/ris/
  42. 42. Ministry of Health. Diagnostic Codes [Internet]. 2023 Apr p. 39. https://www.health.gov.on.ca/en/pro/programs/ohip/claims_submission/diagnostic_codes.pdf
  43. 43. Hsieh HF, Shannon SE. Three Approaches to Qualitative Content Analysis. Qual Health Res [Internet]. 2005 Nov [cited 2021 Apr 12];15(9):1277–88. Available from: http://journals.sagepub.com/doi/10.1177/1049732305276687 pmid:16204405
  44. 44. von der Gracht HA. Consensus measurement in Delphi studies. Technol Forecast Soc Change [Internet]. 2012 Oct [cited 2022 Oct 20];79(8):1525–36. Available from: https://linkinghub.elsevier.com/retrieve/pii/S0040162512001023
  45. 45. Sandelowski M. Theoretical Saturation. In: The SAGE Encyclopedia of Qualitative Research Methods. 2nd ed. Sage, Thousand Oaks; 2008. 875–876 p.
  46. 46. Yeung DF, Boom NK, Guo H, Lee DS, Schultz SE, Tu JV. Trends in the incidence and outcomes of heart failure in Ontario, Canada: 1997 to 2007. CMAJ [Internet]. 2012 Oct 2 [cited 2023 Jun 7];184(14):E765–73. Available from: https://www.cmaj.ca/content/184/14/E765 pmid:22908143
  47. 47. Jaakkimainen RL, Bronskill SE, Tierney MC, Herrmann N, Green D, Young J, et al. Identification of Physician-Diagnosed Alzheimer’s Disease and Related Dementias in Population-Based Administrative Data: A Validation Study Using Family Physicians’ Electronic Medical Records. J Alzheimers Dis [Internet]. 2016 Aug 23 [cited 2021 May 4];54(1):337–49. Available from: https://www.medra.org/servlet/aliasResolver?alias=iospress&doi=10.3233/JAD-160105 pmid:27567819
  48. 48. G As, W C, G J, VR J, C L, T T. Identifying individuals with physcian diagnosed COPD in health administrative databases. COPD [Internet]. 2009 Oct [cited 2023 Jun 6];6(5). Available from: https://pubmed.ncbi.nlm.nih.gov/19863368/
  49. 49. Mulhall CL, Lam JMC, Rich PS, Dobell LG, Greenberg A. Enhancing quality care in Ontario long-term care homes through audit and feedback for physicians. J Am Med Dir Assoc [Internet]. 2020 Mar [cited 2020 Nov 5];21(3):420–5. Available from: https://linkinghub.elsevier.com/retrieve/pii/S1525861019308266 pmid:31974064
  50. 50. Hirdes JP, Major J, Didic S, Quinn C, Mitchell L, Chen J, et al. A Canadian Cohort Study to Evaluate the Outcomes Associated with a Multicenter Initiative to Reduce Antipsychotic Use in Long-Term Care Homes. J Am Med Dir Assoc. 2020 Jun;21(6):817–22. pmid:32493650
  51. 51. Evans SM, Lowinger JS, Sprivulis PC, Copnell B, Cameron PA. Prioritizing quality indicator development across the healthcare system: identifying what to measure. Intern Med J [Internet]. 2009 [cited 2023 Apr 21];39(10):648–54. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1445-5994.2008.01733.x pmid:19371394
  52. 52. Baker GR, Fancott C, Judd M, O’Connor P. Expanding patient engagement in quality improvement and health system redesign: Three Canadian case studies. Healthc Manage Forum [Internet]. 2016 Sep 1 [cited 2023 Nov 24];29(5):176–82. Available from: pmid:27576853
  53. 53. Armstrong N, Herbert G, Aveling EL, Dixon-Woods M, Martin G. Optimizing patient involvement in quality improvement. Health Expect [Internet]. 2013 [cited 2023 Nov 24];16(3):e36–47. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1111/hex.12039 pmid:23374430
  54. 54. Morassaei S, Campbell M, Di Prospero L. Measuring the Impact of Patient Engagement From the Perspective of Health Professionals Leading Quality Improvement Projects. J Contin Educ Health Prof [Internet]. 2021 [cited 2023 Nov 24];41(4):247–52. Available from: https://journals.lww.com/10.1097/CEH.0000000000000405 pmid:34825900
  55. 55. Delnoij DMJ, Westert GP. Assessing the validity of quality indicators: keep the context in mind! Eur J Public Health. 2012 Aug;22(4):452–3. pmid:22829489
  56. 56. Correia RH, Dash D, Hogeveen S, Woo T, Kay K, Costa AP, et al. Applicant and Match Trends to Geriatric-Focused Postgraduate Medical Training in Canada: A Descriptive Analysis. Can J Aging Rev Can Vieil [Internet]. 2023 Apr 17 [cited 2023 May 3];1–8. Available from: https://www.cambridge.org/core/product/identifier/S071498082200054X/type/journal_article pmid:37066844
  57. 57. Borrie M, Seitz D, Basu M, Cooper T, Kay K. Specialized Geriatric Services in Ontario: Human Resources Mapping Geriatricians, Care of the Elderly Physicians, and Geriatric Psychiatrists. St Joseph’s Health Care London; 2018 p. 19.
  58. 58. Rowe JW, Fried L, Jackson J, Novelli W, Stone R. Preparing for better health and health care for an aging population. Natl Acad Med. 2016;9. pmid:27668895