Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

‘More than just numbers on a page?’ A qualitative exploration of the use of data collection and feedback in youth mental health services

  • Craig Hamilton ,

    Roles Conceptualization, Formal analysis, Investigation, Project administration, Writing – original draft, Writing – review & editing

    craig.hamilton@orygen.org.au

    Affiliation Orygen, Melbourne, Australia

  • Kate Filia,

    Roles Project administration, Writing – review & editing

    Affiliations Orygen, Melbourne, Australia, Centre for Youth Mental Health, University of Melbourne, Melbourne, Australia

  • Sian Lloyd,

    Roles Writing – review & editing

    Affiliation Orygen, Melbourne, Australia

  • Sophie Prober,

    Roles Writing – review & editing

    Affiliation Orygen, Melbourne, Australia

  • Eilidh Duncan

    Roles Conceptualization, Formal analysis, Methodology, Supervision, Writing – review & editing

    Affiliation Health Services Research Unit, Institute of Applied Health Sciences, University of Aberdeen, Aberdeen, United Kingdom

Abstract

Objectives

This study aimed to explore current data collection and feedback practice, in the form of monitoring and evaluation, among youth mental health (YMH) services and healthcare commissioners; and to identify barriers and enablers to this practice.

Design

Qualitative semi-structured interviews were conducted via Zoom videoconferencing software. Data collection and analysis were informed by the Theoretical Domains Framework (TDF). Data were deductively coded to the 14 domains of the TDF and inductively coded to generate belief statements.

Setting

Healthcare commissioning organisations and YMH services in Australia.

Participants

Twenty staff from healthcare commissioning organisations and twenty staff from YMH services.

Results

The umbrella behaviour ‘monitoring and evaluation’ (ME) can be sub-divided into 10 specific sub-behaviours (e.g. planning and preparing, providing technical assistance, reviewing and interpreting data) performed by healthcare commissioners and YMH services. One hundred belief statements relating to individual, social, or environmental barriers and enablers were generated. Both participant groups articulated a desire to improve the use of ME for quality improvement and had particular interest in understanding the experiences of young people and families. Identified enablers included services and commissioners working in partnership, data literacy (including the ability to set appropriate performance indicators), relational skills, and provision of meaningful feedback. Barriers included data that did not adequately depict service performance, problems with data processes and tools, and the significant burden that data collection places on YMH services with the limited resources they have to do it.

Conclusions

Importantly, this study illustrated that the use of ME could be improved. YMH services, healthcare commissioners should collaborate on ME plans and meaningfully involve young people and families where possible. Targets, performance indicators, and outcome measures should explicitly link to YMH service quality improvement; and ME plans should include qualitative data. Streamlined data collection processes will reduce unnecessary burden, and YMH services should have the capability to interrogate their own data and generate reports. Healthcare commissioners should also ensure that they provide meaningful feedback to their commissioned services, and local and national organisations collecting youth mental health data should facilitate the sharing of this data. The results of the study should be used to design theory-informed strategies to improve ME use.

Introduction

The collection, analysis, and feedback of health services data plays an essential role in the improvement of health care [15]. Globally, shortcomings in the quality of mental health care have been identified and there is substantial interest in enhancing the use of data to address these. Opportunities for this include strategies designed to bring about changes in healthcare provider behaviour such as routine outcome measurement [6]; audit and feedback [5]; and monitoring and evaluation [2,713]. These strategies can improve care and patient outcomes but the effects are highly variable [5] and their potential has not been fully realised [14]. Knowing more about the conditions under which collection and feedback of data works to change practice, and identifying the barriers to its effective use, helps us to understand how to optimise it [15]. There is a recognised risk within the healthcare improvement field that the “effort invested in collecting information (which is essential) is not matched by effort in making improvement” [16].

This paper focuses on the use of monitoring and evaluation (ME) in the context of Australian youth mental health care. ME involves the systematic collection and analysis of program data (e.g. program activity, patient outcomes) to provide strategic information, which can be used for decision-making by program managers and healthcare commissioners. Monitoring is a continuous process which tracks progress in implementation and performance, often against indicators and targets [17]. Evaluation is a periodic activity, which can identify the extent to which intended objectives have been achieved and can provide insight into what has contributed to their achievement or non-achievement [17].

Youth mental health care in Australia

In response to the high burden and incidence of mental ill-health among young people, and inadequacies of the mental health system to meet their needs, numerous countries have developed and implemented youth mental health (YMH) services targeted to young people aged 12 to 25 years [1822]. In Australia, YMH services are typically commissioned by 31 Primary Health Networks (PHNs) [23] and delivered by local or national non-government organisations. A significant proportion of services operate as part of a national franchise led by the headspace National Youth Mental Health Foundation (110 centres by 2019) [24,25].

There is recognition of the importance of the collection and feedback of data within mental health services [2,6,7,9,26], and the practice of ME is perceived as an integral component of the commissioning process and contemporary YMH service provision [18,24,2729]. Despite this, healthcare commissioners report that they find it challenging to make meaningful use of ME data collected from YMH services [30]. Little is known about how YMH services and healthcare commissioners currently use ME and given its potential to contribute to the improvement of mental health care provided to young people, it is essential that we understand what helps and hinders its use. This study aimed to explore current ME practice among YMH services and healthcare commissioners in Australia, and to identify individual and environmental barriers and enablers to these practices.

Methods

Sampling and recruitment

Participants were purposively sampled to ensure representation across healthcare commissioning organisations and YMH services, from a variety of roles/responsibilities, and with good coverage of geographical areas within Australia.

The research team, members of which were employed at Orygen, a national youth mental health organisation [31], sent an email invitation to an existing network of contacts working in healthcare commissioning organisations and YMH services (n = 240). Snowball sampling was also used, where recipients of the original email invitation were asked to forward the email to their contacts they deemed relevant (based on the information provided in the email). A flowchart detailing participant recruitment to the two sample groups is provided in Fig 1.

thumbnail
Fig 1. Flow chart of participants’ recruitment to the study.

https://doi.org/10.1371/journal.pone.0271023.g001

Interviews

Semi-structured interviews were conducted by the lead author (CH), a male completing a Master of Public Health, and supervision was provided by a senior researcher (ED). Data collection and analyses were informed by the Theoretical Domains Framework (TDF) [32]. The TDF incorporates 33 theories of behaviour change and is used to explore and identify factors which inhibit or enable professional behaviour change [3234]. CH and ED met frequently, with expertise in TDF drawn from ED where required, related to reviewing the topic guide; coding guidelines; interview recordings; and all coding. While an interview topic guide informed by the TDF [32] was developed, the researcher encouraged a natural flow to the interviews; as such, they were semi-structured depending on when and how topics were raised by the participant. The topic guide was piloted with two mental health professionals with experience of managing YMH services and amended to improve clarity and reduce length. The topic guide is provided in S1 Appendix.

Interviews were conducted between June 2020 and August 2020 using Zoom videoconferencing software, apart from one telephone interview. Due to the COVID-19 pandemic, most participants were at home during their interview but a small minority were in a private office at their usual place of work. Notably, there were very few internet connectivity issues during the Zoom interviews, with visuals and audio remaining largely stable throughout. Each interview was audio recorded and transcribed verbatim by a specialist transcription service. CH checked the transcripts to ensure accuracy.

Data analysis

Following guidance [34] on using the TDF in qualitative studies and under the supervision of ED, CH developed coding guidelines (“a set of explicit statements of how the TDF is to be applied to a specific data set” [34].) Transcripts were imported into QSR NVivo 12 [35] for analysis.

A deductive approach was initially taken in which the researchers read the transcripts, considered the relevance of the data to the TDF’s domains and theoretical constructs, and then coded the data into one or more of the 14 theoretical domains [32,34]. This was followed by thematically analysing [36] the data coded to each theoretical domain to generate belief statements. A belief statement is a “collection of responses with a similar underlying belief that suggest a problem and/or influence of the beliefs on the target implementation problem” [34]. In line with standard practice with TDF studies [34,37], once coding was complete, three criteria were considered when judging the relevance of the TDF domains and associated belief statements to the target behaviour: (1) a high frequency of coding (≥80% participants), (2) presence of conflicting belief statements, and (3) presence of strong beliefs which may impact behaviour.

All transcripts were analysed by CH, while ED independently coded a subset of transcripts to check for consistency of coding. Differences in coding were discussed and the coding guidelines were iteratively revised until coding was acceptably consistent.

Ethical considerations

The study was approved by The University of Melbourne Centre for Youth Mental Health Human Ethics Advisory Group (Ethics ID: 2056869). All participants were provided with written study information and signed a consent form prior to interview.

Results

Participant overview

A total of 40 participants were recruited across both sample groups and data saturation of themes was achieved. The healthcare commissioners sample included staff responsible for the youth mental health portfolio, or staff involved in analysing YMH service data. The YMH services sample included management and other staff involved in ME from a commissioned YMH service.

Interviews lasted between 40 and 70 minutes (M = 56.63). Participant characteristics are summarised in Table 1.

Current practice

The types of ME behaviours performed by healthcare commissioners and YMH services are shown in Table 2. Involvement in evaluation was mentioned by a few participants, but most ME activity related to monitoring only.

thumbnail
Table 2. ME behaviours performed by healthcare commissioners and YMH services.

https://doi.org/10.1371/journal.pone.0271023.t002

Although there are commonalities in the types of behaviours, there is variation in how these behaviours are performed by services and commissioning organisations. While all healthcare commissioners require services to collect data, there are differences in the extent of these requirements. Some commissioners only require services to collect a nationally mandated primary mental health care minimum data set [38]. However, most commissioners require services to collect data in addition to this mandated data set and to provide monitoring reports (usually quarterly) which include quantitative data on service activities and outcomes, and qualitative data such as case studies or narrative. In addition to these formal monitoring mechanisms, many commissioners maintain informal communication with services to ensure they are up to date with what is happening and aware of any potential issues.

The degree to which commissioners and YMH services partner on ME varies. The ME planning process appears to be highly collaborative in some cases, while highly prescriptive in others. Similarly, some commissioners actively engage services in data-informed discussions (e.g. service development workshops), while other services report receiving little to no feedback on the reports they provide to their commissioners.

Who performs ME behaviours varies across services. For example, some services have specific data or ME-related staff who can retrieve data from data systems, and analyse and visualise data. However, in other services, staff may perform these behaviours on top of their formal job role (e.g. clinicians preparing commissioner reports). Services that operate as part of the headspace franchise are supported by the headspace National Office, which collects and analyses data from all centres, and provide centres and their commissioners with data reports and access to an online data visualisation tool.

Domains analysis

Table 3 overviews which TDF domains were relevant for the behaviours. Twelve of the 14 domains were relevant to healthcare commissioner behaviours and 11 to YMH service behaviours. S2 Appendix provides detailed information regarding the frequency of TDF coding and belief statements, the rationale for relevance, and illustrative quotes.

thumbnail
Table 3. TDF domains [32] and reasons for relevance/irrelevance.

https://doi.org/10.1371/journal.pone.0271023.t003

Belief statements shared by healthcare commissioners and YMH services

In total, 100 belief statements relating to healthcare commissioners and/or YMH service behaviours were generated. All belief statements, reasons for relevance, and illustrative quotes can be found in S2 Appendix. There were 26 belief statements that applied to both healthcare commissioner and YMH service behaviours, which are summarised in Table 4. Each of these belief statements are subsequently described in further detail (with the relevant TDF domains in bold), as well as those that were only held by one sample group.

thumbnail
Table 4. Belief statements shared by healthcare commissioners and YMH services.

https://doi.org/10.1371/journal.pone.0271023.t004

Participants in both groups regarded ME as integral to their work (intentions). Many believed that ME should be primarily used to drive quality improvement (intentions) so that young people receive the best care possible and experience improved outcomes (goals). Numerous participants purported a desire to improve the use of ME (intentions) and improving ME planning was seen as key enabler of this (behavioural regulation). However, it was also widely acknowledged that ME is burdensome for services (beliefs about consequences) and that there are limited funds for them to allocate to it (environmental context and resources).

ME helps participants to understand what is happening in services, identifies service risks and gaps, informs service improvements, and guides healthcare commissioners on how they can support services (beliefs about consequences). Participants also had particular interest in using ME to understand the experiences of young people and families accessing services (goals). The inclusion of qualitative data in monitoring reports was regarded as essential by many, as it helps to contextualise quantitative data (beliefs about consequences).

“We receive monthly data and they’ve got a target and an achievement. Really they’re only numbers on paper, until you understand what they actually mean. So I find that the qualitative stuff behind the data is of equal importance, because it speaks to the data. I think that tells us the richest information.”

(Participant 24, healthcare commissioner).

“I think it’s really the case studies that are particularly useful, because we can really get a good sense, ourselves, around what the presentations were for young people, what their goals were, what our evidence-based approaches were to meeting those goals, where the young person came to in their trajectory, and what the outcomes were for good, for bad, for otherwise, and also what the service impacts have been within service.

(Participant 6, YMH service).

Being data literate, inquisitive and open minded were regarded as important ME skills by numerous participants (skills). Similarly, having a good understanding of the YMH service context (skills) was seen as important, as was being able to empathise with YMH service staff, and being able to build relationships with organisations (e.g. service providers, healthcare commissioning organisations) (skills). One healthcare commissioner reflected on value of having previously worked in a service:

“I understand the tensions within the work. Sure every service is different and I could never possibly say that I understand exactly what they’re encountering on a day to day basis… but as a general rule, having service delivery experience really does help you when you’re collaborating with providers.”

(Participant 11, healthcare commissioner).

Both groups acknowledged that ME data does not always accurately reflect what happens on the ground in services (beliefs about consequences). Several YMH service participants noted that reporting data from a single or limited number of outcome measures only provides a partial insight into the difference their service makes.

“I don’t think that any of those measures should be taken individually. I think that would be reductionistic… they all need to be collected and viewed as a whole. I think to take any one of them individually and use that as the basis for the outcome is totally not valid.”

(Participant 8, YMH service).

Commissioners provided a different perspective on this issue. One participant spoke of finding out that a service had withheld important information about challenges they were experiencing from the commissioner, while others spoke of the integrity of data sets being reduced by data entry issues within services. Problems with data processes and tools were also widely cited by both groups as barriers to ME (environmental context and resources).

“We’re just trying to enter things into multiple platforms and you do see differences in different platforms with even just caseloads and occasions of service numbers. They are slightly different and I think that’s because we’re trying to work across too many systems.

(Participant 28, YMH service).

“We have a database that the PHN [healthcare commissioning organisation] manages, which all of the service providers enter into… it does come with some challenges because the service providers often have a lot of difficulty—it’s not the best system. It’s quite limited in what it can do with reporting. So the service providers often have challenges in being able to export and being able to filter according to the KPIs…”

(Participant 3, healthcare commissioner).

Healthcare commissioner belief statements

The value of engaging with services on an ongoing, informal basis was raised by many (beliefs about consequences), and there was a strong desire among participants to develop stronger partnerships with services, so they can support them with quality improvement (goals). It was, however, mentioned that ME can identify service issues which the commissioner may not be able to help resolve (beliefs about capabilities).

“I think the downside will probably be if I’ve found a need and I can’t support that… So if I’m aware of a gap or if I’m aware that someone is struggling and I can’t assist, I think that’s sort of a negative of evaluation.

(Participant 34, healthcare commissioner).

While some reported having little contact with other healthcare commissioning organisations regarding ME, many spoke of how they learn from and collaborate with other commissioners (social influences).

“I actually spoke to four other PHNs [healthcare commissioning organisations] to get their data to see what they collected and what some of their turnaround times were, which was fantastic. So we’ve done our own little benchmark study.”

(Participant 12, healthcare commissioner).

The ability to develop appropriate expectations and performance indicators for commissioned YMH services was viewed as a vital skill by several participants (skills). One commissioner spoke of the dangers of setting inappropriate performance indicators:

“I think people underestimate how hard it is to develop a really good indicator… You have to be really careful because you can create perverse incentives.

(Participant 16, healthcare commissioner).

Some mentioned that the way in which the government measures healthcare commissioner performance incentivises a focus on service activity rather than service outcomes (reinforcement). Others spoke of how the national primary mental health care minimum data set is of limited use when monitoring and evaluating commissioned services (environmental context and resources).

“The PMHC-MDS [national primary mental health care minimum data set] is not fit for purpose. It has too many fields. It collects information that we don’t necessarily use or value. It creates a reporting burden for provider organisations that’s unnecessary and unwarranted.”

(Participant 17, healthcare commissioner).

YMH service belief statements

Several service participants indicated that doing ME helps to ensure their service retains funding from their commissioner because it is a contractual obligation (reinforcement), while others spoke of wanting to use ME to demonstrate the difference their service makes (goals). However, most participants indicated that ME often takes a backseat to other priorities, such as attending to the needs of young people and staff (goals).

Many felt that their commissioner actively supported them, but this feeling was not shared by all (social influences):

“The PHNs [healthcare commissioners] that I find helpful are the ones who are willing to work in partnership… there are commissioners who have described themselves as like an ATM: ‘you complete the transaction and we give you the money’. Whereas others are more likely to work in partnership, so really collaborative kind of decision making.”

(Participant 31, YMH service).

Numerous participants expressed that they felt their commissioner’s expectations of their service was unrealistic (reinforcement). This related to either the volume of ME activity (i.e. data collection, reporting) required of services or expectations about service performance. Some participants said they were worried about the potential consequences of not meeting the commissioner’s expectations (emotion).

“It can also make me feel nervous. I guess I had a lot of anxiety when we’d had to do the Q3 report when I’d first started and I had to put zero next to a lot of our KPIs. That was very anxiety provoking.

(Participant 10, YMH service).

Participants suggested that commissioners could help YMH services with ME by collaborating with them (and young people and families) on decisions about ME planning, streamlining reporting requirements, and improving feedback (behavioural regulation).

It was widely reported that staff need to feel that data collection is meaningful for them to actively engage in it, and it was beneficial to create formal opportunities to discuss data with staff (behavioural regulation).

While participants asserted that ME should benefit clinical practice (goals), there were mixed views about its impact (beliefs about consequences), particularly in regard to using outcome measures with young people. While several participants spoke about the value of using measures, some felt that using measures that focus on symptoms and problems can inhibit recovery-orientated practice. Participants also spoke of how clinicians value the use of data in their practice to varying degrees (social influences). For many, the use of ME helps to ensure that their service operates in an evidence-based way (beliefs about consequences).

“Without evaluation and reflection and looking at ourselves and looking at what we’re doing, we could be in the dark ages. We could be providing a service that is unhelpful… Evaluation means that we can’t not be focused on outcomes in the participant and their needs, and keeps us ethical, and keeps us up-to-date with best practice.”

(Participant 33, YMH service).

Discussion

This study sought first to explore how data collection and feedback practice, in the form of monitoring and evaluation (ME), is used by YMH services and the organisations that commission them. Secondly, the study aimed to identify the barriers to and enablers of ME use from the perspectives of both YMH services and healthcare commissioners. We found that ME is a complex set of behaviours (e.g. planning and preparing for ME; entering data into data systems; providing technical assistance to YMH services; retrieving data from data systems; preparing reports for healthcare commissioners; analysing and visualising data; providing feedback; reviewing and interpreting data; making decisions and taking action; and informal communication between healthcare commissioners and YMH services). While there were commonalities in the types of behaviours performed, there was variation in how they were performed by commissioning organisations and YMH services. Both groups identified numerous individual, social, and environmental barriers and enablers. Many of these have the potential to be modified to enhance the use of ME activity to better support improving quality of service provision.

It was important for both commissioners and YMH services that data should drive service quality improvement. However, both groups raised concerns that data does not provide a fully accurate picture of service performance, and YMH services also felt that commissioners’ expectations of service performance were sometimes unrealistic or not meaningful. Difficulties with measuring quality in mental health care have been raised in previous literature [7,9,10,39]. In one study, mental health service managers reported that because performance indicators set for them did not obviously relate to service performance, data collection was regarded as a compliance activity rather than an opportunity to identify potential service improvements [39]. Beliefs articulated by participants in the present study can also be related to Mannion and Braithwaite’s [40] taxonomy of dysfunctional outcomes of health performance measurement. The authors identify 21 unintended or adverse consequences relating to poor measurement, misplaced incentives or sanctions, breach of trust, and politicisation of performance systems [40].

Young people having a positive experience of care was of the utmost importance to commissioners and services alike. Many thus believed that data should provide meaningful insights that support them to improve patient experiences and outcomes. Literature suggests that providing clinicians with actionable feedback that presents aspects of care delivery that are under their control and relevant to their job has the greatest chance of making a difference to practice [41,42]. Yet, in this context, data collection focused primarily at the patient-level without a strong focus on clinician-level activities that contribute to patient outcomes. A greater focus on clinician-level data in ME plans may help to ensure that data optimally contributes to improving the experiences of young people receiving care.

While clinician-level data is critical to actionable quality improvement, patient-level data is also important in measuring quality of care and clinical decision-making [2,6,7,13]. The role of outcome measurement in this was a topic of contention in this study. Participants reported variability in the value that clinicians place on using measurement in their practice and regarded mandated outcome measures to be of limited clinical utility or even a potential impediment to recovery-orientated practice. These issues are consistent with the literature on implementing outcome measurement in mental health settings [6,4348]. To avoid the risk of it becoming a purely bureaucratic exercise, outcome measurement should be meaningful to clinicians, young people, and families and carers [6,45,4952]. The dearth of clinically meaningful outcome measures designed for young people has been previously highlighted [53], but such measures are being developed [54,55]. It has also been advocated that using idiographic outcome measures (e.g. Goal-Based Outcome Tool) [56] can help to facilitate person-centred care [51,57,58] and has been associated with improvements in young people’s satisfaction and engagement with mental health services [59,60].

Challenges relating to data processes and systems, and minimum data sets are well documented in the literature [7,9,39,43,61,62], and consistent with the results of the present study. It was common for participants to speak of the burden of having to use multiple systems because of a lack of interoperability between systems or because data were needed that was not available in the national primary mental health care minimum data set. It is a priority for commissioners that YMH services collect the minimum data set because that data is used by the Australian government to measure commissioner performance, but for many commissioners, the minimum data set does not adequately capture service performance. This places commissioners in a challenging position. They are mindful that ME places a significant burden on YMH services but it is difficult for them to meaningfully monitor and evaluate services without requiring the collection of additional data.

Lastly, commissioners and YMH services want to work in partnership and such an approach may help to address some of the challenges. Services spoke of the benefits of commissioners being collaborative and forthcoming with meaningful feedback and commissioners spoke of how valuable they found it to communicate with services on an ongoing and informal basis (‘soft governance’), which aligns with the commissioning literature [6264]. Both groups also regarded interpersonal skills such as the abilities to empathise and build relationships as essential. This corroborates existing research, which emphasises that a trusting relationship between the provider and recipient of feedback improves the likelihood that the feedback will inform learning and improvement [65,66].

Strengths and limitations

The inclusion of both YMH service and healthcare commissioner perspectives from a good spread of geographical regions is a strength of the study. A limitation, however, is that a significant number of participants were recruited through the researcher’s professional network, so there is a potential risk of self-selection bias. Given the significance that participants placed on understanding the experiences of young people and families, future research is needed to include their views in ME activity.

The use of the Theoretical Domains Framework also has strengths and limitations. The TDF’s 14 domains, underpinned by 33 behavioural theories, enabled the identification of a wide range of barriers and enablers to ME. Systematically exploring each of the TDF’s domains in the interviews may have helped unveil barriers and enablers that would have been otherwise missed. It also allows for the results of this study to be used in the development of strategies to enhance ME use, through mapping the relevant TDF domains to behaviour change theory (i.e. the behaviour change wheel approach to intervention design) [34,67]. However, it would be valuable to critically appraise the data when designing these strategies, as prior research shows that people tend to emphasise external (environment, social influences) rather than internal (knowledge, skills) factors as barriers to their own behaviour [68,69]. Finally, while the TDF is extensive in its scope, it is possible that there are barriers and enablers that are not currently covered by its 14 domains.

Implications for practice

There are several strategies emerging from this research that healthcare commissioners and YMH services should implement to ensure ME is meaningful.

Firstly, ME plans should be co-designed [70] and should meaningfully involve young people and families whenever possible. The targets, performance indicators and outcome measures should explicitly link to YMH service quality improvement and, where possible, provide clear examples to demonstrate how improvements can be achieved. ME plans should also include qualitative data such as case studies.

Streamlined data collection processes will reduce unnecessary burden, and YMH services should have the capability to interrogate their own data and generate reports. Healthcare commissioners should also ensure that they provide meaningful feedback to their commissioned services, and local and national organisations collecting youth mental health data should facilitate the sharing of this data.

YMH services and commissioners should be provided with opportunities to build their ME capacity by organisations with relevant expertise. Finally, it must be noted that additional investment will likely be required for YMH services and commissioners to implement these recommendations.

Implications for future research

Future research could identify what targets, performance indicators and outcome measures would be most appropriate to use in youth mental healthcare. Young people and families should be meaningfully involved in this research, particularly in the development and ongoing validation of outcome measures.

Conclusions

By using a theory-informed behavioural approach to explore the use of ME in youth mental health care we found that current practice comprises of numerous interrelated behaviours performed by YMH services and healthcare commissioners, and that there are many barriers and enablers to this activity at the individual; organisational; and broader environmental levels. Importantly, this study illustrated scope for improvement. The results of the study should be used to design theory-informed strategies to improve ME use. This would help to ensure that the use of ME produces ‘more than just numbers on a page’ and leads to continuous improvements in the quality of mental health care provided to young people.

Supporting information

Acknowledgments

We would like to thank the participants who generously shared their time and experience for the purposes of this study.

References

  1. 1. Institute of Medicine (US). Crossing the quality chasm: a new health system for the 21st century. Washington, D.C: National Academy Press 2001.
  2. 2. Institute of Medicine (US) Committee on Crossing the Quality Chasm: Adaptation to Mental Health and Addictive Disorders. Improving the quality of health care for mental and substance-use conditions. Washington, DC: National Academies Press 2006.
  3. 3. Smith PC, Mossialos E, Papanicolas I, et al., editors. Performance measurement for health system improvement: Experiences, challenges and prospects. Cambridge: Cambridge University Press 2010.
  4. 4. Shah A. Using data for improvement. BMJ 2019;364:l189. pmid:30770353
  5. 5. Ivers N, Jamtvedt G, Flottorp S, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database of Systematic Reviews Published Online First: 2012. pmid:22696318
  6. 6. Mackrill T, Sørensen KM. Implementing routine outcome measurement in psychosocial interventions—a systematic review. European Journal of Social Work 2019;:1–19.
  7. 7. Kilbourne AM, Beck K, Spaeth-Rublee B, et al. Measuring and improving the quality of mental health care: a global perspective. World Psychiatry 2018;17:30–8. pmid:29352529
  8. 8. Russ TC, Woelbert E, Davis KAS, et al. How data science can advance mental health research. Nat Hum Behav 2019;3:24–32. pmid:30932051
  9. 9. Lora A, Lesage A, Pathare S, et al. Information for mental health systems: an instrument for policy-making and system service quality. Epidemiology and Psychiatric Sciences 2017;26:383–94. pmid:27780495
  10. 10. Rosenberg SP, Hickie IB, McGorry PD, et al. Using accountability for mental health to drive reform. The Medical Journal of Australia 2015;203:328–30. pmid:26465695
  11. 11. Urbanoski K, Inglis D. Performance measurement in mental health and addictions systems: A scoping review. J Stud Alcohol Drugs Suppl 2019;:114–30. pmid:30681956
  12. 12. Fleming I, Jones M, Bradley J, et al. Learning from a Learning Collaboration: The CORC Approach to Combining Research, Evaluation and Practice in Child Mental Health. Administration and Policy in Mental Health and Mental Health Services Research 2016;43:297–301. pmid:25234345
  13. 13. Hickie IB, Scott EM, Cross SP, et al. Right care, first time: a highly personalised and measurement-based care model to manage youth mental health. Medical Journal of Australia 2019;211. pmid:31679171
  14. 14. Foy R, Skrypak M, Alderson S, et al. Revitalising audit and feedback to improve patient care. BMJ 2020;368:m213. pmid:32107249
  15. 15. Ivers NM, Grimshaw JM, Jamtvedt G, et al. Growing Literature, Stagnant Science? Systematic Review, Meta-Regression and Cumulative Analysis of Audit and Feedback Interventions in Health Care. J GEN INTERN MED 2014;29:1534–41. pmid:24965281
  16. 16. Dixon-Woods M. How to improve healthcare improvement—an essay by Mary Dixon-Woods. BMJ 2019;367:l5514. pmid:31575526
  17. 17. Markiewicz A, Patrick I. Developing monitoring and evaluation frameworks. Los Angeles: Sage 2016.
  18. 18. Malla A, Iyer S, McGorry P, et al. From early intervention in psychosis to youth mental health reform: a review of the evolution and transformation of mental health services for young people. Soc Psychiatry Psychiatr Epidemiol 2016;51:319–26. pmid:26687237
  19. 19. McGorry PD, Goldstone SD, Parker AG, et al. Cultures for mental health care of young people: an Australian blueprint for reform. The Lancet Psychiatry 2014;1:559–68. pmid:26361315
  20. 20. Hetrick SE, Bailey AP, Smith KE, et al. Integrated (one-stop shop) youth health care: best available evidence and future directions. The Medical Journal of Australia 2017;207:S5–18. pmid:29129182
  21. 21. Malla A, Purcell R. Foreword: The real world of implementing early intervention services for young people, their families and their communities. Early Intervention in Psychiatry 2019;13:5–7. pmid:31243905
  22. 22. Illback RJ, Bates T. Transforming youth mental health services and supports in Ireland. Early Intervention in Psychiatry 2011;5:22–7. pmid:21208387
  23. 23. Australian Government Department of Health. PHN Background. 2018.http://www.health.gov.au/internet/main/publishing.nsf/Content/PHN-Background (accessed 1 Apr 2018).
  24. 24. Rickwood D, Paraskakis M, Quin D, et al. Australia’s innovation in youth mental health care: The headspace centre model. Early Intervention in Psychiatry 2018;0. pmid:30311423
  25. 25. McGorry P, Trethowan J, Rickwood D. Creating headspace for integrated youth mental health care. World Psychiatry 2019;18:140–1. pmid:31059618
  26. 26. Directorate for Health and Social Care Integration. Mental health strategy: 2017–2027. Scotland: 2017. http://www.gov.scot/Publications/2017/03/1750/0 (accessed 20 Jun 2021).
  27. 27. Australian Government Department of Health. Designing and Contracting Services Guidance. Australian Government Department of Health 2016. https://www.health.gov.au/internet/main/publishing.nsf/Content/B16B1F0B2EDE5E00CA2582E900037062/$File/PHN%20Designing%20and%20Contracting%20Services%20v0.1.pdf (accessed 7 Mar 2019).
  28. 28. Howe D, Batchelor S, Coates D, et al. Nine key principles to guide youth mental health: development of service models in New South Wales. Early Interv Psychiatry 2014;8:190–7. pmid:24251956
  29. 29. Department of Health and Department of Education. Transforming Children and Young People’s Mental Health Provision: a Green Paper. 2017. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/664855/Transforming_children_and_young_people_s_mental_health_provision.pdf (accessed 20 Jun 2021).
  30. 30. Oostermeijer S, Bassilios B, Nicholas A, et al. Implementing child and youth mental health services: early lessons from the Australian Primary Health Network Lead Site Project. Int J Ment Health Syst 2021;15:16. pmid:33622372
  31. 31. Orygen. https://www.orygen.org.au/ (accessed 13 Sep 2020).
  32. 32. Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implementation Science 2012;7:37. pmid:22530986
  33. 33. Michie S, Johnston M, Abraham C, et al. Making psychological theory useful for implementing evidence based practice: a consensus approach. BMJ Quality & Safety 2005;14:26–33. pmid:15692000
  34. 34. Atkins L, Francis J, Islam R, et al. A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems. Implementation Science 2017;12:77. pmid:28637486
  35. 35. QSR International. NVivo Qualitative Data Analysis Software. 2020. https://www.qsrinternational.com/nvivo-qualitative-data-analysis-software/home (accessed 13 Sep 2020).
  36. 36. Braun V, Clarke V. Using thematic analysis in psychology. Qualitative Research in Psychology 2006;3:77–101.
  37. 37. Duncan EM, Francis JJ, Johnston M, et al. Learning curves, taking instructions, and patient safety: using a theoretical domains framework in an interview study to investigate prescribing errors among trainee doctors. Implementation Science 2012;7:86. pmid:22967756
  38. 38. Australian Department of Health. Primary Mental Health Care Minimum Data Set: Overview of purpose, design, scope and key decision issues. Canberra: Australian Department of Health 2016. https://pmhc-mds.com/doc/pmhc-mds-overview.pdf (accessed 5 Nov 2018).
  39. 39. Holloway D, Alam M, Griffiths A, et al. Performance Management in Australia’s Public Mental Health Service: A State Based Perspective. Australian Journal of Public Administration 2012;71:20–32.
  40. 40. Mannion R, Braithwaite J. Unintended consequences of performance measurement in healthcare: 20 salutary lessons from the English National Health Service. Internal Medicine Journal 2012;42:569–74. pmid:22616961
  41. 41. Hysong SJ, Best RG, Pugh JA. Audit and feedback and clinical practice guideline adherence: Making feedback actionable. Implementation Science 2006;1:9. pmid:16722539
  42. 42. Brown B, Gude WT, Blakeman T, et al. Clinical Performance Feedback Intervention Theory (CP-FIT): a new theory for designing, implementing, and evaluating feedback in health care based on a systematic review and meta-synthesis of qualitative research. Implementation Science 2019;14:40. pmid:31027495
  43. 43. Lewis CC, Boyd M, Puspitasari A, et al. Implementing Measurement-Based Care in Behavioral Health: A Review. JAMA Psychiatry 2019;76:324–35. pmid:30566197
  44. 44. Mellor-Clark J, Cross S, Macdonald J, et al. Leading Horses to Water: Lessons from a Decade of Helping Psychological Therapy Services Use Routine Outcome Measurement to Improve Practice. Administration and Policy in Mental Health and Mental Health Services Research 2016;43:279–85. pmid:25179755
  45. 45. Wolpert M. Uses and Abuses of Patient Reported Outcome Measures (PROMs): Potential Iatrogenic Impact of PROMs Implementation and How It Can Be Mitigated. Adm Policy Ment Health 2014;41:141–5. pmid:23867978
  46. 46. Boswell JF, Kraus DR, Miller SD, et al. Implementing routine outcome monitoring in clinical practice: Benefits, challenges, and solutions. Psychotherapy Research 2015;25:6–19. pmid:23885809
  47. 47. Sharples E, Qin C, Goveas V, et al. A qualitative exploration of attitudes towards the use of outcome measures in child and adolescent mental health services. Clinical Child Psychology and Psychiatry 2017;22:219–28. pmid:27340237
  48. 48. Kwan B, Rickwood DJ, Brown PM. Factors affecting the implementation of an outcome measurement feedback system in youth mental health settings. Psychotherapy Research 2021;31:171–83. pmid:33040708
  49. 49. Thornicroft G, Slade M. New trends in assessing the outcomes of mental health interventions. World Psychiatry 2014;13:118–24. pmid:24890055
  50. 50. Jacob J, Napoleone E, Zamperoni V, et al. How Can Outcome Data Inform Change? Experiences from the Child Mental Health Context in Great Britain, Including Barriers and Facilitators to the Collection and Use of Data. In: Tilden T, Wampold BE, eds. Routine Outcome Monitoring in Couple and Family Therapy: The Empirically Informed Therapist. Cham: Springer International Publishing 2017. 261–79.
  51. 51. Law D, Wolpert M. Guide to using outcomes and feedback tools with children, young people and families. 2nd ed. CAMHS Press 2014.
  52. 52. Grundy A, Keetharuth AD, Barber R, et al. Public involvement in health outcomes research: lessons learnt from the development of the recovering quality of life (ReQoL) measures. Health Qual Life Outcomes 2019;17:60. pmid:30975153
  53. 53. Kwan B, Rickwood DJ. A systematic review of mental health outcome measures for young people aged 12 to 25 years. BMC Psychiatry 2015;15:279. pmid:26573269
  54. 54. Kwan B, Rickwood DJ. A routine outcome measure for youth mental health: Clinically interpreting MyLifeTracker. Early Intervention in Psychiatry Published Online First: 14 July 2020. pmid:32662215
  55. 55. Filia KM, Jackson HJ, Cotton SM, et al. Developing and testing the F-SIM, a measure of social inclusion for people with mental illness. Psychiatry Research 2019;279:1–8. pmid:31276963
  56. 56. Law D, Jacob J. Goals and goal-based outcomes: Some useful information. 3rd ed. London: CAMHS Press 2015. https://www.corc.uk.net/media/1219/goalsandgbos-thirdedition.pdf.
  57. 57. Ashworth M, Guerra D, Kordowicz M. Individualised or Standardised Outcome Measures: A Co-habitation? Adm Policy Ment Health Published Online First: 5 March 2019. pmid:30838500
  58. 58. Wolpert M, Vostanis P, Martin K, et al. High integrity mental health services for children: focusing on the person, not the problem. BMJ 2017;:j1500. pmid:28373178
  59. 59. Cairns AJ, Kavanagh DJ, Dark F, et al. Goal setting improves retention in youth mental health: a cross-sectional analysis. Child and Adolescent Psychiatry and Mental Health 2019;13:31. pmid:31320924
  60. 60. Jacob J, De Francesco D, Deighton J, et al. Goal formulation and tracking in child mental health settings: when is it more likely and is it associated with satisfaction with care? Eur Child Adolesc Psychiatry 2017;26:759–70. pmid:28097428
  61. 61. Valenstein M, Mitchinson A, Ronis DL, et al. Quality Indicators and Monitoring of Mental Health Services: What Do Frontline Providers Think? AJP 2004;161:146–53. pmid:14702263
  62. 62. Campbell DA, Lambright KT. Struggling to Get It Right: Performance Measurement Challenges and Strategies for Addressing Them among Funders of Human Services. Nonprofit Management and Leadership 2017;27:335–51.
  63. 63. Meurk C, Harris M, Wright E, et al. Systems levers for commissioning primary mental healthcare: a rapid review. Australian Journal of Primary Health 2018;24:29. pmid:29338836
  64. 64. Davidson Knight A, Lowe T, Brossard M, et al. A Whole New World: Funding and Commissioning in Complexity. London: Collaborate and Newcastle Business School, Northumbria University 2019. http://wordpress.collaboratei.com/wp-content/uploads/A-Whole-New-World-Funding-Commissioning-in-Complexity.pdf (accessed 24 Jul 2019).
  65. 65. Cooke LJ, Duncan D, Rivera L, et al. The Calgary Audit and Feedback Framework: a practical, evidence-informed approach for the design and implementation of socially constructed learning interventions using audit and group feedback. Implementation Science 2018;13:136. pmid:30376848
  66. 66. Sargeant J, Armson H, Driessen E, et al. Evidence-Informed Facilitated Feedback: The R2C2 Feedback Model. MedEdPORTAL 2016;12.
  67. 67. Michie S, van Stralen M, West R. The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implementation Science 2011;6:1–12. pmid:21513547
  68. 68. Zuckerman M. Attribution of success and failure revisited, or: The motivational bias is alive and well in attribution theory. Journal of Personality 1979;47:245–87.
  69. 69. Miller DT, Ross M. Self-serving biases in the attribution of causality: Fact or fiction? Psychological Bulletin 1975;82:213–25. pmid:1099222
  70. 70. Co-designing with young people: the fundamentals. Orygen 2019. https://www.orygen.org.au/Training/Resources/Service-knowledge-and-development/Guidelines/Co-designing-with-young-people-The-fundamentals/Orygen-Co-designing-with-YP-the-fundamentals?ext=. (accessed 29 Sep 2020).