Facilitating Patient and Public Involvement in basic and preclinical health research

Involving patients in research broadens a researcher’s field of influence and may generate novel ideas. Preclinical research is integral to the progression of innovative healthcare. These are not patient-facing disciplines and implementing meaningful PPI can be a challenge. A discussion forum and thematic analysis identified key challenges of implementing PPI for preclinical researchers. In response we developed a “PPI Ready” planning canvas. For contemporaneous evaluation of PPI, a psychometric questionnaire and an open source tool for its evaluation were developed. The questionnaire measures information, procedural and quality assessment. Combined with the open source evaluation tool, researchers are notified if PPI is unsatisfactory in any of these areas. The tool is easy to use and adapts a psychometric test into a format familiar to preclinical scientists. Designed to be used iteratively across a research project, it provides a simple reporting grade to document satisfaction trend over the research lifecycle.


Introduction 21
Involving patients or the interested public in research broadens a researcher's field of influence, 22 generating novel ideas, challenges and discussions. Basic,translational and preclinical research (hereto 23 preclinical) is integral to the progression of innovative healthcare, indeed the majority of National 24 Institute for Health (NIH) funds in the USA are focused on preclinical research (1). Preclinical research is 25 not a traditionally patient-facing discipline and implementing meaningful patient and public involvement 26 (PPI) can be a serious challenge in the absence of well-defined support structures. 27 Increasingly in healthcare, patients and the interested public are sought as partners in study design and 28 governance. This trend is growing due to an increasing requirement by national, international and 29 charitable funding bodies to include PPI as a condition of funding (2-6). We use the INVOLVE definition 30 of PPI, whereby PPI in research is defined as research carried out with or by patients and those who 31 have experience of a condition, rather than for, to, or about them (7). PPI has multiple demonstrated 32 positive impacts on research including the potential to reduce waste in the research landscape (4,8,9). 33 Furthermore, if a project or researcher is funded by a public body there is a duty to demonstrate 34 accountability for the expenditure of supporting funding. Thus, incorporating PPI should be beneficial to 35 both researcher and research outputs. There tends to be a recognition within the biomedical community 36 that PPI can be beneficial to all stakeholders (10). Awareness of PPI is certainly increasing, however, the 37 incorporation of valuable and meaningful PPI as standard research practice is progressing slower than 38 expected (11,12). 39 PPI is collaborative by its very nature and its impact on of its key stakeholders should be assessed in 40 considering PPI success. Preclinical researchers are not patient-facing by nature and as such, some of the 41 basic aspects of implementing meaningful PPI can be challenging without the relevant support 42 structures. Understanding the perceived challenges and barriers as viewed by preclinical researchers is a 43 first step in the development of appropriate resources and guidance documents to facilitate valuable 44 collaboration and involvement between the public and the largest recipient cohort of publicly funded 45 health research (13). Here we describe an analysis of the views of preclinical researchers into the 46 challenges regarding PPI. In  partner: "Worried that a mistake I make will upset a patient". Communication concerns included being 98 able to explain and justify the need for preclinical and basic research, which has a longer term and less 99 direct research impact. There was concern around public representatives bringing personal judgement 100 upon research methods: ""Patients maybe against some aspects of research such as animal testing. We 101 need [. . .] to convey the importance of animal testing when it comes to the progression of research." 102 There was concern that the time spent justifying research that has already been through rigorous ethics 103 application to public partners with alternative agendas could slow the research process and be counter-104 productive. 105

Communicating with vulnerable groups is not in the preclinical researchers' toolkit 106
Preclinical research training may include scientific communications, presentation skills and even media 107 training. Seldom, however, is training provided for communicating with vulnerable groups such as 108 patient cohorts. The worry of appropriately communicating with vulnerable groups was particularly 109 prominent with less experienced researchers and in researchers investigating fatal and life-limiting 110 illnesses "The unfortunate reality of PPI is that some patients may have life-threatening diseases. What 111 happens if a patient dies? As a researcher I don't know what to do in that situation and I don't know 112 how to handle it". More generally, there was a recurrent worry of upsetting the public, fear of "saying 113 the wrong thing" or causing upset or public disengagement. The perceived lack of guidance, combined 114 with the bespoke nature of PPI resulted in many researchers feeling the burden of publically 115 representing their research and their institute. Whereas they may be comfortable presenting a fully 116 peer-reviewed "sanctioned" piece of research output, there was a personal and professional 117 vulnerability associated with sharing early stage and on-going research with the public. Researchers felt 118 they would be guarded and could not discuss preliminary research fully if a public member was present 119 for fear of said research being misconstrued or offering false hope. 120

Time consumption is a major concern hindering PPI implementation 121
The overwhelming largest concern about implementing PPI was the time required for the engagement, 122 recruitment, procedural, and on-going management of public involvement. This may be a function of the 123 relatively new strategy change towards PPI in Ireland. "I simply don't have enough time to implement 124 PPI. I can't delegate any work in relation to PPI because I don't think enough people know what it is and 125 what to do." In particular, researchers who had no experience in PPI had the strongest sense that the 126 time consuming nature of PPI outweighed it's benefit to preclinical research "Time spent engaging with 127 patients is time spent away from doing research". Again, this could be influenced by the lack of 128 experience in engaging patients and the perceived lack of guidance and clarity around PPI (15) and 129 dissociation between preclinical research and policy setting (16)(17)(18). 130

Development of a tool to facilitate an individual researcher to be become PPI ready 131
PPI is bespoke; there is no standard formula for a good PPI initiative. However, one can be as prepared 132 as possible to implement a bespoke PPI initiative. By reflecting on the main theoretical challenges for 133 implementing PPI in advance of starting a research project, an individual researcher can facilitate the 134 downstream success of their PPI initiative. An individual researcher's theoretical challenges will stem 135 from the uncertain boundaries of the concept of PPI. Therefore, we have adapted a standard business 136 tool, the business model canvas (BMC), which is designed to enhance strategic thinking for business 137 innovation and inform the exploitation of emerging opportunities in changing environments (19,20). 138 The PPI Ready Researcher Planning Canvas (figure 1B) is designed to enhance strategic thinking about 139 PPI and enable a researcher to prepare themselves for PPI. The PPI Ready Canvas is a visual plan with 140 elements describing a researchers challenges, proposed solutions, the solutions' value proposition, 141 research impact, financial and time costs. It is designed to assist researchers in aligning their activities by 142 illustrating potential benefits and trade-offs. 143 The PPI Ready Researcher Planning Canvas elements allow researchers to consider what the challenges 144 are that are hindering their implementation of PPI. It is broken down into Barriers: Institutional policies, 145 procedures or situations that may hinder PPI; Worries: Personal hurdles that cause anxiety or unease 146 about PPI; Concerns: Perceived impediments to research value or practice. There is corresponding space 147 to consider the response to the challenges. In order to consider the cost/benefit analysis of 148 implementing PPI, there is a module of Value: what additional benefit will PPI bring to your research; 149 and Impact: What effect on impact will inclusion of PPI bring? Consider cultural, economic, health, 150 political, social and knowledge impacts. To ground the proposed solutions or responses, there are 151 modules to consider both the timeframe required to prepare for PPI and also the associated costs and 152 funding required. 153

A mechanism for iterative PPI assessment to allow incremental revisions and PPI development 154
Formalised PPI is a new concept in preclinical research. All initiatives, especially paradigm shifts, have 155 associated risk. The earlier in the project lifecycle you can avoid a risk, the more accurate you can make 156 your design. Many risks are not discovered until they have been to be integrated into a system and it is 157 not feasible to identify all risks. Therefore, just like any area of the research cycle and design process, 158 PPI should be assessed, evaluated and improved iteratively in response to experience and lessons 159 learned. This may be tricky considering PPI by its very nature is not prescriptive. The form that PPI takes 160 will be bespoke depending upon the needs of the research. However, the underlying concepts for 161 valuable and meaningful PPI remain constant and can be measured and assessed. 162 We have developed a PPI satisfaction assessment survey to be used for iterative and contemporaneous 163 PPI assessment throughout the life cycle of a research project (table 2) Question wording and scale were refined in response and reassessed by a subset of the PIP cohort. The 174 refined questionnaire and the general self-efficacy scale (GSE) were piloted on a cohort of patients and 175 public that had attended a research involvement or engagement event. N=60 responses passed the 176 consent check. Oblique factor analysis identified co-linearity between PPI questions which lead to 177 question exclusion based on quantitative analysis and informed by qualitative input for PIPs. The refined 178 questionnaire passed content validity analysis, including, internal validity, discriminant and convergent 179 validity analysis. 180   RSq 0.835 and Adjusted RSq 0.832, figure 2). 215

An open-source simple analysis and reporting tool for the PPI assessment survey 216
Time consumption was highlighted as a major barrier to implementing PPI in the preclinical sciences. 217 Therefore, development of a PPI assessment survey (PAS) would be redundant unless it was quick and 218 easy to use. The tool is simple by design and at the suggestion of preclinical researchers who did not 219 want to require any further training in order to be able to use it. An excel spreadsheet was selected as 220 this programme is widely used and available and researchers were comfortable with it. We have 221 therefore developed a series of simple excel templates with pre-formulated inputs for the analysis of 222 the survey (figure 3). Furthermore, there are linked templates that can be sent to the PIP which allows 223 the researcher's file to be updated and analysed as the PIP fills in their survey response in a separate 224 excel document. 225 The analysis is multi-dimensional and uses a flagging system, which preclinical researchers would be 226 familiar with from standard quality control measures used by laboratory equipment. The analysis tool 227 uses a flagging system for each question answered on an individual PIP level. The banding for our flag 228 system was based upon a previously published large analysis of different response scales (24). If a PIP 229 scored a question ≥7 no flag is generated. If a question is scored between 4-6 a yellow warning flag 230 "some attention required" is generated. A score on any individual question of ≤3 generates a red 231 warning flag "Immediate attention required". These flags alert the researcher to risks associated with 232 their PPI initiatives and allows them to put control measures in place. The iterative use of the PAS over 233 time will facilitate trend analysis to determine the success of the control measures. 234 The GAS question Is included in the PAS (question 9). For validation of the survey during use, the 235 analysis template has an inbuilt content validation (figure 3). If the GAS response is within the range of 236 the Mean Q1-8 +/-two standard errors, the PAS is consistent in its measurement of satisfaction and a 237 "PASS" flag is produced. If the GAS is outside this range a "FAIL" flag is produce and caution should be 238 used in interpreting the results as there may be an underlying construct skewing the PAS response. 239 There is also an overall PAS assessment grade produced based on the mean PIP response across all 240 questions. This can be used as an easy reporting measure for annual grant reports and for assessing a 241 PPI initiative across the lifecycle of the project. The overall grade uses categorisation based upon 242 standard risk assessment, a concept familiar to most preclinical researchers(25). An overall grade ≥7 is 243 generates an overall score and grade. (C) The scores can be plotted over time to visualise the PPI 253 satisfaction trend. As illustrated, the overall score is relatively flat, but plotting of each value score 254 illustrates that the informational assessment required attention and improved upon implementation of 255 changes (after the second quarter) to increase clarity in communications. 256 acceptable, 4-6 highlights a moderate risk to PPI success, and a grade <4 highlights a substantial risk to 257 PPI success. The overall global grade is a useful tool for the measurement and reporting of PAS. 258 However, if used in isolation there is a risk of masking potential subscale concerns. The PAS assesses 259 three value levels: informational (quality and clarity of the information and communication provided, n= 260 2 questions), procedural fairness (consideration of the PIP needs, n=2 questions), and quality (how 261 valued and valuable a PIP perceives the PPI initiative, n=4 questions). The analysis tool generates a per-262 category summary, which is the mean response from PIPs within each value category. This allows easy 263 identification of improvements required between value categories and allows the researcher to pinpoint 264 the area requiring refinement in order to improve their PAS grade ( figure 3B). 265

Limitations of the PAS 266
The PAS is designed to facilitate assessment of PPI for preclinical researchers who tend to have little to 267 no experience in qualitative, semi-quantitative or psychometric research. It is designed specifically for 268 those with limited experience or access to established PPI resources. It addresses the key concepts 269 underlying successful PPI collaborations between patients/public and researchers and ensures they are The circumvention of the qualitative analysis of other assessment tools, such as feedback forms which 278 allow free text input, reduces the length of time and requirement for specialist knowledge and training 279 for PPI assessment. It also allows the satisfaction trend to be easily tracked across time and in response 280 to improvement measures. This can be beneficial but is also a limitation as there is a loss to the extent 281 of information that can be assessed. A quantitative method cannot provide in-depth information on the 282 motivations behind the levels of satisfaction nor provide details of PIP experiences with PIP (26). The use 283 of the PAS with informal PIP feedback in response to issues identified in PAS analysis provides a realist 284 and feasible solution given the researcher challenges described above. However, for the most robust 285 development and progression of PPI as a research and impact tool, a mixed methods approach using the 286 PAS and qualitative in-depth exploratory PPI narrative research and analysis could enable new topics 287 and insights to emerge, which could also be confirmatory for the quantitative findings. 288

Concluding Remarks 289
Preclinical researchers are increasingly required to wear multiple hats as part of the standard 290 expectations of a research career position. Established roles include study design, conduct and 291 experimental analysis but they are also now expected to be professional teachers, presenters, public 292 speakers, writers, reviewers, editors, managers and mentors. There is increasing pressure to 293 commercialise research in-house, act as innovators and become social and political influencers (27,28). 294 It is therefore unsurprising that time commitment, communication and lack of guidance were 295 considered major challenges to the implementation of PPI in preclinical research. Understanding that 296 researchers face personal challenges, anxieties and worries, as well as institution barriers and research 297 concerns is a first step in encouraging meaningful PPI collaborations. Addressing the challenges faced by 298 preclinical researchers will greatly help the development of resources and guidance documents to 299 facilitate PPI to one of the largest recipient cohorts of publicly funded health research (1, 29). 300 The research describe here will aid preclinical researchers in the formulation of PPI. Adequate 301 preparation in advance of a research project with public involvement should improve the success of that 302 involvement. All new concepts have challenges and risk and involving the public in research is no 303 different. Challenges are inevitable, however assessing public involvement iteratively through the use of 304 the PAS described here within allows for those challenges to be addressed and managed 305 contemporaneously. In conclusion, we have developed a tool specifically for the needs of the preclinical 306 researchers to facilitate the implementation, assessment and improvement of PPI throughout the 307 lifecycle of a research project. 308

Ethics Statement 310
Ethical approval from the local Institutional Review Board (UCD Human Research Ethics Committee). 311 Survey responses were collected anonymously with informed consent of the participant for non-312 commercial use of the data provided. 313

Patient and Public Involvement Statement 314
Patients were engaged through The Patient Voice in Arthritis Research PPI group. People with 315 experience of living with any rheumatic disease from any area of Ireland were invited to apply to be 316 involved in this study. Communication was remote, via email and phone. The patient insight partner 317 group of 12 were sent a review guide and a structured template for their review (supplemental 318 information). PIPs were asked specifically about: language accessibility, relevance, usefulness, necessity 319 of the questionnaire, missing aspects, the scale used, and the likelihood of intended use, questionnaire 320 length, overall views and alternative assessment methods. The questionnaire was adjusted in response 321 to PIP feedback. Telephone follow-up was used to obtain views on questionnaire refinement. PIPs were 322 also asked to share the survey pilot within their relevant networks. The results of the study will be 323 shared with PIPs via email and through the patient/researcher co-produced newsletter of the UCD 324 Centre for Arthritis Research, News Rheum. 325

Researcher View on Public and Patient Involvement 326
Basic, translational and preclinical health researchers within the UCD College of Health and Agricultural 327 Sciences were invited to a discussion forum to express their views on implementing PPI. All attendees 328 consented to the non-commercial use of the data provided. A facilitated semi-structured discussion 329 forum structure was used, consisting of three groups, each with a note-taker and a facilitator. There was 330 also a roaming central facilitator. An option for written contribution was also provided for those who did 331 not wish to voice their opinions. Notes were combined and a qualitative textual analysis was performed. 332

Planning Canvas Development 333
In response to the preclinical researcher views on implementing PPI, we proposed that reflecting on the 334 main theoretical challenges for implementing PPI, which stem from the uncertain boundaries of the 335 concept, in advance of starting a research project would facilitate downstream success for PPI. A 336 common tool in to help business rethink their business strategy in a fast-evolving landscape is the 337 Business Model Canvas (BMC). The BMC is used to enhance strategic thinking about business 338 innovation(20). We used the theory and design concept of the BMC informed by researcher views to 339 develop the PPI Ready: Researcher Planning Canvas. The PPI Ready Canvas is designed to facilitate the 340 researcher's preparedness for PPI, rather than for planning an individual PPI activity. 341

Questionnaire Development 342
A review of the health, public engagement, and marketing research literature was conducted (30-44). A 343 long questionnaire of 15 questions was developed in response to the researcher discussion forum 344 (supplemental information). Three key processes for assessment were information (n=4 questions), 345 procedural fairness (n=4) and quality (n=7). The global assessment question "Overall, how 346 satisfied/dissatisfied are you with your involvement in this project" was included as question 16 for 347 convergent validity. Face Validity and Questionnaire Accessibility 348 A voluntary, community recruited panel of patient insight partners (PIP) (n=12) reviewed the 349 questionnaire for face validity and language accessibility. The PIP review document can be found in 350 supplementary methods. Questions were simplified and refined in response and changes discussed with 351 PIPs. An 11 point satisfaction scale with 3 anchor points (at 0, 5 and 10) was used for all questions (22). 352

Survey Pilot 353
A survey containing the 15 public involvement (PI) questions, one global assessment of satisfaction 354 (GAS) question and the well characterized 10-question general self-efficacy (GSE) scale(23) was piloted 355 on a cohort of 63 adults (full survey can be found in supplementary information). All respondents self-356 reported as having attended a meeting or event(s) that gave them the opportunity to discuss or express 357 their views about health research or to share their experience with researchers. Three did not consent 358 to data storage and use; therefore responses were excluded from analysis. Of the 60 respondents, 72% 359 (n=43) were patients, 8% (n=5) were carers, 17% (n=10) were family members and 3% (n=2) were other 360 members of the public. All surveys were complete and there were no missing entries. 361

Data Analysis 362
Data was analysed in IBM SPSS v24. Factor analysis with correlation matrix was used to identify co-363 linearity and redundancy within the 15 PI questionnaire. As expected, there was a high degree of co-364 linearity and variables were removed based on a correlation greater than 0.8. To inform refinement, PIP 365 insight, views, and reported relevance was considered. Eight questions remained after refinement 366 (8QPI). Factor analysis was repeated on the 8QPI data, which met all requirements for determinant 367 (>0.0001), KMO (>0.8) and Bartlett's test for sampling adequacy (p<0.05). Cronbach's alpha with a cut-368 off of 0.7 was used for internal consistency reliability testing. Discriminant validity was analysed via 369 factor analysis and component analysis and correlation between 8QPI and GSE. Convergent validity was 370 established via linear regression between the average 8QPI and GAS question, using a cut-off value of 1-371 (2SE) (two standard errors). 372

Modelling the 8QPI into a Quality Control Framework 373
The validated PI questionnaire includes two questions for information assessment, two questions for 374 assessment of procedural fairness, four questions for quality assessment and the GAS question. We 375 developed a simple flagging system for each of the three key assessment categories that are measured 376 on a per-individual basis. Furthermore, there is also an overall assessment grade that provides a global 377 measure of PIP satisfaction with an individual PPI scheme. 378 The excel analysis template can be found in supplementary materials. PAS results are entered directly or 379 via a linked file. For each question, responses of 0-3 issue an 'Immediate Attention Required' flag; 4-6 380 issued a 'Some Attention Required' flag and responses of 7-10 issued a 'No Attention Required' flag(24). 381 Content validity is automatically tested by comparing the GAS response to the overall mean PAS (Q1-8) 382 response. If the GAS is within the mean +/-2 standard deviations (as determined in the PAS pilot), a 383 'PASS' flag is generated, otherwise a 'FAIL' flag is produced and the data must be interpreted with 384 caution. 385 An output summary table is generated on tab 2. Each category (communication, procedural and quality) 386 receives a score based on the mean responses for all PIPs. An overall score based on mean response for 387 all questions (Q1-8) and associated PPI Grade is generated for simple PPI satisfaction reporting. Based 388 on the risk matrix concept, overall scores of 0-3 are 'High Risk' for PIP dissatisfaction/PPI failure, scores 389 of 4-6 represent a 'Moderate Risk' and a score of 7-10 are 'Acceptable'(45). 390