Peer Review History

Original SubmissionJuly 6, 2023
Decision Letter - Dorothy Lall, Editor

PONE-D-23-20754Improving the pragmatic usefulness of the scoring matrix for the Consolidated Framework for Implementation Research (CFIR). A proposal for a more frequency-based approach: the CFIR-f.PLOS ONE

Dear Dr. Economidis,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

==============================

ACADEMIC EDITOR: Thank you for this submission. It is clear and well written. Please respond to the points raised by the reviewer point wise. Thank you 

==============================

Please submit your revised manuscript by Nov 02 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Dorothy Lall

Academic Editor

PLOS ONE

Journal requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Thank you for stating the following in the Competing Interests section:

“Anthony Shakeshaft was the lead investigator on a competitive tender (FACS.17.266) awarded by the NSW Government Department of Family and Community Services (now Department of Communities and Justice) to evaluate the Functional Family Therapy – Child Welfare [FFT-CW®] and Multisystemic Therapy for Child Abuse and Neglect [MST-CAN] programs in New South Wales, Australia (2018-2020). George Economidis was employed part-time at the National Drug and Alcohol Research Centre as a Project Coordinator for the FFT-CW® and MST-CAN program evaluation from May 2018 to August 2020.”

Please confirm that this does not alter your adherence to all PLOS ONE policies on sharing data and materials, by including the following statement: "This does not alter our adherence to  PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests).  If there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared.

Please include your updated Competing Interests statement in your cover letter; we will change the online submission form on your behalf.

3. Thank you for stating the following in the Acknowledgments Section of your manuscript:

“The research team acknowledges and thanks all policy experts and service providers who participated in data collection for this study. We acknowledge the contribution of Dr. Paula Jops and Dr. Hueiming Liu to conducting interviews with some of the policy experts in this study. NDARC receives funding from the Australian Government’s Department of Health.”

We note that you have provided funding information that is currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form.

Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows:

“GE was supported by an Australian Government Research Training Program (RTP) Scholarship via the University of New South Wales (UNSW), Sydney, Australia, and a Higher Degree Research scholarship from the National Drug and Alcohol Research Centre (NDARC), UNSW, Sydney. The evaluation of the MST-CAN program in New South Wales, Australia (2018-2020) was funded via a competitive tender process (FACS.17.266) by the NSW Government Department of Family and Community Services (now Department of Communities and Justice). Their Futures Matter and Department of Communities and Justice staff provided feedback on the evaluation plan, participated in interviews as key stakeholders in this evaluation, and reviewed the manuscript.”

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

4. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

5. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: N/A

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thank you for the opportunity to review this paper, “Improving the pragmatic usefulness of the scoring matrix for the Consolidated Framework for Implementation Research (CFIR). A proposal for a more frequency-based approach: the CFIR-f.”

This paper reports and proposes a pragmatic, structured, frequency-based approach to applying the Consolidated Framework for Implementation Research +2/-2 scoring system. Given the number of domains, constructs and subconstructs (n=67) that the updated CFIR contains, innovative and pragmatic ways to analyse, and interpret, data obtained from applying the updated CFIR (and implementation theories, models and frameworks in general) is important. At the same time the authors identify several valid issues associated with previously published criteria to support researchers and practitioners apply the +2/-2 scoring system.

Specific questions and critiques follow in the order that they appear in the manuscript.

Abstract

Line 61: The CFIR-f engenders a list of themes, ranked by key experts according to their implied importance, to inform policymakers and services about priorities for action. I have serious concerns regarding this conclusion, is it not more accurate to conclude that the proposed scoring system generates a list of barriers and facilitators, ranked by frequency, rather than ‘according to their implied importance’.

Background

Line 67, missing references: Programs that simultaneously intervene at the individual, community and societal level are increasingly being utilised, both to optimise their effectiveness and improve the sustainability with which they are integrated into real-world service delivery.

Line 72: Identifying the enablers of, and barriers to, the delivery of multi-level programs at an early stage of their implementation is especially important because it is a particularly volatile stage in the program delivery process [1, 2]. I think it is important to add that identifying implementation determinants at an early stage can help inform implementation efforts (e.g., adaptations and modifications to the intervention and/or implementation strategy).

Line 77, missing reference: One such framework is the Consolidated Framework for Implementation Research (CFIR).

Line 87: In brief, implementation evaluators would typically convene a group of people involved in the delivery of a program and, using an established qualitative method (e.g., semi-structured interviews), ask a series of questions that align with some or all of CFIR’s constructs. This is one way CFIR can be applied, but I’m not convinced it can be described as the ‘typical’ way. I suggest the authors read/cite the following paper and provide a more accurate summary of how CFIR can has been used: Kirk, M.A., Kelley, C., Yankey, N. et al. A systematic review of the use of the Consolidated Framework for Implementation Research. Implementation Sci 11, 72 (2015). https://doi.org/10.1186/s13012-016-0437-z

Line 96: The answers to each of these questions would then be analysed using a standardised thematic analysis, and each theme is allocated a score by the evaluators ranging from +2 to -2: themes rated +1 or +2 are classified as enablers; themes rated -1 or -2 are classified as barriers; and themes rated as 0 are classified as being neither an enabler nor a barrier to implementation [4, 12]. Related to the previous point, I think it needs to be made clear that the +2/-2scoring system is frequently applied.

Line 101: I suggest rewording ‘actual barriers and enables’ to ‘more influential barriers and enablers’: The more extreme scores (i.e., +2 and -2) are interpreted as actual barriers and enablers, while +1 and -1 are interpreted as being generally influential. The word ‘actual’ implies that barriers and enables scored +1 or -1 are not ‘actual’ barriers and enablers, which is not the cases.

Methods

Table 1 is very useful and clear.

Results

Table 2 title add 'and neither enablers or barriers’. Please make it clear in the column heading, what the number in brackets in the first column represent. E.g. ‘Technical, logistical and referral challenges (n=19)’. This is not obvious.

Line 224: The logic is to provide services, funders and policymakers with a list of barriers and enablers that are weighted according to their importance, as determined by the frequency with which they are mentioned by the experts participating in each group. I disagree with the following part of this statement: ‘a list of barriers and enablers that are weighted according to their importance’. I strongly suggest changing to ‘a list of barriers and enablers by the frequency with which they are mentioned…’ This issue arises throughout the paper, please review and amend each statement accordingly (e.g., Line 387: the CFIR-f allocated a continuous score to each theme which can be interpreted as a list of themes prioritised by experts). The themes were not ‘prioritised’ by experts.

Discussion

Line 443. A major limitation to the CFIR-f approach is that identified barriers and enablers are ranked and ‘prioritised’ based solely on frequency, rather than frequency AND strength of influence on implementation. I don’t think the authors go far enough in highlighting and discussing the implications of this limitation. I think it would be useful to compare this approach to the original CFIR scoring approach, that determines strength of influence on implementation by considering a number of factors, including level of agreement among participants, strength of language, and use of concrete examples. Would a matrix approach, considering strength and frequency be the most desirable scoring approach? Perhaps this is something to highlight that should be explored in future research.

Line 457. I disagree that it is reasonable to conclude ‘that technical, logistical and referral challenges [score -11] is regarded by experts as being about three times more problematic than the need for clarification and amendment of the eligibility criteria of the programs ([score -3]).’ At best, the authors can state that technical, logistical and referral challenges were identified about 3 times more frequently as a barrier to implementation than the needs for clarification and amendment of the eligibility criteria of the programs. Frequency and the degree to which a barrier is ‘problematic’ are different.

How is the CFIR-f more pragmatic than the CRIF approach? It would be useful for the authors to define what they mean by ‘pragmatic usefulness’.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Louise Hull

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 1

Please see attached 'Response to Reviewers' uploaded file articulating my detailed responses to reviewer/editor comments. Thank you.

Attachments
Attachment
Submitted filename: Response to Reviewers.docx
Decision Letter - Dorothy Lall, Editor

Improving the pragmatic usefulness of the scoring matrix for the Consolidated Framework for Implementation Research (CFIR). A proposal for a more frequency-based approach: the CFIR-f.

PONE-D-23-20754R1

Dear Dr. Economidis,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Dorothy Lall

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Thank you all comments have been addressed and the manuscript has improved.

Reviewers' comments:

Formally Accepted
Acceptance Letter - Dorothy Lall, Editor

PONE-D-23-20754R1

Improving the pragmatic usefulness of the scoring matrix for the Consolidated Framework for Implementation Research (CFIR). A proposal for a more frequency-based approach: the CFIR-f.

Dear Dr. Economidis:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at customercare@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Dorothy Lall

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .