Peer Review History
| Original SubmissionJanuary 5, 2025 |
|---|
|
PONE-D-24-58659Evaluation capacity building in a rural Victorian community service organisation: A formative evaluationPLOS ONE Dear Dr. Kavanagh, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Both reviewers have provided detailed comments about your manuscript, which we would appreciate you addressing in full. One thing I would like to highlight is Reviewer 1's comments about anonymity for the case study institution - is it necessary to name the institution? It's not clear in the ethical approval if this is necessary or not; however, some clarity on whether the institution is happy to be named would be beneficial. My assumption is by partaking in the study they were happy with this, but it would be good to clarify. Please submit your revised manuscript by Jun 08 2025 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols . We look forward to receiving your revised manuscript. Kind regards, Daniel Parkes, PhD Staff Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1.Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 2. Please note that funding information should not appear in any section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form. Please remove any funding-related text from the manuscript. 3. We note that you have indicated that there are restrictions to data sharing for this study. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For more information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Before we proceed with your manuscript, please address the following prompts: a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially identifying or sensitive patient information, data are owned by a third-party organization, etc.) and who has imposed them (e.g., a Research Ethics Committee or Institutional Review Board, etc.). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent. b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. For a list of recommended repositories, please see https://journals.plos.org/plosone/s/recommended-repositories. You also have the option of uploading the data as Supporting Information files, but we would recommend depositing data directly to a data repository if possible. We will update your Data Availability statement on your behalf to reflect the information you provide. 4. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Partly ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: I Don't Know Reviewer #2: N/A ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The article is interesting,timely and well presented. Here are some aspects that could be clarified: * The MEL concept (page 5) needs to be presented a little bit more as it is a concept that appears in both research, policy documents and ngo reports. Some say Measurement, not monitoring etc. * The authors could show how this topic is relevant across geographical and professional boundaries as it might seem a bit 'limited' to the Australian context and to the local SCOs there. * In line with this: the authors could show more familiarity with research in the field, e.g. when they show the relevance of an embedded researcher, they could also link to this to major discussions in the field regarding the need for research-minded practitioners. For example as Mike Austin, Berkeley and Bowen McBeath, Portland USA has published about. * The authors could also refer a bit more to the extensive literature on barriers to organizational learning to avoid that the discussion and "problem" is only about experiences from this particular context. The article has the potential to be an important addition to this research on the international context, and should therefor not be limited only to the Australian and this particular rural setting. The topic is familiar across professions and globally. Also literature on evaluation research could be mentioned (evaluation as research field) STUDY CONTEXT (page 6-7) - here some clarifications are needed: But first of all: The place is not anonymized. Is that ok by the informants? The authors should reflect on this. 1. Are the authors both employed at and researching in their own working place and among their collegas? That must be reflected on if that is the case, as it may affect biases in data collection, mixed roles etc. 2. Who designed the program og evaluation capacity (page 6, line109)? 3. Explain the selection criteria: Why the 4 programs were selected (page 7, line118), what area do they cover? And why these four of all 38 programs). 4. What do you mean by 'the wider program- (page 7, line 119) 5. Members of the teams were selected 'based on their role and interest' (page 119-120). How can this affect the data and results? (Will they be more positive towards the research?) 6. Were clients involved at any stage of this research process (why/why not)? 7. The sentence on page 8 line 147-149 is nearly identical with the sentence on Material and Method on the same page, line 157-159, and should be deleted one of the places. MATERIAL AND METHODS 1. Consent is described, however the ethical challenge on not anonymizing the place must be reflected on. Do the workers know, and is it ok? What are the challenges of not covering where this research is taking place? 2. The qualitative data - how are they being used in the Results section? How are they analysed? The section on Data analysis need to be strengthened. It is only 5 lines, and in the name of transparency in research, i would be helpful if the authors could explain in some more detail how they analysed the diverse data collected by ethnographic and interview methods. And to show clearer how it is used and presented in the Results and Discussion. 3. The information of the informants' varied background on page 13-14 (lines 234-238) I recommend that is moved to the Study context section where data and selection is mentioned. RESULTS Some parts could be considered moved to the Discussion/ Implications. For example the reflection on page 18 lines 302-309. I miss some results from the qualitative data DISCUSSION The section sums up the study and facilitators/barriers well, and succeeds in showing why this under-researched topic is crucial for the enhancement for local services. They also bring interesting topics to the discussion regarding how to develop capacity building for this work. The important potential of a link officer, embedded researcher, developing research-minded practitioners etc is of high interest in this paper. The dynamic sustainability model is introduced which contributes to the discussion. It is good that the authors finally links their discussion to a wider context (page 21). The section is quite long, and some of its parts is about lessons learned, and consequently, implications for further practice. These parts could be structured and presented in a separate section following the Discussion (e.g. development of shared understanding, how to facilitate for more reflection as a pre-condition for success etc.)in Limitations and strengths could also be presented in a separate section. Thank you for the opportunity to read the interesting study and good luck in finalizing the article. Reviewer #2: The manuscript presents valuable and original research regarding evaluation capacity building (ECB) within a rural, place-based community service organisation (CSO) in Victoria, Australia. Focusing on the early implementation of Monitoring, Evaluation, and Learning (MEL) pilots, the study addresses a documented gap in the literature, particularly for rural and resource-constrained settings. The use of the RE-AIM framework to structure this formative evaluation is appropriate and well justified. The paper is timely, aligning with current scholarly interest in strengthening ECB practice, as demonstrated by a recent special issue of New Directions for Evaluation (2024, Issue 183) dedicated entirely to exploring best practices and effective methodologies for evaluating ECB itself While the topic is important and the initial work valuable, major revisions are required to improve methodological transparency, address the data gaps appropriately, and ensure conclusions are robustly supported by the available evidence. I offer the following suggestions for the authors’ consideration: General and Conceptual Suggestions - The study makes a strong case for the importance of MEL in CSOs, but the link between rurality and the specific challenges of MEL implementation could be strengthened. The Modified Monash (MM) classification is used to define rurality, yet the implications of this classification for workforce, infrastructure, and program delivery are not clearly connected to the evaluation design or the observed challenges. - The manuscript notes that Brophy identified a need for MEL integration; however, it would be helpful to clarify how and by whom this need was identified—e.g., via leadership, internal audit, funder requirements, or staff demand. - The pedagogical approach to MEL training could be described more clearly. The authors imply an experiential learning model (participants applied tools to their own programs), but this is not explicitly stated. Clarifying whether the approach was experiential, practice-based, or otherwise grounded in adult learning theory would help contextualize the training’s design and intent. - The authors mention that five MEL modules were delivered between June and October 2023. Specifying the approximate time commitment (e.g., total hours, duration of sessions) and whether participants received materials such as workbooks, templates, or pre/post tasks would be useful. - The “evaluation champions” strategy is briefly noted as an implementation enabler. It would be helpful to indicate whether this was a formal designation or an informally observed role, and what functions these champions were expected or observed to play in supporting MEL adoption. Methods & Results - The supplementary materials, including the RE-AIM operationalization (Supplementary Material 1), questionnaire (Supplementary Material 2), and interview guide (Supplementary Material 3), were reviewed and provide useful context for the study methods (though the questionnaire warrants revision as noted below). - The first author was embedded in the organization and facilitated the intervention. While this position likely offered advantages, the discussion does not adequately reflect on potential biases or power dynamics that may have influenced participant responses or engagement. A brief reflexive comment on this dual role would add methodological integrity. - While the use of the RE-AIM framework is a strength, some domains are addressed more robustly than others. For instance, “Reach” is primarily described in terms of team selection, with limited analysis of participant engagement or reasons for non-participation. Similarly, “Implementation” focuses mainly on fidelity to the protocol (e.g., number of outcomes developed), but does not address the cost or resource requirements of the MEL intervention—an element sometimes included within the RE-AIM framework (and noted as potentially relevant by the authors in Supplementary Material 1). The authors should clarify which RE-AIM domains were prioritized or fully evaluated at this early implementation stage. - The manuscript would benefit from a clearer explanation of how data source triangulation was undertaken. While the study is described as using a mixed methods approach—including self-report questionnaires, administrative data, and (planned) qualitative interviews—it remains unclear how these data sources were compared, integrated, or analyzed in relation to one another. For example, were questionnaire findings cross-validated against administrative records or implementation logs? Were efforts made to identify convergence or divergence across data types? The strength of a mixed methods design often lies in triangulating multiple perspectives or forms of evidence. In its current form, the manuscript does not fully demonstrate how this integration occurred, weakening the mixed methods claim. Brief clarification would enhance methodological transparency and credibility. - A self-report questionnaire (Supplementary Material 2), adapted from a tool on evidence-informed practice, was used to assess MEL-related constructs (e.g., awareness, beliefs, attitudes, knowledge, skills, implementation). However, the tool appears to have several weaknesses. Firstly, several items seem to measure overlapping constructs—e.g., “beliefs” (that MEL improves outcomes) and “attitudes” (willingness to support MEL)—which may limit conceptual distinction. Secondly, the tool mixes response formats (Likert scales, dichotomous items), which can undermine consistency and comparability. Thirdly, based on the items provided, it appears each dimension (Awareness, Beliefs, Attitudes, Knowledge, Skills) is assessed using only a single item. Relying on single-item measures can significantly limit reliability and validity compared to multi-item scales. The authors should clarify how the original tool was adapted for this study and describe the adaptation process, acknowledging the limitations of the measurement approach used. Future iterations should consider a more rigorously developed instrument, potentially using multi-item scales aligned with established ECB frameworks (e.g., Preskill & Boyle, Nielsen). - The data collection using administrative records is appropriate, but the authors could better explain how these data were analyzed—e.g., what criteria determined protocol adherence, and how deviations were classified. - The description of short evaluation cycles (SECs) would benefit from more detail. How were lessons from each SEC integrated into MEL refinements? Including an example of how adaptations occurred in real time would reinforce the developmental nature of the work. - The participant sample size for the questionnaires is very limited (n=6 post-training, n=0 follow-up). Consider discussing whether additional data collection efforts (e.g., reminders, in-person data capture, brief intercept interviews) were attempted or might be feasible in the future. - Several implementation barriers are reported (e.g., limited staff capacity, organizational change, unclear value of MEL), but it is often unclear which data source supports each conclusion. Were these insights drawn from survey responses, meeting logs, informal reports, or researcher observation? For example, the barrier “waxing and waning engagement” would be more compelling if grounded in specific data (e.g., frequency of missed meetings, analysis of communication logs). Discussion - The manuscript includes a theory of change model (Figure 1), but the discussion does not return to it sufficiently. There is limited reflection on whether initial assumptions about mechanisms of change were confirmed, challenged, or refined based on the evaluation findings. This is a missed opportunity, especially in a formative evaluation context. - While many barriers are discussed, enabling factors (e.g., physical presence of the evaluation lead, supportive leadership, flexible design) are mentioned relatively briefly. A more balanced discussion synthesizing what worked could be useful for practitioners reading this paper for guidance. - Although the authors acknowledge the low response rate and lack of interview data in the Results section, these limitations are not sufficiently explored in the Discussion. The manuscript requires a more critical reflection on how these data gaps affect the credibility, depth, and validity of the conclusions. For instance, the absence of follow-up survey responses prevents robust assessment of changes in MEL-related knowledge, attitudes, or practices over time. Similarly, the exclusion of qualitative interview data—originally planned to contextualize implementation challenges—leaves an important gap in understanding participants’ perspectives. The authors should more clearly articulate how the small sample and missing data limit the robustness and generalizability of the findings. They could also suggest realistic data collection strategies for future iterations (e.g., embedded follow-up questions during workshops, brief written reflections, group interviews, or analysis of participant-generated outputs such as MEL plans) to help mitigate such issues. Strengthening this aspect of the discussion would improve transparency and help other practitioners or researchers facing similar constraints. Overall, this study addresses an important topic in a relevant context. If the authors can address the methodological and reporting issues outlined above, particularly regarding the data limitations and transparency, the manuscript could make a valuable contribution to the literature on evaluation capacity building in community service organisations. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean? ). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy . Reviewer #1: No Reviewer #2: Yes: David Buetti ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org . Please note that Supporting Information files do not need this step. |
| Revision 1 |
|
Evaluation capacity building in a rural Victorian community service organisation: A formative evaluation PONE-D-24-58659R1 Dear Dr. Kavanagh, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice will be generated when your article is formally accepted. Please note, if your institution has a publishing partnership with PLOS and your article meets the relevant criteria, all or part of your publication costs will be covered. Please make sure your user information is up-to-date by logging into Editorial Manager at Editorial Manager® and clicking the ‘Update My Information' link at the top of the page. If you have any questions relating to publication charges, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Haitao Shi Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: (No Response) Reviewer #2: I believe all comments have been addressed by the authors, providing clarifications on ECB intervention methods, data collection, and other relevant details. This paper makes a significant contribution to existing ECB literature. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean? ). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy . Reviewer #1: No Reviewer #2: Yes: David Buetti ********** |
| Formally Accepted |
|
PONE-D-24-58659R1 PLOS ONE Dear Dr. Kavanagh, I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now being handed over to our production team. At this stage, our production department will prepare your paper for publication. This includes ensuring the following: * All references, tables, and figures are properly cited * All relevant supporting information is included in the manuscript submission, * There are no issues that prevent the paper from being properly typeset You will receive further instructions from the production team, including instructions on how to review your proof when it is ready. Please keep in mind that we are working through a large volume of accepted articles, so please give us a few days to review your paper and let you know the next and final steps. Lastly, if your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. If we can help with anything else, please email us at customercare@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Haitao Shi Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .