Peer Review History

Original SubmissionOctober 11, 2022
Decision Letter - Samane Shirahmadi, Editor

PONE-D-22-28106Developing an organizational capacity assessment tool and capacity-building package for the National Center for Prevention and Control of Noncommunicable Diseases in IranPLOS ONE

Dear Dr. Amirhossein Takian,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by 23 March 2023. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Samane Shirahmadi, PhD

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf.

2. Thank you for stating the following in the Competing Interests  section:

“AT and AO are members of the INCDC at the MoHME- Iran. We (AT and AO) have no conflict of interest to disclose. AB, , EM, MB, and MR declare that they have no competing interests.”

We note that one or more of the authors are employed by a commercial company: INCDC at the MoHME- Iran

a.            Please provide an amended Funding Statement declaring this commercial affiliation, as well as a statement regarding the Role of Funders in your study. If the funding organization did not play a role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript and only provided financial support in the form of authors' salaries and/or research materials, please review your statements relating to the author contributions, and ensure you have specifically and accurately indicated the role(s) that these authors had in your study. You can update author roles in the Author Contributions section of the online submission form.

Please also include the following statement within your amended Funding Statement.

“The funder provided support in the form of salaries for authors [insert relevant initials], but did not have any additional role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. The specific roles of these authors are articulated in the ‘author contributions’ section.”

If your commercial affiliation did play a role in your study, please state and explain this role within your updated Funding Statement.

b. Please also provide an updated Competing Interests Statement declaring this commercial affiliation along with any other relevant declarations relating to employment, consultancy, patents, products in development, or marketed products, etc. 

Within your Competing Interests Statement, please confirm that this commercial affiliation does not alter your adherence to all PLOS ONE policies on sharing data and materials by including the following statement: "This does not alter our adherence to  PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests) . If this adherence statement is not accurate and  there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared.

Please include both an updated Funding Statement and Competing Interests Statement in your cover letter. We will change the online submission form on your behalf.

3. We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For more information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions.

In your revised cover letter, please address the following prompts:

a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially sensitive information, data are owned by a third-party organization, etc.) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent.

b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories.

We will update your Data Availability statement on your behalf to reflect the information you provide.

4. Your ethics statement should only appear in the Methods section of your manuscript. If your ethics statement is written in any section besides the Methods, please move it to the Methods section and delete it from any other section. Please ensure that your ethics statement is included in your manuscript, as the ethics statement entered into the online submission form will not be published alongside your manuscript.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: No

Reviewer #2: Yes

Reviewer #3: Partly

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: No

Reviewer #2: Yes

Reviewer #3: No

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: Yes

Reviewer #3: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: No

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Capacity assessment tools are increasingly, and effectively, used by health systems around the globe to assess and improve their capacities to respond to public health emergencies and other problems. Given the prominence of non-communicable diseases, a similar approach to NCD capacities, as described in this paper, is a promising idea. If it were validated, the authors’ tool could be useful in many countries beyond Iran, where it was developed and tested.

However, despite the title (“Developing an organizational capacity assessment tool …”), the parts of paper addressing the performance of the proposed tool are poorly described and I believe under-developed. The methods for assessing the tool’s validity and reliability are given in one-paragraph (Section 2.1) and the related results in another paragraph (the first paragraph in the Results section). Appendix A has only two paragraphs (those immediately preceding Table 2) on methods and results.

Overall, I find these paragraphs difficult to follow, both because of substantive reasons and also poor writing. In particular:

• Section 2.1 describes two expert teams, noting that “For the first expert team (18 experts), we selected 12 experts …” The second team has 6 experts, so perhaps that’s the difference, but it’s very hard to follow.

• More substantively, these teams are described in different ways in the main paper and the appendix, but I believe that some work within the relevant ministries and others not. If this is true, the first would be evaluating their own work, so there is a potential for bias. It would be interesting to see if both categories of experts had similar or different ratings, but it is not clear to me that this is done.

• The paper describes formal interviews with the six experts from the second team, but it is not clear whether or how they were used to assess the validity or reliability of the tool. Presumably the discussions on pp. 9-16 is based on these interviews, but these seem to be substantive, and unrelated to assessing the tool’s performance.

• The paragraph in the Results section refers to Appendix A, Table 1-6. The appendix has only 2 numbered tables, so I can only guess that “Table 1-6” refers to the six un-numbered domain-specific tables.

• Referring to “Table 2-Appendix A”, the Results section reports on “kappa values for all of the arias (areas?) we tested show, but one see only in the appendix that that the results are based on only three substantive areas (physical activity, cardiovascular disease, and tobacco).

Because of the lack of clarity about methods and results, it is difficult for me to know that the tool is valid and performs well even in the setting in which it was tested.

Rather than validating the tool they developed, the authors spend the bulk of the paper in substantive discussion of the six domains (pp. 9-16), providing specific recommendations for organization change. It is not clear to me that these discussions are based on the questionnaire results; rather they seem to be based on the qualitative interviews. The Discussion and Conclusions (pp. 17-18) also primarily address substantive issues. As a methodologist who is not familiar with Iran, I have no way to assess the validity of this discussion. But even if they were valid, they would be of little interest to researchers or practitioners outside of Iran.

Reviewer #2: Thanks for choosing me as a reviewer.

The article is very valuable and constructively written, however, the following suggestions are made for improvement.

The study was written using a structure that made sense.

Introduction: Given the high prevalence of NCDs noted in the introduction, could you perhaps elaborate on how these conditions interact with the COVID-19 pandemic? Make an effort to use numbers to demonstrate the subject's importance.

More information can be found in the WHO research at https://www.who.int/publications-detail-redirect/9789240010291.

Even when the questions on the COREQ checklist have been addressed, it is still advised to complete and submit this checklist. The table title ("Reported on Page #") invites the authors to provide the page numbers that correspond to each specific item on the checklist.

Best Regards

Reviewer #3: Thank you for giving me the opportunity the reading this valuable research. In this research, a tool has been designed to assess the capacity of the National Center for Non-Communicable Disease for Non-Communicable Disease and in another part of the study; the NCNCD organizational capacity has been assessed. The design of the study is mixed method and it had qualitative and quantitative parts.

Here are some comments:

Abstract:

- Please do not use the abbreviations in the abstract.

- Please describe what you had done in quantitative and qualitative sections of your study in clear form in design section, separately.

- What is the type of your mixed method design (exploratory? explanatory?)?

- How do you use and combine the findings of the two parts of the study? Describe it in the method and result.

- The conclusion did not cover all the findings of your study.

Introduction

- The knowledge gap is not well discussed in the introduction section.

- you should report the previous related tool in your research topic and why you want design a new tool.

- The goal is not clear well.

Method

- What is the type of your mixed method design (exploratory? explanatory?)? why you select mixed method for your research? Please describe it. What is the qualitative phase and what is the quantitative phase and how this two phases are related and integrated?

- It is not clear what are the goal of literature review (step 1 in method) and what is the relation of this step to other steps of the study. Is your goal is item generation for the tool?

- 2. Designing Tool (Organizational Capacity Assessment Tool, OCAT):

is there related tool previously?

what was your search strategy? how did you check the quality of studies? what were your keywords? what document you have searched for? what language did your document have? what was your time limits? what studies had been searched? what were your inclusion and exclusion criteria for the studies? you should draw the PRISMA checklist for your research.

why did you checked only CVI? why didn’t check other validity properties? why did you select 18 experts? how do you calculate CVI? why did you select only 6 experts for relability assessment? how did you check the inter-rater agreement and which coefficient did you calculate?

which type of the tool did you design? (checklist, questionnaire, scale,…)

- The descriptions which you have provided in the 3.1 and 3.3 section, are related to tool designing step instead of these two steps, you could provide one step for data gathering via the tool which have been designed in the previous section.

- Interviews with NCNCD and expert team two:

How did you insure data saturation? What is the method of thematic analysis? Did the interview were recorded? How did you checked the rigor of the qualitative phase of your study?

- Data analysis: please explain more details about thematic content analysis with reference.

Result:

- You reported the CVR in tables, but you didn’t describe the assessment method of it in the method section.

- You should report how much item did you generate at the first, and how much items were deleted in the process of content validity assessment.

- You shoud describe how and which Kappa did you assessed in the method section and then report the result only in the result section.

- you should report how many codes, subcategories, main categories and themes did you gathered. how was the process of abstraction of the finding (it is better to report this process as a table or figure)

- You should show how the findings of the two part of your study are related to each other in the finding section, too.

- you should report the analysis of the findings which you gathered with the tool.

Discussion:

- In the first line and paragraph of the discussion section, conclude the goal of your study and the main findings.

- What was your limitations? What is your further researches recommendations? What was your findings implication for practice?

Conclusion

- The conclusion did not cover all the findings of your study.

Appendix:

- the table did not have numbers.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Michael A Stoto

Reviewer #2: Yes: Dr Samad Azari

Reviewer #3: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 1

April 20, 2023

Dear editors;

PLOS ONE

Re: PONE-D-22-28106

Developing an organizational capacity assessment tool and capacity-building package for the National Center for Prevention and Control of Noncommunicable Diseases in Iran

Dear Editor,

Thank you and the reviewers very much for your recent comments, which provided us

the opportunity to improve our manuscript. We are pleased to inform you that we have

addressed all the comments in in order that they raised, as you will find them below.

--------------------------------

Comments to the Author

Reviewer #1:

1. Capacity assessment tools are increasingly, and effectively, used by health systems around the globe to assess and improve their capacities to respond to public health emergencies and other problems. Given the prominence of non-communicable diseases, a similar approach to NCD capacities, as described in this paper, is a promising idea. If it were validated, the authors’ tool could be useful in many countries beyond Iran, where it was developed and tested.

Re: Thank you. Non-communicable diseases are the leading cause of premature death, as you mentioned, so researchers, managers, and policymakers must pay close attention to this issue. In WHO statements, this need has been referred to numerous times. The Validity section has been revised to accommodate this comment.

2. However, despite the title (“Developing an organizational capacity assessment tool …”), the parts of paper addressing the performance of the proposed tool are poorly described and I believe under-developed. The methods for assessing the tool’s validity and reliability are given in one-paragraph (Section 2.1) and the related results in another paragraph (the first paragraph in the Results section). Appendix A has only two paragraphs (those immediately preceding Table 2) on methods and results. Overall, I find these paragraphs difficult to follow, both because of substantive reasons and also poor writing. In particular:

• Section 2.1 describes two expert teams, noting that “For the first expert team (18 experts), we selected 12 experts …” The second team has 6 experts, so perhaps that’s the difference, but it’s very hard to follow.

• More substantively, these teams are described in different ways in the main paper and the appendix, but I believe that some work within the relevant ministries and others not. If this is true, the first would be evaluating their own work, so there is a potential for bias. It would be interesting to see if both categories of experts had similar or different ratings, but it is not clear to me that this is done.

• The paper describes formal interviews with the six experts from the second team, but it is not clear whether or how they were used to assess the validity or reliability of the tool. Presumably the discussions on pp. 9-16 is based on these interviews, but these seem to be substantive, and unrelated to assessing the tool’s performance.

Re: Thank you for your comments.

One of the crucial steps in the development of a capacity assessment tool is, as you mentioned, ensuring its validity and reliability. Perhaps because we combined three phases (mentioned below); tool's reliability and validity is not stated clearly:

• Design and validity of the tool

• Tool scoring by stakeholders and the results

• Discussing challenges, opportunities, and solutions using the qualitative approches

Each of these might stand alone as a separate article.

This study is the third part of a larger study, the first two parts of which have been published and can be found at the link below.

1. Assessment and prioritization of the WHO “best buys” and other recommended interventions for the prevention and control of non-communicable diseases in Iran, (Link)

2. Intersectoral collaboration in the management of non-communicable disease’s risk factors in Iran: stakeholders and social network analysis (Link)

We used multiple expert teams during various phases of study

• During tool design (expert team number one) (Table 1 appendix A)

• During tool validity and reliability (expert team number two) (Table 2 appendix A)

• While identifying challenges and opportunities (qualitative phase-expert team number three) (Table 3 appendix A).

Cognitive interviews, CVI, and CVR were used to assess the validity. We added the profiles of experts who participated in the validity and reliability of the tool. The validity and reliability section was revised in the main text and appendix A. Please see to the appendix-A and main text's validity sections. We selected 12 experts for validity and 3+6 experts for reliability.The kappa values show moderate (0.4–0.6) to strong (0.6 and higher) interrater agreement. In the designed tool, experts assign a score to each of the sub-dimensions or questions. Tables 6–11 of the appendix A show the degree of validity for each subdomain. Appendix A-Table 5 shows the reliability values. Many capacity assessment tools use the Kappa test for reliability measurement. Please see the link below for more information.

Link : https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3900052/

Self-assessments has been widely used in the past for measuring organizational capacity. The evaluated organization's human resources (experts and managers) examine the capacity of the organization to fulfill its duties; and the capacity of the organization is measured rather than individual performance. Examples can be found in the links below.

1. https://usaidlearninglab.org/resources/organizational-capacity-assessment (Description section)

2. https://www.ngoconnect.net/sites/default/files/resources/Organizational%20Capacity%20Self-Assessment%20Tool%20-%20Training%20Guidelines.pdf

3. https://www.jsi.com/resource/organizational-capacity-assessment-oca-tool-participants-copy/

As you can see, many tools have a common main domain, and in the initial cycle of capacity assessment, subdomains focus on the key areas. Subdomain questions may alter in the upcoming cycle of capacity assessment because it is a continuous process.

The boundaries between quantitative and qualitative phases became more clear. This is an explanatory sequential design that has a quantitative phase and a qualitative phase.

3. The paragraph in the Results section refers to Appendix A, Table 1-6. The appendix has only 2 numbered tables, so I can only guess that “Table 1-6” refers to the six un-numbered domain-specific tables.

Re: Thank you. Every table's title in Appendix A has been revised. Please see Appendix-Tables 6, 7, 8, 9, 10, and 11. Additionally, based on the 12 experts, this Tabels reported the validity (CVI and CVR) of each subdomain.

4. Referring to “Table 2-Appendix A”, the Results section reports on “kappa values for all of the arias (areas?) we tested show, but one see only in the appendix that that the results are based on only three substantive areas (physical activity, cardiovascular disease, and tobacco).

Re:Thank you. Revised. Since the framework, questions, and scoring processes in domains and subdomains are the same in all 7 areas, the study team believed that repetition in one area (two experts) is a more important criterion for measuring reliability. Therefore, we run the test in these three areas. Twice in three area (two experts separately for each).

5. Because of the lack of clarity about methods and results, it is difficult for me to know that the tool is valid and performs well even in the setting in which it was tested.

Re: We revised the method and findings sections and hope that they will be convincing to the reviewers. Please see the Method and Results sections and Appendix A. We are pleased tominform you that the Ministry of Health & Medical of Iran has put some of our recommended interventions (qualitative phase) on agenda, including strengthening the organizational structure.

6. Rather than validating the tool they developed, the authors spend the bulk of the paper in substantive discussion of the six domains (pp. 9-16), providing specific recommendations for organization change. It is not clear to me that these discussions are based on the questionnaire results; rather they seem to be based on the qualitative interviews. The Discussion and Conclusions (pp. 17-18) also primarily address substantive issues. As a methodologist who is not familiar with Iran, I have no way to assess the validity of this discussion. But even if they were valid, they would be of little interest to researchers or practitioners outside of Iran.

Re:We hope the revisions we made have been able to address your concerns. Pages 9–16 contain information from the study's qualitative phase. The qualitative phase was carried out in order to closely examine the organizational weaknesses (as determined by the tool). To make the distinction clear to readers, we changed the headings and defined the boundaries between the quantitative and qualitative phases.

The recommended interventions in the findings and discussion section seem more appropriate for developing countries, where the structures are not yet fully developed and there are severe financial and human resource shortages.

Reviewer #2:

1. Introduction: Given the high prevalence of NCDs noted in the introduction, could you perhaps elaborate on how these conditions interact with the COVID-19 pandemic? Make an effort to use numbers to demonstrate the subject's importance.

More information can be found in the WHO research at https://www.who.int/publications-detail-redirect/9789240010291.

Re: Thank you for your thoughtful comment. Unfortunately, the COVID-19 crisis caused significant disruption in the delivery of NCD prevention and treatment services. Furthermore, many of those who died as a result of COVID-19 had NCDs or related risk factors. This was mentioned briefly in the introduction. Please see paragraph 3 on page 3.

2. Even when the questions on the COREQ checklist have been addressed, it is still advised to complete and submit this checklist. The table title ("Reported on Page #") invites the authors to provide the page numbers that correspond to each specific item on the checklist.

Re: Thank you. We fulfilled the check list for the study's qualitative parts as a Appendix B. Please see Appendix B

Reviewer #3:

Abstract:

1.Please do not use the abbreviations in the abstract.

Re: Thank you; we revised the abstract. Please see abstract.

2. Please describe what you had done in quantitative and qualitative sections of your study in clear form in design section, separately.

- What is the type of your mixed method design (exploratory? explanatory?)?

Re: Revised in abstract and method section please see page 2-abstract-method and page 4 method section; first paragraph.

3. How do you use and combine the findings of the two parts of the study? Describe it in the method and result.

Re: Please see abstract-method, the first paragraph of the method, the first paragraph of 4. Qualitative phase, and the first paragraph of each DOMAIN in the result section.

4. The conclusion did not cover all the findings of your study.

Re: Revised, please see abstract- conclusion

Introduction

5. The knowledge gap is not well discussed in the introduction section.

- you should report the previous related tool in your research topic and why you want design a new tool.

- The goal is not clear well.

Re: Done. Please see the last paragraph of the introduction.

Method

6. What is the type of your mixed method design (exploratory? explanatory?)? why you select mixed method for your research? Please describe it. What is the qualitative phase and what is the quantitative phase and how this two phases are related and integrated?

Re: Revised in abstract and method section please see page 2-abstract-method and page 4 method section; first paragraph.

7. It is not clear what are the goal of literature review (step 1 in method) and what is the relation of this step to other steps of the study. Is your goal is item generation for the tool?

Re: One of the first steps in capacity assessment and capacity building, which is recommended in many tools, is the initial familiarization with the dimensions of capacity assessment, a general knowledge of the related subject (here NCDs) and a basic knowledge of the organization under investigation.

As a step in that direction, this has been done.

8. Designing Tool (Organizational Capacity Assessment Tool, OCAT):

is there related tool previously?

Re: There is no comprehensive tool available in the field of NCDs for measuring the organizational capacity of the institution in charge of this topic.

There is a survey questionnaire in the topic of national capacity with different dimensions from the one created for this study. And it does not emphasize organizational capability

The WHO questionnaire inquires about the existence of a response strategy for a disease, while our tool looks for the team responsible for developing the strategy and the degree of adherence to it.

While our tool asked questions about the approved organizational structure, parallel institutions, and the institution's position in the overall structure, the WHO questionnaire only inquired about the existence of an institution for NCDs within the ministry. The WHO survey questionnaire was reviewed at as one of the key documents, and its most crucial points were covered in more depth in the tool than the organizational capacity aspect.

Please see the following link.

Link click here

9. what was your search strategy? how did you check the quality of studies? what were your keywords? what document you have searched for? what language did your document have? what was your time limits? what studies had been searched? what were your inclusion and exclusion criteria for the studies? you should draw the PRISMA checklist for your research.

Re: Revised. Please see Appendix A-Box B

10. Why did you checked only CVI? why didn’t check other validity properties? why did you select 18 experts? how do you calculate CVI? why did you select only 6 experts for relability assessment? how did you check the inter-rater agreement and which coefficient did you calculate?

which type of the tool did you design? (checklist, questionnaire, scale,…)

Re: Cognitive interviews, CVI, and CVR were used to assess the validity. We added the profiles of experts who participated in the validity and reliability of the tool. The validity and reliability section was revised in the main text and appendix A. Please see to the appendix-A and main text's validity sections. We selected 12 experts for validity and 3+6 experts for reliability.The kappa values show moderate (0.4–0.6) to strong (0.6 and higher) interrater agreement. In the designed tool, experts assign a score to each of the sub-dimensions or questions.

11. The descriptions which you have provided in the 3.1 and 3.3 section, are related to tool designing step instead of these two steps, you could provide one step for data gathering via the tool which have been designed in the previous section.

Re: Revised. Please see section 3.1 . first paragraph

12. Interviews with NCNCD and expert team two:

How did you insure data saturation? What is the method of thematic analysis? Did the interview were recorded? How did you checked the rigor of the qualitative phase of your study?

- Data analysis: please explain more details about thematic content analysis with reference.

Re:. Revised please see section 4. Qualitative phase. After including the expert teams from earlier phases, the third expert team is now relevant for this phase.

Result:

13. You reported the CVR in tables, but you didn’t describe the assessment method of it in the method section.

Re: revised please see validity section of method.

14. You should report how much item did you generate at the first, and how much items were deleted in the process of content validity assessment.

Re: Thank you. Revised please see appendix A section: Validity . ‘’Three of the 18 sub-dimensions or tool questions were modified and their validity was assessed again; including subdomains 3.3; 5.1 and 5.3)’’

15. You shoud describe how and which Kappa did you assessed in the method section and then report the result only in the result section.

Re: Please see Appendix A section of reliability.

16. you should report how many codes, subcategories, main categories and themes did you gathered. how was the process of abstraction of the finding (it is better to report this process as a table or figure)

Re: Revised please see page 8. “Six main themes (domains) and 18 sub-themes (subdomains) were utilized for arranging 438 codes’’

17. You should show how the findings of the two part of your study are related to each other in the finding section, too.

Re: Thank you. Revised. Please see the first paragraph on page 10

18. you should report the analysis of the findings which you gathered with the tool.

Re: Thank you. Revised. Please see the last paragraph on page 8

Discussion:

19. In the first line and paragraph of the discussion section, conclude the goal of your study and the main findings.

Re: Thank you. Revised. Please see first line and paragraph of the discussion section.

20. What was your limitations? What is your further researches recommendations? What was your findings implication for practice?

Re: As previously stated, capacity assessment and capacity building are dynamic processes that must be repeated after the initial intervention. For example, with the changes that are likely to occur probably in the structure of the Ministry of Health, the capacity assessment must be repeated.

Conclusion

21. The conclusion did not cover all the findings of your study.

Re: Revised. Please see Conclusion.

Appendix:

22. the table did not have numbers.

Re: Thank you. Revised. Please see Appendix A.

------------------------------------------------------------------------------------------

Again, thank you very much for providing us with the opportunity to improve our work. We hope that you and your team will find the revisions up to your satisfaction and look forward to your decision in due course.

Yours sincerely,

Amirhossein Takian, MD MPH Ph.D. FHEA

Professor and Head, Department of Global Health and Public Policy

School of Public Health, Tehran University of Medical Sciences, Tehran, Iran

Corresponding author

Attachments
Attachment
Submitted filename: Response to Reviewers 1.docx
Decision Letter - Samane Shirahmadi, Editor

Developing an organizational capacity assessment tool and capacity-building package for the National Center for Prevention and Control of Noncommunicable Diseases in Iran

PONE-D-22-28106R1

Dear Dr. Amirhossein Takian ,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Samane Shirahmadi, PhD

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #2: All comments have been addressed

Reviewer #3: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #2: Yes

Reviewer #3: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #2: Yes

Reviewer #3: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #2: Yes

Reviewer #3: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #2: (No Response)

Reviewer #3: please edit the appendix A: some numbers are written in Farsi language.

All the comments have been addressed

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #2: Yes: Samad Azari

Reviewer #3: No

**********

Formally Accepted
Acceptance Letter - Samane Shirahmadi, Editor

PONE-D-22-28106R1

Developing an organizational capacity assessment tool and capacity-building package for the National Center for Prevention and Control of Noncommunicable Diseases in Iran

Dear Dr. Takian:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Samane Shirahmadi

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .