Peer Review History
| Original SubmissionMay 26, 2024 |
|---|
|
PONE-D-24-20917Protocol for Developing a COnsolidated Checklist for Reporting MIXed-Methods Research (CORMIX) Using Modified DelphiPLOS ONE Dear Dr. Jaam, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Sep 06 2024 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols . We look forward to receiving your revised manuscript. Kind regards, Sergi Fàbregues Academic Editor PLOS ONE Journal requirements: 1. When submitting your revision, we need you to address these additional requirements. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 2. Please provide additional details regarding participant consent. In the ethics statement in the Methods and online submission information, please ensure that you have specified (1) whether consent will be informed and (2) what type you obtain (for instance, written or verbal, and if verbal, how it will be documented and witnessed). 3. When completing the data availability statement of the submission form, you indicated that you will make your data available on acceptance. We strongly recommend all authors decide on a data sharing plan before acceptance, as the process can be lengthy and hold up publication timelines. Please note that, though access restrictions are acceptable now, your entire data will need to be made freely accessible if your manuscript is accepted for publication. This policy applies to all data except where public deposition would breach compliance with the protocol approved by your research ethics board. If you are unable to adhere to our open data policy, please kindly revise your statement to explain your reasoning and we will seek the editor's input on an exemption. Please be assured that, once you have provided your new statement, the assessment of your exemption will not hold up the peer review process. Editor Comments: - On page 4, lines 62-66, the authors imply that mixed methods research (MMR) leads to a better understanding of the phenomenon under study and increase the quality of the results. Conversely, mixed methods do not increase the quality of a study but rather allow researchers to answer complex questions, among other things. - Throughout the paper, the authors use the term triangulation as a synonym for integration. The MMR literature describes triangulation as one of the possible rationales for MMR; therefore, using the term to describe integration is inaccurate. - The authors do not cite the journal article reporting criteria for MMR developed by the American Psychological Association (Levitt et al., 2018). This omission is concerning because these standards fill two important gaps identified by the authors: (1) they were developed by a task force group assembled specifically for this purpose and (2) they provide a detailed and comprehensive set of reporting guidelines for MMR. - The Methods section needs to be much more detailed, especially the literature review section, which is weak in terms of reporting. The authors must specify the exact search query that they will use. Additionally, what are the inclusion and exclusion criteria? - It is unclear why the terms quantitative and qualitative will be used as separate terms in the search (instead of: (quantitative AND qualitative)), and why articles discussing checklists for quantitative and qualitative research will also be included in the review. There are a large number of checklists in both methodologies and reviewing them all will certainly involve a lot of unnecessary work. Moreover, why review these checklists if the goal is to develop reporting guidelines for MMR? - As noted by one of the reviewers, the authors seem to confuse reporting quality with methodological quality. For instance, the MMAT is a tool to assess the methodological quality of MMR studies, not the reporting quality, as the authors currently imply. - On page 8 the authors describe the weaknesses of existing reporting tools. Specifically, they mention that they are based on the views of the authors, have limited guidance for use, and do not use a robust methodology. These statements should be substantiated. For example, the MMAT is based on a Delphi study with experts (not the views of the authors) and has a detailed manual describing how to use the tool (therefore, it does not have limited guidance for use). - As one of the reviewers argues, what is the function of ChatGPT? - The sampling strategy has several important weaknesses. Selecting the sample from personal contacts and snowball sampling may miss MMR researchers with important expertise on the topic. Instead, a strategy to identify researchers based on a literature search would be much more appropriate and accurate. Furthermore, why should content experts have a rank of associate professor or higher? How will consumer experience with mixed methods be determined? In addition, we know from the literature that researchers conceptualize and operationalize MMR quality differently across disciplines. Furthermore, each discipline has different reporting standards. How will the issue of disciplinary differences be addressed in the sample? - In conclusion, this protocol has a number of important issues that need to be addressed by the authors. Acceptance of the paper will depend on the authors' ability to address these issues and to make substantial changes in some parts of the study design. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Does the manuscript provide a valid rationale for the proposed study, with clearly identified and justified research questions? The research question outlined is expected to address a valid academic problem or topic and contribute to the base of knowledge in the field. Reviewer #1: Yes Reviewer #2: Partly Reviewer #3: Partly ********** 2. Is the protocol technically sound and planned in a manner that will lead to a meaningful outcome and allow testing the stated hypotheses? The manuscript should describe the methods in sufficient detail to prevent undisclosed flexibility in the experimental procedure or analysis pipeline, including sufficient outcome-neutral conditions (e.g. necessary controls, absence of floor or ceiling effects) to test the proposed hypotheses and a statistical power analysis where applicable. As there may be aspects of the methodology and analysis which can only be refined once the work is undertaken, authors should outline potential assumptions and explicitly describe what aspects of the proposed analyses, if any, are exploratory. Reviewer #1: Yes Reviewer #2: Partly Reviewer #3: No ********** 3. Is the methodology feasible and described in sufficient detail to allow the work to be replicable? Descriptions of methods and materials in the protocol should be reported in sufficient detail for another researcher to reproduce all experiments and analyses. The protocol should describe the appropriate controls, sample size calculations, and replication needed to ensure that the data are robust and reproducible. Reviewer #1: Yes Reviewer #2: No Reviewer #3: No ********** 4. Have the authors described where all data underlying the findings will be made available when the study is complete? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception, at the time of publication. The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: No Reviewer #2: No Reviewer #3: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above and, if applicable, provide comments about issues authors must address before this protocol can be accepted for publication. You may also include additional comments for the author, including concerns about research or publication ethics. You may also provide optional suggestions and comments to authors that they might find helpful in planning their study. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: I am of two minds about this article. First, I am not a supporter of checklists in non-quantitative research. Second, if one were to develop such a checklist for mixed methods research, the Delphi procedure that the authors propose would be well-suited to that task. The origin of my doubts with checklists comes from their use in qualitative research. In particular, the authors refer to SRQR and COREQ, without considering the controversy surrounding such “check the box” ratings of qualitative research (for an early critique, see Barbour, 2001, which has over 4,000 citations in Google Scholar). The underlying problem is that qualitative methods are so diverse that no one set of criteria can apply to even a majority of the most widely used approaches (e. g., grounded theory, interpretive phenomenological analysis, reflexive thematic analysis, and qualitative content analysis). The broad range of criteria that these methods use to assess rigor is quite different from the far more limited criteria for quantitative methods such as surveys and experiments. This debate around the use of checklists in qualitative methods poses at least two challenges for these authors: Will they limit their Delphi participants to researchers who agree with their preference for universal standards? And, will they try to apply the same standards to both the qualitative and quantitative methods in their instrument? Moving on to issues related to the authors’ proposed methods, I am quite familiar with Delphi studies, and I found the design and description of the current study to be very thorough, with one small exception. On p. 11, lines 222 -223 there is a reference to having the second round of the expert panel review comments from the first round, but there was no mention of collecting such comments in the first round. I assume this just an omission in describing the first round, since it is traditional in a Delphi to collect such comments along with the numeric ratings. In contrast, I was less satisfied with the section on Finalizing the instrument (p. 13), which relies on a sample of five articles from “different fields.” First, five seems like a very low number, given that the authors expect their checklist to have widespread utilization. Second, I would prefer to see some systematic procedure for selecting those articles. Overall, the nature of this article as a protocol leads me to emphasize the strengths of the proposed methods for meeting the authors’ stated goals. Still, I think the authors should be more reflexive in their thinking about whether a set of check-the-box criteria are indeed adequate for such a complex task as assessing reporting quality in mixed methods research. Reference Barbour, R.S. (2001). Check lists for improving rigour in qualitative research: A case of the tail wagging the dog? BMJ, 232:1115-1117. Reviewer #2: General Comments/Summary of the paper This manuscript presents a protocol for the development of a reporting guideline for mixed methods research. The guideline that will be developed from this project can be relevant for those interested in mixed methods research. The main comments concern the confusion between reporting and methodological quality and the lack of information in the methods section. The comments are detailed below. Main comments 1. The authors use the terms “mixed methods”, “mixed-method” and “mixed-methods” interchangeably throughout the manuscript (text, table and figure). Textbooks usually use “mixed methods” without the hyphen and with an S at the end of method. 2. There is a confusion between reporting and methodological quality tools in the introduction (page 5) and initial phase (page 8). For example, they cited the MMAT, QATSDD and QuADS as existing tools that they found (line 137). Yet, these tools were not developed as reporting guideline but for appraising the methodological quality of studies. Reporting guidelines will usually include more criteria to ensure that the information is complete and clear. There are more than 600 reporting guidelines on the EQUATOR Network. More than a dozen mixed methods reporting tools can be found from the EQUATOR Network such as the MMR-RHS checklist and ASSESS tool. The authors should focus on these tools to justify their project and develop their questionnaire. There are also other reporting guidances for mixed methods that the authors can rely on such as Brown, K., Elliott, S., Leatherdale, S., & Robertson-Wilson, J. (2015). Searching for rigour in the reporting of mixed methods population health research: a methodological review. Health Education Research, 30(6), 811-839; Leech, N., & Onwuegbuzie, A. (2010). Guidelines for conducting and reporting mixed research in the field of counseling and beyond. Journal of Counseling and Development, 88(1), 61-70; Burrows, Timothy J. 2013. A preliminary rubric design to evaluate mixed methods research. Ph.D. diss., Virginia Polytechnic Institute and State University, https://www.proquest.com/dissertations-theses/preliminary-rubric-design-evaluate-mixed-methods/docview/1513244026/se-2. The authors should consider reviewing this sentence “…these tools use different sets of criteria based on author’s views and experts opinions rather than robust methodology. Standardized scoring system is lacking, and subjective judgement is made by accessors, hence, inter-rater reliability may be inconsistent. (lines 141-143)”. Some tools have been developed using a rigorous process (including Delphi and qualitative interviews). Also, inter-rater reliability is not necessarily a recommended step for the development of reporting guideline (Moher et al., 2010). 3. More information on the research team is needed (page 8). How many persons are part of the team? Are they junior/senior researchers or trainees? What is the expertise of each member? Do they all have expertise/experience in mixed methods? Based on the reference list, it seems that only the last author has expertise in mixed methods. 4. In figure 1 (page 7), there is a step (“Step 6: Conduct a face-to-face meeting with the research team and all experts to finalise the CORMIX tool”) that is not described in the text. More information is needed on how this step will be conducted. Who are “all the experts” (e.g., those that participated in the Delphi?)? 5. Initial phase (page 9) : 1) 7 databases are listed on line 152. Yet, ProQuest is a platform that provides access to several databases. The authors should indicate which databases from ProQuest have been used; 2) The search strategy is not detailed. Did they consulted a specialized librarian to develop the search strategy? Did they used Boolean, truncation or proximity operators? What time frame was covered in the search strategy?; 3) The authors mentioned searching in the grey literature (line 153). Is the search of the grey literature limited to Google Scholar? If no, please add the other sources used. Also, the “major journals” searched should be listed (line 153); 4) The authors mentioned that the data will be extracted using a “Standardised data extraction sheet” (line 161). More information on what data will be extracted is needed; 5) The authors mentioned, “The use of AI to generate item Chatbot, equipped with language processing, can develop items that can be beneficial for CORMIX. Items generated by ChatBot will be compared with items found from the literature search to supplement the findings. To ensure comprehensiveness, the prompt questions will be repeated in different timestamps and contexts.”(lines 164-167). This method is innovative but would benefit from more precision. For example, what will be asked to generate the items? Is this process replicable by other researchers (e.g., if another researcher use ChatBot with the same questions, will the same items be generated?). Also, what is meant by “repeated in different timestamps and contexts”? 6. Modified Delphi phase (page 10): 1) More information on how the experts will be identified and recruited is needed; 2) The authors mentioned aiming for a minimum of 20 experts. What is the rationale behind this number? Is this number after the 1st round or after three rounds? They mentioned that more than 20 (line 181). How many more?; 3) The recruitment is expected to start in July 2024 and data collection in August 2024. Thus, the questionnaire and phase 1 is completed. The authors could provide more detailed information on the Initial phase and questionnaire development; 4) The authors mentioned that “(6) An online meeting will be held with the experts between the Delphi rounds to share the main results” (lines 188-189). This is not a typical step in a Delphi method. The results of a round are usually shared at the beginning of the questionnaire of the next round. Could this online meeting influence anonymity (a characteristic of a Delphi method)?; 5) It is mentioned that the research team will perform a face and content validation of the questionnaire (line 197). How many persons will rate each item? What are their expertises? 7. Finalizing the CORMIX Phase (page 13): How will the raters (2 PhD students and 5 mixed methods researchers) be recruited? How will the 5 published mixed methods studies identified? Will they choose good and bad reported papers? How will the issues or limitations identified from the raters be collected (e.g., using a pre-developed questionnaire vs during an interview or focus group)? 8. Guidance statement: It is mentioned that the guidance statement “will undergo pilot testing and feedback from intended end-users” (lines 248-249). More information on how this testing is needed (how will they be recruited, contacted, how many, how data will be collected, …). 9. Explanation and Elaboration (E&E) Document: Will the E&E also be submitted to testing and feedback as the guidance statement? 10. The authors should be careful on their use of the word “triangulation” (lines 66, 76, 147). Triangulation refers of the use of multiple methods or sources of data. In the context of their sentences, they seem to refer more to integration than triangulation. For example, at this sentence, “These reported limitations highlight that researchers merely use mixed methods as a means to collect large volume of qualitative and quantitative data with limited triangulation.”, the word triangulation could be replaced by “integration”. Minor comments Line 80: write Equator with capital letters. Line 112: replace EQUATER by EQUATOR Line 118: remove “1.” at the end of the sentence. Line 281: Reference #5 is incomplete. Reviewer #3: More information is needed about the initial phase to clarify the process for the reader, including what the Rayyan tool is, how using ChatGPT benefits the project at this phase, and what you will do with the ChatGPT results. Expert selection: The expert selection process is a bit confusing. First, just because someone is a journal editor of a journal that publishes mixed methods research does not mean that they are an expert in mixed methods quality (unless you are only recruiting from methodological journals such as JMMR, but that does not appear to be the case). Additionally, requiring that content experts/academics be at the rank of associate professor or higher may be limiting. In some fields, mixed methods research is relatively new and assistant professors may know more about it than full professors. For the consumers, how will you know they have a minimum of 3 years of experience in using and publishing mixed-method research? ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean? ). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy . Reviewer #1: No Reviewer #2: No Reviewer #3: No ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org . Please note that Supporting Information files do not need this step. |
| Revision 1 |
|
PONE-D-24-20917R1Protocol for Developing a COnsolidated Checklist for Reporting MIXed-Methods Research (CORMIX) Using Modified DelphiPLOS ONE Dear Dr. Jaam, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. ============================== The authors have addressed most of the issues raised in the first round of reviews. However, several important points still need to be clarified. Reviewer 2 highlights a number of issues in the reporting of the first phase, while reviewer 1 notes others specifically related to the Delphi phase. ============================== Please submit your revised manuscript by Dec 01 2024 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols . We look forward to receiving your revised manuscript. Kind regards, Sergi Fàbregues Academic Editor PLOS ONE Additional Editor Comments: - Provide a rationale for the feasibility of the review and questionnaire development according to the stated timetable and clarify whether these stages have been completed. - On line 94 you say: “Attempts have been made to formulate reporting guidelines for mixed methods research to improve quality, but comprehensive standards are still lacking”. What do you mean by comprehensive? Please explain further why the published reporting frameworks are not comprehensive. - On line 106, when discussing the JARS, you state: “While more comprehensive than GRAMMAS, it does not provide the level of detail some researchers might need for comprehensive reporting across all aspects of a mixed methods study”. Please explain this statement further, as we believe that the JARS is quite detailed. - On line 124, you state that the tool will address “the philosophical underpinnings of mixed methods”. What are these underpinnings? How will they be addressed? - Include the full search strategy used or planned as a supplementary file. - Clarify the issues raised by reviewer 2 regarding the screening process and the selection of checklists for review. - If the review has been completed, please indicate the implications of not following the reviewers’ suggestions and how these issues will be addressed. We believe that the reviewers’ comments are important enough to be considered. - Provide more details on how mixed methods experts will be identified, both in terms of the literature to be searched and the criteria for selecting experts. - Consider reviewer 1’s suggestions regarding measurement tools, criteria for agreement, and other methodological issues. - Clarify how the 10 articles for the pilot will be selected. - Respond to any other issues raised by the reviewers. - There are several typographical and grammatical errors throughout the paper. These need to be corrected: e.g. lines 64-65, 101-102. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Does the manuscript provide a valid rationale for the proposed study, with clearly identified and justified research questions? The research question outlined is expected to address a valid academic problem or topic and contribute to the base of knowledge in the field. Reviewer #1: Yes Reviewer #2: Yes ********** 2. Is the protocol technically sound and planned in a manner that will lead to a meaningful outcome and allow testing the stated hypotheses? The manuscript should describe the methods in sufficient detail to prevent undisclosed flexibility in the experimental procedure or analysis pipeline, including sufficient outcome-neutral conditions (e.g. necessary controls, absence of floor or ceiling effects) to test the proposed hypotheses and a statistical power analysis where applicable. As there may be aspects of the methodology and analysis which can only be refined once the work is undertaken, authors should outline potential assumptions and explicitly describe what aspects of the proposed analyses, if any, are exploratory. Reviewer #1: Yes Reviewer #2: Partly ********** 3. Is the methodology feasible and described in sufficient detail to allow the work to be replicable? Descriptions of methods and materials in the protocol should be reported in sufficient detail for another researcher to reproduce all experiments and analyses. The protocol should describe the appropriate controls, sample size calculations, and replication needed to ensure that the data are robust and reproducible. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Have the authors described where all data underlying the findings will be made available when the study is complete? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception, at the time of publication. The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: No ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above and, if applicable, provide comments about issues authors must address before this protocol can be accepted for publication. You may also include additional comments for the author, including concerns about research or publication ethics. You may also provide optional suggestions and comments to authors that they might find helpful in planning their study. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: A this point, I still have a number of comments, most of which are rather small. On lines 117-118, the authors state that their tool will address the “philosophical underpinnings of mixed methods,” and on 178-179 they state the need to prompt “researchers to report their philosophical or paradigmatic stance.” But isn’t this something that the experts should decide, rather than having the authors make prior assumptions about the centrality of this topic? On line 200 they mention searching in the Journal of Mixed Methods Research, and I would add the International Journal of Multiple Research Approaches. On line 267 the authors state that they will conduct “a targeted literature search to identify additional mixed methods experts.” I find this to be too vague, since it doesn’t say anything about which literature will be searched or what criteria will be used for selection. I have personally conducted several citation searches that being with “mixed method*” and the results are so widely scattered that they would be useless for locating experts in the field. On lines 272-273 the authors say they will use snowball sampling to expand their list of experts. This is a useful strategy, but snowball sampling is highly sensitive to the initial set of “seeds” that are used. So, this reinforces the point that I just made above about the importance of an explicit strategy for selecting the first set of experts. Line 303 states that the items will be evaluated “using a 5-point Likert scale from highly relevant to non-relevant.” In my experience, this will lead to a highly skewed set of results, because very few of the items will be “non-relevant” after the authors’ intense screening process. I suggest that the scale run from “Minor Relevance” to “Highest Relevance” and that the authors use a 10-point rating scale. The usual argument for using 5-point and shorter scales is that it reduces respondent “burden,” but that is not a major issue when working with a highly educated panel who have all volunteered to engage in a rating process. On line 303 the rating scale is described as “highly relevant to non-relevant,” but on lines 320-321 it is described as “Strongly disagree” to “Strongly Agree.” Again, no one is like to strongly disagree with the relevance of these items. If items “pile up” at the top of the rating scale, then al most all of them will meet the authors’ criteria for consensus, which would lead to a very lengthy instrument. As above, I would recommend a 10-point scale, and then use scores of either 7-10 or 8-10 as a cut-off for consensus. Line 323 I disagree about whether there is agreement on the figure 85% for consensus. One common joke among Delphi researchers is that “there is no consensus about consensus.” I personally would recommend that 85% be set as an aspirational goal, but that 75% after the second round be considered adequate. On line 345, I have found that one of the most useful aspects of the comments made by the experts is to adjust the wording of items. Will this be possible? In the section on lines 347 to 367 it is unclear whether the 10 people are each going to rate all 10 articles. If so, I would prefer to have a sub-sample of the panel rate a larger number of articles. You would have the same number of ratings if five people each rated 20 articles. The reason for this recommendation is that variability among the articles is at least as important as variability among the raters. Line 360 a random sample of “mixed method studies published in the last three years” will have a population of about 3,000 articles. How will the authors “stratify” such a large population to get just 10 (or 20) matching articles? I also worry that relying on PubMed for half the sample frame will lead to an over-emphasize health-related topics. What about substituting Scopus instead? Reviewer #2: The authors have addressed the comments of the reviewers. Yet, there are still some remaining issues. The main comment is about the timeline suggested. The authors are planning a search period of the literature until September 2024 (line 225), a recruitment period of experts in September 2024 (line 298), and start the Delphi phase in October 2024 (line 299). Considering that the results of the review will be used to develop the Delphi questionnaire, it does not seem feasible to complete all the steps of the review, and develop and validate the questionnaire within one month. If the authors have already completed the review and questionnaire, they should include the actual date of the search, and state that the review is completed in the protocol. Other comments are listed below: • Line 106: replace “GRAMMAS” by “GRAMMS”. • Line 166: Spell out “MMAT” since the first time it appears in the text. • Lines 182-184: The authors mentioned “The MMR-RHS checklist is tailored for rehabilitation and health sciences. While it may be adaptable to other fields, it might not fully capture the nuances of mixed methods research in other disciplines”. To help readers understand, the authors should add an example of nuances of mixed methods not captured in the MMR-RHS. • Line 219: The authors mentioned using free keywords and controlled vocabulary (MeSH and EMTREE). Yet, it is not clear what controlled vocabulary could be used for this topic. As mentioned previously, if the review is completed, the authors could add the actual search strategy used. • Lines 238-241: “A two-stage screening process will be employed, first: initial screening will focus on mixed methods-specific guidelines. Second, quantitative and qualitative checklists will be reviewed selectively, prioritizing those frequently cited in mixed methods literature or those that have significantly influenced reporting practices”. This description of the screening process is different from what is usually found in reviews (i.e., first, screen titles and abstracts and then full-text papers). The process suggested will involve first identifying guidelines and then checklists. This could be done at the same time (i.e., identifying guidelines and checklist from titles/abstracts and then from full-text papers). • Lines 241-243: “We will limit our in-depth review to the top 10 most relevant checklists for each methodology (quantitative and qualitative) based on citations.” It is not clear why the authors will limit to the top 10 checklists. Also, using number of citations to judge which ones are the “most relevant” can also be misleading. For example, more recent checklists are likely to be less cited but can be more relevant compared to a checklist developed in the 80s. • Line 282: “This will be supplemented by a targeted literature search to identify additional mixed methods experts.” The authors should describe how they will process (e.g., where will they find the literature, how will they select the literature, etc.). • Lines 381-388: “These studies will be systematically selected to represent a range of disciplines and reporting quality. The ten articles will be identified through a systematic search in PubMed and PsycINFO for mixed method studies published in the last three years. Stratified random sampling of the ten articles will be conducted to ensure representation across disciplines.” How will the authors assess the reporting quality of the 10 papers? Also, considering the latter sentence (random sampling), it seems that the authors will first identify all published mixed methods studies from the past 3 years indexed in PubMed and PsycINFO. Then, they will appraise the reporting quality of all identified mixed methods studies and classify them into disciplines. They will then randomly select 10 based on disciplines and reporting quality. This seems to be very time consuming. For example, only in Medline, there are more than 7000 papers published since 2022 that have used mixed methods in their titles. • Reference #1 refers to two different books by Creswell (Designing and conducting mixed method research VS Research design: Qualitative, quantitative and mixed methods approaches). The authors should check which want they wanted to cite. • Reference #5 is incomplete. Please check. • Figure 1: There is a missing word at the end of step 6. • The authors should avoid using contractions in the manuscript (e.g., there’s (line 81), It’s (line 118), …). ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean? ). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy . Reviewer #1: Yes: David L. Morgan Reviewer #2: No ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org . Please note that Supporting Information files do not need this step. |
| Revision 2 |
|
PONE-D-24-20917R2Protocol for Developing a COnsolidated Checklist for Reporting MIXed Methods Research (CORMIX) Using Modified DelphiPLOS ONE Dear Dr. Jaam, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Feb 10 2025 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols . We look forward to receiving your revised manuscript. Kind regards, Sergi Fàbregues Academic Editor PLOS ONE Journal Requirements: Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. Additional Editor Comments: While the authors have addressed most of the reviewers’ comments, the reviewers note that a few minor issues still need to be addressed. These issues, with which I agree, include: - Clarify how all the steps of the review will be completed in the short period of time mentioned between the date of the search and the start of data collection. - Correct several typos and add some examples, as suggested by the reviewers. - Clarify and add additional information about the methods employed (especially regarding the search process for identifying experts) and their limitations. - Correct several problems in the list of references. - In the response to the reviewers letter, provide a justification for procedures and steps that were performed in a different way than suggested by the reviewers. [Note: HTML markup is below. Please do not edit.] [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org . Please note that Supporting Information files do not need this step. |
| Revision 3 |
|
PONE-D-24-20917R3Protocol for Developing a COnsolidated Checklist for Reporting MIXed Methods Research (CORMIX) Using Modified DelphiPLOS ONE Dear Dr. Jaam, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. The reviewer reports are attached. Please provide a point-by-point response to the reviewers' comments. Many thanks. Please submit your revised manuscript by Mar 21 2025 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols . We look forward to receiving your revised manuscript. Kind regards, Sergi Fàbregues Academic Editor PLOS ONE Journal Requirements: Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. Additional Editor Comments: The Revision 2 reviewer reports are attached. Please provide a point-by-point response to the reviewers' comments. Many thanks. Sergi [Note: HTML markup is below. Please do not edit.] [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org . Please note that Supporting Information files do not need this step.
|
| Revision 4 |
|
Protocol for Developing a COnsolidated Checklist for Reporting MIXed Methods Research (CORMIX) Using Modified Delphi PONE-D-24-20917R4 Dear Dr. Jaam, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice will be generated when your article is formally accepted. Please note, if your institution has a publishing partnership with PLOS and your article meets the relevant criteria, all or part of your publication costs will be covered. Please make sure your user information is up-to-date by logging into Editorial Manager at Editorial Manager® and clicking the ‘Update My Information' link at the top of the page. If you have any questions relating to publication charges, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Sergi Fàbregues Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: |
| Formally Accepted |
|
PONE-D-24-20917R4 PLOS ONE Dear Dr. Jaam, I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now being handed over to our production team. At this stage, our production department will prepare your paper for publication. This includes ensuring the following: * All references, tables, and figures are properly cited * All relevant supporting information is included in the manuscript submission, * There are no issues that prevent the paper from being properly typeset If revisions are needed, the production department will contact you directly to resolve them. If no revisions are needed, you will receive an email when the publication date has been set. At this time, we do not offer pre-publication proofs to authors during production of the accepted work. Please keep in mind that we are working through a large volume of accepted articles, so please give us a few weeks to review your paper and let you know the next and final steps. Lastly, if your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. If we can help with anything else, please email us at customercare@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Sergi Fàbregues Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .