Peer Review History
| Original SubmissionApril 15, 2020 |
|---|
|
PONE-D-20-10819 A systematic review of simulation modeling to assess health system performance: characterization of the field and visual aid to guide model selection. PLOS ONE Dear Dr. Larrain, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Thank you for submitting your work to PLOS One. Although the reviewers and I see some merit in the study, there are major issue that need to be addressed before the article can be considered for publication, particularly related to the paper's contribution and theoretical positioning. The reviewers provide detailed suggestions for improvement, which I hope will guide you in revising your work. Please submit your revised manuscript by Nov 06 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols We look forward to receiving your revised manuscript. Kind regards, Federica Angeli Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 2. Thank you for stating the following in the Competing Interests section: "The authors have declared that no competing interests exist." We note that one or more of the authors are employed by a commercial company: OptiMedis AG. 2.1. Please provide an amended Funding Statement declaring this commercial affiliation, as well as a statement regarding the Role of Funders in your study. If the funding organization did not play a role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript and only provided financial support in the form of authors' salaries and/or research materials, please review your statements relating to the author contributions, and ensure you have specifically and accurately indicated the role(s) that these authors had in your study. You can update author roles in the Author Contributions section of the online submission form. Please also include the following statement within your amended Funding Statement. “The funder provided support in the form of salaries for authors [insert relevant initials], but did not have any additional role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. The specific roles of these authors are articulated in the ‘author contributions’ section.” If your commercial affiliation did play a role in your study, please state and explain this role within your updated Funding Statement. 2.2. Please also provide an updated Competing Interests Statement declaring this commercial affiliation along with any other relevant declarations relating to employment, consultancy, patents, products in development, or marketed products, etc. Within your Competing Interests Statement, please confirm that this commercial affiliation does not alter your adherence to all PLOS ONE policies on sharing data and materials by including the following statement: "This does not alter our adherence to PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests) . If this adherence statement is not accurate and there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared. Please include both an updated Funding Statement and Competing Interests Statement in your cover letter. We will change the online submission form on your behalf. Please know it is PLOS ONE policy for corresponding authors to declare, on behalf of all authors, all potential competing interests for the purposes of transparency. PLOS defines a competing interest as anything that interferes with, or could reasonably be perceived as interfering with, the full and objective presentation, peer review, editorial decision-making, or publication of research or non-research articles submitted to one of the journals. Competing interests can be financial or non-financial, professional, or personal. Competing interests can arise in relationship to an organization or another person. Please follow this link to our website for more details on competing interests: http://journals.plos.org/plosone/s/competing-interests [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly Reviewer #2: Partly ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: N/A Reviewer #2: N/A ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Authors introduce the topics very clearly, exploring the whole health care performance context thoroughly. They illustrate the issue the paper should contribute to address: how to keep into account complex interactions amid factors those which impact triple aim. They also specify that their review wants to explore possible simulations for improving “what if and “how to” scenario processes, listing the major limits of actual models proposed by some institutions. Final aim – presenting a visual aid to select the most appropriate simulation - is clearly expressed. Methods section details search strategy. Inclusion ad exclusion criteria are clean. Authors discuss data extraction and analysis which they adopted. They carefully show the process they followed, focusing how each step contributes to an integrated frame which looks coherent with paper’s aims. Results The description of Areas of Assessment is readable as it connects its themes with the topics in a structured way. The PRISMA diagram contributes to clarity. Table 1 classifies selected papers by described criteria. The description of simulation modelling techniques starts classifying them by five categories and adding a sixth one which includes studies based on three or more models (hybrid). The section structure is coherent but it could have started explaining reasons behind authors’ choice. Even though it should be considered a minor issue, it would add value trying to adopt a specific classification for simulation models. In case criteria available do not fit due to specificity of authors aims, it could help to quickly explore the connection between this classification and those are often adopted in decision making under uncertainty (e.g see the book Kochenderfer, M.J., 2015, Decision Making Under Uncertainty (MIT Lincoln Laboratory Series) The MIT Press.). The table offers a synthetic view which allows readers getting the big picture. Once again, as minor issue authors should have quoted the sources they based upon to state strengths and limitations of different types of models. It could be especially helpful for readers who do not have specific background. The Complexity section is especially interesting and at the center of authors aims. Lack of complexity in evaluating health care performances is one of the issues that authors want to address, so this section is expected to be rigorous and original. The latter expectation is quite satisfied while it is not possible to evaluate the first. Authors list nine complexity features those which are present in the 27 simulation models they’d previously selected. These features sound impactful and relevant but author should quote studies to help readers see them in the broader frame of system modelling theories. It also should allow to better understand both the connection among these features and how they impact on estimating health care system performance. Some features look ambiguous if considered outside a frame of reference as in different field they are referred to different phenomena. E.g. dynamism in dynamic system theory could refer to different meanings ranging from the presence of state variables to the time-variant characteristic of the system itself or both. Again, authors connect Adaptation with intelligence as the ability to make decisions following specific rules. While it is a possible option, the definition of dynamism few rows above could bring someone wonder whether these rules must vary over time for a model showing dynamism. In summary, while this section is relevant and innovative, authors should better explain references and help readers put these features in a unitary frame. Discussion The discussion starts exploring why a family of models can be helpful in modelling a specific system. While the intent is correct, this part sounds a little bit narrative. Maybe a more schematic description could help to stay connected with both the nine features and the five types of models described above. This part looks as it was written through some and partial examples, so the reader could wonder why other considerations were neglected by authors and especially why. In the second and in the third part authors discuss the core of the paper: how to improve the choice of performance simulation model for evaluating health care performance in the late of the “triple aim”. They begin explaining how different models can cope with different health care systems, then they explain how they applied these concepts to design the visual tools, showed in fig. 2. The visual tools is exciting for its simplicity. Authors illustrate through one example how to apply it for choosing a simulation model which fits both your needs and constraints. Nonetheless it shows a major weakness: authors do not specify how to integrate its different components. E.g. if the system under scrutiny calls for more than one relevant feature should the user follow different connectors, probably ending up in more than one loop? In this case, should users integrate different models? Authors should be more systematic discussing the tool which should be considered like a model itself, as it offers a way to choose simulation models through matching the features proposed by authors, as a result of their review, and needs coming from triple aim approach. More examples should be proposed to help readers understand how to use the tool when more than one feature is necessary. Limitations are expressed clearly. Conclusion suffers the weakness posed in the discussion section. Summary Positive • Relevant goals and clear expression of the issues to address • Impactful contents • High quality review process • Clear language To improve • Clarify classification criteria • Better connect the nine features in a common frame • Explore and explain how to use the visual tool when health system calls for multiple features and needs Reviewer #2: While this paper is well written, and discusses a timely topic that in principle is worthy of academic investigation, I do not think that the paper should be accepted for publication. In my view, the paper provides an interesting overview of recent simulation modeling efforts in health systems (with a special emphasis on complexity, being an important characteristic of such systems). But it does not go beyond that, and indeed more is needed to justify publication in a scholarly journal. First, the paper misses a scientific motivation. The justification for the paper is given in lines 70-72 and can be summarized by saying that simulation models have recently gained more recognition. But this in itself is no (scholarly) motivation for reviewing the related literature. A proper motivation would include: a discussion of why and to whom such a review would be beneficial, i.e., what is the aim of doing such a review? Who would use it, and to what aim? What goes wrong when the study (review) is not done? Also, a scientific motivation would include the identification of a clear knowledge gap: what other review efforts have been published in the academic literature? What is missing in those reviews, making this review necessary? Can the health domain learn from reviews of simulation models in other fields? Especially in light of various recently published review papers (14, 24, 65) it is of paramount importance that the authors justify the need for another review paper on this topic. Without such a clear motivation, the paper reads more like a (interesting) policy report, not a scholarly article. Second, although the paper provides an interesting overview of relevant papers and the properties of the models described in them, it does not offer as much as I hoped for, in terms of deep reflection and discussion of subtle connections between the models. This can be due to the fact that an enormously wide net is cast across a variety of modeling approaches, each of which having been studied in thousands and thousands of academic papers. Describing such canonical and grand modeling traditions in so few words inescapably leads to gross simplifications. Although understandable, this does not do justice to the subtleties and complexities of each of these models, let alone the nuanced relations between them. Similarly, the applications of the discussed models in the healthcare domain are so diverse (see column IC Topic-of-interest in Table 1), that drawing commonalities and generic lessons from them is hardly possible. What can one learn from comparing the use of model A in context X with the use of model B in context Y? For this to lead to credible results, many more data points are needed. As a consequence, the paper results in a well-structured but unavoidably shallow discussion. These two issues together make that this reviewer, having read the paper, feels that the paper never really explained what its added value to the academic literature would be (point 1) and that it is indeed difficult to identify key lessons learned from the material that is presented (point 2). In combination, this makes the paper rather unsuitable for publication in a scholarly journal, although I can imagine that with some extensions the manuscript could be made suitable as a report for practitioners and policy makers, especially those with limited knowledge of simulation models and a wish to have a systematic, practical guide to help them navigate such models in the healthcare domain. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Andrea Montefusco Reviewer #2: Yes: Caspar Chorus [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 1 |
|
PONE-D-20-10819R1 A systematic review of simulation modeling to assess health system performance: Characterization of the field and visual aid to guide model selection. PLOS ONE Dear Dr. Larrain, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Mar 21 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols We look forward to receiving your revised manuscript. Kind regards, Yong-Hong Kuo Academic Editor PLOS ONE Additional Editor Comments (if provided): The referees from the last round were invited to review this revision. One of them agreed and returned the review report. The reviewer still had some comments for the authors to address in their work. The other was not available to review for this round. Thus, I have gone through his/her comments and the revision. Below is my evaluation. 1. As the reviewers suggested, the academic value of the work shall be strengthened. Currently, it is unclear about the research questions and how the state-of-the-art is advanced by this work. The scientific motivation is still missing. 2. The analysis part shall be strengthened. What are the key messages resulting from the analysis? Also, it would be nice to have this article discussing the research trends and shedding light on future research directions. 3. There have been dozens of literature review / survey papers on simulation models of healthcare applications. The position of this paper is unclear. How is this paper different from the existing review papers? 4. The number of papers on healthcare simulation is tremendous. Currently, only 27 papers were analyzed. This coverage is much narrow than those covered by the existing review papers. I suggest the authors have a more comprehensive review of the studies, particularly the recent ones. To my knowledge, the below studies are relevant to this review work. However, the list below is not exhaustive and the authors shall identify further related studies: • Abramovich, M. N., Hershey, J. C., Callies, B., Adalja, A. A., Tosh, P. K., & Toner, E. S. (2017). Hospital influenza pandemic stockpiling needs: a computer simulation. American journal of infection control, 45(3), 272-277. • Chen, Y., Kuo, Y. H., Balasubramanian, H., & Wen, C. (2015, December). Using simulation to examine appointment overbooking schemes for a medical imaging center. In 2015 Winter Simulation Conference (WSC) (pp. 1307-1318). IEEE. • Gul, M., & Guneri, A. F. (2015). A comprehensive review of emergency department simulation applications for normal and disaster conditions. Computers & Industrial Engineering, 83, 327-344. • Kuo, Y. H. (2014). Integrating simulation with simulated annealing for scheduling physicians in an understaffed emergency department. HKIE transactions, 21(4), 253-261. • Kuo, Y. H., Leung, J. M., Graham, C. A., Tsoi, K. K., & Meng, H. M. (2018). Using simulation to assess the impacts of the adoption of a fast-track system for hospital emergency services. Journal of Advanced Mechanical Design, Systems, and Manufacturing, 12(3), JAMDSM0073-JAMDSM0073. • Kuo, Y. H., Rado, O., Lupia, B., Leung, J. M., & Graham, C. A. (2016). Improving the efficiency of a hospital emergency department: a simulation study with indirectly imputed service-time distributions. Flexible Services and Manufacturing Journal, 28(1-2), 120-147. • Hu, X., Barnes, S., & Golden, B. (2018). Applying queueing theory to the study of emergency department operations: a survey and a discussion of comparable simulation studies. International transactions in operational research, 25(1), 7-49. • Moeke, D., van de Geer, R., Koole, G., & Bekker, R. (2016). On the performance of small-scale living facilities in nursing homes: a simulation approach. Operations research for health care, 11, 20-34. • Niessner, H., Rauner, M. S., & Gutjahr, W. J. (2018). A dynamic simulation–Optimization approach for managing mass casualty incidents. Operations research for health care, 17, 82-100. • Ordu, M., Demir, E., Tofallis, C., & Gunal, M. M. (2020). A novel healthcare resource allocation decision support tool: A forecasting-simulation-optimization approach. Journal of the operational research society, 1-16. • Roy, S. N., Shah, B. J., & Gajjar, H. (2020). Application of Simulation in Healthcare Service Operations: A Review and Research Agenda. ACM Transactions on Modeling and Computer Simulation (TOMACS), 31(1), 1-23. • Roy, S., Prasanna Venkatesan, S., & Goh, M. (2020). Healthcare services: A systematic review of patient-centric logistics issues using simulation. Journal of the Operational Research Society, 1-23. • Salleh, S., Thokala, P., Brennan, A., Hughes, R., & Booth, A. (2017). Simulation modelling in healthcare: an umbrella review of systematic literature reviews. PharmacoEconomics, 35(9), 937-949. • Vanbrabant, L., Braekers, K., Ramaekers, K., & Van Nieuwenhuyse, I. (2019). Simulation of emergency department operations: A comprehensive review of KPIs and operational improvements. Computers & Industrial Engineering, 131, 356-381. • Vanbrabant, L., Martin, N., Ramaekers, K., & Braekers, K. (2019). Quality of input data in emergency department simulations: framework and assessment techniques. Simulation Modelling Practice and Theory, 91, 83-101. • Weissman, G. E., Crane-Droesch, A., Chivers, C., Luong, T., Hanish, A., Levy, M. Z., ... & Halpern, S. D. (2020). Locally informed simulation to predict hospital capacity needs during the COVID-19 pandemic. Annals of internal medicine, 173(1), 21-28. • Yousefi, M., Yousefi, M., & Fogliatto, F. S. (2020). Simulation-based optimization methods applied in hospital emergency departments: A systematic review. Simulation, 96(10), 791-806. • Zhang, C., Grandits, T., Härenstam, K. P., Hauge, J. B., & Meijer, S. (2018). A systematic literature review of simulation models for non-technical skill training in healthcare logistics. Advances in Simulation, 3(1), 1-16. Based on the reviewer's and my one evaluations, I recommend major revision. Hope that the authors shall find the comments constructive. The revision will go through a rigorous review process again. Unsuccessful revision can lead to rejection of the work. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: N/A ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The following comments refer directly to those in the previous review and the authors’ answers. The actual review did not limit to verifying whether the authors have addressed issues but reconsidered the entire paper. Comment 1 No comment Comment 2 The authors profoundly modified section 2.4. While before, it explained the data analysis approach weakly, now it meets expectations. Authors fully reformulate sentences, making possible a better understanding of the paper aims too. The section specifies, step by step, the rationale behind every item’s choice. It allows the reader to navigate the paper with a clean schema. The connections between items and the research question are now clear. Though this section calls for attention, it progressively helps readers build the first representation of how items integrate each other. It results in a sort of “integrated variable space” which appears to be the real novelty of this paper. Comment 3 The authors followed the suggestion and added the reference column in Table 1. Though it was a minor issue, expert readers can now connect with their knowledge while everyone could search sources directly and efficiently. Comment 4 The authors rewrote section 3.3 completely and, now the reader can find proper references. It helps understand the rationale behind the nine aspects of complex relations and the way to model them. It is easy to go through authors’ statements, as the text and the table together provide a clean, integrated view. Comment 5 While the authors analyzed complexity dimensions well, section 3.3 still misses why the nine dimensions impact health system performance. They go through that in session 4, Discussion. That is correct, as this way, they directly connect their framework with the research question and the final paper’s goal. It could help anticipate here straightforward examples of each complexity component's impacts on health care system performance. Although this could be considered a minor issue, nonetheless, it can improve the clarity much. Perhaps it is not easy to connect these concepts with health care system performance for those who have no in-depth knowledge of complex system theory and health care. Despite being a minor issue, this could limit the paper’s practical impact. Delivering examples in section 3.3 could increase the readability of section 4, as the readers would have examples in their minds. Comment 6 See comment 4. Comment 7 See comment 5. The authors hit the nail, though they could have delivered examples of impacts in this section too. Comment 8-9 The authors provided the suggested schematic connection between the nine features and health care. The section is informative now, though it still lacks a synthesis. Maybe its presence – A draw? Flow diagramm? - could help readers generalizing the application of the frame. Although the authors commented on the examples more in-depth than in the first submission, maybe the readers could find it challenging to apply the tools to a specific case. With their thorough revision, the authors improved the first and the second parts of the paper. The final sections get better after the revision, but it requires further work to keep the previous sections' promises. While reading it, it is hard to focus on the research question, as authors still do not explicitly connect their results’ discussion with keywords like triple aims and system performances, for just quickly quote them in the Limitations and Conclusion sections. Conclusion and Limitation must exploit the potential that is now shown by the previous sections. At the moment, they seem to be far weaker than the rest of the paper. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Andrea Montefusco [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 2 |
|
PONE-D-20-10819R2 Simulation modeling to assess performance of integrated healthcare systems: Systematic literature review to characterize the field and visual aid to guide model selection. PLOS ONE Dear Dr. Larrain, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Jun 03 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Yong-Hong Kuo Academic Editor PLOS ONE Additional Editor Comments (if provided): The revision has been reviewed by the reviewer from the last round. The reviewer has made a favorable recommendation. I have gone through the revision again. I highly appreciate that the authors have seriously addressed some of the concerns. The scope of the work and research question are now clear. I believe the work has certain degree of merit and potential to be published. However, since PLOS ONE publishes scientifically rigorous studies, there are still two major issues which have to be addressed: 1. As compared with other similar studies on simulation modeling in healthcare system, the studies reviewed in this work (only 27) are significantly fewer. The arguments and conclusions made in this study are therefore not promising. This problem is particularly clear as shown in Section 3.2. For examples, the results presented are only based on 2 studies for Markov models, 1 study for microsimulation, 2 studies for agent-based models, 1 study for hybrid simulations, etc. What were claimed in those sections are not comprehensive and convincing. 2. The problem stated in point 1 is probably caused by a subjective selection of articles (on p. 10) by only the two authors. This work is title a "systematic" review but this process systems to be non-systematic. I suggest there is a clear description of how the reviewers determine whether to include a study is included. A quality score is given to an article by the two authors; however, is this quality score reliable? (Do the authors imply that out of 2271 articles, only 27 are of quality? Others are not of quality?) Another issue is that a systematic literature review is to identify trends in the research area. It would be necessary to include studies not only based on the subjectively determined quality but need to identify the overall trends. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: N/A ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Is the manuscript technically sound, and do the data support the conclusions? The authors have progressively addressed the original version issues. Their paper now sounds informative while it illustrates its aims and methods sharply. The authors illustrate their framework clearly. Now the flow is logical, and connections between scope, data, and conclusions are evident. Have the authors made all data underlying the findings in their manuscript fully available? Yes, they did. Is the manuscript presented in an intelligible fashion and written in standard English? The authors have progressively evolved their manuscript through discourse with the reviewers. After the last revision, the paper will offer a novel and informative perspective to health care scholars and especially professionals. It will help readers look at the “Triple Aim” model with a broader framework to assess health care performances considering complexity accurately and robustly. Though the proposed tool has limits, it shows the readers the relevance of the systemic approach in evaluating a health care system. The paper starts a practical conversation that goes beyond simple improvement in health care assessment techniques. This proposal falls in the field of accurate and relevant feedback conversation (e.g, see Zollo, 2009 about superstitious learning). Despite its limits, the tool connects the diverse elements that concur in performance and allows H.C. professionals to take them and their complex interactions into account in evaluating and comparing performance. This reviewer can't judge the English level of the manuscprit. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Andrea Montefusco [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 3 |
|
Simulation modeling to assess performance of integrated healthcare systems: Literature review to characterize the field and visual aid to guide model selection. PONE-D-20-10819R3 Dear Dr. Larrain, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Yong-Hong Kuo Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: |
| Formally Accepted |
|
PONE-D-20-10819R3 Simulation modeling to assess performance of integrated healthcare systems: Literature review to characterize the field and visual aid to guide model selection. Dear Dr. Larrain: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Yong-Hong Kuo Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .