Peer Review History
| Original SubmissionMarch 2, 2022 |
|---|
|
Dear Mr. Mercier, Thank you very much for submitting your manuscript "Effective resistance against pandemics: Mobility network sparsification for high-fidelity epidemic simulations" for consideration at PLOS Computational Biology. As with all papers reviewed by the journal, your manuscript was reviewed by members of the editorial board and by several independent reviewers. The reviewers appreciated the attention to an important topic. Based on the reviews, we are likely to accept this manuscript for publication, providing that you modify the manuscript according to the review recommendations. Please prepare and submit your revised manuscript within 30 days. If you anticipate any delay, please let us know the expected resubmission date by replying to this email. When you are ready to resubmit, please upload the following: [1] A letter containing a detailed list of your responses to all review comments, and a description of the changes you have made in the manuscript. Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out [2] Two versions of the revised manuscript: one with either highlights or tracked changes denoting where the text has been changed; the other a clean version (uploaded as the manuscript file). Important additional instructions are given below your reviewer comments. Thank you again for your submission to our journal. We hope that our editorial process has been constructive so far, and we welcome your feedback at any time. Please don't hesitate to contact us if you have any questions or comments. Sincerely, Feng Fu Associate Editor PLOS Computational Biology Virginia Pitzer Deputy Editor-in-Chief PLOS Computational Biology *********************** A link appears below if there are any accompanying review attachments. If you believe any reviews to be missing, please contact ploscompbiol@plos.org immediately: [LINK] Reviewer's Responses to Questions Comments to the Authors: Please note here if the review is uploaded as an attachment. Reviewer #1: In this manuscript, the authors discuss a network sparsification method implemented in the context of epidemiology. The method is built upon the earlier algorithm from Spielman and Srivastava, which uses effective resistance to sample the edges. In this paper, the authors simulated the SIR model on a real-world mobility network. They showed that the effective resistance sampling method could preserve the behavior of the SIR model. In addition, the authors examined other simpler sparsification algorithms and found that using effective resistance provides the most accurate simulations. Network sparsification is an interesting and important topic, especially when simulating large-scale epidemical models. Thus, the findings within this paper are potential of great practical importance. Detailed comments: 1. In the introduction, the authors stated, “It outperforms the simpler edge-sampling methods based on uniform probabilities and edge weights, as well as the naïve thresholding approach.” Simply saying “outperform” is a little vague. I think it would be better if the authors could explain precisely how the effective resistance sampling method is better than the other methods. It seems that the effective resistance sampling method is more accurate to capture the behavior of the SIR model, but it is more computationally expensive than the other simpler sparsification algorithms. 2. Figure 1: Is the color coded on the same scale for the figures on the left and right? The edges look in the same darker blue in the figure on the left. 3. It would be better if the authors could provide some details of the network in the caption of Figure 1. For example, each node represents a census tract, define q as the fraction of edges wanted to preserve, etc. 4. Figure 2: There are blue and black dots in the figure. What does the color represent? Moreover, weight-based sampling method performed almost as good as the effective-resistance method, at least when measured by R-squared. I think the authors should mention this observation in the text. 5. The authors chose the same q values for the three edge-sampling methods and simulated the methods for each value of q. So in Figures 3 and 5, if I draw a vertical line on each chosen value of q, should three different dots lie on the vertical line? It seems that the dots are aligned for q values below 0.01 but not for the values above 0.01. 6. Figure 3: In the caption, the authors mentioned: “the shaded regions corresponds to one standard deviation of the average.” However, I didn’t see the shaded regions in the figure on my end. Also, “corresponds” should be “correspond”. 7. I don’t understand Figure 4 very easily. Please provide a better description. What does the y-axis represent? How are these two nodes chosen? Is it meaningful that all sampling methods have overlapping curves for one node under the localized initial condition but not the dispersed initial condition? Reviewer #2: The authors propose a new statistical method for weighted graph sparsification, with a focus on preserving epidemic spread dynamics. The paper is very well written and indeed a pleasure to read. The method is of clear relevance to network science and computational biology as so many of the latter's analytical pipelines depend on large networks that benefit from sparsification---not just mobility or other networks involved in epidemic simulations. As such, I recommend publication with some minor suggested edits. My main comment relates to a comparison with methods we developed in our group. Thus, for maximum transparency, I sign this review. I could not agree more with the authors about the need for principled network reduction methods that preserve essential network dynamics and the hierarchical structure of networks. Two of the methods we have proposed are cited by the authors: distance backbones [Ref. 16] and the effective graph [Ref. 17]. The most related to this paper is the former methodology. While the effective resistance methods is statistically-principled ("the weighted adjacency matrix and graph Laplacian of \\tilde {G} are equal, in expectation, to those of G"), distance backbones are algebraically-principled (and parameter-free). Indeed, distance backbones are unique for a given distance function (typically in network science we sum distance edges resulting in the unique metric backbone, but other distances are possible), while effective resistance leads to different sparsifications in each run (similarly to the disparity filter proposed by Serrano, Boguna and Vespignani [Ref. 15] but likely more efficient in preserving spreading dynamics given the preservation of essential connectivity), and also changes the original edge weights. Perhaps more importantly, the distance backbone is guaranteed to preserve the entire distribution of shortest paths intact and edge weights, not only the network connectivity --- even though distance backbones are also typically very small [Ref. 16]. The authors say (page 3) that their "strategy helps keep the network connected and preserves its global structure." So, it would be interesting to understand how the effective resistance sparsification affects the original distribution of shortest paths? Figure 4 tallies the fraction of disconnected nodes, which is related, but not the distribution of shortest paths. This suggests that effective resistance preserves the distance backbone for a large range of the fraction of edges removed (unlike the other methods) as the backbone would also keep all edges connected, but the impact on shortest paths may occur for smaller fractions of edges removed. So, while the effective resistance "sparsifier preserves the linear properties of the original networks in expectation," the distance backbone does not affect any shortest path on the network (nor the original distance weights). This speaks to the synergy between the two concepts, and of course I do not expect the authors to run simulations to compare the two methods in this paper. However, the impact of effective resistance sparsification on the distribution of shortest paths is a reasonable question to consider---in addition to the uniqueness of the distance backbone. Another related question is what to do with the results of the effective resistance sparsifier since it changes the original weights? In the case of the epidemic models this is (more or less) clear because we tend to be more interested in the dynamical observables reported (time to infection, infection probability, etc.) But what about cases where the original weights are meaningful? For instance, in brain connectome or gene regulation networks (as in [Ref. 16]) the original edge weights have specific experimental significance. The distance backbone preserves the edge weights and thus their experimental significance. The authors could discuss or suggest how that would be handled with their sparsifier. The authors say, in regards to Refs 16&17, that they "know of no rigorous results on whether these techniques preserve dynamical behavior." The effective graph methodology [Ref. 17] is also principled (logically-principled based on the Quine-McCluskey algorithm) , but it applies to networks with node dynamics (such as automata networks used in systems biology). The effective graph is unique because our measure of effective connectivity is a parameter (not a statistic) of the dynamics of Boolean functions. Preserving dynamical behavior is the whole point of Ref. 17, indeed of the whole approach. Because the networks analyzed in this paper have no node-dynamics and are rather used to study spreading dynamics (dynamics on networks rather than dynamics of networks), the effective graph is not directly comparable to the effective resistance methodology (albeit the similar name). Still, one cannot say the latter was not studied in regards to preserving dynamical behavior (our recent discussion in https://doi.org/10.1093/bioinformatics/btac360 could be relevant here). As for weather the distance backbone methodology [Ref. 16] was studied to preserve dynamical behavior (in the sense of dynamics on networks) is a curious thing since https://doi.org/10.1101/2022.02.02.478784 is under review in this same journal, and of course the authors would not know about it. Still, the utility of the distance backbone in epidemic models has also been studied. I would venture that the effective resistance sparsifier assumes propagation by a particular distance (a resistance distance) which makes a lot of sense for epidemic spread, whereas the distance backbone methodology at large can consider any distance (our under review work on epidemic spread considers only the metric distance). I posit that it should be possible to derive a "resistance backbone" that is unique (not sampled) , but that is clearly an idea for future work---again, supporting the synergy and complementarity between the two methods, which in my view are both very relevant and useful. Minor comments: I would welcome a little more justification for the values of the edge-sampling proportion q used. In particular, why only 0.1, 0.55, and 1 for anything above 10%? Probably because there is not much difference between methods tested (safe thresholding) in that range? By the way, since other figures are derived for q=0.1 which led to near 7% edges preserved (right?), a figure with q vs actual % od edges that remain would be useful even if in supporting materials as I imagine this may different from network to network depending on how many (essential) bridges they have (and other properties). I don't quite get what all the four panels are in Figure 4. A and B sides are explained, but not upper and lower panels. Really liked the use of the Wasserstein distance for comparisons. Luis M. Rocha ********** Have the authors made all data and (if applicable) computational code underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data and code underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data and code should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data or code —e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: Yes: Luis M. Rocha Figure Files: While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Data Requirements: Please note that, as a condition of publication, PLOS' data policy requires that you make available all data used to draw the conclusions outlined in your manuscript. Data must be deposited in an appropriate repository, included within the body of the manuscript, or uploaded as supporting information. This includes all numerical values that were used to generate graphs, histograms etc.. For an example in PLOS Biology see here: http://www.plosbiology.org/article/info%3Adoi%2F10.1371%2Fjournal.pbio.1001908#s5. Reproducibility:
To enhance the reproducibility of your results, we recommend that you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. Additionally, PLOS ONE offers an option to publish peer-reviewed clinical study protocols. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols References: Review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. |
| Revision 1 |
|
Dear Mr. Mercier, We are pleased to inform you that your manuscript 'Effective resistance against pandemics: Mobility network sparsification for high-fidelity epidemic simulations' has been provisionally accepted for publication in PLOS Computational Biology. Before your manuscript can be formally accepted you will need to complete some formatting changes, which you will receive in a follow up email. A member of our team will be in touch with a set of requests. Please note that your manuscript will not be scheduled for publication until you have made the required changes, so a swift response is appreciated. IMPORTANT: The editorial review process is now complete. PLOS will only permit corrections to spelling, formatting or significant scientific errors from this point onwards. Requests for major changes, or any which affect the scientific understanding of your work, will cause delays to the publication date of your manuscript. Should you, your institution's press office or the journal office choose to press release your paper, you will automatically be opted out of early publication. We ask that you notify us now if you or your institution is planning to press release the article. All press must be co-ordinated with PLOS. Thank you again for supporting Open Access publishing; we are looking forward to publishing your work in PLOS Computational Biology. Best regards, Feng Fu Academic Editor PLOS Computational Biology Virginia Pitzer Section Editor PLOS Computational Biology *********************************************************** Reviewer's Responses to Questions Comments to the Authors: Please note here if the review is uploaded as an attachment. Reviewer #1: Thank authors for the hard work. It's a pleasure reading this manuscript. All my previous comments have been addressed. Reviewer #2: Thank you so much for addressing all my comments. I am fully satisfied with the responses and excited for the possibilities this research (and synergy with our own methods) brings to network science and the study of the dynamics of epidemic spread on networks. ********** Have the authors made all data and (if applicable) computational code underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data and code underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data and code should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data or code —e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: Yes: Luis M. Rocha |
| Formally Accepted |
|
PCOMPBIOL-D-22-00323R1 Effective resistance against pandemics: Mobility network sparsification for high-fidelity epidemic simulations Dear Dr Mercier, I am pleased to inform you that your manuscript has been formally accepted for publication in PLOS Computational Biology. Your manuscript is now with our production department and you will be notified of the publication date in due course. The corresponding author will soon be receiving a typeset proof for review, to ensure errors have not been introduced during production. Please review the PDF proof of your manuscript carefully, as this is the last chance to correct any errors. Please note that major changes, or those which affect the scientific understanding of the work, will likely cause delays to the publication date of your manuscript. Soon after your final files are uploaded, unless you have opted out, the early version of your manuscript will be published online. The date of the early version will be your article's publication date. The final article will be published to the same URL, and all versions of the paper will be accessible to readers. Thank you again for supporting PLOS Computational Biology and open-access publishing. We are looking forward to publishing your work! With kind regards, Zsofia Freund PLOS Computational Biology | Carlyle House, Carlyle Road, Cambridge CB4 3DN | United Kingdom ploscompbiol@plos.org | Phone +44 (0) 1223-442824 | ploscompbiol.org | @PLOSCompBiol |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .