Peer Review History
| Original SubmissionAugust 24, 2023 |
|---|
|
Dear Kozielska, Thank you very much for submitting your manuscript "A neural network model for the evolution of learning in changing environments" for consideration at PLOS Computational Biology. As with all papers reviewed by the journal, your manuscript was reviewed by members of the editorial board and by several independent reviewers. The reviewers appreciated the attention to an important topic. Based on the reviews, we are likely to accept this manuscript for publication, providing that you modify the manuscript according to the review recommendations. Please take the comments of the three reviewers into consider (in particular the issues pointed out by Reviewer 1). Please prepare and submit your revised manuscript within 30 days. If you anticipate any delay, please let us know the expected resubmission date by replying to this email. When you are ready to resubmit, please upload the following: [1] A letter containing a detailed list of your responses to all review comments, and a description of the changes you have made in the manuscript. Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out [2] Two versions of the revised manuscript: one with either highlights or tracked changes denoting where the text has been changed; the other a clean version (uploaded as the manuscript file). Important additional instructions are given below your reviewer comments. Thank you again for your submission to our journal. We hope that our editorial process has been constructive so far, and we welcome your feedback at any time. Please don't hesitate to contact us if you have any questions or comments. Sincerely, Xingru Chen, Ph.D. Guest Editor PLOS Computational Biology Zhaolei Zhang Section Editor PLOS Computational Biology *********************** A link appears below if there are any accompanying review attachments. If you believe any reviews to be missing, please contact ploscompbiol@plos.org immediately: Please take the comments of the three reviewers into consider (in particular the issues pointed out by Reviewer 1). Reviewer's Responses to Questions Comments to the Authors: Please note here if the review is uploaded as an attachment. Reviewer #1: Generally speaking I am really enthusiastic about the work presented in this MS. As the authors point out there are relatively few formal models of the evolution of learning and none that explicitly consider how network-based learning is likely to evolve. In addition, the analyses are well executed and clearly discussed, and so the work will almost certainly have an impact on the field. I have only one main issue I feel needs to be explicitly considered in the MS to maximise its potential. Most of the functional ("behavioural gambit") logic underpinning existing understanding about the selective advantages of learning emphasise the key role of within generation (within life time) patterns of variation with the received wisdom being that learning is most strongly selected for when within-generation variation is substantial, varies between generations and potentially trackable. However, the authors only consider ecological change happening between generations here. That learning is still favoured at all seems to me to be because there is a noisy cue that can be at least partially resolved via repeated sampling. Now none of this renders the analysis uninteresting - the selective value of being able to resolve such cue error via learning is not something that has been widely considered in the evolutionary literature to my knowledge. Nevertheless, i do feel that the authors need to acknowledge this issue and discuss how their formulation relates to such "classic" ideas about how ecological change affects the evolutionary value of learning. This received wisdom is derived from Dave Stephens' early discussions and analyses culminating in his seminal model in Stephens (1991), which I note the authors do not cite. Stephens, D. W. (1991). Chnage, regularity, and value in the evolution of animal learning. Behavioral Ecology, 2(1), 77–89. https://doi.org/10.1093/beheco/2.1.77 Reviewer #2: See attached review report Reviewer #3: This study introduces a novel approach to modeling learning using small neural networks and a biologically inspired learning algorithm that selectively impacts part of the network based on the disparity between expectations and reality. This model offers insights into how natural selection shapes learning, and authors applied it to explore the evolution of learning under diverse environmental conditions, considering different trade-offs between exploration (learning) and exploitation (foraging). In the individual-based simulations, efficient learning regularly evolved. However, consistent environments with minimal change and short-lived organisms, unable to dedicate much time to exploration, posed challenges to the evolution of learning, consistent with previous findings. Once learning evolved, surprisingly, the characteristics of the learning strategy (learning period duration and learning rate) and average performance after learning were minimally influenced by the frequency and magnitude of environmental changes. Conversely, an organism's lifespan and the resource distribution in the environment significantly impacted the evolved learning strategy. Notably, a longer learning period did not universally lead to improved performance, suggesting variability in the effectiveness of evolved neural networks. Testing results demonstrate that a relatively simple, biologically inspired learning mechanism can evolve to facilitate efficient adaptation in dynamic environments. ********** Have the authors made all data and (if applicable) computational code underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data and code underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data and code should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data or code —e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes ********** PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: Yes: Xin Wang Reviewer #3: No Figure Files: While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Data Requirements: Please note that, as a condition of publication, PLOS' data policy requires that you make available all data used to draw the conclusions outlined in your manuscript. Data must be deposited in an appropriate repository, included within the body of the manuscript, or uploaded as supporting information. This includes all numerical values that were used to generate graphs, histograms etc.. For an example in PLOS Biology see here: http://www.plosbiology.org/article/info%3Adoi%2F10.1371%2Fjournal.pbio.1001908#s5. Reproducibility: To enhance the reproducibility of your results, we recommend that you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. Additionally, PLOS ONE offers an option to publish peer-reviewed clinical study protocols. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols References: Review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.
|
| Revision 1 |
|
Dear Kozielska, We are pleased to inform you that your manuscript 'A neural network model for the evolution of learning in changing environments' has been provisionally accepted for publication in PLOS Computational Biology. Before your manuscript can be formally accepted you will need to complete some formatting changes, which you will receive in a follow up email. A member of our team will be in touch with a set of requests. Please note that your manuscript will not be scheduled for publication until you have made the required changes, so a swift response is appreciated. IMPORTANT: The editorial review process is now complete. PLOS will only permit corrections to spelling, formatting or significant scientific errors from this point onwards. Requests for major changes, or any which affect the scientific understanding of your work, will cause delays to the publication date of your manuscript. Should you, your institution's press office or the journal office choose to press release your paper, you will automatically be opted out of early publication. We ask that you notify us now if you or your institution is planning to press release the article. All press must be co-ordinated with PLOS. Thank you again for supporting Open Access publishing; we are looking forward to publishing your work in PLOS Computational Biology. Best regards, Xingru Chen, Ph.D. Guest Editor PLOS Computational Biology Zhaolei Zhang Section Editor PLOS Computational Biology *********************************************************** Reviewer's Responses to Questions Comments to the Authors: Please note here if the review is uploaded as an attachment. Reviewer #1: I am happy with the authors' response to my comments and the changes to the MS they made. I look forward to seeing this in print. Reviewer #2: I appreciate the authors' great efforts on addressing all my concerns and I believe their work will arise much attention in the field. Therefore I'm happy to recommend accepting it for publication in PLOS Computational Biology. Reviewer #4: This study introduces a novel approach to modeling learning using small neural networks and a simple, biology-inspired learning algorithm, offering insights into the evolution of this crucial adaptation mechanism. Unlike many existing models that focus on simple phenotypes or make biologically unrealistic assumptions, the approach incorporates the complexities of neural networks and considers evolutionary questions. In the simulations, this paper explore the evolution of learning under diverse environmental conditions, examining different trade-offs between exploration (learning) and exploitation (foraging). The results show that efficient learning readily evolves in the individual-based simulations. However, consistent environments or short-lived organisms pose challenges to the evolution of learning, consistent with previous findings. Surprisingly, once learning evolves, the characteristics of the learning strategy (e.g., learning period duration and learning rate) and average performance after learning are minimally affected by the frequency and magnitude of environmental changes. Instead, an organism's lifespan and the resource distribution in the environment significantly impact the evolved learning strategy. Shorter lifespans and narrow resource distributions reduce the likelihood of learning evolution. Notably, a longer learning period does not universally lead to better performance, suggesting variability in the effectiveness of evolved neural networks. Overall, this study demonstrates that a biologically inspired, relatively simple learning mechanism can evolve to facilitate efficient adaptation in dynamic environments. Authors also well figured out the questions and concerns raised in previous reviews; thus, suggest to accept. ********** Have the authors made all data and (if applicable) computational code underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data and code underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data and code should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data or code —e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes Reviewer #4: Yes ********** PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: Yes: Xin Wang Reviewer #4: No |
| Formally Accepted |
|
PCOMPBIOL-D-23-01364R1 A neural network model for the evolution of learning in changing environments Dear Dr Kozielska, I am pleased to inform you that your manuscript has been formally accepted for publication in PLOS Computational Biology. Your manuscript is now with our production department and you will be notified of the publication date in due course. The corresponding author will soon be receiving a typeset proof for review, to ensure errors have not been introduced during production. Please review the PDF proof of your manuscript carefully, as this is the last chance to correct any errors. Please note that major changes, or those which affect the scientific understanding of the work, will likely cause delays to the publication date of your manuscript. Soon after your final files are uploaded, unless you have opted out, the early version of your manuscript will be published online. The date of the early version will be your article's publication date. The final article will be published to the same URL, and all versions of the paper will be accessible to readers. Thank you again for supporting PLOS Computational Biology and open-access publishing. We are looking forward to publishing your work! With kind regards, Zsofia Freund PLOS Computational Biology | Carlyle House, Carlyle Road, Cambridge CB4 3DN | United Kingdom ploscompbiol@plos.org | Phone +44 (0) 1223-442824 | ploscompbiol.org | @PLOSCompBiol |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .