Peer Review History

Original SubmissionOctober 31, 2022
Decision Letter - Melissa J. Coleman, Editor

PONE-D-22-30015Contrast Normalization Affects Response Time-Course of Visual InterneuronsPLOS ONE

Dear Dr. Pirogova,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Both reviewers had similar comments about the submitted manuscript that should be addressed. Specifically, please provide additional statistical analyses and clarification of some of the analyses done. Please provide the additional background and context raised by both reviewers.

Please submit your revised manuscript by Mar 03 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Melissa J. Coleman

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at 

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf 

2. PLOS requires an ORCID iD for the corresponding author in Editorial Manager on papers submitted after December 6th, 2016. Please ensure that you have an ORCID iD and that it is validated in Editorial Manager. To do this, go to ‘Update my Information’ (in the upper left-hand corner of the main menu), and click on the Fetch/Validate link next to the ORCID field. This will take you to the ORCID site and allow you to create a new iD or authenticate a pre-existing iD in Editorial Manager. Please see the following video for instructions on linking an ORCID iD to your Editorial Manager account: https://www.youtube.com/watch?v=_xcclfuvtxQ

3. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability.

Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. Any potentially identifying patient information must be fully anonymized.

Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access.

We will update your Data Availability statement to reflect the information you provide in your cover letter.

4. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: No

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Major comments

1) The introduction to the paper is missing previous work that has been done both in larger flies (e.g. Harris Neuron 2000) and in Drosophila (e.g. recent papers from the Silies lab). Additionally, the Matulis 2019 should be mentioned already in the introduction as part of the background for the current work.

2) The paper is missing statistical tests. All claims regarding changes in amplitude and kinetics should be explicitly tested with the appropriate statistical test (e.g. Drews 2020 paper, Fig 5).

3) Authors’ choice of variability measure throughout the paper is a bit unorthodox (68% bootstrapped confidence interval). If authors choose to continue to use it, they should both justify the selection and present at least some of their results in the supplementary material with a more conventional statistics of variability.

4) Authors should present the entire stimulus trace, from before the surround presentation to after its disappearance. This should be done at least once for each cell type so that it will be clear to readers that the surround is indeed a classic surround that does not evoke a response.

Comments

1) State explicitly that the numbers in the figure legends are for neurons and flies.

2) In Figures 2,3 and 5, the figure legend claims that responses are normalized to the cell’s maximum, but the y-axis seems to represent a non-normalized response (simple df/f).

3) Please clarify if the Tm3 dataset in Fig 3 contains the neurons from fig 2 or if it is completely independent.

4) Please address the large difference in the numbers of cells imaged for the different cell types (i.e. 98 Mi1s vs. 22 Tm2)

5) The Matulis 2020 paper does contain local stimuli as part of their stimulus set. The explanation for the discrepancy in the discussion should be corrected.

6) Included code for figure replication is using a function that has been removed since 2020 (tsplot - https://seaborn.pydata.org/whatsnew/v0.10.0.html?highlight=tsplot). Please correct to updated versions of python and associated packages.

7) Line 170 states that "on average, over 90% of all cells measured passed the SNR criterion". Please clarify the sentence.

Reviewer #2: The manuscript by Pirogova and Borst describes contrast normalization in several medulla neurons (Mi1, Tm3, Tm1, Tm2) that feed into the motion-sensitive direction-selective T4 and T5 neurons in Drosophila. While contrast normalization in these medulla neurons (specifically, the effects on response amplitudes) has been described by Drew et al., 2020, the present study demonstrated the effects on temporal properties or response dynamics, exerted only by dynamic (but not stationary) stimuli in the visual surround. To account for these effects, the authors proposed two models, one with a stationary nonlinearity and the other with dynamic nonlinearity. By showing that the temporal properties of the responses was affected by the stimulus surround and not simply consequential to response saturation, the authors excluded the first model and favored the dynamic nonlinearity model with the hypothesis that contrast normalization affects membrane time constant.

Overall, the results represent an interesting extension of Drew et al., 2020. The data are of high quality. I have a number of minor comments/suggestions but would otherwise support the publication of this manuscript.

(1) Temporal dynamic is the focus of this study but was only presented in a qualitative way. It would be very helpful if the authors could quantify the effects or parameterize the temporal dynamics, and potentially compare with the modeling results.

(2) I am not sure if the presentation of the stationary nonlinearity model is really necessary. On the other hand, must the dynamic nonlinearity model utilize the alteration of input resistance ? I suppose not. The comparison of these two very different types of models seems odd. The authors could do a better job to introduce the models.

(3) The receptive field sizes for the medulla neurons (described in 241-243) appear to be different from those described in Arenz et al., 2017.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 1

Dear Editor,

Thank you for considering our manuscript entitled ‘Contrast Normalization Affects Response Time-Course of Visual Interneurons’ for publication. We appreciate the reviewers' thoughtful comments and suggestions on our work. We have addressed each of the reviewers' comments below.

Reviewer #1: Major comments

1) The introduction to the paper is missing previous work that has been done both in larger flies (e.g. Harris Neuron 2000) and in Drosophila (e.g. recent papers from the Silies lab). Additionally, the Matulis 2019 should be mentioned already in the introduction as part of the background for the current work.

We added the citations to previous work [37] Harris et al. (2000) and [38] Ketkar et al. (2022) in the introduction and corrected the introduction to include Matulis et al. (2020) as part of the background.

2) The paper is missing statistical tests. All claims regarding changes in amplitude and kinetics should be explicitly tested with the appropriate statistical test (e.g. Drews 2020 paper, Fig 5).

We have added a figure (S3) that shows the results of Mann-Whitney U test (as in Drews et al., 2020) for each of the existing figures, included the indications of significance in the manuscript, and referred to figure S3 in the text.

3) Authors’ choice of variability measure throughout the paper is a bit unorthodox (68% bootstrapped confidence interval). If authors choose to continue to use it, they should both justify the selection and present at least some of their results in the supplementary material with a more conventional statistics of variability.

We use 68% bootstrapped confidence interval in the figures when the whole trace of the response is presented, as in Drews et al. (2020). We have now added figure S3 with additional statistical analysis for each of the main and supplementary figures that show 68% bootstrapped confidence interval as a measure of variability.

4) Authors should present the entire stimulus trace, from before the surround presentation to after its disappearance. This should be done at least once for each cell type so that it will be clear to readers that the surround is indeed a classic surround that does not evoke a response.

We have added a figure (S4) that shows extended traces of the responses from Fig. 3 for each cell type and referred to figure S4 in the text.

Comments

1) State explicitly that the numbers in the figure legends are for neurons and flies.

The legends have been corrected to explicitly state the number of cells and flies in the dataset.

2) In Figures 2,3 and 5, the figure legend claims that responses are normalized to the cell’s maximum, but the y-axis seems to represent a non-normalized response (simple df/f).

We apologise for the mistake. We have corrected the legends accordingly.

3) Please clarify if the Tm3 dataset in Fig 3 contains the neurons from fig 2 or if it is completely independent.

The Tm3 and Mi1 dataset in Fig 3 contains the data from Fig 2 and Fig 3. The legend for Fig 3 was corrected accordingly.

4) Please address the large difference in the numbers of cells imaged for the different cell types (i.e. 98 Mi1s vs. 22 Tm2)

Mi1 was imaged with a number of protocols that had different combinations of stimuli that all contained the ‘grey surround’ condition and the ‘moving grating’ condition presented in Fig 3. This resulted in a smaller number of Mi1 cells stimulated with a stochastic surround that was only in one stimulus set, but a higher number of responses (from all the imaged Mi1 cells) to the two conditions shown in Fig 3. These different stimulus protocols were combined into one when also other cell types were recorded, resulting in a lower number of the other cells imaged.

5) The Matulis 2020 paper does contain local stimuli as part of their stimulus set. The explanation for the discrepancy in the discussion should be corrected.

We apologise for the mistake. We have corrected the explanation accordingly.

6) Included code for figure replication is using a function that has been removed since 2020 (tsplot - https://seaborn.pydata.org/whatsnew/v0.10.0.html?highlight=tsplot). Please correct to updated versions of python and associated packages.

We have now provided a yaml file in the GitHub repository to recreate a Python environment that contains all the Python packages needed to reproduce the figures.

7) Line 170 states that "on average, over 90% of all cells measured passed the SNR criterion". Please clarify the sentence.

To pass the SNR criterion to become included in the data, the inter-trial variance of a cell’s responses had to be smaller than the average cell response. Large inter-trial variance was typically caused by movement artefacts. Averaged over all cell types recorded, less than 10% of the cells were discarded due to the SNR criterion.

Reviewer #2 Minor comments/suggestions:

(1) Temporal dynamic is the focus of this study but was only presented in a qualitative way. It would be very helpful if the authors could quantify the effects or parameterize the temporal dynamics, and potentially compare with the modeling results.

We parameterized and quantified the temporal dynamics of the response by measuring the level to which the response has decayed at the end of the luminance step as percent of the peak responses. We have now also added a figure (S3) that shows the results of Mann-Whitney U test (as in Drews et al., 2020) for each of the existing figures, included the indications of significance in the manuscript, and referred to figure S3 in the text.

(2) I am not sure if the presentation of the stationary nonlinearity model is really necessary. On the other hand, must the dynamic nonlinearity model utilize the alteration of input resistance ? I suppose not. The comparison of these two very different types of models seems odd. The authors could do a better job to introduce the models.

We still think that our attempt to explain the phenomenon with a stationary nonlinearity first makes sense, since this is the classical way to describe contrast normalization. Furthermore, even this stationary saturation nonlinearity affects the time-course of the signal due to signal compression for large amplitudes. We actually learned a lot from it: if this is the explanation, we should obtain the same signal amplitude and time course for a static surround that we observed for a moving surround by simply reducing the stimulus amplitude accordingly. Clearly, the results contradicted this explanation. On the other hand, the biophysical model could reproduce the experimental results well.

To be clearer on this, we now explicitly introduce an inhibitory input which acts as a shunt, instead of saying that the leak conductance is enlarged by the moving surround.

(3) The receptive field sizes for the medulla neurons (described in 241-243) appear to be different from those described in Arenz et al., 2017.

The numbers are taken from S1 and S2 tables in Arenz et. al., 2017 (Mi1 - 28.81°, Tm3 - 11.91°, Tm1 - 27.14°, Tm2 - 30.52°) and rounded to the nearest integer, resulting in the values used (29° for Mi1 cells, 12° for Tm3 cells, 27° for Tm1 cells, and 31° for Tm2 cells).

We clarified the way the values were rounded in the manuscript.

Attachments
Attachment
Submitted filename: Response to Reviewers.docx
Decision Letter - Melissa J. Coleman, Editor

Contrast Normalization Affects Response Time-Course of Visual Interneurons

PONE-D-22-30015R1

Dear Dr. Pirogova,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Melissa J. Coleman

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: (No Response)

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

**********

Formally Accepted
Acceptance Letter - Melissa J. Coleman, Editor

PONE-D-22-30015R1

Contrast Normalization Affects Response Time-Course of Visual Interneurons

Dear Dr. Pirogova:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Melissa J. Coleman

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .