Peer Review History

Original SubmissionJune 15, 2021
Decision Letter - Kewei Chen, Editor

PONE-D-21-19322Comparing the reliability of different ICA algorithms for fMRI analysisPLOS ONE

Dear Dr. Wei, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Sorry for the long review process so far, and we had only one reviewer's feedback, the experts comments are comprehensive and in-depth. I feel it is fair to move forward with you starting the revision process and I personally went over the MS couple time and concur with the reviewer.

Please submit your revised manuscript by Mar 11 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Kewei Chen, Ph.D

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at 

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. We note that the grant information you provided in the ‘Funding Information’ and ‘Financial Disclosure’ sections do not match. 

When you resubmit, please ensure that you provide the correct grant numbers for the awards you received for your study in the ‘Funding Information’ section.

3. Thank you for stating the following in the Acknowledgments Section of your manuscript: 

"This research was funded by the National Key R&D Program of China (Grant Nos. 2018YFC2001400 and 2018YFC2001700), the National Natural Science Foundation of China (Grant No. 81972160), and the Beijing Natural Science Foundation (Grant No. 17L20019)."

We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form. 

Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows: 

"This research was funded by the National Key R&D Program of China (Grant Nos. 2018YFC2001400 and 2018YFC2001700 by WP and ZL), the National Natural Science Foundation of China (Grant No. 81972160 by WP), and the Beijing Natural Science Foundation (Grant No. 17L20019 by WP).

The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

4. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability.

Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. Any potentially identifying patient information must be fully anonymized.

Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access.

We will update your Data Availability statement to reflect the information you provide in your cover letter.

5. Please include your full ethics statement in the ‘Methods’ section of your manuscript file. In your statement, please include the full name of the IRB or ethics committee who approved or waived your study, as well as whether or not you obtained informed written or verbal consent. If consent was waived for your study, please include this information in your statement as well. 

6. Please include a copy of Table S8, S9, S2 and S4 which you refer to in your text on pages 6, 11 and 12.

7. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information. 

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: No

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This is an interesting article. There is real value. However, I have several concerns regarding details missing, and technical questions that are unanswered in the article in its present form. The article is generally well written, but all acronyms are not fully spelled out on first use, and software packages and links are not always provided in the methods. The authors would have made both review and their revision tasks easier if they had added line numbers in the draft.

I will detail the various data and methodology concerns in detail below.

1. ACRONYMS must be spelled out on first use and links to software and citations provided. The omissions are numerous. A few (not complete list) MDL, AMUSE, JADE, ERICA and surprisingly, even ICASSO which is a pivotal method here.

2. Only two human subject data sets are used in this paper. This might be acceptable if we were convinced that the data was sufficiently rich, but this information is missing. The reader needs to know fully data set sizes: numbers of imaging sessions, full number of tasks, full number of stimuli in tasks etc. The richness of data is a factor in ICA outcomes so these aspects MUST be reported. Similarly, the method of data setup for ICA needs better description. Was data analyzed in each subject ultimately as a single concatenated ensemble (and if so, of what matrix dimensions), or as a number of subset ensembles (and if so, of what dimensions, and how many such ensembles). Because only two individuals data were used, there remains the unfortunate possibility that there are structural individual-based idiosyncracies in the data here that favor Infomax. I do not believe this likely, but it cannot be fully discounted given the data. Other research applying ICA or testing dimensionality separation methods including ICA have used larger numbers of individuals, or/and artificial data generation under constraints to explore comparisons, or reliability. Examples in motor field are Tresch, D'Avella, Cheung, papers in J Neurophysiology for synergy separation analyses, and Yang, Logan and Giszter in PNAS exploring SCC-like measures on motor synergy outcomes across individual animals. An aspect not explored by the authors of this paper (and given lack of detail provided, this may or may not be possible in their data), that might be used to enrich the analysis they perform is bootstrapping or jack-knifing the data sets used, in addition to subsequent multiple iterations of ICA methods on each subjects subset data. At the very least, there needs to be discussion of the data limits in this paper resulting from only 2 subjects and only 1 subject per type of experiment, and the caveats resulting, i.e. possibility of individual idiosyncracy of data favoring specific ICA methods.

3. The SCC method as described in section 2.4 is unclear. It is possible to do this in two ways - correlate the individual IC spatial components picking best correlations (where pairings of and IC with other sets may not be unique- same IC best in two or more correlations), or correlate the spatial matrix (e.g., in MATLAB using matperm, matcorr) and then IC correlations will be unique, based on the matrix permutation use in the unique IC matchings. The authors used the former, non-unique method I believe, but this might have biased results. If they did, with the possibility of non-unique matching of ICs, then I think the reader needs to know if there were non-unique best IC spatial correlations in each method, and if so how many. This might be an additional metric on ICA algorithm quality.

4. Finally, in relation to the SCC statistics used: Correlation, especially here, is non-gaussian if used directly. This may present difficulties in interpreting the statistics in tables 6 and 7 that could be avoided. Using the Fisher-Z transform (see Yang, Logan, Giszter, PNAS 2019 for example with Infomax ICA data stats) the correlations are transformed to normal variables, improving the interpretation of parametric methods.

5. Equations would be helpful throughout the methods if very clearly written. The goal is reproducibility of methods.

6. The data sharing statement is insufficient, for this study an anonymized repository should be chosen - there are many.

This paper is a solid contribution, but marred by the omissions in detail, and possibly improved by attention to the technical points noted.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 1

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. We note that the grant information you provided in the ‘Funding Information’ and ‘Financial Disclosure’ sections do not match.

When you resubmit, please ensure that you provide the correct grant numbers for the awards you received for your study in the ‘Funding Information’ section.

3. Thank you for stating the following in the Acknowledgments Section of your manuscript:

"This research was funded by the National Key R&D Program of China (Grant Nos. 2018YFC2001400 and 2018YFC2001700), the National Natural Science Foundation of China (Grant No. 81972160), and the Beijing Natural Science Foundation (Grant No. 17L20019)."

We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form.

Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows:

"This research was funded by the National Key R&D Program of China (Grant Nos. 2018YFC2001400 and 2018YFC2001700 by WP and ZL), the National Natural Science Foundation of China (Grant No. 81972160 by WP), and the Beijing Natural Science Foundation (Grant No. 17L20019 by WP).

The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

4. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability.

Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. Any potentially identifying patient information must be fully anonymized.

Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access.

We will update your Data Availability statement to reflect the information you provide in your cover letter.

5. Please include your full ethics statement in the ‘Methods’ section of your manuscript file. In your statement, please include the full name of the IRB or ethics committee who approved or waived your study, as well as whether or not you obtained informed written or verbal consent. If consent was waived for your study, please include this information in your statement as well.

6. Please include a copy of Table S8, S9, S2 and S4 which you refer to in your text on pages 6, 11 and 12.

7. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

We prepared the new version on the basis of “Journal Requirements” mentioned in the Decision letter:

1. We referred to the The PLOS ONE style templates.

2. The correct grant numbers for the Funding are those in the Acknowledgments Section of the first version. That is, "This research was funded by the National Key R&D Program of China (Grant Nos. 2018YFC2001400 and 2018YFC2001700), the National Natural Science Foundation of China (Grant No. 81972160), and the Beijing Natural Science Foundation (Grant No. 17L20019)."

3. We have removed any funding-related text from the manuscript. Please move the following contents to the Funding Statement: "This research was funded by the National Key R&D Program of China (Grant Nos. 2018YFC2001400 and 2018YFC2001700), the National Natural Science Foundation of China (Grant No. 81972160), and the Beijing Natural Science Foundation (Grant No. 17L20019)."

4. The minimal data set has been included in the Supporting information.

5. The information on IRB has been added.

6. Table S8, S9, S2 and S4 should be Table 8, 9, 2 and 4. We have corrected the error in the revised version.

7. We have included captions for the Supporting Information files at the end of the manuscript, and updated in-text citations.

=========================

Reviewer #1: This is an interesting article. There is real value. However, I have several concerns regarding details missing, and technical questions that are unanswered in the article in its present form. The article is generally well written, but all acronyms are not fully spelled out on first use, and software packages and links are not always provided in the methods. The authors would have made both review and their revision tasks easier if they had added line numbers in the draft.

Response:We really appreciate the comments and suggestions. The revised manuscript, we hope, can solve the raised issues and is clearly expressed.

I will detail the various data and methodology concerns in detail below.

1. ACRONYMS must be spelled out on first use and links to software and citations provided. The omissions are numerous. A few (not complete list) MDL, AMUSE, JADE, ERICA and surprisingly, even ICASSO which is a pivotal method here.

Response: We did not find the origin of the acronym “ICASSO” when we prepare the manuscript. The developers provided a link http://research.ics.aalto.fi/ica/icasso/ where two publications (http://research.ics.aalto.fi/ica/icasso/publications.shtml) can be found. However, neither the content on the website nor the publications introduced the origin of ICASSO. We also failed to find the origin of the acronym in other publications. To find the answer, we asked the developer Prof. Aapo Hyvärinen and learn: it is not a real acronym. It is a kind of a joke on the acronym "ICA" and the name of the artist Picasso. We think this information is useful and have supplemented it in the Acknowledgments since no answer can be found publicly elsewhere.

The origins of other acronyms, links to software, and citations have been added.

2. Only two human subject data sets are used in this paper. This might be acceptable if we were convinced that the data was sufficiently rich, but this information is missing. The reader needs to know fully data set sizes: numbers of imaging sessions, full number of tasks, full number of stimuli in tasks etc. The richness of data is a factor in ICA outcomes so these aspects MUST be reported. Similarly, the method of data setup for ICA needs better description. Was data analyzed in each subject ultimately as a single concatenated ensemble (and if so, of what matrix dimensions), or as a number of subset ensembles (and if so, of what dimensions, and how many such ensembles). Because only two individuals data were used, there remains the unfortunate possibility that there are structural individual-based idiosyncracies in the data here that favor Infomax. I do not believe this likely, but it cannot be fully discounted given the data. Other research applying ICA or testing dimensionality separation methods including ICA have used larger numbers of individuals, or/and artificial data generation under constraints to explore comparisons, or reliability. Examples in motor field are Tresch, D'Avella, Cheung, papers in J Neurophysiology for synergy separation analyses, and Yang, Logan and Giszter in PNAS exploring SCC-like measures on motor synergy outcomes across individual animals. An aspect not explored by the authors of this paper (and given lack of detail provided, this may or may not be possible in their data), that might be used to enrich the analysis they perform is bootstrapping or jack-knifing the data sets used, in addition to subsequent multiple iterations of ICA methods on each subjects subset data. At the very least, there needs to be discussion of the data limits in this paper resulting from only 2 subjects and only 1 subject per type of experiment, and the caveats resulting, i.e., possibility of individual idiosyncracy of data favoring specific ICA methods.

Response: The fMRI data sets contained three types of status, sensory stimulation, imagined movement and motor execution task. The imagined movement is actually one type of cognitive task. We used these data to represent several different types. Based on the suggestion, missing information such as numbers of imaging sessions has been added.

More details of the method of data setup for ICA have been added. Except the algorithm and the number of ICs, default settings/parameters defined by the GIFT software were used during analysis.

There was a single ensemble for each subject since each data set contained only one subject. During data reduction steps, for one subject one session, the data reduction actually would be disabled since the number of principal components extracted from the data is the same as the number of independent components, as introduced by the manual of the GIFT software, i.e., the matrix dimensions would not be changed. (In group analysis containing a number of subjects, subjects in a group can also be concatenated as a single ensemble by the GIFT software).

As you pointed out, some published studies such as Tresch, D'Avella, Cheung’s and Yang, Logan and Giszter’s have successfully performed ICA with Infomax. We suppose that many study groups have found some merits of Infomax empirically. The merit of this study is, on the aspect of reliability, to show the priority of Infomax and to uncover in which index Infomax is superior to other tested ICA algorithms in the single-subject level.

We used the RandInit mode (algorithm starts with Randomizing different Initial values) to run ICASSO. This information has been added in the manuscript. The RandInit mode was chosen because 1) the RandInit mode in ICASSO uses the original data whereas the data will be resampled in the bootstrapping method; 2) the RandInit mode generates correlation coefficients with straightforward calculations whereas some extra normalization is necessary for bootstrapping [J. Himberg, A. Hyvärinen and F. Esposito. NeuroImage 2004(3):1214-1222.]. If we run ICASSO 10 times, the algorithm (e.g., Infomax) will run 10 times; in each time, the algorithm starts with randomizing different initial conditions.

This study used fMRI data from two subjects. Discussions of this limitation have been added at the end of the manuscript. These fMRI data sets contained three types of status, i.e., sensory stimulation, imagined movements, and motor execution task. Single-subject ICA analysis is the foundation of group-level analysis. However, results from two single-subject analyses may not well represent the performance of an ICA algorithm for all types of data. Whether the reliability of Infomax is superior to other algorithms needs to be further proved with other types of data.

3. The SCC method as described in section 2.4 is unclear. It is possible to do this in two ways - correlate the individual IC spatial components picking best correlations (where pairings of and IC with other sets may not be unique- same IC best in two or more correlations), or correlate the spatial matrix (e.g., in MATLAB using matperm, matcorr) and then IC correlations will be unique, based on the matrix permutation use in the unique IC matchings. The authors used the former, non-unique method I believe, but this might have biased results. If they did, with the possibility of non-unique matching of ICs, then I think the reader needs to know if there were non-unique best IC spatial correlations in each method, and if so how many. This might be an additional metric on ICA algorithm quality.

Response: The SCC was the correlation coefficient value between two IC spatial maps, that is, two spatial matrices. When comparing two groups of ICs A and B, correlation coefficient value between each IC in group A and each IC in group B was calculated.

The best matched ICs must be a pair showing the highest correlation value. This best- matched pair is unique since the corresponding r value is the highest one; we did not find one IC in one group with two equal highest r values with two ICs in the other group.

In our data, we only found lower thresholds: 1) (section 3.4 Difference in SCC values between the most reliable results and the other results for each non-deterministic algorithm) For the sensory data, if the SCC values ≤ 0.88 (found in the results of COMBI), there would be an IC (in the corresponding ICASSO result) that matched two IC maps of the most reliable result (i.e., presenting similar SCC values), and there was another IC (in the corresponding ICASSO result) that did not match any IC in the most reliable result. In other words, when compared with the most reliable result, the other nine results with a SCC ≤ 0.88 indicated unreliable performance; 2) (section 3.5 Comparing SCC values among the nine ICA algorithms). If there was an SCC value less than 0.669816, there would be at least one “unmatched” IC in the result of such algorithm, which means that this IC could not exclusively match any IC map in the results of Infomax and thus suggests a poor spatial consistency between the results of the algorithm and Infomax. This information has been presented in the manuscript.

4. Finally, in relation to the SCC statistics used: Correlation, especially here, is non-gaussian if used directly. This may present difficulties in interpreting the statistics in tables 6 and 7 that could be avoided. Using the Fisher-Z transform (see Yang, Logan, Giszter, PNAS 2019 for example with Infomax ICA data stats) the correlations are transformed to normal variables, improving the interpretation of parametric methods.

Response: We used correlation coefficient value to compare different conditions. The correlation values can be transformed to normal variables with the Fisher-Z transform.

The population correlation coefficient is ρ, and the sample correlation coefficient is r. The sample is a part of the population.

When Fisher-Z the transformation is applied to the sample correlation coefficient r, the sampling distribution of the transformed variable is approximately normal. Without the Fisher transformation, the variance of r (between two variables X and Y) grows smaller as |ρ| gets closer to 1 and thus is not normally distributed. (referring to publications listed on https://en.wikipedia.org/wiki/Fisher_transformation)

In our case, however, for each comparison between two groups of SCC, normal distribution of transformed SCC values in each group cannot be assured. The reason is that the SCC values (r values) come from different populations: For example, a SCC value between j-th IC in group A and k-th IC in group B can be Fisher-Z transformed, which leads to a normal-distributed transformed SCC value. Such normal distribution means the transformed r values meet normal distribution for the two populations where two samples (j-th IC in group A and k-th IC in group B) are drawn. But now we have other SCC values between other pairs of ICs (e.g., m-th IC in group A and n-th IC in group B). These are samples from other populations. Therefore, the transformed r values may not be normally distributed when putted into a group. For a large data set it may be possible (normally distributed), but we have only a small sample size here (12 ICs for the sensory data and 8 ICs for motor data). As the solution, we used the Kruskal–Wallis test in Table 6 and 7. The Kruskal–Wallis test is a nonparametric test that does not rely on normal distribution.

5. Equations would be helpful throughout the methods if very clearly written. The goal is reproducibility of methods.

Response: We supplemented an equation in section 2.3:

In ICASSO, the similarity between one pair of ICs (i and j) is quantified by the absolute value of their mutual correlation coefficients �i j. The clustering process is performed by using distance between the two ICs. The distance is determined by transforming the similarity matrix into a dissimilarity (distance) matrix: di j=1-�i j.

There are two commonly used methods to transform the similarity matrix into a distance matrix: di j=1-�i j or di j=1/�i j. Therefore, the supplemented information is helpful if someone want to verify the results since the two methods generate different values. There are some long and complex equations in the cited literature. We do not present these equations in that they need detailed introductions and may interfere with the topic of the manuscript.

6. The data sharing statement is insufficient, for this study an anonymized repository should be chosen - there are many.

Response: We have added the fMRI data to the Supporting information so that other groups can verify the results.

This paper is a solid contribution, but marred by the omissions in detail, and possibly improved by attention to the technical points noted.

Response: Thanks very much again for the comments and suggestions. We hope the new version responds well to the concerns.

Attachments
Attachment
Submitted filename: Response to Reviewers.docx
Decision Letter - Kewei Chen, Editor

PONE-D-21-19322R1Comparing the reliability of different ICA algorithms for fMRI analysisPLOS ONE

Dear Dr. Wei,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Apr 30 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Kewei Chen, Ph.D

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: No

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This is a nice revision. Thank you for the work on the term ICASSO origin. This revision is coupled with a very responsive letter, but not all comments in the letter appear in the revised manuscript as I read it. Since readers may have the same concerns as reviewers it is crucial to include these in the manuscript, unless the full review history is also going to be published. In particular, the issue I could not find in the materials and methods:

That the component correlation method used (in section "Comparing the reliability among four ICA algorithms with SCC") did not in the present data produce any double use correlations must be stated, and that this was observed to be true, although this is not guaranteed by the method. I still wasn't clear if the authors were relying here on other code or directly checked this in their work, and this should be clearly stated.

Line 331 - probably, term 'well-performing' is better than 'well-performed' here

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 2

Thank you very much for the comments and suggestions. The explanations in the former letter have been added in the manuscript if the contents were not included in the former version.

In section "Comparing the reliability among four non-deterministic ICA algorithms with SCC” (and other sections where SCCs were calculated), the SCC was calculated by using MATLAB function corrcoef. We have added this information in the manuscript.

In this section, the SCC values are used in two steps.

Step 1 is a simple picking-the maximum-value process, introduced in the second paragraph of this section. This is not a statistical comparison but just picks the highest SCC value. As a result, we get a list of SCC values for each pair of groups.

Step 2 is a statistical comparison between different lists of SCCs by using the Kruskal–Wallis test, introduced in the last paragraph of this section.

Thus, this is not a double use of correlations. Step 1 provides values for statistical analysis in Step 2.

In the new version, we have added this information and highlighted supplemented words. We hope the contents will be more clearly expressed.

The word 'well-performed' has been changed to 'well-performing' based on your suggestion.

Attachments
Attachment
Submitted filename: Response to Reviewers.docx
Decision Letter - Pew-Thian Yap, Editor

Comparing the reliability of different ICA algorithms for fMRI analysis

PONE-D-21-19322R2

Dear Dr. Wei,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Pew-Thian Yap

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: (No Response)

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: No

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: I think the use of Fisher Z transforms on your data would be necessary for parametric tests, and your argument about the distributions of i th and jth correlation pairings potentially having different distributions is not particularly good, and only meaningful insofar as your number of subjects is too small. The correlations have different normal distributions, as you correctly state, but the central limit theorem indicates a sum of normal distributions will tend to normal. The distribution problems you may have with normality in your study are primarily due to the very small subject sample size you have (2 subjects) meaning the distribution might be multimodal if variances are low for different correlations across your analysis runs, because not enough subject variations for distributions are contributing. It would be better to directly acknowledge this as the basis of the non parametric testing. Since you are choosing using non-parametric tests here, rather than parametric tests, the ordinal testing you used will be unaffected by the z transform and it is not needed. The best study would have 6-10 subjects per group and use z-transformed correlation data.

I suggest a note to this effect, but it is your choice.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

**********

Formally Accepted
Acceptance Letter - Pew-Thian Yap, Editor

PONE-D-21-19322R2

Comparing the reliability of different ICA algorithms for fMRI analysis

Dear Dr. Wei:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Pew-Thian Yap

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .