Peer Review History
| Original SubmissionJuly 14, 2020 |
|---|
|
PONE-D-20-20804 Equivalent current dipole sources of neurofeedback training-induced alpha activity through temporal/spectral analytic techniques. PLOS ONE Dear Dr. Shaw, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Nov 06 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols We look forward to receiving your revised manuscript. Kind regards, Alice Mado Proverbio Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 2. In your Methods section, please provide additional information about the participant recruitment method and the demographic details of your participants. Please ensure you have provided sufficient details to replicate the analyses such as: a) the recruitment date range (month and year), b) a description of any inclusion/exclusion criteria that were applied to participant recruitment, c) a table of relevant demographic details, d) a statement as to whether your sample can be considered representative of a larger population, e) a description of how participants were recruited, and f) descriptions of where participants were recruited and where the research took place. 3. Thank you for your ethics statement: 'The experimental procedure was reviewed and approved by a local research ethics committee. Informed consent was provided and signed by all participants before the experiment.' Please amend your current ethics statement to include the full name of the ethics committee/institutional review board(s) that approved your specific study. Once you have amended this/these statement(s) in the Methods section of the manuscript, please add the same text to the “Ethics Statement” field of the submission form (via “Edit Submission”). For additional information about PLOS ONE ethical requirements for human subjects research, please refer to http://journals.plos.org/plosone/s/submission-guidelines#loc-human-subjects-research. 4. Thank you for stating the following in the Acknowledgments Section of your manuscript: "The authors thank the following institutions for their research support: Ministry of Science and Technology, Taiwan (108-2410-H-006-112-MY3 and 108-2634-F-006- 012), and the Mind Research and Imaging Center, National Cheng Kung University, Taiwan." We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form. Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows: "The author(s) received no specific funding for this work." Please include your amended statements within your cover letter; we will change the online submission form on your behalf. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly Reviewer #2: Partly Reviewer #3: No Reviewer #4: Partly ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: No Reviewer #3: I Don't Know Reviewer #4: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: No Reviewer #2: Yes Reviewer #3: Yes Reviewer #4: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes Reviewer #4: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The manuscript titled “Equivalent current dipole sources of neurofeedback training-induced alpha activity through temporal/spectral analytic techniques” touches upon evergreen subject of EEG neurofeedback. It comes with reasonable number of participants (63) and an interesting attempt to localize sources of observed EEG activity. Although interesting some major questions need to be answered before final conclusions can be made: 1. The training protocol was based on the alpha band which is highly susceptible to various manipulation (Williams, 1977; Quandt et al. 2012) and the training protocol included instructions for participants. How many trainer did provide those instructions and how authors assured mitigation of trainer effect ? 2. Were the participants personality profiles investigated (using psychological tests such as EPQ-R, NEO-EFI,…) ? Authors’ results are based only on the “responders” subgroup of the “alpha” – experimental group: 1. How many exactly “responders” were finally identified ? 2. Are there any specific features allowing to distinguish responders vs non-non-responders other than increase in alpha amplitude ? 3. Is there any relations between responders, non-responders and instructions, trainers or personality traits ? 4. What would be result for comparison all “alpha”(experimental) group vs “ctrl” (control) group ? Authors used bipolar electrode montage very rarely used in this kind of investigations which makes it difficult to compare to similar studies: 1. Why Authors decided to use bipolar montage ? 2. What would be the results recalculated to the average reference ? 3. What was the results of whole head analyses for “non-responders” ? Source analyses 1. Sources were calculated using Curry software which the procedure is not publicly available, could you provide detailed methodology, please ? 2. From my understanding sources were calculated using ICA. Since ICA will return different clusters for each participants how the results were averaged: calculated individually, finding common clusters and then averaged or they were all subjects were analysed together ? 3. Without individual electrode positions and/or individual T1s localization results will not provide accurate results allowing for anatomical consierations Other methodological questions: 4. Authors investigated also the delta, theta and beta (13-30 Hz) bands what was the results how they compare to alpha band results (especially the theta band which could possibly drive the alpha one) ? 5. What was exactly calculation of ERSPA ? Reviewer #2: The authors aimed to explore the neurophysiological sources of alpha increases induced by neurofeedback training. Therefore, they calculated the power increases during 12 sessions of alpha neurofeedback training by using four different analysis techniques, namely the maximum peak average, the positive average, the negative average, and the event-related spectral perturbation average. All analysis methods showed a high agreement on localized dipole sources in the precuneus, posterior cingulate cortex, and middle temporal gyrus. The authors conclude that all measures are suitable for investigating dipole sources during neurofeedback training. The research question itself is an interesting one and worth to be investigated. However, it is unclear how the authors derived the research question and why the described methods were chosen in order to answer it. The method section lacks important descriptions and it is not comprehensible why and on the basis of which criteria the authors ended up with a source localization analysis with 14 participants despite 63 that were tested. It is questionable why four analysis techniques were compared, considering that the research question focused on exploration of the source regions of alpha activity, and it is not reported whether source localization would have been significant using solely one of these methods. Furthermore, source localization results differ between text and table. All together, it remains unclear what the results contribute to the understanding and future conduction of alpha neurofeedback training and its effectiveness to enhance cognitive performance. Therefore, the manuscript needs to be substantially revised before to be considered for publication. Major points: 1. It is not clear on what basis the research question was derived and why it is important to explore it. The introduction is rather confusing and it is not convincingly reported why the authors investigated dipole sources in EEG activity that was measured after neurofeedback training and not during training. Transfer of neurofeedback training to resting-state activity in previous studies has been mixed and EEG activity after training could simply display fatigue and not the training effect itself (as there was no feedback during the alpha blocks of the whole-head EEG). Moreover, it is not apparent why and how the authors chose the four analysis techniques and compared them. I was wondering why the authors did not use one of the more conventional source localization methods like beamforming or LORETA. The abstract even mentions that alpha neurofeedback training is important for memory function but this focus (or generally transfer of neurofeedback training) is not explored at all (only mentioned in the discussion, please see point 6). 2. Many questions remain throughout the methods section that make me wonder whether the used methods were the most appropriate ones to answer the research question: - What is the three-dipole model that was mentioned in the abstract but never explained in the manuscript? - Was the neurofeedback training on fixed days or were the three sessions per week conducted randomly? - Why was feedback to EEG activity over central sites given during neurofeedback training? On what basis were these electrodes chosen? What does the chosen neurofeedback protocol mean for other alpha neurofeedback studies that give feedback to other electrode sites? - Why was resting-state EEG only investigated for responders? How many participants were responders? - Did the study have a different aim during conduction – why would you need the control group then? - Did participants use strategies during resting-state (p. 5, lines 103-105) and if so, why? - Was the frequency band for the control group different for every session or did it repeat by chance? - What were the criteria for artifact removal (p. 6, line 163)? How were eye and muscle artifacts controlled for during neurofeedback training? How much data had to be removed for analysis? - What did participants do during the 5-minute breaks of the whole-head EEG measurement? - What does a positive / negative alpha peak (p.7, lines 171-178)? Would an alpha increase/decrease be a more fitting description? - How many trials of the whole-head EEG measurement were used for each analysis method? Why did the number of trials differ for the four analysis methods (p. 16)? 3. Topographic results are reported only on a visual basis (p. 9)! Please report statistical analyses if you want to discuss differing topographies. Similarly, the source localization results were statistically compared between methods (pp. 10-11) but localized brain regions were not tested statistically against each other within the same method (e.g., is the source localization of the precuneus, PCC, and middle temporal gyrus detected by the positive average significantly greater than zero or greater than detection of other regions using the same method?). 4. The source localization results reported in the results section (p. 10) differ from the results displayed in Table 1 (p. 11). The authors state that 32%-84% of the participants showed dipoles in the reported regions of the precuneus, PCC, and middle temporal gyrus. However, the results in the text are bilateral and the results in the table are split for left and right brain regions. How can you exclude that a participant showing a dipole in right precuneus didn’t also show a dipole in left precuneus? The authors simply added the number of participants for left and right dipoles together. Looking at the results split for left and right brain regions, in some regions barely showed a dipole (4%-16%, which corresponds to 1-4 participants). 5. Considering that 63 participants took part in the study in total, the source localization results rely on a very small number of participants (1-14 participants for the localized brain regions). What does that mean for the generalization of the results? I doubt that effect sizes (that were not reported) are considerably large. 6. The authors tried to integrate their results in the current literature in the discussion. However, it is not clear how the source localization in the present study specifically contributes to the understanding of alpha neurofeedback training. Specifically, the conclusion about a “global alpha activity” (p. 17) that spreads from parietal to frontal regions is highly speculative and is not supported by the results (sources in parietal regions solely). Furthermore, the interpretation that sources of alpha activity after neurofeedback training are regions important for memory because of the strategies that participants used lacks any evidential basis. If a conclusion about memory performance based on neurofeedback is drawn, the authors should systematically investigate transfer to memory tasks or explore the nature of used strategies during training and the following whole-head EEG measurement. Reviewer #3: Dear authors, I am sorry to inform you that I did not reccomend your study for publication. I realise that you performed an extensive study with 12 sessions of neurofeedback training and your neurofeedback data looks very promising (although you should use an identical instruction for the experimental and the control group, to make sure the alpha increase originates from NFT and not from your instruction to think of positive memories (which in itself leads to relaxation and to increased alpha)). I also really liked your camera installation during neurofeedback training, that is a great method! The main concern I have with your study is the attempt to perform EEG source localisation with an 32 electrode EEG cap. Here is a paper you may want to look at (https://www.sciencedirect.com/science/article/pii/S0165027015003064), where EEG data was simulated and the validity of source localisation with different electrode densities was tested. Everything below 64 electrodes was simply inaccurate, but even 64 electrodes wasn't good. It should not be performed with anything below 128 electrodes (and even then usually T1 scans of each participant are used). If you want to use your dataset for future publications, please also make sure you do present correct units in every figure and also do not mix up amplitude and power in the manuscript. Best wishes and all the best Marion Brickwedde Reviewer #4: The present study aimed to identify sources of training-induced alpha activity through four different temporal/spectral analytic techniques, relating the effects of the training to ”memory function”. As I understood, the authors have already reviewed their paper. In my opinion, the authors did a very good work, I also guess thanks to the valuable reviewers' feedback. The paper now is very well structured, with a properly deep introduction and large and detailed discussions. Undoubtedly, I believe some weaknesses still affect the work, most of all the generic reference of alpha power in their study to memory functions without directly testing this reference by means of a specific memory task used in memory studies carried out in the context of cognitive neuroscience literature on this matter. Indeed, I was most surprised of finding this reference in the light of the kind of procedure to whom their experimental participants were submitted, namely a neurofeedback training (NFT) to self-regulate their own brain activity. Indeed, rather than a mnemonic task, this can be referred to as a learning to self-regulate their own brain alpha activity by means of a progressive increase of individual cognitive control on brain neural networks subserving alertness and focusing of attention functions. However, I believe that this may not be hard to solve now. Indeed, I think that the hard work made so far by the authors would deserve one more chance to achieve an acceptable level and be published. Indeed, the EEG literature is plenty of studies directly relating alpha power to the functions of alerting, orienting, and cognitive control characterizing the different neural networks of the visual selective attention system. Below, for instance, I reported a short list of recent and less recent studies in the literature on alpha power and visual attention to be used as a starting point for a critical review of their findings and of those papers referred to by these studies: Zani, A.; Tumminelli, C.; Proverbio, A.M. Electroencephalogram (EEG) Alpha Power as a Marker of Visuospatial Attention Orienting and Suppression in Normoxia and Hypoxia. An Exploratory Study. Brain Sci. 2020, 10, 140. Misselhorn, J.; Friese, U.; Engel, A.K. Frontal and parietal alpha oscillations reflect attentional modulation of cross-modal matching. Sci. Rep. 2019, 22, 5030. Rihs, T.A.; Michel, C.M.; Thut, G. Mechanisms of selective inhibition in visual spatial attention are indexed by alpha-band EEG synchronization. Eur. J. Neurosci. 2017, 25, 603–610. Poch, C.; Carretie, L.; Campo, P. A dual mechanism underlying alpha lateralization in attentional orienting to mental representation. Biol. Psychol. 2017, 28, 63–70. Foxe, J.J.; Snyder, A.C. The role of alpha-band brain oscillations as a sensory suppression mechanism during selective attention. Frnt. Hum. Neurosci. 2011, 2, 154. Kelly, S.P.; Lalor, E.C.; Reilly, R.B.; Foxe, J.J. Increases in Alpha Oscillatory Power Reflect a Suppression during Sustained Visuospatial Attention Active Retinotopic Mechanism for Distracter. J. Neurophysiol. 2006, 95, 3844–3851. As I already wrote, I believe that if the authors will be able to eliminate any reference to generic and abstract “memory functions” both in the Abstract and in any other paper sections, mostly the Introduction and Discussion sections, and to provide a short critical discussion of their NFT alpha findings in the light of the attentional findings advanced in the indicated studies, the paper will be more than worth to be published. As a more specific point, please provide a very short explanation of the NFT procedure already in the Abstract. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No Reviewer #3: Yes: Marion Brickwedde, PhD Reviewer #4: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.
|
| Revision 1 |
|
PONE-D-20-20804R1 Equivalent current dipole sources of neurofeedback training-induced alpha activity through temporal/spectral analytic techniques. PLOS ONE Dear Dr. Shaw, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses all the points raised during the review process. In particoular I require that you provide full analyses including the non-responders. Please submit your revised manuscript by Jun 07 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Francesco Di Russo, Ph.D. Academic Editor PLOS ONE [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: (No Response) Reviewer #3: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly Reviewer #3: Partly ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #3: I Don't Know ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: No Reviewer #3: No ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #3: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Dear Authors, I still have some concerns related to your manuscript: 1. In my opinion only comparison of the whole control group to the all members of the experimental group is the prove of successful intervention, Does Fig 1 shows all members of alpha group or only responders ? 2. There is no point to Investigate whole brain EEG of responders if you do not compare it to non-responders, it does not explain the difference between the two 3. In the questions about other bands I was asking about possible training effects - so to say to compare them between control and experimental groups 4. All methodological explanations should be included in methods section Reviewer #3: The authors used a 12-session alpha Neurofeedback training to induce high-alpha activity in a group of responders. These individuals underwent a whole-head EEG recording and were asked to emulate the strategy used during neurofeedback training in order to produce high alpha activity. During this time, high alpha bursts were used to apply EEG source localisation to find sources of NF-training induced alpha power changes. I believe that the study design underlies some major confounds and that this conclusion cannot be drawn so easily and that these limitations must be clearly addressed in the discussion. 1. Systematic differences in study design between groups, unrelated to the condition differences a. Alpha group received different instructions than control group (mental imagery, etc.) b. Alpha group received verbal feedback of successful alpha events (verbal feedback has a strong motivational effect, which can strongly influence learning processes). In case the control group received the same verbal feedback for successful events in their frequency band, this point can be ignored. c. “not well trained” participants were removed from alpha group, but no such treatment was applied to the control group. How many participants of the control group would have to be excluded applying the same criteria? --> This makes it impossible to conclude, that found differences between groups originate from NF training. 2. There is no proof that the localization of alpha power is related to NF training. a. There is no comparison between high alpha burst events in a baseline period before any training happened concerning their dipoles reconstructions and successful alpha events after NF training. It is entirely possible they would be the same and that NF training simply increases already present alpha oscillations, targeted by electrodes over the training site. b. The authors did not present a comparison between the fixation period of the whole-head EEG and the periods, where participants had to copy their NF strategies. If there is no difference overall in alpha power between both periods, how can be inferred that this alpha activity has anything to do with NF training? What if the same source localisation strategy would be applied to high alpha bursts in the fixation period? If it was different, that would be more convincing. How many successful alpha events would there be in the fixation period, if the same criteria (1.5-fold higher amplitude than average of all the 1-s fixation EEGs) was applied? 3. The source localisation cannot be generalised for all NF alpha trainings a. Alpha power reported in previous studies, generally increased over sites it was trained on. In this study, a general inference is made of where training-induced alpha changes occur, but it is evident that this is heavily dependent on training site. 4. Insufficient electrode numbers and missing T1 scans introduce the possibility of heavy mis-localization errors (up to 7 cm). it is necessary that each of these limitations are clearly communicated in the discussion. Some further details that need adjustment are outlined below 1. Previous studies have shown that training-induced alpha activity appeared diversely over the parieto-occipital [2] , fronto-parietal [3], or frontal regions [4] through topographic EEG analysis … Therefore, estimating the properties of the internal localized sources of the topography was a valuable way to explore generators of training-induced alpha activity. The first two studies only show that alpha power increased over exactly the sites that were trained on (the first study used P3, P4, Pz, O1 and O2 as training sites, and the second study used F3, Fz, F4, P3, Pz and P4 as training sites). In the third study the authors cite, alpha power increase was trained over POz in 2 participants and was successful over POz. In three additional participants, the frequency of alpha oscillations was trained to be sped up. These 3 participants showed increased frequency of alpha oscillations over frontal areas, that means a faster alpha rhythm. This did not refer to power changes. Therefore, it is difficult to argue that in this study, the authors explore the generators of training-induced alpha activity. These studies clearly show that training-induced alpha power changes depend on training site. The authors can therefore only conclude results for the specified training site. 2. Before NFT, researchers provided constructive strategies for successful alpha activity training reported by participants in our previous study, for example, pleasant or relaxing situations such as reading or wandering this is a confound which needs to be discussed. This means that the instructions between conditions was not equal. It means that these strategies could have led to alpha increase independent of Neurofeedback training or EEG. It would only be no problem, if the control group received the same instructions. 3. In the resting period, the researchers use information on the cumulative waveform, e.g., the timestamps of high EEG amplitudes to help participants recall what kind of strategy they used to achieve a high amplitude. was the same feedback given for the control group? Verbal feedback can have a tremendously motivating and positive effect on learning, which again is a confound for the NF training compared to control. 4. The 4-Hz bandwidth was randomly selected from the range of 7- 20 Hz. Thus, the Ctrl group received various kinds of 4-Hz amplitudes during each session [16]. how many participants in the control group received feedback including 10 Hz in their training range? 5. The experimental procedure of the whole-head EEG recording involved 5 runs. Each run interleaved four fixation and four alpha blocks (50 s each). In a fixation block, the “responder” was asked to keep his or her eyes open and not to engage in any training event. In an alpha block, the “responder” was asked to produce training-induced alpha activity. There was a 5-min rest period between two consecutive runs. As noted above, I find it important to report, whether the fixation and the alpha increase blocks different from each other in alpha power overall (not just looking at successful alpha events). If not, can the authors be certain their findings refer to the dipoles of NF training induced alpha power changes? Or just high alpha power events (which also happen naturally). Is there here found dipole localization any different to what has been found for resting state alpha oscillations? 6. All of the successful alpha events contaminated with blinks, eye-movement artifacts (> 65 μV) or remarkable muscle activity were automatically removed and were further processed through four different averaging methods. I assume that the authors mean the remaining events were further processed, because it sounds as if the removed ones were processed. 7. Six participants in the Alpha group who were not well trained were excluded from further analyses. please state the exact exclusion criteria. Not well trained is very vague. If the authors would apply the same criterium to the ctrl group, how many participants would be excluded? 8. How many trials were average for each method? Please state the min, mean and max in the text. 9. The electrode number is still a problem and should be discussed. The authors refer to the paper from Rodin & Rodin, 1995, where a source localization was conducted with 19 electrodes. There is no proof in this paper, that this source localization was valid. The second paper that is referred to by Sohrapour refers to the effect of electrode number on epileptic source localisation. Importantly, epileptic activity cannot be compared to healthy brain activity, as the strength of the activity largely exceeds normal activity, reducing error margin of source localization. Additionally, the patients studied in this particular study had seizures that strong, intracranial electrodes were implanted and surgery was conducted. Furthermore, in this study MRI scans of the individual participants were applied (which is not the case in the study under review). Also, the Sohrapour et. al find similar results for the source localisation, however the worst localisation is achieved using 32 electrodes. Excerpt from the paper: “In other words, the most dramatic decrease in localization error can be seen when going from 32 electrodes to 64 electrodes. The average localization error improves by 4 mm when going from 32 electrodes to 64 electrodes and improves by 1.3 mm when going from 64 electrodes to 96 electrodes,… Additionally to the computer simulation I cited before, I also want to refer to a review paper, elaborating on EEG source imaging by Michel & Brunet (https://doi.org/10.3389/fneur.2019.00325): “What is the minimal number of electrodes needed for reliable source localization? This question is often asked, particularly from the clinical community that intends to apply EEG source localization to the EEG that is routinely recorded with the standard 10-20 system, i.e., with only 19 electrodes. Several studies have demonstrated that this low number not only leads to blurring of the solution, but also to incorrect localization (49) compared the effective spatial resolution of different electrode montages (19-129 electrodes) and concluded that “the smallest topographic feature that can be resolved accurately by a 32-channel array is 7 cm in diameter, or about the size of a lobe of the brain.” All of this refers to source localisation in accordance with T1 scans. Therefore, I believe it is not enough to state that the signal to noise ratio could be improved using T1 scans and higher density EEG, but to discuss as a limitation of the study, that the low-density EEG used introduces blurring and localization error, which in some cases can be substantial. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #3: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 2 |
|
PONE-D-20-20804R2 Equivalent current dipole sources of neurofeedback training-induced alpha activity through temporal/spectral analytic techniques. PLOS ONE Dear Dr. Shaw, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised by the Reviewer 1 during the review process. Please submit your revised manuscript by Sep 12 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Francesco Di Russo, Ph.D. Academic Editor PLOS ONE Journal Requirements: Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #3: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly Reviewer #3: No ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #3: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #3: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #3: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Please specify type of post hoc tests and their statistics used in the results sections "EEG data for NFT" - does it confirm F test results ? if so for which data points ? Please provide statistics for full groups in Fig 1. and Fig2 and accordingly corrects figures Also please also cite the literature behind your statements "The benefit of the bipolar recording was to reduce the possible artifacts of motion or eye blinks" - I did not find any example. Instead bipolar setting is used to monitor eye blinks - please check: https://doi.org/10.1007/s10548-019-00707-x Reviewer #3: (No Response) ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #3: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 3 |
|
Equivalent current dipole sources of neurofeedback training-induced alpha activity through temporal/spectral analytic techniques. PONE-D-20-20804R3 Dear Dr. Shaw, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Daqing Guo Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #5: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly Reviewer #5: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #5: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: No Reviewer #5: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #5: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: I have no more questions assuming that the experimental group in figures 1 and 2 includes all participants including non-responders Reviewer #5: (No Response) ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #5: No |
| Formally Accepted |
|
PONE-D-20-20804R3 Equivalent current dipole sources of neurofeedback training-induced alpha activity through temporal/spectral analytic techniques. Dear Dr. Shaw: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Daqing Guo Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .