Peer Review History
| Original SubmissionMarch 20, 2020 |
|---|
|
PONE-D-20-08099 Applying time series analyses on continuous accelerometry data – a clinical example in older adults with and without cognitive impairment PLOS ONE Dear Dr. Rackoll, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Pls. very carefully address the major concerns by the reviewers, in particular with reagrd to cohort sizes, statisitcis, methodology etc. Please submit your revised manuscript by Jul 24 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols We look forward to receiving your revised manuscript. Kind regards, Henrik Oster, Ph.D. Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf 2. Thank you for stating the following in the Acknowledgments Section of your manuscript: 'This work was supported by grants from the Deutsche Forschungsgemeinschaft (Fl 379-10/1; Fl 379-11/1, and DFG-Exc 257).' We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form. Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows: 'The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.' Additional Editor Comments (if provided): Dear Dr. Rackoll, as you can see from the reviews several fundamental concerns were raised regarding the validity of your study. Please very carefully address all the major concerns (such as sample size, statistics, methodology) in your revision. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: No Reviewer #3: No ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: I Don't Know Reviewer #3: No ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: No Reviewer #3: No ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: General: The authors are applying a relatively novel analysis, function-on-scalar regression, to differentiate changes in diurnal activity between older adults with and without MCI. The authors note that this method is an improvement from aggregated approaches of determining links between activity and covariates such as demographic and health measures. Overall, this is a sound study and is utilizing an appropriate model to extend our knowledge of activity based measures in the presence or absence of disease in older adults. Major Points The type of basis function used may improve or hurt prediction accuracy. You should compare model accuracy of different basis functions through either cross-validation for each type of basis function or inspection of the residual curves produced by the model and then determine which is the better fit. On line 206 you state that data should be aggregated into at least 5 minute epochs which is what you chose moving forward. What is the rationale for not choosing a smaller epoch of less than 5 minutes? On line 243 you state that 18 basis functions seemed to be a good compromise between smoothing out noise versus preserving activity patterns. Since there is little previous research to support this methodology the determination of number of basis functions should be driven by model comparison (i.e. comparison of residual curves produced by each model to detect bias in prediction as a result of number of basis functions). It is difficult to know if 18 basis functions is best without any tests of cross validation or other model comparison metrics. What are the results in table 1 reported with an unadjusted p-value? Isn’t each row of this table an independent test and thus inflating your chance of a type 1 error? In reviewing the cited Goldsmith paper, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4987214/, shouldn’t the Function-of-Scalar-Regression (FoSR) give you a global coefficient and p-value for each covariate used in your model? The stated purpose of this paper should be to compare results of multiple linear regression using summer activity levels compare to results of the FoSR in terms of global beta and p-values, just like that referenced Goldsmith paper. See table 1 from the linked study above. This would fulfill the primary purpose of this paper to demonstrate the sensitivity of the FoSR analysis compared to more traditional regression approaches. Also, does your FOSR model only include covariates of group (MCI vs HOV) and age? In line 279 you state that you only control for age to prevent overfitting but how do you know that you have overfit the model in this context without doing some level of cross validation which is absent here? And again in reference to the previous Goldsmith paper when you are plotting relative activity in figures 1 or 2 is this similar to their plotting of the coefficient function (i.e. the change in the prediction of activity level as a function of the covariate across time)? See figure 1 and 2 in Goldsmith 2016. I continue to bring up the Goldsmith paper because it would be in the best interest of this paper to demonstrate the strength of this method based on how it has been used previously to help identify differences and similarities in your approach and interpretation compared to how it was used previously. In figure 1 you should demonstrate how the predicted values of the FoSR model differ from a simple averaging of the data in both the absolute and relative activity counts between groups. This would provide more face validity on the conceptual basis of how FoSR captures activity patterns in this sample of data that averaging and summing methods do not. In the chronotype analysis are you still using FoSR to determine differences in activity level as a function of strata and group? It does not explicitly say if this is the case. I believe you say it on line 404 but it should be stated earlier. You may also wish to include the experiment that each individual comes from as a random variable/effect which may also help control for any variance in this study since this sample is a pooled group from two previous projects. Minor Points: Line 258 do you mean that you inputted multiple 24-hour activity records from one individual into your model? Line 276 change epoque to epoch Line 309 should OLD be HOV? Line 335 change hours to hour In figure caption for Figure 1 include what the black line in B represents. Reviewer #2: The data and question could be used to write a useful paper but this one tries to do too much with too little and isn't careful about how it is done. The motivation is not appropriate for the question, the paper contains irrelevant material, the analytic sample is tiny and extremely selected, so I place no confidence in the results. Reviewer #3: Rackoll and colleagues describe the results of a “… time series analyses on continuous accelerometry data – a clinical example in older adults with and without cognitive impairment” and find that patients with mild cognitive impairment show an elevation in activity levels, specifically in the morning and the afternoon. The first sentence of the MS is: “Current analysis approaches of accelerometry data use sum score measures which do not provide insight in activity patterns over 24 hours, and thus do not adequately depict circadian activity patterns”. What are the authors talking about? What is meant by “Current analysis approaches”? Do the authors refer to commercial applications that come with the accelerometers, or the read-outs of smart watches? Whatever is meant by this introductory sentence, it is far from reality and shows that the authors are not familiar with the field of circadian research. Time-series analysis of circadian data has been uses decades before even accelerometers were applied in human studies, for example by Sokolove in the 1970s. But even today and specifically concerning actimetry in humans, the search – “time series analysis" circadian human actimetry – yields more than 350 papers, 16 alone in 2020. The top hit in relevance is a paper called “Multiscale adaptive analysis of circadian rhythms and intradaily variability: application to actigraphy time series in acute insomnia subjects”, which even has been published in PLOS one. I am willing to bet that all of these papers go beyond “sum score measures”. The authors should look at this paper as an example of what can be done when analysing “continuous accelerometry data”. It seems that the authors use FoSR similar to how some people use SPSS, i.e., without actually looking at the data. The figures the authors come up with are simple 24-hour averages as have been used in analyses of human actimetry for decades. It is simply not enough to state that there are activity level-differences between the groups. Actimetry can do so much more, can dig so much deeper than the analysis presented here. The first step of every detailed analysis of circadian long-term time series are making double plots, so that one can see how the activity is generally spread over every day’s 24 hours. The PLOS one paper mentioned above, goes into “intradaily variability”, which could well be at the basis of the activity level differences. We have no idea how dispersed or consolidated the daily activity profiles are in the participants of the two groups, we have no idea when the participants apparently sleep (which could be assessed by looking at the data). Figure 1a should be also double-plotted, since this allows a proper representation of sleep. Even a relative short recording of a week allows to determine a mid-trough time of something like time of “L5” (often used in time series analysis of human actigraphy and mentioned in the present MS). These individual phases would have been much more appropriate to be used in a normalisation process compared to the MEQ questionnaire. It is also not clear whether individual daily profiles were normalised to individual mean levels (deviation from daily means or entered moving means). If one wants to make statutes about specific times of day where activity levels differ between groups, such a step would be essential. If the only result that came out of this “novel” FoSR analysis is the vague activity level difference that can be sen in Fig 1a, then the field does not need this analysis. This is a shame because we need much more good analysis of actimetry data from every kind of patients, especially in psychiatry and for higher age-groups. I suggest the authors team up with people, who have long-term experience in circadian analysis of activity recordings and milk their data again with all the insights available for this type of analysis. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Andrew Hooyman Reviewer #2: No Reviewer #3: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.
|
| Revision 1 |
|
PONE-D-20-08099R1 Applying time series analyses on continuous accelerometry data – a clinical example in older adults with and without cognitive impairment PLOS ONE Dear Dr. Rackoll, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. ============================== Please carefully address the concerns raised by the reviewer. ============================== Please submit your revised manuscript by Feb 18 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols We look forward to receiving your revised manuscript. Kind regards, Henrik Oster, Ph.D. Academic Editor PLOS ONE [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: No ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The purpose of this paper is to use Function on Scalar Regression to determine if activity levels between healthy older volunteers and individuals with Mild Cognitive Impairment throughout different times of the day. I believe the authors have done a good job of addressing my initial comments. However, with further inspection of the cited research on FOSR and code provided new concerns are presented. In my second comment I asked the reviewers to provide a rationale for why they use a 5-minute epoch which they have now provided. Looking at the original Goldsmith paper (11) their rationale for epoch length is not based on activity patterns but rather achieving a normal distribution of count data at each time point. This comment by Goldsmith represents an important statistical consideration and the authors should confirm the same is true for their sample given the smaller sample size used here compared to Goldsmith (11). Unintended outliers that are generated at certain timepoints due to a selected epoch length may skew results of the FOSR analysis. You state that you log transform the data to correct for skew but that may be an unnecessary step if the right epoch is chosen. Also, you state in response to comment 1 that your focus is on longer phases of activity over the day, yet you choose a shorter epoch than previous studies see Goldsmith (11). In response to comment 3 you state your design of your wavelet function is based on your hypothesis that MCI patients would have higher activity at night and lower during the morning compared to HOV. This is not your hypothesis in the manuscript, however. On line 110 you make a more general hypothesis that your algorithm will detect differences in activity in both timing and magnitude between the two groups. No where in the introduction do you state any evidence on how circadian rhythms would be disrupted in terms of specific time of day. If there is evidence it needs to be stated. Your hypothesis would also need to be updated in the manuscript. It is wearisome that you use one hypothesis to justify a methodological decision in the comments but then state a different hypothesis in the manuscript. More to the point, your rationale for using a wavelet function is based on research that uses this type of function on time series data but not necessarily research that has used FOSR. Goldsmith (11) provides all the code necessary to use FOSR as it was applied in their paper which seems to be the fundamental goal of this paper, i.e. use FOSR to determine differences in activity patterns between MCI and HOV volunteers. However, the Goldsmith R package, refund, does not include a function to use wavelets for smoothing data, which is why I believe you go this route of using a linear mixed effects model instead. Ultimately, this smoothing step is of huge importance in the application of FOSR where the level of smoothing may produce differences between groups that do not necessarily exist. You justify the use of the wavelet function based on previous research using it on time series data but has it ever been used on a geriatric population? Or have the number of basis functions been validated to represent true activity versus noise? Obviously, a wavelet function with X number of basis functions may be more accurate for one group or individual than another, for example, 36 basis functions for kids versus 9 for older adults. If you have no direct evidence validating the application of wavelets with a specific number of basis functions to the activity levels of the groups investigated here, I would recommend dropping this smoothing step altogether. Alternatively, I recommend that the authors utilize the Goldsmith package, refund, to implement FOSR as it has been used in the past, thus making the methods of this paper more directly comparable to the research they claim to base the current results off of. Additionally, the use of the refund package would allow for more direct model comparison between models that include or exclude the covariate of group while controlling for age and sex. Code to do all this can be found here: https://jeffgoldsmith.com/papers.html. CRAN for refund here: https://cran.r-project.org/web/packages/refund/refund.pdf. The number of basis functions used in your wavelet transformation also dictates your capability to determine statistical significance at different times based on your use of a bonferonni correction to account for multiple comparisons. Why not use a false discovery rate correction instead? Applying Bonferroni based on number of basis functions seems to make it either too conservative or too liberal and has a direct impact on your capability to determine when the activity patterns differ between groups. Also, the refund package uses an FOSR function that is Bayesian and therefore utilizes confidence intervals to make determinations of significance rather than p-values. The authors also never run a multiple linear regression on the aggregate activity data. In the introduction the authors state that this method is in effectual but they never perform it to actually demonstrate that this is the case. MINOR COMMENTS Change hours to hour, line 110. Be sure to cite the NHANES cohort, line 174. Please cite who is suggesting that the use of cubic splines or wavelet transformations is suggested for actigraph data, line 206. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Dr. Andrew Hooyman [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 2 |
|
Applying time series analyses on continuous accelerometry data – a clinical example in older adults with and without cognitive impairment PONE-D-20-08099R2 Dear Dr. Rackoll, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Henrik Oster, Ph.D. Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: I commend the authors on their dedication to this manuscript! I wish them the best of luck on their continued research. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Andrew Hooyman |
| Formally Accepted |
|
PONE-D-20-08099R2 Applying time series analyses on continuous accelerometry data – a clinical example in older adults with and without cognitive impairment Dear Dr. Rackoll: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Prof. Henrik Oster Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .