Peer Review History
| Original SubmissionDecember 17, 2021 |
|---|
|
PONE-D-21-39830Visual tracking assessment in a soccer-specific virtual environment: a web-based studyPLOS ONE Dear Dr. VU, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Two expert reviewers have assessed your work. Both reviewers find the topic of your study to be interesting, yet they also raise some critical issues. The reviewers ask for missing methodological details which are important to include and discuss. Both reviewers also raise important questions regarding the methodological limitations of the setup (the effect of allowing viewpoint changes) and the interpretation of your analyses (particularly the exploratory PCA and HCPC analyses). At this point it is unclear to me whether it is possible to draw solid conclusions from your work, and thus whether your paper can ultimately be published. Nevertheless, I would like to give you the chance to defend or modify your methodological choices. If you decide to revise and resubmit your potentially interesting study, I suggest you pay careful attention to replying convincingly to all reviewer comments. Additionally, I point out that the raw data underlying your results need to be made fully available. You may include these data in the supplementary files, or better still you could upload your raw data and analysis scripts to a public data repository (e.g. Zenodo) and include the doi linking to your data in your revised manuscript. Please submit your revised manuscript by Mar 21 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Guido Maiello Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 2. Please amend your current ethics statement to address the following concerns: a) Did participants provide their written or verbal informed consent to participate in this study? b) If consent was verbal, please explain i) why written consent was not obtained, ii) how you documented participant consent, and iii) whether the ethics committees/IRB approved this consent procedure. 3. Thank you for stating in your Funding Statement: [This study was partially funded by the ANR within the framework of the PIA EUR DIGISPORT project (ANR-18-EURE-0022) and by the Region Bretagne.] Please provide an amended statement that declares *all* the funding or sources of support (whether external or internal to your organization) received during this study, as detailed online in our guide for authors at http://journals.plos.org/plosone/s/submit-now. Please also include the statement “There was no additional external funding received for this study.” in your updated Funding Statement. Please include your amended Funding Statement within your cover letter. We will change the online submission form on your behalf. 4. Please ensure that you include a title page within your main document. You should list all authors and all affiliations as per our author instructions and clearly indicate the corresponding author. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: No ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: No ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The authors studied how an observer visually tracks moving elements in a sport environment. They reduce a bias present in MOT protocols by simulating the movements of players on a field thanks to player movements extracted from a database of real football matches. The authors compare the tracking performances of football players, athletes who do not play football but another team-based sport, and non-sport-practising individuals. They also have participants track scenes where virtual players follow real game trajectories, and also when the virtual players follow non-natural movements. Using ANOVA and HCPC analyses, the authors wished to analyse what experimental factors influenced tracking in the virtual scenes. The presentation of the study, its details and findings were well presented and easy to understand. * Method Is it possible that the fact that virtual players are showing no discerning features (beyond their team's colour) may have made the task harder than it would be in a real-case scenario? Does the choice of making all players identical reflect a belief of the authors that visual tracking in soccer is a process requiring very few visual features? I ask because the authors state their interest in approaching more natural conditions several times in the manuscript, but there was no discussion of realism related to visual features of the virtual players. Was it really necessary to give participants the possibility to pan the camera view? The authors point to this as a limit in the discussion section because some participants may have had issues manipulating the camera, on top of tracking elements in the scene. This is potential a bias that could have been easily avoided by restricting the movements of the virtual player to be within the camera viewport. * Results I understand that using a method such as HCPC allows for the study of many variables at once and to identify and characterise varied behaviours. But this appears rather post-hoc and nebulous to me in the interpretations made from the resulting data. It is complicated to measure and attribute the effects amongst the variables related to each group. Additionally, although fig. 5.A shows fairly separable clusters of target-players, that is not the case for fig. 4.A. I realise that the alternative would be to vary each variable independently and measure their resulting effect, therefore multiplicating the number of experiments needed to run. But considering the fact that the authors present HCPC as an exploratory tool, I would at least expect to read in the discussion section about what they would do in future work to study more precisely the specific behaviours the HCPC analysis highlighted. * typos p.2 "players movements" -> "player movements" p.3 "allow participants to access to the" -> "allow participants to access the" p.6 "not difference" -> "no difference" Reviewer #2: This paper investigates whether playing a team sport effects tracking performance in a visual tracking task using movement trajectories taken from soccer matches (i.e., structured). The authors compared the tracking performance of soccer players, other team sport players, and non-team sports players. They also compared performance on structured and unstructured movement trajectories. The authors did not find that soccer players displayed better tracking performance when facing the structured movement trajectories taken from real soccer games compared with the other two groups. Since the results did not fit with the authors’ hypothesis, they suggest an immersive design that captures the soccer field more closely should be used in future research. The manuscript fits nicely within the scope of PLOS ONE and addresses an interesting question. However, there are several methodological limitations (many of which the authors outline themselves in the discussion) that make me wonder about the quality of the data and makes interpretation of the results somewhat difficult. Major Comments 1. The introduction is clear and includes some relevant literature. However, I do think it could be more focussed towards the specific questions the authors are interested in addressing. a. Expertise effects could be discussed in more detail. Quite some relevant research has been done into video game players (e.g., Dye & Bavelier, 2010; Green & Bavelier, 2006; Sekuler et al., 2008). b. I would appreciate some discussion on why these conflicting results regarding expertise differences have been found. For example, why does expertise in some sports seem to lead to better MOT performance (10 – 13) but not in others (15)? c. Following on from this, I think some discussion of near- and far-transfer effects is relevant here (e.g., Harris et al., 2020). This seems relevant to the authors’ proposal that the standard MOT task is not soccer-specific enough to find expertise effects. d. A recent review of Neurotracker (3D MOT training) also seems relevant (Vater et al., 2021) which concludes that there is limited evidence for transfer of MOT training to actual sport skills. These findings seem to support the rationale for your study suggesting sport-specific MOT tasks should be considered. 2. The research question is interesting, and the primary objective is clearly stated (lines 48 – 49). However, I find it somewhat difficult to understand the authors’ hypothesis. My understanding from the introduction is that they think that soccer-expertise will only effect tracking performance on a soccer-specific task. Based on this, I would hypothesise an interaction effect: no difference in performance between the groups on the unstructured task but better performance by the soccer group on the structured task. I find the authors’ explanation of this somewhat confusing (lines 57 – 67) and think it could be written more clearly. a. In particular, I find the use of ‘performance differential’ difficult to interpret (even though I think it is essentially describing the hypothesis I outline above). Since the authors go on to run an ANOVA, I think it makes more sense for the hypothesis to be explained in terms of an interaction. 3. I had several queries when reading the methods section of the experiments that are detailed below. The authors do recognise several of these points as limitations of their experiment in the discussion. Although this is good, I do think that they should elaborate on why, despite these limitations, they are still confident in the quality of the data and thus results presented here. a. Were people asked whether they watched soccer/sport? It seems likely that these people would also display enhanced tracking abilities in sport scenarios despite not necessarily playing the sport themselves. b. I wonder why the authors picked a defender’s viewpoint. What are the implications of this for players who are not defenders? I expect a striker-specific viewpoint to be very different from a defender-specific viewpoint. c. There is quite some variability in the standard of the players. My assumption is that the movement trajectories were taken from professional games (lines 143, also please clarify whether this is the case) and I wonder what the implications of this are. Is it likely that district level soccer players are really familiar with the movement trajectories they were exposed to? If not, why would we expect them to show an advantage? d. I would appreciate more detail on the data collection. This study was ran online so I wonder how the researchers ensured standardization of viewing factors (e.g., screen dimensions, refresh rates)? It is well-documented that things like tracking area size and speed of targets effect performance in MOT. e. I wonder why the authors only used 15 trials in each condition. Based on my knowledge, this seems rather low compared to other MOT experiments. f. I wonder why there was no ball. Presumably that seems like an important anchor in these football specific scenarios in that the ball-carrier dictates where people look and move? 4. I think it’s good that the authors made an example of task available online. However, this raises concerns about the task because, personally, I can’t see the players in the other half with sufficient acuity to track them. I do realise that I am missing an important component of the task whereby the participants could rotate their viewpoint. Nevertheless, I do struggle to visualize this (e.g., which axis do I rotate around/can I move to a ‘bird’s eye view’) so would appreciate an example of the possible viewpoint changes as well. 5. Following on from this, I wonder whether the authors could provide more justification for their decision to allow viewpoint changes. Research has shown that viewpoint changes have a large effect on tracking ability (e.g., Huff et al., 2009, Seiffert et al., 2005) so I think this addition to the task makes the interpretation of the data more complicated. I think the authors should consider the implications of allowing participants to change viewpoint throughout the manuscript. This additional component of the task arguably makes it less similar to soccer. 6. Since the viewpoint changes are an essential component of this study in my opinion, I wonder why the authors didn’t provide any data on this. Would the authors be happy to share this data and provide some insight into this? 7. The exploratory analysis using PCA and HCPC does not really fit with the experiment and I find interferes with the overall narrative of the paper. I wonder why the authors chose to include this analysis. Moreover, I find it difficult to understand how the features were calculated and the implications of this for interpretation. My understanding is that the features were expressed in terms of image coordinate (viewing angle) which would have been different across each participant and each scenario. Is it not possible that these results are therefore largely effected by viewing angles? 8. I find Figure 2 difficult to interpret and think more detail is needed. What does the colour coding on each panel represent? What are the axes of the middle panel? What are the units in each panel? 9. Throughout the manuscript, the authors often talk about ‘improvement’ which I find confusing. From my understanding, this is not a training study where participants were tested at different time-points and thus could show improvements, but rather a single testing session. I think the question being addressed is whether one of the groups is better. 10. The discussion says ‘All participants certainly did not comply with the instruction to select only those target-players whom they were confident enough to have successfully completed the tracking’ and that this ‘may have artificially improved the success rate of participants’ (lines 313 – 320). I am not entirely sure what the authors mean by this. Did they not check how many players were selected and verify whether these were correct or not? Minor Comments 1. The third sentence of the abstract is rather long and difficult to follow. 2. I don’t follow the reasoning in lines 35 – 38. The authors suggest there is sport-specific gaze control so, presumably, there is also MOT-specific gaze control. The task/sport determines the gaze so I don’t find this result surprising. 3. I find ‘teammates’ more intuitive than partners in the context of soccer. 4. Line 284 – 285: ‘TEAM also tended to outperform NOTEAM but not significantly (p = .071)’. I think this an over interpretation of the results and should be removed. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 1 |
|
PONE-D-21-39830R1Visual tracking assessment in a soccer-specific virtual environment: a web-based studyPLOS ONE Dear Dr. VU, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Of the two reviewers who assessed your original submission, only Reviewer 1 was available to assess your revised manuscript. I have thus assessed your responses to the comments raised by Reviewer 2 myself. Reviewer 1 is satisfied with your responses to his comments. For the most part, I also think you have appropriately addressed the comments from Reviewer 2 in your rebuttal letter. However, some of your responses have not been incorporated in the revised manuscript, and some control analyses are still missing. In my comments below I thus ask you to perform these analyses and incorporate all your responses into your manuscript. Finally, I appreciate that you have added data files S3, S4 and S5 to the submission. I kindly ask you to also specify and describe, in the “Supporting information” section of the manuscript, what data are contained in supplementary files S3, S4, and S5. Please submit your revised manuscript by Jun 02 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Guido Maiello Academic Editor PLOS ONE Additional Editor Comments (if provided): Reviewer 2 Point 3a: In your response to this reviewer comment, you present a table showing tracking performance as function of soccer viewing frequency. You state that even though “mean visual tracking performance appeared to increase with the frequency of soccer game viewing, but we assumed that this effect was not significant due to the large standard deviation”. I have plotted the data you presented, and computed 95% confidence intervals from your reported standard deviations and sample sizes. From these, it looks like there could be a significant effect, as in a few instances the means and confidence intervals of tracking performance across soccer viewing frequency do not overlap. Whichever the case, I suggest it is always best to verify ones’ assumptions when possible. I thus ask you to explicitly test whether a statistical statistically significant effect exists in these data. Further, I believe the reviewer’s main question was whether the observed differences between study groups could be explained by soccer viewing frequency. Thus it would be useful to test and report whether the three study groups differed significantly in soccer viewing frequency (e.g. by running a one-way ANOVA on viewing frequency, with sport practice as the between-subjects main effect. If the groups differ in viewing frequency the same way they differ in tracking performance, then this potential confound should be included and discussed in the discussion section of the manuscript. Moreover, when a reviewer brings up a question, it is good practice to address it both in the response letter as well as in the main manuscript, since future readers may have the same question. I thus ask you to incorporate your response to this reviewer comment in the main manuscript. This could be simply a few lines in the methods or the discussion section of the manuscript, where you point out the issue and report the result of your control analyses (once you have performed them). Reviewer 2 point 3b: As above, I ask you to report the result of these control analyses in the main manuscript. Reviewer 2 point 3d: I suggest you should include in the manuscript these considerations and control analyses regarding frame rate and screen resolution. Perhaps you should also specify that by asking participants to position their eyes at a distance from the screen equivalent to its width, the screen should have subtended approximately 53 degrees of visual angle for all participants. For future reference, you should be aware that there exist validated methods to control viewing distance/stimulus size for online experiments, e.g.: Li, Q., Joo, S. J., Yeatman, J. D., & Reinecke, K. (2020). Controlling for participants’ viewing distance in large-scale, psychophysical online experiments using a virtual chinrest. Scientific Reports, 10(1), 1-11 Lago, M. A. (2021). SimplePhy: An open-source tool for quick online perception experiments. Behavior Research Methods, 53(4), 1669-1676 Reviewer 2 point 3e: Please add a few short sentences in the methods section of the manuscript explaining your reasoning in selecting the number of trials. Reviewer 2 point 3f: please include these considerations in the discussion section of the manuscript. Line 531: “Abrupt changes … were observed” [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: (No Response) ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Erwan David [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
| Revision 2 |
|
Visual tracking assessment in a soccer-specific virtual environment: a web-based study PONE-D-21-39830R2 Dear Dr. Vu, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Guido Maiello Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: |
| Formally Accepted |
|
PONE-D-21-39830R2 Visual tracking assessment in a soccer-specific virtual environment: a web-based study Dear Dr. Vu: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Guido Maiello Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .