Peer Review History

Original SubmissionJune 18, 2021
Decision Letter - Caroline Sunderland, Editor

PONE-D-21-20054

Development of an expected possession value model to analyse team attacking performances in rugby league

PLOS ONE

Dear Dr. Sawczuk,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

The reviewers have some concerns about the rigour of the research which will require particular attention.

Please submit your revised manuscript by Sep 24 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Caroline Sunderland

Academic Editor

PLOS ONE

Journal requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please provide additional details regarding participant consent. In the Methods section, please ensure that you have specified (1) whether consent was informed and (2) what type you obtained (for instance, written or verbal). If your study included minors, state whether you obtained consent from parents or guardians. If the need for consent was waived by the ethics committee, please include this information

3. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability.

Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. Any potentially identifying patient information must be fully anonymized.

Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: No

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: No

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This study investigated the validity of the evaluation method of team attacking performances in rugby league via expected possession value (EPV) models. The authors examined three EPV models (two with fixed zone sizes of ~5m x 5m and ~10m x 10m [Kempton et al. 2016, JSS], and one with aggregated zones based on the zones’ match EPV) analyzing attacking performance in rugby league. They identified the EPV model which provides the greatest reproducibility of match EPV between fixtures, and quantified individual teams’ attacking performances across a season using an EPV model.

The motivations in this paper were clear and there were some contributions (to investigate the validity of EPV with 59,233 plays in 180 Super League matches), whereas the novelty was not so high (but this seems to be no problem in this journal). My major concern is that only one spatiotemporal resolution (EPV-19) is proposed and the results cannot conclude EPV-19 is the best. At least resolutions less than 77 and less than 19 may be required to conclude it. Furthermore, the interpretation of KL divergence may be wrong and the related part should be revised (for detail, see comment #5). For these reasons (including the comments below), the manuscript may be acceptable for me if the authors can solve these problems.

Specific comments

1. In the introduction, other EPV studies (e.g., soccer and deep learning approach) should be referred to. For example:

[1] J Fernández, L Bornn, D Cervone, Decomposing the Immeasurable Sport: A deep learning expected possession value framework for soccer, MIT SSAC, 2019

[2] J Fernández, L Bornn, D Cervone, A framework for the fine-grained evaluation of the instantaneous expected value of soccer possessions, Machine Learning, 2021

2. In the introduction, other rugby studies or methods related to movements and scores should be referred to. For example, see Related work in:

Rory Bunker, Keisuke Fujii, Hiroyuki Hanada, Ichiro Takeuchi, Supervised sequential pattern mining of event sequences in sport to identify important patterns of play: an application to rugby union, 2020.

3. The above of Eq.(3): I am not familiar with Monte Carlo every visit algorithm. Please explain the summary of the algorithm and the reason why the authors selected it.

4. P7: the authors used linear mixed models to evaluate whether the columns or rows could be combined. However, there were no related statistical values. I wonder how the areas of EPV-19 were determined based on such an approach. A detailed and clear flow of the determination may be required for reproducibility.

5. P9: KL divergence is an asymmetric measure between probability distributions P and Q. In this case, what are P and Q? These should be specified with the definition of KL divergence (i.e., equation). By the usual definition, when Q(x)=0, the KL divergence will be an infinite value. This does not always mean P (or Q) is reproducible by Q (or P). In summary, I consider that the authors may misunderstand the interpretation of KL divergence about reproducibility (this is to measure the dissimilarity between two probability distributions). This point should be revised throughout the manuscript.

6. Results and Discussion: although the analysis using KL divergence may be useful in this study, the property in the similar reward distribution between previous and current matches may not be useful to extract meaningful information. For example, coarser resolution (e.g., EPV-5) may obtain more similar distributions to the previous matches, but more general and unmeaningful insight will be obtained. EPV-19 may be an appropriate resolution to obtain meaningful insight (e.g., Fig 4), but there were no related results and discussion. The analysis will enhance the validity of the authors' claims.

Reviewer #2: The authors aim to use expected possession value (EPV) models to assess attacking performance in Rugby League. I agree that the event-level and spatial data used in this paper has good value and presents interesting opportunities for AI/statistical methods to improve match analysis in Rugby. Similar datasets have led to some very successful and interest papers at top AI venues when applied for football/soccer.

It would be nice to see a more in-depth introduction and literature review about work in this space and valuing actions in sports data. This would help make the work more accessible to a wider audience who may not be as familiar with EPV models or Rugby League. A comparison in the differences between this work and that in [5] for football would also be of interest.

I think the zone selection criteria the authors have used makes sense. However, it would have been nice to have seen a visual representation of these zones in the paper and potentially an experiment comparing the different approaches for zone selection, events that occur in those zones and how this would affect their model performances.

In the data-processing it would be good to see a full-list of events that can end a sequence and then it would be interesting to explore the impact of negative reward-weights based on loss of possession at the end of a sequence vs other 0 events such as (penalties, scrums, kicks etc).

Looking at the results to compare the 3 EPV models in Figure 2 - it seems like EPV-208 (A) may be using zones that are too granular and therefore does not have enough data to fully learnt the value of some zones (e.g., a darker zone behind the teams posts). EPV-77 (B) shows some better findings with large zones and EPV-19 (C) shows the most useful of results however I would expect most coaches were already aware of the findings. However, I think the charts shown in Figure 2 would be extremely valuable for a defensive coach when planning against an opposition as he/she would be able to understand where attacks are most dangerous.

There could be more details around each of the models. The work in [5] showing an EPV model for football goes into a lot more detail around a single EPV model and I think this paper could benefit from a more detail throughout. I think it is important for the authors and reader to recognise the limitations of their work due to data etc. Finally, I think greater discussion into future work and real-world impact of the study could be beneficial.

It would be good if eventually there is a free online source of the data used for a sample number of games in Rugby League as this would help with reproducibility and improvements of the models.

Overall, I think the flow of the paper could be improved to help with readability but the work describes an interesting and sound evaluation of EPV models in Rugby League. The paper does report original research and with a few tweaks it will make a valid contribution to the base of academic knowledge in sports analytics.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 1

** THIS IS THE SAME AS THE WORD DOCUMENT ATTACHED TO THE SUBMISSION **

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: No

Reviewer #2: Yes

________________________________________

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: No

Reviewer #2: Yes

________________________________________

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: No

The data that was used for this study was acquired from a third-party, formerly Opta Sports, now Stats Perform. It is available from www.optaprorugby.com. The data was provided under a license agreement with Opta Sports/Stats Perform, and the data is subject to an approved research ethics application from our University. The terms of our license agreement prevent us from sharing the raw data we used for this analysis. Our ethical approval also prevents us from sharing any data in any way that could be re-identified. The metadata and (fixture/location/action) data itself would allow someone else to re-identify fixtures, teams and/or players, breaching the ethics approval given. However, it should be possible to obtain access to the data by contacting Stats Perform (www.statsperform.com/contact/). The authors had no special access privileges and all data is taken from the 2019 season of the Super League. The corresponding author is happy to liaise with any researchers who have queries about how to obtain the data.

________________________________________

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

________________________________________

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This study investigated the validity of the evaluation method of team attacking performances in rugby league via expected possession value (EPV) models. The authors examined three EPV models (two with fixed zone sizes of ~5m x 5m and ~10m x 10m [Kempton et al. 2016, JSS], and one with aggregated zones based on the zones’ match EPV) analyzing attacking performance in rugby league. They identified the EPV model which provides the greatest reproducibility of match EPV between fixtures, and quantified individual teams’ attacking performances across a season using an EPV model.

The motivations in this paper were clear and there were some contributions (to investigate the validity of EPV with 59,233 plays in 180 Super League matches), whereas the novelty was not so high (but this seems to be no problem in this journal). My major concern is that only one spatiotemporal resolution (EPV-19) is proposed and the results cannot conclude EPV-19 is the best. At least resolutions less than 77 and less than 19 may be required to conclude it. Furthermore, the interpretation of KL divergence may be wrong and the related part should be revised (for detail, see comment #5). For these reasons (including the comments below), the manuscript may be acceptable for me if the authors can solve these problems.

Specific comments

1. In the introduction, other EPV studies (e.g., soccer and deep learning approach) should be referred to. For example:

[1] J Fernández, L Bornn, D Cervone, Decomposing the Immeasurable Sport: A deep learning expected possession value framework for soccer, MIT SSAC, 2019

[2] J Fernández, L Bornn, D Cervone, A framework for the fine-grained evaluation of the instantaneous expected value of soccer possessions, Machine Learning, 2021

Thank you for this comment, we have now added references to these methods in the introduction (INTRODUCTION: PARAGRAPH 2).

2. In the introduction, other rugby studies or methods related to movements and scores should be referred to. For example, see Related work in:

Rory Bunker, Keisuke Fujii, Hiroyuki Hanada, Ichiro Takeuchi, Supervised sequential pattern mining of event sequences in sport to identify important patterns of play: an application to rugby union, 2020.

Thank you for this comment. The paper mentioned answers a slightly different question to ours as we are looking at which zones are visited and attempting to find optimal zone sizes rather than the sequences of actions themselves. However, we do see the relevance of the paper as an alternative method and have therefore made reference to it in the discussion. Indeed one of the limitations of our method is that although we now know the value of different zones and where an opposition team is likely to attack from, we do not know the sequence of events through which this happens. This limitation has been added to the discussion, referencing the Bunker et al. study. (DISCUSSION – LIMITATIONS)

3. The above of Eq.(3): I am not familiar with Monte Carlo every visit algorithm. Please explain the summary of the algorithm and the reason why the authors selected it.

Thank you for this comment, we have added the details requested (METHODS – CALCULATION OF EPV-308 AND EPV-77 FIXED ZONE SIZE VALUES: PARAGRAPH 3).

4. P7: the authors used linear mixed models to evaluate whether the columns or rows could be combined. However, there were no related statistical values. I wonder how the areas of EPV-19 were determined based on such an approach. A detailed and clear flow of the determination may be required for reproducibility.

Thank you for this comment. We have now included every single statistical value obtained in the supplementary data for this paper. This information provides significantly more detail as to how the areas were determined (SUPPLEMENTARY DATA 1).

5. P9: KL divergence is an asymmetric measure between probability distributions P and Q. In this case, what are P and Q? These should be specified with the definition of KL divergence (i.e., equation). By the usual definition, when Q(x)=0, the KL divergence will be an infinite value. This does not always mean P (or Q) is reproducible by Q (or P). In summary, I consider that the authors may misunderstand the interpretation of KL divergence about reproducibility (this is to measure the dissimilarity between two probability distributions). This point should be revised throughout the manuscript.

Thank you for this comment. In this study, we use the subsequent match as the true distribution (P), and the previous 1-10 matches as the approximating distribution (Q). We are only interested in whether Q can approximate P as we only know Q when we are preparing for future matches. In line with your comments, we have extended our use of the KL Divergence to ensure the similarity aspect is used more. We still use the term reproducibility as a global term (this can be changed if there is preference for a different term), but now we consider both the percentage of non-infinity zones and the average value of the KL Divergence across all teams. We use the percentage of non-infinity zones is useful as a measure of how many of the subsequent match’s complete set of zones are visited in the previous matches, given a zone not visited by P (the subsequent match), automatically obtains a value of 0. The KL Divergence value then provides an understanding of how similar the reward distributions are between matches. To this end, as you note, we are looking at how similar the distributions are in terms of their values, rather than whether they have just been visited. We believe that our definition and usage now matches the more accurate definition you provided in the comment above. We introduce these elements in the methodology (METHODS – EVALUATING THE REPRODUCIBILITY OF MATCH EPV BETWEEN FIXTURES), provide the percentage of non-infinity values in TABLE 2, the KL Divergence values in FIGURE 3, and evaluate the results within the discussion (DISCUSSION – REPRODUCIBILITY OF ATTACKING PERFORMANCES).

6. Results and Discussion: although the analysis using KL divergence may be useful in this study, the property in the similar reward distribution between previous and current matches may not be useful to extract meaningful information. For example, coarser resolution (e.g., EPV-5) may obtain more similar distributions to the previous matches, but more general and unmeaningful insight will be obtained. EPV-19 may be an appropriate resolution to obtain meaningful insight (e.g., Fig 4), but there were no related results and discussion. The analysis will enhance the validity of the authors' claims.

Thank you for this comment. We agree that a coarser resolution (e.g. EPV-5) would provide more similar distributions to previous matches, but poor insight in practice. We also agree that more analyses would be beneficial to the study. As such, we have added three further zone analyses to the study. In the original manuscript, we used a smallest effect size of interest of 1 to split zones. We have now added three further state spaces (EPV-37, EPV-13, EPV-9), which use smallest effect sizes of interest of 0.5, 1.5 and 2.0 respectively to split the zones. We feel that this adds a significant amount to the article and thank you again for the suggestion.

Reviewer #2: The authors aim to use expected possession value (EPV) models to assess attacking performance in Rugby League. I agree that the event-level and spatial data used in this paper has good value and presents interesting opportunities for AI/statistical methods to improve match analysis in Rugby. Similar datasets have led to some very successful and interest papers at top AI venues when applied for football/soccer.

It would be nice to see a more in-depth introduction and literature review about work in this space and valuing actions in sports data. This would help make the work more accessible to a wider audience who may not be as familiar with EPV models or Rugby League. A comparison in the differences between this work and that in [5] for football would also be of interest.

Thank you for this comment, we have now added more detail to the introduction to make reference to the probabilistic deep learning models used in football vs the stochastic Markovian models used in ice hockey. We have also explained why rugby league is better suited to stochastic analyses. (INTRODUCTION: PARAGRAPH 2).

I think the zone selection criteria the authors have used makes sense. However, it would have been nice to have seen a visual representation of these zones in the paper and potentially an experiment comparing the different approaches for zone selection, events that occur in those zones and how this would affect their model performances.

Thank you for this comment, we are unclear what is meant by a ‘visual representation of these zones’. We believe this may be provided by Figure 2, but have also provided a non-heatmapped version of each model in the supplementary data (SUPPLEMENTARY DATA 1).

We agree that the events that occur in any zone could have a large impact on the results. Unfortunately, we are unable to provide the events that occur in the specific zones due to limitations within the dataset (only the location of the first event of each play is provided, so we cannot guarantee the location accuracy of any event thereafter). However, we have added three further experiments regarding the mixed model approach for zone selection (using SESOI 0.5, 1.0, 1.5 and 2.0 now, vs only SESOI 1.0 before) to provide greater discussion around the models.

In the data-processing it would be good to see a full-list of events that can end a sequence and then it would be interesting to explore the impact of negative reward-weights based on loss of possession at the end of a sequence vs other 0 events such as (penalties, scrums, kicks etc).

Thank you for this comment, we have now added a full list of events which could end an episode (TABLE 1). We considered negative reward weights for this study but were uncertain as to how we could include them given our episodes began when the team obtained possession and ended when the team lost possession. We believe negative events could only be included if the episode duration was extended (i.e. to the next points being scored), but this in turn creates problems as the model becomes more like a Markov Game. Such an approach is better suited with both larger datasets, and action data, neither of which was suitably available for this study. We agree the use of negative rewards is of great interest though and are hoping to identify an appropriate method of using them, or accounting for negative actions, in future studies.

Looking at the results to compare the 3 EPV models in Figure 2 - it seems like EPV-208 (A) may be using zones that are too granular and therefore does not have enough data to fully learnt the value of some zones (e.g., a darker zone behind the teams posts). EPV-77 (B) shows some better findings with large zones and EPV-19 (C) shows the most useful of results however I would expect most coaches were already aware of the findings. However, I think the charts shown in Figure 2 would be extremely valuable for a defensive coach when planning against an opposition as he/she would be able to understand where attacks are most dangerous.

Thank you, we agree with all these points. We believe that you are correct in the assertion that most coaches may expect to see some of the findings but visualising them simply in such a quantitative way could provide some insight in a quick way, which may help to prepare for future matches.

There could be more details around each of the models. The work in [5] showing an EPV model for football goes into a lot more detail around a single EPV model and I think this paper could benefit from a more detail throughout. I think it is important for the authors and reader to recognise the limitations of their work due to data etc. Finally, I think greater discussion into future work and real-world impact of the study could be beneficial.

Thank you for this comment. We have added significant detail to the development of the zones for each of the models in the supplementary material now, which we hope helps the reader to better understand how we have generated each model (SUPPLEMENTARY DATA 1).

We have also added further information regarding limitations (data, lack of predictive work within this study), future work (attempts to improve data collection techniques and provide more meaningful models including action/player location data) and real-world impact (use of EPV-19 to evaluate the attacking performances of opposition teams) throughout the DISCUSSION.

It would be good if eventually there is a free online source of the data used for a sample number of games in Rugby League as this would help with reproducibility and improvements of the models.

Thank you, we agree. Hopefully, given the increasing amounts of free data available in other sports (soccer in particular), sports like rugby (league and union) will follow suit and release data so that models can be improved. Unfortunately, although the data used for this study is accessible, it is behind a paywall, as described in the data availability statement.

Overall, I think the flow of the paper could be improved to help with readability but the work describes an interesting and sound evaluation of EPV models in Rugby League. The paper does report original research and with a few tweaks it will make a valid contribution to the base of academic knowledge in sports analytics.

Thank you

Attachments
Attachment
Submitted filename: Reviewer response.docx
Decision Letter - Caroline Sunderland, Editor

PONE-D-21-20054R1Development of an expected possession value model to analyse team attacking performances in rugby leaguePLOS ONE

Dear Dr. Sawczuk,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please address the minor comments and then the paper will be ready for acceptance.

Please submit your revised manuscript by Nov 27 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Caroline Sunderland

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Additional Editor Comments (if provided):

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thanks for your response and revising the manuscript.

I consider that the manuscript is ready for publication if the authors revise the following points.

1. L167: EPV-37, EPV-19, EPV-13 and EPV-9, EPV-308 zones -> EPV-37, EPV-19, EPV-13, EPV-9, and EPV-308?

2. KL divergence formula should be clarified (P and Q are not defined in this paper).

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 2

Reviewer #1: Thanks for your response and revising the manuscript.

I consider that the manuscript is ready for publication if the authors revise the following points.

Thank you again for your critique during the review period. We are pleased that our changes satisfied your comments.

1. L167: EPV-37, EPV-19, EPV-13 and EPV-9, EPV-308 zones -> EPV-37, EPV-19, EPV-13, EPV-9, and EPV-308?

This sentence has now been clarified.

“To calculate the aggregated set of zones for EPV-37, EPV-19, EPV-13 and EPV-9, the zones from EPV-308 were grouped together or split based upon differences in their match EPV.”

2. KL divergence formula should be clarified (P and Q are not defined in this paper).

We have now added the KL Divergence formula to the text, with more description to ensure there is no ambiguity between P and Q, our true and approximating distributions.

“The KL Divergence is calculated according to the equation:

D_KL (P||Q)= ∑_(s∈S)▒〖P(s) log⁡((P(s))/(Q(s))) 〗

Where P is the true reward distribution, Q is the approximating reward distribution and S refers to the set of all states within the model. The subsequent match’s reward distribution (PGmi(S)) was used as the true reward distribution with the previous matches’ reward distribution (PGM(S)) as the approximating distribution. “

Attachments
Attachment
Submitted filename: Reviewer response.docx
Decision Letter - Caroline Sunderland, Editor

Development of an expected possession value model to analyse team attacking performances in rugby league

PONE-D-21-20054R2

Dear Dr. Sawczuk,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Caroline Sunderland

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Formally Accepted
Acceptance Letter - Caroline Sunderland, Editor

PONE-D-21-20054R2

Development of an expected possession value model to analyse team attacking performances in rugby league

Dear Dr. Sawczuk:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Caroline Sunderland

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .