Peer Review History

Original SubmissionJune 30, 2022
Decision Letter - Xiyu Liu, Editor

PONE-D-22-18546An Attention Based Recurrent Learning Model for Short-term Travel Time PredictionPLOS ONE

Dear Dr. Chughtai,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Oct 13 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Xiyu Liu

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf  and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please note that PLOS ONE has specific guidelines on code sharing for submissions in which author-generated code underpins the findings in the manuscript. In these cases, all author-generated code must be made available without restrictions upon publication of the work. Please review our guidelines at https://journals.plos.org/plosone/s/materials-and-software-sharing#loc-sharing-code and ensure that your code is shared in a way that follows best practice and facilitates reproducibility and reuse.

3. Thank you for stating the following in the Acknowledgments Section of your manuscript:

“The funding for this research is provided by the Khalifa University of Science and Technology.”

We note that you have provided additional information within the Acknowledgements Section that is not currently declared in your Funding Statement. Please note that funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form.

Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows:

“The author(s) received no specific funding for this work.”

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thank you very much for the beautiful manuscript. I think the paper is in good condition!

The idea presented in the attention section is very practical and logical.

Manuscripts and forms are written in beautiful formats.

Accept

Reviewer #2: Paper summary:

This paper proposed an attention-based GRU model for short-term travel time prediction, enabling GRU to learn the relevant context in historical time slots and update the weights of hidden states accordingly. The authors evaluated the proposed model using FCD data from Beijing. To demonstrate the generalization of the proposed model, the authors performed a robustness analysis by adding noise obeying Gaussian distribution. The experimental results on test data indicated that the proposed model performed better than the existing deep learning time-series models in terms of RMSE, MAE, MAPE, and R^2.

Strengths:

1. This proposal is the first to use traffic flow as input with attention-based GRU to forecast travel time.

2. This proposal is well-structured, and the experiments are detailed and convincing.

Weaknesses:

1. I am wondering whether the overfitting of the original model will affect the results of this proposal.

2. The related work section is suggested to be divided into 2-3 subsections to make the structure of this section clearer.

Other comments:

1. It seems that attention based is missing a hyphen in “attention based”.

2. To overcome these limitations, two specialized variants of RNNs, Gated Recurrent Unit (GRU) [48] and Long Short-Term Memory (LSTM) [49] are developed. Consider inserting a comma to separate the elements.

3. RNNs, unlike MLPs and CNNs (feed-forward neural networks and take data all at once), act on data sequentially and are frequently employed in the Natural Language Processing (NLP) domain. It seems that there is a grammar mistake.

4. “This paper employed GRU with attention mechanism to process travel time sequences and forecast future TT.”. It seems that there is an article usage problem.

Reviewer #3: The authors proposed an attention-based GRU model for short-term travel time prediction to cope with this problem enabling GRU to learn the relevant context in historical time slots and update the weights of hidden states accordingly. This scheme thus solves the problem that the existing GRU does not consider the relationship between various historical travel time slots for traffic prediction. The main message, background and figures are generally clear and concise, but some comparisons with the state-of-the-art algorithms still need to be added in the experimental results section. I recommend improving the following points.

==Major concern==

1.The algorithm proposed in the article integrates the attention mechanism into the GRU to improve the prediction ability. And the comparison results with GRU and several other traditional algorithms are given in the article. In fact, in recent years, attention mechanism has been introduced into DNN to solve Travel Time Prediction, such as [1]. Therefore, it is hoped that the author can add new experiments to compare with such algorithms.

[1] Wu J, Wu Q, Shen J, Cai C. Towards attention-based convolutional long short-term memory for travel time prediction of bus journeys. Sensors. 2020;20(12):3354

==Minor concern==

1. On line 203 of the article, too many commas are printed.

2. References are poorly written. For example, the representation of page numbers, some are p.785-794, some are: 785-794. In addition, the reference is ended with a period, and some ends with a semicolon and a period. and many more.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Revision 1

1. Reviewer#2, Concern # 1: I am wondering whether the overfitting of the original model will affect the results of this proposal.

Author response: Thank you very much for the valuable suggestion.

Author action: As discussed in the Dataset section, we have used holdout cross validation (80-20 %) to validate the results of our proposed approach. Furthermore, we have used a normalization term in the loss function calculation to regularize the training and improve model generalization.

2. Reviewer#2, Concern # 2: The related work section is suggested to be divided into 2-3 subsections to make the structure of this section clearer.

Author response: Thank you very much for the valuable suggestion.

Author action: We have updated the Related Work section by organizing the literature under suitable headings as suggested by the reviewer.

3. Reviewer#2, Concern # 3: It seems that attention based is missing a hyphen in “attention based”.

Author response: Thank you very much for pointing it out.

Author action: We have updated the manuscript and added a hyphen in “attention based”.

4. Reviewer#2, Concern # 4: To overcome these limitations, two specialized variants of RNNs, Gated Recurrent Unit (GRU) [48] and Long Short-Term Memory (LSTM) [49] are developed. Consider inserting a comma to separate the elements.

Author response: Thank you very much for the valuable suggestion.

Author action: We have inserted comma to separate the elements as suggested by the reviewer.

5. Reviewer#2, Concern # 5: RNNs, unlike MLPs and CNNs (feed-forward neural networks that take data all at once), act on data sequentially and are frequently employed in the Natural Language Processing (NLP) domain. It seems that there is a grammar mistake.

Author response: Thank you very much for pointing it out.

Author action: We have corrected the identified grammar mistake and updated the manuscript.

6. Reviewer#2, Concern # 6: “This paper employed GRU with attention mechanism to process travel time sequences and forecast future TT.”. It seems that there is an article usage problem.

Author response: Thank you very much for pointing it out.

Author action: We have corrected the highlighted issue in the manuscript.

1. Reviewer#3, Concern # 1: The algorithm proposed in the article integrates the attention mechanism into the GRU to improve the prediction ability. And the comparison results with GRU and several other traditional algorithms are given in the article. In fact, in recent years, attention mechanism has been introduced into DNN to solve Travel Time Prediction, such as [1]. Therefore, it is hoped that the author can add new experiments to compare with such algorithms.

[1] Wu J, Wu Q, Shen J, Cai C. Towards attention-based convolutional long short-term memory for travel time prediction of bus journeys. Sensors. 2020;20(12):3354

Author response: Thank you very much for the valuable suggestion.

Author action: We have cited the reference suggested by the reviewer in our paper and some other references in the Related Work section. This research work is different from the predictions provided by [1] and [2]. [1] proposed an attention-based convolutional long short-term memory for predicting journey trip time of selected bus routes at current time. [2] proposed attention-based LSTM for predicting travel time of freeway at current time. In our research work, we have used urban network data of Beijing which is different from freeways or bus data (selected routes). Secondly, we proposed an attention-based GRU model for short-term travel time prediction. This is the reason we have compared our approach with GRU and several other traditional algorithms from various families as discussed in the Related Work section. A fair comparison of real-time (current time) prediction approaches with short-term travel time prediction approaches is not possible. Once again, we are thankful to the respectable reviewer for the valuable suggestion.

[1] Wu J, Wu Q, Shen J, Cai C. Towards attention-based convolutional long short-term memory for travel time prediction of bus journeys. Sensors. 2020;20(12):3354

[2] Ran X, Shan Z, Fang Y, Lin C. An LSTM-based method with attention

mechanism for travel time prediction. Sensors. 2019;19(4):861.

2. Reviewer#3, Concern # 2: On line 203 of the article, too many commas are printed.

Author response: Thank you very much for pointing it out.

Author action: We have corrected the identified error in the manuscript.

3. Reviewer#3, Concern # 3: References are poorly written. For example, the representation of page numbers, some are p.785-794, some are: 785-794. In addition, the reference is ended with a period, and some ends with a semicolon and a period. and many more.

Author response: Thank you very much for pointing it out.

Author action: We have updated the manuscript by updating all the references.

Attachments
Attachment
Submitted filename: Response-to-Reviewers.docx
Decision Letter - Xiyu Liu, Editor

An Attention-based Recurrent Learning Model for Short-term Travel Time Prediction

PONE-D-22-18546R1

Dear Dr. Chughtai,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Xiyu Liu

Academic Editor

PLOS ONE

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #2: (No Response)

Reviewer #3: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #2: (No Response)

Reviewer #3: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #2: (No Response)

Reviewer #3: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #2: (No Response)

Reviewer #3: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #2: (No Response)

Reviewer #3: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #2: (No Response)

Reviewer #3: The authors proposed an attention-based GRU model for short-term travel time prediction to cope with this problem enabling GRU to learn the relevant context in historical time slots and update the weights of hidden states accordingly.They have answered all my questions. I have no other comments.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #2: No

Reviewer #3: No

**********

Formally Accepted
Acceptance Letter - Xiyu Liu, Editor

PONE-D-22-18546R1

An Attention-based Recurrent Learning Model for Short-term Travel Time Prediction

Dear Dr. Chughtai:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Professor Xiyu Liu

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .