Peer Review History
Original SubmissionJune 24, 2022 |
---|
PONE-D-22-18053Graph-based machine learning improves just-in-time defect predictionPLOS ONE Dear Dr. Moriano, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Jan 27 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Orawit Thinnukool, Ph.D. Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 2. Thank you for stating in your Funding Statement: “This research was sponsored in part by Oak Ridge National Laboratory's (ORNL's) Laboratory Directed Research and Development program. Pablo Moriano acknowledges support from ORNL's Artificial Intelligence initiative. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.” Please provide an amended statement that declares *all* the funding or sources of support (whether external or internal to your organization) received during this study, as detailed online in our guide for authors at http://journals.plos.org/plosone/s/submit-now. Please also include the statement “There was no additional external funding received for this study.” in your updated Funding Statement. Please include your amended Funding Statement within your cover letter. We will change the online submission form on your behalf. 3. Thank you for stating the following in the Acknowledgments Section of your manuscript: “This research was sponsored in part by Oak Ridge National Laboratory’s (ORNL’s) 503 Laboratory Directed Research and Development program. Pablo Moriano acknowledges 504 support from ORNL’s Artificial Intelligence initiative.” We note that you have provided additional information within the Acknowledgements Section that is not currently declared in your Funding Statement. Please note that funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form. Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows: “This research was sponsored in part by Oak Ridge National Laboratory's (ORNL's) Laboratory Directed Research and Development program. Pablo Moriano acknowledges support from ORNL's Artificial Intelligence initiative. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.” Please include your amended statements within your cover letter; we will change the online submission form on your behalf. Additional Editor Comments (if provided): Dear Author The manuscript Graph-based machine learning improves just-in-time defect prediction by investigating very important and practice problems. The author presented a sufficient introduction and research motivation as well as contribution. The literature review is sufficient. The proposed methods are new and innovative. In my point of view, the idea of this paper is interesting and has some novelty. The paper is clearly presented and easy to follow assuming the reader has some background knowledge. Although I found that your paper has merit, it is not acceptable to publish in its present form. Please revise the manuscript according to the reviewers' comments and upload the revised file within 30 days. Please find the reviewer's comments at the end of this message; The evaluation is also good. However, the paper can be improved in the following way. Major: 1. Though the authors have updated the manuscript, a few comments are not still addressed. please consider the following comments and revise the paper : 1. Abstract: Read the complete abstract and try to write direct, simple, and straight sentences e.g., “The main objective …… 2.Open science? Although a link is provided to the data (provided by another research group/Ning Li) there is no code or details for instance on how the holdout samples were undertaken. This means your analysis could not be reproduced. Please provide your python code etc. 3.There does seem a slight tendency to cherry pick results to "big up" their method. This isn't helpful or necessary. It would be better to either give the range of possible results and/or typical improvement. 4. The reviwers concern about the classification performance evaluation. The authors note the problems relating to imbalanced datasets and for that reason advocate AUC-PRC (fair enough) but then extensively quote F1 performance stats which seems odd (e.g., l438). 5. Another problem with AUC style stats is they are based on entire families of classifier (many of which are uninteresting in a practical sense) so unless classifier x strictly dominates y knowing AUC_x > AUC_y is not informative. There are also other concerns (see [1-2]) 6. I'd consider using Matthews correlation coefficient/phi or Youden's J (which compares against a guessing strategy). 7.Lack of tuning may be misleading (l480-2). I appreciate you've not tuned any method but it seems a bit unrealistic. 8. The paper should have more case studies related to work. 9. Literature work should have a table in which each paper's method, problem, constraints, and research should be analyzed in detail in tabular form. 10.The research findings and limitations must be defined before the conclusion of the work. 11.Authors mentioned "The core of our contribution is problem 65 formulation. In particular, we leverage contribution graphs to extract graph-related 66 features that inform classification models when classifying defect-prone changes."I do not understand if the contribution is more on data classification modeling or proposing new feature vector OR new framework Please Elaborate. 12.Expand the comparison with state of art work (at least 3-4 works) not only work [10]. Minor: 1. English should be extensively revised and corrected. 2.Abstract (and l486): "by as much as 46.72%" but this isn't the representative case. I think the authors can reframe this claim in a more reasonable way without losing the impact or value of their work. We're not in the advertising business ;-) l398: lower -> fewer (being a bit pedantic here!) REFERENCES: [1] Hand, D. (2009). Measuring classifier performance: a coherent alternative to the area under the ROC curve. Machine Learning, 77, 103--123. https://doi.org/10.1007/s10994-009-5119-5 [2] Powers, D. (2012). The Problem of Area Under the Curve International Conference on Information Science and Technology (ICIST), Wuhan. 3.The notations should have a particular table. 4. The time complexity of each method must be defined in the paper. 5. -Reorganize the methodology section into phases with one main diagram that represent each phase as well as each step into distinct phase. 6.-Figure 1 and 2 are missing. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly Reviewer #2: Yes Reviewer #3: Partly ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: N/A Reviewer #2: Yes Reviewer #3: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: STRENGTHS: The idea is interesting and has some novelty. The paper is clearly presented and easy to follow assuming the reader has some background knowledge. WEAKNESSES: Open science? Although a link is provided to the data (provided by another research group/Ning Li) there is no code or details for instance on how the holdout samples were undertaken. This means your analysis could not be reproduced. Please provide your python code etc. There does seem a slight tendency to cherry pick results to "big up" their method. This isn't helpful or necessary. It would be better to either give the range of possible results and/or typical improvement. OTHER COMMENTS: I'm concerned about the classification performance evaluation. The authors note the problems relating to imbalanced datasets and for that reason advocate AUC-PRC (fair enough) but then extensively quote F1 performance stats which seems odd (e.g., l438). Another problem with AUC style stats is they are based on entire families of classifier (many of which are uninteresting in a practical sense) so unless classifier x strictly dominates y knowing AUC_x > AUC_y is not informative. There are also other concerns (see [1-2]) I'd consider using Matthews correlation coefficient/phi or Youden's J (which compares against a guessing strategy). Lack of tuning may be misleading (l480-2). I appreciate you've not tuned any method but it seems a bit unrealistic. MINOR: Abstract (and l486): "by as much as 46.72%" but this isn't the representative case. I think the authors can reframe this claim in a more reasonable way without losing the impact or value of their work. We're not in the advertising business ;-) l398: lower -> fewer (being a bit pedantic here!) REFERENCES: [1] Hand, D. (2009). Measuring classifier performance: a coherent alternative to the area under the ROC curve. Machine Learning, 77, 103--123. https://doi.org/10.1007/s10994-009-5119-5 [2] Powers, D. (2012). The Problem of Area Under the Curve International Conference on Information Science and Technology (ICIST), Wuhan. Reviewer #2: The manuscript Graph-based machine learning improves just-in-time defect prediction by investigating very important and practice problems. The author presented a sufficient introduction and research motivation as well as contribution. The literature review is sufficient. The proposed methods are new and innovative. The evaluation is also good. However, the paper can be improved in the following way. 1. The paper should have more case studies related to work. 2. The notations should have a particular table. 3. The time complexity of each method must be defined in the paper. 4. Literature work should have a table in which each paper's method, problem, constraints, and research should be analyzed in detail in tabular form. 5.The research findings and limitations must be defined before the conclusion of the work. Reviewer #3: Many drawbacks are existed into presented paper such as: -Authors mentioned "The core of our contribution is problem 65 formulation. In particular, we leverage contribution graphs to extract graph-related 66 features that inform classification models when classifying defect-prone changes."I do not understand if the contribution is more on data classification modeling or proposing new feature vector OR new framework. Elaborate. -Reorganize the methodology section into phases with one main diagram that represent each phase as well as each step into distinct phase. -Figure 1 and 2 are missing. -Expand the comparison with state of art work (at least 3-4 works) not only work [10]. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Martin Shepperd Reviewer #2: Yes: Abdullah Lakhan Reviewer #3: No ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
Revision 1 |
Graph-based machine learning improves just-in-time defect prediction PONE-D-22-18053R1 Dear Dr. Moriano, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Orawit Thinnukool, Ph.D. Academic Editor PLOS ONE Additional Editor Comments (optional): I am confident that the paper is now ready for publication. Reviewers' comments: |
Formally Accepted |
PONE-D-22-18053R1 Graph-based machine learning improves just-in-time defect prediction Dear Dr. Moriano: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Assistant Professor Orawit Thinnukool Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .