Peer Review History

Original SubmissionJanuary 14, 2021
Decision Letter - Luca Citi, Editor

PONE-D-21-01369

Mathematically aggregating experts’ predictions of possible futures

PLOS ONE

Dear Dr. Hanea,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

In particular:

  • Improve presentation of aggregation rules (avoid "laundry list"), providing rationale, intuition behind each and evidence from previous literature that such a scheme could be reasonable.
  • When presenting results, highlight the main points of the story.
  • Try to establish links with similar concepts in related disciplines (e.g. computer science / AI / statistics).
  • Make clear what this work adds to the literature.
  • Address the many constructive comments by the reviewers.

Please submit your revised manuscript by Apr 18 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Luca Citi, PhD

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1) Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2)  Thank you for stating the following in the Competing Interests section:

[The authors have declared that no competing interests exist.].   

We note that one or more of the authors are employed by a commercial company: DelphiCloud and Cognimotive Consulting Inc.

i. Please provide an amended Funding Statement declaring this commercial affiliation, as well as a statement regarding the Role of Funders in your study. If the funding organization did not play a role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript and only provided financial support in the form of authors' salaries and/or research materials, please review your statements relating to the author contributions, and ensure you have specifically and accurately indicated the role(s) that these authors had in your study. You can update author roles in the Author Contributions section of the online submission form.

Please also include the following statement within your amended Funding Statement.

“The funder provided support in the form of salaries for authors [insert relevant initials], but did not have any additional role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. The specific roles of these authors are articulated in the ‘author contributions’ section.”

If your commercial affiliation did play a role in your study, please state and explain this role within your updated Funding Statement.

ii. Please also provide an updated Competing Interests Statement declaring this commercial affiliation along with any other relevant declarations relating to employment, consultancy, patents, products in development, or marketed products, etc. 

Within your Competing Interests Statement, please confirm that this commercial affiliation does not alter your adherence to all PLOS ONE policies on sharing data and materials by including the following statement: "This does not alter our adherence to  PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests) . If this adherence statement is not accurate and  there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared.

Please include both an updated Funding Statement and Competing Interests Statement in your cover letter. We will change the online submission form on your behalf.

Please know it is PLOS ONE policy for corresponding authors to declare, on behalf of all authors, all potential competing interests for the purposes of transparency. PLOS defines a competing interest as anything that interferes with, or could reasonably be perceived as interfering with, the full and objective presentation, peer review, editorial decision-making, or publication of research or non-research articles submitted to one of the journals. Competing interests can be financial or non-financial, professional, or personal. Competing interests can arise in relationship to an organization or another person. Please follow this link to our website for more details on competing interests: http://journals.plos.org/plosone/s/competing-interests

3) We note that you have stated that you will provide repository information for your data at acceptance. Should your manuscript be accepted for publication, we will hold it until you provide the relevant accession numbers or DOIs necessary to access your data. If you wish to make changes to your Data Availability statement, please describe these changes in your cover letter and we will update your Data Availability statement to reflect the information you provide.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

Reviewer #4: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: No

Reviewer #2: Yes

Reviewer #3: No

Reviewer #4: I Don't Know

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: No

Reviewer #4: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

Reviewer #4: No

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This is a very nice work, I like reading it very much. The way it is written is really clear, and using an astounding English writing that I am unable to judge. I found the paper very informative in terms of a "Review" of the different aggregation methods that can be used from experts' predictions.

I can perform two main criticism of this work. The first one is that, though it may be influenced from my background, I think the manuscript can benefit if it is more connected to Computer Science concepts. For instance, I found that the provided definition for Entropy is not the standard definition given in computer science or statistical sciences.

The second criticism is that the evaluation of the 22 methods lack any statistical proof about the obtained results. I am not pointing towards some extreme statistical hypothesis test but at least some paragraph stating what kind of control experiments authors executed to verify that obtained results are not affected by some uncovered bias. For instance, what happens if Authors run the 22 tests on an artificial dataset that is produced by an agent that randomly produces responses ? I do not think that per se that invalidates anything about this work, but anyway, it would beneficial to understand if the results that Authors finally obtained are affected by chance or not.

Some minor issues that I found on the manuscript:

43-45: Are Authors suggesting that when the relative frequency interpretation is not appropriate, the reason why it should be encouraged to elicit bounds while making a probabilistic forecasting decision is because it helps the expert to elucidate a best estimate B i,c ?

104: I suggest authors that they may link the idea of analyzing or considered prior performance in a similar domain, connected with the concept of transfer learning in Computer Science/Machine Learning.

157: Equation (1): I would like to suggest Authors to use shorter variable names in Equations, or greek symbols. The mathematical notation using very long variable names makes them more difficult to read.

132: the important concept of calibration is used here, before being explained in detail in section 2.1.2

166: I suggest authors to provide some reference to a prior work on ROC curve definitions.

177: In this sentence, I think the word "Calibration" could be replaced by "Accuracy" and it will be valid. The explanation provided for calibration later in this paragraph is excellent, but I found that sentence a little bit confusing. Is "Calibration" the same thing as "Accuracy" ?

199: Where is the "Refinement" term ?

202: Entropy is used here, very roughly defined, but it is defined in more detail later in Section 2.1.3.

Section 2.1.3: The definition of Entropy in Equation (3) is a little bit confusing. I think it is a very important concept that should be defined more properly (in terms of having a more self-contained manuscript). I would suggest to cite and link with this article "A modified belief entropy in Dempster-Shafer framework, Zhou 2017" and provide the Shannon Entropy definition used in that article (which you could later modify to this particular problem of having the entropy of the particular distribution (pk, 1-pk).

Section 3: The idea of seed questions could be linked to Transfer Learning (as mentioned earlier) as well as with Supervised Learning.

297: I suggest Authors to add Bi,c to "simply takes the average of the best estimates B i,c of each individual)" for clarity.

301: Why in this line "the mean" (i.e. I assume arithmetic mean) is different from the "The simple Average.." in line 299.

318: The concept of "extremizes the mean estimate" I believe that it could be further defined and clarified (it is used again in line 692).

358: I think the name "KitchSinkWAgg" is somehow unexpected in this context, and I think it could be further clarified (I was totally curious about that). Additionally and more broadly, I think the manuscript is going to be improved if an extra table with all the different methods are summarized and described (what aspect of the proxies are each one of them trying to emphasize).

438: I think the NVivo software could be referenced (if there is any reference), or at least clarified something else about it.

527: There is a broken reference in the manuscript.

658: I am not sure what exactly the pronoun "Its" is referring to in this sentence.

Figure 2: I think you have 22 methods, not 21.

One last word regarding the Discussion section, it was hard for me to state clearly what are the findings of this work. Perhaps, stating the findings, one by one, in a clear manner helps readers to identify them.

Reviewer #2: This study reports a pure experimental analysis on existing methods of information aggregation. 22 aggregation methods are examined with several data sets. The verifications rely on three criteria including accuracy, calibration, and informativeness. The evaluation results are clear to me: the beta-transformed arithmetic mean is the best, and Arithmetic mean of the non-parametric distribution is the worst. The statement is clear and complete. I enjoyed reading this paper. The present results are convincing to me.

Reviewer #3: The authors consider the problem of aggregating experts' probability predictions of a future binary outcome. They consider many different aggregators that derive from the weighted average and differ in the way the weights are calculated. They apply the aggregators to three real-world datasets and compare the performances of the aggregators in terms of several well-established criteria. Even though no single aggregator emerges as a clear winner, the beta-transformed arithmetic mean (BetaArMean) performs very well.

Overall, I enjoyed reading the paper, and I believe the literature would benefit from an extensive comparison of different weighting schemes. However, the paper needs work before being published. In the attached document, I have listed both general and specific comments. I hope they help the authors to improve this paper.

Reviewer #4: Review of PONE-D-21 Mathematically aggregating experts’ predictions of possible futures

I was hoping to gain some new insights into aggregation when I read this paper. I wish I had. This paper seems more like a report to a government agency than a journal article that tells a story about how to improve aggregation. Many of the methods considered seem way too similar to matter, and yet they are considered anyway. The writing is difficult to follow and the graphs are too small to show the differences the authors claim they find – the best and worst aggregation rules across datasets. I don’t see a good explanation for WHY these aggregation rules are the best and worst. I don’t understand what it was about the data sets that may had produced the differences, and I’m just not learning enough from the paper.

Perhaps I could be convinced that a very different version of this paper could be published. The new version would have to get right to the point and focus on the key points – not a laundry list of aggregation rules. It would also have to include an explanation of why the results came out the way they did. Presentation of the arguments is not strong in the current form. Figures 1 and 2 compare too many rules, and they don’t really highlight the main points of the story. Sorry to be negative, but I think the authors should focus more on what they add to the literature that is different, new and will result in progress on these issues.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: Yes: Ville Satopää

Reviewer #4: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Attachments
Attachment
Submitted filename: Review.pdf
Revision 1

Please see the attached document Answers2Comments.docx for detailed answers to all comments.

Attachments
Attachment
Submitted filename: Answers2Comments.docx
Decision Letter - Luca Citi, Editor

PONE-D-21-01369R1

Mathematically aggregating experts’ predictions of possible futures

PLOS ONE

Dear Dr. Hanea,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, all three reviewers and the editor agree that the paper has improved noticeably through the revision and that is practically ready for publication. However, there are a few remaining minor points raised by the reviewers that we would like the authors to consider for their final version. To speed up the process and also in consideration of the reviewers' time, your new submission will not be sent for another round of reviews but will be evaluated by the editor only.

Please submit your revised manuscript by Aug 19 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A brief rebuttal letter that quickly responds to the last few outstanding points.
  • In the interest of time, there is no need to submit the marked-up copy of your manuscript (if the system requires a file to be uploaded, feel free to submit an empty document labeled 'Revised Manuscript with Track Changes').
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Luca Citi, PhD

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

Reviewer #2: All comments have been addressed

Reviewer #3: (No Response)

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: No

Reviewer #2: Yes

Reviewer #3: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: All the issues that I previously pointed out were effectively addressed. Indeed, the manuscript changed quite a lot and now the exposition of the ideas is much more clear.

The included sections really help to understand the basis of the different methods, and there are now several references that truly aid in supporting the statements.

Additionally, now the section 3.2.1 really explains in a better way, step by step, all the different methods and the rationale behind them. The table 1 offers a clear overview and categorization of all of them. Now the take-home message (e.g. extremizing he aggregate through the BetaArMean) is much more clear.

The idea of moving the burden of the details of the different methods to Section 6, improves a lot the structure of the manuscript: it is now easier to read.

Some final suggestions: Figure 1, the scale font is completely unreadable (compare the fonts to the ones that you have on Figure 2 which can be perfectly read).

The manuscript is now lengthy, but I believe it fits better what is trying to do: summarizing different methods of mathematically aggregation of decisions from human experts.

Reviewer #2: Reviews:

This is my second-round reviews on this paper. Again, I enjoyed reading this paper. Compared with its previous version, some parts have been further clarified.

This paper provides a comprehensive review of methods on information aggregation, in a scenario of group (experts) decision-making. Although the studies are purely empirical, which may lead to the lack of theoretical background and profundity, the concrete contribution on applications has been impressive and applaudable. Therefore, again, I would like to recommend acceptance of this paper.

For a minor issue, I would suggest the authors revise the abstract. The current abstract contains too much background (motivation, research gap, etc.), and yet, only three sentences (lines 20-24 page 1) regarding your substantial works of this paper. It is better to mention your works, your contribution, and the significance of this study in the abstract.

Reviewer #3: (Comments submitted as a separate file)

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Rodrigo Ramele

Reviewer #2: Yes: Dr. Junyi (Don) CHAI

Reviewer #3: Yes: Ville Satopää

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Attachments
Attachment
Submitted filename: Review.pdf
Revision 2

Please see the response to the reviewers' last comments attached as Review_final.docx

Attachments
Attachment
Submitted filename: Review_final.docx
Decision Letter - Luca Citi, Editor

Mathematically aggregating experts’ predictions of possible futures

PONE-D-21-01369R2

Dear Dr. Hanea,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Luca Citi, PhD

Academic Editor

PLOS ONE

Formally Accepted
Acceptance Letter - Luca Citi, Editor

PONE-D-21-01369R2

Mathematically aggregating experts’ predictions of possible futures

Dear Dr. Hanea:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Luca Citi

Academic Editor

PLOS ONE

Open letter on the publication of peer review reports

PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.

We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.

Learn more at ASAPbio .