Peer Review History
| Original SubmissionApril 8, 2025 |
|---|
|
PONE-D-25-18078Categorical consistency of parity and magnitude facilitates implicit learning of color-number associationsPLOS ONE Dear Dr. Retter, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. I have received two thorough reviews of your manuscript by experts in the field and read the manuscript myself. Both reviewers acknowledge your manuscript addresses an important question and provides valuable empirical evidence. However, several limitations must be addressed before publication. I concur with their evaluation and briefly summarize the main points requiring additional work below. As Reviewer 1 emphasizes, there is a need for a stronger theoretical motivation. Also, Reviewer 2 raises critical methodological concerns about your paradigm's inability to distinguish between response-level and semantic-level learning mechanisms, which should be acknowledged. Both reviewers highlight issues with missing counterbalancing of key experimental factors, which is crucial to discuss further. Additionally, Reviewer 2 identifies redundancies and potential errors in the statistical analyses. The reviewers have provided excellent guidance for these revisions. I encourage you to submit a revised manuscript that addresses these core issues while maintaining the strengths of your empirical work. Please submit your revised manuscript by Jul 27 2025 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols . We look forward to receiving your revised manuscript. Kind regards, Elisa Scerrati Academic Editor PLOS ONE Journal requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 2. Please note that PLOS ONE has specific guidelines on code sharing for submissions in which author-generated code underpins the findings in the manuscript. In these cases, we expect all author-generated code to be made available without restrictions upon publication of the work. Please review our guidelines at https://journals.plos.org/plosone/s/materials-and-software-sharing#loc-sharing-code and ensure that your code is shared in a way that follows best practice and facilitates reproducibility and reuse. 3. We note that the grant information you provided in the ‘Funding Information’ and ‘Financial Disclosure’ sections do not match. When you resubmit, please ensure that you provide the correct grant numbers for the awards you received for your study in the ‘Funding Information’ section. 4. Please expand the acronym “FHSE/UL” (as indicated in your financial disclosure) so that it states the name of your funders in full. This information should be included in your cover letter; we will change the online submission form on your behalf. 5. When completing the data availability statement of the submission form, you indicated that you will make your data available on acceptance. We strongly recommend all authors decide on a data sharing plan before acceptance, as the process can be lengthy and hold up publication timelines. Please note that, though access restrictions are acceptable now, your entire data will need to be made freely accessible if your manuscript is accepted for publication. This policy applies to all data except where public deposition would breach compliance with the protocol approved by your research ethics board. If you are unable to adhere to our open data policy, please kindly revise your statement to explain your reasoning and we will seek the editor's input on an exemption. Please be assured that, once you have provided your new statement, the assessment of your exemption will not hold up the peer review process. 6. Your ethics statement should only appear in the Methods section of your manuscript. If your ethics statement is written in any section besides the Methods, please delete it from any other section. 7. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly Reviewer #2: Yes ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: “Categorical consistency of parity and magnitude facilitates implicit learning of color-number associations” by Retter and Schiltz: This manuscript provides valuable insights into the relationship between color and parity and color and magnitude, in terms of whether associative learning can result in implicit associations between these dimensions. The authors conducted a single experiment, with three tasks. The parity task aimed to replicate prior findings by the researchers demonstrating associative learning can emerge between color and parity. The magnitude task was a novel task designed to evaluate whether associative learning could also extend to mapping colors to categories of magnitude. And lastly, the parity-mix task aimed to demonstrate whether learning from tasks 1 and 2 could be merged to influence responses in a final parity task. At the end participants were also asked to complete tasks aiming to explicitly demonstrate learned associations. The data are appropriately analyzed and the majority of the results are clear and carefully describe key details. There are no concerns about potential competing interests on the part of the authors, data availability, or research ethics. Together, this manuscript reports an interesting experimental design investigating parity/magnitude associations. However, there are several points of concern that I think should be addressed to ensure this work is ready for publication, including clarifying the purpose/context of the work amidst prior work and addressing concerns in methodology of part 3. Major concerns 1. The theoretical basis/motivation for this work was limited throughout the introduction. There was a clear overview of associative learning and why investigating associative learning and categorical-level (vs. item-level) is important, which was great. However, there was little discussion on why parity and magnitude are beneficial categories to assess in the context of associative learning. Similarly, it was unclear throughout the introduction what the rationale was for replicating the authors’ previous work on parity, as well as, why it is important to extend these ideas to magnitude. Specifically, I think the work would benefit from additional elaboration and clarification how this this work is different from the previous studies performed by these authors (i.e., adding to the statement on page 2 of the Intro: “Here, we aimed to replicate these results for parity, and extend the paradigm to testing for another category-level association, magnitude, which is a fundamental aspect of number processing.”). 2. The rationale, design, and interpretation of the results of the parity-mix task left me with a number of questions/concerns. The authors mention that all participants completed the tasks in the same order (parity, magnitude, parity-mix), which seems particularly concerning in evaluating the results of this final task. It seems quite possible that order-effects are contributing to the results, such that a primacy effect could be driving why participants were most likely to report blue-yellow colors correctly give those are the color categories first learned in the parity task. This concern could be addressed with additional clarification on the logic for why the order was not balanced across participants and/or, ideally, a set of data in which the order is reversed for tasks 1 & 2. Further, the colors for the parity-mix task read as an essential component to the task. However, given the limited recognition by the participants that the colors were different, this suggests that the task likely appear the exact same as task 1 for participants. If the colors were more distinct, such that the purplish-blue and yellowish-green were clear border colors between purple and blue and yellow and green (and that participants reported such colors as being on their own color category borders), the results would more directly address the question of whether associations are overlapping across categories. Further, I wonder whether other kinds of associations could be driving the match vs. mismatched results of this task, such as the dark-is-more bias (e.g., McGranaghan (1989), Schloss, et al. 2018; etc.), in which people infer darker colors represent more. Blues and purples are generally/often darker than yellows and greens and were mapped to larger magnitudes in the present study. Such a bias being activated could provide an alternative explanation to the higher accuracy of matched trials than mismatched. The work would benefit from a discussion considering these points in the context of interpreting the results of the parity-mix task. 3. Throughout the introduction (and a bit beyond too), the use of “category-level” could be interpreted with respect to multiple different dimensions, including parity, magnitude, and color. To clarify, when using “categorical-level”, for the most part (as I understand it), the authors’ use it in reference to the categories of parity and/or magnitude and NOT the color categories used (i.e., the color category of blue). However, in the Introduction’s discussion of the parity-mix experiment, there are references to associations extending to different color categories and in the design of that experiment itself, there is greater consideration color categories (given the colors are shifted towards neighboring color categories) as well as the parity and magnitude categories. The manuscript would benefit from ensuring clarity throughout on which categories are being referenced when discussing the category-level. I encourage the authors to provide additional details and examples throughout the introduction to clarify the multiple dimensions being discussed. 4. In order for others to have the ability to replicate this work, it would be necessary to add information on the following: a. how much time participants took (on average) to complete each task, b. how much compensation was given out to participants, c. how participants were evaluated for atypical color-vision (It seems quite surprising not a single participant of 40 had atypical color vision given prevalence rates) d. the rationale for the sample size of 40 participants (e.g., was a power analysis performed? If not, how was this number of 20 per condition selected and how is sufficient power ensured?) 5. In the discussion, the authors briefly discuss one practical extension of the work. Aligning with the comments on the introduction and providing additional motivation for why this work is important, it would be beneficial to expand on the implications from these results. Minor concerns 6. Regarding the methods, it is currently unclear how many total congruent versus incongruent trials were presented for each task. This information seems particularly important to include given the results are dependent on the differences between congruent & incongruent. A related minor comment, when reporting the results, would help the readers to see the means & variation for each level being used to determine the congruency effect when discussing this effect (versus currently the means/SE are discussed later in the results). This would be especially helpful given that the individual subject data on the results graphs make it a bit challenging to clearly see the means & error bars. 7. While the Supplementary materials are available, a more detailed summary of the main points from those could be incorporated into the main text. Currently, the sentence regarding the “congruency effects were observed to evolve across blocks for the category level experiment….” felt out of place and missing the context needed to understand its importance. 8. There is a bit of inconsistency in the language used throughout the manuscript, that sometimes makes it a bit difficult to interpret the information. For instance, each of the following were used in reference (to at least I think) the same manipulation in the experiment: category-level experiment, category-level version,; experimental group. 9. The discussion had a few typos (the rest of the paper was well-written in that regard): a few examples include; “to” is missing between “compared” and “incongruent” in the first sentence; “leaning” instead of “learning” , “we predicting” instead of “we predicted,” etc. Reviewer #2: The reviewer comments are in the attached file. The reviewer comments are in the attached file. The reviewer comments are in the attached file. The reviewer comments are in the attached file. The reviewer comments are in the attached file. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean? ). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy . Reviewer #1: No Reviewer #2: No ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org . Please note that Supporting Information files do not need this step.
|
| Revision 1 |
|
Categorical consistency of parity and magnitude facilitates implicit learning of color-number associations PONE-D-25-18078R1 Dear Dr. Retter, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice will be generated when your article is formally accepted. Please note, if your institution has a publishing partnership with PLOS and your article meets the relevant criteria, all or part of your publication costs will be covered. Please make sure your user information is up-to-date by logging into Editorial Manager at Editorial Manager® and clicking the ‘Update My Information' link at the top of the page. For questions related to billing, please contact billing support . If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Elisa Scerrati Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: This revised manuscript has substantially improved in clarity and rigor from the original submission. Thank you to the authors for their work to carefully and meaningfully address the questions and concerns that were originally posed. The introduction now provides a stronger motivation for why the authors are studying parity & magnitude. Throughout, the article is more clear with respect to terminology (e.g., “category level”), their rationale for design decisions (i.e., the decision to not counterbalance the task orders), and potential alternative explanations/limitations of the work. The authors also added essential details about their study, which would be needed for future replication by others (e.g., timing, compensation, trial numbers, etc.). My one comment to note (which may be intentional by the authors), although the authors provide a URL to their data on Zenodo, the URL link (at least at this time) appears to link to a broken page (i.e., “DOI not found” message). Apologies if this is intentional (i.e., if the data will be publicly posted upon publication). I did not see the mention of such qualifications, so wanted to make mention that the URL currently does not permit access to the data, in case this is due to a typo/error for the URL link. Otherwise, in my reading of the revised manuscript and the authors’ response to the reviews, I believe the present work is suitable for publication in PLOS One. Reviewer #2: Thank you for your detailed responses. I am satisfied with the result and find the manuscript much improved. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean? ). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy . Reviewer #1: No Reviewer #2: No ********** |
| Formally Accepted |
|
PONE-D-25-18078R1 PLOS ONE Dear Dr. Retter, I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now being handed over to our production team. At this stage, our production department will prepare your paper for publication. This includes ensuring the following: * All references, tables, and figures are properly cited * All relevant supporting information is included in the manuscript submission, * There are no issues that prevent the paper from being properly typeset You will receive further instructions from the production team, including instructions on how to review your proof when it is ready. Please keep in mind that we are working through a large volume of accepted articles, so please give us a few days to review your paper and let you know the next and final steps. Lastly, if your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. You will receive an invoice from PLOS for your publication fee after your manuscript has reached the completed accept phase. If you receive an email requesting payment before acceptance or for any other service, this may be a phishing scheme. Learn how to identify phishing emails and protect your accounts at https://explore.plos.org/phishing. If we can help with anything else, please email us at customercare@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Elisa Scerrati Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .