Peer Review History
Original SubmissionOctober 25, 2022 |
---|
PONE-D-22-29437Methods for Improving the Quality of National Household SurveysPLOS ONE Dear Dr. West, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Apr 27 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Bidhubhusan Mahapatra, Ph.D. Academic Editor PLOS ONE Journal requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf. 2. Please provide additional details regarding participant consent. In the ethics statement in the Methods and online submission information, please ensure that you have specified what type you obtained (for instance, written or verbal, and if verbal, how it was documented and witnessed). If your study included minors, state whether you obtained consent from parents or guardians. If the need for consent was waived by the ethics committee, please include this information. 3. Thank you for stating the following in the Acknowledgments Section of your manuscript: “The authors wish to acknowledge the contributions of Mick Couper, Colette Keyser, Andrew Hupp, and the rest of the AFHS team in the Survey Research Operations unit of the Survey Research Center.” We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form. Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows: “This work was supported by the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) of the National Institutes of Health (grant number R01HD095920; PI: B.T. West; website: https://www.nichd.nih.gov/). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.” Please include your amended statements within your cover letter; we will change the online submission form on your behalf. 4. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability. Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. Any potentially identifying patient information must be fully anonymized. Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access. We will update your Data Availability statement to reflect the information you provide in your cover letter. 5. We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For more information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. In your revised cover letter, please address the following prompts: a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially sensitive information, data are owned by a third-party organization, etc.) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent. b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. We will update your Data Availability statement on your behalf to reflect the information you provide. Additional Editor Comments: It is very well written paper; however, as the reviewers highlight, there are some areas that needs more clarification. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Partly ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: No Reviewer #3: No ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The issue undertaken in this manuscript entitled “Methods for improving the quality of national household surveys” is a highly relevant, novel, and timely initiative to improve the response rate from different population segments. These modes of data collection are more useful when the information is obtained on sensitive and private matters. However, these interventions to improve the response rate may not work or yield the same amount of impact across the globe settings. The reasons for such apprehensions are more practical and resources bound rather than theoretical. In a developing country setting, a face-to-face interview may have a better chance of a higher response rate to the national household surveys. It is because people still don’t trust outside communications (mail/Email) when it comes to divulging information regarding their households/dwelling units or any of its members. They trust only local representatives/leaders and their approval (oral/written) from them for providing any such information. Thus, face-to-face interaction between the interviewer/field supervisor and the local leader/representative regarding the aim and objective of the intended household survey is always helpful to reduce nonresponses in the initial phase of any household survey. The authors on page 28 in the discussion regarding the modular approach stated-“In short, this study suggested that asking respondents to complete several shorter web/mail surveys was not as effective as asking respondents to complete a single long web/mail survey and take breaks as needed; the number of survey requests, and not the length, seemed to be the limiting factor”. However, the modular approach sometimes is also used to generate a specific indicator from the respondents possessing specific background characteristics, and not to divide a longer interview into serval shorter interviews. On the other hand, another support to your study could be that a typical kind of respondent wishes to finish the whole interview in one go as s/he may not get time/privacy to complete the survey again and again. In a face-to-face interview, such as conducted in many countries and varying settings under the Demographic and Health Survey (DHS), often respondents themselves requested to finish the interview at once. This is the reason for more than 85% of interviews in DHS get completed in the first visit itself. There could be various reasons for such a preference/pattern in a given setting. Reviewer #2: I have fairly little to criticize on this paper. The paper is very nicely written, the research design is clear and capable to answer the research question. I have only some very general remarks, some of them can be easily incorporated, some others may produce doubts whether PLOS is the right outlet for this paper. (1) The paper deals with a highly specific topic that primarily concerns practitioners of survey research. Frankly, I do not really believe that a paper that finds out that incentives and mail options increase response rates of population surveys is of interest for many readers. Thus, I was wondering whether this paper should be addressed to one of several survey methods journals. However, this is a question, the Editor needs to deal with. I look here only from the perspective of a survey methodologist. (2) In front of your hypothesis 1, you state that there is "theoretical and empirical support for the first hypothesis" -- and I agree. However, it strikes me, why a hypothesis that is theoretical sound and well-supported by empirical evidence requires further research. This is also somewhat similar for the other hypotheses, but I agree that the empirical evidence is not as strong for those cases. Of course, I support the idea to replicate existing studies to see, whether the finding holds in different data, or whether the finding continues to be correct. It is, as you find out. But this then is again a research question that is highly specialized and not necessarily a topic for PLOS. In any case, you should try to explain why we need a further study for hypothesis 1. (3) Given that this is a general audience outlet, I was wondering whether you should make crystal clear that you are dealing with the Online mode in a probability sample, and not with one of those self-selected Online Access Panels that gain so much attention on the public. (4) The second hypothesis is already very specific. You expect an improvement of response rates for "motivated participants" on "mail or telephone reminders", which is fine, but raises again the question of general interest. The third hypothesis then is even more specific: It concerns only "carefully designed AND communicated modular surveys" with a "lengthy questionnaire". Can you extend a bit on whether it can be expected that these results transport to other survey, too? (5) Hypothesis 3 mixes the hypothesis with the reasons for the hypothesis. I would separate the part behind "due to" from the hypothesis itself, especially since you are not investigating the "perceived convenience" mechanism. (6) I was wondering if it is common for PLOS to print out so many differentiations of significance levels. This is meanwhile strongly discouraged by a number of other journals, and also by the ASA. I would also not present percentage points and percentage points differences with decimal places, but that's a matter of taste, perhaps. (7) On pg. 20/21 you are reflecting on the differences in the result between the various ESRI groups. Doing so, you take the values at face value, stating that the hypothesis is true for one group, but not for some other. Another way to look at these inconsistent results is to state that the results are not robust at all, so that there is huge uncertainty whether the hypothesis is supported at all (despite the significance tests). There's also not really a theoretical reason why the hypothesis should be true for one group, but not the other. In fact, your theoretical reasoning suggest that it should be generally true. (8) Maybe not your fault, but the resolution of the graphs was very bad in the file. Hard to read for my old eyes. Reviewer #3: The title is far too broad and does not reflect the content of the paper. Please change it to something that reflects the experiment reported in the paper. I find that the introductory sections of the paper (e.g. p.2, l.21-22; p.4, l.18-19) make unwarranted claims. The study tests just three particular aspects of survey design, and tests only a particular protocol for each aspect (other protocols are possible), so it should not be claimed that “A robust set of recommendations for new approaches to future surveys” can be provided as a consequence. Overall, there are two robust tests included in this paper. One is the test of the effect of the $5+priority mailing and the other is the test of modular design. Both of these are based on carefully designed and implemented experiments. The rest of the paper consists of descriptive analysis accompanied by highly speculative interpretation. I strongly suggest that this analysis (which relates to hypothesis 1 and most of hypothesis 2) should be dropped and the paper more closely focussed on the two aspects that have robust design. Furthermore, the authors report on p.29 that a follow up study has separated the effects of the $5 and the priority mailing. I would strongly recommend combining that analysis with the analysis in this paper of the combined protocol, to provide readers with all the results on this specific part of the survey protocol in one place. In the ‘Guiding theoretical framework’ section, all the examples given appear to be from one country. The authors should state whether they expect the framework and the study findings to be generalizable and, if so, to justify the restriction of examples to one country (or include examples from other countries). Alternatively, state clearly that this study applies to the U.S. I find the hypotheses rather vague. For example, the first one (p.8, l.5-8) refers to the effect of several different components of design (sequential mixed-mode, mail options, cash incentives, web invitation by mailed letter). It would be much more informative to the reader if the effect of one or more of these components could be separated out. Furthermore, the counterfactual is not stated. The hypothesis is that this set of design components in combination “…will improve rates of response … along with …. quality …” But compared to what? Without knowing what we are comparing to, the hypothesis is meaningless. The second hypothesis (p.9, l.1-3) has fewer design components confounded (“mail and telephone reminders”) but again refers to improving response rates. Compared to what? No reminders? That is not a very useful protocol to study as there is no good survey practice that involves no reminders. It has been clear since the 1950s that reminders are a cost-effective way to improve survey response rates. The third hypothesis also fails to state a counter-factual, but this becomes apparent upon reading the design of the experiment. Regarding the household screening, it is stated (p.13, l.19-20) that if there was only one eligible person, that person was immediately invited to complete the survey online. How was that achieved for households who completed the paper version of the screener? Was there an instruction with a web-link included within the screener questionnaire? How successful was this? It seems unlikely that someone filling a short paper questionnaire having declined the opportunity to do it online would then complete a long questionnaire online. It appears (p.14, l.9-10) that postcard reminders were sent only to people who had not provided an email address. This seems odd, given the relatively low impact of emails (lots of people do not receive/see/open them). Why wouldn’t you send postcards to all, with the email additional for those with email addresses? I don’t understand the claim (p.19,l.14) that mail packet 1 added 5.78 p.p. Fig.3 suggests 6.77 – 5.73 = 1.04 p.p. What am I missing? p.20, l.4: “increase the diversity”: you have not presented evidence of this. To claim that increasing response amongst females (for example) increases diversity, you need also to show that females were under-represented in the absence of this feature (i.e. if you look only at web respondents). Also, it is unclear how you have used individual characteristics to classify households (this is household response). The same comment about diversity applies to the analysis of the effect of the priority mailing (table 1): this does not tell us whether the priority mailing is improving the sample balance or making it worse. The discussion of findings regarding hypothesis 2 (pp. 22-24) is rather unfocussed and reads a little as if data dredging is going on here. Lacking is a clear statement of which effects were tested and how many. Without this, the meaning of the differences reported cannot be judged. Including the results in table form could be helpful. The reporting of hypothesis 3 findings suggests that invitation to module n+1 was conditional on completing module n. This was not clear in the Method section and requires clarification. Does it not make more sense to invite all screened-in people to each module? p.28, l.4 “older Black females”: do you mean “younger Black females”? The recommendations (p.29) are almost completely unsupported by the presented evidence. E.g. the authors recommend a push-to-web approach, despite the fact that they did not compare this to any other approach and only obtained a 15% response rate. Similarly, the use of telephone reminders was not explicitly compared to any alternative protocol. Recommendation #4 is the only one that is fully supported. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No Reviewer #3: No ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
Revision 1 |
PONE-D-22-29437R1Methods for improving participation rates in national self-administered web/mail surveys: Evidence from the United StatesPLOS ONE Dear Dr. West, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Aug 17 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Bidhubhusan Mahapatra, Ph.D. Academic Editor PLOS ONE Journal Requirements: Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: All comments have been addressed Reviewer #3: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: (No Response) Reviewer #3: Partly ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: (No Response) Reviewer #3: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: (No Response) Reviewer #3: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: (No Response) Reviewer #3: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: I don't have any more comment on this manuscript. The revised version of the manuscript has come up very well. This would be a valuable knowledge addition to the science of the survey research method within and beyond USA. Reviewer #2: (No Response) Reviewer #3: I am glad to see that you have given careful consideration to all reviewer comments and in most cases have taken appropriate action that, overall, has greatly improved the paper in my view. However, one major concern remains. And I have a few additional comments related to material that you have now clarified or added. My major concern is with what is now hypothesis 2. The way this hypothesis is tested does not appear to me to provide any useful information for informing future designs. This is because you test the effect of being successfully contacted. This is not a design feature that can be controlled: the survey researcher cannot choose which sample members will be successfully contacted. Rather, this is an outcome of the process. I would strongly prefer you drop this hypothesis and analysis from the paper: I think it severely weakens the paper and does not give a good impression of the authors (unlike the analyses of the other two hypotheses, which are underpinned by robust experimentation). Alternatively, you could just include a descriptive analysis (estimate) of the overall effect of the telephone reminders for the whole sample. In practice, this is the treatment that researchers can implement: they cannot control who will provide a phone number or who can be successfully contacted, so the outcome for the total sample is the metric of interest. Of course, you have no direct evidence of the counterfactual, so you can only present as a maximum bound on the effect the difference between to total observed response and the total response if you remove all who responded subsequent to a successful phone contact (true effect is probably a little smaller as some of those contacted by phone may have eventually participated anyway). Other comments: p.7, l.11-12: I don’t understand this. If screener completion rate was 53.0% and weighted completion rate for main survey was 33.0%, how is the overall rate 40.7%? Surely it’s around 0.53 * 0.33 = 0.175 (if you assume screener response rate to be independent of eligibility for main stage). What am I missing? p.9, l. 5-6: I still don’t see any evidence that completion rates have benefitted from a web/mail approach? This doesn’t seem to have been compared with any other approach? Maybe you mean that they have benefitted from incentives in the context of a web/mail approach? p.9, l.6: “$2 incentive was effective”: reader could be excused for assuming this means relative to no incentive. But actually the evidence you presented was that $2 was as effective as $5. p.17, l.17-19: thank you for attempting to clarify this. However, this information contradicts the notes to figure 1 (“If invited cases did not respond to mod 1, the later invitation would invite them to complete mods 1 and 2 together” and “Nonresponse to earlier modules resulted in continued invitation”). So this still requires clarification! p.18, l.3-9: This text is written as if it is always the same person completing screener and main questionnaire, which is presumably not the case? p.21, l.7-11: Related to my main concern above, this adjustment – upon which your estimation relies – is likely to be woefully inadequate. I doubt these variables account for more than a small fraction of the variation in propensity to respond to a survey. So, what remains is that more willing people turn out to be more willing…. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No Reviewer #3: No ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
Revision 2 |
Methods for improving participation rates in national self-administered web/mail surveys: Evidence from the United States PONE-D-22-29437R2 Dear Dr. West, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Bidhubhusan Mahapatra, Ph.D. Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #3: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #3: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #3: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #3: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #3: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #3: I am really very grateful that you have taken all my comments - possibly a little demanding - in good spirit and have carefully amended the manuscript in the ways suggested or implied. I truly think the paper is much improved as a result and that the revisions were worthwhile. I hope you do too. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #3: Yes: Peter Lynn ********** |
Formally Accepted |
PONE-D-22-29437R2 Methods for improving participation rates in national self-administered web/mail surveys: Evidence from the United States Dear Dr. West: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Bidhubhusan Mahapatra Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .