Peer Review History
Original SubmissionApril 26, 2021 |
---|
PONE-D-21-13856 Audio, video, chat, email, or survey: How much does online interview mode matter? PLOS ONE Dear Dr. Crichton, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Aug 07 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Janet E Rosenbaum, Ph.D. Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf 2. We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For more information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. In your revised cover letter, please address the following prompts: a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially sensitive information, data are owned by a third-party organization, etc.) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. We will update your Data Availability statement on your behalf to reflect the information you provide. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. Additional Editor Comments (if provided): [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Partly Reviewer #2: Yes ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: I Don't Know ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The authors should be commended for implementing an experimental design to assess key differences among major methods of online interviewing. The authors also address the important features of interviews related to data quality, including rapport with the interviewer, honesty, and self-disclosure. This experiment and the results are a useful contribution to the literature for practitioners, including the relative cost data presented in Table 5. At the same time, there are a number of limitations to this research which deserve more attention from the authors: The main questions the authors test are clearly designed to represent questions with different degrees of sensitivity. These questions do not seem like questions that would be asked by survey or market researchers conducting online interviews. Including some “standard” survey questions would have provided readers with more confidence that these results are relevant to their own research. Can the authors comment on how their selection of questions could have affected the results? The interviewee post-interview survey was a useful design feature, but responses to some of these questions - desirable responding, self-disclosure, and honesty - could have been directly influenced by the specific questions in the initial interview. The sample size is sufficient for most analysis, but not all, as the authors note on page 23. For example, an analysis of interviewer effects showing interviewers’ contribution to variance in participants responses to questions. The authors note on page 16 that they examined interview completion rates, mode completion rates, technical difficulties, and rapport rating and did not see any differences. These are useful metrics to report, but do not actually identify interview effects. If the authors cannot estimate interviewers’ contribution to variance due to sample size or study design, are there any further indirect metrics that could inform the likely potential for interviewer effects? On page 27, the authors note that 99% of screened participants indicated having access to a computer with a keyboard. Part c. of Table 4 (on page 26) also shows participants were disproportionately white. Like most examples of online research I have seen, this sample is clearly skewed to higher average SES and White participants. Although it seems likely that the authors could not have obtained a more representative sample without significantly changing recruitment methods, it would be useful for the authors to comment on how this skew in participant demographics might have affected results across all experimental conditions? The authors note on page 24 that the greatest challenges to recruitment and logistics involved audio and video modes. They specifically note on page 26 that unwillingness to participate in video mode was the greatest disqualifying factor in the recruited pool. This is not surprising, but good to note for practitioners who are considering different online modes. A separate question is how the lower screening and completion rates and the higher no-show rates for audio and video modes (as reported in Table 5 on page 32). How might this differential nonresponse pattern have affected findings and conclusions? The authors’ first conclusion that mode is only a practical question, and not a threat to validity, is not supported by relevant literature on modes and accuracy of response and also not supported by two of their findings. First, as the authors note, the greatest challenge to recruitment (and logistics) involved audio and video mode. Given that 99% of screened participants indicated having access to a computer with a keyboard and 70% having a webcam, it seems like unwillingness to participate in audio or video modes was influenced (at least in part) by recruits’ lack of comfort with reporting in those modes. Audio and video do complicate logistics, as the authors note, but having these devices suggests these participants should be comfortable using them. Even though the authors did not find significant results across modes in their analysis of disclosure, they note on page 36 that interviewers might overestimate participants’ comfort and on page 42 that some participants indicated mode affected their disclosure. So, this first conclusion seems stronger than the research demonstrates, especially considering the noted limitation in sample size that affected all quantitative analysis. Reviewer #2: Dear Kyle Crichton and Co-authors, Many thanks for the opportunity to review your manuscript, “Audio, video, chat, email or survey: How much does interview mode matter?” You have presented a very well-written and clear manuscript. You aptly highlighted the need for the novel research and detailed the methods, and findings meticulously. Although I thoroughly enjoyed reading it for its clarity and novelty, I must admit that it was challenging for me at times as a qualitative researcher. The statement on page 11, line 281 (“we were wary of attempting to assess a qualitative method using quantitative …”), was a reassuring touch. I do have some feedback and suggestions for your consideration. Some quite minor and some that may require more thought. Please note, that I have reviewed the manuscript from a qualitative researcher perspective and have not thoroughly reviewed the statistical analyses. The study title and aims suggest that the research compares empirically, different modes of online interviews, and yet you have included surveys. You note that surveys are “the interview’s close cousins” however, qualitative researchers would likely argue that they are a different data collection technique to interviews altogether (i.e., rather than a mode of online interview). The rationale for including survey techniques in the study is not overly satisfying and it seems that more needs to be said about the apparent juxtaposition of online interview modes and survey techniques. Differences between qualitative methodologies is not addressed. For instance, a phenomenological study would not be amenable to data collection via a survey, nor would coding of data be a consideration. Ethnographic studies also typically rely on observational data, and this is not mentioned at all. You duly acknowledge that there is no universal approach to qualitative data analysis, however you have not stated that the approach depends on the aims of the research and the methodology (consider narrative analysis, discourse analysis, etc.). It seems that the findings of the study are applicable to mixed methods research, but not so much to purely qualitative research (although this clearly depends on the methodology). Perhaps this could be considered and discussed. On page 48, code count is discussed as a measure of thematic analysis potential. You acknowledge that this is simplistic, a point with which I agree, but I also challenge this as a meaningful measure of the qualitative equivalence. To my mind, the number of codes (i.e., ways to break up and categorise data) does not speak to its interpretative potential. At minimum, a rationale for including this as a measure with an appropriate reference would be useful. In terms of minor feedback, it would be useful to know if any qualitative data analysis software or tools was used by the coders. On page 3 lines 44-45 need revision. On page 4 line 83, there is a direct quote attributed to source 8, but no page number. As very minor feedback, please check all direct quotes - some of the quotation marks are orientated the wrong way. On page 23, line 603, it says that 145 interviews were completed; elsewhere it says 154. I was surprised to see you refer to the limitations in the method section, however, I recognise this may be a more accepted convention in quantitative research papers. It was pleasing to see that you conducted a small-scale screening study to measure the impact of the COVID-19 pandemic on willingness to participate in the various modes. With many thanks once again, I sincerely look forward to seeing this manuscript again and hopefully eventually in print. Very best, Reviewer ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
Revision 1 |
PONE-D-21-13856R1Audio, video, chat, email, or survey: How much does online interview mode matter?PLOS ONE Dear Dr. Crichton, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Feb 02 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Janet E Rosenbaum, Ph.D. Academic Editor PLOS ONE Journal Requirements: Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: (No Response) Reviewer #2: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The authors’ responses to my comments are generally sound and appreciated, I have a few follow-up points for the authors to consider. In response to my comment about how the observed skew in participant demographics to white and higher SES participants might have affected results (across all experimental conditions, the authors note that these factors were distributed relatively equally across mode and therefore are unlikely to have affected our comparison of results between conditions. The authors’ then conclude that this demographic skew should not have affected the internal validity of their experimental study, given the similar composition across conditions. To clarify my original question, a number of the authors’ conclusions include claims of external validity – that is, what other researchers should expect in terms of data quality when using different modes of online interviewing. Can the authors address the issue of how the demographic skew of their study participants could have affected the ability of their study to provide external validity? The authors note on page 51 that “despite low power, any effects on interview data that was missed in this study are likely small enough that most interviewers can gently ignore them.” First, I assume the authors meant to say “generally” instead of gently. Second, this statement feels a bit strong, given that the authors have no statistical criteria to determine whether observed effects should, or should not, be considered meaningful. The authors’ conclusion might be correct (and I would guess it is, based on their results), but assuming this is the case without statistical support or other metrics seems questionable. Do the authors have other evidence or literature they can cite to support this conclusion that the effects are likely small enough to be ignorable? Further in the same paragraph, the authors conclude that they “... expect the effect of mode to be even smaller for interviews that focus on less sensitive content.” This statement makes intuitive sense to me but given that the authors (intentionally) did not include less sensitive questions in the experimental design or cite relevant literature here, the basis of this statement needs to be more clearly justified. Reviewer #2: Dear Kyle Crichton and Co-authors, Many thanks for the opportunity to review your revised manuscript, “Audio, video, chat, email or survey: How much does interview mode matter?” Thank you for the changes you have made in response to reviewer feedback. You have addressed a range of critical comments thoughtfully. You have gone some way in addressing the key criticism made about methodology versus data collection method however, I do still feel that there is some conflation of qualitative research and interviews. Specifically, on p. 5, under the sub-heading “Honesty”, you state, “In qualitative research, the honesty of a participant’s response is a fundamental underpinning of the field.” I would argue this is also the case in quantitative research, where self-report measures are used (e.g., validated pain scales, perceived knowledge of a particular topic, satisfaction surveys, etc.). Equally, ethnographic studies, for example, data is collected in part by researchers’ observations of participants (no honesty required on participants’ part). I feel that some further clarity (a sentence or two) about the fact that you are not investigating qualitative research methods generally, but rather a data collection method commonly used in qualitative research, namely interviews. There are some other minor points for your consideration on p. 37 (lines 965-966) the terms interviewee and participant are used interchangeably which is a bit confusing. There is a minor typographic error on p.7 (line 157) “interviewees”. Thank you again for your thorough revision and very best wishes. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. |
Revision 2 |
Audio, video, chat, email, or survey: How much does online interview mode matter? PONE-D-21-13856R2 Dear Dr. Crichton, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Janet E Rosenbaum, Ph.D. Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: I have no further questions, comments, or suggestions for the author. I am satisfied that the revised version addresses my major concerns from prior reviews.. Reviewer #2: Dear Kyle Crichton and Co-authors, Many thanks for the opportunity to review your revised manuscript, “Audio, video, chat, email or survey: How much does interview mode matter?” once again. Thank you for kindly making the suggested changes to the manuscript, as well as the additional edits and alterations. I am satisfied with the changes and the manuscript in its current state. Wishing you all the very best. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No |
Formally Accepted |
PONE-D-21-13856R2 Audio, video, chat, email, or survey: How much does online interview mode matter? Dear Dr. Crichton: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Janet E Rosenbaum Academic Editor PLOS ONE |
Open letter on the publication of peer review reports
PLOS recognizes the benefits of transparency in the peer review process. Therefore, we enable the publication of all of the content of peer review and author responses alongside final, published articles. Reviewers remain anonymous, unless they choose to reveal their names.
We encourage other journals to join us in this initiative. We hope that our action inspires the community, including researchers, research funders, and research institutions, to recognize the benefits of published peer review reports for all parts of the research system.
Learn more at ASAPbio .