Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Development, validation, and reliability testing of the College Perspectives around Food Insecurity survey

  • Jennette Kilgrow ,

    Contributed equally to this work with: Jennette Kilgrow, Elyce Gamble, Amanda Meier, Jinan Banna, Dennis L. Eggett, Stephanie Grutzmacher, Jennifer A. Jackson, Kendra OoNorasak, Nathan Stokes, Rickelle Richards

    Roles Data curation, Formal analysis, Supervision, Writing – original draft, Writing – review & editing

    Affiliation Culinary Arts Institute, Utah Valley University, Orem, Utah, United States of America

  • Elyce Gamble ,

    Contributed equally to this work with: Jennette Kilgrow, Elyce Gamble, Amanda Meier, Jinan Banna, Dennis L. Eggett, Stephanie Grutzmacher, Jennifer A. Jackson, Kendra OoNorasak, Nathan Stokes, Rickelle Richards

    Roles Data curation, Formal analysis, Validation, Writing – original draft, Writing – review & editing

    Affiliation Department of Nutrition, Dietetics and Food Science, Brigham Young University, Provo, Utah, United States of America

  • Amanda Meier ,

    Contributed equally to this work with: Jennette Kilgrow, Elyce Gamble, Amanda Meier, Jinan Banna, Dennis L. Eggett, Stephanie Grutzmacher, Jennifer A. Jackson, Kendra OoNorasak, Nathan Stokes, Rickelle Richards

    Roles Data curation, Formal analysis, Validation, Writing – original draft, Writing – review & editing

    Affiliation Utah Valley Hospital Endocrine and Diabetes Clinic, Provo, Utah, United States of America

  • Kyle Lyman ,

    Roles Data curation, Writing – review & editing

    ‡ KL, AB, CK, PM, KL, CM, KA and BMG also contributed equally to the work.

    Affiliation Department of Nutrition, Dietetics and Food Science, Brigham Young University, Provo, Utah, United States of America

  • Andrea Barney ,

    Roles Data curation, Writing – original draft, Writing – review & editing

    ‡ KL, AB, CK, PM, KL, CM, KA and BMG also contributed equally to the work.

    Affiliation Department of Nutrition, Dietetics and Food Science, Brigham Young University, Provo, Utah, United States of America

  • Cade Kartchner ,

    Roles Data curation, Writing – original draft, Writing – review & editing

    ‡ KL, AB, CK, PM, KL, CM, KA and BMG also contributed equally to the work.

    Affiliation Department of Nutrition, Dietetics and Food Science, Brigham Young University, Provo, Utah, United States of America

  • Paola Martinez ,

    Roles Data curation, Writing – review & editing

    ‡ KL, AB, CK, PM, KL, CM, KA and BMG also contributed equally to the work.

    Affiliation Department of Nutrition, Dietetics and Food Science, Brigham Young University, Provo, Utah, United States of America

  • Kanae Lee ,

    Roles Data curation, Formal analysis, Writing – review & editing

    ‡ KL, AB, CK, PM, KL, CM, KA and BMG also contributed equally to the work.

    Affiliation Department of Nutrition, Dietetics and Food Science, Brigham Young University, Provo, Utah, United States of America

  • Carol Mathusek ,

    Roles Data curation, Writing – review & editing

    ‡ KL, AB, CK, PM, KL, CM, KA and BMG also contributed equally to the work.

    Affiliation Department of Nutrition, Dietetics and Food Science, Brigham Young University, Provo, Utah, United States of America

  • Kelly Ang ,

    Roles Data curation, Writing – review & editing

    ‡ KL, AB, CK, PM, KL, CM, KA and BMG also contributed equally to the work.

    Affiliation Department of Nutrition, Dietetics and Food Science, Brigham Young University, Provo, Utah, United States of America

  • Brooke M. Green ,

    Roles Conceptualization, Data curation, Formal analysis, Validation, Writing – original draft, Writing – review & editing

    ‡ KL, AB, CK, PM, KL, CM, KA and BMG also contributed equally to the work.

    Affiliation Marion County Women, Infants and Children, Salem, Oregon, United States of America

  • Jinan Banna ,

    Contributed equally to this work with: Jennette Kilgrow, Elyce Gamble, Amanda Meier, Jinan Banna, Dennis L. Eggett, Stephanie Grutzmacher, Jennifer A. Jackson, Kendra OoNorasak, Nathan Stokes, Rickelle Richards

    Roles Conceptualization, Data curation, Writing – review & editing

    Affiliation Department of Human Nutrition, Food and Animal Sciences, University of Hawaii at Mānoa, Honolulu, Hawaii, United States of America

  • Dennis L. Eggett ,

    Contributed equally to this work with: Jennette Kilgrow, Elyce Gamble, Amanda Meier, Jinan Banna, Dennis L. Eggett, Stephanie Grutzmacher, Jennifer A. Jackson, Kendra OoNorasak, Nathan Stokes, Rickelle Richards

    Roles Formal analysis, Writing – review & editing

    Affiliation Department of Statistics, Brigham Young University, Provo, Utah, United States of America

  • Stephanie Grutzmacher ,

    Contributed equally to this work with: Jennette Kilgrow, Elyce Gamble, Amanda Meier, Jinan Banna, Dennis L. Eggett, Stephanie Grutzmacher, Jennifer A. Jackson, Kendra OoNorasak, Nathan Stokes, Rickelle Richards

    Roles Conceptualization, Data curation, Formal analysis, Writing – review & editing

    Affiliation College of Public Health and Human Sciences, Oregon State University, Corvallis, Oregon, United States of America

  • Jennifer A. Jackson ,

    Contributed equally to this work with: Jennette Kilgrow, Elyce Gamble, Amanda Meier, Jinan Banna, Dennis L. Eggett, Stephanie Grutzmacher, Jennifer A. Jackson, Kendra OoNorasak, Nathan Stokes, Rickelle Richards

    Roles Conceptualization, Data curation, Formal analysis, Writing – original draft, Writing – review & editing

    Affiliation College of Public Health and Human Sciences, Oregon State University, Corvallis, Oregon, United States of America

  • Kendra OoNorasak ,

    Contributed equally to this work with: Jennette Kilgrow, Elyce Gamble, Amanda Meier, Jinan Banna, Dennis L. Eggett, Stephanie Grutzmacher, Jennifer A. Jackson, Kendra OoNorasak, Nathan Stokes, Rickelle Richards

    Roles Conceptualization, Data curation, Formal analysis, Writing – review & editing

    Affiliation Department of Dietetics and Human Nutrition, University of Kentucky, Lexington, Kentucky, United States of America

  • Nathan Stokes ,

    Contributed equally to this work with: Jennette Kilgrow, Elyce Gamble, Amanda Meier, Jinan Banna, Dennis L. Eggett, Stephanie Grutzmacher, Jennifer A. Jackson, Kendra OoNorasak, Nathan Stokes, Rickelle Richards

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Methodology, Project administration, Supervision, Validation, Writing – original draft, Writing – review & editing

    Affiliation Department of Nutrition, Dietetics and Food Science, Brigham Young University, Provo, Utah, United States of America

  •  [ ... ],
  • Rickelle Richards

    Contributed equally to this work with: Jennette Kilgrow, Elyce Gamble, Amanda Meier, Jinan Banna, Dennis L. Eggett, Stephanie Grutzmacher, Jennifer A. Jackson, Kendra OoNorasak, Nathan Stokes, Rickelle Richards

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    rickelle_richards@byu.edu

    Affiliation Department of Nutrition, Dietetics and Food Science, Brigham Young University, Provo, Utah, United States of America

  • [ view all ]
  • [ view less ]

Abstract

The objective of this study was to develop and to test the validity and reliability of a survey aimed to evaluate internal and external factors associated with college food insecurity. Researchers used a mixed methods approach to evaluate the College Perspectives around Food Insecurity survey. Survey items were constructed from interview data and assigned a social cognitive theory concept (environment, personal, or behavior). Two rounds of expert reviews established content validity (Round 1, n = 3; Round 2, n = 2). Researchers evaluated face validity through two rounds of cognitive interviews with college students 18+ years old (Round 1, n = 9; Round 2, n = 16) and tested survey reliability (n = 105). Researchers used descriptive statistics, test-retest reliability statistics, and Cronbach’s alpha scores for data analysis. The initial survey contained 143 items. After feedback from expert reviewers and cognitive interviews, the final survey contained 99 items. Test-retest reliability was 0.99, and Cronbach’s alpha scores were 0.74 for environment, 0.47 for personal, and 0.39 for behavior. The College Perspectives around Food Insecurity survey can be used to better understand internal and external factors associated with food insecurity in college students, which can inform interventions aimed at assisting this population.

Introduction

Food insecurity refers to inconsistent or limited access to the quality and quantity of food needed for a healthy, active life [1]. An estimated 32% of college students in the United States have been reported to experience food insecurity [2]. Student populations that experience higher rates of food insecurity have included students of color [35], first-generation college students [3,4], transgender students [4], students who are living off campus [4,6], and students using financial aid [5].

Adverse health consequences may arise when students use unhealthy strategies to cope with food insecurity [7]. For example, students experiencing food insecurity were more likely to turn to cheap, low-quality, highly processed foods compared to students who are food secure [8,9]. Studies have noted students experiencing food insecurity consume fewer fruits and vegetables [10,11] and more sweets and sugar-sweetened beverages compared to students who are food secure [1012]. Additionally, students classified as food insecure were more likely to skip breakfast and eat fast food than those classified as food secure [13]. These common but problematic eating patterns among college students experiencing food insecurity have been associated with obesity [12], increased risk of disordered eating behaviors [9], poor mental health [7,14,15], and decreased academic performance [16].

The high prevalence of food insecurity in college populations and the related risk to academic, social, and physical wellbeing make it important to clarify the factors associated with the issue. Previous research has evaluated risk factors and adverse health outcomes associated with food insecurity [316]; however, few studies have evaluated students’ experiences surrounding food insecurity. Through qualitative methods, our research team identified unique strategies that college students with food insecurity used to cope with financial instability [17]. For example, students with food insecurity reported altering their food supply throughout the month or between paychecks, selling plasma for cash, and reducing food intake [17].

The aims of our present study were to: 1) develop a survey framed around the Social Cognitive Theory (SCT) [18] and our previous qualitative work [17] that evaluates internal and external factors associated with food insecurity among college students and 2) test the survey for validity and reliability. We anticipated that the development of this survey may provide researchers with a tool to better reflect the context of the food insecurity experience among college students.

Materials and methods

Study design

The College Perspectives around Food Insecurity (CPFI) survey was developed by a multi-institutional research team through four revision phases using a mixed methods approach (Fig 1). In Phase 1 (2018–2019), researchers constructed survey questions using concepts and wording reported by college students from qualitative interviews conducted previously by the research team [17]. Researchers framed each survey question around a theoretical concept- personal, behavioral, or environmental- based on Bandura’s SCT [18]. Using this health behavior theory allowed researchers to capture and represent the personal perceptions, behaviors, and environmental factors that college students addressed in the interviews [17].

thumbnail
Fig 1. Timeline of survey development and revisions.

https://doi.org/10.1371/journal.pone.0317444.g001

During Phase 2 (2019–2020) researchers conducted two rounds of expert reviews to test content validity of the survey, while in Phase 3 (2021), researchers conducted two rounds of cognitive interviews with college students to test the face validity of the survey. Finally, during Phase 4 (2022), the survey underwent reliability testing using a test-retest approach.

Participants and recruitment

For Phase 2, the research team invited four expert reviewers from October to November 2019 to review the initial survey. Researchers invited expert reviewers who had expertise in food insecurity, college students’ eating behaviors, survey research, and/or health behavior theory. These individuals were previous research collaborators of the research team or were individuals who collaborators recommended to the research team.

Researchers recruited students for cognitive interviews (Phase 3) from December 2020 to February 2021 (round 1) and April to July 2021 (round 2) at University A, University B, and University C. Eligibility criteria included being 18 years of age or older. Researchers aimed to diversify the sample based on sociodemographic characteristics that had been associated with food insecurity among college students at the time of data collection. These included race [13,19], self-identified gender [20], age [2022], Pell Grant recipient status [20,21,23], use of a campus meal plan [24,25], parent’s education [19,26], and living situation (e.g., with family, roommates, alone, etc.) [19]. Thus, researchers utilized various recruitment methods. Researchers posted flyers at campus dining halls and through international student offices. Flyers had a Quick Response (QR) code to direct students to an online Qualtrics (Qualtrics, Provo, UT) screening survey to determine study eligibility and to gather demographic information. Researchers randomly selected general education classes and requested faculty forward a recruitment email to students enrolled in the class. Emails contained a URL to the screening survey. Researchers also used snowball sampling techniques [27]. Researchers used similar recruitment methods for reliability testing (Phase 4) from April to August 2022 at University A, University B, University C, and University D. However, University D also asked professors of courses in human development, family studies, and public health to forward recruitment emails to students.

The Institutional Review Board for Human Subjects at University A approved (IRB X18038) the participation of the first round of expert reviewers with an alteration of informed consent [45 CFR 46.116(f)(3)(i)] (Phase 2). This altered informed consent was presented as written text on the first page of the survey, with a note at the end stating expert reviewers’ completion of the review indicated their consent to participate. For the second round of expert reviewers (Phase 2), validation study (Phase 3), and reliability study (Phase 4), University A’s IRB determined those phases to be non-human subject research (IRB2020-394), as data collected during survey development was for continued improvement of the survey only [45 CFR 46.102(e)(1)(i)(ii)].

Survey development

Translating interview data into survey items (Phase 1).

Two researchers from University A independently reviewed the previously collected interview data [17]. Then, they worked together to translate words and phrases from the college students’ interview transcripts into statements (survey items) tied to specific concepts of the SCT [18]. Research team members provided feedback on the content and the organization of the survey items through the Qualtrics notes feature. The team then met bi-weekly over approximately three months to provide additional verbal feedback on survey items and make final decisions on improving the content, flow, and organization of the survey. Researchers used Qualtrics online survey software to build the initial CPFI survey draft.

Content validity (Phase 2).

Three expert reviewers evaluated the initial CPFI survey draft for clear phrasing, the importance of survey items in addressing the research purpose, and content appropriateness, using a scale of 0 = poor to 10 = exceptional [28]. Reviewers also evaluated the accuracy with which the researchers had paired each survey item with a theoretical concept (personal, behavioral, and environmental). Reviewers were given a $25 Amazon e-gift card for their time.

The COVID-19 pandemic occurred after the initial expert review but before starting Phase 3. Thus, researchers added six survey items to the CPFI survey to evaluate changes in college students’ living situations, employment status, and food consumption during the pandemic. Two of the three original expert reviewers agreed to evaluate the new questions, using the same criteria described previously, and received a $15 Amazon e-gift card.

Face validity (Phase 3).

Cognitive interviews were conducted to help researchers determine if the target audience interpreted questions as the researchers intended and if the questions resonated with the target audience [29]. Researchers from University A conducted cognitive interviews across all participating institutions (Universities A, B, and C). Training of undergraduate (n = 6) and graduate (n = 1) student interviewers occurred through practice interviews with college students. To ensure familiarity with interview procedures, each student interviewer rotated among observer, note-taker, and interviewer roles. Two researchers who each had at least 10 years of qualitative research experience reviewed recorded interviews and notes, provided feedback to interviewers on interviewing skills, and had interviewers conduct another round of practice interviews, as needed.

Cognitive interviews were conducted through Zoom online video conferencing technology (Zoom Video Communications Inc., 2021). Trained interviewers prompted students to talk aloud while completing the survey, noting any areas of confusion and the thought process used to come to their response [29]. The interviewer asked probing questions when needed, such as “what are you thinking right now?”, “can you tell me more about that?”, and “could you describe that for me?” Another researcher took notes throughout the survey and aided the interviewer in probing the specified questions. At the end of the survey, the interviewer asked the participant the following debriefing question: “do you have any other final suggestions for making this a better survey?”

To ensure perspectives from food secure and food insecure students at each university, researchers used survey quotas in Qualtrics based on students’ responses to a 2-item food sufficiency screener on the recruitment survey [30,31]. The questions were: 1) In the last 30 days, did you ever run short of money and try to make your food or your food money go further? and 2) Which of these statements best describes the food eaten in your household? This 2-item screener was used to reduce respondent burden and has been shown to identify an individual’s risk for food insecurity [31]. Researchers classified students as “food secure” with responses of “no” to question 1 and “enough of the kinds of food we want to eat” to question 2. Researchers classified students as “food insecure” with any other combination of answers to the two screening questions.

Researchers also had students complete the United States Department of Agriculture’s (USDA) 10-item Adult Food Security Survey Module (FSSM) [30]. Using the FSSM scoring, researchers classified students as “food secure” with a score of 0–2 and “food insecure” with a score of 3 or more affirmative responses [30]. If there was a discrepancy between student classification on the 2-item screener and the 10 item FSSM, researchers classified students’ food security from the 10-item score.

The initial round of cognitive interviews lasted, on average, 90 minutes per participant. After collecting data from nine students, data collection was stopped to re-evaluate the survey because interviews took longer than expected and themes had begun to emerge from students’ responses. Researchers then shortened and revised the survey content for clarity. The second round of cognitive interviews, with 16 students, averaged 60 minutes each, which the research team felt aligned with the expected data collection time. For compensation, students in the cognitive interviews were given $15 Amazon e-gift cards.

Reliability testing (Phase 4).

Researchers used the 2-item food sufficiency screener for food secure and food insecure quotas, with the 10-item FSSM used to classify students’ food security status, as described earlier. To reach the recommended 80% reliability standard outlined by Bujang [32], researchers needed a sample size of 62 students. Based on previous sampling response rates among college students [33], researchers recruited 300 students across all universities to reach the minimum sample size of 62.

At time 1, students completed the CPFI survey, the 10-item FSSM [30], the 26-item National Cancer Institute Diet Screener Questionnaire (DSQ) [34], and demographic questions. Although the DSQ is a validated tool, researchers administered this survey item at time 1 to assess the overall time to complete. Students also provided their university email address for the time 2 follow-up.

Seven days after the initial survey was completed, students were sent an email requesting that they retake the CPFI survey at time 2 within three days of receiving the email. The time 2 survey was significantly shorter and contained only the theoretical items undergoing reliability testing and the FSSM items. If students had not taken the time 2 survey within 10 days (of initial survey completion), they were sent a second reminder email asking them to complete the second survey that day. Students were compensated with a $20 Amazon e-gift card for completing the survey twice.

Researchers collected timing data on each question through Qualtrics. Researchers used this feature to assess the length of time it took students to complete each section of the survey. Additionally, researchers asked students to provide feedback on the survey quality. These questions included: (1) “Overall, how easy or difficult was the online survey to complete?” and (2) “What is your opinion about the length of time it took you to complete the survey?”

Analysis

Expert reviews (Phase 2).

Mean scores from expert reviewer ratings were calculated for each criterion evaluated (clear phrasing, importance, and content appropriateness). A mean score of 8.0 or higher on each criterion indicated content validity was achieved [28]. IBM Statistical Packages for Social Scientists (SPSS, v. 24) was used for all analyses.

Cognitive interviews (Phase 3).

Students’ responses from the field notes were pasted as a bulleted list under the corresponding question. A researcher who had not initially recorded the field notes filled in any relevant missing information from the audio transcript. Three researchers independently reviewed the notes to make recommendations to modify the survey items, then met to discuss patterns and trends and create a justification statement for each proposed change. Finally, the research team reviewed proposed changes for each question and reached a consensus on survey revisions. For each round of cognitive interviews, researchers used descriptive statistics in SPSS (v. 29) to evaluate participant demographics across the entire sample and by food security classification.

Reliability testing (Phase 4).

Reliability of the survey items was evaluated using test-retest reliability statistics and Cronbach’s alpha scores, which were calculated for the theoretical grouping of items assigned to the personal, behavioral, and environmental concepts. These groupings included both Likert-scale and partially closed-ended branching questions. Based on similar research among college students, a score of at least 0.7 met the standard for reliability [33]. Statistical Analysis System (SAS) Software (version 9.2) was used for reliability analysis.

Researchers used descriptive statistics for participant demographics and to calculate the average length of time it took students to complete the survey. Two students did not answer all USDA FSSM questions, thus were not able to be categorized as food secure or food insecure. Researchers evaluated differences between students who were food secure vs. food insecure using chi-square statistics or Fisher’s exact tests (when >  20% of expected cell counts were <  5). A p-value < 0.05 was considered significant. Researchers performed all demographics and survey completion time analyses in SPSS (v. 29).

Results

Fig 1 describes the steps of survey development and revisions made throughout the validity and reliability testing phases. More detailed results from each step are outlined below.

Translating interview data into survey items (Phase 1)

The initial survey translated from interview data contained 143 items: 53 personal, 48 behavioral, and 42 environmental. Response options included dichotomous (yes/no) and Likert scale items (1 = strongly agree to 5 = strongly disagree).

Content validity (Phase 2)

Four items paired with the environmental concept received mean scores less than 8.0 (the threshold for establishing content validity) in the clear phrasing criterion from expert reviewers. Three items were revised for clarity and one item was deleted. Revisions included adding a time frame (“while in college”) and/or making items/response options more complete sentences or phrases (e.g., “buying food at a restaurant/fast food” instead of “restaurant/fast food”). The deletion was done because the item asked students’ response to a future hypothetical situation, rather than actual experience.

Within the behavioral concept, six items scored less than 8.0. One item scored less than 8.0 in clear phrasing, importance, and content appropriateness. This item was revised by simplifying wording from “make/prepare food” to “prepare food” and by changing “at home” to “where I currently live” because reviewers felt students might not associate “home” with a current living situation. One item scored less than 8.0 in importance and content appropriateness. Researchers deleted this item because researchers agreed that this item would not give a lot of useful information related to the research intent. Four items scored less than 8.0 in the clear phrasing criterion. Researchers changed the verb tense in two items to indicate the question was asking about any point in time. Researchers also added the phrase “to me” to reflect students’ personal situation. Researchers added a branching question to ensure relevancy to the students who saw the items. Two items were deleted because the item either asked about a future hypothetical situation or had too many ideas presented.

Within the personal concept, three items scored less than 8.0. One item scored less than 8.0 in clear phrasing, importance, and content appropriateness. Researchers simplified the wording in the item from “...budgeting skills necessary to have money left for food each month” to “...stick to a monthly budget for food.” One item scored less than 8.0 in clear phrasing and content appropriateness. Researchers deleted this item because of confusing wording and limited relevancy to the research intent. One item scored less than 8.0 for clear phrasing. Researchers revised this item to be written as a complete sentence instead of a phrase. Researchers also added examples of cooking equipment (stove, oven, microwave, etc.) to the item.

Researchers revised a total of 86 items for greater clarity, deleted 30 items for similar reasons listed previously, and added 35 items, including branching questions and questions about credit card debt and alcohol use (as recommended by expert reviewers). For 26 survey items, one expert reviewer disagreed with the theoretical concept chosen by researchers or felt the items could fit into another theoretical concept than initially identified by researchers. After extensive discussion with the research team, researchers re-classified the theoretical concept for two items and retained the original theoretical concept for the remaining 24 items. The revised survey consisted of 144 items: 38 personal, 40 behavioral, and 66 environmental.

Six COVID-19 questions were then added to the survey and underwent expert review. Only one item did not achieve the validity threshold on content appropriateness (average rating of 6.5) and importance of item (average rating of 6.5), so the research team unanimously agreed that it should be removed from the survey because it was less relevant to the research intent. One item related to employment was moved to the demographics section because it fit better in that section. One reviewer felt that one item could be classified as environmental or behavioral; however, the research team unanimously agreed that the item paired better with the original environmental theoretical concept. The revised survey, after both rounds of expert reviews, consisted of 148 items: 38 personal, 41 behavioral, and 69 environmental.

Face validity (Phase 3)

During the screening of students for cognitive interviews, the original diversity criteria were not met completely, although we were able to sample students who were of different races, genders, ages, living situations, and meal plan statuses (Table 1). Due to long duration of cognitive interviews with nine students, researchers stopped data collection, revised 33 items, added one item, deleted 50 items that students thought were confusing or perceived as duplicative, and moved 16 items to another theoretical concept based on re-evaluation by the team. The revised survey contained 99 items: 39 personal, 28 behavioral, and 32 environmental.

thumbnail
Table 1. Demographic characteristics of college students from four universities across the United States who participated in phase 3 cognitive interviews.

https://doi.org/10.1371/journal.pone.0317444.t001

The revised survey was then evaluated during a second round of cognitive interviews with 16 students. After the second round of cognitive interviews, researchers revised 46 items, deleted seven items, moved one item about COVID-19 and employment to the demographics section, added eight items, and re-classified one item into a different theoretical construct. Researchers also added page divisions between sections of the survey to help respondents know the topic area of questions within each section. The revised survey yielded 99 survey items: 44 personal, 26 behavioral, and 29 environmental (S1 Table).

Reliability testing (Phase 4)

Initial survey completion showed 154 complete responses to the time 1 survey, with 105 students who took the survey twice. However, due to the branched nature of the survey, not all students received all questions. Researchers used branching questions in the survey based on feedback received in Phases 2 and 3. For example, some questions (such as campus meal plans or use of food assistance programs) did not apply to all students. Students primarily self-identified as female (68%), non-Hispanic/Latinx (88%), and white/Caucasian (75%) (Table 2).

thumbnail
Table 2. Demographic characteristics of college students from four universities across the United States who participated in phase 4 reliability testing.

https://doi.org/10.1371/journal.pone.0317444.t002

The test-retest reliability statistic was 0.99. Results for all items (Likert and partially closed-ended branching questions) showed Cronbach’s alpha score of 0.14 for personal, 0.24 for behavioral, and 0.69 for environmental. Cronbach’s alpha scores for Likert scale items only were 0.47 for personal, 0.39 for behavioral, and 0.74 for environmental.

The mean length of time to complete the time 1 survey in minutes was 16.38 (SD, 8.88). The length of time to complete each section of the time 1 survey (in minutes) was as follows: Theoretical questions =  9.46 (SD =  6.98), DSQ =  4.21 (SD =  1.94), demographics =  2.33 (1.29), and survey quality questions =  0.39 (SD =  0.59).

Most respondents (74%) rated the survey as very easy, easy, or somewhat easy. One percent found it very difficult, 11.7% somewhat difficult, and 13.6% neutral. The majority (51.5%) rated the length of the survey to be ‘just about right,’ with 45.6% indicating it was too long and 2.9% extremely too long.

Discussion

The CPFI survey was designed to measure the context surrounding food insecurity in a college student population. Researchers employed several steps to measure the validity and reliability of the CPFI survey, thus strengthening its ability to capture internal and external factors associated with food insecurity among college students. Content validity was established through expert review of questions on clarity, importance, and appropriateness to the research. Any questions with mean scores below 8.0 were deleted or revised. Additionally, the research team clarified the assignment of theoretical concepts, strengthening the survey used in phase 3 (face validity testing).

Face validity was confirmed through two rounds of cognitive interviews in which question grouping by theoretical concepts was further refined, questions were clarified, and duplicate questions were removed. These changes further improved the quality of the survey and helped researchers determine that the target audience was interpreting the questions as intended [29].

Test-retest reliability showed excellent stability over time [35]. Cronbach’s alpha test for internal consistency of items groups within the SCT concepts showed good reliability for questions in the environmental concept. However, the scores for the personal and behavioral concepts did not meet an internal consistency standard of 0.7 [36]. The internal consistency improved in all concepts when only Likert scale items were used in the analysis, which suggests that the branched survey items in our survey that were categorical or dichotomous yes/no (rather than continuous or scaled) responses may be better assessed by a difference statistical tool [37]. It remains questionable as to whether the personal and behavioral concepts were truly being measured by the questions designed to measure them.

The timing data indicated the total survey length was slightly above the recommended 15-minute timeframe for online surveys [38]. We were concerned that the addition of the DSQ would dramatically lengthen the survey completion time because of the number of items (n = 28) it added to the survey. Although the theoretical portion of the survey had a higher number of total survey items, we anticipated students may spend less time on this section because of its branched nature where students would not need to answer every question. However, data showed that overall, students spent more time on the theoretical section than the DSQ. Most students felt the length was reasonable. However, with 48.5% feeling it was too long, an alternative to the DSQ may be considered to bring the overall survey completion time down while still maintaining the depth of information evaluated through the theoretical portion of the survey. Such an effort to reduce the overall completion time could decrease the respondent burden and increase response rates [39].

Previous surveys have aimed to identify incidence of food insecurity and demographics, academic and/or health outcomes associated with food insecurity [316]. The CPFI provides a novel approach that can reflect internal and external factors related to food insecurity derived from students’ own experiences. Additional strengths of this research project included participant recruitment from four large universities in four different states across the U.S. Another strength was the intentional representation of both food secure and food insecure students during multiple phases of testing, which allowed us to evaluate perspectives from both student groups. Finally, survey development included multiple methods of survey revision and improvement for validity and reliability testing.

Certain limitations should be noted, including that the intended demographic criteria were not met in cognitive interview testing and study populations primarily self-identified as female and white/Caucasian. Researchers used non-probability sampling strategies which have been shown to result in biased samples [40]. Thus, the findings in the present study might not accurately reflect students’ experiences outside of those who participated in this study.

Conclusions

The CPFI was established to understand food insecurity in a college student population. The survey has time-stable reliability and good internal consistency for the environmental concept. Although the personal and behavioral concepts did not reach the threshold for good reliability, the CPFI can still be used to better understand the internal and external factors surrounding college students’ experiences with food insecurity.

Where many previous research studies have looked at the prevalence, demographics, and/or outcomes of college food insecurity [316], this unique survey was designed and tested to provide insight into the students’ experiences with food insecurity and their perceptions of it. This can give a more nuanced understanding of what student experiences are most notable and support future use of this survey to identify interventions appropriate to address food insecurity among college students.

Supporting information

S1 Table. The College Perspectives around Food Insecurity survey questions by theoretical concept and survey flow.

https://doi.org/10.1371/journal.pone.0317444.s001

(PDF)

References

  1. 1. United States Department of Agriculture. Definitions of Food Security. 2022 [cited 2023 Sep 8]. Available from: https://www.ers.usda.gov/topics/food-nutrition-assistance/food-security-in-the-u-s/definitions-of-food-security/.
  2. 2. Abbey EL, Brown M, Karpinski C. Prevalence of food insecurity in the general college population and student-athletes: a review of the literature. Curr Nutr Rep. 2022;11(2):185–205. pmid:35218475
  3. 3. Wolfson JA, Insolera N, Laska MN, Leung CW. High prevalence of food insecurity and related disparities among US College and University students from 2015–2019. J Nutr Educ Behav. 2024;56(1):27–34. pmid:37999695
  4. 4. Tripathy K, Bhasin R, McKinzie R, Sackett A, Storrs M-E, Janda KM. Food insecurity disparities and impact on academic and social experiences among college students at a large public university. J Am Coll Health. 2024;72(9):3740–7. pmid:36996426
  5. 5. Harville C 2nd, James DCS, Burns A. Profile of a food-insecure college student at a major Southeastern University: a randomized cross-sectional study. Nutrients. 2023;15(5):1108. pmid:36904108
  6. 6. Mobley C, Luo Y, Fernandez M, Hossfeld L. Social determinants of health and college food insecurity. Nutrients. 2024;16(9):1391. pmid:38732637
  7. 7. Lemp H, Lanier J, Wodika A, Schalasky G. Impact of food insecurity on the health and well-being of college students. J Am Coll Health. 2024;72(9):3671–80. pmid:36943238
  8. 8. Huelskamp A, Waity J, Russell J. Effects of campus food insecurity on obesogenic behaviors in college students. J Am Coll Health. 2021;69(5):572–5. pmid:31702978
  9. 9. Royer MF, Ojinnaka CO, Bruening M. Food insecurity is related to disordered eating behaviors among college students. J Nutr Educ Behav. 2021;53(11):951–6. pmid:34561153
  10. 10. Marshall TA, Laurence B, Qian F, Robinson-Warner G, Handoo N, Anderson C. Food insecurity is associated with lower diet quality among dental students. J Dent Educ. 2023;87(11):1574–84. pmid:37537836
  11. 11. Mei J, Fulay AP, Wolfson JA, Leung CW. Food insecurity and dietary intake among college students with unlimited meal plans at a large, Midwestern University. J Acad Nutr Diet. 2021;121(11):2267–74. pmid:33972204
  12. 12. Cedillo YE, Kelly T, Davis E, Durham L, Smith DL Jr, Kennedy RE, et al. Evaluation of food security status, psychological well-being, and stress on BMI and diet-related behaviors among a sample of college students. Public Health. 2023;224:32–40. pmid:37708714
  13. 13. Laska MN, Lenk K, Lust K, McGuire CM, Porta CM, Stebleton M. Sociodemographic and health disparities among students screening positive for food insecurity: findings from a large college health surveillance system. Prev Med Rep. 2020;21:101297. pmid:33643812
  14. 14. Coffino JA, Spoor SP, Drach RD, Hormes JM. Food insecurity among graduate students: prevalence and association with depression, anxiety and stress. Public Health Nutr. 2021;24(7):1889–94. pmid:32792027
  15. 15. Oh H, Smith L, Jacob L, Du J, Shin JI, Zhou S, et al. Food insecurity and mental health among young adult college students in the United States. J Affect Disord. 2022;303:359–63. pmid:35157947
  16. 16. Ryan RA, Murphy B, Deierlein AL, Lal S, Parekh N, Bihuniak JD. Food insecurity, associated health behaviors, and academic performance among Urban University undergraduate students. J Nutr Educ Behav. 2022;54(3):269–75. pmid:34758921
  17. 17. Richards R, Stokes N, Banna J, Cluskey M, Bergen M, Thomas V, et al. A comparison of experiences with factors related to food insecurity between college students who are food secure and food insecure: a qualitative study. J Acad Nutr Diet. 2023;123(3):438–453.e2. pmid:35940496
  18. 18. Glanz K, Rimer B, Viswanath K. Health behavior and health education: theory, research, and practice. 4th ed. San Francisco (CA): Jossey-Bass Publisher; 2008.
  19. 19. El Zein A, Shelnutt KP, Colby S, Vilaro MJ, Zhou W, Greene G, et al. Prevalence and correlates of food insecurity among U.S. college students: a multi-institutional study. BMC Public Health. 2019;19(1):660. pmid:31142305
  20. 20. Soldavini J, Berner M, Da Silva J. Rates of and characteristics associated with food insecurity differ among undergraduate and graduate students at a large public University in the Southeast United States. Prev Med Rep. 2019;14:100836. pmid:30886818
  21. 21. Hagedorn RL, McArthur LH, Hood LB, Berner M, Anderson Steeves ET, Connell CL, et al. Expenditure, coping, and academic behaviors among food-insecure college students at 10 higher education institutes in the Appalachian and Southeastern regions. Curr Dev Nutr. 2019;3(6):nzz058. pmid:31149651
  22. 22. Bruening M, Argo K, Payne-Sturges D, Laska MN. The struggle is real: a systematic review of food insecurity on postsecondary education campuses. J Acad Nutr Diet. 2017;117(11):1767–91.
  23. 23. Payne-Sturges DC, Tjaden A, Caldeira KM, Vincent KB, Arria AM. Student hunger on campus: food insecurity among college students and implications for Academic Institutions. Am J Health Promot. 2018;32(2):349–54. pmid:28699401
  24. 24. Nikolaus CJ, An R, Ellison B, Nickols-Richardson SM. Food insecurity among College Students in the United States: a scoping review. Adv Nutr. 2020;11(2):327–48. pmid:31644787
  25. 25. van Woerden I, Hruschka D, Vega-Lόpez S, Schaefer DR, Adams M, Bruening M. Food insecure college students and objective measurements of their unused meal plans. Nutrients. 2019;11(4):904. pmid:31018554
  26. 26. Leung CW, Wolfson JA, Lahne J, Barry MR, Kasper N, Cohen AJ. Associations between food security status and diet-related outcomes among students at a large, Public Midwestern University. J Acad Nutr Diet. 2019;119(10):1623–31. pmid:31561811
  27. 27. Zoellner J, Harris JE. Mixed-Methods research in nutrition and dietetics. J Acad Nutr Diet. 2017;117(5):683–97. pmid:28284525
  28. 28. Mackison D, Wrieden WL, Anderson AS. Validity and reliability testing of a short questionnaire developed to assess consumers’ use, understanding and perception of food labels. Eur J Clin Nutr. 2010;64(2):210–7. pmid:19904290
  29. 29. Dillman DA, Smyth JD, Christian LM. Internet, phone, mail, and mixed-mode surveys: the tailored design method. 4th edition. Hoboken: Wiley; 2014.
  30. 30. USDA ERS—Survey Tools [Internet]. [cited 2024 Jul 6]. Available from: https://www.ers.usda.gov/topics/food-nutrition-assistance/food-security-in-the-u-s/survey-tools/#adult
  31. 31. Nikolaus CJ, Ellison B, Nickols-Richardson SM. Are estimates of food insecurity among college students accurate? Comparison of assessment protocols. PLoS One. 2019;14(4):e0215161. pmid:31017912
  32. 32. Bujang B. A simplified guide to determination of sample size requirements for estimating the value of intraclass correlation coefficient: a review. Arch Orofac Sci. 2017;12(1):1–11.
  33. 33. Richards R, Brown LB, Williams DP, Eggett DL. Developing a questionnaire to evaluate College Students’ knowledge, attitude, behavior, self-efficacy, and environmental factors related to canned foods. J Nutr Educ Behav. 2017;49(2):117–124.e1. pmid:27876324
  34. 34. National Cancer Institute. Dietary Screener Questionnaires (DSQ) in the NHANES 2009-10: DSQ [Internet]. 2021 [cited 2023 Sep 8]. Available from: https://epi.grants.cancer.gov/nhanes/dietscreen/questionnaires.html
  35. 35. Kather F, Hadzic M, Hehle T, Eichler S, Klein J, Völler H, et al. Test-retest reliability of the Mini Nutritional Assessment-Short Form (MNA-SF) in older patients undergoing cardiac rehabilitation. J Geriatr Cardiol. 2020;17(9):574–9. pmid:33117422
  36. 36. Van Horn LT, Wright L, Arikawa AY, Sealey-Potts C. Validity and reliability of a questionnaire measuring EBDPs among registered dietitian nutritionist. J Hum Nutr Diet. 2023;36(1):323–35. pmid:35485216
  37. 37. Devellis RF, Thorpe CT. Scale development: theory and applications. 5th ed. US: SAGE Publications; 2021.
  38. 38. Revilla M, Höhne J. How long do respondents think online surveys should be? New evidence from two online panels in Germany. Int J Mark Res. 2020;62(5):538–45.
  39. 39. Sammut R, Griscti O, Norman IJ. Strategies to improve response rates to web surveys: a literature review. Int J Nurs Stud. 2021;123:104058.
  40. 40. Berndt AE. Sampling methods. J Hum Lact. 2020;36(2):224–6.