Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Preliminary evidence for the validity of the Brief Post-Secondary Student Stressors Index (Brief-PSSI): A cross-sectional psychometric assessment

  • Brooke Linden ,

    Roles Conceptualization, Data curation, Formal analysis, Methodology, Project administration, Software, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    brooke.linden@queensu.ca

    Affiliation Health Services and Policy Research Institute, Queen’s University, Kingston, Ontario, Canada

  • Amy Ecclestone

    Roles Conceptualization, Data curation, Formal analysis, Project administration, Writing – original draft, Writing – review & editing

    Affiliation School of Public Health Sciences, University of Waterloo, Kitchener, Ontario, Canada

Abstract

The brief version of the Post-Secondary Student Stressors Index (Brief-PSSI) was developed in order to improve the usability of the instrument as a method for evaluating the severity and frequency of stressors faced by post-secondary students. While the original 46-item instrument has been thoroughly psychometrically validated and successfully used among student populations, the length of the instrument limits its utility. Providing a valid, shortened version of the PSSI will enable institutions to include the tool on existing online surveys currently being deployed to surveil the mental health and wellbeing of their students. This study reports preliminary evidence in support of the validity and reliability of the Brief-PSSI using a cross-sectional pilot sample of students attending an Ontario university in 2022. A total of 349 participants (average age 25 (SD = 7.7), range 19–60) completed the first survey, while 149 completed the follow-up survey (average age of 26 (SD = 7.7), range 17–60). Evidence of internal structure, relations to other variables, and of test-retest reliability was assessed according to established index validation guidelines, including the specification of multiple-indicator, multiple-cause models, and Spearman’s rho correlation coefficients. Results provide preliminary support for the validity and reliability of the tool, which demonstrated acceptable goodness-of-fit statistics, statistically significant relationships with like constructs in the hypothesized directions, and good test-retest reliability correlation coefficients. The Brief-PSSI is a useful tool for evaluating the sources of stress among post-secondary students, assessing both the severity of stress experienced and frequency with which each stressor occurred. Future research should explore the practical utility of adding the Brief-PSSI to existing survey assessments as well as pursue the continued collection of validation evidence for the tool among varied student populations.

Background

Concerns surrounding post-secondary student mental health and wellbeing have increased over the past decade, with above average stress levels and related mental health concerns reported by many in this population [1, 2]. The majority of students attending post-secondary institutions in Canada belong to the 18-to-25 year age group, referred to as “emerging adulthood” [3]. Research suggests that during this phase of the life course, increased autonomy combined with a lack of role permanence and associated responsibilities results in an increased propensity to engage with risky behaviour, often as maladaptive methods of coping with stress [3]. Importantly, substantial brain and psychosocial development take place during emerging adulthood, placing individuals at increased susceptibility to risk factors for the development of mental illnesses [4]. In fact, the majority of mental illnesses have their first onset on or before 18 years of age [5]. Data collected from the National College Health Assessment survey (NCHA II) in 2019 (n = 55,284) supports this, with nearly one quarter of respondents self-reporting a past-year diagnosis of anxiety (24%) or depression (20%), with 16% reporting a dual-diagnosis [6]. In addition, many students reported feeling hopeless (63.6%), overwhelmed (88.2%) and anxious (68.9%) (6). Concerningly, recent research suggests that these prevalence estimates have increased significantly over the past decade [7].

In addition to diagnoses and symptoms of mental illnesses, a high prevalence of above average stress was evident in students’ responses to the NCHA II. Nearly half of respondents reported their past-year stress level to have been “more than average” (45.6%), while 15% reported “tremendous” stress [6]. Excessive stress among student populations has been linked to declining mental health, which is in turn associated with a number of negative outcomes including the development of mental illnesses [8, 9], poor academic performance [10], increased substance use [1], dropout [11], burnout [12], and in extreme cases, self-injury and/or suicidal ideation [2, 9]. Though much of the extant literature highlighting student stress focuses on stressors related to academics, more contemporary research suggests that the stressors experienced by students throughout their post-secondary careers extend into additional domains including the learning environment, campus culture, interpersonal, and personal [13]. More recently, challenges related to the COVID-19 pandemic introduced novel stressors (and in some cases, exacerbated pre-existing ones) including a mandatory move to online learning, interrupted academic activities and work placements, increased social isolation and loneliness stemming from campus closures, and economic uncertainty [14, 15].

The Post-Secondary Student Stressors Index (PSSI) was developed in collaboration with students, aiming to fill the gaps left by previous instruments that attempted to assess students stress [1618]. These instruments were either too narrow or broad in scope [19], did not involve students in the development process [20], had poor psychometric properties [2124], or were outdated [20, 25]. Most importantly, they did not comprehensively capture the multitude of stressors faced by post-secondary students. The PSSI is a validated tool that evaluates 46 stressors pertinent to post-secondary life across five domains—academics, the learning environment, campus culture, interpersonal and personal stressors—the PSSI can help post-secondary institutions to determine the most severe and frequently occurring sources of stress for students on their campus and tailor their upstream mental health supports accordingly. Although the PSSI is a psychometrically sound and evidence-informed tool, its length may be problematic for inclusion on pre-existing surveillance survey efforts currently conducted by institutions (i.e., the National College Health Assessment, Canadian Campus Wellbeing Survey), with 6 to 12 items in each assessed domain. Therefore, the purpose of this study was to collect and analyze preliminary evidence for the validity and reliability of the Brief-PSSI, a shorter, 14-item version of the original PSSI.

Methods

Study design

We used a cross-sectional, online survey to gather data from a random sample of students attending an Eastern Ontario University during the Spring semester of 2022. All enrolled students were eligible for inclusion with the exception of those studying off campus (i.e., on exchange abroad). Upon request, a sample of 5000 student e-mails was drawn by Queen’s University’s Office of Research and Institutional Planning and provided to the Principal Investigator. The sample size of 5000 was requested based on our expectation of obtaining an approximate response rate of 8–10% based on previous experience surveying this population. We aimed to obtain a minimum of 300 responses in order to comfortable conduct the desired psychometric analyses of interest.

Students received an e-mail invitation on July 5, 2022, with a reminder email sent to those who had not yet completed the survey on July 12, 2022. Participants were provided with a Letter of Information as the first page of the survey and were asked to indicate their written, informed consent by selecting “I have read the Letter of Information and agree to participate in this study” before being granted access to any survey questions. To enable us to assess the test-retest reliability of the tool, a second, follow-up survey was sent to all participants who completed the first survey on August 4, 2022, with a reminder e-mail sent on August 12, 2022.

This study sought to obtain preliminary evidence for the validity and reliability of the Brief-PSSI. Validity is a process by which we determine the degree of confidence we can place on the inferences made about people based on their scores on a given instrument [26]. Reliability refers to the consistency of test scores within a particular population. As stated in the Standards for Educational and Psychological Testing (“the Standards”), the comprehensive validation of an instrument requires the accumulation of evidence from five sources: content; response processes; internal structure; relations to other variables; and test consequences [27]. This article reports the collection of internal structure and relations to other variables evidence for the validity of the Brief-PSSI, in addition to examining its test-retest reliability. This research received ethics clearance from Health Sciences and Affiliated Teaching Hospitals Research Ethics Board (#HSPR-020-22).

Measures

Demographics.

Several demographic variables were assessed, including gender (male, female, non-binary), year of birth, residence during the academic year, level of study (undergraduate, graduate, or professional program), enrollment status (part-time/full-time), international student status (yes/no), first generation student status (yes/no), racialized group/visible minority status (yes/no), and self-reported grade point average (GPA).

Brief Post-Secondary Student Stressors Index.

The Brief-PSSI is composed of 14 items and was developed by strategically collapsing categories of items included in the original 46-item version of the PSSI. For each stressor on the Brief-PSSI, respondents are asked to indicate the severity of stress experienced and the frequency with which this stress occurred. Response options range on a scale from 1 (‘not stressful’ and ‘rarely’) to 4 (‘very stressful’ and ‘almost always’), with higher ratings indicating a greater severity or frequency of stress. An additional option to indicate ‘N/A’ was also available in cases where a stressor did not occur or was not applicable.

Mental health measures.

Composed of ten items, the Perceived Stress Scale (PSS-10) is designed to evaluate overall perceived stress level. Items ask respondents how often they felt a certain way within the past month, with response options ranging from 0 (never) to 4 (very often). A composite score ranging from 0 to 40 is calculated, with higher scores being indicative of higher stress levels [28]. The Kessler Psychological Distress Scale (K10) measures levels of psychological distress experienced within the past month. Each of the ten items are scored from 1 (none of the time) to 5 (all of the time). A composite score ranging from 10 to 50 is calculated, with higher scores being indicative of higher levels of psychological distress [29]. Finally, the Connor-Davidson Resiliency Scale (CD-RISC-10) assesses an individual’s ability to cope in the face of adversity. Items are scored from 0 (not true at all) to 4 (true nearly all of the time). Responses are summed for a composite score ranging from 0–40, with higher scores indicative of a higher level of resilience [30, 31]. All three scales have demonstrated strong psychometric properties and have been used among samples of post-secondary aged youth [3236].

Analysis

Statistical analyses were completed using R (R Foundation for Statistical Computing, Vienna, Australia) Version 4.1.3 [37]. We first conducted a basic descriptive analysis on the demographic characteristics of the sample and each of the stressors variables in the Brief-PSSI. Following this, we began our psychometric assessment of the instrument. We used a statistical approach to assessing evidence for validity appropriate for an index, as opposed to a scale. Like the original PSSI, the brief version of the tool was designed as an index, meaning that individual stressors were conceptualized as causal indicators (e.g., “causes” of stress), rather than effect indicators (e.g., “effects” of stress) [38, 39]. Index construction places emphasis on considering multicollinearity among indicators and assessing theoretical relationships between latent constructs and predictor variables [40] rather than homogeneity of items within an index [39, 41].

Item reduction.

An exploratory factor analysis (EFA) was performed on the original, 46-item PSSI to guide the item reduction process. The results of the EFA were examined for clear groupings of factor loadings. Groupings were then examined for a common theme and “collapsed” into a single item for use on the Brief PSSI. For example, original items “maintaining my GPA” and “receiving a bad grade” were collapsed to create a new item for the brief tool, “maintaining my grades”. Where cross-loadings occurred (i.e., factor loadings on more than one component), the higher factor loading was retained. See S2 Appendix for the complete results of the EFA and resulting item reduction.

Internal structure evidence.

First, we performed an EFA on the Brief-PSSI to determine the recommended structure of the instrument (S3 Appendix). Both parallel and very simple structure analyses recommended a one-factor solution as the best fit, closely followed by a two-factor solution. We elected to test both models in subsequent analyses. Next, to determine the degree to which the relationships among items within the instrument were consistent with what is expected of the construct under study, we estimated multiple-indicator, multiple-cause (MIMIC) models, a special case of structural equation modelling (SEM) and extension of Confirmatory Factor Analysis (CFA). MIMIC models allow for the estimation of relationships among latent constructs of interest (unobserved variables such as stress) [42]. We hypothesized the model would demonstrate acceptable goodness-of-fit statistics with no evidence of multicollinearity among indicators (using a variance inflation factor (VIF) cut-off threshold of 10) [43].

Relations to other variables.

To explore the relationships between the Brief-PSSI and other “like” constructs, we calculated non-parametric Spearman’s rho correlation coefficients to further examine relationships between items on the Brief-PSSI and like constructs. We hypothesized that a higher score on the Brief-PSSI latent variable (indicating higher student stress) would predict higher scores on both the PSS-10 and K10 (higher general stress and psychological distress) and lower scores on the CD-RISC-10 (lower resilience).

Reliability.

To assess the test-retest reliability of the tool, we used the matched responses from both administered surveys to examine Spearman’s rho correlation coefficients between items on the Brief-PSSI over the two-week period. We hypothesized that student stress levels would remain fairly consistent over a two-week period (barring any major stressful event during that time frame) and considered 0.7 to be indicative of good test-retest reliability [26]. We then conducted a sensitivity analysis, removing participants who indicated they had experienced a major stressful event during the two-week period between surveys. Tests were conducted using a complete case analysis approach to missing data, wherein any respondent with complete information for the variables used in an individual test was included in the analyses.

Results

Sample and participants

We first calculated the descriptive statistics for the sample (Table 1). As expected, we initially received 506 submitted surveys representing a 10% response rate, but only n = 349 had usable data (i.e., <90% missing data) ultimately resulting in a response rate closer to 7%. A total of 174 of these participants completed the second survey (T2 n = 174), representing an approximate 50% follow-up response rate between T1 and T2. At both time points, the median age was 23 years of age, and most participants were full-time, female undergraduate students between 21 and 24 years old, reporting a GPA in the A range. For the majority of our analyses, we used the T1 sample as it was larger. We used both samples to evaluate the test-retest reliability of the tool.

Distribution of stressors in sample

Descriptive statistics (means and standard deviations) were calculated for each stressor on the Brief-PSSI at each timepoint (see S1 Appendix). The overall distribution of stressors, stratified by mean severity and frequency for the T1 sample is depicted via a quadrant graph in Fig 1. Fig 2 depicts the same data in a faceted display, where each domain of stress is shown on its own quadrant graph. The most severe stressors experienced by students in this sample were pressure to succeed (, SD = 0.94), examinations (, SD = 0.85), and concerns for the future (, SD = 1.03). The most frequent stressors identified were pressure to succeed (, SD = 0.97), managing my academic workload (, SD = 0.88), and concerns for the future (, SD = 1.05). Overall, these findings are consistent with patterns of stressors we have observed in previous samples of students who completed the original PSSI [13, 16, 17].

thumbnail
Fig 1. Quadrant graph displaying mean severity and frequency of Brief-PSSI stressors.

Stressors are displayed by mean severity and frequency ratings for the total sample, colour coded by domain (T1 data).

https://doi.org/10.1371/journal.pone.0297171.g001

thumbnail
Fig 2. Facet graph displaying mean severity and frequency of Brief-PSSI stressors.

Stressors are displayed by mean severity and frequency ratings for the total sample, faceted and colour coded by domain. Note that the same data is displayed in Figs 1 and 2 (T1 data).

https://doi.org/10.1371/journal.pone.0297171.g002

Internal structure

Using data from the first time point, we first performed two exploratory factor analyses based on the recommended results of parallel and very simple structure analyses: a one-factor solution, and a two-factor solution. The one-factor model was significant (X2 = 312.5, p<0.001), with most factor loadings ≥0.4, 29.7% of the variance explained, and acceptable fit statistics (RMSA = 0.07, RMSEA = 0.094 [95% CI 0.083, 0.105] and TLI = 0.79). The two-factor model was also significant (X2 = 208.19, p<0.001), with factor loadings mostly ≤0.4, and 35.1% of the total variance explained. The fit for this model was good (RMSA = 0.05, RMSEA = 0.08 [95% CI 0.68, 0.093] and TLI = 0.84). The table in S3 Appendix displays the EFA results.

Next, we specified two MIMIC models. We first ruled out issues of multicollinearity by confirming that all variance inflation factors (VIF) for all Brief-PSSI indicators were well below the selected cut-off (<5). Descriptive statistics for indicators included in the models are outlined in Table 2. In the first model, we included all 14 stressors variables from the Brief-PSSI, specifying one overall latent variable of “student stress”. In the second model, we ran the two-factor model we explored earlier through EFA. For both models, we excluded observations where a participant indicated that a stressor was “not applicable” or “did not happen” based on prior analyses that demonstrated the removal of these responses made for a stronger overall model [18]. Results of both models are displayed in Table 3.

thumbnail
Table 2. Descriptive statistics for MIMIC model indicators.

https://doi.org/10.1371/journal.pone.0297171.t002

thumbnail
Table 3. Results of multiple-indicator, multiple-cause (MIMIC) models.

https://doi.org/10.1371/journal.pone.0297171.t003

All items in the first model demonstrated moderate-to-strong factor loadings, with only two falling <0.4. The model was statistically significant (X2 = 143.47, p<0.001) and demonstrated acceptable overall fit. The root mean square error of approximation (RMSEA) and standardized root mean square residual (SRMR) were both <1, though not as small as typically desired (≤0.05), while the Tucker-Lewis Index (TLI) was just below the desired 0.9 cutoff. The second model was also statistically significant (X2 = 122.66, p<0.001) and showed slightly stronger fit statistics. RMSEA and SRMR were smaller compared to the first model, while the TLI reached the desired 0.9 cutoff.

Relations to other variables

To assess relations to other variables, the 14 items on the Brief-PSSI were correlated with the PSS-10, K10, and CD-RISC-10 scores. As expected, all correlations moved in the hypothesized direction (i.e., positively related to PSS-10 and K10, inversely related to CD-RISC-10) and were statistically significant (Fig 3).

thumbnail
Fig 3. Correlation matrix for Brief-PSSI stressors and related constructs.

Data shows the relationships between individual stressors and the three main related constructs of interest: the Kessler Psychological Distress Scale (K10, “CDRscore”), Perceived Stress Scale (PSS10, “PSSscore”), and Connor-Davidson Resiliency Scale (CD-RISC, “CDRscore”).

https://doi.org/10.1371/journal.pone.0297171.g003

Test-retest reliability

Next, we assessed the temporal stability of the tool by examining Spearman’s rho (rs) correlation coefficients between Brief-PSSI items and PSS-10 scores at the first and second data collection time points for all respondents with complete data (i.e., who answered the questions on both surveys). As hypothesized, all correlations were moderate-to-strong (rs = 0.50–0.73) and statistically significant (Table 4).

thumbnail
Table 4. Test-retest reliability correlation coefficients for the Brief-PSSI and PSS-10 at T1 and T2.

https://doi.org/10.1371/journal.pone.0297171.t004

Discussion

Mental health concerns continue to be an issue among post-secondary student populations, with recent research suggesting that above average stress, psychological distress, and symptoms consistent with mental illnesses have significantly increased over the past decade [7]. In response to the growing demand for mental health supports and in line with emergent broad-scale frameworks for supporting post-secondary mental health (i.e., the Okanagan Charter [44], the National Standard of Canada for Mental Health and Wellbeing for Post-secondary Students [45], many institutions have begun to move away from the more traditional medical model of treating symptoms of mental illness, instead adopting a more holistic approach to supporting overall student mental health and fostering a culture of wellbeing [46]. Mental health promotion and mental illness prevention strategies are critical upstream supports that aim to reduce the overall prevalence of languishing mental health and mental illness by providing students with the tools required to develop and improve resiliency and intervene prior to symptom development. While many institutions do indeed provide upstream mental health promotion supports to their students, national data collected from campus mental health service providers and institutional administrators in 2016 suggested that students seldom seek out these supports [47]. Furthermore, many respondents reported feeling that there was room for improvement in the services and supports currently provided [47]. A large part of improving upstream approaches designed to support students’ mental health and wellbeing involves the careful evaluation of the sources of student stress. Doing so will allow post-secondary institutions to effectively improve the targeting of their upstream supports, which may in turn alleviate the overwhelming burden currently placed on downstream treatment options.

The Post-Secondary Student Stressors Index (PSSI) is a useful tool for conducting a detailed evaluation of the sources of student stress, providing a rich, holistic assessment by evaluating nearly fifty potential stressors across five domains of student life. However, many institutions are already engaged in other lengthy survey tools designed to support the surveillance of student mental health and wellbeing, including the National College Health Assessment [6] and the Canadian Campus Wellbeing Survey [48]. While the PSSI adds a detailed assessment of student stress that is presently missing from these surveys, adding a 46-item instrument to these already lengthy questionnaires is not feasible. Doing so is likely to push respondents past their cognitive load capacity, which may potentially impact response rates as well as the quality of the data collected. The Brief-PSSI is a short, 14-item alternative that can be more easily added to these existing surveillance surveys without unduly increasing cognitive load. In addition, having been created based on a strategic grouping of stressors assessed the original PSSI, the Brief-PSSI continues to provide valuable insight into the most salient stressors experienced by students, assessed by both severity of stress elicited and frequency of occurrence. Though the tool evaluates fewer stressors overall, the five domains of student life assessed in the original instrument continue to be tapped, resulting in a high-level overview of general patterns of student stress that can provide institutions with guidance on how to improve the targeting of their upstream mental health supports. In this study, we were able to provide preliminary internal structure and relations to other variables evidence for validity for the Brief-PSSI, as well as strong test-retest reliability over a two-week period. Future research should continue to explore the validity of this novel instrument with larger, more varied samples.

Limitations

As always, there are some limitations associated with this research. No formal content or response processes evidence were collected as part of this validation study for the Brief-PSSI. However, extensive content validation work was completed on the original, 46-item instrument and the brief version of the tool was created by strategically collapsing those stressors into smaller categories. The sample size used here was small, particularly the second sample used for test-retest reliability analyses. The majority of respondents were full-time, female undergraduate students between 21 and 24 years old, reporting a GPA in the A range. Overrepresentation in these categories may have impacted the overall results. Further, this sample is drawn from students attending a single university in Eastern Ontario, as well as during the Spring semester when many students are not enrolled. Additionally, the response rate of about 7% was low, though just below the expected range of response rates typically observed in survey-based research conducted among this population. As a result, possible selection bias may be present. Validation is an ongoing process. Future validation work conducted on the Brief-PSSI should examine evidence to support the validity of the tool among a larger, more varied sample comprised of students attending multiple institutions to ensure it is valid for use across different post-secondary contexts. Additionally, types of evidence for validity not presented in the current study should be examined (i.e., content, response processes). Finally, we did not have sufficient data to complete an assessment of measurement equivalence in this study. Future work should explore this method of validity testing as well.

Supporting information

S1 Appendix. Mean severity and frequency for stressors on Brief-PSSI across timepoints.

https://doi.org/10.1371/journal.pone.0297171.s001

(DOCX)

S2 Appendix. Exploratory factor analysis conducted on PSSI and item reduction for development of Brief PSSI.

https://doi.org/10.1371/journal.pone.0297171.s002

(DOCX)

S3 Appendix. Exploratory factor analysis of Brief PSSI.

https://doi.org/10.1371/journal.pone.0297171.s003

(DOCX)

References

  1. 1. Duffy A, Keown-Stoneman C, Goodday S, Horrocks J, Lowe M, King N, et al. Predictors of mental health and academic outcomes in first-year university students: Identifying prevention and early-intervention targets. BJPsych Open. 2020;6:1–8. pmid:32381150
  2. 2. Gollust SE, Eisenberg D, Golberstein E. Prevalence and correlates of self-injury among university students. Journal of American College Health. 2008;56:491–8. pmid:18400660
  3. 3. Arnett JJ. Emerging Adulthood: a theory of development from the late teens through the twenties. American Psychologist. 2000;55:469–80.
  4. 4. Chung WW, Hudziak JJ. The Transitional Age Brain: “The Best of Times and the Worst of Times.” Child Adolesc Psychiatr Clin N Am. 2017;26:157–75.
  5. 5. Kessler RC, Amminger GP, Aguilar-Gaxiola S, Alonso J, Lee S, Ustün TB. Age of onset of mental disorders: a review of recent literature. Curr Opin Psychiatry. 2007;20:359–64. pmid:17551351
  6. 6. American College Health Association. American College Health Association—National College Health Assessment II: Canadian Reference Group Data Report Spring 2019 [Internet]. Silver Spring, MD; 2019. www.acha.org
  7. 7. Linden B, Boyes R, Stuart H. Cross-Sectional Trend Analysis of the NCHA II Survey Data on Canadian Post- Secondary Student Mental Health and Wellbeing from 2013 to 2019. BMC Public Health. 2021;21:1–13.
  8. 8. Patel V, Flisher AJ, Hetrick S, McGorry P. Mental health of young people: a global public-health challenge. The Lancet. 2007;369:1302–13. pmid:17434406
  9. 9. Eisenberg D, Gollust SE, Golberstein E, Hefner JL. Prevalence and correlates of depression, anxiety, and suicidality among university students. American Journal of Orthopsychiatry. 2007;77:534–42. pmid:18194033
  10. 10. Hjorth CF, Bilgrav L, Frandsen LS, Overgaard C, Torp-Pedersen C, Nielsen B, et al. Mental health and school dropout across educational levels and genders: a 4.8-year follow-up study. BMC Public Health. 2016;16:976. pmid:27627885
  11. 11. Keyes CLM. The Mental Health Continuum: From Languishing to Flourishing in Life. J Health Soc Behav. 2002;43:207–22. pmid:12096700
  12. 12. Evans TM, Bira L, Gastelum JB, Weiss LT, Vanderford NL. Evidence for a mental health crisis in graduate education. Nat Biotechnol. 2018;36:282–4. pmid:29509732
  13. 13. Linden B, Stuart H, Ecclestone A. Trends in Post-Secondary Student Stress: A Pan-Canadian Study. Canadian Journal of Psychiatry. 2023;68:521–30. pmid:35791667
  14. 14. Sahu P. Closure of Universities Due to Coronavirus Disease 2019 (COVID-19): Impact on Education and Mental Health of Students and Academic Staff. Cureus. 2020;12. pmid:32377489
  15. 15. Son C, Hegde S, Smith A, Wang X, Sasangohar F. Effects of COVID-19 on college students’ mental health in the United States: Interview survey study. J Med Internet Res. JMIR Publications Inc.; 2020. pmid:32805704
  16. 16. Linden B, Stuart H. Psychometric assessment of the Post- Secondary Student Stressors Index (PSSI). BMC Public Health. 2019;19:1–12.
  17. 17. Linden B, Boyes R, Stuart H. The Post-Secondary Student Stressors Index: proof of concept and implications for use. Journal of American College Health. 2020;1–9. pmid:32432984
  18. 18. Linden B, Stuart H. Further evidence in support of the validity of the post-secondary student stressors index using a nationwide, cross-sectional sample of Canadian university students. PLoS One. 2022;17:e0278897. pmid:36520875
  19. 19. Stallman HM, Hurst CP. The University Stress Scale: measuring domains and extent of stress in university students. Aust Psychol. 2016;51:128–34.
  20. 20. Ross S, Niebling B, Heckert T. Sources of stress among college students. Coll Stud J. 1999;33:312.
  21. 21. Feldt RC, Koch C. Reliability and construct validity of the College Student Stress Scale. Psychol Rep. 2011;108:660–6. pmid:21675579
  22. 22. Feldt RC, Graham M, Dew D. Measuring adjustment to college: construct validity of the Student Adaptation to College Questionnaire. Measurement and Evaluation in Counseling and Development. 2011;44:92–104.
  23. 23. Taylor MA, Pastor DA. A confirmatory factor analysis of the Student Adaptation to College Questionnaire. Educ Psychol Meas. 2007;67:1002–18.
  24. 24. Feldt RC. Development of a brief measure of college stress: The College Student Stress Scale. PsychologicaI Reports. 2008;102:855–60. pmid:18763455
  25. 25. Rocha-Singh IA. Perceived stress among graduate students: development and validation of the Graduate Student Stress Inventory. Educ Psychol Meas. 1994;54:714–27.
  26. 26. Streiner D, Norman GR, Cairney J. Health measurement scales: A practical guide to their development and use. 5th ed. New York: Oxford University Press; 2015.
  27. 27. American Educational Research Association, American Psychological Association, National Council on Measurement in Education. Standards for Educational and Psychological Testing. 2nd ed. Washington, DC: American Educational Research Association; 2014.
  28. 28. Cohen S, Kamarck T, Mermelstein R. A global measure of perceived stress. J Health Soc Behav. 1983;24:385–96. pmid:6668417
  29. 29. Kessler RC, Barker PR, Epstein JF, Gfroerer JC, Hiripi E, Howes M, et al. Screening for serious mental illness in the general population. Arch Gen Psychiatry. 2003;60:184–9. pmid:12578436
  30. 30. Campbell-Sills L, Stein MB. Psychometric analysis and refinement of the Connor–Davidson Resilience Scale (CD-RISC): Validation of a 10-item measure of resilience. J Trauma Stress. 2007;20:1019–28. pmid:18157881
  31. 31. Connor KM, Davidson JRT. Development of a new resilience scale: the Connor-Davidson Resilience Scale (CD-RISC). Depress Anxiety. 2003;18:76–82. pmid:12964174
  32. 32. Singh K, Yu X. Psychometric Evaluation of the Connor-Davidson Resilience Scale (CD-RISC) in a Sample of Indian Students. Journal of Psychology. 2010;1:23–30.
  33. 33. Manzar MD, Salahuddin M, Alamri M, Albougami A, Khan MYA, Nureye D, et al. Psychometric properties of the Epworth sleepiness scale in Ethiopian university students. Health Qual Life Outcomes. 2019;17:1–8.
  34. 34. Harrer M, Adam SH, Baumeister H, Cuijpers P, Karyotaki E, Auerbach RP, et al. Internet interventions for mental health in university students: A systematic review and meta-analysis. Int J Methods Psychiatr Res. 2019;28:1–18. pmid:30585363
  35. 35. Smith KJ, Rosenberg DL, Haight GT. An assessment of the psychometric properties of the Perceived Stress Scale-10 (PSS10) with business and accounting students. Accounting Perspectives. 2014;13:29–59.
  36. 36. Stallman HM. Psychological distress in university students: A comparison with general population data. Aust Psychol. 2010;45:249–57.
  37. 37. R Core Team. R: A language environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing; 2023.
  38. 38. DeVellis RF. Scale development: theory and applications. 4th ed. Brickman L, Rog D, editors. California: SAGE Publications, Inc.; 2017.
  39. 39. Streiner DL. Being inconsistent about consistancy: when coefficient alpha does and doesn’t matter. J Pers Assess. 2003;80:217–22.
  40. 40. Diamantopoulos A, Siguaw JA. Formative versus reflective indicators in organizational measure development: a comparison and empirical illustration. British Journal of Management. 2006;17:263–82.
  41. 41. Streiner DL, Streiner DL. Starting at the Beginning: An Introduction to Coefficient Alpha and Internal Consistency. J Pers Assess. 2003;80:99–103. pmid:12584072
  42. 42. Chang C, Gardiner J, Houang R, Yu YL. Comparing multiple statistical software for multiple-indicator, multiple-cause modeling: an application of gender disparity in adult cognitive functioning using MIDUS II dataset. BMC Med Res Methodol. 2020;20:1–14. pmid:33183226
  43. 43. Craney TA, Surles JG. Model-dependent variance inflation factor cutoff values. Qual Eng. 2002;14:391–403.
  44. 44. International Conference on Health Promoting Universities and Colleges. Okanagan Charter: an international charter for health promoting universities and colleges [Internet]. Kelowna, BC; 2015. https://bp-net.ca/wp-content/uploads/2019/03/Okanagan-Charter.pdf
  45. 45. National standard for the mental health and well-being for post-secondary students. Canada: Canadian Standards Association; 2020.
  46. 46. Abrams Z. Student mental health is in crisis: campuses are rethinking their approach. American Psychological Association. 2022;53.
  47. 47. Jaworska N, De Somma E, Fonseka B, Heck E, MacQueen GM. Mental health services for students at postsecondary institutions: a national survey. Canadian Journal of Psychiatry. 2016;61:766–75. pmid:27310230
  48. 48. Faulkner G, Ramanathan S, Kwan M. Developing a coordinated Canadian post-secondary surveillance system: a Delphi survey to identify measurement priorities for the Canadian Campus Wellbeing Survey (CCWS). BMC Public Health. 2019;19. pmid:31296190