Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Further evidence in support of the validity of the post-secondary student stressors index using a nationwide, cross-sectional sample of Canadian university students

  • Brooke Linden ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Software, Validation, Writing – original draft, Writing – review & editing

    brooke.linden@queensu.ca

    Affiliations Department of Public Health Sciences, Queen’s University, Kingston, Ontario, Canada, Health Services and Policy Research Institute, Queen’s University, Kingston, Ontario, Canada

  • Heather Stuart

    Roles Conceptualization, Methodology, Supervision, Validation, Writing – review & editing

    Affiliations Department of Public Health Sciences, Queen’s University, Kingston, Ontario, Canada, Health Services and Policy Research Institute, Queen’s University, Kingston, Ontario, Canada

Abstract

The Post-Secondary Student Stressors Index (PSSI) was created to facilitate improved evaluation of the sources of post-secondary student stress. This study reports evidence in support of the validity of the tool using a large, nationwide cross-sectional sample of students attending universities across Canada during the 2020–2021 academic year. We provide additional evidence for the construct validation of the PSSI, including internal structure evidence and relations to other variables by estimating multiple-indicator, multiple-cause models and investigating Spearman’s rho correlation coefficients between the PSSI and like constructs. Based on index validation guidelines, results provide further support for the internal structure of the PSSI, demonstrating hypothesized relationships with like constructs and manifest variables, as well as acceptable goodness-of-fit statistics. Similarly, correlation coefficients were statistically significant and in line with directionality hypotheses. The results of this research provide further evidence for the validity of the PSSI among varied university student populations in Canada and addresses several of the limitations identified in earlier preliminary psychometric work on the instrument.

Introduction

Although the prevalence of mental illnesses among post-secondary students are not significantly different from those observed among the general Canadian population [1], the demand for mental health services on post-secondary campuses continues to grow as does the complexity of presenting issues [2]. National cross-sectional data suggests that the prevalence of students reporting above average to extreme levels of stress is increasing, along with self-reported symptoms of common mental illnesses such as anxiety and depression [3]. Chronic stress has been linked to poor academic performance and negative mental health outcomes [4], necessitating the bolstering of upstream campus services that provide students with the tools needed to recognize and respond to changes in their mental health in the face of day-to-day stressors. In order to appropriately tailor these upstream services to the needs of students, a method of measuring the sources of student stress is required.

Until recently, there has been no validated measurement tool for assessing the frequency and severity of stressors experienced by students within post-secondary settings. Existing instruments designed to evaluate student stress are associated with a host of issues, including being conceptualized too broadly or too narrowly, seldom involving students in the development process (often focusing on a single year or program), or demonstrating a lack of strong psychometric properties. These instruments are also often outdated, with the majority having been developed between 1990 and 2005. This raises concerns regarding the validity of the instruments with respect to accurate and comprehensive measurement of the stressors that are currently salient for students [5].

In 2019, the Post-Secondary Student Stressors Index (PSSI) was developed in response to these gaps in measurement. The tool was developed and validated with extensive collaboration with students over the course of a two-year period. While a preliminary psychometric analysis was conducted on the instrument in 2019 using pilot test data from a single school in Ontario [6], instrument validation is an ongoing, iterative process [7]. Therefore, the purpose of this study was to collect further evidence of validity (specifically, internal structure and relations to other variables) using a larger, more varied sample, representing university students from across Canada.

Methods

Study design

The data analyzed here was collected during the study, “Multi-site Release of the Post-Secondary Student Stressors Index (PSSI)”, conducted during the 2020–2021 academic year. Cross-sectional data were collected at three time points via an online survey from students (n = 12,577) attending fifteen Canadian universities [8]. Participating institutions represented all but one Canadian province. The survey, distributed through Qualtrics, included questions about demographic characteristics, sources of student stress, mental health and mental illness, and stress specific to COVID-19. Participants were provided with a letter of information and indicated their consent by selecting “I have read and understood the Letter of Information and agree to participate in this study” before being granted access to the survey questions. The sampling method and sample size used was at the discretion of each participating institution. Further detail related to study design can be found elsewhere [8].

Measures

Demographics.

Several demographic variables were assessed, including: gender (male, female, non-binary), age in years, level of study (undergraduate, graduate, or professional program), marital status, residence during the academic year, student status (full-time/part-time/other), first-generation status (yes/no), international student status (yes/no), self-reported grade point average (GPA), and region of university (Eastern/Western/Central Canada).

Post-secondary student stressors index.

The Post-Secondary Student Stressors Index (PSSI) is a 46-item instrument. For each item, respondents are asked to indicate the severity of stress experienced and the frequency with which they experienced it. Responses are provided on an adjectival scale from 1 (‘not stressful’ and ‘rarely’) to 4 (‘extremely stressful’ and ‘almost always’), with higher ratings indicating a greater level of stress resulting from an item. An additional option to indicate “not applicable” (a response of 0) was also available in the event that a stressor didn’t happen or was otherwise not applicable. The tool evaluates stressors across five domains, including: academics, learning environment, campus culture, interpersonal, and personal. For the purposes of the subsequent analyses only severity ratings were used.

Mental health measures.

The Perceived Stress Scale (PSS10) is a brief, 10-item scale designed to measure general stress [9]. Participants are asked to indicate the frequency with which they have had certain thoughts or feelings on an adjectival scale from 0 (‘never’) to 4 (‘very often’). Positively worded items on the scale are reverse coded and ratings then summed, with higher scores indicating more perceived stress. The Kessler Psychological Distress Scale (K10) is a 10-item scale designed to measure general psychological distress [10]. Participants are asked to indicate the frequency with which they have had certain thoughts or feelings on an adjectival scale from 1 (‘none of the time’) to 5 (‘all of the time’). Scores are then summed, with higher scores reflecting greater distress and a higher likelihood of experiencing a mental disorder. Both the PSS10 and K10 have demonstrated consistent psychometric properties in a number of samples and settings and have previously been used among post-secondary populations [1113].

Analysis

The statistical tests we used to collect evidence for validity reflect those appropriate for an index, as opposed to a scale. While both types of multi-item instruments are used to evaluate latent constructs (i.e., variables that cannot be directly observed), items in a scale are referred to as “effect indicators”, while those in an index are referred to as “causal indicators” [7]. With the latter, changes in the items (stressors) determine changes in the value of the latent variable (student stress). The goal of an index is not unidimensionality and internal consistency. On the contrary, there is no expectation for items within an index to correlate with one another, although they might [14, 15]. Instead, index construction places emphasis on considering multicollinearity among indicators and assessing theoretical relationships between latent constructs and predictor variables [16]. Most importantly, while the goal in scale development is often to use as few items as possible to measure the latent construct in the interest of brevity, the opposite is true of index development. Indices often include a large number of items as the goal is to cover the entire scope of the latent variable, which may be quite broad. In general, items in an index are retained as long as they have a distinct influence on the latent construct, as removing an indicator in an index may result in omitting a part of the construct itself [16, 17].

Therefore, to further evaluate the internal structure of the PSSI, we estimated a multiple-indicator, multiple-cause (MIMIC) model, a special case of structural equation modelling (SEM) and an extension of Confirmatory Factor Analysis (CFA) [18]. A MIMIC model utilizes both a psychometric (i.e., measurement) approach as well as a structural approach, which allows for the researcher to estimate relationships among latent constructs (unobserved variables) and contextualize them using predictor variables (usually demographics assumed to have no error, such as age or gender) [18]. We hypothesized that the model would demonstrate that the internal structure of the PSSI was consistent with an index and demonstrate acceptable goodness-of-fit statistics. To assess multicollinearity among indicators, we examined the variance inflation factor (VIF) using a cut-off threshold of 10 [19].

Next, we explored the relationships between the latent variables on the PSSI and other “like” constructs to assess relations to other variables [20]. This method of index validation requires that these constructs are measured using effect indicators and that a theoretical relationship can be presumed to exist between the constructs [17]. We hypothesized that higher scores on the PSSI latent variables (indicating higher student-specific stress) would predict higher scores on both the PSS10 and K10 (higher general stress and psychological distress, respectively) resulting in positive, significant estimates. To test this hypothesis, we specified a second MIMIC model including PSS10 and K10 scores as predicted variables. Next, we calculated nonparametric Spearman’s rho (rs) correlation coefficients to examine the relationships between the PSSI, PSS10, and K10 across a variety of student subgroups, expecting to see positive, moderately significant relationships across all groups.

All tests were conducted using a complete case analysis approach to missing data; any student who had complete information for the variables relevant to an individual test was included in the analysis. All analyses were completed using R, Version 3.4.1. This research received ethics clearance from Queen’s University’s Health Sciences and Affiliated Teaching Hospitals Research Ethics Board.

Results

Sample

Descriptive statistics for the demographic characteristics of the sample are reported in Table 1. Most participants were female (71.6%), single (85.6%), lived off campus with family (54.0%), self-reported their GPA to be in the A range (60.3%), and studied full-time (90.6%) at the undergraduate level (75.0%). Most participants were aged 20 years or younger (41.4%), with an overall average age of 23 (SD = 6.9). Approximately 12% of the sample were international students, and over one quarter of respondents (26%) were the first in their families to attend university.

thumbnail
Table 1. Demographic characteristics of sample (n = 12 577).

https://doi.org/10.1371/journal.pone.0278897.t001

Internal structure evidence

All 46 items on the PSSI were first correlated with the PSS10 in line with recommendations to explore correlations between index indicators and a separate variable that “summarizes the essences of the construct that the index purports to measure” [16, 17]. The PSS10 is a validated measure of general perceived stress, thus fulfilling this requirement. We assessed the potential for multicollinearity by examining the variance inflation factor (VIF) for each indicator. All VIF values were well below the cut-off (<5).

As previously outlined, the PSSI was conceptualized as being an index composed of five domains (or latent constructs) of student stress: academics, learning environment, campus culture, interpersonal, and personal stressors. As such, we specified these five latent variables in our MIMIC model and included manifest variables age and gender (where 1 = female and 0 = male) as predictors for our structural model. Descriptive statistics for indicators and covariates are shown in Table 2. We decided to run two models. In the first model, we excluded observations where a participant indicated that a stressor was “not applicable” or did not happen. This resulted in a model with 516 total observations. In the second model, we included those observations, resulting in a model with 11,965 total observations. Results of both models are displayed in Table 3.

thumbnail
Table 2. Descriptive statistics for indicators and covariates.

https://doi.org/10.1371/journal.pone.0278897.t002

thumbnail
Table 3. Multiple cause, multiple indicator (MIMIC) model estimates.

https://doi.org/10.1371/journal.pone.0278897.t003

In the first model, gender was significantly associated with all latent constructs, while age was significantly associated with all but the “learning environment” and “personal”. All items demonstrated strong factor loadings (>0.5). The model was statistically significant (X2 = 4218, df = 1061, p<0.001) and demonstrated good overall fit. The root mean square error of approximation (RMSEA) and standardized root mean square residual (SRMR) were both <1, and both the Comparative Fit Index (CFI) and Tucker-Lewis Index (TLI) were only just below the desired cut-off of ≥0.90 at 0.81 and 0.80, respectively. In contrast, the second model revealed several low (and in one case, negative) factor loadings and weaker overall fit statistics. The lower factor loadings were observed on items where a much larger proportion of respondents indicated that the stressor was “not applicable” or did not happen to them relative to the other items on the index. In this model, gender and age were significantly associated with all latent constructs, though this is to be expected given the large sample size. Similarly, though the model itself was found to be significant, the chi-square statistic is sensitive to sample size, and thus not informative. Ultimately, results suggest that the removal of responses of “not applicable” or “did not happen” results in a stronger overall model.

Relations to other variables

To assess evidence for relations to other variables, we first specified a third MIMIC model including PSS10 and K10 scores as predicted variables (Table 4). Estimation of the model resulted in a good overall fit and, as expected, the path between the PSSI latent variables and selected indicators were significant and above 0, consistent with our hypothesis.

thumbnail
Table 4. MIMIC model estimates exploring relations to other variables (n = 516).

https://doi.org/10.1371/journal.pone.0278897.t004

Next, we assessed the correlations between the PSSI and like instruments, using a non-parametric Spearman’s rho as the data were not normally distributed. Given that it would not be appropriate to sum participants’ responses on the PSSI, we created a “count” variable. The goal of the PSSI is not to derive a comprehensive severity “score” based on the stressors experienced, but rather to identify sources of stress. Logically, we would expect that those who experienced a greater number of stressors would in turn experience a higher level of overall stress and/or psychological distress, as measured by the PSS10 and K10. Thus, to develop the PSSI count variable, we dichotomized responses such that any response indicating stress in response to a stressor was coded as 1, and responses indicating that a stressor was “not applicable”, did not happen, or was not stressful were coded as 0. Counts were then summed to derive a count. Essentially, this applies an equal weight of 1 to each stressor, a common method employed when working with health status indices [7]. To assess these relationships across different subgroups of students, we repeated the analysis across several split samples. Results are displayed in Table 5. As hypothesized, the analyses revealed moderate, positive correlations across all groups. All correlations were significant at p<0.001.

thumbnail
Table 5. Relations to other variables (Spearman’s rho correlations (rs)) by group.

https://doi.org/10.1371/journal.pone.0278897.t005

Discussion

This psychometric analysis followed the guidelines laid out by Diamantopoulos and Winklhofer regarding validation processes appropriate for an instrument developed during a formative measurement perspective (i.e., an index rather than a scale) [16, 17]. Notably, no reliability analyses nor dimensionality tests, such as exploratory factor analysis, were performed on the PSSI as unidimensionality and internal consistency are not relevant when evaluating the validity of an index [14]. The results of the analyses presented here echo the preliminary analyses previously published [6] and provide further support for the validity of the Post-Secondary Student Stressors Index (PSSI). Results demonstrate its utility among a large sample of students who varied by gender, age, level of study, course load (i.e., part-time vs. full-time), and more.

Unlike other tools that purport to evaluate the sources of student stress, the PSSI was co-developed with students, and has now been validated using a large sample of students attending university across Canada. The PSSI is a valuable tool for universities that are interested in identifying the presence and distribution of student stressors on their campuses. A more in-depth understanding of the sources of student stress, as well as their associated severity and frequency is an important step towards improved tailoring of upstream campus mental health and wellbeing services. Although downstream campus services (i.e., psychotherapy, pharmacotherapy) continue to be overburdened, few post-secondary institutions have successfully taken a comprehensive or “whole campus approach” including both upstream and downstream mental health services, as is recommended by broad frameworks like the Okanagan Charter. Post-secondary institutions are increasingly encouraged to incorporate both physical and mental health into institutional culture and policies to promote an environment that enhances health and well-being, instead of focusing on downstream treatment only [21]. By improving understanding of the sources of student stress and areas most in need of support through the use of tools like the PSSI, institutions may be able to improve the effectiveness of upstream services, thereby contributing toward alleviating the bottleneck currently observed at the downstream service level.

Limitations

There are some limitations to this research. While the results demonstrate additional evidence for the validity of the tool among students attending schools across a variety of regions in Canada, the sample did not include participants from the province of Quebec. Additionally, the French version of the PSSI has not yet been formally tested. As a result, the findings presented here cannot be generalized to the francophone university student population in Canada. Secondly, the PSSI, though intended for use cross various types of post-secondary institutions, has thus far only been validated for use among university student populations. Therefore, caution should be used when applying this tool among samples of post-secondary students attending colleges, institutes, etc. Finally, the data used for this study were collected during the 2020–2021 academic year, which was the first full academic year that took place during the global COVID-19 pandemic. As a result, it is likely that stress levels (both general and student-specific) were somewhat higher than we might usually expect. However, this would only be expected to impact the descriptive analyses, and not the psychometric analyses.

Conclusion

The results of this research provide further evidence for the validity of the PSSI among varied university student populations in Canada and addresses several of the limitations identified in earlier preliminary psychometric work on the instrument. Future research will continue to explore the utility of the PSSI in identifying distributions of stressors among unique subgroups of interest (i.e., international students, first-generation students, etc.), as well as the continued validation of the instrument in different contexts (i.e., French-speaking students, college populations). Finally, the development of a shortened version of the PSSI to improve its accessibility and utility in terms of its ability to be added on to existing survey instruments will also be explored.

Acknowledgments

BL would like to acknowledge and thank all the students who participated in this study, as well as each of the co-investigators at participating institutions who offered their time and commitment to this project during a challenging academic year.

References

  1. 1. Wiens K, Bhattarai A, Dores A, Pedram P, Williams JVA, Bulloch AGM, et al. Mental health among Canadian postsecondary students: a mental health crisis? Can J Psychiatry. 2020;65(1):30–5. pmid:31939333
  2. 2. Coordinating Commitee for Vice-Presidents Students. White paper on postsecondary student mental health (Internet). Toronto, ON; 2015. Available from: http://www.campusmentalhealth.ca.
  3. 3. Linden B, Boyes R, Stuart H. Cross-Sectional Trend Analysis of the NCHA II Survey Data on Canadian Post- Secondary Student Mental Health and Wellbeing from 2013 to 2019. BMC Public Health. 2021;21(590):1–13. pmid:33765965
  4. 4. Duffy A, Keown-Stoneman C, Goodday S, Horrocks J, Lowe M, King N, et al. Predictors of mental health and academic outcomes in first-year university students: Identifying prevention and early-intervention targets. BJPsych Open. 2020;6(3):1–8. pmid:32381150
  5. 5. Linden B, Boyes R, Stuart H. The Post-Secondary Student Stressors Index: proof of concept and implications for use. J Am Coll Heal. 2020;1–9. pmid:32432984
  6. 6. Linden B, Stuart H. Psychometric Assessment of the Post-Secondary Student Stressors Index (PSSI). BMC Public Health. 2019;19(1139):1–12. pmid:31426767
  7. 7. Streiner D, Norman G, Cairney J. Health measurement scales: A practical guide to their development and use. 5th ed. New York: Oxford University Press; 2015.
  8. 8. Linden B. Cross-canada release of the post-secondary student stressors index (PSSI): Protocol for a cross-sectional, repeated measures study. JMIR Res Protoc. 2021;10(8). pmid:34463632
  9. 9. Cohen S, Kamarck T, Mermelstein R. A global measure of perceived stress. J Health Soc Behav. 1983;24:386–96. pmid:6668417
  10. 10. Kessler R, Barker P, Epstein J, Gfroerer J, Hiripi E, Howes M, et al. Screening for serious mental illness in the general population. Arch Gen Psychiatry. 2003;60(2):184–9. pmid:12578436
  11. 11. Smith KJ, Rosenberg DL, Haight GT. An assessment of the psychometric properties of the Perceived Stress Scale-10 (PSS10) with business and accounting students. Account Perspect. 2014;13(1):29–59.
  12. 12. Lee E-H. Review of the Psychometric Evidence of the Perceived Stress Scale. Asian Nurs Res (Korean Soc Nurs Sci). 2012;6(4):121–7. pmid:25031113
  13. 13. Stallman HM. Psychological distress in university students: A comparison with general population data. Aust Psychol. 2010;45(4):249–57.
  14. 14. Streiner DL. Being inconsistent about consistancy: when coefficient alpha does and doesn’t matter. J Pers Assess. 2003;80(3):217–22.
  15. 15. Streiner DL, Streiner DL. Starting at the Beginning: An Introduction to Coefficient Alpha and Internal Consistency. J Pers Assess. 2003;80(1):99–103. pmid:12584072
  16. 16. Diamantopoulos A, Siguaw JA. Formative versus reflective indicators in organizational measure development: a comparison and empirical illustration. Br J Manag. 2006;17(4):263–82.
  17. 17. Diamantopoulos A, Winklhofer HM. Index Construction with Formative Indicators: An Alternative to Scale Development. J Mark Res. 2001;38(2):269–77.
  18. 18. Chang C, Gardiner J, Houang R, Yu YL. Comparing multiple statistical software for multiple-indicator, multiple-cause modeling: an application of gender disparity in adult cognitive functioning using MIDUS II dataset. BMC Med Res Methodol. 2020;20(1):1–14. pmid:33183226
  19. 19. Craney TA, Surles JG. Model-dependent variance inflation factor cutoff values. Qual Eng. 2002;14(3):391–403.
  20. 20. American Educational Research Association, American Psychological Association, National Council on Measurement in Education. Standards for Educational and Psychological Testing. 2nd ed. Washington, DC: American Educational Research Association; 2014.
  21. 21. Suárez-Reyes M, Muñoz Serrano M, Van Den Broucke S. How do universities implement the Health Promoting University concept? Health Promot Int. 2019;34(5):1014–24. pmid:30052965