Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

STEM undergraduates’ perspectives of instructor and university responses to the COVID-19 pandemic in Spring 2020

  • Sherry Pagoto ,

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Supervision, Writing – original draft, Writing – review & editing

    sherry.pagoto@uconn.edu

    Affiliation Department of Allied Health Sciences, University of Connecticut, Storrs, Connecticut, United States of America

  • Kathrine A. Lewis,

    Roles Conceptualization, Formal analysis, Writing – review & editing

    Affiliations Department of Psychology, The Pennsylvania State University, University Park, Pennsylvania, United States of America, Department of Women’s, Gender, and Sexuality Studies, The Pennsylvania State University, University Park, Pennsylvania, United States of America

  • Laurie Groshon,

    Roles Formal analysis, Resources, Writing – review & editing

    Affiliation Department of Allied Health Sciences, University of Connecticut, Storrs, Connecticut, United States of America

  • Lindsay Palmer,

    Roles Conceptualization, Data curation, Writing – review & editing

    Affiliations Department of Psychology, The Pennsylvania State University, University Park, Pennsylvania, United States of America, Department of Women’s, Gender, and Sexuality Studies, The Pennsylvania State University, University Park, Pennsylvania, United States of America

  • Molly E. Waring,

    Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Supervision, Writing – review & editing

    Affiliation Department of Allied Health Sciences, University of Connecticut, Storrs, Connecticut, United States of America

  • Deja Workman,

    Roles Data curation, Investigation, Project administration, Writing – review & editing

    Affiliation Department of Mathematics, The Pennsylvania State University, University Park, Pennsylvania, United States of America

  • Nina De Luna,

    Roles Data curation, Investigation, Writing – review & editing

    Affiliation Department of Mathematics, The Pennsylvania State University, University Park, Pennsylvania, United States of America

  • Nathanial P. Brown

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Supervision, Writing – original draft, Writing – review & editing

    Affiliation Department of Mathematics, The Pennsylvania State University, University Park, Pennsylvania, United States of America

Abstract

Objectives

We examined undergraduate STEM students’ experiences during Spring 2020 when universities switched to remote instruction due to the COVID-19 pandemic. Specifically, we sought to understand actions by universities and instructors that students found effective or ineffective, as well as instructor behaviors that conveyed a sense of caring or not caring about their students’ success.

Methods

In July 2020 we conducted 16 focus groups with STEM undergraduate students enrolled in US colleges and universities (N = 59). Focus groups were stratified by gender, race/ethnicity, and socioeconomic status. Content analyses were performed using a data-driven inductive approach.

Results

Participants (N = 59; 51% female) were racially/ethnically diverse (76% race/ethnicity other than non-Hispanic white) and from 32 colleges and universities. The most common effective instructor strategies mentioned included hybrid instruction (35%) and use of multiple tools for learning and student engagement (27%). The most common ineffective strategies mentioned were increasing the course workload or difficulty level (18%) and use of pre-recorded lectures (15%). The most common behaviors cited as making students feel the instructor cared about their success were exhibiting leniency and/or flexibility regarding course policies or assessments (29%) and being responsive and accessible to students (25%). The most common behaviors cited as conveying the instructors did not care included poor communication skills (28%) and increasing the difficulty of the course (15%). University actions students found helpful included flexible policies (41%) and moving key services online (e.g., tutoring, counseling; 24%). Students felt universities should have created policies for faculty and departments to increase consistency (26%) and ensured communication strategies were honest, prompt, and transparent (23%).

Conclusions

To be prepared for future emergencies, universities should devise evidence-based policies for remote operations and all instructors should be trained in best practices for remote instruction. Research is needed to identify and ameliorate negative impacts of the pandemic on STEM education.

Introduction

On March 11, 2020, when the World Health Organization declared COVID19 a pandemic [1], colleges and universities around the US swiftly made plans to close their campuses, send students home, and move to emergency remote instruction in a 1–2 week period. This period was rife with confusion and anxiety as many students had difficulty securing housing [2] and others were forced into living situations that were not conducive to remote instruction [3]. Some universities were criticized for how they handled the abrupt move [4]. Once courses went online, the degree of disruption intensified, with some courses more affected than others. Courses in the STEM fields (science, technology, engineering, and math) are inherently difficult to move online with no preparation or instructor training given the frequent use of laboratory experiences, group projects, and the common use of “chalk talks,” all of which present unique challenges and require the use of specialized technologies to conduct remotely. We know little about STEM undergraduate students’ perceptions of how their universities and instructors handled remote instruction under these emergency circumstances. Such insights can inform institutions’ strategies for insuring effective teaching and learning since it is likely that the COVID-19 pandemic will continue to impact higher education well into the 2021–2022 academic year.

As universities shutdown instructors had little time to pivot to remote instruction and their success largely hinged upon their previous knowledge and abilities, availability of training at their institutions, and the time available to be trained. Although much research establishing best practices for remote instruction exists, it has not been well disseminated outside the fields of educational technology and instructional design [5]. A recent survey study of undergraduate students revealed a great deal of variability in course modalities used by instructors in Spring 2020, such that 65% of students surveyed reported recorded lectures, 60% reported live lectures, 55% reported pre-recorded video, and 25% reported breakout groups during a live class [6]. One study found that course workloads also changed during the shutdown, with about one-quarter of students reported that their instructors decreased the workload while one-third reported that their instructors increased the workload [7]. Much variability was also reported in terms of grading in Spring 2020, with 60% of students reporting that they were given a choice between grade and pass/fail, 34% reporting no pass/fail option, and 6% reporting mandatory pass/fail [6]. The same study found that the proportion of students rating their course as somewhat or very satisfying from pre-shutdown to post-shutdown dropped from 87% to 59%; and only 17% said they were satisfied with how much they were learning after the shutdown. The pandemic’s disruption to higher education in the spring appears to have been substantial: 13% of students will delay graduation as a result and 40% lost a job, internship, or job offer [8]. These numbers are sure to rise as the pandemic continues through 2020 and 2021.

The pandemic will affect at least 2 cohorts of undergraduate students in the US. Research on STEM students’ perceptions of the abrupt shift to remote instruction is now needed given the unique challenges associated with conducting STEM courses online and the potential impact on academic performance and retention, as STEM courses are already notorious for “weeding out” students [9]. Research is also needed on STEM students’ perspectives on how their instructors handled the move to remote instruction not only in terms of the tools and technologies that were used, but also in how much instructors conveyed their investment in student success under circumstances that were stressful both for students and instructors.

The variability in course modality, technologies used, instructor preparedness, quality of instruction and grading in Spring 2020 point to a role for universities in developing policies that create a more cohesive approach to remote instruction and for those policies to be guided by the large body of research on online instruction. Given the unprecedented nature of the pandemic, it is unlikely that universities had policies for closing campus or how to implement campus-wide remote instruction and they had little time to create and enforce new policies. Institutional policies are now needed to guide a more seamless transition, but this needs to be data-driven and informed by students. Students’ perspectives of universities responses, including what students felt their university did that was effective and ineffective may be useful to inform policies and the development of university playbooks on how emergency responses can be more evidence-based in the future. Such knowledge could be leveraged in the context of this pandemic, future pandemics, and other emergency situations that force school closures (e.g., wild fires, snowstorms) [10].

To address these needs, in virtual focus groups, we queried a diverse sample of STEM undergraduate students about their perspectives of how their universities and instructors handled campus closures and the move to remote instruction in Spring 2020. Specifically, we asked about strategies, tools, and technologies that universities and instructors used that were effective and ineffective. Then, we asked about instructor behavior that made them feel the instructor cared or did not care about their success. Given the exploratory nature of this work examining an unprecedented event in modern history, we put forth no specific hypotheses.

Methods

Focus group methodology was selected because very little prior knowledge was available on this topic. In July 2020 we recruited undergraduate students attending US colleges and universities to participate in a study about their experiences in Spring 2020 related to the abrupt transition to remote instruction. Participants completed an online survey [11] that included questions about their demographic characteristics and participated in focus groups conducted via video conferencing software.

Recruitment

We recruited students through faculty in the Math Alliance, a national organization of faculty in mathematics, by asking them to forward a recruitment email to their students and other faculty teaching courses in the calculus sequence. Recruitment ads were also posted on Reddit in subreddits that targeted student and/or STEM interests and through course listservs. Researchers targeted students within the calculus sequence as these courses are required of most STEM majors. Interested individuals completed a brief online survey to assess eligibility. Eligible students were aged 18 years or older, enrolled full-time in a college or University in the United States in spring 2020, STEM majors, and met criteria for inclusion in one of 16 demographic-stratified focus groups. STEM majors included students who had a declared major within the natural sciences (e.g., biology, physics), engineering, mathematics, and technology but excluded students in the social sciences (e.g., economics, psychology). To create a diverse sample, eligible students were required to fit into one of 16 strata defined by gender, race/ethnicity, and SES (see Table 1). Students whose responses to questions about gender, race/ethnicity, or SES did not place them in one of these strata were excluded from participation. Participants reported their gender as woman, man, or with an ‘other’ write-in option; students who did not identify as women or men were excluded. Participants reported whether they identified their ethnicity as Hispanic/Latinx, and reported how they describe their race(s). Based on responses, we categorized participants as non-Hispanic Caucasian, Hispanic/Latinx (any race[s]), non-Hispanic Black, non-Hispanic Asian, or other race/ethnicity or multiracial; participants identifying as another race/ethnicity or non-Hispanic multiracial were excluded from participation. For socioeconomic status, we categorized students as low SES if they: 1) were eligible for work study or Pell grants, 2) reported that it was hard for their family to pay for basics (e.g., rent, heat, food), and/or 3) reported that parental education was less than or equal to a high-school diploma or GED and a household income of less than $75,000. Students were categorized as higher SES students as students whose parental education was a Bachelor’s degree or higher, who were not eligible for work study, were not eligible for a Pell Grant, and reported that it was not hard at all or somewhat hard to pay for basics. Students whose responses to questions about financial aid, parental education, household income, and financial strain did not place them in either of the above groups were excluded from the study.

thumbnail
Table 1. Characteristics of undergraduate students majoring in STEM fields at US colleges/universities who participated in focus groups in summer 2020, n (%).

https://doi.org/10.1371/journal.pone.0256213.t001

Of the 416 respondents to the initial eligibility screener, we emailed the 154 who were eligible and recruited participants into gender-SES-race/ethnicity focus groups with a cap of 6 students per focus group. A total of 59 students from 34 institutions participated in one of 16 focus groups ranging in size from 2–6. Of the 34 institutions represented in the sample, 88% were public, 12% were private, and 38% were minority serving. Institutions had a mean of 30.14% of students on Pell grants (sd = 13.10). Participants received a $50 gift card for participating. The Penn State University (00014866) and University of Connecticut (L20-0060) Institutional Review Boards approved this study. Participant characteristics are shown in Table 1.

Focus groups

Focus groups were stratified by gender (male, female), socioeconomic status (SES; low, moderate/high), and race/ethnicity (non-Hispanic Caucasian, Hispanic/Latinx, non-Hispanic Black, non-Hispanic Asian) so that participants did not experience discomfort expressing the challenges they experienced during emergency remote instruction due to being in a group with participants of differing social class, gender, or race. We conducted 16 focus groups, one group with each gender x SES x race/ethnicity group (e.g., non-Hispanic Black women of moderate/high SES). Thematic saturation was achieved with 16 focus groups. Focus groups lasted 60 minutes and were conducted by an investigator (NB, LP, KL, ND, or DW) who was paired with a note taker. To keep the focus groups as uniform as possible, the facilitator followed a script that was produced by the investigator team which reflected multiple disciplines and career stages, including 3 professors (mathematics, clinical psychology, epidemiology), 2 graduate students (gender studies, social psychology), and 2 undergraduate STEM majors. Through discussion, the team developed a list of questions about the strategies instructors used and how instructors behaved towards students during the move to remote instruction as well as what universities did and how they could have done better. The discussion drew upon each team member’s experiences and observations during this unprecedented historical event and the goal was to produce responses that could inform tangible policy changes. No research was available at the time to inform the focus group script S1 File. The final focus group script posed the following questions:

  1. What are some examples of strategies, tools, or technologies that your instructors used that you found to be very effective (in that they made it easier to learn) during remote instruction?
  2. What are some examples of strategies, tools, or technologies that your instructors used that you found to be ineffective (in that they did not help you learn) during remote instruction?
  3. What are some things your instructors did during remote instruction that made you feel like they cared about their students?
  4. What are some things your instructors did during remote instruction that made you feel like they did not care about their students?
  5. What are some things that your university did to help students be successful during remote instruction?
  6. What do you wish your university did to better help students be successful during remote instruction?

During the focus group, each participant was given a turn to respond to each question but they were not specifically asked to react to each other’s responses. When a participant’s response was cryptic, the facilitator probed for clarification. Participants were given the option of providing no response if they could not think of an answer to the question (e.g., if they found no professor strategies helpful they could say “none”).

Analysis

Transcription software was used to produce transcripts that were then reviewed and edited for accuracy by research assistants S2 File. We summarized participant characteristics using descriptive statistics. We conducted a conventional content analysis using a data-driven inductive “framework” approach to coding the content into major and minor themes for each of the six focus group questions [12]. As a first step, a pair of investigators, including one who was present during the focus groups and one who was not, read through the transcripts (familiarization) to identify emerging themes (identifying a thematic framework). They then developed a codebook that was then applied to all of the data (indexing). Once each coder finished independent coding they met in pairs to resolve discrepancies as described elsewhere. [13] A third investigator was brought in for unresolved discrepancies. This process resulted in a total of 46 themes across 6 questions. Interrater agreement ranged from 81% to 100% and Cohen’s kappa statistics ranged from 0.202 to 1.0, with only 2 of the 46 having a kappa of < .5. This exceeds the recommended standard of 80% agreement on 95% of codes [14].

Data management and descriptive analyses of the survey data were conducted using SAS 9.4 (SAS Institute, Inc, Cary, NC) while interrater reliability statistics were calculated with SPSS 26 (IBM, Armonk, NY).

Results

Focus group participants (N = 59) had a median age of 19 years old, 51% were women, 47% were low SES, and the sample was racially/ethnically diverse (Table 1). In Spring 2020, participants were enrolled in 29 US colleges/universities, including private colleges/universities (n = 4, 14% of schools), public universities (n = 23, 79%), and community colleges (n = 2, 7%). When surveyed in July 2020, students were living in 20 US states/territories including Puerto Rico, with two students living outside the US. The distributions of responses to each question by the 16 race/ethnicity, gender, and SES subgroups are depicted in Table 2. The representation of the 16 subgroups for each theme of each question are depicted in the S1 Table.

thumbnail
Table 2. Characteristics of focus groups and distribution of responses by subgroup.

https://doi.org/10.1371/journal.pone.0256213.t002

Effective instructor strategies, tools, and technologies

The 59 participants provided 93 responses to the question: “What are some examples of strategies, tools, or technologies that your instructors used that you found to be very effective (in that they made it easier to learn) during remote instruction?” Two major themes and three minor themes emerged in responses (Table 3). The most common (35%; n = 33) response referred to hybrid instruction, meaning the instructor’s use of both synchronous and asynchronous modalities. For instance, many participants preferred live remote lectures that were also recorded and posted, allowing students who experienced disruptions during the live portion to watch later at their convenience. The next most common theme (27%; n = 25) referred to instructors’ use of multiple tools to reach and engage students, such as discussion boards, study groups (e.g., in breakout rooms), or supplementary materials like making lecture notes or slides available to students. Specific technologies mentioned include communication platforms (e.g., Piazza, Discord, Nectir, Gauchospace, Slack, GroupMe, Blackboard Collaborate Ultra) and streaming/video conferencing platforms (e.g., Zoom, Boing, Microsoft Teams, Twitch). Minor themes included instructor’s communication strategy (15%; n = 14) such as quick and clear email responses to student inquiries; recorded lectures (12%; n = 11); leniency (5%; n = 5); and live lectures at the regularly scheduled class times (3%; n = 3). Finally, two responses (2%) indicated there was nothing the student found particularly effective.

thumbnail
Table 3. Effective instructor strategies, tools, and technologies (N = 59 participants; N = 93 responses).

https://doi.org/10.1371/journal.pone.0256213.t003

Ineffective instructor strategies, tools, and technologies

The 59 participants made a total of 68 responses to the question: What are some examples of strategies, tools, or technologies that your instructors used that you found to be ineffective (in that they did not help you learn) during remote instruction? About 1/5 of students (n = 14; 21%) had no examples of ineffective strategies, tools, and technologies to share. Otherwise, 5 major themes and 1 minor theme emerged (Table 4). The most common theme (n = 12; 18%) encompassed any strategy that instructors used that served to increase the difficulty level or workload of the course relative to how the course was conducted pre-shutdown. Such measures included increasing the weighting of proctored exams to reduce the impact of cheating on final grades, adding new tasks such as participation on discussion boards, or changing exam and assignment formats in ways that would avoid cheating but also increase the difficulty level for students (e.g., open-ended responses instead of multiple choice or replacing exams with time intensive projects). The next most common theme (n = 10; 15%) was instructors using pre-recorded lectures. Responses cited that pre-recorded lectures that were used in place of live lectures made it harder to feel motivated to attend and harder to learn from given the lack of opportunity to ask questions in the moment. An equally prevalent theme (n = 10; 15%) was instructors’ lack of a communication strategy, tool, or technology, with examples including no formal ways to interface with the instructor or poor email responses. Another major theme (n = 9; 13%) was the use of any technologies that had frequent technical difficulties, were inefficient, required bandwidth or computer storage that not all students had, or that required more training than was provided. Examples included test taking platforms in which students experienced technical difficulties that interfered with test performance and/or the time available to take the test, dissemination of large files that students didn’t have the ability to download, use of breakout rooms with little guidance on how students should utilize the time, and the use of online tests that didn’t allow students to skip difficult questions and return to them later. The next theme (n = 8; 12%) was related to instructors’ approaches to lecturing, including the use of long-form recorded or live remote lectures, posting outdated lecture recordings, and use of poor quality recordings. One minor theme was instructors requiring attendance at live lectures (n = 4; 6%). One response was coded as “other” because it did not fit into any of these categories.

thumbnail
Table 4. Ineffective instructor strategies, tools, and technologies (N = 59; N = 68 responses).

https://doi.org/10.1371/journal.pone.0256213.t004

Instructor caring behavior

The 59 participants made a total of 87 responses to the question: “What are some things your instructors did during remote instruction that made you feel like they cared about their students?” Two major themes and 4 minor themes emerged (Table 5). The most common response, accounting for 29% (n = 25) of responses related to instructor leniency and/or flexibility with respect to course policies or assessments. Examples included use of pass/fail or other flexible grading policies, allowing homework to be turned in past the deadline, and allotting more time for assignments. The next most common theme (25%; n = 22) was instructor responsiveness and accessibility to students. Examples included prompt replies to student emails, flexible office hours, and frequent engagement on discussion boards. About 13% (n = 11) of responses related to instructors bonding with the class, such as through words of encouragement or just spending time acknowledging the pandemic and asking how students were coping. Similar to this theme which refers to instructors’ interactions with the entire class, another 13% (n = 11) of responses related to the instructor offering one-on-one opportunities to check in with students and provide emotional support. The next theme, comprising 9% (n = 8) of responses related to any sign the instructor put in effort to ensure the class was a success, including learning and using new technologies, expressing enthusiasm, or generally seeming to put in significant effort to maintain high-quality instruction. About 7% (n = 6) of responses related to instructors seeking student feedback on how the course was going. The remaining 5% (n = 4) of responses suggested the student could not think of any examples of caring instructor behaviors.

thumbnail
Table 5. Actions that made students feel their instructor cares (N = 59; N = 87 responses).

https://doi.org/10.1371/journal.pone.0256213.t005

Instructor uncaring behavior

The 59 participants made a total of 72 responses to the question: What are some things your instructors did during remote instruction that made you feel like they did not care about their students? Over one-quarter of responses (25%; n = 18) had no examples of uncaring behavior by instructors to share. Otherwise, two major themes and three minor themes emerged (Table 6). The most common response (28%; n = 20) referred to poor communication, including unanswered emails, lack of empathy conveyed in communications, and scolding the class for underperformance. The next most common response (15%; n = 11) referred to instructors increasing the difficulty of the course, including making exams harder to offset the assumed impact of cheating, assigning more work, or grading harder. Minor themes included instructors being unprepared or disorganized (13%; n = 9), including expending minimal effort into the online format, posting assignments at the last minute, and frequently changing requirements, rules, and standards. Another minor theme (11%; n = 8) was inflexibility which included the instructor making no accommodations for students including international students, students with unreliable technology/internet access, students in different time zones, or students with special educational needs. The final minor theme was insufficient instruction or guidance (8%, n = 6). Examples included instructors posting assignments and lectures and leaving students to work on their own with no guidance, minimal or no opportunities to engage with the instructor, and minimal instruction provided for assignments.

thumbnail
Table 6. Actions that made students feel their instructor doesn’t care (N = 59 participants, N = 72 responses).

https://doi.org/10.1371/journal.pone.0256213.t006

What universities did well

The 59 participants made a total of 116 responses to the question: What are some things that your university did to help students be successful during remote instruction? Two major themes and 6 minor themes emerged (Table 7). The most common response (41%; n = 48) was classified as administrative flexibility, examples of which included university-wide pass/fail policies, waiving limits on counseling sessions, and extending administrative deadlines. The next most common response (24%; n = 28) was provision of remote services such as tutoring, counseling, and advising. One minor theme was agile response (9%; n = 10) which referred to how quickly and smoothly university services changed to meet student needs. Other minor themes included seeking student input during the process via town halls, surveys, and direct email solicitations (6%; n = 7); provision of technology such as wifi hotspots or laptops (4%; n = 5); effective communication strategies (4%; n = 5); and financial assistance in the form of fee waivers and refunds (3%; n = 4). A small percentage of responses (8%; n = 9) suggested the student could not think of any examples of what the university did well.

thumbnail
Table 7. University actions which helped students be successful in Spring 2020 (N = 59 participants, N = 116 responses).

https://doi.org/10.1371/journal.pone.0256213.t007

What universities could have done better

The 59 participants made a total of 69 responses to the question: What do you wish your university did to better help students be successful during remote instruction? Three major themes and six minor themes emerged (Table 8). The most common response (26%; n = 18) was that students wanted the university to create policies for faculty and departments to increase the consistency in how courses were carried out in terms of grading and teaching modalities. The second most common response (23%; n = 16) was a university communication strategy that was honest, prompt, clear, and transparent. The next most common response (10%; n = 7) was to improve the responsivity and support provided by university offices and services (e.g., financial aid, counseling, tutoring). Minor themes included engaging student input to crisis response (6%; n = 4), provision of technology (4%; n = 3), provide specific accommodations for international students (4%; n = 3), refund fees for services not rendered (3%; n = 2), provide opportunities for students to interact with one another (3%; n = 2), and provide more effective policies to prevent cheating (3%; n = 2). The remaining responses (17%; n = 12) suggested the student could not think of examples of things the university could have done better.

thumbnail
Table 8. Things students wish their university had done better (N = 59 participants, N = 69 responses).

https://doi.org/10.1371/journal.pone.0256213.t008

Discussion

The results of the present study revealed that generally students expressed that hybrid instruction and the use of multiple complementary resources and technologies were preferred, while reliance on pre-recorded lectures, poorly functioning technology, and insufficient opportunities to communicate with the instructor interfered with their ability to learn course material. The preference for live over pre-recorded lectures is consistent with a recent study that showed that 80% of undergraduate students said video chat and live streaming have made remote instruction better and 72% said the ability to connect with instructor and other students over live video was important [15]. Interestingly, some students valued recorded lectures due to the flexibility they provide, and this may be particularly important for them to juggle school, work and home life. However, when recordings were in lieu of live class time, lengthy, low quality, or clearly recycled from a previous semester, they were often found to be insufficient. Even though the convenience of recorded lectures was mentioned, many students voiced that it was harder to get motivated and pay attention when classes were reduced to viewing recordings. More work is needed to examine the impact of live versus recorded lectures during the pandemic on attendance/views and student engagement as well as academic performance and intentions to stay in a STEM major [16, 17].

Results also revealed university actions that students perceived as beneficial including lenient grading policies (e.g., pass/fail), a quick and smooth response to the emergency, and remotely offering campus services. Many students reported that they experienced too much variability in policies and practices between faculty and departments which for them signaled a need for the university to step in and create campus-wide policies. They felt this would not only help them stay focused and organized but also prevent particularly egregious instructor practices such as instructors being largely inaccessible to students, conducting the entire course by simply posting lecture notes, or increasing the difficulty level and/or workload of courses. Research on student performance across multiple courses under conditions of varying modalities across courses may be needed. Another area for improvement was related to the response of university services (e.g., financial aid) during campus closures which was often cited as slow and inefficient. Research suggests that for remote instruction to be effective it must be accompanied by an educational ecosystem to support the remote learner [5]. Unfortunately, the abrupt move of campus employees to work-from-home appeared to at least temporarily affect their ability to meet students’ needs during the pandemic. Campus employees likely weren’t used to working from home but many were also likely to be juggling children who were schooling from home while attempting to do their jobs [18]. Provision of safe and affordable childcare options for campus employees have been called for in general [19] and the pandemic has revealed this need further. Universities may also need to develop protocols for an online ecosystem that provides students efficient access to services and resources when the campus is closed in an emergency. Further, some services that went online, such as telehealth student counseling, should be continued post-pandemic to increase reach (e.g., to commuter and nontraditional students) and reduce disruptions in care (e.g., winter break).

A cross-cutting theme, emerging in responses to every question we asked about both instructors and universities, was communication. Students desired communication from both instructors and universities that was responsive, transparent, timely, and empathetic. Consistent with prior research [20], when these qualities were conveyed by instructors and/or universities, students felt more aware of what was expected of them and more valued and respected. In Fall 2020, some universities were called out for harsh messaging to students that shames and blames them for COVID19 spread on campuses [21]. Our findings suggest such an approach to communication is not likely to be well-received by students. Interestingly, a survey of undergraduate students in Fall 2020 found that 37% of students said their opinion of their university declined during the semester [15], suggesting that some students have not been satisfied with how their universities have handled the pandemic.

Research has demonstrated that student-instructor and student-student interactions can promote student achievement [22], perceived learning, and student satisfaction during remote instruction [23]. Our findings on communication suggest that instructors need to create more opportunities to engage with students during remote instruction and to frequently evaluate whether their engagement plan meets students’ needs. Effective engagement strategies were mentioned by students, many of which are supported by prior research [24], included the use of formal communication platforms, quick email response times, remote office hours, offers to meet one-on-one, and live lectures that allow for students to ask questions and hear other students ask and get answers to their questions. When live lectures are not offered, the onus is on instructors to provide alternative ways to engage with students while being mindful that voluntary forms such as virtual office hours may not be sufficient to accommodate large class sizes and may be underutilized by students who are bashful, have erratic schedules, or do not wish their home environment to be seen on video.

Some students mentioned appreciating the use of novel communication platforms (e.g., Piazza) that allowed them to ask questions any time (without having to send emails), post questions anonymously, get incentivized for answering each other’s questions, organize conversations into searchable threads, and access everything via a mobile app. The reliance on email to communicate with students is proving to be insufficient as evidenced by a study that found that students who were provided novel tools to communicate with instructors and other students this fall rated their motivation and engagement with learning outside of class significantly higher [15]. The importance of student engagement is underscored by emerging research that shows undergraduate students continue to feel inadequately engaged by their instructors. A recent survey of 3,412 undergraduate students in the middle of the Fall 2020 semester found that only 40% agreed that their remote instruction experience is engaging during class time and only 32% agreed they are being adequately engaged outside class time [15]. That same study showed that 85% of students felt that instructors should foster a sense of community among the students in online courses. This signals the need for better implementation and dissemination of best practices for engaging students during online instruction [25]. Research is needed on how novel technology-based communication platforms influence both instructor-student engagement and student-student engagement and ultimately, student motivation and academic performance. Effective remote communication strategies should also be continued post-pandemic to provide more flexibility in options for students.

Two themes emerged across questions that we suspect are related. Students emphasized the need for leniency in grading, assignments, and expectations, while also expressing concern regarding increased difficulty level and/or workload of courses. Although the stress and disruption related to the pandemic likely contributed to students’ pleas for leniency, instructors should consider that pleas for leniency may also be a result of increases in difficulty level and course workload that may be occurring intentionally or unintentionally. Consistent with our findings, a survey of 148 undergraduates found nearly one-third reported one or more of their instructors had increased the workload in Spring 2020 [26]. Some practices may have created more work for students than instructors realize. For example, students reported that to avoid online cheating, some instructors replaced exams with class projects that ended up requiring more time to accomplish than the time they would have spent studying for an exam. They also reported that instructors increased the difficulty level of courses as a way to offset the impact of online cheating, by replacing multiple choice exams with free response without allocating sufficient time to complete the exam or by simply increasing the stringency of the grading curve. In addition to increased difficulty, students are reporting numerous other challenges associated with anti-cheating software, including having to read and follow elaborate instructions on how to take the test, anxiety and discomfort associated with being watched through your computer during an exam, and being accused of cheating when internet connection issues disrupt the exam [27]. Another example students gave regarding increased workload was instructors posting lecture videos that exceeded the usual lecture time. This may have occurred because nothing stops an instructor from running over time when recording a video as opposed to in-person lectures which have a hard stop time. Even when video lectures matched the length of in-person lectures, students said these took longer to digest because watching videos was more cognitively taxing compared to live lectures where interaction occurs. Class interaction breaks up the monotony of a lecture and also allows students to ask the instructor to clarify concepts and answer questions in the moment and hear answers to other student’s questions, all of which can facilitate learning [28, 29]. Some students said they needed to take frequent breaks when watching lecture videos, rewind and re-listen to parts they didn’t absorb, and work much harder to stay focused, all of which made the time spent on the lecture go beyond the time they would spend in an in-person lecture. Interestingly, their experience was that the extra time they spent on lectures did not result in better learning compared to in-person classes, but instead, worse learning. Breaking up video lectures into short units, recording live lectures so that class interaction is captured in the video, and allowing students to interact while they are watching may be ways to enhance the experience of watching class by video. Another way students said workload was negatively impacted was when instructors posted their lecture videos at the end of the week or at erratic times rather than at class time, which made it difficult for students to manage their time. Interestingly, the problems students cited were often things that could be remedied by changing practices and/or leveraging available technologies. Inadequate instructor practices may be driving the common sentiment that remote learning is generally worse than in-person. Indeed, a recent survey of undergraduate students found that 68% said remote instruction is less effective than in-person instruction [15]. This sentiment runs counter to research on remote instruction outside of the context of emergencies which shows positive outcomes on student performance [3033] and no differences in student satisfaction relative to in person instruction [34]; however, as discussed elsewhere, it is doubtful that best practices for remote instruction were being implemented widely in Spring and Fall 2020 [5].

The present study has some limitations. Though participants were diverse in terms of gender, SES, and race/ethnicity, our sample size was too small to compare results by those demographic factors and did not represent the entire range of gender and racial/ethnic diversity in the US. Further, students were asked about their experiences in Spring 2020 which may not generalize to Fall 2020 when instructors had more time to prepare. This work was an initial step conducted to inform survey questions for a larger survey study that assessed students’ experiences in Fall 2020 and disproportionate impacts of the pandemic on STEM education by race/ethnicity, gender, and SES.

Our findings revealed that many students felt universities and instructors lacked a cohesive strategy for emergency remote instruction. To be sure, few anticipated the circumstances of 2020 and students, instructors, and administrators were all under enormous stress. However, a robust literature on online instruction exists [35, 36] and evidence points to the increasing possibility that pandemics and other natural disasters are likely to occur in the future [37, 38]. Done well, remote instruction can actually increase STEM participation and diversity, which signals the urgent need for broad adoption of best practices [39]. Going forward, universities must require that all instructors are proficient in remote instruction. This entails provision of training in remote instruction that is consistent with best practices and the adoption of policies that incentivize faculty to gain proficiency in these skills (e.g., embed in promotion and tenure criteria, teaching awards). This would not only prepare faculty for emergency situations like the pandemic, but even more importantly, this would position universities to develop remote learning programs that are designed to increase diversity in STEM. Remote learning programs have been developed precisely for this purpose at some land grant universities, for the purpose of bringing the STEM curriculum to diverse students rather than the usual approach which involves attempting to recruit diverse students to the often rurally-located land grant universities [39]. Studies have shown this model to be successful in increasing racial/ethnic diversity [39] and gender diversity [40]. Related, remote student recruitment strategies (e.g., virtual tours) used during the pandemic to showcase the university’s offerings to prospective students and their families should continue post-pandemic to attract students who may not be able to afford to travel for campus tours. The pandemic, by hastening universities’ and instructors’ capacities to deliver remote education and services, provides a unique opportunity for universities to build upon, allowing them to reimagine their approaches to increasing the diversity of their student body.

Universities should also develop protocols for campus closure and remote instruction that incorporate 1) data on what worked well and what did not in Spring and Fall 2020, 2) the vast body of research on best practices in remote instruction [36], and 3) iterative input from their student bodies being sure to include diverse voices. University-level policies should address course modalities, grading policies, student engagement strategies, and faculty training requirements. For example, universities should consider prohibiting course modalities that are proving to be unacceptable and/or ineffective, mandating instructor training in remote instruction tools and effective student engagement strategies, and examining the relationship between course modalities and student course evaluations to identify modalities that aren’t working well or are being executed poorly.

Similarly, instructors should produce remote instruction protocols for their courses that are informed by best practices identified in the remote instruction literature. Effective remote instruction requires a design process that takes into account myriad factors including instructor-student ratio, modality, synchrony, pedagogical style (e.g., exploratory, collaborative) among others [5]. To identify training gaps on the part of instructors, research is needed to examine how instructors approached remote instruction in Spring and Fall 2020 and to what extent it reflected best practices. Finally, effective strategies should be shared in venues that reach the academic community given the lack of implementation and dissemination of evidence-based practices for remote instruction thus far. A great example of such a venue is the Facebook group Pandemic Pedagogy which currently has 32.7K members and emerged shortly after the pandemic commenced as a forum for faculty to share their experiences with remote instruction.

A plan is now needed to address the potential negative consequences to STEM education caused by the pandemic. Universities urgently need to devise strategies to 1) identify and assist students who have exhibited declines in academic performance, 2) follow-up with students who have switched out of a STEM major since Spring 2020, and 3) re-engage students who have unenrolled temporarily or permanently as a result of the pandemic [41]. A generation of college students is at risk for long-term impacts of the pandemic on their educational and economic potential. Given the disproportionate impact of COVID19 on the very racial/ethnic groups that are underrepresented in the STEM fields [4244], the pandemic may cause another gaping leak in the STEM pipeline. Finally, the pandemic has presented STEM education with an enormous opportunity to innovate by leveraging the new skillset instructors and university services have developed in the past year. Every crisis brings opportunities for growth. We now must accelerate the implementation and dissemination of best practices for online STEM education with the goal of increasing diversity in STEM while also identifying and ameliorating the negative impacts of the pandemic on STEM undergraduate students.

Supporting information

S1 Table. Representation of subgroups of STEM undergraduate students within each theme for each item.

https://doi.org/10.1371/journal.pone.0256213.s001

(DOCX)

References

  1. 1. www.who.int. WHO Director-General’s opening remarks at the media briefing on COVID-19–11 March 2020. Speeches 2020 11/23/20]; Available from: https://www.who.int/director-general/speeches/detail/who-director-general-s-opening-remarks-at-the-media-briefing-on-covid-19—11-march-2020
  2. 2. Su A. Students Scramble to Find Last-Minute Housing After Being Displaced By Coronavirus Measures. 2020 11/23/2020]; Available from: https://www.thecrimson.com/article/2020/3/12/housing-scramble-coronavirus/.
  3. 3. Levin D. No Home, No Wi-Fi: Pandemic Adds to Strain on Poor College Students. Schools During Coronavirus 2020 12/11/2020]; Available from: https://www.nytimes.com/2020/10/12/us/covid-poor-college-students.html.
  4. 4. www.npr.org. Colleges Face Student Lawsuits Seeking Refunds After Coronavirus Closures. The Coronavirus Crisis 2020 11/23/2020]; Available from: https://www.npr.org/2020/05/29/863804342/colleges-face-student-lawsuits-seeking-refunds-after-coronavirus-closures.
  5. 5. Hodges C., et al. The Difference Between Emergency Remote Teaching and Online Learning. 2020 12/11/2020]; Available from: https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning#fn8.
  6. 6. Means B., Neisler J., and Langer Research Associates, Suddenly Online: A National Survey of Undergraduates During the COVID-19 Pandemic. 2020, San Mateo, CA: Digital Promise.
  7. 7. Eduljee N. and Croteau K., College Student Transition to Synchronous Virtual Classes during the COVID-19 Pandemic in Northeastern United States. Pedagogical Research, 2020. 5(4).
  8. 8. Aucejo E.M., et al., The impact of COVID-19 on student experiences and expectations: Evidence from a survey. J Public Econ, 2020. 191: p. 104271. pmid:32873994
  9. 9. Weston T.J., et al., Weed-Out Classes and Their Consequences, in Talking about Leaving Revisited: Persistence, Relocation, and Loss in Undergraduate STEM Education, Seymour E. and Hunter A.-B., Editors. 2019, Springer International Publishing: Cham. p. 197–243.
  10. 10. Sheehan M.C. and Fox M.A., Early Warnings: The Lessons of COVID-19 for Public Health Climate Preparedness. Int J Health Serv, 2020. 50(3): p. 264–270. pmid:32517569
  11. 11. Harris P.A., et al., Research electronic data capture (REDCap)—A metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform, 2009. 42(2): p. 377–381. pmid:18929686
  12. 12. Hsieh H.F. and Shannon S.E., Three approaches to qualitative content analysis. Qual Health Res, 2005. 15(9): p. 1277–88. pmid:16204405
  13. 13. O’Connor C. and Joffe H., Intercoder Reliability in Qualitative Research: Debates and Practical Guidelines. International Journal of Qualitative Methods, 2020. 19: p. 1609406919899220.
  14. 14. Qualitative data analysis: An expanded sourcebook, 2nd ed, in Qualitative data analysis: An expanded sourcebook, 2nd ed. 1994, Sage Publications, Inc: Thousand Oaks, CA, US. p. xiv, 338–xiv, 338.
  15. 15. tophat.com. Higher Ed Students Grade the Fall 2020 Semester. Top Hat Field Report 2020; Fall 2020 Edition:[Available from: https://tophat.com/teaching-resources/interactive/student-survey-report/.
  16. 16. Chi M. and Wylie R., The ICAP Framework: Linking Cognitive Engagement to Active Learning Outcomes. Educational Psychologist, 2014. 49(4): p. 219–243.
  17. 17. Wiggins B.L., et al., The ICAP Active Learning Framework Predicts the Learning Gains Observed in Intensely Active Classroom Experiences. AERA Open, 2017. 3(2): p. 2332858417708567.
  18. 18. Krukowski R.A., Jagsi R., and Cardel M.I., Academic Productivity Differences by Gender and Child Age in Science, Technology, Engineering, Mathematics, and Medicine Faculty During the COVID-19 Pandemic. J Womens Health (Larchmt), 2020.
  19. 19. Carson J. and Mattingly M. COVID-19 Didn’t Create a Child Care Crisis, But Hastened and Inflamed It. Research Publications 2020 11/23/2020]; Available from: https://carsey.unh.edu/publication/child-care-crisis-COVID-19.
  20. 20. Sitzman K. and Leners D.W., Student Perceptions of CARING in Online Baccalaureate Education. Nursing Education Perspectives, 2006. 27(5). pmid:17036683
  21. 21. Marcus J., Baral S., and More Than 100 Other Scholars, An Open Letter to University Leadership. 2020, Inside Higher Ed: Washington, DC.
  22. 22. Bernard R.M., et al., A Meta-Analysis of Three Types of Interaction Treatments in Distance Education. Review of Educational Research, 2009. 79(3): p. 1243–1289.
  23. 23. Alqurashi E., Predicting student satisfaction and perceived learning within online learning environments. Distance Education, 2019. 40(1): p. 133–148.
  24. 24. Martin F., Wang C., and Sadaf A., Student perception of helpfulness of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. The Internet and Higher Education, 2018. 37: p. 52–65.
  25. 25. Meyer K.A., Student Engagement in Online Learning: What Works and Why. ASHE Higher Education Report, 2014. 40(6): p. 1–114.
  26. 26. Murphy L., Eduljee N., and Croteau K., College Student Transition to Synchronous Virtual Classes during the COVID-19 Pandemic in Northeastern United States. Pedagogical Research, 2020. 5: p. em0078.
  27. 27. www.washingtonpost.com. Cheating-detection companies made millions during the pandemic. Now students are fighting back. Technology 2020 12/11/2020]; Available from: https://www.washingtonpost.com/technology/2020/11/12/test-monitoring-student-revolt/. pmid:34307768
  28. 28. Flaherty C. The Power of Peer Interaction. Digital Learning 2020 11/23/2020]; Available from: https://www.insidehighered.com/digital-learning/article/2020/11/03/power-active-learning-during-remote-instruction.
  29. 29. Flaherty C. Zoom Boom. News 2020 11/23/2020]; Available from: https://www.insidehighered.com/news/2020/04/29/synchronous-instruction-hot-right-now-it-sustainable.
  30. 30. Dell C.A., Low C., and Wilker J.F., Comparing student achievement in online and face-to-face class formats. Journal of online learning and teaching, 2010. 6(1): p. 30–42.
  31. 31. Weber J.M. and Lennon R., Multi-course comparison of traditional versus Web-based course delivery systems. Journal of educators online, 2007. 4(2): p. n2.
  32. 32. Warren L.L. and Holloman H.L. Jr, On-line instruction: Are the outcomes the same? Journal of Instructional Psychology, 2005. 32(2): p. 148.
  33. 33. Lack K.A., Current Status of Research on Online Learning in Postsecondary Education, in Ithaka S+R. 2013.
  34. 34. Allen M., et al., Comparing Student Satisfaction With Distance Education to Traditional Classrooms in Higher Education: A Meta-Analysis. American Journal of Distance Education, 2002. 16(2): p. 83–97.
  35. 35. Chen B., Bastedo K., and Howard W., Exploring Best Practices for Online STEM Courses: Active Learning, Interaction & Assessment Design. Online Learning; Vol 22, No 2 (2018) , 2018.
  36. 36. Means B., Bakia M., and Murphy R., Learning Online: What Research Tells Us About Whether, When and How. 2014, New York, NY: Routledge. 232.
  37. 37. Curseu D., et al., Potential Impact of Climate Change on Pandemic Influenza Risk. Global Warming: Engineering Solutions, 2009: p. 643–657.
  38. 38. Tollefson J., Why deforestation and extinctions make pandemics more likely. Nature, 2020. 584(7820): p. 175–176. pmid:32770149
  39. 39. Drew J.C., et al., Development of a Distance Education Program by a Land-Grant University Augments the 2-Year to 4-Year STEM Pipeline and Increases Diversity in STEM. PLOS ONE, 2015. 10(4): p. e0119548. pmid:25875606
  40. 40. Herman C., et al., Using a blended learning approach to support women returning to STEM. Open Learning: The Journal of Open, Distance and e-Learning, 2019. 34(1): p. 40–60.
  41. 41. Johnson E. As Students Flock to Gap-Year Programs, College Enrollments Could Suffer. 2020; Available from: https://www-chronicle-com.ezproxy.lib.uconn.edu/article/as-students-flock-to-gap-year-programs-college-enrollments-could-suffer.
  42. 42. Kirby T., Evidence mounts on the disproportionate effect of COVID-19 on ethnic minorities. The Lancet Respiratory Medicine, 2020. 8(6): p. 547–548. pmid:32401711
  43. 43. Tai D.B.G., et al., The Disproportionate Impact of COVID-19 on Racial and Ethnic Minorities in the United States. Clinical Infectious Diseases, 2020.
  44. 44. Farlie R., Couch K., and Xu H., The Impacts of COVID-19 on Minority Unemployment: First Evidence from April 2020 CPS Microdata. National Bureau of Economic Research Working Paper Series, 2020. 27246.