Figures
Abstract
Research has clearly demonstrated that some study strategies (for example, self-testing and spaced studying) are effective, yet students often report studying ineffectively. Our focus with the current study is to update and extend the current literature on how college students study. We surveyed 484 introductory psychology students at a small liberal arts college—a different type of school from prior studies. Our survey built on an existing study strategies questionnaire used to assess a variety of student study behaviors and beliefs. Additionally, we asked new questions about multitasking and study scheduling. Overall, we found that the current sample reported studying in similar ways to what past research suggested; students used both effective and ineffective strategies, some of which correlated with grade point average (GPA). However, some differences emerged. For example, our students were more likely to report learning how to study from a teacher. Additionally, a majority of students believed that multitasking was ineffective, yet most reported multitasking while studying. Finally, an important, but exploratory, analysis demonstrated that study strategies were similar before and after COVID-19 forced classroom changes. We highlight the need for future research on study strategies to recruit participants from more diverse institutions.
Citation: Rinella HL, Putnam AL (2022) The study strategies of small liberal arts college students before and after COVID-19. PLoS ONE 17(12): e0278666. https://doi.org/10.1371/journal.pone.0278666
Editor: Simone Battaglia, University of Bologna: Universita di Bologna, ITALY
Received: July 19, 2022; Accepted: November 21, 2022; Published: December 8, 2022
Copyright: © 2022 Rinella, Putnam. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: That data are posted on the Open Science Framework: OSF.IO/JMXH2.
Funding: The authors received no specific funding for this work.
Competing interests: The authors have declared that no competing interests exist.
Introduction
Effective studying is key to success in school, but doing so may be easier said than done. Most students struggle to juggle classes, extracurricular activities, and their social lives, and this juggling can lead them to study in ineffective ways, such as putting off studying to the last minute [1,2]. Although psychology research has provided clear evidence-based recommendations about how students should study, it remains unclear how often students follow these recommendations or are even aware of them [3,4]. The goal of the current study was to examine students’ self-reported study behaviors, knowledge about multi-tasking, and whether students study differently as a result of the COVID-19 pandemic.
Which study strategies are effective?
A wealth of empirical research in laboratories and classrooms over the last 20 years has clearly demonstrated that not all study strategies are equally effective (for reviews see [5–7]). On one hand, retrieval practice and spaced studying are two highly effective study strategies that help learners of all ages and abilities to build robust, flexible, and long-lasting knowledge in many domains [5,8–10]. On the other hand, two extremely popular study strategies—highlighting and rereading—have continually been demonstrated to have small or no positive effect on learning [5,11], and may even be negatively correlated with grade point average, (GPA; [12]).
Not only have psychological scientists published this research in academic journals, but they have also broadcasted these findings to wider audiences. Tips about effective studying have been presented in books, YouTube channels, websites, and even podcasts, catered to both teachers and students [13–17]. Given both the scientific consensus identifying effective study habits and extensive outreach efforts, one might assume that students and teachers should have a good understanding of how to study, but survey research suggests otherwise.
How do students actually study?
Research exploring what students know about effective studying has relied on two methods: (1) examining metacognitive judgments or study behavior in laboratory experiments and (2) surveying students about study habits. Both approaches suggest that students frequently make poor study choices.
Laboratory studies have consistently shown that students do not recognize the benefits of spacing and self-testing. Students typically report feeling like they learned less from using these effective strategies compared to massed study and rereading, and when given a choice, they often chose easier and less effective strategies [8,18,19]. This pattern is thought to occur largely because of fluency illusions—rereading and cramming lead to increased feelings of fluency, which learners interpret as mastery over the to-be-learned materials. In contrast, strategies requiring more upfront effort often led to higher long-term retention, a concept known as desirable difficulties [20–22]. Students apparently do not learn about the effectiveness of their study strategies from experience, likely because students interpret the effort needed for using desirable strategies as indication of their ineffectiveness [23].
Similarly, survey research indicates that students study ineffectively [3,11,19,24–28]. A recent meta-analysis revealed that rereading is the most frequently reported study strategy used (endorsed by 78% of students), and that many students also endorse using outlines, flashcards, highlighting, and note-taking [6]. In a comprehensive review of the literature, Dunlosky and colleagues [5] rated rereading and highlighting as having low-utility, outlines and note-taking as having medium-utility, and flashcards, assuming they are used for retrieval practice, as having high-utility, suggesting that the most popular study strategies tend to produce low learning gains. Even when students use self-testing, such as with flashcards, they may not be doing so in an effort to enhance learning: several studies have demonstrated that students primarily use self-testing as a tool to gage how much they have learned [3,25,27].
One past survey study was particularly revealing, as the researchers measured study behaviors and how those behaviors correlated with GPA [3]. High achieving students (with GPAs over 3.6) reported using self-testing as a study strategy, whereas lower achieving students used self-testing to determine what they know. Furthermore, students with low GPAs typically chose to study whatever was due the soonest, instead of planning or spacing their studying, and tended to study later in the day. The results of Hartwig and Dunlosky [3] suggest that most students are not studying effectively, and that using ineffective study strategies is negatively correlated with GPA (for a replication, see [29]).
How does multitasking affect learning?
One issue that has not yet been extensively explored in surveys of student study strategies is multitasking [30]. Multitasking has been extensively studied in the lab (often in the form of cognitive control or task switching), and findings generally suggest that doing more than one thing at a time leads to poorer performance on all tasks (for a review see [31]). The impact of multitasking in educational contexts has increased with the proliferation of smart phones. Between 2012 and 2021, smartphone ownership increased from 39% to 85%, with 96% of the US population between the ages of 18–29 now owning a smartphone [32].
Unsurprisingly, modern students use their phones while studying, with upwards of 90% of students reporting that they multitask with media “sometimes” or “frequently” while studying (for a review, see [33,34]). Depending on the context, multitasking can either slow down studying [30], lower comprehension [30,35–38], or both. Critically, students appear to have poor metacognitive understanding of how multitasking affects their performance [39]. Furthermore, some forms of media multitasking during class (e.g., self-reported Facebook and text messaging use) have been shown to be negatively correlated to GPA and learning [30,40–42]. Despite the clear evidence for the potential negative effects of multitasking, it is unclear whether students understand how multitasking may affect their academic performance.
Study habits before and after COVID-19
An exploratory question we addressed in the current study was whether the COVID-19 pandemic changed study behaviors. Clearly, the pandemic changed academic life in unprecedented ways [42]. Classes rapidly shifted to emergency remote delivery, and the next few months saw mixtures of fully online or hybrid formats for classes. There was variability in whether classes were synchronous or asynchronous and instructors struggled with online exam formats, trying to balance concerns about cheating with learning goals [43]. And of course, these academic changes were in addition to numerous other stressors related to the pandemic. For example, 71% of students reported an increase in stress during the pandemic [44–46]. Did the pandemic change study behaviors?
We had three competing predictions. The first is that given the tumultuous nature of moving online students might revert to using less effective, easier strategies, such as rereading, studying at the last minute, or not studying at all. Indeed, surveys indicate that students experienced a loss of motivation during the pandemic [44–47]. Additionally, they spent less time studying and more time procrastinating or embracing distractions [48], likely exacerbating student tendencies to multitask during online lectures [49]. Combined with a move towards open-note tests [43], it would be unsurprising if students endorsed less effective study strategies after the pandemic. A second prediction is that students may use more effective strategies. This might occur because students had fewer assignments and social obligations, and were able to devote more time to studying. Indeed, one survey showed that after the pandemic students were more likely to report finishing all expected readings for a class [48]. Finally, a third prediction is that study behaviors would largely remain the same before and after the pandemic. Indeed, some evidence suggests that learning outcomes and student perceptions of courses are similar between in-person and courses with online components [50].
The current study
The goal of the current project was to examine student study behaviors and beliefs about study behaviors. We built on prior surveys [3,26,29], added questions about multitasking, and surveyed students before and after the COVID-19 pandemic. We aimed to replicate Hartwig and Dunlosky [3] for two reasons. First, their survey and many others examining study behaviors [24,26,29,51,52] surveyed students at large public institutions, raising the question of whether students at other types of schools—such as smaller private schools where student/faculty ratios are typically smaller—would have similar study behaviors and beliefs. Second, Hartwig and Dunlosky [3] was published 10 years ago. Over the past decade, there has been a significant effort to communicate research findings about effective study strategies to students. As a result, today’s students may have a better understanding of study strategies.
Overall, we predicted that our sample would report study behaviors that were consistent with past research—they would use less effective study strategies, favor massing and rereading, report inaccurate assumptions about why self-testing was important, and that self-testing would show a positive relationship with GPA. Additionally, we hypothesized that students who reported cramming before a test would be more likely to use passive study strategies, and that our sample would report multitasking while studying, despite also acknowledging that multitasking was detrimental to learning. Finally, as described above we had no strong a priori predictions about how the pandemic might change study behaviors.
Method
We report how we determined our sample size, all data exclusions (if any), all manipulations, and all measures in the study. The Furman University IRB approved the study procedure.
Participants
Four-hundred eighty-four introductory psychology students from a small liberal arts college in the Southeastern United States were recruited across four semesters between the Fall of 2019 and the Spring of 2021. Notably, the Fall of 2019 was conducted fully in-person. In the Spring of 2020, classes rapidly shifted to entirely online. The Fall of 2020 and Spring of 2021 courses were taught in a “hybrid-flex” format, where students had the option of attending classes in-person or online on a day-to-day basis. Multiple introductory psychology courses were offered each semester, and all the students in each section were invited to participate in our survey.
Our goal was to gather data from as many students as possible during our four semesters of data collection. Ages of students ranged from 18 to 22 years old (M = 19.13, SD = 1.09; 65% female, 34% male, <1% non-binary), and all class years were represented (45% freshman, 33% sophomores, 15% juniors, 7% seniors). The racial demographics of our sample (83% white, 7% Asian, 5% African American, 4% more than one race, and 1% other), mirrored the school’s general population. For context, the school’s acceptance rate is 56% and students had an average SAT score of 1309 and an ACT score of 30.
Participants received course credit for completing the survey. We excluded any participants from analyses who did not answer any questions on the survey. We did not exclude the data from any other participants, although some participants were missing a few responses to specific questions. After these exclusions, our final sample consisted of 478 students.
Materials and procedure
Table 1 displays our survey questions, which were adapted from Hartwig and Dunlosky [3]. First, for Question 12 (“what is your GPA?”) we added new response options: “I don’t know”, and “this is my first semester in college, so I don’t have a college GPA yet” because many participants were first semester freshmen. Second, for Question 14 (“identify regularly used study strategies”), we removed the “other” option and added “use a mnemonic technique (acronyms, rhymes, memory palace, peg system)” as another answer choice. Finally, we added Questions 11 and 12 to assess multitasking opinions and behaviors.
It is important to note that we used self-reported GPA for our analyses. Although there are some systematic inaccuracies when it comes to self-reported grades, evidence suggests that there is still a reasonably strong relationship between self-reported GPA and actual GPA [53]. Overall, high achievers tend to report GPAs accurately, whereas low achievers tend to claim their GPAs are higher than they actually are [54].
All students enrolled in a section of introduction to psychology were invited to complete a large survey study. Our study skills inventory was one of several survey elements with the others addressing unrelated topics (e.g., Fear of Missing Out and disordered eating). Students saw the link to the survey posted to the subject pool website and could complete their survey from their own laptop any time before the semester ended. The entire survey took about 60 minutes.
Results
Our preregistered analysis plan, data, and R analysis scripts are available at OSF.IO/JMXH2. Data were collected before we submitted the preregistration, but no one on the research team had seen any of the data when the preregistration was submitted. We set alpha at .05 to determine statistical significance. Table 1 presents the main study results.
Preregistered analyses
What are the common behaviors and beliefs about studying among college students? How do the current results compare to prior research?
Two of our preregistered research questions–examining the overall pattern of responses and comparing our results to prior work [3]–are addressed here. Table 1 displays the response percentages from the current survey along with the percentages reported in Hartwig and Dunlosky [3], along with the effect size of the difference and the statistical significance. Because the data were categorical, we used Chi-squared tests of independence and Cramer’s VC as our measure of effect size to look for differences between the two studies. Cramer’s VC ranges between 0 and 1, with higher numbers indicating a larger difference between the two sets of responses. We didn’t compare responses to Question 11 and Question 12 (both about multitasking) because we added those questions to the survey, nor did we compare responses for Question 13 (GPA) and Question 14 (specific study behaviors) because we added additional response options to these questions.
As seen in Table 1 (and in S1 Table in S1 File), the current responses are generally similar to past surveys. Of the ten questions for which we conducted inferential tests, four yielded nearly identical response rates. Students from both populations (“CS”- current study, “HD”- Hartwig and Dunlosky’s results) reported similar tendencies (i.e., all p values were greater than .05) to: return to material after a class had ended (Question 3, HD- 23% vs. CS- 23%), studying the same amount of time for essay and short answer exams (Question 4, CS- 53% vs. HD- 58%), set aside materials they know to focus on other materials (Question 7, CS- 56% vs. HD- 54%), and intentionally spacing out their studying (Question 10, CS- 51% vs. HD- 47%).
In contrast, the other six questions revealed statistically significant differences between the current sample and Hartwig and Dunlosky [3], but in all cases the effect sizes were small, as seen in the last column in Table 1. The students in our survey were more likely to report learning how to study from a teacher (Question 1, CS- 52% vs. HD- 36%), and less likely to report studying what they were doing worst in (Question 2, CS- 8% vs. HD- 24%). Our students were less likely to report rereading whole articles (Question 5, CS- 11% vs. HD- 19%), but more likely to report not rereading at all (Question 5, CS- 28% vs. HD- 17%). Finally, our students differed in terms of when they studied and when they thought studying would be effective. Our students were less likely to study in the evening (Question 8, CS- 49% vs. HD- 69%) and more likely to study in the morning, afternoon, and late at night. Only 35% of our students thought studying in the evening would be most effective compared to 50% of students in prior research (Question 9).
Finally, although we did not calculate inferential statistics for Question 14—which questioned students about the use of specific study behaviors—we can still see in Table 1 that the overall pattern of responding is generally similar to past surveys. The notable exceptions are that our students are more likely to make outlines, less likely to underline or highlight while reading, and less likely to cram before a test.
Why do students use self-retrieval practice as a study strategy?
Do students know about the multiple benefits of self-testing for learning (Question 6)? Although 48% of students self-tested to keep track what they have learned, 30% reported self-testing because they learn more than they do through rereading, suggesting that these students understand the benefit of self-testing related to learning gain. These numbers were comparable to past work.
Is the use of different study strategies related to GPA?
We conducted two analyses to evaluate the relationships between individual study strategies and GPA. First, we computed Goodman-Kruskal gamma correlations between GPA and each of the specific study strategies listed in in Question 14 (see left column of Table 2). To do so, we coded student responses as a 0 or 1 and then correlated each strategy with self-reported GPA (we converted the response ranges to the midpoint). Critically, there were no significant correlations between any of the study strategies and GPA. These findings are inconsistent with Hartwig and Dunlosky [3], who found that self-testing was positively correlated with GPA.
Second, we ran a linear regression analysis to evaluate possible relationships between individual study strategies and GPA: doing so allowed us to simultaneously evaluate each study strategy while controlling for the Type I error rate. The regression model (see Table 2) revealed that underlining/highlighting while reading, using a mnemonic technique, and participating in class were positively associated with GPA, meaning that students who reported regularly using these study strategies had higher GPAs. In contrast, studying with friends was negatively associated with GPA, meaning that students who reported regularly studying with friends had lower GPAs. This latter finding was similar to Hartwig and Dunlosky [3]. However, Hartwig and Dunlosky also found that self-testing and rereading were positively associated with GPA and that making outlines was negatively associated with GPA, whereas we did not find a significant relationship for any of those variables.
Does cramming before tests correlate with the use of passive study habits?
We were interested to see if students who studied at the last minute were also likely to use less-active study techniques. Despite variability in the field regarding the definition of active and passive study strategies (for a review see [55]), we defined active study strategies as being characterized by cognitive effort and deep semantic processing (self-testing and flashcard use) whereas passive study strategies do not use such processes (highlighting, rereading, recopying notes). We calculated Goodman-Kruskal’s gamma correlations between students’ use of active and passive study strategies and their response to how they typically scheduled their study sessions (Question 10 –“do you space or cram?”). We found no significant correlations between the study strategies listed above and study pattern, meaning that none of the study strategies had an association with spacing or cramming.
Do students know that multitasking while studying is ineffective? Do students report multitasking while they study?
Our two new survey questions (Question 11 and Question 12) measured multitasking behaviors and attitudes about multitasking. Specifically, Question 11 asked students to check all common multitasking behaviors that they regularly used and Question 12 asked whether students thought multitasking harmed their studying. Although most students (90%) reported that they believed multitasking made their studying less effective, a majority of students (84%) also reported using at least one of the common multitasking behaviors. This pattern is nearly identical to that reported in prior research [38]. The most common multitasking behaviors were listening to music while studying (60%) and communicating via phone (42%). Although listening to music may or may not have detrimental effects on reading speed and memory (see [56] for a meta-analysis), phone communication most likely does [30].
Exploratory analyses
Did study strategies change as a result of the COVID-19 pandemic?
Overall, we had no strong predictions concerning whether study strategies would change as a result of the COVID-19 pandemic. As outlined in the introduction, students might have more time available for studying given the restriction of social activity. Conversely, given changes in assignment types (e.g., open note tests) and additional stressors (health and financial concerns), students may resort to easier, less effective strategies.
To address this question, we compared responses from the semester before the pandemic reached the United States (Fall of 2019) to the two semesters after our institution had been fully affected by social distancing requirements (the Fall and Spring semesters during the 2020–2021 school year). We omitted responses from the Spring of 2020 because it was during this semester that our institution shifted from in-person to online classes. Table 3 Column 5 presents the results of a series of chi-squared tests of independence comparing the distribution of responses for the semesters before and after COVID-19. Notably, we found no significant difference between the study habits during the pre-COVID-19 and post-COVID-19 periods, meaning that the students in our sample reported similar study habits before and after the pandemic. However, self-reported student GPAs did moderately rise between the pre- and post-pandemic semesters (V = .20). One explanation for this finding is that instructors adjusted assignments as they transitioned courses to online format (for example, students typically score higher on open-note online exams compared to traditional close-book in person exams; [43]).
Does multitasking behavior predict whether students cram or space their studying?
We hypothesized that students who reported using none of the multitasking behaviors listed in Question 12 would be more likely to space their studying, because these students may have more knowledge of effective study habits. Likewise, we predicted that students who reported multitasking—for example, using the phone or talking to people while studying—would be more likely to cram. We ran a binary multiple logistic regression with study pattern (Question 10, “do students study the night before or space out studying”) as the outcome variable and multitasking activities (Question 12, check all multitasking activities you do regularly) as the predictor variables. The multitasking activities were either 0 (I do this activity) or 1 (I do not do this activity).
Table 4 shows the results of our regression model. Three predictors—listening to music, talking with people, and answering none of the above—predicted cramming tendencies, meaning that students who reported doing these behaviors were more likely to cram. Examining the odds ratios suggested that students who listened to music while studying were 1.77 times more likely to cram, students who talk to people while studying are 1.70 times more likely to cram, and students who reported not multitasking were 2.65 times more likely to cram. In contrast, surfing the web was the sole significant predictor of spacing, such that students that reported surfing the internet were 2.30 times more likely to space their studying. Thus, in contrast to our hypothesis, one of the strongest predictors of cramming was not multitasking at all. One explanation for this pattern is that when students are cramming, they are pressed for time and thus are less likely to indulge in multi-tasking behavior.
Discussion
The goal of the current study was to measure students’ study behaviors and their beliefs about studying at a small liberal arts college. Overall, our results replicated past research (e.g., Hartwig & Dunlosky, [3]) with a few key exceptions (described below). These differences might have occurred because the current sample is different than past research, or because outreach efforts by cognitive psychologists (e.g. in the forms of books [14], podcasts [16], and blogs [17]) have been effective in changing study habits. Furthermore, the current study also demonstrated that most students recognize the problems with multitasking while studying, yet continue to do so. Finally, it also demonstrated that study habits before and after the onset of the COVID-19 pandemic did not differ.
How do the current results compare to past results?
In general, our results were consistent with past work [3,26,29]. Taken together, these studies paint a picture of students who use a mix of empirically effective and less effective strategies. Many students test themselves with practice problems and use flashcards, and about half of students from all studies report spacing their studying, rather than exclusively cramming (see S1 Table in S1 File). At the same time, students study whatever is due soonest instead of scheduling their studying, rarely return to course material after a class has ended, frequently reread course materials, and use quizzing primarily to determine what they know, instead of using it to aid learning.
There were some differences between the current study and prior work. First, 52% of our students reported learning how to study from a teacher, compared to 36% (reported percentage in both [3] and [29]). Second, our sample was less likely to choose to study what they felt they were doing the worst in, were less likely to reread class materials, were more likely to find self-testing enjoyable, were more likely to study throughout the day instead of studying just in the evening, and to think that studying can be effective at all times of the day. Additionally, in the current study (and in [29]), self-testing or using flashcards was not related to GPA, whereas there was a positive correlation in [3].
What might explain these differences? One possibility is that the surge in evidence-based study advice that began in the mid-2010s could have successfully proliferated to students and teachers, such that the students in the current sample may have been exposed to more empirically-supported study advice. The increased proportions of students who reported learning study techniques from a teacher supports this explanation. Alternatively, prior studies have drawn students from large public institutions (with over 20,000 students enrolled) compared to the current study where we drew students from a small private liberal arts college (about 2,400 students enrolled). Differences in the institutions (the student to faculty ratio, program offerings, proportion of classes taught by tenured professors, and student characteristics such as high school experience) may have contributed to the differences in study habits. Future research should draw participants from a wide diversity of institutions before generalizing conclusions to students from all types of institutions.
What do students understand about multitasking?
Similar to Pashler and colleagues’ [38] findings, our survey reported that 84% of students multitask while studying despite 90% of students recognizing that their studying would be more efficient without it. This discrepancy between what students believe should be effective and what they actually do suggests further education or outreach about how to study might not be helpful. Rather, students may benefit more from advice that suggests concrete ways to implement effective study strategies [4].
Furthermore, an exploratory analysis suggested that students who reported not multitasking at all were more likely to cram (likely because they were under intense time pressure) and that talking to people and listening to music were also associated with cramming. In contrast, students who surfed the web, were more likely to space, perhaps because they are not feeling the same time pressure as students who report cramming. Given the clear laboratory evidence about the perils of multitasking (e.g., [38]) and the exploratory nature of these analyses, future research should explore more directly how different study behaviors are associated with multitasking study scheduling.
COVID-19
Finally, a key finding from our paper was that student study strategies seemed to be unaffected by COVID-19: across all of our questions student responses were regardless of whether data was collected before or after social distancing restrictions were implemented. This outcome is inconsistent with survey research suggesting that students experienced a lack of motivation as a result of the pandemic [48]. Why did this discrepancy occur? One possibility is that our comparisons were looking at semesters before the pandemic and after the Spring 2020 semester when courses shifted rapidly online. Thus, the later semesters were a combination of online and hybrid formats; regardless, instructors likely had more time to prepare for teaching in hybrid formats, as opposed to the remote emergency instruction that occurred in the Spring of 2020.
Limitations and future directions
There are a few limitations that constrain the conclusions we can draw from this study. By design, we used broad survey questions. These could have been misinterpreted. For example, students may report every study strategy they have ever used rather than the ones they use most frequently. Likewise, we asked about “multitasking” generally, but, as shown in our analyses, some types of multitasking appear to predict study scheduling behavior, whereas others do not. Future studies should focus on specific study strategies, rather than broad behaviors. More specific survey questions can allow for more precise inferences, and ensure that researchers and students are interpreting terms in the same way (see for example, [52]).
A second limitation is our sample composition. Our sample was drawn from a single small liberal arts college. Because some differences between the current work and prior work emerged, it is critical for future researchers to gather data from a larger range of institutions to examine how and why study behaviors might differ across the many kinds of educational institutions. As well, recruiting participants from more diverse institutions will allow for a more nuanced analysis of variations in student characteristics (e.g., full-time versus part-time) and how those variations correlate with study strategy choice. Our sample also consisted of only students from introduction to psychology courses, rather than a random sample of our entire study body. Although this is a clear limitation in some ways, by our estimates only about 25% of the students in our sample went on to become psychology majors; thus, the remaining 75% of students were taking introduction to psychology as a general education requirement or as an elective. Regardless, further studies should recruit a variety of college majors in addition to recruiting for various general student characteristics, like course load and personal obligations. This can allow for more generalized findings, and even begin to explore whether different study strategies are more effective for different majors.
Conclusion
The current study supports the idea that, in general, students have room to improve with their study strategies. Teachers can be a great resource and facilitator for reshaping students’ incorrect preconceived notions about what is effective, but some research suggests that teachers do not know much more about effective studying than the average student [29]. Only 52% of Furman students reported studying using methods that were taught by a teacher, but this is a higher percentage than past research (specifically Hartwig and Dunlosky [3] reported that only 36% of their sample were taught study strategies by a teacher). Thus, both teachers and students can benefit from improved understanding of empirically supported study strategies.
Other scholars have explored what methods can be used within an educational setting to promote student success; critically, such programs emphasize not just what strategies are effective, but provide concrete recommendations about how to use those strategies. For example, a college course was created and tested by McDaniel and colleagues [4] for teaching the knowledge, belief, commitment, and planning (KBCP) framework (which integrates metacognition and empirical research) to teach college students to understand the mechanics behind each study strategy and how to use these strategies for effective studying. They found that the course was successful and prompted the long-term use of effective behaviors in the students’ future courses, suggesting that students may benefit from receiving interventions that also address and encourage the best ways to implement changes in study behavior.
In sum, our results are consistent with prior research in showing that while students do engage in some empirically-supported study techniques, there are also many ways in which they fall short. As one example, many students report multitasking while studying, despite recognizing that such behavior is detrimental to their overall success. Studying in college is never easy, but the current results suggest that students do have room for improvement of terms of how and when they study.
Supporting information
S1 File. Additional results for the current study and comparison to prior studies.
S1 Table compares the current results to three prior examinations of student study strategies. S2 Table provides a detailed analysis of student study strategies before and after COVID-19.
https://doi.org/10.1371/journal.pone.0278666.s001
(DOCX)
References
- 1. Macan TH, Shahani C, Dipboye RL, Phillips AP. College students’ time management: Correlations with academic performance and stress. J Educ Psychol [Internet]. 1990 Dec;82(4):760–8. Available from: https://psycnet.apa.org/fulltext/1991-13852-001.pdf.
- 2. Wolters CA, Brady AC. College Students’ Time Management: a Self-Regulated Learning Perspective. Educ Psychol Rev [Internet]. 2021 Dec 1;33(4):1319–51. Available from: https://doi.org/10.1007/s10648-020-09519-z.
- 3. Hartwig MK, Dunlosky J. Study strategies of college students: are self-testing and scheduling related to achievement? Psychon Bull Rev [Internet]. 2012 Feb;19(1):126–34. Available from: pmid:22083626
- 4. McDaniel MA, Einstein GO, Een E. Training College Students to Use Learning Strategies: A Framework and Pilot Course. Psychology Learning & Teaching [Internet]. 2021 Feb 24;1475725721989489. Available from: https://doi.org/10.1177/1475725721989489.
- 5. Dunlosky J, Rawson KA, Marsh EJ, Nathan MJ, Willingham DT. Improving students’ learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychol Sci Public Interest [Internet]. 2013 Jan 8;14(1):4–58. Available from: http://psi.sagepub.com/lookup/doi/10.1177/1529100612453266. pmid:26173288
- 6. Miyatsu T, Nguyen K, McDaniel MA. Five Popular Study Strategies: Their Pitfalls and Optimal Implementations. Perspect Psychol Sci [Internet]. 2018 May 2;13(3):390–407. Available from: http://journals.sagepub.com/doi/10.1177/1745691617710510. pmid:29716455
- 7.
Putnam A. L. and Roediger H. L. Education and Memory: Seven Ways the Science of Memory Can Improve Classroom Learning. In: Wixted JT, editor. Stevens’ Handbook of Experimental Psychology and Cognitive Neuroscience [Internet]. Hoboken, NJ, USA: John Wiley & Sons, Inc.; 2018. p. 1–45. Available from: http://doi.wiley.com/10.1002/9781119170174.epcn106.
- 8. Cepeda NJ, Pashler H, Vul E, Wixted JT, Rohrer D. Distributed practice in verbal recall tasks: A review and quantitative synthesis. Psychol Bull [Internet]. 2006 Jan 1;132(3):354–80. Available from: http://doi.apa.org/getdoi.cfm?doi=10.1037/0033-2909.132.3.354. pmid:16719566
- 9. Putnam AL, Roediger HL. Does response mode affect amount recalled or the magnitude of the testing effect? Memory and Cognition [Internet]. 2013 Jan 1;41(1):36–48. Available from: http://www.springerlink.com/index/10.3758/s13421-012-0245-x. pmid:22898928
- 10. Roediger HL, Butler AC. The critical role of retrieval practice in long-term retention. Trends Cogn Sci [Internet]. 2011 Jan 1;15(1):20–7. Available from: http://www.sciencedirect.com/science/article/pii/S1364661310002081. pmid:20951630
- 11. Wissman KT, Rawson KA, Pyc MA. How and when do students use flashcards? Memory [Internet]. 2012 Aug 1;20(6):568–79. Available from: https://doi.org/10.1080/09658211.2012.687052. pmid:22671698
- 12. Gurung RAR, Weidert J, Jeske A. Focusing on how students study. J Scholarsh Teach Learn [Internet]. 2010 Jan [cited 2021 May 24];10(1):28–35. Available from: http://files.eric.ed.gov/fulltext/EJ882123.pdf.
- 13.
Agarwal PK, Bain PM. Powerful Teaching: Unleash the Science of Learning [Internet]. John Wiley & Sons; 2019. 352 p. Available from: https://play.google.com/store/books/details?id=OpCWDwAAQBAJ.
- 14.
Brown PC, Roediger HL, McDaniel MA. Make It Stick. The Science of Successful Learning [Internet]. Smith JK, editor. Vol. 1. Belknap Press: An Imprint of Harvard University Press; 2014. Available from: https://doi.org/10.1080/00220671.2015.1053373.
- 15. Putnam AL, Sungkhasettee VW, Roediger HL. Optimizing learning in college: Tips from cognitive psychology. Perspect Psychol Sci [Internet]. 2016 Sep 29;11(5):652–60. Available from: http://pps.sagepub.com/lookup/doi/10.1177/1745691616645770. pmid:27694461
- 16. Wadsworth W. Study Exam Expert: Ace your exams with the science of direct learning [Internet]. 2022 [cited 2022 Jul 2]. Available from: https://examstudyexpert.com/podcast/.
- 17.
The Learning Scientists [Internet]. [place unknown, publisher unknown]. [cited 2022 Jul 2]. Available from: http://learningscientists.org.
- 18. Logan JM, Castel AD, Haber S, Viehman EJ. Metacognition and the spacing effect: the role of repetition, feedback, and instruction on judgments of learning for massed and spaced rehearsal. Metacognition and Learning [Internet]. 2012 Dec 1;7(3):175–95. Available from: https://doi.org/10.1007/s11409-012-9090-3.
- 19. McCabe JA. Metacognitive awareness of learning strategies in undergraduates. Memory and Cognition. 2011 Apr 16;39(3):462–76. pmid:21264604
- 20.
Bjork EL, Bjork RA. Making things hard on yourself, but in a good way: Creating desirable difficulties to enhance learning. In: Gernsbacher M.A., Pew R. W., Hough L. M., Pomerantz J. R., editor. Psychology and the Real World [Internet]. FABBS Foundation; 2011. p. 56–64. Available from: http://bjorklab.psych.ucla.edu/pubs/EBjork_RBjork_2011.pdf.
- 21. Roediger HL, Karpicke JD. Test-enhanced learning: Taking memory tests improves long-term retention. Psychological Science. 2006 Jan 1;17(3):249–55. pmid:16507066
- 22. Bjork RA, Dunlosky J, Kornell N. Self-regulated learning: Beliefs, techniques, and illusions. Annu Rev Psychol [Internet]. 2013 Jan 3;64(1):417–44. Available from: http://www.annualreviews.org/doi/abs/10.1146/annurev-psych-113011-143823. pmid:23020639
- 23. Kirk-Johnson A, Galla BM, Fraundorf SH. Perceiving effort as poor learning: The misinterpreted-effort hypothesis of how experienced effort and perceived learning relate to study strategy choice. Cogn Psychol [Internet]. 2019 Dec;115:101237. Available from: pmid:31470194
- 24. Blasiman RN, Dunlosky J, Rawson KA. The what, how much, and when of study strategies: comparing intended versus actual study behaviour. Memory [Internet]. 2017 Jul;25(6):784–92. Available from: pmid:27561889
- 25. Karpicke JD, Butler AC, Roediger HL 3rd. Metacognitive strategies in student learning: do students practise retrieval when they study on their own? Memory [Internet]. 2009 Apr 1;17(4):471–9. Available from: pmid:19358016
- 26. Kornell N, Bjork RA. The promise and perils of self-regulated study. Psychon Bull Rev. 2007 Jan 1;14(2):219–24. pmid:17694904
- 27. Susser JA, McCabe J. From the lab to the dorm room: metacognitive awareness and use of spaced study. Instructional Science [Internet]. 2013 Mar 1;41(2):345–63. Available from: https://doi.org/10.1007/s11251-012-9231-8.
- 28. Yan VX, Thai K-P, Bjork RA. Habits and beliefs that guide self-regulated learning: Do they vary with mindset? J Appl Res Mem Cogn [Internet]. 2014 Jun 3;1–13. Available from: http://dx.doi.org/10.1016/j.jarmac.2014.04.003.
- 29. Morehead K, Rhodes MG, DeLozier S. Instructor and student knowledge of study strategies. Memory [Internet]. 2016 Jan 27;24(2):257–71. Available from: http://www.tandfonline.com/doi/full/10.1080/09658211.2014.1001992. pmid:25625188
- 30. Junco R, Cotten SR. No A 4 U: The relationship between multitasking and academic performance. Comput Educ [Internet]. 2012 Sep 1;59(2):505–14. Available from: http://dx.doi.org/10.1016/j.compedu.2011.12.023.
- 31.
Pashler H, Johnson. Attentional Limitations in Dual-task Performance. In: Pashler , editor. Attention [Internet]. Psychology Press; 1998 [cited 2021 Nov 15]. Available from: http://laplab.ucsd.edu/articles/Pashler_Johnston_Attention1998.pdf.
- 32.
Pew Research Center. Demographics of mobile device ownership and adoption in the United States [Internet]. Pew Research Center (US); [reviewed 2021 Apr 7; cited 2022 Jul 2]. Available from: https://www.pewresearch.org/internet/fact-sheet/mobile/.
- 33. Chen Q, Yan Z. Does multitasking with mobile phones affect learning? A review. Comput Human Behav [Internet]. 2016 Jan 1;54:34–42. Available from: https://www.sciencedirect.com/science/article/pii/S0747563215300595.
- 34. May KE, Elder AD. Efficient, helpful, or distracting? A literature review of media multitasking in relation to academic performance. International Journal of Educational Technology in Higher Education [Internet]. 2018 Feb 27;15(1):13. Available from: https://doi.org/10.1186/s41239-018-0096-z.
- 35. Glass AL, Kang M. Dividing attention in the classroom reduces exam performance. Educ Psychol Rev [Internet]. 2019 Mar 16;39(3):395–408. Available from: https://doi.org/10.1080/01443410.2018.1489046.
- 36. Middlebrooks CD, Kerr T, Castel AD. Selectively distracted: Divided attention and memory for important information. Psychol Sci [Internet]. 2017 Aug;28(8):1103–15. Available from: pmid:28604267
- 37. Patterson MC. A Naturalistic Investigation of Media Multitasking While Studying and the Effects on Exam Performance. Teach Psychol [Internet]. 2017 Jan 1;44(1):51–7. Available from: https://doi.org/10.1177/0098628316677913.
- 38. Pashler H, Kang SHK, Ip RY. Does multitasking impair studying? Depends on timing. Appl Cogn Psychol [Internet]. 2013 Mar 18;27:593–9. Available from: http://doi.wiley.com/10.1002/acp.2919.
- 39. Finley JR, Benjamin AS, McCarley JS. Metacognition of multitasking: How well do we predict the costs of divided attention? J Exp Psychol Appl [Internet]. 2014 Feb 3;20(2):158–65. Available from: http://supp.apa.org/psycarticles/supplemental/xap0000010/xap0000010_supp.html. pmid:24490818
- 40. Alghamdi A, Karpinski AC, Lepp A, Barkley J. Online and face-to-face classroom multitasking and academic performance: Moderated mediation with self-efficacy for self-regulated learning and gender. Comput Human Behav [Internet]. 2020 Jan 1;102:214–22. Available from: https://www.sciencedirect.com/science/article/pii/S074756321930305X.
- 41. Wood E, Zivcakova L, Gentile P, Archer K, De Pasquale D, Nosko A. Examining the impact of off-task multi-tasking with technology on real-time classroom learning. Comput Educ [Internet]. 2012 Jan 1;58(1):365–74. Available from: https://www.sciencedirect.com/science/article/pii/S0360131511002077.
- 42. Hodges C, Moore S, Lockee B, Trust Torrey, Bond A. The difference between emergency remote teaching and online learning [Internet]. 2020. Available from: https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning.
- 43. Whisenhunt BL, Cathey CL, Hudson DL, Needy LM. Maximizing learning while minimizing cheating: New evidence and advice for online multiple-choice exams. Scholarship of Teaching and Learning in Psychology [Internet]. 2022;8(2):140–53. Available from: https://psycnet.apa.org/fulltext/2022-60322-001.pdf.
- 44. Usher EL, Golding JM, Han J, Griffiths CS, McGavran MB, Brown CS, et al. Psychology students’ motivation and learning in response to the shift to remote instruction during COVID-19. Scholarship of Teaching and Learning in Psychology [Internet]. 2021 Jun 17; Available from: https://psycnet.apa.org/fulltext/2021-56701-001.pdf.
- 45. Qi T, Hu T, Ge Q-Q, Zhou X-N, Li J-M, Jiang C-L, et al. COVID-19 pandemic related long-term chronic stress on the prevalence of depression and anxiety in the general population. BMC Psychiatry [Internet]. 2021 Jul 28;21(1):380. Available from: pmid:34320924
- 46. von Keyserlingk L, Yamaguchi-Pedroza K, Arum R, Eccles JS. Stress of university students before and after campus closure in response to COVID-19. J Community Psychol [Internet]. 2022 Jan;50(1):285–301. Available from: pmid:33786864
- 47.
Means B, Neisler J. Suddenly online: A national survey of undergraduates during the COVID-19 pandemic [Internet]. Langer Research Associates; 2020 [cited 2022 Jun 22]. Available from: https://digitalpromise.dspacedirect.org/bitstream/handle/20.500.12265/98/DPSuddenlyOnlineReportJuly2020.pdf?sequence=3&isAllowed=y.
- 48. Gurung RAR, Stone AM. You can’t always get what you want and it hurts: Learning during the pandemic. Scholarship of Teaching and Learning in Psychology [Internet]. 2020 Oct 19; Available from: https://psycnet.apa.org/fulltext/2020-78359-001.pdf.
- 49. Lepp A, Barkley JE, Karpinski AC, Singh S. College students’ multitasking behavior in online versus face-to-face courses. SAGE Open [Internet]. 2019 Jan 1;9(1):2158244018824505. Available from: https://doi.org/10.1177/2158244018824505.
- 50. Hoffman HJ, Elmi AF. Comparing student performance in a graduate-level introductory biostatistics course using an online versus a traditional in-person learning environment. Journal of Statistics and Data Science Education [Internet]. 2021 Jan 2;29(1):105–14. Available from: https://doi.org/10.1080/10691898.2020.1841592.
- 51. Bartoszewski BL, Gurung RAR. Comparing the relationship of learning techniques and exam score. Scholarship of Teaching and Learning in Psychology [Internet]. 2015 Sep;1(3):219–28. Available from: https://psycnet.apa.org/fulltext/2015-39664-001.pdf.
- 52. Kuhbandner C, Emmerdinger KJ. Do students really prefer repeated rereading over testing when studying textbooks? A reexamination. Memory [Internet]. 2019 Aug;27(7):952–61. Available from: pmid:31045483
- 53. Kuncel NR, Credé M, Thomas LL. The Validity of Self-Reported Grade Point Averages, Class Ranks, and Test Scores: A Meta-Analysis and Review of the Literature. Rev Educ Res [Internet]. 2005 Mar 1;75(1):63–82. Available from: https://doi.org/10.3102/00346543075001063.
- 54. Bahrick HP, Bahrick LE, Bahrick AS, Bahrick PE. Maintenance of foreign language vocabulary and the spacing effect. Psychol Sci [Internet]. 1993 Jan 1;4(5):316–21. Available from: http://pss.sagepub.com/content/4/5/316.short.
- 55. Chi MTH. Active-constructive-interactive: a conceptual framework for differentiating learning activities. Top Cogn Sci [Internet]. 2009 Jan;1(1):73–105. Available from: pmid:25164801
- 56. Kämpfe J, Sedlmeier P, Renkewitz F. The impact of background music on adult listeners: A meta-analysis. Psychology of Music [Internet]. 2011 Oct 1;39(4):424–48. Available from: https://doi.org/10.1177/0305735610376261.