Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Level of consistency between students’ self-reported and observed study approaches in flipped classroom courses: How does it influence students’ academic learning outcomes?

  • Feifei Han

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Writing – original draft, Writing – review & editing

    feifei.han@acu.edu.au

    Affiliations Institute for Learning Sciences and Teacher Education, Australia Catholic University, Brisbane, Australia, Faculty of Education, Department of Pedagogy and Psychology, University of Hradec Králové, Hradec Králové, Czech Republic

Abstract

Using Student Approaches to Learning research as a theoretical framework, the present study used both self-reported and observational log data to understand students’ study approaches in a flipped classroom course amongst 143 computer science undergraduate students. Specifically, it aimed to examine: 1) to what extent students’ study approaches identified by self-reported and observational log data are consistent with each other; and 2) to what extent students’ academic learning outcomes differ between students who showed consistent and inconsistent study approaches by self-reported and observational log data. Using The Revised Study Process Questionnaire, students were clustered as reporting either a Deep or a Surface Study Approach. Using frequencies of students’ participation in five online learning activities, they were classified as adopting either an Active or a Passive Study Approach. A 2 x 2 cross-tabulation showed a positive and moderate association between clusters of students’ study approaches resulted from two types of data. Amongst students who self-reported a Deep Study Approach, the proportion of students who adopted an Active Study Approach (80.7%) was significantly higher than those who adopted a Passive Study Approach (19.3%). In contrast, of the students who self-reported a Surface Study Approach, the proportion of students who used a Passive Study Approach (51.2%) was significantly higher than those who used an Active Study Approach (48.8%). Furthermore, students who had good study approaches by both self-report and observation did not differ from students who adopted an Active study approach by observation but reported a Surface Study Approach on course grades. Likewise, there was no significant difference in terms of academic learning outcomes between those who had poor study approaches by both self-report and observation and those who adopted Passive study approach by observation but reported a Deep Study Approach. Future studies may consider incorporating some qualitative methods in order to find out possible reasons behind the inconsistencies between self-reported and observed study approaches.

Introduction

Over the last few decades, learning in higher education has undergone significant transformation, such as re-designing large lecture-focused courses by adopting flipped classroom courses [1]. As a special type of blended learning designs, flipped classroom courses require students to engage in “interactive content focusing on key concepts prior to class thus allowing class time for collaborative activities that clarify concepts and contextualise knowledge through application, analysis, and planning and producing solutions” [2, p. 1].

Flipped classroom courses have been widely adopted in contemporary learning design in higher education [3], as this type of course design not only promotes active learning [4], but also able to bring various benefits. To start with, flipped classroom courses are well known for their flexibility, as the online learning materials are available at any time and place so that they can easily accommodate the diverse learning preferences of students [5]. Second, flipped classroom courses are beneficial in terms of developing a number of students’ essential graduate attributes, such as problem-solving [6], interpersonal skills [7], and collaboration and teamwork skills [8]. Moreover, compared with traditional learning design, flipped classroom courses can increase students’ attendance [9], retention rates [10], as well as their learning engagement [11]. The majority of existing research has also reported better academic learning outcomes in flipped classroom courses compared with the traditional approach [12].

Compared with traditional course designs, students’ learning experiences in flipped classroom courses are highly complex as students often move back and forth between in-class and on-line learning spaces. In these experiences, students not only interact with the teaching staff and fellow students (the human elements), but also spend a significant proportion of time navigating online learning platforms where they interact with a variety of technology-enabled learning tools, such as blogs, wikis, online discussion forums, podcasts, and video clips (non-human or material elements) [13]. The complexity of learning in flipped classroom courses requires new ways and methods to be integrated into research in order to properly understand how students go about learning (study approaches).

Traditionally, research into students’ study approaches in higher education has largely relied on employing self-reported methods (e.g., self-reported questionnaires, focus groups, interviews, and dairies) [14]. However, self-reported methods and data are often criticized for their subjectivity and inability to represent the complexity of students’ contemporary learning experiences [15,16]. Moreover, self-reports are easily affected by careless answering and item nonresponse by students [17,18].

The widespread adoption of online learning management systems (LMS) and the development in the areas of learning analytics have enabled collection and analysis of observational log data in technology-mediated learning. The observational log data may not only provide relatively objective descriptions of students’ online learning, but also has potential to reflect the dynamic and nuanced differences of how students approach online learning [19].

However, fully relying on the observational log data to examine students’ study approaches without guidance from educational theories has also received criticism of being data-centric, which could result in erroneous interpretations due to a lack of sound theories and meaningful contexts [20]. To address the drawbacks of the self-reported and observational log data, in recent years, researchers have begun to employ both self-reported and observational log data to understand students’ study approach [2124]. The present investigation will adopt Student Approaches to Learning research as a theoretical framework and will use both self-reported and observational log data to examine students’ study approaches in flipped classroom courses. The present study will expand on previous research by addressing two research gaps: 1) levels of consistency between students’ study approaches measured by self-reported and observational log data; and 2) how levels of consistency impact students’ academic learning outcomes. The following sections will review relevant literature.

Student approaches to learning framework and self-reported study approaches

Student Approaches to Learning (SAL) research is one of the guiding frameworks widely used in higher education to understand students’ experiences of learning [25]. SAL research identifies variations in key factors of students’ experiences of learning and studies the relations between them [26]. Within SAL research framework, Biggs uses a 3P model to describe the relations of these factors by categorizing them into Presage, Process, and Product [27]. The Presage has the factors which exist prior to the time of learning in students’ learning and consists of both students’ factors (e.g., prior knowledge, personal attributes, and personality) and situational factors (e.g., institutional climate, course structure, and teaching methods). The Product concerns with various measures of students’ academic performance, such as course grades and students’ post-learning conceptions of the subject matter [28].

As to the Process, one of the main factors is how students go about learning, known as study approaches [29]. Researchers have identified qualitatively different study approaches in different academic disciplines (e.g., history, science, mathematics, biology, arts, and political sciences) and in different learning tasks (e.g., reading, writing, and learning through discussions, problem-based learning, and collaborative learning). Although specific categories of study approaches differed by disciplines and learning tasks, two broad categories of study approaches have been consistently observed [3032], namely Surface and Deep Study Approaches. A Surface Study Approach deals with learning with minimum efforts with an intent to achieve some practical goals, such as to fulfil the compulsory requirements of the course or to pass the examination. Students taking a Surface Study Approach often report using rote memorization, a lack of independence in learning, and relying heavily on teachers’ instructions or their peers’ efforts. In contrast, students with A Deep Study Approach reports features of taking initiatives, being proactive in, and reflective of, their learning processes with a meaningful intent of learning to gain understandings [3336].

Central to the 3P model is the relational stance [26,37], in which study approaches are not considered as stable psychological traits of learners but are simultaneously shaped by the learner and the learner’ experienced learning context [28]. Hence, learners can consciously choose different study approaches on the basis of their perceived contexts and situations of learning. The same learner may vary their study approaches in different subject matters or in different learning designs (face-to-face, online, or blended designs).

Numerous studies have been conducted to examine the relations between students’ study approaches and their academic learning outcomes since a seminal study by Säljö [38], which found students’ learning outcomes varied by the study approaches they adopted. Since then, a large number of studies have consistently identified logical relations between study approaches and academic learning outcomes that A Deep Study Approach tends to be associated with better academic learning outcomes; A Surface Study Approach tends to be related to poorer academic learning outcomes [3942].

However, the majority of existing SAL research has used self-reports to measure study approaches, which has suffered from a number of drawbacks as mentioned above. To improve the insights of contemporary university students’ learning, suggestions have been put forward to expand the current self-reporting methods by including other types of measures to research student learning [43]. For instance, Richardson suggests that observational log data provide “both researchers and practitioners with the opportunity to monitor students’ strategic decisions in online environments in minute detail and in real time” [17, p. 359].

Levels of consistency between self-reported and observed study approaches adopting SAL research framework

The recent development of the emerging research field of learning analytics collects rich and detailed observational log data recorded by LMS to study students’ online learning when they interact with a variety of online learning resources and activities.The observational log data has wide applications in higher education research, such as advising students’ career choice [44], detecting at risk students to improve retention [45], providing personalised feedback [46], facilitating collaborative learning [47], monitoring students’ affect in learning [48], as well as identifying learning strategies and study approaches [49,50].

There are an increasing number of studies have examined levels of consistency between using self-reported data and observational log data to understand students’ learning experiences. However, most of them have adopted either self-regulated learning [23,51,52] or learning engagement research frameworks [53]. Only two studies have specifically focused on investigating levels of consistency between self-reported and observed study approaches using SAL research framework [24,54]. However, not only did the two studies suffer from a major limitation that they relied on either self-reported data or observational log data to detect study approaches but also they produced inconsistent results.

Relying on self-reported data, Ellis et al. [54] used the Revised Study Process Questionnaire [55] to measure students’ study approaches. A hierarchical cluster analysis identified two groups of students who differed on their reporting on both Deep and Surface Scales. One cluster had significant higher ratings on the Deep Scale but significant lower ratings on the Surface Scale, hence, was referred to as reporting a Deep Study Approach. In contrast, the other cluster of students showed significant lower ratings on the Deep Scale but significant higher ratings on the Surface Scale, hence, was referred to as reporting a Surface Study Approach. The researchers then compared the observed frequencies of various types of online events between the two clusters of students and found that students reporting a Deep Study Approach participated all the online learning events significantly more frequently than those reporting a Surface Study Approach, suggesting a relatively high level of consistency between self-reported and observational log data.

However, another study by Gašević et al. only found partial consistency [24]. Gašević et al. used observational log data to detect students’ study approaches and identified four distinct study approaches, which varied on the number of sequences of the different online learning events students were engaged with. They further categorized the four study approaches into either a Deep or a Surface Study Approach according to the levels of engagement of the four study approaches. They found that students who were observed using Deep and Surface Study Approaches only differed significantly on their reporting on the Deep Scale measured by the R-SPQ [55] but not on the Surface Scale, demonstrating only partial consistency between self-reported and observational log data.

To address the limitation of the existing studies and inconsistent results, the present investigation will identify students’ study approaches using both self-reported and observational log data to see the levels of the consistency at the levels of identified groups. Furthermore, the present study will also address another research gap by examining the extent to which students’ academic learning outcomes differ between students with consistent self- and inconsistent self-reported and observational study approaches. Specifically, the following two research questions will be addressed:

  1. 1a) How do students’ observed frequencies with online learning activities and their academic learning outcomes differ by their self-reported study approaches in flipped classroom courses?
  2. 1b) How do students’ self-reported study approaches and their academic learning outcomes differ by their observed study approaches in flipped classroom courses?
  3. 1c) To what extent are students’ clusters of self-reported and observed study approaches in flipped classroom courses consistent with each other?
  4. 2. To what extent do students’ academic learning outcomes differ by levels of consistency between their self-reported and observed study approaches?

Method

Participants and their flipped classroom course

A cohort of 143 undergraduates who were studying towards a Bachelor of Computer Science and Information Technologies degree participated in the study. All the participants were enrolled in a flipped classroom computer science course, which lasted for 13 weeks. The face-to-face learning and teaching comprised a two-hour lecture, a two-hour tutorial, and a three-hour practice each week. Hosted in a Learning Management System (LMS), the online learning consisted of activities before and after each week’s face-to-face learning. The activities before face-to-face learning aimed to prepare for lectures, tutorials, and practice; whereas the activities after face-to-face learning aimed to consolidate and extend face-to-face learning. The following described these online activities in detail:

Online activities before face-to-face learning and teaching:

  • Pre-recorded lectures were in video format and covered the key concepts to be discussed in the coming week’s lecture.
  • Quizzes were embedded in video lectures and tested students’ understanding of the contents in the pre-recorded lecture.
  • Readings were in pdf format and covered essential readings for each week.

Online activities after face-to-face learning and teaching:

  • Exercises were in multiple-choice format and tested students’ understanding of the key concepts in each week.
  • Problem-solving questions were in open-ended format and assessed students’ abilities to apply theories to solve practical problems.

Instruments and data

The revised study process questionnaire to collect self-reported study approaches.

We used the R-SPQ to collect the students’ self-report study approaches. The R-SPQ has 20 items, with 10 items being the Deep Scale and 10 items being the Surface Scale [55]. However, the Confirmatory Factor Analysis of the two-factor solution with 20-items did not show satisfactory fit of the data: χ2 = 342.46, p = .00, CFI = .82, TLI = .80, RMSEA = .08. By removing 2 cross-loading items in each factor, the final retained 16 items demonstrated appropriate fit: χ2 = 163.84, p = .00, CFI = .92, TLI = .91, RMSEA = .07. The Cronbach’s alpha reliability of the Deep and Surface Scales were .82 and .86 respectively.

The learning management system (LMS) to collect observed frequencies of students’ learning online activities.

We used the analytic functions in the LMS to collect the observed frequencies of five types of online learning activities, namely pre-recorded lectures, quizzes, readings, exercises, and problem-solving questions.

Students’ academic learning outcomes.

We used course grades as students’ academic learning outcomes. The maximum course grades were 100, consisting of: course preparation (maximum 20); the practice project (maximum 20), and the close-book examination (maximum 60).

Ethical consideration and data collection

Before the data collection, ethics was applied and granted by the ethics committee of the researcher’s university. We strictly followed the ethical procedures which required written consent from all the participants, who agreed to voluntarily complete the questionnaire, to allow access to log data in LMS and course grades. The questionnaire data were collected towards the end of the semester in students’ practice sessions and the observational log data and course grades were collected upon the completion of the course.

Data analysis

To answer the research question 1a, we conducted two separate hierarchical cluster analyses using either self-reported data or observational log data respectively. For the self-reported data, we used the mean scores of the Deep and Surface Scales to cluster students. On the basis of cluster membership, a series of one-way ANOVAs were performed to examine if there were significant differences in the frequencies of observational log data and course grades. We then conducted a second hierarchical analysis using the mean scores of frequencies of five online learning activities to cluster students. Similarly, based on cluster membership, one-way ANOVAs were conducted to investigate if students differed in their self-reported study approaches and course grades (to answer research question 1b). For research question 1c, we conducted cross-tabulation using students’ clusters resulted from the two above-mentioned hierarchical cluster analyses to examine levels of consistency between self-reported and observed study approaches.

To answer the second research question, first we categorized students into four groups: 1) poor study approaches by both self-report and observation; 2) poor observed study approaches but good self-reported study approaches; 3) good observed study approaches but poor self-reported study approaches; and 4) good study approaches by both self-report and observation. We then compared students’ academic learning outcomes between groups using a one-way ANOVA. Because of the small sample size, power analyses were also conducted for all the one-way ANOVAs.

Results

Results of research question 1a –differences in students’ observed frequencies of online learning activities by their self-reported study approaches

The hierarchical cluster analysis using the mean scores of the Deep and Surface Scales identified a two-cluster solution of 57 and 86 students in each cluster respectively (see Table 1). The one-way ANOVAs showed significant differences on their self-reported study approach: Deep Study Approach (F(1, 142) = 110.03, p < .01, η2 = .44); and Surface Study Approach (F(1, 142) = 22.28, p < .01, η2 = .14). Cluster 1 students reported significantly higher ratings on Deep Scale but lower ratings on the Surface Scale than cluster 2 students. Hence cluster 1 students reported a Deep Study Approach and cluster 2 students reported a Surface Study Approach respectively. In terms of the observed frequencies of online learning activities between students reported a Deep and a Surface Study Approach: the one-way ANOVAs found that out of five online learning activities, they differed significantly on three of them: readings (F(1, 142) = 6.59, p < .05, η2 = .05); exercises (F(1, 52) = 4.05, p < .05; η2 = .03); and problem-solving questions (F(1, 142) = 7.39, p < .05, η2 = .05). So did the final grades (F(1, 142) = 4.55, p < .05, η2 = .05). Students reported a Deep Study Approach were also found to participate in these three online learning activities significantly more frequently and to obtain significantly higher course grades than students reported a Surface Study Approach. The results of power analyses of all the significant results were above .70, except for frequencies of exercises (.52) and course grades (.56), which also showed small effect sizes (η2 = .03 and η2 = .05 for frequencies of exercises and course grades respectively). These results suggest that in order for large effect sizes to be detected for differences on frequencies of exercises and course grades with 70% of power and with alpha at .05 between the two clusters of students, a larger sample size is required.

thumbnail
Table 1. Observed frequencies of online learning activities by different self-reported study approaches.

https://doi.org/10.1371/journal.pone.0286549.t001

Results of research question 1b –differences in students’ self-reported study approach by their observed study approach

The hierarchical cluster analysis using observational log data also produced a two-cluster solution, which had 88 and 55 students respectively (see Table 2). The one-way ANOVAs showed significant differences on the frequencies of all five online learning activities between the two clusters: pre-recorded lectures (F(1, 142) = 36.78, p < .01, η2 = .21); quizzes (F(1, 142) = 6.29, p < .01, η2 = .04); readings (F(1, 142) = 97.83, p < .01, η2 = .41); exercises (F(1, 142) = 22.78, p < .01, η2 = .14); and problem-solving questions (F(1, 142) = 117.17, p < .01, η2 = .45). Cluster 1 students were observed to participate in all the five online learning activities significantly more frequently than cluster 2 students. Therefore, cluster 1 and cluster 2 students adopted an active study approach and a passive study approach respectively.

thumbnail
Table 2. Self-reported study approaches by clusters of different observed study approaches.

https://doi.org/10.1371/journal.pone.0286549.t002

The results of one-way ANOVAs further showed that students who used an Active Study Approach also reported significant higher ratings on the Deep Scale (F(1, 142) = 10.27, p < .01, η2 = .07); and had significant higher grades (F(1, 142) = 10.27, p < .01, η2 = .07) than students who employed a Passive Study Approach. However, the two clusters of students did not differ on their reporting on the Surface Scale. The results of power analyses of all the significant results were above .70.

Results of research question 1(c)–consistency between self-reported and observed study approaches

The results of the cross-tabulation demonstrated a significant and moderate association between the cluster membership results from the self-reported and observed study approaches: χ2(1) = 14.71, p < .01, φ = .32. Table 3 shows that amongst students who self-reported a Deep Study Approach, the proportion of students who adopted an Active Study Approach (80.7%) was significantly higher than those who adopted a Passive Study Approach (19.3%). In contrast, of the students who self-reported a Surface Study Approach, the proportion of students who used a Passive Study Approach (51.2%) was significantly higher than those who used an Active Study Approach (48.8%).

Results of research question 2 –academic learning outcomes by levels of consistency between self-reported and observed study approaches

The results of one-way ANOVA revealed that students in four groups (i.e., group 1 –poor study approaches by both self-report and observation: n = 44; group 2 –poor observed study approaches but good self-reported study approaches: n = 11; group 3 –good observed study approaches but poor self-reported study approaches: n = 42); and group 4 –good study approaches by both self-report and observation: n = 44) differed significantly on their academic learning outcomes (F(3, 142) = 10.69, p < .01, η2 = .19). The power analysis showed an observed power of 1.00. The pairwise post-hoc comparison showed that group 4 students (who had good study approaches by both self-report and observation) and group 3 students (who had good observed study approaches but poor self-reported study approaches) obtained significantly higher grades than group 2 students (who had poor observed study approaches but good self-reported study approaches) and group 1 students (who had poor study approaches by both self-report and observation). However, there were no significant differences either between students in group 4 and group 3 or between students in group 2 and group 1.

Discussion

The present study examined: 1) the consistency between students’ self-reported and observed study approaches in flipped classroom courses adopting a SAL research framework; and 2) students’ academic learning outcomes by the levels of consistency between their self-reported and observed study approaches.

Consistency between self-reported and observed study approaches

We detected students’ study approaches using both self-reported and observational log data. No matter we used self-reported or observational log data to identify study approaches, the other type of data only showed partial consistency. Between the two clusters of students reporting a Deep and Surface Study Approach, they only differed on the frequencies of participation in three out of five online learning activities, namely readings, exercises, and problem-solving questions. One possible reason why the two clusters of students did not differ on frequencies of pre-recorded lectures and quizzes (which tested students’ understandings of the lectures) could be completion of quizzes contributed to 20% of the course grades.

Between the two clusters of students who adopted an Active and a Passive Approach, they only differed on the Deep Scale but not the Surface Scale. In addition, the partial consistency results were further supported by the moderate association between student clusters resulted from the self-reported and observed study approaches. Our results of partial consistency results aligned with the results in Gašević et al [24].

However, students’ academic learning outcomes showed a consistent pattern irrespective to using which types of data to identify students’ study approaches. We found students who reported a Deep Study Approach scored significantly highly than their peers who reported a Surface Study Approach. Similarly, students who were observed more active in online learning also obtained significant better grades than their counterparts who were observed less active in online activities.

Academic learning outcomes by levels of consistency between self-reported and observed study approaches

In this study, we found that the less than two thirds (61.5%) of students’ study approaches were consistent between how they reported the learned and how they were observed to learn. In a study by Ye and Pennisi on levels of consistency between students’ self-regulated learning strategies by self-reports and observation, the results showed less than two thirds of students in the consistent groups (from 30.8% to 53.9% depending on whether there were three or two types of self-regulated learning strategies) [23].

Of the rest of students (38.5%) in the two inconsistent groups, the majority of them were in the group of good observed study approaches but poor self-reported study approaches (29.4%). Furthermore, the two consistent groups of students who had either better or poorer study approaches as reflected by both self-reported and observational log data also obtained higher and lower course grades respectively. Furthermore, our results showed that students who had good study approaches by both self-report and observation (group 4) did not differ from students who adopted a good study approach by observation but reported a Surface Study Approach (group 3) on course grades. Likewise, there was no significant difference in terms of academic learning outcomes between those who had poor study approaches by both self-report and observation (group 1) and those who adopted less desirable online study approaches by observation but reported a Deep Study Approach (group 2). The qualitative results of Ye and Pennisi’s [23] study revealed that students with poor monitoring and self-reflection abilities (identified as poor self-regulated learners) were less accurate in terms of reporting their learning. However, as the present study did not use qualitative methods, it was unknown the reasons of the inconsistencies between self-reported and observed study approaches among the participants.

Limitations and directions for future research

Some limitations of the study need to be pointed out so that future research design can improve on these. To start with, the sample size of the study was small and the recruitment only focused on a single discipline–computer science. Previous research has shown disciplinary variations in terms of the preferred study approaches. For instance, a recent study by Ng and Yong [56] compared study approaches used by undergraduates from two science disciplines and found that biosciences undergraduates tended to report deep approaches, whilst a higher proportion of pharmacy undergraduates reported surface approaches. Ng and Yong’s study only used self-reported data to measure study approaches across disciplines. Future research may combine both self-reported and observational log data to examine study approaches of students from diverse disciplines for possible disciplinary variations to be discovered. Second, the R-SPQ is a commonly used self-reported instrument in the SAL research framework, it does not make a clear distinction between students’ study approaches in the face-to-face and online components. However, the observational log data was concerned with the online part of the learning in the flipped classroom course. Future studies should use a self-reported questionnaire which can clearly distinguish between how students go about learning in-person and online. Furthermore, the present study only used one type of observational log data–frequency to examine students’ observed study approaches. The advancement of learning analytics functions has made it possible to record a diversity of types of log data, such as time spent on different types of online learning activities (duration), the exact time at which a logged event occurs (timestamp), and the sequences of online events (sequence) [22,5759]. Therefore, future studies may employ multiple types of log data to examine students’ observed study approaches [49,50]. An additional limitation for future research to address is to examine possible reasons behind the two inconsistent groups, namely students with good observed study approaches but poor self-reported study approaches and students with poor observed study approaches but good self-reported study approaches. Future studies can use qualitative methods, such as interviews, to find out why there is a mismatch between how students’ reported their study approaches and what they actually did in learning.

Conclusion

The present study adopted the framework of Student Approaches to Learning research to compare levels of consistency of study approaches measured by a self-reported questionnaire and log data extracted from LMS among a cohort of Australian computer science undergraduates in a flipped classroom course. It also examined students’ academic learning outcomes of students who showed consistent and inconsistent study approaches by self-reports and observation. A positive and moderate association was found between students’ self-reported and observed study approaches. Students’ academic learning outcomes also differed by levels of consistency between their self-reported and observed study approaches. Due to the quantitative nature of the study, it was unknown the reasons of the inconsistencies between self-reported and observed study approaches, which may be revealed by combining quantitative and qualitative methods in future research.

References

  1. 1. Cho HJ, Zhao K, Lee CR, Runshe D, Krousgrill C. Active learning through flipped classroom in mechanical engineering: improving students’ perception of learning and performance. International Journal of STEM Education. 2021 Dec;8(1):1–3. pmid:34312588
  2. 2. Karanicolas S, Snelling C, Winning R, Chapman J, Williams A, Douglas T, et al. Translating concept into practice: enabling first-year health sciences teachers to blueprint effective flipped learning approaches. 2019 Jan.
  3. 3. Yough M, Merzdorf HE, Fedesco HN, Cho HJ. Flipping the classroom in teacher education: Implications for motivation and learning. Journal of Teacher Education. 2019 Nov;70(5):410–22.
  4. 4. Thai NT, De Wever B, Valcke M. The impact of a flipped classroom design on learning performance in higher education: Looking for the best “blend” of lectures and guiding questions with feedback. Computers & Education. 2017 Apr 1;107:113–26.
  5. 5. Mok HN. Teaching tip: The flipped classroom. Journal of information systems education. 2014;25(1):7.
  6. 6. Gilboy MB, Heinerichs S, Pazzaglia G. Enhancing student engagement using the flipped classroom. Journal of Nutrition Education and Behavior. 2015 Jan 1;47(1):109–14. pmid:25262529
  7. 7. Yelamarthi K, Drake E. A flipped first-year digital circuits course for engineering and technology students. IEEE Transactions on Education. 2014 Sep 18;58(3):179–86.
  8. 8. Love B, Hodge A, Grandgenett N, Swift AW. Student learning and perceptions in a flipped linear algebra course. International Journal of Mathematical Education in Science and Technology. 2014 Feb 17.
  9. 9. Chen Y, Wang Y, Chen NS. Is FLIP enough? Or should we use the FLIPPED model instead?. Computers & Education. 2014 Oct 1;79:16–27.
  10. 10. Kim MK, Kim SM, Khera O, Getman J. The experience of three flipped classrooms in an urban university: An exploration of design principles. The Internet and Higher Education. 2014 Jul 1;22:37–50.
  11. 11. Kanelopoulos J, Papanikolaou K, Zalimidis P. Flipping the classroom to increase students’ engagement and interaction in a mechanical engineering course on machine design. International Journal of Engineering Pedagogy. 2017 Oct 7:4: 19–34.
  12. 12. Lax N, Morris J, Kolber BJ. A partial flip classroom exercise in a large introductory general biology course increases performance at multiple levels. Journal of Biological Education. 2017 Oct 2;51(4):412–26.
  13. 13. Fenwick T, Edwards R, Sawchuk P. Emerging approaches to educational research: Tracing the socio-material. Routledge; 2015 Apr 8.
  14. 14. Richardson JT. Student learning in higher education: a commentary. Educational Psychology Review. 2017 Jun;29(2):353–62.
  15. 15. Matcha W, Gasevic D, Uzir NA, Jovanovic J, Pardo A, Lim L, et al. Analytics of Learning Strategies: Role of Course Design and Delivery Modality. Journal of Learning Analytics. 2020;7(2):45–71.
  16. 16. Zhou M, Winne PH. Modeling academic achievement by self-reported versus traced goal orientation. Learning and Instruction. 2012 Dec 1;22(6):413–9.
  17. 17. Hitt C, Trivitt J, Cheng A. When you say nothing at all: The predictive power of student effort on surveys. Economics of Education Review. 2016 Jun 1;52:105–19.
  18. 18. Zamarro G, Cheng A, Shakeel MD, Hitt C. Comparing and validating measures of non-cognitive traits: Performance task measures and self-reports from a nationally representative internet panel. Journal of Behavioral and Experimental Economics. 2018 Feb 1;72:51–60.
  19. 19. Baker R, Siemens G. Learning analytics and educational data mining. Cambridge Handbook of the Leaning Sciences, 2014:253–72.
  20. 20. Reimann P, Markauskaite L, Bannert M. e‐R esearch and learning theory: What do sequence and process mining methods contribute?. British Journal of Educational Technology. 2014 May;45(3):528–40.
  21. 21. Han F, Pardo A, Ellis RA. Students’ self‐report and observed learning orientations in blended university course design: How are they related to each other and to academic performance?. Journal of Computer Assisted Learning. 2020 Dec;36(6):969–80.
  22. 22. Han F, Ellis RA, Guan E. Patterns of students’ collaborations by variations in their learning orientations in blended course designs: How is it associated with academic achievement?. Journal of Computer Assisted Learning. 2022 Oct 14.
  23. 23. Ye D, Pennisi S. Using trace data to enhance Students’ self-regulation: A learning analytics perspective. The Internet and Higher Education. 2022 Jun 1;54:100855.
  24. 24. Gasevic D, Jovanovic J, Pardo A, Dawson S. Detecting learning strategies with analytics: Links with self-reported measures and academic performance. Journal of Learning Analytics. 2017 Jul 5;4(2):113–28.
  25. 25. Zusho A. Toward an integrated model of student learning in the college classroom. Educational Psychology Review. 2017 Jun;29(2):301–24.
  26. 26. Prosser M, Trigwell K. Exploring University Teaching and Learning-Experience and Context. 2020. Cham: Springer.
  27. 27. Biggs JB. Approaches to the enhancement of tertiary teaching. Higher education research and development. 1989 Jan 1;8(1):7–25.
  28. 28. Guo JP, Yang LY, Zhang J, Gan YJ. Academic self-concept, perceptions of the learning environment, engagement, and learning outcomes of university students: relationships and causal ordering. Higher Education. 2022 Apr;83(4):809–28.
  29. 29. Entwistle N, Ramsden P. Understanding student learning (routledge revivals). Routledge; 2015 Aug 20.
  30. 30. Laurillard D. Styles and approaches in problem-solving. The experience of learning. 1997;2:127.
  31. 31. Limbu L, Markauskaite L. How do learners experience joint writing: University students’ conceptions of online collaborative writing tasks and environments. Computers & Education. 2015 Mar 1;82:393–408.
  32. 32. Asikainen H, Gijbels D. Do students develop towards more deep approaches to learning during studies? A systematic review on the development of students’ deep and surface approaches to learning in higher education. Educational Psychology Review. 2017 Jun;29(2):205–34.
  33. 33. Edmunds R, Richardson JT. Conceptions of learning, approaches to studying and personal development in UK higher education. British Journal of Educational Psychology. 2009 Jun;79(2):295–309. pmid:19017434
  34. 34. Lonka K, Olkinuora E, Mäkinen J. Aspects and prospects of measuring studying and learning in higher education. Educational Psychology Review. 2004 Dec;16(4):301–23.
  35. 35. Marton F, Säljö R. On qualitative differences in learning: I—Outcome and process. British Journal of Educational Psychology. 1976 Feb;46(1):4–11.
  36. 36. Nelson Laird TF, Seifert TA, Pascarella ET, Mayhew MJ, Blaich CF. Deeply affecting first-year students’ thinking: Deep approaches to learning and three dimensions of cognitive development. The Journal of Higher Education. 2014 May 1;85(3):402–32.
  37. 37. Marton F, Booth S. Learning and awareness. Routledge; 2013 Feb 1.
  38. 38. Säljö R. Learning about learning. Higher education. 1979 Jul;8(4):443–51.
  39. 39. Crawford K, Gordon S, Nicholas J, Prosser M. Qualitatively different experiences of learning mathematics at university. Learning and Instruction. 1998 Oct 1;8(5):455–68.
  40. 40. Hay DB. Using concept maps to measure deep, surface and non‐learning outcomes. Studies in Higher Education. 2007 Feb 1;32(1):39–57.
  41. 41. Guo J. Building bridges to student learning: Perceptions of the learning environment, engagement, and learning outcomes among Chinese undergraduates. Studies in Educational Evaluation. 2018 Dec 1;59:195–208.
  42. 42. Lindblom-Ylänne S, Lonka K, Leskinen E. On the predictive value of entry-level skills for successful studying in medical school. Higher Education. 1999 Apr;37(3):239–58.
  43. 43. Vermunt JD, Donche V. A learning patterns perspective on student learning in higher education: state of the art and moving forward. Educational Psychology Review. 2017 Jun;29(2):269–99.
  44. 44. Bettinger EP, Baker RB. The effects of student coaching: An evaluation of a randomized experiment in student advising. Educational Evaluation and Policy Analysis. 2014 Mar;36(1):3–19.
  45. 45. Krumm AE, Waddington RJ, Teasley SD, Lonn S. A learning management system-based early warning system for academic advising in undergraduate engineering. In Learning analytics 2014 (pp. 103–119). Springer, New York, NY.
  46. 46. Gibson A, Aitken A, Sándor , Buckingham Shum S, Tsingos-Lucas C, Knight S. Reflective writing analytics for actionable feedback. In Proceedings of the seventh international learning analytics & knowledge conference 2017 Mar 13 (pp. 153–162).
  47. 47. Kaendler C, Wiedmann M, Rummel N, Spada H. Teacher competencies for the implementation of collaborative learning in the classroom: A framework and research review. Educational Psychology Review. 2015 Sep;27(3):505–36.
  48. 48. Ocumpaugh J, Baker R, Gowda S, Heffernan N, Heffernan C. Population validity for Educational Data Mining models: A case study in affect detection. British Journal of Educational Technology. 2014 May;45(3):487–501.
  49. 49. Fincham E, Gašević D, Jovanović J, Pardo A. From study tactics to learning strategies: An analytical method for extracting interpretable representations. IEEE Transactions on Learning Technologies. 2018 Apr 5;12(1):59–72.
  50. 50. Jovanović J, Gašević D, Dawson S, Pardo A, Mirriahi N. Learning analytics to unveil learning strategies in a flipped classroom. The Internet and Higher Education. 2017 Apr 1;33(4):74–85.
  51. 51. Han F, Vaculíková J, Juklová K. The relations between Czech undergraduates’ motivation and emotion in self-regulated learning, learning engagement, and academic success in blended course designs: Consistency between theory-driven and data-driven approaches. Frontiers in Psychology. 2022;13. pmid:36467195
  52. 52. Ellis RA, Han F, Pardo A. Improving learning analytics–Combining observational and self-report data on student learning. Journal of Educational Technology & Society. 2017 Jul 1;20(3):158–69.
  53. 53. Ober TM, Hong MR, Rebouças-Ju DA, Carter MF, Liu C, Cheng Y. Linking self-report and process data to performance as measured by different assessment types. Computers & Education. 2021 Jul 1;167:104188.
  54. 54. Ellis RA, Han F, Pardo A. Improving learning analytics–Combining observational and self-report data on student learning. Journal of Educational Technology & Society. 2017 Jul 1;20(3):158–69.
  55. 55. Biggs J, Kember D, Leung DY. The revised two‐factor study process questionnaire: R‐SPQ‐2F. British Journal of Educational Psychology. 2001 Mar;71(1):133–49. pmid:11307705
  56. 56. Ng ZX, Yong PH. The implication of multicultural education on students’ learning approaches in biosciences and pharmacy courses. Journal of Applied Research in Higher Education. 2022 Dec 6;14(4):1466–79.
  57. 57. Hadwin AF, Nesbit JC, Jamieson-Noel D, Code J, Winne PH. Examining trace data to explore self-regulated learning. Metacognition and Learning. 2007 Dec;2(2):107–24.
  58. 58. Winters FI, Greene JA, Costich CM. Self-regulation of learning within computer-based learning environments: A critical analysis. Educational Psychology Review. 2008 Dec;20(4):429–44.
  59. 59. Bannert M, Molenaar I, Azevedo R, Järvelä S, Gašević D. Relevance of learning analytics to measure and support students’ learning in adaptive educational technologies. Inproceedings of the seventh international learning analytics & knowledge conference 2017 Mar 13 (pp. 568–569).