Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Ready student one: Exploring the predictors of student learning in virtual reality

  • J. Madden ,

    Roles Conceptualization, Formal analysis, Investigation, Validation, Visualization, Writing – original draft

    jmadden@astro.cornell.edu

    Affiliation Astronomy and Space Sciences, Cornell University, Ithaca, NY, United States of America

  • S. Pandita,

    Roles Conceptualization, Formal analysis, Investigation, Methodology, Resources, Software, Validation, Writing – review & editing

    Affiliation Communication, Cornell University, Ithaca, NY, United States of America

  • J. P. Schuldt,

    Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Resources, Supervision, Writing – review & editing

    Affiliation Communication, Cornell University, Ithaca, NY, United States of America

  • B. Kim,

    Roles Conceptualization, Formal analysis, Investigation, Methodology, Project administration, Resources, Validation, Writing – review & editing

    Affiliation Communication, Cornell University, Ithaca, NY, United States of America

  • A. S. Won,

    Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Validation, Writing – review & editing

    Affiliation Communication, Cornell University, Ithaca, NY, United States of America

  • N. G. Holmes

    Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Validation, Writing – review & editing

    Affiliation Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY, United States of America

Ready student one: Exploring the predictors of student learning in virtual reality

  • J. Madden, 
  • S. Pandita, 
  • J. P. Schuldt, 
  • B. Kim, 
  • A. S. Won, 
  • N. G. Holmes
PLOS
x

Abstract

Immersive virtual reality (VR) has enormous potential for education, but classroom resources are limited. Thus, it is important to identify whether and when VR provides sufficient advantages over other modes of learning to justify its deployment. In a between-subjects experiment, we compared three methods of teaching Moon phases (a hands-on activity, VR, and a desktop simulation) and measured student improvement on existing learning and attitudinal measures. While a substantial majority of students preferred the VR experience, we found no significant differences in learning between conditions. However, we found differences between conditions based on gender, which was highly correlated with experience with video games. These differences may indicate certain groups have an advantage in the VR setting.

Introduction

In twenty-first century education, technology is pervasive in our classrooms [1]. Research has found many ways in which technology benefits student learning and attitudes towards science [2]. As new instructional technologies are developed, it is necessary that researchers conduct critical evaluations of their effectiveness. There are many open research questions related to identifying how to most effectively use different kinds of technology for different learning goals or for different points in the learning process. One technology that has received particular attention for its potential in science learning is virtual reality and other immersive media [3, 4].

In a typical college lecture for a science course, an instructor can choose to engage students with a concept using technologies such as specialized equipment for an interactive lecture demonstration [5], a dynamic computer simulation [6], or classroom polling with personal response systems [7]. Science courses also employ technology extensively in instructional labs, where students can use technology to obtain first-hand experiences with the phenomenon [8]. Instructional labs, however, have failed to provide measurable gains in student learning of those phenomena [9, 10], reminding us that what is important for learning is not the technology itself, but how and why it is used [2]. For example, using relatively similar equipment, interactive lecture demonstrations have consistently found measurable learning gains when students are actively engaged in making predictions and explaining the observations from the demonstrations [5, 11]. Interactive simulations have also been shown to improve or replicate learning compared to hands-on manipulatives, while reducing the associated resources [12, 13]. While there are many potential explanations for these differences [14], two relate to how learners and technology interact through issues of embodiment and real-world complexities. This study aimed to test the impact of these variables by directly comparing student learning and attitudes from three different instructional technologies (a hands-on activity, desktop simulation, and virtual reality simulation), while taking advantage of their respective affordances along the dimensions of embodiment and real-world complexity.

Embodiment

Theories of learning argue that cognition is inherently embodied: “the mind must be understood in the context of its relationship to a physical body that interacts with the world.” [15 p.625]. Research has found that learning benefits from activities that explicitly attend to embodied cognition [1618]. There are several ways in which embodiment is argued to support learning, generally tied to a hypothesis whereby activities help move cognition from abstract to concrete representations of a phenomenon [16].

Two such notions of embodied cognition focus on how learners off-load cognitive work on to the environment and how off-line cognition is body-based [15, 19]. These notions suggest physical aspects of cognition, whereby learning is supported through engaging perceptuo-motor systems [20]. Indeed, nearly all science education advocates for the use of interactive hands-on activities [21]. Through hands-on activities and demonstrations, learners connect abstract concepts to their physical environment. For example, researchers in physics education have developed embodied activities for teaching concepts of energy conservation [22]. In these activities, learners assign units of energy to physical objects (either cubes or people) [23], and then manipulate those objects to represent processes of energy transfer and dynamics. Through these concrete representations of an otherwise abstract phenomenon, learners develop their conceptual and mechanistic understandings of the phenomenon [22]. By manipulating physical objects, students can see deep features of the phenomena, allowing them to effectively integrate the features into their mental models of the phenomena [20, 24].

Two other notions of embodied cognition focus on how cognition is situated within the real-world and that the learning environment is part of the cognitive system [15, 19]. Because the knowledge learned is tied to the environment in which it was learned, it is argued that there are limitations to applying the knowledge in new environments [25]. In science education, the initial learning environment can often seem too far removed or distinct from the intended environment for applying the knowledge. Context-rich activities, can provide concrete, real-world scenarios for otherwise abstract learning activities. Alternatively, activities may be modified to better represent the real world, such as by providing realistic representations of the phenomena.

These activities also demonstrate that there are degrees of embodiment. These may range from metaphorical groundings of cognition (for example, being able to imagine oneself walking from point A to point B) to physically experiencing the phenomenon (for example, actually walking from A to point B) [16]. While much research has demonstrated the benefits of physically experiencing the phenomenon through hands-on activities, the results are not universal [9, 12, 13, 26, 27]. One explanation for the lack of clear benefit for hands-on activities is that when the activities and materials are complicated or difficult to manipulate, the learner may experience extraneous cognitive load [28] or may be likely to make mistakes [5]. That is, real-world complexity gets in the way of student learning in hands-on activities.

Real-world complexity

In a real-world experiment, students’ measurements and observations are prone to variability, systematic effects, and measurement mistakes that are not relevant to the theoretical concepts being taught. In general, people struggle to evaluate uncertain events [29] and use error-prone shortcuts and heuristics in making judgments under uncertainty [30]. Learning an underlying concept, then, becomes difficult if the information being used to develop that understanding is uncertain or probabilistic.

For hands-on activities, the uncertain and probabilistic elements contribute to extraneous cognitive load during the activity. Cognitive load refers to a learner’s capacity for processing information in their short-term memory [28]. Information can either contribute to extraneous (ineffective) cognitive load or germane (effective) cognitive load [28]. Germane cognitive load refers to information that is relevant for learning, such that large amounts of germane cognitive load can improve learning [31]. Extraneous cognitive load refers to information that is irrelevant for learning, such that large amounts of extraneous cognitive load impede learning [28, 32]. In hands-on activities, issues of uncertainty, complicated equipment, and user mistakes contribute to extraneous cognitive load, which may hamper learning.

To reduce that cognitive load, hands-on activities often use heavily guided instructions that attempt to limit mistakes, tell students how to manipulate the equipment, and reduce uncertainties [8]. However, constructivist theories of learning argue that students must have the ability to explore and generate their own knowledge [33]. The key is to develop activities with high germane cognitive load, but low extraneous cognitive load [31, 34].

Simulations are one way to remove the extraneous cognitive load of real-world hands-on activities, allowing phenomena to be demonstrated in a consistent and controlled way. With that control, they can still be relatively unstructured, maintaining high germane cognitive load. Students can autonomously and easily change variables, allowing them to learn at their own pace [35, 36]. Several studies have found that computer simulations produce equal [13, 37, 38] or better [12, 13] learning than hands-on activities. Simulations also provide opportunities for students to see features of a phenomenon that they would be unable to see otherwise, for example abstract concepts such as heat [39] or microscopic objects such as cells [40] or electrons [12, 41].

Simulations, however, provide a more limited embodied cognition experience than hands-on activities, where the learner interacts directly with the phenomenon. Virtual reality is a potential technology that can employ high-levels of embodiment, while maintaining controlled and simplified representations of the phenomena to be learned.

Why virtual reality?

Immersive virtual reality (VR) may provide the best of both worlds. VR allows embodied simulations and offers a number of other affordances [42, 43] that make it uniquely suited as a teaching tool for basic science. First, students can physically interact with content, providing the engagement of a hands-on activity but with the control and replicability of a simulation. Second, the simulations provide multiple forms of embodiment, such as changing perspectives to experience phenomena as they would in different circumstances in the real world [16]. Third, students can experience these perspectives in ways unavailable in the real world [35, 39, 41]. From a research perspective, the ability to track student movement allows for assessing engagement and learning [44] to better test the embodiment hypothesis. It also facilitates implementing interventions to increase learning in real time.

There is thus a need for experimental research that directly tests the effectiveness of VR on science learning, over-and-above that offered by existing hands-on and simulation approaches. Several studies have compared learning between these three modalities and found that, in terms of student attitudes, participants generally prefer learning in VR over other modalities [40, 45]. One study even found that students’ attitudes towards socio-scientific issues improved more in an augmented reality (AR) simulation over a desktop simulation, even though these aspects were secondary to the activity’s primary cognitive goal [40]. Related to embodiment, several studies have found that participants became more immersed in VR than in other environments [4648]. Generally, this immersion seems to be independent of personal characteristics, such as gender, VR experience, and time spent gaming [46].

Measures of learning from VR are generally conflicting. Studies have found that participants in VR learn more than [39, 48], as much as [40, 45], or less than [47] participants in hands-on or desktop conditions. When learning was improved in VR over a hands-on activity, the gains were attributed to immediate feedback available through the simulation and visualization of the abstract phenomenon that was otherwise imperceptible in the hands-on activity [39]. When learning was hindered in VR over a desktop simulation, the differences were attributed to higher cognitive load [47]. In this study, where participants answered multiple-choice questions and performed technical lab procedures in the two types of simulation, researchers found that students in the desktop simulation condition learned more than students in the VR condition on a conceptual test. However, learning was the same between conditions on a transfer test and participants overwhelmingly preferred the VR condition. The researchers also found that participants in the VR condition had significantly higher cognitive load as measured through an electroencephalogram (EEG). They suggested that the physical manipulation of the equipment in VR was more complicated than the desktop condition, which may have increased students’ extraneous cognitive load, impacting learning. They recommended experiments that used more natural control systems to manipulate the environment. Furthermore, the researchers also argued that the enjoyment associated with VR actually distracted the learners from learning.

In addition to cognitive load, there are also questions about gender differences and experience with 3D rotations through experience with video games. In one study, men outperformed women during the task itself, but there was no difference on a post-condition recall test [49]. In a study that found no overall differences in learning from VR compared with video and static images, the researchers found that men and participants with experience with video games outperformed in the VR condition over other conditions and participants [45]. Because gender was also correlated with video game experience, they hypothesized that video game experience was a proxy for the gender differences emerging in their study. This result is somewhat surprising given that neither condition involved much interaction with the simulations (even the VR participants could only look around the simulation).

Research has found, however, that performance is improved when participants can more fully interact with the simulation: for example, walking around the simulation compared with remaining stationary while looking around the simulation [49]. Furthermore, one study found that students’ reported sense of presence in a simulation was correlated with their learning from the simulation [48]. This study also found that preferential learning from VR was confined to sub-topics involving “dynamic three-dimensional processes, but not processes that can be represented statically in two dimensions” [48, p.1]. This suggests that VR simulations are more effective when they take advantage of their specific affordances.

The existing research exemplifies the idea that it is not the technology, but how it is used, that promotes learning. In this study, we aimed to develop and test a VR simulation that took advantage of its various affordances, particularly related to embodiment and real-world complexity. We probed these ideas by comparing an interactive VR condition with analogous hands-on and desktop simulation activities for learning about Moon Phases. As per the previous research, we evaluated students’ conceptual knowledge, long-term retention, attitudes towards the activity, and socio-scientific beliefs. We also compared differential effects on sub-populations of our participants, including evaluating effects for gender, video game experience, and experience in VR. We found that there were no overall differences in learning on short-term or long-term assessments between conditions and that the immersive VR did not impact students’ socio-scientific beliefs. We also replicated previous work in that participants overwhelmingly preferred the VR condition, and that men outperform in the VR condition, which may be attributed to video game experience.

Methods and materials

This study had two hypotheses. First, we proposed that there would be a main effect of virtual reality on learning. Second, we proposed that there would be corresponding effects on environmental attitude. We also collected data on other measures in order to explore interactions that might be predicted by the literature. All measures are reported below.

In our study, we used a between-subjects pre-post design, with three conditions. The three conditions were designed to express the same content using different educational tools; an immersive simulation using a VR headset, a computer-based interactive desktop simulation, and an analog hands-on activity (Fig 1). We chose the concept of Moon phases, as it was expected to benefit from embodiment and reduction of real-world complexities. A review of over 35 years of astronomy education literature found that phases of the Moon was one of the most challenging topics in astronomy education [50]. A lesson on Moon phases requires the student to place themselves in spatial and temporal perspectives of the Sun-Earth-Moon system that are generally inaccessible, which can be challenging through static images or text [51, 52]. Previous work has found that students’ spatial reasoning correlates with their understanding of lunar concepts [53, 54]. Understanding of Moon phases also requires understanding the dynamic evolution of the phases over time and space [54], likely facilitated through interacting with and moving around a simulation. Furthermore, the traditional hands-on activity for teaching Moon phases (described below) is susceptible to various real-world inaccuracies, such as creating eclipses every month or rotating or orbiting the wrong way. Research has also found that explanatory features of traditional descriptions and images can interfere with student understanding [51, 52], motivating the need for authentic real-world visualizations over abstracted ones.

thumbnail
Fig 1. Experiment procedure.

Map of the process for our experiment showing the three conditions. The individual shown has given written consent to have their likeness presented here.

https://doi.org/10.1371/journal.pone.0229788.g001

Participants

Participants were recruited from the undergraduate student population of a medium-sized private university. There were 172 participants, including 138 women, 31 men, and 3 other, all between the ages of 18 and 24. For three conditions to obtain a Cohen’s d effect size of 0.25 with 80% statistical power at 95% significance level, our power test determined that a sample size of 53 participants per condition (159 total) would be sufficient. We ran a total of 172 participants to buffer against potential exclusions. The study primarily drew from students currently enrolled in an introductory astronomy class (varying in major), and students majoring in Communication. Participants’ self-reported race and ethnicity were as follows: 62 Asian/Pacific Islander, 59 Caucasian, 21 African American, 18 bi/multiracial (including Asian/Pacific Islander, Caucasian, African American, Hispanic/Latinx, and Native American), 6 Hispanic/Latinx, 4 preferred not to answer, and 2 specified ‘other’ without elaboration. Participants could select as many categories that applied. All participants signed an informed-consent form before beginning the experiment. All aspects of the experiment were approved by the Cornell Institutional Review Board Protocol #1708007381. Participants were compensated with course credit or 15 dollars in cash for their participation. The individual pictured in Fig 1 has provided written informed consent (as outlined in the PLOS consent form) to publish their image alongside the manuscript.

Participants first took a pre-test and then they were randomly assigned to one of three conditions. After the activity (which included consistent self-guided lessons on Moon phases, described below), participants took a post-test that included a demographic and attitudinal survey. Finally, the participants were shown the other two conditions and asked to comment on which they would prefer as their favorite learning method and why. These activities are summarized in Fig 1. A large preference for one condition over another may lead to greater retention of knowledge learned in that condition. To explore this potential effect we contacted all participants four months after participating and asked them to complete another learning test (delayed post-test). We received 56 responses to the delayed post-test making a breakdown by demographics difficult.

Conditions

Each condition was designed to give participants a similar learning experience using the three technologies we employed. The overall design was to recreate a Sun-Earth-Moon system that the participant could control in time in order to observe the changes in the Moon’s phase and the positions of the Sun, Earth, and Moon during each phase. In each condition the participant could move forward and backward in time and had control over the perspective from which the system was being viewed. Each condition also contained guiding questions for the participant to assist with navigating and learning from the simulation.

VR simulation.

The VR simulation was designed to mimic the hands-on activity as closely as possible, while still taking advantages of the technology’s unique affordances. In the VR simulation, participants used a headset and controllers that tracked their motion and rendered the environment, providing an immersive and interactive experience (Fig 1, left). The simulation contained a realistic Sun-Earth-Moon system in which the participant had control over time and their viewing location (Fig 2). Students could control the Moon phases by moving forwards and backwards in time with simple button presses or grabbing the Moon to move it in its orbit. Upon entering the simulation, participants were initially placed on top of the Earth’s north pole, but they could change their position to be far above the Earth to provide a more diagrammatic view of the system or to be near the surface of the Earth to view a realistic horizon as the Sun or Moon rises and sets. The participants were given the guiding questions throughout the experience via a virtual clipboard attached to their hand to help students interact with and learn from the environment. The names of the Moon phases were displayed in the environment as the participant moved the Moon. The simulation was created by our team using the Unity game engine for use with the Oculus Rift VR headset.

thumbnail
Fig 2. Simulated activity.

A screenshot taken from inside the simulated Moon phases activity showing what the VR and Desktop conditions looked like. A video of the VR experience can be found online: https://vimeo.com/310212130.

https://doi.org/10.1371/journal.pone.0229788.g002

Desktop simulation.

The desktop simulation was designed to mimic the VR activity as closely as possible. In the desktop simulation, participants were shown a realistic Sun-Earth-Moon system on a laptop with controls over the camera position and time (Fig 1, middle). The environment was created using the same system as the VR simulation (Fig 2). Participants could control the Moon phases by going forwards and backwards in time through arrow key presses. They could also navigate around the environment using the mouse and scroll to zoom. The participants were given the guiding questions using a tablet device outside of the simulated environment. The names of the Moon phases were displayed in the environment as the participant moved the Moon. This simulation was created by our team using the Unity game engine.

Hands-on activity.

The hands-on activity was based on traditional classroom activities about Moon phases [55]. In this activity, the participant’s head represented the Earth; the Moon was represented by a small ball held at arm’s length; and the Sun was represented by a stationary light (Fig 1, right). Participants were asked to rotate counter-clockwise to observe changes in the Moon’s phase, as observed in the shadowed portion of the ball’s surface. Through this action, real-world complexities arise where the phenomenon may be inaccurately presented. For example, due to the relative distances and sizes between the ball and the participant’s head, participants may see eclipses every month. Subtleties such as inclination and procession of the orbits are uncontrollable and missed in the hands-on activity. The participants were given the guiding questions using a tablet as well as the names of the Moon phases.

Measures

To evaluate the effects of the different conditions, we assessed student understanding of Moon phases using existing pre-post assessment instruments, their attitudes towards the activities and the environment, and tracked their movements in the simulations.

Moon phases assessment.

The pre- and post-tests each consisted of 14 multiple choice questions about Moon phases and the Moon’s motion relative to the Earth, sourced from existing research-validated assessments [5659]. The selected items included ones where the activities were both likely (orbit direction and period) and unlikely (rise and set times) to impact learning. Because the time between the pre- and post-tests were so short, the questions on each test were isomorphic and matched on content, but not identical. The delayed post-test questions were similar to the ones found on the post-test. On all tests, each question had only one correct answer, and the participant’s score was the sum of the number of correct answers, with all questions weighted equally. The item-test correlation and item difficulty turned out to be 0.37 and 0.33 for the pre-test and 0.43 and 0.59 for the post-test respectively based on all participant scores.

We also examined learning across conditions on each sub-topic on the assessments by comparing performance on the isomorphic questions. One of the deciding factors in choosing a certain technology as an educational tool is the affordances it offers to teach different aspects of the lesson. For learning about Moon phases, for example, the orbit and rotation periods are not controlled in the hands-on condition, but are constrained to be realistic in the desktop and VR simulations. Keeping the Moon’s orbit fixed is a task the participants must preform in the hands-on condition that does not require conscious effort in the other conditions. Differences between the technologies to control or not control certain aspects of the activities, therefore, may lead to differences in participant performance on knowledge questions in different topic areas despite showing the same performance between conditions on the whole exam. Our test contained a question pair that, upon investigation, was not truly the same from pre-test to post-test. Though they referred to the same general topic of why phases occur, they were not isomorphic and the question pair was removed from our analysis. The removal of this question did not significantly alter the results of our analysis.

Demographic survey.

The demographic survey was provided at the end of the post-test and asked about a variety of participants’ characteristics.

Gender: Participants were asked their gender with the choices of Male, Female, or Other. For the analysis involving gender as a variable we removed the three participants who chose Other, due to low sample size.

Video game Experience: Participants were asked “On average, how frequently have you played video games over the past three years?” and the choices were daily, weekly, 1-2 times a month, 1-2 times a year, or never. We grouped the choices of daily, weekly, and 1-2 times a month into the category of ‘having significant video game experience’ and the participants who chose 1-2 times a year or never as ‘not having significant video game experience.’

VR experience: Participants were asked “How much virtual reality experience did you have before you participated in the experiment today?” and the choices were very minimum, moderate, a lot, or none. We grouped the participants who specified moderate or a lot as ‘having significant VR experience’ and the other participants as ‘not having significant VR experience’. Only one participant indicated having a lot of VR experience. We maintained three categories of VR experience: none, minimal VR experience, and moderate to high VR experience.

Academic Major: Participants were asked to pick their academic major from a list or write their own. For our analysis we grouped participants according to whether their major was science-focused or non-science-focused. Non-science majors included arts, humanities, economics, social science, communication, and business. Science majors included physics, astronomy, engineering, biology, chemistry, computer science, information science, and math.

Environmental attitudes survey.

We also added an exploratory set of questions to probe participants’ socio-scientific attitudes through measurements of their individual differences in environmental attitudes using 15-items selected from the Environmental Attitude Inventory (EAI) [60, 61]. We suspected that participants in our VR activity might come out with different socio-scientific attitudes, based on similar outcomes found in previous work [40, 62]. In our simulation, we wanted to test the possibility that viewing the planet from the unique vantage point of space (an astronaut-like perspective of the Earth) might have an impact on environmental consciousness [6365]. This effect is attributed to a recognition of the the planet’s limited resources [66]. Specifically, we selected five questions corresponding to each of three of the EAI’s subscales of particular interest to test whether the intervention influenced participants’ environmental movement activism, environmental threat, and human utilization of nature. Sample items were “I would not want to donate money to support an environmentalist cause,” “When humans interfere with nature it often produces disastrous consequences (R)” and “In order to protect the environment, we need economic growth” (1 = Strongly agree to 7 = Strongly disagree), and the modified scale showed sufficiently reliability (Cronbach’s α = .83).

Activity preference.

After completing the post-test, demographic questions, and EAI, participants were invited to try the other two conditions. They then completed a short survey that asked, “Today, you have experienced three different ways of learning moon phases. Which of the three ways to simulate the moon phases is your favorite method?” Participants could select either “Demonstration in virtual reality,” “Hands-on demonstration,” or “On-screen demonstration.” They were then asked, “Please briefly explain, why did you prefer this learning method?” with an open text box answer.

Virtual reality only analyses.

Analyses on presence and movement measures will be included in a subsequent publication, as only those in the virtual reality condition answered these questions or had head and hand movements, specifically, tracked.

Presence: Fourteen questions from two presence questionnaires [67, 68] measured participants’ sense of spatial presence in the virtual reality activity. Questions drawn from these surveys involved asking participants how much they agreed with the following statements: “I was really in outer space,” “I felt surrounded by outer space,” “I really visited outer space,” and “The outer space seemed real.” The participants assigned to the VR condition were also asked if they had experienced any simulator sickness during the activity and if it was distracting.

Movement tracking and controller use: The X, Y, and Z position and the pitch, yaw, and roll rotation of participants’ head and hands were recorded for the entire session in the VR condition. Movement from timepoint to timepoint may be calculated as the Euclidean distance (mm) between the positions of a tracker at time one and time two. Because the current paper focuses on the comparisons between the three conditions, analysis of the VR specific data, which includes movement, button presses, and presence measures, will be presented in a subsequent publication focused only on participants’ experiences in the VR simulation.

Data analysis

After all of the data had been collected we cleaned the data set by merging the survey results and verifying participant responses. The full data set was checked for inaccuracies, duplicates, and was translated into the proper form for analysis. Mathematica and R [69] packages were used for the analyses. All data and code are available through the CISER data archive (ciser.cornell.edu/data/data-archive/).

First, performance on the pre-test, post-test, and delayed post-test were compared between conditions using Analysis of Variance. Effects of condition on environmental attitudes were compared between conditions using ANOVA. Linear regression analyses were used to evaluate effects of other variables on student performance on the post-test, controlling for pre-test score. Variables were selected based on prior literature and included condition, gender, video game experience, and VR experience. Variables were checked for correlation using the ggpairs function from the GGally package in R, recording each variable level as numeric. The only significant correlation was between gender and video game experience, with a correlation coefficient of 0.47. Gender and video game experience were, therefore, analyzed separately in all regression analyses. Variance Inflation Factors (VIF) were also calculated in all regression analyses to measure possible collinearity. Main effects were tested first without interactions, and then regressions with interactions between condition and video game experience, gender, and VR experience were tested next. For the regression analyses, the base variables were a female, non-science major in the hands-on condition, with no VR experience, and not having significant video game experience.

Results

We describe our results grouped by overall differences in learning and attitudes based on condition and then by interactions between other variables.

Overall learning between conditions

The average score on the pre-test was 33.7% with no significant differences between the three conditions: F(2,169) = 1.04, p = 0.356. Student performance significantly increased from pre- to post-test by 25.3% on average (average score of 59.1%), with no significant differences between the post-test scores across the three conditions: F(2,169) = 0.815, p = 0.444 (Fig 3). An ANCOVA controlling for pre-score also showed no differences between the post-score between conditions: F(2,168) = 1.07, p = 0.344. The average score on the delayed post-test(completed approximately four months after participation in the activity) was 39.0%, again with no differences between conditions: F(2,51) = 0.571, p = .568. Our overall normalized gain of 0.38 is similar to that reported in an experiment involving pre-post moon phase assessments for a 20 minute inquiry-based tutorial of 0.54 [58].

thumbnail
Fig 3. Pre and post scores by condition.

Figure updated and adapted from [72]. An overall view of pre- to post-test performance. Violin plots show how scores were distributed across conditions and between pre- to post-test. Bins are 1 point wide. Average scores and standard error are indicated in white.

https://doi.org/10.1371/journal.pone.0229788.g003

Our pre and post tests were not designed to be unidimensional and therefore did not reach a high Cronbach’s alpha (α = 0.52 for pre test, α = 0.66 for post test). This is expected for multidimensional multiple choice tests [70, 71]. Both of our tests gave scores across a sufficient range according to Ferguson’s delta (δ = 0.93 for pre test, δ = 0.96 for post test).

Learning by question

Breaking down the pre- and post-test scores by question show consistent gains between conditions, with no statistically significant differences (Fig 4). Across the conditions, some topic areas showed large gains over 40% (orbit period, phase period, and illumination), while others showed small gains less than 10% (scale, Moon rotation, phase diagram, rise/set time), consistent with previous research on learning about Moon phases [58, 73]. This demonstrates that learning did not differ between conditions on sub-topics that may have held learning benefits within the different conditions.

thumbnail
Fig 4. Learning by condition and topic.

Figure updated and adapted from [72]. The difference in participant responses from the pre-test to the post-test broken down by question topic. The percent of correct responses on the pre-test and post test for that topic are connected by a colored bar. A green bar signifies improvement, with the higher number representing the post-test score. A red bar means there were fewer correct responses on the post-test, with the higher number representing pre-test score.

https://doi.org/10.1371/journal.pone.0229788.g004

Attitudes towards the conditions

Consistent with previous work [3] participants overwhelmingly (Chi2 = 152, p < 0.00001) preferred the VR condition as their favorite learning method independent of their study condition (Fig 5).

thumbnail
Fig 5. Participant preference.

Preferred choice after viewing each activity.

https://doi.org/10.1371/journal.pone.0229788.g005

78% preferred the VR condition and when describing their reason used phrases such as: the VR condition was “easier to visualize”, “more realistic”, “more immersive”, “more fun”, “more interesting”, and “the most accurate”. The participants who preferred VR generally said that seeing the full picture in a realistic way helped with their learning. The main contributors to this feeling were the ease of viewing different perspectives and having easier control over the system. For example, one participant wrote:

“Having a overall space to see where everything is helps a lot. Even in class I still had a hard time understanding what they are talking about in concept. But I think I learned a lot in VR and being able to manipulate the environment on my own accord. It seems more engaging than the 2 other methods.”

For the few students who did not prefer VR, participants said that they “did not notice everything that was going on”, found it “a little too complex”, it “made me dizzy and confused,” or generally indicated feeling uncomfortable or overwhelmed. They preferred either the hands-on or desktop conditions because they were more familiar. For example, a participant who preferred the desktop condition wrote:

“The VR was cool but since I’m very new to it I spent most of my time just trying to figure out how it worked–it was also tough to find where the Sun and Moon were at times because of how ‘large’ the environment was. The desktop game was more familiar and easy-to-use for me. Personally.”

The 12% of participants who preferred the desktop condition also used phrases such as: the desktop condition was “less overwhelming”, “easier to control”, and “very easy to follow”. When participants did not like the desktop condition they said: the desktop condition “gave a limited field of view”, and “[was] a lot harder for me to navigate”.

The 10% of participants who preferred the hands-on condition used phrases such as “easiest and fastest”, “I was able to more clearly focus”, and “I got distracted by the other methods”. A participant who preferred the hands-on condition wrote:

“I really liked the virtual reality method. And it gave me more information than the other two methods, for instance, what time of day certain Moon phases would rise and set. Nevertheless, it was almost too overwhelming and it was as if I was too excited to be in space to actually commit to learning the Moon phases. With the hands-on demonstration. there was nothing to distract me. And, obviously, controlling the demonstration felt about as natural as possible.”

Effects on environmental attitudes

Our second hypothesis, that there would be a main effect of condition on environmental attitudes, such that participants in the VR conditions would evince greater environmental support, was also not supported. A one-way ANOVA found no significant differences in environmental attitudes among treatment groups (means ± standard error were VR = 5.24 ± 0.09, Desktop = 5.25 ± 0.1, Hands-on = 5.35 ± 0.08; F(2, 169) 0.41, p = .66), suggesting that the methods of learning Moon phases did not affect participants’ environmental attitudes. Although we expected that brief exposure in the VR condition to the unique vantage point of the Earth could possibly increase participants’ concerns for environmental issues (by priming the concept of limited resources), the instrument’s typical use as a trait-level instrument may make it unsurprising that we did not observe such an effect.

Effects due to participant demographics

We conducted exploratory analyses to identify whether other variables were interacting with participants’ experiences in the three conditions. Based on prior literature, we focused on gender, major, video game experience, and quantity of VR experience. We first explored main effects alone and then explored interactions between the variables and condition. When examining the role of the demographic variables on performance, we found a correlation between video game experience and gender (column 1 of Fig 6), but no correlation between any of the other variables (Fig 7). Therefore, video game experience and gender were analyzed in separate regression models.

thumbnail
Fig 6. Relationship between gender, video game experience, and scores across conditions.

The first column shows the correlations between gender and video game experience. The second column shows the average scores and standard error at pre- and post-test for students with and without video game experience. The third column shows the average scores and standard error at pre- and post-test for students identifying as male and female. See S1 Table for line details.

https://doi.org/10.1371/journal.pone.0229788.g006

thumbnail
Fig 7. Interactions plots for academic major, and quantity of VR experience.

Each column shows the relationship present between one of these measures and pre to post score by condition. Mean score and standard error are shown, see S2 Table for line details.

https://doi.org/10.1371/journal.pone.0229788.g007

Main effects between condition and demographic variables.

As shown in Table 1, students’ pre-score was a significant predictor of their post-scores across conditions and, again, there was no significant differences in post-score between conditions. We interpret the regression coefficients as the simple effect size, β. [74] The main effects model also found that there is a significant effect for major, with science majors slightly outperforming non-science majors, even when controlling for pre-score. Neither participants’ gender, video game experience, or VR experience had significant main effects.

thumbnail
Table 1. Linear regression models for gender and video game experience including main effects only.

Regression analysis models for students’ post-score with main effects for pre-score, condition, gender, video game experience, VR experience, and students’ major. Gender and video game experience are analyzed in separate models as the variables were highly correlated.

https://doi.org/10.1371/journal.pone.0229788.t001

Interactions between condition and demographic variables.

Experience in VR did not have a significant interaction with condition (Table 2). Notably, this indicates that having experience with VR did not improve students’ learning in the VR condition, even though some students claimed that not having experience in VR impeded their learning.

thumbnail
Table 2. Linear regression models including main effects with gender and video game interaction.

Regression analysis models for students’ post-score with main effects for pre-score, condition, gender, video game experience, VR experience, and students’ major and interactions between condition and gender, VR experience, and video game experience. Gender and video game experience are analyzed in separate models as the variables were highly correlated.

https://doi.org/10.1371/journal.pone.0229788.t002

The interactions with gender and video game experience are more complicated. As seen in Table 2, there is an interaction between gender and condition in Model 3, with men outperforming in the VR condition. However, Model 4 indicates that there is a main effect with video game experience (seen as marginally significant in Model 2), and a potential interaction between condition and video game experience. From Fig 6, this interaction is likely understood through the pre-test scores. Only in the hands-on condition do we see differences in pre-test scores between participants with and without video game experience. We believe these results can be understood such that men and participants with video game experience learn more from the VR condition, followed by the desktop and then hands-on conditions. Given the correlation between video game experience and gender, it is unclear which variable is responsible for the differences. This correlation between gender and video game experience may be more clearly understood by elaborating on the type and detailed quantity of video game experience in future studies.

Discussion

In this study, we performed a controlled experiment of student learning about Moon phases through three different modalities: a hands-on activity, a desktop simulation, and a VR simulation. We found no overall effect for condition on students’ learning on an immediate post-test, nor on a delayed post-test four months after the intervention. We did find that students with declared science majors outperformed non-science majors in all conditions. Student learning in the VR condition was not improved with VR experience. Despite these results, students overwhelmingly preferred learning in the VR condition.

There are several possible interpretations of these results. First, our hypothesis had been that the VR simulation would improve performance by reducing the real-world complexities of a hands-on activity and providing a realistic, embodied learning experience. The results suggest that these affordances did not impact learning about Moon phases. On the other hand, one may interpret that the VR condition provided equivalent learning to the other two modalities, while dramatically improving students’ attitudes towards the learning experience.

The study also explored interactions between conditions and students’ video game experience and gender. Gender and video game experience were significantly correlated in our study, and men performed better in the VR condition. This means that either video game experience, being male, or a combination of the two provides an advantage.

There are several studies that suggest video game experience would provide this advantage. One study found that video games can have a beneficial effect on completing complex spatial tasks, visuomotor coordination, and multiple object tracking [75]. Other studies suggest that the benefits of video games on such abilities can be gained after a short time playing an action game and is not dependent on gender [76]. Furthermore, research has shown that men are more drawn to the type of video games suggested to provide these benefits [77]. Combining these studies paints a picture that these advantages are not inherently male but may be caused by a particular type of video game experience that so happens to be common for men.

Alternatively, since the 1970’s, several studies have suggested that men have better spatial reasoning than women. However, comprehensive studies have shown recently that, while differences in spatial reasoning between men and women may be present in certain cases, we may not fully understand their causes. Newcombe and Stieff claimed, “from a practical educational standpoint, the most relevant fact is that the relevant skills can be improved in both men and women” [78, p. 962]. Research has also previously found that men outperformed women on post-test conceptual assessments of Moon phase understanding, but only on items involving spatial reasoning [79]. However, it has also been found that men’s and women’s scores improve similarly from pre- to post-test with appropriate instruction [79, 80], again suggesting that performance can be improved in both men and women.

While our study was unable to prove video game experience was the true contributor to performance increases in the VR condition over gender, literature on the subject suggests that video game experience and not gender may be the affecting variable. While further experimental work should examine this proposition, this points the way to improving learning experiences in virtual reality such that they benefit all learners. If all participants did as well in the virtual reality condition as did men/people with video game experience, then virtual reality could be an overall more effective teaching tool than the other two tested modes of teaching moon phases.

Limitations

There are several limitations to this study that should be considered when interpreting our results. This study was focused on learning in an activity that was based on physics and astronomy concepts. Thus, we should be careful when applying our findings to learning in other subjects or even other concepts within physics and astronomy [48]. Furthermore, the participant pool for this study was not perfectly representative of either a typical college classroom or typical physics/astronomy classroom. For example, our study was comprised of 80% women while the American Physical Society reports that only around 20% of undergraduate physics degrees are awarded to women [81]. In addition, the significant interaction between gender and condition was based on a small sample of male participants and future work should evaluate this result with more equally distributed samples.

The contribution that we believe this paper makes to the discussion on gender/video game effects on learning in virtual reality is based on exploratory analyses. Thus, future work should explicitly test these hypotheses in a pre-registered study and include more detailed questions related to the type and quantity of video game experience.

While VR technology has advanced rapidly, it is still not ideal. Control responsiveness, motion sickness, limited resolution and field of view are all technological obstacles that can still break immersion and distract from learning using today’s equipment. In contrast, the technologies we used for the hand-on condition and the desktop condition have essentially plateaued compared to VR, meaning the results of this study may change as VR technology progresses.

Compared to VR, the participants were very familiar with the technology used in the hands-on condition and desktop condition. Many participants had never experienced VR before our study or had very limited experience with using a VR headset. This suggests that many of our participants were managing high cognitive load as they attempt to become comfortable with VR, understand the activity, and learn about Moon phases. Participants in the other two conditions did not have such a high barrier to getting comfortable with the technology handed to them. For participants who preferred the non-VR conditions, a common praise was familiarity with the equipment. This suggests that as a college population becomes more familiarly and comfortable using VR, the results of this study may change. Future research should measure students’ cognitive load during the activity explicitly, such as through eye-tracking [82], self-report surveys [28, 8285], physiological indicators [28, 85], or electroencephalography [86]. We note, however, that many of the common methods have several limitations [87], such as that different methods may or may not be able to distinguish different types of cognitive load [28, 84, 87, 88]. It is unclear which may be at play in these activities.

Future work

Our study joins others in suggesting that virtual reality is a promising technology as an educational tool but does not, in itself, guarantee a learning advantage over traditional hands-on activities or desktop simulations. There are several areas for future work, based on this analysis.

Future work should focus on further investigating the potential relationship between video game experience and learning gains. This strategy has three components. First, researchers should attempt to sample both male and female participants with equivalent video game experience. Video game experience should be more precisely characterized, focusing on game types that require more visuospatial navigation skills. Video game experiences tend to build procedural or motor skills [89] which do not typically go away after long-term disuse. Measures should thus include lifetime experience, rather than frequency of use alone. If video game experience is confirmed to provide users with skills that allow them to learn better in VR, then researchers can use this knowledge to provide participants with tools to learn to interact with the environment in order to provide a level playing field to all participants.

The VR educational experience can also be improved through enhancing the user interface to maximize VR’s full potential. This does not require waiting for the technology to improve or to become more widely used; instead, designers can take cues from participant’s responses. One strategy would be to enhance users’ sense of embodiment. In this study, users were not embodied in avatars, which may have affected their feeling of presence or ‘being there’ within an environment [90]. This in turn could have affected how willing participants were to interact with the environment, thereby reducing their learning gains. Examining existing movement data in the VR condition could provide hints for how to design such a simulation. Similarly, our study used guiding questions that must be answered before proceeding into the environment which may have prevented exploration. Thus, future studies should evaluate how avatar embodiment and interactive text affect movement and learning within an environment as well as how visuospatial ability relates to movement in VR.

Finally, participants worked alone in our study, but each of our conditions could also be used to collaborate. Two students could help each other learn concepts through answering questions together, or an instructor could emphasize key components that are easily missed [91]. Indeed in virtual reality in particular, the anonymity can make it a safer space for learning [92], thereby encouraging students to make mistakes and learn from them without fear of judgement. If participants experience gains from social learning, such gains may be more noticeable in virtual environments.

However, future work must also remain open to the possibility that the excitement and engagement produced by virtual reality experiences may not translate into learning gains in all domains.

Conclusion

This study has several takeaway findings. First, participants’ learning gains from pre- to post-test were not significantly different, on average, between the VR, desktop, and hands-on conditions. Participants preformed similarly well on each question topic across the three conditions. We found no strong evidence that participants’ retention differed between conditions after four months. Our hypothesis, that VR would improve learning by simplifying real-world complexities and providing an embodied learning experience, was not supported. Nonetheless, participants strongly favored learning in the VR activity.

Guided by the literature on virtual reality and learning, we also collected data on demographic measures (gender, video game experience, virtual reality experience, and major) to explore predicted interactions, all of which are reported here. We did find a positive effect from gender within the VR condition. However, video game experience and gender were significantly correlated in our study, and the literature suggests that video game experience may be the main reason for the performance increase.

This study has allowed us to determine new experiment designs that will help explore the reasons we saw similar learning gains across conditions. What remains promising about VR is that, relative to a ball on a stick and 2D computer games, it is a technology that is rapidly advancing. Given that participants’ unfamiliarity with VR and the technical roughness of the simulation, the fact that participants were able to learn as much as those in the other conditions may bode well to support VR as a better educational tool when the majority of students are comfortable learning in a virtual environment.

Given that the learning was the same regardless of condition what remains is the fact that participants widely favored the VR experience. As a method of engaging students, using VR was successful in our study and was not achieved at the cost of learning gains. Novelty does diminish however, so an advantage based on novelty alone will cease to be an advantage as exposure rises.

All together, there are many avenues to explore with educational VR. Future work should encourage a more comprehensive look into VR’s ability as an educational tool, such that a participant’s experience is considered from multiple perspectives.

Supporting information

S1 Table. Modeling interactions between video game experience and gender with condition.

Supporting data for Fig 6.

https://doi.org/10.1371/journal.pone.0229788.s001

(PDF)

S2 Table. Modeling interactions between academic major, and VR experience quantity with condition.

Supporting data for Fig 7.

https://doi.org/10.1371/journal.pone.0229788.s002

(PDF)

Acknowledgments

This work was supported by Oculus Education with special thanks to Cindy Ball and Cindi McCurdy. We would like to thank all the graduate and undergraduate students who helped develop the simulations and run participants: Tristan Stone, Yilu Sun, Akhil Gopu, Kristi Lin, Jason Wu, Anirudh Maddula, Frank Rodriguez, Alice Nam, Phil Barrett, Dwyer Tschantz, Connor Lapresi, Giulia Reversi, Keun Youk, Albert Tsao, and Annie Hughey. We would also like to thank Kimberly Williams from the Cornell Center for the Integration of Research, Teaching, and Learning, Stephen Parry of the Cornell Statistical Consulting Unit, Daniel Alexander and Florio Arguillas from the Cornell Institute of Social and Economic Research, and the entire Cornell Physics Education Research Lab for support, comments, and assistance.

References

  1. 1. Maddux CD. Twenty Years of Research in Information Technology in Education. Computers in the Schools. 2003;20(1-2):35–48.
  2. 2. Singer SR, Nielsen NR, Schweingruber HA. Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Washington, D.C.: Committee on the Status, Contributions, and Future Directions of Discipline-Based Education Research; Board on Science Education; Division of Behavioral and Social Sciences and Education; National Research Council; 2012.
  3. 3. Dede C. Immersive interfaces for engagement and learning. science. 2009;323(5910):66–69. pmid:19119219
  4. 4. Pan Z, Cheok AD, Yang H, Zhu J, Shi J. Virtual reality and mixed reality for virtual learning environments. Computers & Graphics. 2006;30(1):20–28.
  5. 5. Sokoloff DR, Thornton RK. Using interactive lecture demonstrations to create an active learning environment. In: Redish EF, Rigden JS, editors. The Changing Role of Physics Departments in Modern Universities: Proceedings of ICUPE. vol. 399. American Institute of Physics; 1997. p. 1061–1074.
  6. 6. Wieman CE, Perkins KK, Adams WK. Oersted Medal Lecture 2007: Interactive simulations for teaching physics: What works, what doesn’t, and why. American Journal of Physics. 2008;76(4):393.
  7. 7. Stains M, Harshman J, Barker MK, Chasteen SV, Cole R, DeChenne-Peters SE, et al. Anatomy of STEM teaching in North American universities. Science. 2018;359(6383):1468–1470. pmid:29599232
  8. 8. Hofstein A, Lunetta VN. The laboratory in science education: Foundations for the twenty-first century. Science Education. 2004;88(1):28–54.
  9. 9. Holmes NG, Olsen J, Thomas JL, Wieman CE. Value added or misattributed? A multi-institution study on the educational benefit of labs for reinforcing physics content. Physical Review Physics Education Research. 2017;13(1):010129.
  10. 10. Etkina E, Karelina A, Ruibal-Villasenor M, Rosengrant D, Jordan R, Hmelo-Silver CE. Design and Reflection Help Students Develop Scientific Abilities: Learning in Introductory Physics Laboratories. Journal of the Learning Sciences. 2010;19(1):54–98.
  11. 11. Crouch C, Fagen AP, Callan JP, Mazur E. Classroom demonstrations: Learning tools or entertainment? American Journal of Physics. 2004;72(6):835–838.
  12. 12. Finkelstein ND, Adams WK, Keller CJ, Kohl PB, Perkins KK, Podolefsky NS, et al. When learning about the real world is better done virtually: A study of substituting computer simulations for laboratory equipment. Phys Rev ST Phys Educ Res. 2005;1:010103.
  13. 13. Chini JJ, Madsen A, Gire E, Rebello NS, Puntambekar S. Exploration of factors that affect the comparative effectiveness of physical and virtual manipulatives in an undergraduate laboratory. Physical Review Special Topics—Physics Education Research. 2012;8(1):010113.
  14. 14. Smith EM, Holmes NG. Seeing the real world: Comparing learning from enhanced lecture demonstrations and verification labs. 2017;.
  15. 15. Wilson M. Six views of embodied cognition. Psychonomic Bulletin & Review. 2002;9(4):625–636.
  16. 16. Anderson ML. Embodied Cognition: A field guide. Artificial Intelligence. 2003;149(1):91—130. https://doi.org/10.1016/S0004-3702(03)00054-7.
  17. 17. Roth WM, Jornet A. Situated cognition. Wiley Interdisciplinary Reviews: Cognitive Science. 2013;4(5):463–478. pmid:26304240
  18. 18. Carbonneau KJ, Marley SC, Selig JP. A meta-analysis of the efficacy of teaching mathematics with concrete manipulatives. Journal of Educational Psychology. 2013;105(2):380–400.
  19. 19. Martin T, Schwartz DL. Physically Distributed Learning: Adapting and Reinterpreting Physical Environments in the Development of Fraction Concepts. Cognitive Science. 2005;29(4):587–625. pmid:21702786
  20. 20. Tsang JM, Blair KP, Bofferding L, Schwartz DL. Learning to “See” Less Than Nothing: Putting Perceptual Skills to Work for Learning Numerical Structure. Cognition and Instruction. 2015;33(2):154–197.
  21. 21. Ruby A. Hands-on Science and Student Achievement. RAND graduate school; 2001. Available from: https://apps.dtic.mil/docs/citations/ADA393033.
  22. 22. Scherr RE, Close HG, Close EW, Flood VJ, McKagan SB, Robertson AD, et al. Negotiating energy dynamics through embodied action in a materially structured environment. Physical Review Special Topics—Physics Education Research. 2013;9(2):020105.
  23. 23. Scherr RE, Close HG, Close EW, Vokos S. Representing energy. II. Energy tracking representations. Physical Review Special Topics—Physics Education Research. 2012;8(2):020115.
  24. 24. Piaget J. Origin of Intelligence in the Child: Selected Works vol 3. Routledge; 2013.
  25. 25. Brown JS, Collins A, Duguid P. Situated Cognition and the Culture of Learning. Educational Researcher. 1989;18(1):32.
  26. 26. Cunningham HA. Lecture demonstration versus individual laboratory method in science teaching—A summary. Science Education. 1946;30(2):70–82.
  27. 27. Klahr D, Triona LM, Williams C. Hands on what? The relative effectiveness of physical versus virtual materials in an engineering design project by middle school children. Journal of Research in Science Teaching. 2007;44(1):183–203.
  28. 28. Paas F, Renkl A, Sweller J. Cognitive Load Theory and Instructional Design: Recent Developments. Educational Psychologist. 2003;38(1):1–4.
  29. 29. Kahneman D, Tversky A. Subjective probability: A judgment of representativeness. Cognitive Psychology. 1972;3(3):430–454.
  30. 30. Tversky A, Kahneman D. Judgment under Uncertainty: Heuristics and Biases. Science (New York, NY). 1974;185(4157):1124–31.
  31. 31. Kapur M. Examining Productive Failure, Productive Success, Unproductive Failure, and Unproductive Success in Learning. Educational Psychologist. 2016;51(2):289–299.
  32. 32. Sweller J, Chandler P. Evidence for Cognitive Load Theory. Cognition and Instruction. 1991;8(4):351–362.
  33. 33. Bransford J, Brown A, Cocking R. How people learn: Mind, brain, experience, and school. Washington, DC: National Research Council. 1999;.
  34. 34. Schwartz DL, Chase CC, Oppezzo MA, Chin DB. Practicing versus inventing with contrasting cases: The effects of telling first on learning and transfer. Journal of Educational Psychology. 2011;103(4):759–775.
  35. 35. Price AM, Perkins KK, Holmes NG, Wieman CE. How and why do high school teachers use PhET interactive simulations? In: Traxler A, Cao Y, Wolf S, editors. Physics Education Research Conference 2018. Washington, D.C.; 2018.
  36. 36. Podolefsky NS, Perkins KK, Adams WK. Factors promoting engaged exploration with computer simulations. Phys Rev ST Phys Educ Res. 2010;6:020117.
  37. 37. Darrah M, Humbert R, Finstein J, Simon M, Hopkins J. Are Virtual Labs as Effective as Hands-on Labs for Undergraduate Physics? A Comparative Study at Two Major Universities. Journal of Science Education and Technology. 2014;23(6):803–814.
  38. 38. Evangelou F, Kotsis K. Real vs virtual physics experiments: comparison of learning outcomes among fifth grade primary school students. A case on the concept of frictional force. International Journal of Science Education. 2018; p. 1–19.
  39. 39. Strzys MP, Kapp S, Thees M, Klein P, Lukowicz P, Knierim P, et al. Physics holo.lab learning experience: using smartglasses for augmented reality labwork to foster the concepts of heat conduction. European Journal of Physics. 2018;39(3):035703.
  40. 40. Chang HY, Hsu YS, Wu HK. A comparison study of augmented reality versus interactive simulation technology to support student learning of a socio-scientific issue. Interactive Learning Environments. 2016;24(6):1148–1161.
  41. 41. Kapp S, Thees M, Strzys MP, Beil F, Kuhn J, Amiraslanov O, et al. Augmenting Kirchhoff’s laws: Using augmented reality and smartglasses to enhance conceptual electrical experiments for high school students. The Physics Teacher. 2019;57(1):52–53.
  42. 42. Bricken M. Virtual Reality Learning Environments: Potentials and Challenges. SIGGRAPH Comput Graph. 1991;25(3):178–184.
  43. 43. Perone B. Taking VR to School: Exploring immersive virtual reality as a tool for environmental science education. Stanford. 2016;.
  44. 44. Won AS, Bailenson JN, Janssen JH. Automatic Detection of Nonverbal Behavior Predicts Learning in Dyadic Interactions. IEEE Transactions on Affective Computing. 2014;5(2):112–125.
  45. 45. Smith JR, Byrum A, McCormick TM, Young N, Orban C, Porter CD. A Controlled Study of Stereoscopic Virtual Reality in Freshman Electrostatics. Physics Education Research Conference 2017. 2017; p. 376–379.
  46. 46. Lier EJ, Harder J, Oosterman JM, de Vries M, van Goor H. Modulation of tactile perception by Virtual Reality distraction: The role of individual and VR-related factors. PLOS ONE. 2018;13(12):e0208405. pmid:30507958
  47. 47. Makransky G, Terkildsen TS, Mayer RE. Adding immersive virtual reality to a science lab simulation causes more presence but less learning. Learning and Instruction. 2017.
  48. 48. Winn W, Windschitl M, Fruland R, Lee Y. When Does Immersion in a Virtual Environment Help Students Construct Understanding? In: Proceedings of the International Conference of the Learning Sciences, ICLS. No. 206; 2002. p. 497–503.
  49. 49. León I, Tascón L, Ortells-Pareja JJ, Cimadevilla JM. Virtual reality assessment of walking and non-walking space in men and women with virtual reality-based tasks. PLOS ONE. 2018;13(10):e0204995. pmid:30278083
  50. 50. Lelliott A, Rollnick M. Big Ideas: A review of astronomy education research 1974–2008. International Journal of Science Education. 2010;32(13):1771–1799.
  51. 51. Galano S, Colantonio A, Leccia S, Marzoli I, Puddu E, Testa I. Developing the use of visual representations to explain basic astronomy phenomena. Physical Review Physics Education Research. 2018;14(1):010145.
  52. 52. Türk C, Kalkan H. The Effect of Planetariums on Teaching Specific Astronomy Concepts. Journal of Science Education and Technology. 2015;24(1):1–15.
  53. 53. Wilhelm J, Jackson C, Sullivan A, Wilhelm R. Examining Differences Between Preteen Groups’ Spatial-Scientific Understandings: A Quasi-experimental Study. The Journal of Educational Research. 2013;106(5):337–351.
  54. 54. Cole M, Cohen C, Wilhelm J, Lindell R. Spatial thinking in astronomy education research. Physical Review Physics Education Research. 2018;14(1):010139.
  55. 55. Newbury P. Phases of the Moon; 2011. Available from: https://peternewbury.org/2011/09/06/phases-of-the-moon/.
  56. 56. Hufnagel B. Development of the Astronomy Diagnostic Test. Astronomy Education Review. 2002;1(1):47–51.
  57. 57. Lindell R, Olsen JP. Developing the Lunar Phases Concept Inventory. In: Physics Education Research Conference 2002. PER Conference. Boise, Idaho; 2002.
  58. 58. Lindell R. Measuring Conceptual Change in College Students Understanding of Lunar Phases. In: Physics Education Research Conference 2004. vol. 790 of PER Conference; 2004. p. 53–56.
  59. 59. Slater SJ. The Development And Validation Of The Test Of Astronomy STandards (TOAST). J Astro Earth Sci Educ. 2014;1(1):22.
  60. 60. Milfont TL, Duckitt J. The environmental attitudes inventory: A valid and reliable measure to assess the structure of environmental attitudes. Journal of environmental psychology. 2010;30(1):80–94.
  61. 61. Dunlap RE, Van Liere KD, Mertig AG, Jones RE. New Trends in Measuring Environmental Attitudes: Measuring Endorsement of the New Ecological Paradigm: A Revised NEP Scale. Journal of Social Issues. 2000;56(3):425–442.
  62. 62. Ahn SJ, Bostick J, Ogle E, Nowak KL, McGillicuddy KT, Bailenson JN. Experiencing nature: Embodying animals in immersive virtual environments increases inclusion of nature in self and involvement with nature. Journal of Computer-Mediated Communication. 2016;21(6):399–419.
  63. 63. Dunlap RE, Liere KDV. The “New Environmental Paradigm“. The Journal of Environmental Education. 2008;40(1):19–28.
  64. 64. Poole R. Earthrise How Man First Saw the Earth. Yale University Press; 2010.
  65. 65. Stepanova ER, Quesnel D, Riecke BE. Space—A Virtual Frontier: How to Design and Evaluate a Virtual Reality Experience of the Overview Effect. Front Digital Humanities. 2019;2019.
  66. 66. Dunlap RE, Van Liere KD. The “new environmental paradigm”. The journal of environmental education. 1978;9(4):10–19.
  67. 67. Aymerich-Franch L, Karutz C, Bailenson JN. Effects of facial and voice similarity on presence in a public speaking virtual environment. In: Proceedings of the International Society for Presence Research Annual Conference; 2012. p. 24–26.
  68. 68. Witmer BG, Singer MJ. Measuring presence in virtual environments: A presence questionnaire. Presence. 1998;7(3):225–240.
  69. 69. R Core Team. R: A Language and Environment for Statistical Computing; 2017. Available from: https://www.R-project.org/.
  70. 70. Wilcox BR, Pollock SJ. Coupled multiple-response versus free-response conceptual assessment: An example from upper-division physics. Phys Rev ST Phys Educ Res. 2014;10:020124.
  71. 71. Cortina JM. What is coefficient alpha? An examination of theory and applications. Journal of Applied Psychology. 1993;78:98–104.
  72. 72. Madden J, Won AS, Schuldt J, Kim B, Pandita S, Sun Y, et al. Virtual Reality as a Teaching Tool for Moon Phases and Beyond. 2018 Physics Education Research Conference Proceedings. 2018.
  73. 73. Wilhelm J, Cole M, Cohen C, Lindell R. How middle level science teachers visualize and translate motion, scale, and geometric space of the Earth-Moon-Sun system with their students. Phys Rev Phys Educ Res. 2018;14:010150.
  74. 74. Baguley T. Standardized or simple effect size: What should be reported? British Journal of Psychology. 2009;100(3):603–617. pmid:19017432
  75. 75. Spence I, Feng J. Video games and spatial cognition. Review of General Psychology. 2010;14(2):92–104.
  76. 76. Feng J, Spence I, Pratt J. Playing an Action Video Game Reduces Gender Differences in Spatial Cognition. Psychological Science. 2007;18(10):850–855. pmid:17894600
  77. 77. Phan MH, Jardina JR, Hoyle S, Chaparro BS. Examining the Role of Gender in Video Game Usage, Preference, and Behavior. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2012;56(1):1496–1500.
  78. 78. Newcombe NS, Stieff M. Six Myths About Spatial Thinking. International Journal of Science Education. 2012;34(6):955–971.
  79. 79. Wilhelm J. Gender Differences in Lunar-related Scientific and Mathematical Understandings. International Journal of Science Education. 2009;31(15):2105–2122.
  80. 80. Jackson C, Wilhelm JA, Lamar M, Cole M. Gender and Racial Differences: Development of Sixth Grade Students’ Geometric Spatial Visualization within an Earth/Space Unit. School Science and Mathematics. 2015;115(7):330–343.
  81. 81. Society AP. Bachelor’s Degrees in Physics and STEM Earned by Women; 2018. Available from: https://www.aps.org/programs/education/statistics/womenstem.cfm.
  82. 82. Zu T, Hutson J, Loschky LC, Rebello NS. Use of Eye-Tracking Technology to Investigate Cognitive Load Theory. In: 2017 Physics Education Research Conference Proceedings. American Association of Physics Teachers; 2018. p. 472–475. Available from: https://www.compadre.org/per/items/detail.cfm?ID=14673.
  83. 83. Paas FG. Training strategies for attaining transfer of problem-solving skill in statistics: A cognitive-load approach. Journal of Educational Psychology. 1992;84(4):429–434.
  84. 84. Leppink J, Paas F, Van der Vleuten CPM, Van Gog T, Van Merriënboer JJG. Development of an instrument for measuring different types of cognitive load. Behavior Research Methods. 2013;45(4):1058–1072. pmid:23572251
  85. 85. Paas FGWC, van Merriënboer JJG, Adam JJ. Measurement of Cognitive Load in Instructional Research. Perceptual and Motor Skills. 1994;79(1):419–430. pmid:7808878
  86. 86. Antonenko P, Paas F, Grabner R, van Gog T. Using Electroencephalography to Measure Cognitive Load. Educational Psychology Review. 2010;22(4):425–438.
  87. 87. de Jong T. Cognitive load theory, educational research, and instructional design: some food for thought. Instructional Science. 2010;38(2):105–134.
  88. 88. DeLeeuw KE, Mayer RE. A comparison of three measures of cognitive load: Evidence for separable measures of intrinsic, extraneous, and germane load. Journal of Educational Psychology. 2008;100(1):223–234.
  89. 89. Rosser JC, Lynch PJ, Cuddihy L, Gentile DA, Klonsky J, Merrell R. The impact of video games on training surgeons in the 21st century. Archives of surgery. 2007;142(2):181–186. pmid:17309970
  90. 90. Slater M. Implicit Learning Through Embodiment in Immersive Virtual Reality. In: Virtual, Augmented, and Mixed Realities in Education. Springer; 2017. p. 19–33.
  91. 91. Chi MTH. Active-Constructive-Interactive: A Conceptual Framework for Differentiating Learning Activities. Topics in Cognitive Science. 2009;1(1):73–105. pmid:25164801
  92. 92. Yu FY, Liu YH. Creating a psychologically safe online space for a student-generated questions learning activity via different identity revelation modes. British Journal of Educational Technology. 2009;40(6):1109–1123.