Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Development of a self-report instrument for measuring online teaching practices and discussion facilitation

  • Whitney DeCamp ,

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Writing – original draft, Writing – review & editing

    whitney.decamp@wmich.edu

    Affiliation Department of Sociology, Western Michigan University, Kalamazoo, Michigan, United States of America

  • Brian Horvitz,

    Roles Conceptualization, Funding acquisition, Investigation, Methodology, Project administration, Supervision, Writing – original draft, Writing – review & editing

    Affiliation Department of Educational Leadership, Research and Technology, Western Michigan University, Kalamazoo, Michigan, United States of America

  • Regina L. Garza Mitchell,

    Roles Conceptualization, Funding acquisition, Investigation, Methodology, Writing – review & editing

    Affiliation Department of Educational Leadership, Research and Technology, Western Michigan University, Kalamazoo, Michigan, United States of America

  • Megan Grunert Kowalske,

    Roles Conceptualization, Funding acquisition, Investigation, Methodology, Writing – review & editing

    Affiliation Department of Chemistry and the Mallinson Institute for Science Education, Western Michigan University, Kalamazoo, Michigan, United States of America

  • Cherrelle Singleton

    Roles Conceptualization, Investigation

    Affiliation Department of Educational Leadership, Research and Technology, Western Michigan University, Kalamazoo, Michigan, United States of America

Abstract

Online learning in higher education has been increasing for many years. This is happening across all of higher education and it is happening more specifically within STEM fields. The growth of online learning has significantly accelerated the past couple of years during the COVID-19 pandemic as colleges and universities have sought ways to continue educating students while also keeping students, faculty and staff safe. As result, many college faculty and instructors across all fields of study including STEM fields have made and continue to make the transition to teaching online for the first time. Teaching in an online environment is different from traditional classroom teaching in many ways and presents a unique set of challenges to college instructors. This study documents the development of an instrument used for instructors to self-report their instructional techniques and practices. Data from 251 instructors is also used to examine how this instrument can be used to better understand particular practices, with a focus in this study on discussion facilitation. The results align with the Community of Inquiry framework, including indicating that teaching through discussion forums involves direct contribution and/or facilitation.

Introduction

Online education is a rapidly growing component of higher education, with enrollment in online courses accounting for an increasing proportion of the learning experience. In fall 2014, for example, 5.8 million students were enrolled in online classes in the United States [1]. By fall 2019, that had risen to 7.3 million [2], which is more than a 25% increase in just five years. This is, of course, prior to the COVID-19 pandemic, which necessitated increasing reliance on remote instruction and may have a lasting impact on the proportion of higher education that is available online. Whereas instructional techniques in the physical classroom have evolved into best practices over hundreds of years, the development of instructional techniques for the online classroom has thus occurred over a much shorter timetable. Teaching in an online environment is different from traditional classroom teaching in many ways and presents a unique set of challenges to college instructors. Although there exists a robust effort to study online teaching, there remains much that we do not yet fully understand and much to accomplish before time-tested and validated best practices can emerge.

The importance of effective STEM (science, technology, engineering, and mathematics) teaching has been stressed in numerous reports, including the President’s Council of Advisors on Science and Technology [3], the American Association for the Advancement of Science (AAAS) [4], and the National Research Council’s Discipline-Based Education Research Report [5]. University staff and faculty are engaged in continuing initiatives to answer these calls. These initiatives work to cultivate public scientific literacy [6], enhance workforce readiness [7], and increase the competitiveness of the United States in the global economy [3]. A central component of many change initiatives has been to encourage postsecondary instructors to adopt pedagogical approaches based in research on how people learn [4, 8].

The past decade has seen tremendous growth in online learning in higher education [9]. This growth skyrocketed in 2020 with the onset of the COVID-19 pandemic, with many institutions moving instruction online. This is illustrated by data from the National Center for Education Statistics who report that from Fall 2019 to Fall 2020, the number of students taking any distance education courses almost doubled from 7,254,455 (37%) to 14,056,994 (74%) [10]. Although there are no data on how much of this growth happened in STEM fields, there is other evidence of similar growth. In the report “Teaching Online: STEM Education in the Time of COVID” [11], 896 STEM teaching faculty spanning 49 states responded to a survey of teaching practices during the pandemic. Among the respondents, 43% reported having taught an online course prior to the pandemic and 73% reported having converted a course to an online modality since the pandemic began, which is a monumental shift. Continued research is needed to see to what degree this move online continues to shape STEM instruction in higher education.

To understand the nature of teaching practices in STEM, stakeholders need valid and reliable information about current and continuing conditions in the classroom. In response to the need for shared language in measuring postsecondary teaching practices, the AAAS, with support from the NSF, hosted a workshop that explored the state of describing and measuring undergraduate STEM education. The report produced by this workshop documents the numerous methods by which to measure postsecondary instructional practices, including faculty surveys, student surveys, interviews, class observations, and portfolio/artifact analysis [4]. These methods are predominantly content-oriented and focus on face-to-face classrooms, yet the use of technology alters instructional realities and practices, indicating the importance of investigating online STEM teaching practices. To address this need for robust instruments that measure online instructional practices in STEM coursework, we worked to create the instrument described in this study which can aid researchers and that STEM instructors can use to develop and improve as online teachers.

The present study documents the development process of an instrument for measuring online teaching approaches and techniques in undergraduate STEM courses. This involved a four-phase process and 251 participants. The final product includes measures for many elements in online instruction. To provide a more detailed understanding of one such element, this study includes an analysis of 251 responses to empirically identify underlying factors that shape instruction through student-to-student interaction in asynchronous text-based discussion forums.

Theoretical framework

To help ensure the instrument we create reflects and encourages effective online teaching practices, we sought a conceptual framework that is well supported in the educational literature. The approach we settled on is the Community of Inquiry (CoI) framework, which views meaningful learning experiences as generated through three interdependent elements: social presence, cognitive presence, and teaching presence [12]. The CoI framework was developed to acknowledge both the cognitive and social dimensions of online learning. It has been used widely since first introduced, including two 2010 special issues of the Internet and Higher Education. CoI research has also included examinations of epistemic engagement in online learning [13], the development of community in blended learning [14], and the effects of instructional methods on the quality of student interaction [15].

Teaching presence, which is the focus of this study of online discussions, is defined as “the design, facilitation and direction of cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” [16, p. 163], and has been argued to be the “backbone of the community” in context of discussions [17, p. 159]. A large body of evidence supports the importance of teaching presence for successful online teaching [e.g., 13, 18, 19]. Teaching presence is an important factor in determining students’ satisfaction, perceived learning, and sense of community [16]. Teaching presence can further be understood through its own three interrelated elements: 1) Instructional design, which is “the planning and design of the structure, process, interaction and evaluation aspects of the online course” [16, p. 163]; 2) Facilitating discourse, which is “the means by which students are engaged in interacting about and building upon the information provided in the course instructional materials” [16, p. 164]; and 3) Direct instruction, which is “the instructor’s provision of intellectual and scholarly leadership, in part through sharing their subject matter knowledge with the students” [16, p. 164].

This model of teaching presence, situated within the larger CoI framework, focuses on how the instructor and students interact with each other and with course content. This framework is similar to several observational protocols (e.g. RTOP, TDOP), instructor surveys designed for face-to-face classrooms, and the survey developed for Henderson’s NSF- WIDER project [20]. For example, facilitating discourse is similar to the student-student interaction construct described by Walter et al. [20] or student-teacher interactions and dialogue category elicited by the TDOP [21]. Since the CoI framework is widely used within the online education research community, and aligns with the constructs developed in other work, the model fits with this project’s focus on teaching practices in online STEM environments.

Online discussions

As explained above, discourse is one of the three interrelated elements of teaching presence within the CoI framework. Discourse among students and between students and teachers in an online learning environment can be categorized into two types of communication: asynchronous and synchronous. Asynchronous online learning, commonly facilitated by online tools such as discussion boards, relies on communication amongst and between students and teachers who may or may not be online at the same time. Synchronous online learning, commonly facilitated by online tools such as video conferencing and chats, relies on real-time communication amongst and between students and teachers who are online at the same time [22].

Studies on asynchronous learning environments have reported increased student perception of learning and satisfaction when they have more interaction with their instructors [23], when a discussion board is moderated by an instructor [24], and when instructors explicitly make efforts to motivate students in their discussions [25]. There is also a study that reported the number of instructor responses in discussion boards correlates positively with the number of student responses [26]. Conversely, there are also studies that report negative relationships between instructor and student participation in asynchronous discussions. In their study, Mazzolini and Madison categorized instructor participation in three ways: sage on the stage (high participation), guide on the side (moderate participation), and ghost in the wings (low participation) [27]. They examined their effects on students’ rate and length of posts and found instructor participation style had little correlation with student participation. In a follow-up study, Mazzolini and Madison looked at the content of instructors’ posts. They found no correlation between the type of instructor post and student posting rate or length [28]. They also found that as instructor posts increased, student posts decreased in frequency and length. An, Shin, and Lim found that when instructors participate a lot in discussion boards, students replied more to instructor comments and less to peer comments [29], and Dixson, Kuhlhorst, and Reiff reported that instructor participation led to decreased student participation [19].

There are also studies that report on how instructor participation impacts student participation. Nandi, Hamilton and Harland examined the quality of asynchronous discussions and identified themes related to instructor participation [30]. They found that instructors do play an active role in initiating discussions and moving those discussion forward, confirming instructors’ role as an important one. Arend reported that critical thinking among students is improved when instructors put a more consistent emphasis on discussion participation and when instructor facilitation is less frequent but more purposeful [31].

Although prior research provides some insights into the impact of discussions, we have little information on the prevalence of online discussions. Likewise, the few studies that have examined the type of instructor interaction in online discussions are quite dated given the advances and exponential growth in online education, leaving much unknown about the current nature of online discussions.

Based on extant literature and identified gaps in knowledge regarding the facilitation of discourse in asynchronous online discussions, the following research questions are used to guide this study:

  1. What proportion of instructors of online courses use asynchronous discussion forums in their courses?;
  2. What is the extent and style of instructor participation in discussions?; and
  3. What are the correlates/predictors of discussion usage and participation?

The analyses that follow use the self-report data collected with our new instrument to provide new insights in these areas.

Instrument development and methods

To develop our instrument, we used an iterative mixed methods design, which involved collecting and analyzing rich qualitative data (observations, interviews) to explore the phenomenon prior to quantitative data collection (survey development and validation). This approach is well suited to instrument development, as it places critical content analysis of the literature and descriptive observational data before and to purposefully inform instrument development [32, 33]. A similar method was used to develop a valid and reliable survey of postsecondary instructional practices for face-to-face classrooms [20, 34]. Mixed method approaches are generally considered superior to single method approaches as they can answer research questions that other methodologies cannot and thus provide stronger inferences [35].

The development of our instrument was split into four phases. Phase 1 focused on developing the set of constructs that describe online teaching. Phase 2 focused on designing an alpha version of the instrument. Phase 3 focused on testing and revising the alpha version of the instrument. Phase 4 focused on testing, revising and validating the beta version of the instrument. In light of the calls for better understanding of instructional approaches and effectiveness in STEM [38], all phases of this study used STEM courses and instructors as the focus. A wider scope may have resulted in an instrument more generalized but less sensitive to the instructional techniques of STEM. This study was reviewed and approved by the Western Michigan University institutional review board. All participants provided written consent.

Phase 1

The purpose of the critical content analysis was to organize the literature on online postsecondary STEM classrooms into the CoI teaching presence elements: instructional design and organization, facilitating discourse, and direct instruction. This process produced an organized set of elements that describes teaching in online learning environments, including description of what can be observed, what can be self-reported by survey, and those we cannot measure. In this process, potential measurement target areas were based on the Teaching Presence element of the CoI framework [16] which included instructional design and organization, facilitating discourse, and direct instruction.

In addition to identifying constructs from CoI literature, we also recruited three online STEM undergraduate instructors who gave us permission to observe their completed courses and whom we interviewed. Our research team conducted rich, open-ended observations and extended semi-structured interviews. Data were analyzed iteratively using a constant comparative method [36] by the project team through extended discussions about what is occurring, how we know, and why it is important. We then mapped the results of these open-ended observations and interviews to the constructs identified from the literature. At the end of this process we identified a tentative set of constructs (and their definitions) that could be used to fully describe online instruction in the context of undergraduate STEM.

We then submitted our set of constructs to a panel of experts for expert validation. The expert panel included the four members of our project’s advisory board plus ten additional experienced online instructors recruited using a snowball technique [37] beginning with colleagues and members of our advisory panel. This panel individually reviewed the set of constructs and provided detailed feedback which the research team used to revise the set.

Phase 2

The goal of this phase was to design a first version of the instrument. In the first step in this phase, the research team took the set of constructs produced in Phase 1 and generated items for inclusion in the instrument. This was a group process that involved discussion among the project team members. The items agreed upon by the team were organized into an alpha version of the instrument. This alpha version was then reviewed and validated by the same expert panel used in Phase 1. The panel used their knowledge and experience with online courses to provide critiques regarding the extent to which the items reflect the concepts upon which they are based and whether they have face validity. Their feedback was used to revise the alpha version of the instrument that was tested in Phase 3. This was an iterative process, with the panel assessing revisions until no further changes were deemed needed.

Phase 3

The focus of this third phase of our project was to test, revise and validate the version of the instrument that came out of Phase 2. The instrument was piloted on three instructors of completed online undergraduate STEM courses, including conducting additional semi-structured interviews with these participants. We then came to together as a team to discuss these results and used this data to make revisions to the instrument. The revised instrument was then subjected again to review by the project’s expert panel.

Phase 4

The final phase of the project was to collect data from instructors in multiple STEM disciplines. In order to collect data from online education instructors at multiple universities, a list of 19 public higher education institutions that regularly offered a large number of online STEM courses was compiled. This step of data collection was completed before the COVID-19 pandemic and therefore reflects institutions with a history of online teaching that predates the massive shift to online education that occurred in Spring 2020. From these 19 institutions, a sampling frame was developed by reviewing course schedules and preparing a list of instructors at each university who were teaching online STEM undergraduate courses. Schedules for Fall 2019, Winter 2019–2020 (at institutions that have Winter sessions), and Spring 2020 were each reviewed to produce as comprehensive a list as possible. In total, 1,991 unique instructors were identified across the 19 universities, and 1,500 of them were randomly selected to receive an invitation to participate in the survey. Although the sample includes Spring 2020, which was a semester partially affected by the pandemic, the schedules were reviewed in January 2020 before the shift to online occurred at any U.S. institutions and therefore reflect courses that were planned to be offered online regardless of the pandemic.

Personalized email invitations to participate in the survey were sent to each of the randomly selected instructors in January 2021. Each individual was offered a $50 gift card to Amazon, Target, or Barnes & Noble for their participation. Reminder emails were later sent to any individuals who neither completed the survey nor unsubscribed from the email list. Overall, 251 instructors representing all 19 institutions completed the survey, corresponding to a 16.7% response rate, in comparison to a target sample of 250 (the number of gift cards budgeted).

Overall, the sample was demographically diverse. For gender identity, 56.2% were male and 43.4% were female, with the remaining 0.4% (n = 1) identifying as agender. For race/ethnicity, 72.9% were non-Hispanic white; 15.5% were Asian; 6.0% were Hispanic, Latino, or Spanish origin; 3.2% were Black or African American; 1.2% were Middle Eastern or North African; 0.8% (n = 2) were American Indian / Native American; and 0.4% (n = 1) identified with multiple non-White racial/ethnic identities. For appointment status, 24.3% reported being tenured, 7.2% tenure-track, and 68.5% reported a non-tenure/track appointment.

Measuring discussions

To measure whether an asynchronous discussion was present in the course, participants were asked the following yes/no question: “Do you assign discussion forums? For example, student-to-student discussions to further understanding of course topics.” To measure whether the instructor contributes to the discussion, the following yes/no question was asked: “Do you contribute, other than an initial prompting question, in your discussion forums for [course name]?” Finally, participants were provided a prompt to “please describe how often you contribute to discussion forums for the following reasons” and presented with a series of 12 types of contributions. Logical skip patterns were used to present these questions only when they were applicable (i.e., participants who said that they don’t use discussions were not presented with the other two prompts; participants who said they do not contribute to discussions were not presented with the final prompt). Univariate and factor analyses will be presented in the results section to explore the data from these indicators. For a reproduction of the questions used, see S1 Appendix.

Predictors

Because the survey was designed to focus on a particular course rather than the instructor, the inclusion of variables about the instructors themselves was limited. For example, there were no questions about pedagogical training. Nevertheless, there are a number of indicators in the survey that might be informative about correlates and influences on decisions about discussion approaches. Participants were asked for the total number of students in the course, which may influence the feasibility of having discussions and the ability for the instructor to contribute. Instructors were also asked for the number of times they have taught this course online, how many tears they have been teaching, and how many years they have been teaching online. Each of these is indicative of professional experience. Demographic variables were included as control variables, including gender (reference = male), race (reference = non-Hispanic white), and appointment type (reference = tenured). The descriptive statistics for these variables is displayed in Table 1.

Analytic strategy

Analyses began with SPSS 26.0 via a univariate examination of the variables measuring the presence and nature of discussions (RQ1). From there, factor analysis was used to further explore the types of contributions by instructors to identify common themes (RQ2). Finally, a series of logistic regressions were performed in SAS 9.4 using PROC LOGITISTIC in order to estimate the correlates/predictors of discussion usage and style. In order to retain as many cases as possible for the analysis, missing data for the predictor variables were imputed using PROC MI. Data were not imputed for any dependent variables (only one case had missing data on a dependent variable), and inapplicable cases were excluded as appropriate (e.g., instructors who do not use discussions are excluded from analyses predicting discussion contribution style).

Results

As described above, this instrument development process employed an iterative mixed methods design. In Phase 1, we conducted a critical content analysis of CoI teaching presence elements. We also observed a set of completed online courses and conducted extended semi-structured interviews of the courses’ instructors. These qualitative data were used to develop the set of constructs that were turned into instrument items in Phase 2. Versions of this instrument were then subjected to iterative rounds of quantitative testing and revision in Phases 3 and 4. The final version of the instrument is included as S1 Appendix.

The data from the 251 instructors in Phase 4 were analyzed to better understand how they use discussions in their courses. Upon being asked whether the instructor assigns discussion forums, 64.0% (n = 160) indicated that they do and 36.0% (n = 90) indicated that they do not (one participant did not answer the question). Among those who do use discussion forums, 72.5% (n = 116) reported that they contribute to the discussion beyond providing the initial prompt and 27.5% (n = 44) reported that they do not.

The univariate analyses regarding the type of instructor contribution are displayed in Table 2 (presented in the order they were presented to participants). The most common types of contributions were guiding the class towards understanding course topics in a way to help the student clarify their thinking (#7), providing encouragement within a discussion (#5), and diagnosing misconceptions within a discussion (#4), with the majority of contributing instructors reporting these types of contributions often or always. Between one-third and one-half of contributing instructors reported using most of the remaining types of contributions often or always. Only two types of contributions were reported as being used often or always by less than one-third of contributing instructors, including summarizing a discussion (#3) and building a consensus to a discussion (#2). All twelve contribution types were reportedly used by the majority of instructors at least some of the time.

A factor analysis of the responses to the twelve types of contributions is also displayed in Table 2. The analysis indicates that two factors can be extracted from these items, suggesting that there are two themes in contribution types. In order to further explore these themes, additional factor analyses and reliability analyses were estimated based on the factor loadings of the original factor analysis. Based on the factor loadings observed, variables were selected based on two inclusion criteria: factor loadings greater than .5 for the factor in question and factor loadings at least .1 greater than for the alternative factor. This approach produced two mutually-exclusive sets of contribution types.

An analysis of the first factor is displayed in Table 3. This factor included items related to: guiding students toward understanding, encouraging engagement and productive dialog, maintaining focus, student exploration of new concepts, and guiding students away from non-relevant discussion. These items indicate an emphasis on student contributions to the discussions with the instructor acting as a navigator. Based on the types of contributions included in this factor, it has been identified as “facilitation.” An analysis of the second factor is displayed in Table 4. This factor included items related to: consensus building, summarizing, diagnosing misconceptions, and providing encouragement. These items indicate an emphasis on instructor contributions to the discussions with the instructor injecting substantive contributions with a more hands-on approach. The item related to providing encouragement (which also has the lowest factor loading) is less overtly about adding substantive information than the others, and more information would be useful in better understanding its relation to the others. Based on the types of contributions included in this factor, it has been identified as “direct contribution.” Together, these represent two different approaches an instructor might take with discussions; one that focuses on classroom management in order to keep discussions productive, and another that focuses on a more hands-on approach for interaction. Although these approaches are different, they are not necessarily mutually exclusive. To the contrary, the two factors have a positive correlation (r = .646, n = 116, p < .01), suggesting that an instructor who uses one of these approaches has an increased likelihood of also using the other. Overall, 33.6% of contributing instructors are above the mean on both factors, 16.4% are above the mean for only the facilitation factor, 15.5% are above the mean for only the direct contribution factor, and the remaining 34.5% are below the mean for both factors, suggesting a wide variety of approaches that mix and match different styles.

The logistic regression results predicting whether a discussion is used are presented in Table 5 (Model 1). Each increase in the number of times an instructor has taught a course is associated with a significant increase in the probability that a discussion is currently included in the course (β = .260, OR = 1.087, p < .05). No other predictors have significant effects in this model. It is important to keep in mind that whether a discussion is offered is a pedagogical decision and is likely influenced by variables not included here, so this is not designed or expected to be a comprehensive model of all relevant predictors. The logistic regression results predicting whether the instructor contributes to a discussion are also presented in Table 5 (Model 2). Just as with the previous model, the number of times they have taught a course is the only significant predictor, though the direction is reversed here. Each increase in the number of times an instructor has taught a course is associated with a significant decrease in the probability that the instructor contributes to the discussion (β = -.337, OR = .904, p < .01).

thumbnail
Table 5. Logistic regression predicting discussions and contributions.

https://doi.org/10.1371/journal.pone.0275880.t005

The logistic regression results predicting whether a contributing instructor is above the mean for the facilitation contribution factor are presented in Table 6 (Model 3). None of the experience-related predictors are significant, though instructors who identify as a racial/ethnic minority are significantly more likely to be above the mean for facilitation-type contributions than are non-Hispanic white instructors (β = .236, OR = 2.573, p < .05). The logistic regression results predicting whether a contributing instructor is above the mean for the direct contribution factor are presented in Table 6 (Model 4), though none of the predictors have a significant effect. Together, these suggest that the influences that lead to the type of contributions instructors make are generally not related to the number of students or the length of the instructor’s experience.

thumbnail
Table 6. Logistic regression predicting discussion contribution type.

https://doi.org/10.1371/journal.pone.0275880.t006

Discussion

The CoI framework has been used to explain what constitutes a meaningful learning environment in online courses. CoI predicts that teaching presence, one of the necessities for effective learning, includes three components: instructional design, facilitating discourse, and direct instruction [16]. Online asynchronous discussion is a tool used by instructors that can contribute to teaching presence, and research indicates that such discussion provides various benefits in online courses [2325]. Despite teaching theory arguing and research study evidence showing that discussions are a vital component of online learning, much is unknown about the current use of discussions in online courses.

The present study examined self-reported instructional actions for asynchronous online discussions in fully online higher education courses. The resulting analyses indicated that nearly two-thirds (64.0%) of online instructors use asynchronous discussion forums as part of their online courses. Moreover, 72.5% of those instructors (46.4% of instructors overall) participate in those discussions. This suggests that asynchronous discussions are a commonly used tool for online education and facilitate both student-student and student-instructor discourse. The CoI framework’s concept of facilitating discourse suggests that either of these types of discourse contribute to teaching presence.

The factor analysis of the data identified two underlying factors: facilitation and direct instruction. This is consistent with the CoI framework which predicts both of these elements as part of teaching presence [16]. Although the CoI framework’s other teaching presence element, instructional design, was not identified as well, it is important to note that the instructional design is what sets the stage for the other two elements, and therefore is distinct from what happens during the discussion. Thus, the results are compatible with the CoI framework’s concept of teaching presence in asynchronous discussions.

The analyses also examined the prevalence of discussions and associated instructional techniques. Although facilitation and direct instruction were identified as part of teaching presence in the CoI framework over two decades ago [12], we could identify no existing research that has attempted to use instructor characteristics to predict how these manifest in discussions (or generally). Given the dearth of information on this subject, the models estimated in this study were exploratory in nature. The analyses suggest that the more often an instructor taught the course, the more likely they were to include a discussion component, but the less likely they were to contribute to the discussion. Although this was a statistically significant finding in our analyses, we were unable to identify prior research or a theoretical explanation for this relationship, so replication and further exploration is necessary to confirm and support this finding. Expanding our knowledge of the correlates of instructional practices may be useful in better understanding the practices themselves, as well as to develop more effective strategies for encouraging instructor development and improvement. A more expansive dataset will be necessary to determine the nature of the relationships identified here. For example, these effects could be the result of additional experience, but they may also be the result of generational differences in instructional approach, or a self-selection bias in who teaches the course more often. Likewise, it could be the result of disciplinary differences. These are beyond the scope of the present study, but the results here provide a potential foundation upon which to further explore this curious relationship.

One key limitation in this study is the use of self-report data. The instrument relies on instructors to accurately report how they interact with students in discussions. The confidential, no-risk design of the data collection used makes it unlikely that instructors would feel pressured to intentionally present false responses, but other forms of human error are possible. Prior research has shown that self-report and observational data have reasonable reliability with each other [38], suggesting that instructor self-report data can be reasonably trusted, though replicating the present study with observational data would undoubtedly provide a useful confirmation. Another limitation is the focus on asynchronous discussions over synchronous discussions. At the time this project was first planned in 2014 and funded by the National Science Foundation in 2017, there was no option at our university for instructors and faculty to offer fully online courses with scheduled, synchronous meeting times. All online courses were assumed to be primarily or entirely asynchronous. This influenced our decision to focus our project primarily on asynchronous online learning. Since the Fall 2020 semester our institution has created new designations for courses that include scheduled synchronous sessions. This was directly in response to the COVID pandemic which caused the temporary halt of all face-to-face instruction in March of 2020. We acknowledge that there has been great growth in the amount of online undergraduate education being offered synchronously using tools like Zoom, WebEx and Google Meet between the start of this project and the present, which is undoubtedly a limitation of this study. Future research should build on this work by incorporating synchronous teaching approaches to the instrument and analyses.

Given that nearly two-third of instructors report using asynchronous discussions according to the results of this study, the importance of better understanding the associated instructional techniques is clear. As we work towards identifying evidence-based best practices in online STEM education, using instruments like the one developed here will help in breaking teaching presence down to its components and better understanding what effect they might have on student outcomes. Although the present study used undergraduate STEM courses as the developmental baseline for instruments based on calls to better understand STEM instructional practices [38], it is plausible that this self-report survey can be used or adapted for use in non-STEM areas as well, subject to further testing of this instrument.

Future research also ought to examine how this self-report survey (or some variation on it) can be used by online instructors to help them self-evaluate and improve their teaching methods, particularly as they relate to the CoI concept of teaching presence. As McCombs put it, “Teachers need self-assessment and reflection tools to help them assess fundamental beliefs and assumptions about learning, learners and teaching…” [39, p. 1] A search of the literature found no other self-report surveys for teachers’ online learning practices, particularly drawing on the CoI framework.

Although this instrument was partially used in a survey approach in the present study for development and testing, the instrument is designed to be used on an individual-level basis for self-assessment and reflection. The intent is that it can be used by individual instructors seeking better understanding of their own instructional techniques and for professional growth. This might be further facilitated by discussing a self-assessment with a colleague, which our participants who were interviewed found helpful. The self-report instrument used in this study is now available for instructors and researchers [40] under an open license (CC BY-NC-SA 4.0). Research on how this instrument can be used to help instructors reflect and improve their online teaching would be a welcome to contribution to the areas of teacher training and teacher development.

Supporting information

S1 Appendix. Questionnaire.

A reduced questionnaire that includes only the questions used for the present study, as they were phrased in the January 25, 2021 version used for data collection. For the full, final questionnaire, visit: https://scholarworks.wmich.edu/instruments_teaching/.

https://doi.org/10.1371/journal.pone.0275880.s001

(PDF)

S1 Data. Minimal dataset.

The data required to replicate all study findings reported in the study.

https://doi.org/10.1371/journal.pone.0275880.s002

(SAV)

Acknowledgments

The authors thank the hundreds of instructors who participated in this study, as well as the advisory board who provided critical feedback and staff members who helped facilitate this project.

References

  1. 1. Allen IE, Seaman J. Online report card: Tracking online education in the United States. 2016; Oakland, CA: Babson Survey Research Group. Available from: http://onlinelearningsurvey.com/reports/onlinereportcard.pdf
  2. 2. National Center for Education Statistics. Integrated Postsecondary Education Data System (IPEDS), Spring 2019 and Spring 2020, Fall Enrollment component. 2021; Washington, DC: National Center for Education Statistics. Available from: https://nces.ed.gov/programs/digest/d20/tables/dt20_311.15.asp
  3. 3. President’s Council of Advisors on Science and Technology. Engage to Excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. 2012; Washington, DC: Author.
  4. 4. American Association for the Advancement of Science. Vision and change in undergraduate biology education: A call to action. 2011; Washington, DC: AAAS.
  5. 5. Singer Sr., Nielsen NR, & Schweingruber . (Eds.) Discipline-based education research: Understanding and improving learning in undergraduate science and engineering. 2012; Washington, DC: National Academies Press. Retrieved from: http://www.nap.edu/catalog/13362/discipline-based-education-research-understanding-and-improving-learning-in-undergraduate
  6. 6. Rutherford FJ, & Ahlgren A. Science for all Americans. 1990; New York, NY: Oxford University Press.
  7. 7. Carnevale AP, Smith N, & Melton M. STEM. 2011; Washington, DC: Georgetown University Center on Education and the Workforce. Retrieved from: https://cew.georgetown.edu/wp-content/uploads/2014/11/stem-complete.pdf
  8. 8. Bransford JD, Brown AL, & Cocking RR. How people learn: Brain, mind, experience, and school. 2000; Washington, DC: National Academy Press.
  9. 9. Seaman & Seaman, 2019; https://www.bayviewanalytics.com/reports/almanac/national_almanac2019.pdf
  10. 10. Table 311.15 from the NCES https://nces.ed.gov/fastfacts/display.asp?id=80
  11. 11. Seaman, Allen & Ralph, 2021, https://www.bayviewanalytics.com/reports/stem_education_in_the_time_of_covid.pdf
  12. 12. Garrison DR, Anderson T, Archer W. Critical inquiry in a text-based environment: Computer conferencing in higher education model. The Internet and Higher Education, 2000;2(2–3):87–105.
  13. 13. Shea P, Vickers J, Hayes S. Online Instructional Effort Measured through the Lens of Teaching Presence in the Community of Inquiry Framework: A Re-Examination of Measures and Approach. The International Review of Research in Open and Distance Learning. 2010;11(3).
  14. 14. Akyol Z, Garrison DR, Ozden MY. Online and blended communities of inquiry: Exploring the developmental and perceptual differences. International Review of Research in Open and Distance Learning. 2009;10(6):65–83.
  15. 15. Kanuka H. Interaction and the online distance classroom: Do instructional methods effect the quality of interaction? Journal of Computing in Higher Education, 2011;23:143–156.
  16. 16. Garrison DR, Arbaugh JB. Researching the community of inquiry framework: Review, issues, and future directions. The Internet and Higher Education, 2007;10(3):157–172.
  17. 17. DeNoyelles A, Mannheimer Zydney J, Chen B. Strategies for creating a community of inquiry through online asynchronous discussions. Journal of Online Learning & Teaching, 2014;10(1):153–165.
  18. 18. Arbaugh JB, Hwang A. Does “teaching presence” exist in online MBA courses? The Internet and Higher Education. 2006;9(1):9−21.
  19. 19. Dixson M, Kuhlhorst M, Reiff A. Creating effective online discussions:Optimal instructor and student roles. Journal of Asynchronous Learning Networks, 2006;10(3):15–28.
  20. 20. Walter, EM, Beach, AL, Henderson, C, Williams, CT. Measuring postsecondary teaching practices and departmental climate: The development of two new surveys. Paper presented at: Transforming Institutions: 21st Century Undergraduate STEM Education Conference; 2014 October; Indianapolis, IN.
  21. 21. Hora MT, Oleson A, Ferrare JJ. Teaching Dimensions Observation Protocol (TDOP) user’s manual. 2012; Madison, WI: Wisconsin Center for Education Research, University of Wisconsin-Madison. Available from: http://tdop.wceruw.org/Document/TDOP-Users-Guide.pdf
  22. 22. Hrastinski S. Asynchronous and synchronous E-learning. EDUCAUSE Quarterly, 2008;31(4).
  23. 23. Swan K. Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance Education. 2001;22(2):306–31.
  24. 24. Wise K, Hamman B, Thorson K. Moderation, response rate, and message interactivity: Features of online communities and their effects on intent to participate. Journal of Computer-Mediated Communication. 2006;12(1): 24–41.
  25. 25. Wu D, Hiltz SR. Predicting learning from asynchronous online discussions. Journal of Asynchronous Learning Networks. 2004;8(2): 139–152.
  26. 26. Jiang M, Ting E. A study of factors influencing students’ perceived learning in a web- based course environment. International Journal of Educational Telecommunications, 2000;6(4):317–338.
  27. 27. Mazzolini M, Maddison S. Sage, guide, or ghost? The effect of instructor intervention on student participation in online discussion forums. Computers & Education. 2003;40(3):237–53.
  28. 28. Mazzolini M, Maddison S. When to jump in: The role of the instructor in online discussion forums. Computers & Education. 2007;49(2):193–213.
  29. 29. An H, Shin S, Lim K. The effects of different instructor facilitation approaches on students’ interactions during asynchronous online discussions. Computers & Education. 2009;55(3):749–760.
  30. 30. Nandi D, Hamilton M, Harland J. Evaluating the quality of interaction in asynchronous discussion forums in fully online courses. Distance Education. 2012;33(1):5–30.
  31. 31. Arend B. Encouraging critical thinking in online threaded discussions. The Journal of Educators Online. 2009;6(1):1–23.
  32. 32. Creswell JW, Plano Clark VL, Gutmann ML, Hanson WE. Advanced mixed methods research designs. In Tashakkori A, Teddlie C, editors. Handbook of mixed methods in social and behavioral research. Thousand Oaks, CA: Sage Publications; 2003.
  33. 33. Ivankova NV, Creswell JW, Stick SL. Using mixed methods sequential explanatory design: From theory to practice. Field Methods, 2006;18:3–20.
  34. 34. Walter EM, Henderson C, Beach AL, Williams CT. Development and preliminary validation of the Postsecondary Instructional Practices Survey (PIPS). Paper presented at: The Annual Conference for the American Educational Research Association; 2015 April; Chicago, IL.
  35. 35. Tashakkori A, Teddlie C, editors. Handbook of mixed methods in social and behavioral research. Thousand Oaks, CA: Sage Publications, 2003.
  36. 36. Glaser BG, Strauss AL. The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine Publishing Company; 1967.
  37. 37. Creswell JW. Research design: Qualitative, quantitative, and mixed methods approaches. Thousand Oaks, CA: Sage Publications; 2013.
  38. 38. Smith MK, Vinson EL, Smith JA, Lewin JD, Stetzer KR. A campus-wide study of STEM courses: New perspectives on teaching practices and perceptions. CBE–Life Sciences Education. 2014;13:624–635. pmid:25452485
  39. 39. Shea P, Bidjerano T. Community of inquiry as a theoretical framework to foster “epistemic engagement” and “cognitive presence” in online education. Computers and Education. 2009;52: 543−553.
  40. 40. Horvitz B, DeCamp W, Kowalske MG, Garza Mitchell RL. STEM Online Course Auto-Report. Instruments for Measuring Online Teaching Practices, 2021. Available from: https://scholarworks.wmich.edu/instruments_teaching/2