Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Preparing future STEM faculty through flexible teaching professional development

  • Bennett B. Goldberg ,

    Contributed equally to this work with: Bennett B. Goldberg, Derek O. Bruff

    Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Project administration, Supervision, Writing – original draft, Writing – review & editing

    bennett.goldberg@northwestern.edu

    Affiliation Department of Physics and Astronomy, Northwestern University, Evanston, IL, United States of America

  • Derek O. Bruff ,

    Contributed equally to this work with: Bennett B. Goldberg, Derek O. Bruff

    Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Project administration, Supervision, Writing – original draft, Writing – review & editing

    Current address: Center for Excellence in Teaching and Learning, University of Mississippi, Oxford, MS, United States of America

    Affiliation Center for Teaching, Vanderbilt University, Nashville, TN, United States of America

  • Robin McC. Greenler,

    Roles Conceptualization, Investigation, Methodology, Writing – original draft, Writing – review & editing

    Affiliation CIRTL Network, University of Wisconsin–Madison, Madison, WI, United States of America

  • Katherine Barnicle,

    Roles Conceptualization, Data curation, Funding acquisition, Project administration, Writing – review & editing

    Affiliation CIRTL Network, University of Wisconsin–Madison, Madison, WI, United States of America

  • Noah H. Green,

    Roles Data curation, Investigation, Project administration, Software, Visualization, Writing – review & editing

    Current address: The Science Communication Lab, Berkeley, CA, United States of America

    Affiliation Green Scientific and Educational Consulting, Charlottesville VA, United States of America

  • Lauren E. P. Campbell,

    Roles Data curation, Methodology, Resources

    Affiliation Department of Physics & Astronomy, Vanderbilt University, Nashville, TN, United States of America

  • Sandra L. Laursen,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Writing – original draft

    Affiliation Ethnography & Evaluation Research, University of Colorado Boulder, Boulder, CO, United States of America

  • Matthew J. Ford,

    Roles Data curation, Formal analysis, Software, Visualization, Writing – original draft

    Current address: School of Engineering & Technology, University of Washington Tacoma, Tacoma, WA, United States of America

    Affiliation Northwestern IT Research Computing Services, Northwestern University, Evanston, IL, United States of America

  • Amy Serafini,

    Roles Investigation, Project administration, Writing – review & editing

    Affiliation Department of Educational Foundations, Leadership & Technology, Auburn University, Auburn, Alabama, United States of America

  • Claude Mack,

    Roles Data curation, Methodology, Project administration, Software

    Current address: Machtfit GmbH, Berlin, Germany

    Affiliation Department of Physics & Astronomy, Vanderbilt University, Nashville, TN, United States of America

  • Tamara L. Carley,

    Roles Investigation, Methodology, Project administration

    Affiliation Department of Geology and Environmental Geosciences, Lafayette College, Easton, PA, United States of America

  • Christina Maimone,

    Roles Data curation, Formal analysis, Visualization, Writing – original draft

    Affiliation Northwestern IT Research Computing Services, Northwestern University, Evanston, IL, United States of America

  • Henry (Rique) Campa III

    Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Writing – original draft, Writing – review & editing

    Affiliation Graduate School and Department of Fisheries and Wildlife, Michigan State University, East Lansing, MI, United States of America

Abstract

We have prepared thousands of future STEM faculty around the world to adopt evidence-based instructional practices through their participation in two massive open online courses (MOOCs) and facilitated in-person learning communities. Our novel combination of asynchronous online and coordinated, structured face-to-face learning community experiences provides flexible options for STEM graduate students and postdoctoral fellows to pursue teaching professional development. A total of 14,977 participants enrolled in seven offerings of the introductory course held 2014–2018, with 1,725 participants (11.5% of enrolled) completing the course. Our results of high levels of engagement and learning suggest that leveraging the affordances of educational technologies and the geographically clustered nature of this learner demographic in combination with online flexible learning could be a sustainable model for large scale professional development in higher education. The preparation of future STEM faculty makes an important difference in establishing high-quality instruction that meets the diverse needs of all undergraduate students, and the initiative described here can serve as a model for increasing access to such preparation.

Introduction

There is recognition that evidence-based, student-centered instruction in science, technology, engineering, and mathematics (STEM) generally increases undergraduate student learning and success in STEM [1, 2] and reduces the disparities in outcomes between marginalized students who are historically underrepresented in STEM and majority students in STEM [36]. There is also evidence that current [7, 8] and future [9] faculty who engage in effective teaching professional development go on to implement evidence-based pedagogies in their classes. These findings motivate pedagogical professional development programs offered by university teaching centers, graduate schools, and postdoctoral training initiatives [10].

Future STEM faculty—that is, doctoral students and postdoctoral fellows, hereafter referred to as postdocs, who seek academic careers—face particular challenges in learning about and adopting evidence-based teaching practices, including limited opportunities and lack of advisor support for pedagogical professional development [1114]. Despite this, graduate students and postdocs may be more receptive than current faculty to explore and implement evidence-based teaching practices because they are in the process of learning the standards of academia, developing scientific and teaching practices in their discipline, and are preparing for competitive academic positions [15]. Encouragingly, future STEM faculty who participate in moderate- or high-engagement pedagogical professional development (greater than 25 hours of participation) report significantly improved self-efficacy as instructors and significantly higher adoption of evidence-based teaching practices [9] and perform as well or better in research [16].

To provide such professional development opportunities to future STEM faculty, and thereby improve undergraduate education in the U.S. more broadly, the Center for the Integration of Research, Teaching, and Learning (CIRTL) Network, which currently consists of 43 research universities across the United States and Canada, provides structured pedagogical professional development programs for graduate students and postdocs at individual campuses and through cross-Network programming [1719]. Many of these programs are structured as in-person or virtual, synchronous learning communities [18, 20], where participants meet to learn from and with each other as they pursue shared learning goals [21]. The Network also serves as a community of practice [22] for leaders of future STEM faculty development to share strategies and expertise and co-develop and implement network-wide programs.

In 2013, a series of CIRTL Network conversations on emerging models for future STEM faculty development led a small group of faculty, administrators, and researchers to propose a new initiative centered on using massive open online courses, also known as MOOCs. Interest in this new form of asynchronous online education accelerated rapidly in the early 2010s [23], with educators and researchers exploring the potential for online tools such as videos, discussions, and peer assessments to support learning for thousands of concurrent students [24]. Interestingly, research shows that the more successful MOOCs have been associated with targeted rather than general audiences [25].

In this context of pedagogical experimentation and with funding from the National Science Foundation, our specific goals were to design, deliver, and evaluate the use of MOOCs on evidence-based undergraduate STEM teaching for future faculty pedagogical professional development. This in itself was not novel; other MOOCs developed in the same time frame also had this focus [26]. Inspired by instructors who “wrapped” campus-based courses around existing MOOCs [27] and informed by the CIRTL Network’s experience with campus-based and virtual learning communities, we planned the online courses to be delivered in three different modes to meet the diverse learning needs of future faculty: (1) as stand-alone MOOCs for online participants, (2) as blended online and in-person experiences constructed with what we called MOOC-Centered Learning Communities, or MCLCs, and (3) as open educational resources for use by individuals or by campus-based professional development programs. By inviting colleagues around the CIRTL Network and beyond to host MCLCs of participants in the online courses and providing MCLC facilitators with learning guides to support their local in-person meetings, we designed a novel structure that has enabled us to meet the professional development needs of thousands of future STEM faculty worldwide.

To inform the development of our voluntary educational program for adult learners, we gathered quantitative and qualitative data to evaluate our program, to understand our audience, and to discover the outcomes they derived (or not) from participating. Like most questions about “what works” in education, the answers necessarily depend on who participates and how [28]. Our data and analyses offer relevance beyond our own program, however, because they shed light on fundamental questions—still very much debated by researchers—about the affordances, limitations and challenges for MOOCs in facilitating education for learners in diverse, global contexts [2931]. Indeed, a recent review suggests that pedagogical approach and educational resources of MOOCs—aspects we emphasize are particularly under-studied [32]. Hence our purpose is to contribute to this body of knowledge a careful description of a particular educational design and a critique of whether and how the resulting MOOC and associated MCLCs served our intended audience. We operationalized this purpose with three guiding questions:

  1. Will our model enroll participants of our intended audience (future STEM faculty) at a scale beyond that typically reached by traditional on-campus or synchronous professional development programs?
  2. To what extent will participants increase their knowledge of and confidence in delivering effective STEM instruction?
  3. What effects will the MCLCs have on participation and participant outcomes in the broader MOOC?

These questions recognize the exploratory nature and context sensitivity of studies about real-world educational practices, in contrast to studies involving hypothesis testing in structured environments where variables can be controlled and individually manipulated [33].

Methods

The methods section is organized as follows: We first describe the course structure, curriculum design, assessments, and associated MCLCs of the two courses, an introductory and advanced MOOC, in order to situate the guiding questions of participant engagement, participant learning, and MCLC outcomes within the project design. We next describe the data sources of participant course activity, pre- and post-course surveys, and MCLC facilitator surveys and focus groups that we used to answer our guiding questions and inform our conclusions.

Course structure and logistics

In 2013–2014, we launched an eight-week introductory MOOC, Introduction to Evidence-based Undergraduate STEM Teaching, followed by a second, more advanced eight-week MOOC, Advanced Learning Through Evidence-Based STEM Teaching, in 2015–2016. Each course consists of six modules, each featuring instructional videos, discussion prompts, recommended readings, and a quiz that together take 3-to-5-hours to complete. Each course also included three peer-graded assessments (PGAs). Each course took approximately a year to design and build, and each course drew on expertise from multiple institutions and individuals within and outside the CIRTL Network.

The introductory course examines the fundamentals of learning and learning design, including learning objectives, assessment, and active learning, culminating with a final PGA in which participants develop a sample lesson plan incorporating these core elements. The advanced course delves deeper into evidence-based teaching practices, including peer instruction, cooperative learning, and inquiry-based labs. The final PGA in the second course requires participants to develop a teaching philosophy statement that demonstrates their understanding of and preferences among the teaching practices they have learned. Based on course objectives, module learning goals and assessments, we defined a “completer” as a participant who completed a combination of quizzes (weighted 60% together for four highest scores) and PGAs (individually weighted 10%, 10%, and 20%) with an overall score of at least 50%. Each course has been offered once or twice a year since they launched, originally on the Coursera platform and now on the edX platform.

To foster greater engagement and learning, we encourage participants in the online courses to join or start an MCLC. These learning communities, typically hosted on university campuses, meet weekly to share, discuss, and contextualize what participants are learning in the online course. Depending on local needs, MCLCs can be part of credit-bearing courses or non-credit seminars or simply be an informal set of meetings among peers or colleagues. Each MCLC has a facilitator who regularly convenes the community and plans discussions or other activities for the in-person meetings [34]. We provide an “MCLC Facilitators’ Guide” to support MCLC facilitators that includes learning goals and objectives for online and in-person sessions; overviews of online videos, discussion prompts to engage participants with course content, and assignments; and 3–7 suggested activities with facilitator notes for each module that complement and extend the online materials. Our project team markets the potential of MCLCs for professional development associated with teaching and learning and recruits MCLC facilitators at CIRTL Network campuses and through our respective networks (e.g., professional disciplinary societies, campus academic units, colleagues external to our own campuses) to draw a diverse and international community.

The project website, https://www.stemteachingcourse.org/, makes freely available most of the course content, including videos, accompanying slides, discussion prompts, and instructions for each PGA. The project also has a public YouTube channel, https://www.youtube.com/user/cirtlmooc, featuring all course videos organized by course module. All materials are made available under a Creative Commons 4.0 Attribution-Noncommercial license to facilitate reuse by anyone interested in STEM teaching or pedagogical professional development.

Data sources and analysis

We invited course participants to take surveys for project evaluation. These surveys and learning activity data obtained from the course platforms were determined “not human subjects research” by the Michigan State University Institutional Review Board (IRB), hence formal consent was not required. Interviews of MCLC facilitators were determined to be Exempt by the IRB at University of Colorado Boulder (protocol 15–0658). All interview respondents provided informed consent in writing.

Data on course participation, engagement, and outcomes comes from four main sources, the analysis of which are used to answer our first two guiding questions on whether and how we reached our target audience and what participants learned. The first is the MOOC platforms, Coursera and edX. For both platforms, data are available on which quizzes and PGAs participants attempted and completed, which course videos they watched (operationalized as initiation of playing), and whether they officially completed the course. The second source of data is a pre-course survey, which asked about participants’ demographic characteristics, intended course activity, and familiarity with concepts covered in the course.

The third source of data is a post-course survey, which inquired about demographics, self-reported activity in the course, familiarity with concepts covered in the course, self-reported learning gains, and evaluations of course components such as assignments and videos. The pre- and post-course surveys were first administered through Coursera and then later Qualtrics (Qualtrics, Provo, UT). Course participants were encouraged, but not required, to take the surveys. Respondents were asked to generate and enter an anonymous code to allow linkage between the two surveys. Data were extracted from the course platforms and Qualtrics, cleaned, and combined in a database and then analyzed by a combination of descriptive and statistical analysis. Descriptive analysis was conducted with Python, including with the pandas library [35], and data visualizations were created with seaborn [36] and matplotlib [37]. To answer our first Guiding Question of whether we reached our target audience and had an impact at a larger scale than local projects, we analyzed the demographic and completion data, and compared to local professional development. To answer our second Guiding Question of the learning participants achieved, we examined the course assessments and self-reported gains.

Finally, data about MCLC facilitators came from surveys of facilitators conducted in 2014–15 and interviews conducted in 2016, and are used to answer our third Guiding Question Analysis of these provided rich formative feedback and also characterized MCLC implementation. The surveys gathered information about MCLC composition, structure, and participation, and about the facilitators’ preparation, interactions with learners, and benefits they perceived for the learners and to themselves [38]. Interviews probed these elements further and asked facilitators about the MOOC elements, MCLC facilitation strategies, and integration with other professional development programming at their campus [39].

Results

Learner engagement and target audience

To answer our first Guiding Question we sought to understand whether we reached our intended audience, how they engaged, what they learned, what motivated them, and their overall satisfaction. A total of 14,977 people enrolled in seven offerings of the introductory course held 2014–2018, with 1,725 participants from approximately 60 countries completing. (As noted above, course participants who completed a combination of quizzes and PGAs earned a certificate of completion.) The average introductory course completion rate was 11.5% overall. Enrollment and completion numbers for the advanced MOOC were lower. Overall, 5,320 total people registered for the four offerings to date of the advanced MOOC, with 291 completers and a course-averaged 6.3% completion rate (S1 Table in S1 File).

Pre-course survey data from a subset of registrants (described in more detail below) offer insight into course participants’ goals in taking the course. Respondents’ most frequently reported intent was to “enhance my STEM teaching skills,” with 93% ranking this as “Important” or “Very important,” with the goal to “enhance learning of my students” close behind at 89%. These and other factors related to improved teaching and learning substantially outranked achievement-oriented goals including “earn credentials for my CV” (49%) and “earn a statement of accomplishment,” (31%). The importance of motivating factors did not differ notably by role or by the respondent’s ultimate level of engagement with the course. S6 Table in in S1 File shows the rank order of all motivations.

In addition to the category of “completer,” which we defined a priori, platform data pointed us to additional categories of engagement and learning, which we define from the perspective of MOOCs and similar free, online, self-directed adult learning environments. In developing a “learner” category, we focused on those who engaged in significant ways with the course materials, as a proxy for learning and gains in knowledge. We set the threshold for “learners” based on greater than 50% drop-offs observed in participant activity week-to-week as a function of both video watching and assignment completion: “learners” watched course videos after Week 2 and/or completed course quizzes after Week 1 (the outlined region in Fig 1A). As other MOOC developers have also found, course engagement typically drops off after the first or second week [40, 41]. Learners represented 22% of those enrolled in the introductory course and were distinguished by their behaviors into two main groups: completers (53% of learners, or 11.5% of all enrolled) who participate in quizzes and peer-graded assignments, and non-completers, whom we call “auditors” (our term, representing 47% of learners, 10.2% of enrolled). Auditors primarily watch course videos without attempting quizzes and PGAs, and about half the auditors watched videos in all six of the course modules [4244]. Thus, 68% of the learners engaged continuously with the material throughout the course. Fig 2 characterizes these engagements in a consort diagram with the percentages of the introductory course. The supporting materials contain detailed data about week-by-week engagement with course activities, and additional data that supports the distinction between learners and non-learners, completers and auditors.

thumbnail
Fig 1. Learner engagement v. video watching, assignments, and weeks of the introductory STEM MOOC.

(A) Joint histogram of enrolled in the introductory MOOC by total number of modules/course weeks in which they participated by watching videos (vertical axis) or taking quizzes (horizontal axis). The outlined region approximately separates “learners” from disengaged “non-learners;” a small number (31 or 1%) of learners who completed a PGA but few quizzes may not fall within the outlined region. (Note that the course included a module “0”, introducing the course.) (B) Percent of total enrolled who participated during each module/course week by watching more than one video that week, distinguished by character of participant.

https://doi.org/10.1371/journal.pone.0276349.g001

thumbnail
Fig 2. Consort diagram of course participants.

Consort diagram displaying the overall total of participants who enrolled, the fraction that engaged or didn’t in more than one video, page or activity, and the subsequent fraction that we describe as learners, both those who completed and those who audited.

https://doi.org/10.1371/journal.pone.0276349.g002

Pre- and post-course respondents were largely PhDs and postdocs (50% pre-course and 58% post-course) with faculty an additional 20% pre-course and 16% post-course, with the remaining other category (27% pre-course and 19% post-course) a mix of staff and non-academics Nearly all (91%) of the pre-course survey respondents indicated their disciplines from STEM or Social, Behavioral, and Economic Sciences (SBES) fields. A majority identified as female (60% pre-course/56% post-course), and nearly half came from one of the 38 CIRTL universities. Nearly one third of post-survey respondents participated in an MCLC. We are, therefore, confident that the course is reaching future STEM faculty or professionals, that the connection to the CIRTL Network has been instrumental in disseminating the training professional development program, and that the course also reaches many learners beyond the CIRTL Network (see S1 and S2 Tables in S1 File).

Across seven instances of the introductory course, 3,884 students (26% of enrolled) took the pre-course survey. In a subset of the data where we can link survey participation to course engagement behaviors, pre-course survey respondents included 57% of learners in the course; conversely, 55% of pre-course survey respondents engaged with the course as learners. Similarly, at the end of the course, half (55%) of the completers responded to the post-course survey; among post-survey respondents, 84% completed the course and an additional 8% engaged during all six modules. Results from the surveys are, therefore, very reflective of the experiences and demographics of learners and course completers (see S7 Fig in S1 File for full details).

Overall, 34% of pre-course survey respondents completed the course. Among them, postdoctoral researchers completed at a higher rate of 39% compared to 32% for other participants (Fisher’s exact test, p = 0.005). Those who indicated on the pre-course survey that they intended to pursue an academic career (74% of respondents) completed at a significantly higher rate, 37%, than those who did not, 23% (Fisher’s exact test, p << 0.001).

Learning and confidence to apply content

To address Guiding Question 2, we asked course participants to rate their learning gains in seven areas that reflected our intended general outcomes for them. On average, respondents to the post-course survey rated four gains in the range of “good” to “great” gains: Confidence to implement teaching and learning strategies covered in class; Interest in additional classes related to teaching and learning; Interest in discussing teaching and learning with colleagues and friends; and Confidence that they understand the material covered (S9 Fig in S1 File). The remaining gains were rated, on average, “moderate” to “good”: Enthusiasm for STEM teaching and learning; Interest in an additional MOOC related to teaching and learning; and willingness to seek help from faculty or peers regarding teaching and learning. No gains were rated lower than moderate. Additionally, participants who responded to both pre- and post-course surveys reported higher post-course familiarity with several specific concepts taught in the course, including backwards design, setting learning objectives, use of formative and summative assessments, and leveraging diversity to enhance teaching and learning. Fig 3A displays these results as the average change and Fig 3B as paired. Supporting materials include additional analysis of which course elements respondents found helpful to their learning, self-reported gains in interest in course topics, confidence in applying skills covered in the course, and post-course familiarity with key concepts.

thumbnail
Fig 3. Increase in average reported familiarity with pedagogical topics.

(A) Average responses of pre- and post-course respondents, unpaired. Error bars represent one standard deviation of the response distribution in each direction. (B) Average of paired differences for the 520 respondents who took both the pre- and post-course surveys for course instances where responses can be linked. Error bars represent the 99% confidence interval on the difference.

https://doi.org/10.1371/journal.pone.0276349.g003

The high rate of learner completion and their high self-reported learning gains, especially among those who self-identified as future STEM faculty, indicates that the course design matched these self-directed learners’ time commitment, work level, availability, and motivation. This conclusion is corroborated by satisfaction of post-course survey respondents, 97% of whom agreed that the course improved their ability to teach, 93% were either “satisfied” or “extremely satisfied” with the course, and 97% would “recommend [the course] to others”“. These statistics are based on post-survey respondents to all seven instances of the introductory course.

Learning community engagement

Finally, to answer our third Guiding Question, we explored participant and facilitator experiences in the MCLCs. Many learners engaged in our blended model of delivery: 134 institutions have hosted at least one MCLC, and many have hosted multiple times, yielding 236 total MCLCs as of Spring 2018. According to MCLC facilitator feedback, MCLCs had on average 12 participants, who were largely (75%) STEM PhD graduate students and postdocs, and who completed the course at a high rate (65%).

We do not have data on how many learners were in MCLCs., Our best estimate is that between 20–40% of our learners were in MCLCs based on data about their intentions; 33% of post-survey respondents reported participating in an MCLC, and 96% were learners in the course. The top reasons they wished to engage in local, in-person learning were the opportunities to “interact with peers” (35%), to “discuss course materials and assignments” (32%), to “meet others interested in teaching and learning” (31%), and to “receive feedback on my teaching and learning practices” (28%). Among post-course survey respondents, 34% reported participating in an MCLC, and those who did had strong outcomes: 97% were learners and 87% completed the course, representing 19% of all completers. This suggests that MCLC participation, where available, is supportive of course completion.

Feedback from learning community facilitators

Survey responses and interviews with MCLC facilitators provided valuable feedback on the structure and efficacy of the MOOCs broadly and the MCLCs in particular. Their thoughtful feedback informed substantial revisions of the introductory and advanced MOOCs. In their MCLCs, facilitators reported using the facilitator guide and finding most Guide components to be useful (~75%); they reviewed the guide to get ideas and used different activities to meet the needs of their particular group. Activities involving reflection, discussion or extension of course material were well received, while those that relied on participants’ past teaching experience, or required peer feedback, additional reading, or reflection outside the MCLC meeting, were generally harder to implement. Both facilitators with prior expertise in the MOOC content and those without prior knowledge reported success in leading MCLCs: experts tended to prepare MCLCs as mini-courses enriched with their own content and activities, while novices conducted MCLCs in the form of peer-led study groups, largely drawing on the MOOC materials and the facilitator guide. That novice and expert leaders can lead MCLCs with the support of the Guide, makes the MCLC model sustainable and adaptable in numerous settings. Moreover, most (45 of 51) facilitator survey respondents reported they would facilitate an MCLC again, saying, for example, “I enjoyed facilitating the MOOC, learning from it, and sharing my experience with the participants in our learning community, and “It’s one of my favorite things to do, even though I am doing it as a volunteer.”

Discussion

Our purpose in this study was to explore through discovery-based research whether (1) our model enrolled participants of future STEM faculty at a scale beyond that typically reached by traditional on-campus or synchronous professional development programs; (2) to what extent did participants increase their knowledge of and confidence in delivering effective STEM instruction; and (3) examine the effects the MCLCs had on participation and participant outcomes in the broader MOOC. Our key finding in that our model of multiple offerings of two MOOCs on evidence-based undergraduate STEM teaching, intentional support for facilitated MCLCs, and open access to course materials, have successfully met needs among graduate students and postdocs for pedagogical professional development that often go unmet by traditional on-campus resources and events [45]. A number of key factors led to this result.

Our target audience of STEM graduate students and postdocs have clearly identified professional development goals and are geographically clustered at research universities. This enabled the formation of local MCLCs, since potential participants studied and worked in proximity to each other. Having local MCLCs at universities also made publicity and recruitment easier for our blended delivery mode, since the opportunity to join MCLCs could be advertised by supportive university faculty and staff members, graduate schools, departments, and centers for learning and teaching. Those faculty and staff also made ready facilitators for MCLCs. Facilitators’ self-reported experience, expertise, and the similar professional goals of participants, lent MCLCs a structure and coherence that, we suggest, distinguished them from the ad hoc student meet-ups that are common in many MOOCs, leading to greater course completion rates by MCLC participants.

Participants had flexible options for engaging with course materials and resources. Some learners completed the courses by submitting quizzes and peer-graded assignments, some audited the courses by consistently watching videos, while other learners viewed course videos in an ad hoc manner on YouTube and the project website. MCLCs provided a professional development option for those who also wanted an in-person experience. For graduate students and postdocs often constrained by time, advisor priorities and the need to focus on research, these options enabled motivated future faculty to seek out and obtain pedagogical professional development on their own terms.

In addition to the stand-alone MOOC and MCLC delivery modes, course materials have been made available as open educational resources (OER) on our project website and a YouTube channel to encourage adoption and adaptation by those individuals and institutions involved in future STEM faculty professional development. By making access as broad and simple as possible (all materials including the MCLC Facilitator Guide are freely available), we limited our ability to track use of the materials in any great detail. Statistics from our YouTube channel indicate that our 130 videos were viewed collectively 60,540 times outside the courses during the first three years of course offerings. Multiple colleagues and educators have expressed an interest in using our course materials for professional development programs at their institutions, and some have reported back on particular uses, including developing or supporting university-level teaching certificate programs, redesigning curricula, incorporating additional materials into existing educational workshops or for credit courses, and providing online professional development opportunities for diverse audiences.

Our MOOC initiative was launched from, developed by, and continues to be hosted by STEM faculty, educators, administrators, and educational developers through the CIRTL Network despite the fact that long-term sustainability is often not an outcome of NSF-funded educational initiatives [46]. The CIRTL Network brought together the initial team that developed the MOOCs; it provided a range of STEM education practitioners and researchers who contributed to the course content through interviews, resource sharing, module development, and feedback; and Network institutions hosted approximately one third of the MCLCs. While numerous individuals from outside the CIRTL Network contributed in significant ways as well, particularly as MCLC facilitators, the existing network functioned as a community of practice that enabled the initiative to succeed. From 2018 to now, the CIRTL Network has assumed all management of the MOOCs and MCLCs, which continue to be well enrolled. In this project, the CIRTL Network was instrumental, highlighting that other professional networks such as disciplinary societies, formal and informal, could serve similar design, dissemination, and sustained support functions.

Our completion rate of 11.5% is more than double the rate reported for other non-professional and non-degree MOOCs [24, 40, 41]. Recent research shows that, while general MOOC participation and completion rates have declined over the last five years, MOOCs designed for highly motivated students pursuing professional development have thrived [25]. Our findings are consistent with this trend and point to potential future uses of MOOCs and MCLCs for career and professional development needs. Asynchronous, online learning in conjunction with synchronous, in-person learning is a structure with potential to be effective in professional development domains beyond teaching, including leadership, conflict resolution, responsible conduct of research, and mentoring, as well as interdisciplinary domains such as data visualization or computational thinking.

The fact that current STEM faculty also took our MOOCs and participated in MCLCs suggests that this structure is also useful for academics especially at institutions without extensive faculty development programs. Indeed, other initiatives have leveraged our model. A recent example is the Inclusive STEM Teaching Project, a similarly structured, blended delivery course with asynchronous online content through edX and project-trained local learning community facilitators [47]. In addition, the NIH-funded Postdoc Academy project developed two asynchronous online courses targeting postdoc professional development, again also offered with local learning communities called Postdoc Academy Learning Sessions (PALS) [48, 49].

Limitations and recommendations for future work

This study draws on data gathered for program evaluation to make claims about the utility and reach of an asynchronous online course and associated local learning communities and materials to support teaching professional development at scale, and thus it has some limitations in comparison to other literature on professional development. First, we did not attempt to investigate variation in course activity across group demographics, career stage, or home institutional type. Early on we chose to limit collection of identifiable data, seeking to increase participation and response rates by protecting participant anonymity as much as possible. Future large scale professional development programs that wish to explore such differences may make a different choice. A second limitation, common in professional development and MOOCs, is that we rely on participants’ self-assessment of learning gains and do not assess learning through exams or other external measures. Our goal was to increase the confidence of mostly future faculty in developing and applying teaching strategies, however, and confidence can only be assessed by self-report. Moreover, though there is a large overlap between learners and survey respondents, they are not the same. Our report of gains in familiarity with course topics is based on 520 people whose pre- and post-course survey responses could be paired, which is a relatively large sample, but is nonetheless a subset of course learners who may not be fully representative. Future projects should consider a longitudinal research effort to examine learning and application to classroom practices, as some have begun to do [8], and should consider multiple measures of such outcomes [5052]. We acknowledge the substantial complexity, effort and cost of such studies.

Conclusion

We demonstrated the effective delivery of pedagogical professional development to future STEM faculty with the potential to significantly impact undergraduate STEM education. Our design combines flexible, asynchronous content in conjunction with optional, supported and facilitated in-person learning communities, all offered within the context of a network of STEM faculty and educational developers. Our model can successfully be used and leveraged in many contexts to overcome barriers where learners seek significant professional development in constrained settings to help them meet their diverse career and professional goals.

Acknowledgments

We gratefully acknowledge our many colleagues who contributed to this effort by developing MOOC modules or content to components of modules and our MOOC Learning Community facilitator’s guides. Your knowledge, insights, creativity, and delivery of this material was invaluable to the success of this project and for reaching literally thousands of future faculty. We specifically would like to thank (in alphabetical order): C. Brame, S. Chasteen, M. DiPietro, C. Fata-Hartley, A. Little, J. Littrell, R. M. Mathieu, T. McMahon, and K. Spilios, for their many critical contributions.

References

  1. 1. President’s Council of Advisors on Science and Technology. Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics; 2012. Washington, DC.
  2. 2. Freeman S, Eddy S, McDonough M, Smith M, Okoroafor N, Jordt H, Wenderoth M. Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Science. 2014; 111(23), 8410–8415. pmid:24821756
  3. 3. Eddy S, Hogan K. Getting under the hood: How and for whom does increasing course structure work? CBE—Life Sciences Education. 2014; 13, 453–468. pmid:25185229
  4. 4. Ballen CJ, Wieman C, Salehi S, Searle JB, Zamudio KR. Enhancing diversity in undergraduate science: Self-efficacy drives performance gains with active learning. CBE—Life Sciences Education. 2017; 16(4), p56. pmid:29054921
  5. 5. Theobald EJ, Hill MJ, Tran E, Agrawal S, Arroyo EN, Behling S, et al. Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences. 2020; 117(12), pp.6476–6483. pmid:32152114
  6. 6. Dewsbury BM, Swanson HJ, Moseman-Valtierra S, Caulkins J. Inclusive and active pedagogies reduce academic outcome gaps and improve long-term performance. PLoS ONE. 2022; 17(6): e0268620. pmid:35704639
  7. 7. Light G, Calkins S, Luna M, Drane D. Assessing the impact of faculty development programs on faculty approaches to teaching. The International Journal of Teaching and Learning in Higher Education. 2009; 20(2): p. 168–181.
  8. 8. Archie T, Hayward CN, Yoshinobu S, Laursen SL, Investigating the linkage between professional development and mathematics instructors’ use of teaching practices using the theory of planned behavior. PLOS ONE. 2022; 17(4) e0267097. pmid:35427406
  9. 9. Connolly MR, Savoy JN, Lee YG, Hill LB. Building a better future STEM faculty: How doctoral teaching programs can improve undergraduate education. 2016; University of Wisconsin-Madison: Madison, WI; Wisconsin Center for Education Research.
  10. 10. Wright M, Horii CV, Felten P, Sorcinelli MD, and Kaplan M. Faculty Development Improves Teaching and Learning. POD Speaks 2, 2018; 1–5.
  11. 11. Gardner GE, Jones MG. Pedagogical preparation of science graduate teaching assistant: challenges and implications. Science Education. 2011; 20, 31–4.
  12. 12. Nyquist JD, Manning L, Wulff DH, Austin AE, Sprague J, Fraser PK, et al. On the road to becoming a professor: the graduate student experience. Change: the Magazine of Higher Learning. 2011; 31:18–27.
  13. 13. Brownell SE, Tanner KD. Barriers to faculty pedagogical change: lack of training, time, incentives, and … tensions with professional identity?. CBE Life Sci Educ. 2012; 11, 339–346.
  14. 14. Thiry H, Laursen SL, Liston C. (De)Valuing teaching in the academy: Why are underrepresented graduate students overrepresented in teaching and outreach? Journal of Women and Minorities in Science and Engineering. 2007; 13(4), 391–419.
  15. 15. Prevost LB, Vergara CE, Urban-Lurain M, Campa H III. Evaluation of a high-engagement teaching program for STEM graduate students: outcomes of the FAST-Future Academic Scholars in Teaching Fellowship Program. Innovative Higher Education. 2017; 42
  16. 16. Shortlidge EE, Eddy SL. The trade-off between graduate student research and teaching: A myth? PLOS ONE. 2018; 13(6): e0199576. pmid:29940027
  17. 17. Austin AE, Campa H III, Pfund C, Gillian-Daniel DL, Mathieu R, Stoddart J. Preparing STEM doctoral students for future faculty careers. New Directions for Teaching and Learning. 2009; 117, 83–95.
  18. 18. Hokanson SC, Grannan S, Greenler R, Gillian-Daniel DL, Campa H III, Goldberg BB. A Study of Synchronous, Online Professional Development Workshops for Graduate Students and Postdocs Reveals the Value of Reflection and Community Building. Innovative Higher Education. 2019; 44, 385–398.
  19. 19. Mathieu RD, Austin AE, Barnicle KA, Campa H III,. McLinn C. The Center for The Integration of Research, Teaching, and Learning: A national network to prepare STEM Future faculty in Saichaie K.and Theisen C.H., Editors. Special Issue: Approaches to Graduate Student Instructor Development and Preparation. New Directions in Teaching and Learning. 2020; 163 p45–53. Jossey-Bass.
  20. 20. Garrison DR, Anderson K, Archer W. Critical Inquiry in a Text-Based Environment: Computer Conferencing in Higher Education. The Internet and Higher Education. 2000; 2(2–3): 87±105 ISSN: 1096-7516.
  21. 21. McDaniels M, Pfund C, Barnicle KA. Creating dynamic learning communities in synchronous online courses. Online Learning. 2016; 20(1), 110–129.
  22. 22. Wenger E. Communities of practice: Learning, meaning, and identity. 1998. Cambridge: Cambridge University Press.
  23. 23. Kovanović V, Gašević D, Joksimović S, Hatala M, Adesope O. Analytics of communities of inquiry: Effects of learning technology use on cognitive presence in asynchronous online discussions, The Internet and Higher Education. 2015; 27, 74–89, ISSN 1096-7516, https://doi.org/10.1016/j.iheduc.2015.06.002
  24. 24. Pappano L. The year of the MOOC. New York Times. 2012; Nov 2.
  25. 25. Reich J, Ruipérez-Valiente JA. The MOOC pivot. Science. 2019; 363, 6423, pp. 130–131 pmid:30630920
  26. 26. Johns Hopkins University. Hopkins creates free online course to equip PhD candidates with teaching skills [Press release]. 2014, January 14. Retrieved from https://hub.jhu.edu/2014/01/24/university-teaching-101-coursera/
  27. 27. Bruff D, Fisher D, McEwen K, Smith B. Wrapping a MOOC: Student perceptions of an experiment in blended learning. Journal of Online Learning and Teaching. 2012; 9(2).
  28. 28. Sabelli N., Dede C. Integrating educational research and practice. Reconceptualizing goals and policies: “How to make what works work for us?” Menlo Park, CA: SRI International. 2001. Retrieved from https://www.sri.com/publication/education-learning-pubs/integrating-educational-research-practice-reconceptualizing-goals-and-policies-how-to-make-what-works-work-for-us/
  29. 29. Al-Rahmi W., Aldraiweesh A., Yahaya N., Kamin Y. B., Zeki A. M. Massive open online courses (MOOCs): Data on higher education. Data in Brief, 2019, 22, 118–125. pmid:30581914
  30. 30. Ebner M., Schön S., Braun C. More than a MOOC—Seven learning and teaching scenarios to use MOOCs in higher education and beyond. In Yu S., Ally M., & Tsinakos A.(Eds.), Emerging technologies and pedagogies in the curriculum (pp. 75–87). 2020, Singapore: Springer Singapore.
  31. 31. Badali M., Hatami J., Banihashem S. K., Rahimi E., Noroozi O., Eslami Z. The role of motivation in MOOCs’ retention rates: A systematic literature review. Research and Practice in Technology Enhanced Learning, 2021, 17(1), 1–20.
  32. 32. Despujol I., Castañeda L., Marín V. I., Turró C. What do we want to know about MOOCs? Results from a machine learning approach to a systematic literature mapping review. International Journal of Educational Technology in Higher Education, 2022, 19(1), 1–22.
  33. 33. Berliner D. C. Comment: Educational research: The hardest science of all. Educational Researcher, 2001, 31(8), 18–20.
  34. 34. Blum-Smith S, Yurkofsky MM, Brennan K. Stepping back and stepping in: Facilitating learner-centered experiences in MOOCs, Computers & Education. 2021; 160, 104042, ISSN 0360-1315, https://doi.org/10.1016/j.compedu.2020.104042.
  35. 35. https://pandas.pydata.org/about/citing.html
  36. 36. https://seaborn.pydata.org/citing.html
  37. 37. https://matplotlib.org/stable/users/project/citing.html
  38. 38. Laursen S. Summary of MCLC facilitator survey data. [Report to CIRTL MOOC team] Boulder, CO: Ethnography & Evaluation Research, University of Colorado Boulder. 2016; May. https://www.colorado.edu/eer/content/cirtl-mclc-facilitator-survey-2016
  39. 39. Laursen S. Evaluation of CIRTL MOOC-Centered Learning Communities. [Report to CIRTL MOOC team] Boulder, CO: Ethnography & Evaluation Research, University of Colorado Boulder. 2017; November. https://www.colorado.edu/eer/content/cirtl-mclc-facilitator-interviews-2017
  40. 40. Jordan K. Initial trends in enrollment and completion of massive open online courses. The International Review of Research in Open and Distance Learning. 2014; 134–160.
  41. 41. Jordan K. Massive open online course completion rates revisited: assessment, length, and attrition. 2015. Retrieved from www.irrodl.org/index.php/irrodl/article/view/21123340
  42. 42. Tseng SF, Tsao YW, Yu LC, Chan CL, Lai KR. Who will pass? Analyzing learner behaviors in MOOCs. Research and Practice in Technology Enhanced Learning. 2016; 11(1), 8. pmid:30613241
  43. 43. Perna LW, Ruby A, Boruch RF, Wang N, Scull J, Ahmad S, et al. Moving Through MOOCs: Understanding the Progression of Users in Massive Open Online Courses. Educational Researcher. 2014; 43(9), 421–432. https://doi.org/10.3102/0013189X14562423
  44. 44. Kizilcec RF, Piech C, Schneider E. Deconstructing disengagement: analyzing learner subpopulations in massive open online courses. ACM Press. 2013; p. 170 https://doi.org/10.1145/2460296.2460330
  45. 45. Golde CM, Dore TM. At cross purposes: What the experiences of doctoral students reveal about doctoral education. Philadelphia, PA: A report for The Pew Charitable Trusts. 2001. Available from: www.phd-survey.org.
  46. 46. Mervis J. Science 2009; 323, 5910, pp. 54–58; https://doi.org/10.1126/science.323.5910.54
  47. 47. https://www.inclusivestemteaching.org/
  48. 48. Chesniak OM, Drane D, Young C, Hokanson SC, Goldberg BB. Theory of change models deepen online learning evaluation. Evaluation and Program Planning. 2021; 88, 2021, 101945, ISSN 0149-7189, pmid:33894476
  49. 49. Sun T, Drane D, McGee R, Campa H III, Goldberg BB, Hokanson SC. A national professional development program fills mentoring gaps for postdoctoral researchers. bioRxiv [Preprint] 2022; Available from: https://doi.org/10.1101/2022.09.26.509546
  50. 50. Manduca CA, Iverson ER, Luxenberg M, Macdonald RH, McConnell DA, Mogk DW, et al. Improving undergraduate STEM education: The efficacy of discipline-based professional development. Science Advances. 2017; Feb 1;3(2):e1600193. pmid:28246629
  51. 51. Derting TL, Ebert-May D, Henkel TP, Maher JM, Arnold B, Passmore HA. Assessing faculty professional development in STEM higher education: Sustainability of outcomes. Science Advances. 2016; 2(3), e1501422. pmid:27034985
  52. 52. Laursen SL, Archie T, Weston TJ, Hayward CN, Yoshinobu S. A measurement hat trick: Evidence from a survey and two observation protocols about instructional change after intensive professional development. 2023 Conference on Research in Undergraduate Mathematics Education, Omaha, NE, February 23–25.