Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

Beyond digital interfaces: The human element in online teaching and its influence on student experiences

  • Soo See Chai ,

    Contributed equally to this work with: Soo See Chai, Su-Hie Ting, Kok Luong Goh, Yee Hui Robin Chang, Bui Lin Wee, Dila Novita, J. Karthikeyan

    Roles Formal analysis, Project administration, Supervision, Writing – original draft

    sschai@unimas.my

    Affiliation Faculty of Computer Science and Information Technology, University of Malaysia Sarawak (UNIMAS), Kota Samarahan, Sarawak, Malaysia

  • Su-Hie Ting ,

    Contributed equally to this work with: Soo See Chai, Su-Hie Ting, Kok Luong Goh, Yee Hui Robin Chang, Bui Lin Wee, Dila Novita, J. Karthikeyan

    Roles Data curation, Methodology

    Affiliation Faculty of Language and Communication, University of Malaysia Sarawak (UNIMAS), Kota Samarahan, Sarawak, Malaysia

  • Kok Luong Goh ,

    Contributed equally to this work with: Soo See Chai, Su-Hie Ting, Kok Luong Goh, Yee Hui Robin Chang, Bui Lin Wee, Dila Novita, J. Karthikeyan

    Roles Conceptualization, Data curation, Writing – review & editing

    Affiliation School of Science and Technology, International University College of Advanced Technology Sarawak (iCATS University College), Kuching, Sarawak, Malaysia

  • Yee Hui Robin Chang ,

    Contributed equally to this work with: Soo See Chai, Su-Hie Ting, Kok Luong Goh, Yee Hui Robin Chang, Bui Lin Wee, Dila Novita, J. Karthikeyan

    Roles Data curation, Formal analysis

    Affiliation Faculty of Applied Sciences, Universiti Teknologi MARA Cawangan Sarawak, Jalan Meranek, Kota Samarahan, Sarawak, Malaysia

  • Bui Lin Wee ,

    Contributed equally to this work with: Soo See Chai, Su-Hie Ting, Kok Luong Goh, Yee Hui Robin Chang, Bui Lin Wee, Dila Novita, J. Karthikeyan

    Roles Data curation, Validation

    Affiliation Faculty of Computer Science and Information Technology, University of Malaysia Sarawak (UNIMAS), Kota Samarahan, Sarawak, Malaysia

  • Dila Novita ,

    Contributed equally to this work with: Soo See Chai, Su-Hie Ting, Kok Luong Goh, Yee Hui Robin Chang, Bui Lin Wee, Dila Novita, J. Karthikeyan

    Roles Writing – review & editing

    Affiliation Department of Public Administration, Universitas Islam 45, Bekasi, Indonesia

  • J. Karthikeyan

    Contributed equally to this work with: Soo See Chai, Su-Hie Ting, Kok Luong Goh, Yee Hui Robin Chang, Bui Lin Wee, Dila Novita, J. Karthikeyan

    Roles Writing – review & editing

    Affiliation National College (Autonomous)Tiruchirappalli, Tiruchirappalli, Tamilnadu, India

Abstract

Amidst the digital transformation of education, the essence of the human touch in online teaching remains pivotal. Despite growing literature, there remains a significant gap in understanding how the human element in online teaching directly influences student engagement and learning outcomes, especially in diverse educational contexts. This study develops a quantifiable index capturing the essence of humanized online teaching and investigates the determinants influencing this humanization. Additionally, an index encapsulating students’ online learning experiences, as perceived by their instructors, has been constructed. Bridging these indices, the research unravels the intricate relationship between the humanization of online teaching and the resulting student experiences in the virtual realm. Sourced from a self-constructed questionnaire and encompassing responses from 152 instructors across 22 Malaysian institutions, the data revealed an average incorporation of 81.38% humanized online teaching elements. Key determinants, such as subject matter, teaching experience, Internet quality, and platform choices, emerged as significant influences. A regression model showed approximately 31.7% (R-squared = 0.317, p<0.001) of the variation in the dependent variable. A significant moderate positive correlation (r = 0.423, p<0.001) between the Humanized Online Teaching Index and the Students’ Online Learning Experiences Index highlights the intertwined nature of humanized instructional methodologies and enhanced student engagement in online settings. Though contextualised during the coronavirus disease-2019 pandemic, the study’s implications transcend the immediate circumstances, offering transformative insights for future online teaching methodologies and enhancing student experiences in the evolving digital age.

Introduction

Distance education has been around for 300 years [1], but today’s distance learning utilizes the Internet to facilitate real-time communication between teachers and students. When the COVID-19 pandemic in 2020 resulted in a widespread shift to online learning, video conferencing platforms such as WebEx, Zoom, and Google Classroom were being used to help students learn despite school closures. Online learning, or e-learning, involves using electronic devices connected to the Internet to learn in either synchronous or asynchronous settings. It is a type of remote learning that bridges the physical and geographical distances between teachers and students, fostering an accessible educational environment irrespective of their locations [2]. If not utilized properly, the use of technology can potentially hinder learning as it may remove its social component and cause students to feel isolated. This sense of isolation can lead students to perceive themselves as just users of certain software rather than being part of an educational institute [3], which can negatively affect their engagement, motivation, retention, and satisfaction [4].

Thus, online education must leverage technology while preserving a human touch to reduce isolation [5]. The human touch is an integral aspect of conventional classroom teaching, whereby instructors and students can interact, laugh together, and make learning an enjoyable experience [6]. Humanizing instruction relies on instructor-student interactions, which result in a connection between students, engagement, and rigorous exchange [7]. Humanizing instruction involves creating opportunities for such interactions and fostering a caring environment [8]. When done in the course of online learning where screen is the medium of instruction, maintaining human interactions presents a challenge. Some teachers suggest using the camera-on mode to establish a human touch in online learning [9]. With the camera off, the absence of visual cues makes it difficult for teachers to know whether students can understand the teachings. However, mandating students to turn on their cameras may not be culturally sensitive or equitable [10].

The four interconnected elements that underpin humanized online instruction are: Trust, Presence, Awareness, and Empathy [11, 12]. These principles of humanization in online teaching are closely related to the Community of Inquiry (CoI) framework, which we use in our study to investigate students’ online learning experiences from their instructors’ perspectives. CoI is a widely recognized theoretical framework that provides a holistic view of the online learning experience, emphasizing the importance of social, cognitive, and teaching presence [13]. The CoI framework posits that effective online learning, particularly higher-order learning, requires community building [14]. Integral to our study, it offers a comprehensive model for analyzing the dynamics of online learning environments. The social presence in the CoI framework refers to the ability of participants to project their personal characteristics into the community, thereby supporting communication and relationship building [15]. Cognitive presence, the core of the CoI model, involves the exploration, construction, resolution, and confirmation of understanding through collaboration and reflection [16]. Teaching presence, the design and facilitation aspect of the framework, entails orchestrating the other two presences to support learning [17]. Research has consistently shown that these interdependent elements of the CoI framework are essential for creating a meaningful and holistic online learning experience, emphasizing the importance of integrating social, cognitive, and teaching strategies for effective online education [18]. By examining how instructors integrate humanizing approaches into their online instruction, we can better understand how to create a sense of community and promote active learning among online learners, which are essential components of the CoI framework. In this context, the primary challenge is to ensure how, even without enforcing camera-on mode during class time, instructors may integrate human touch in the learning process through a variety of approaches.

In the evolving landscape of education, the humanization of online teaching has emerged as a pivotal factor in enhancing the quality of virtual learning experiences. This research employed the three interdependent elements of the CoI framework to determine if students are enjoying profound and meaningful learning experiences from the perspective of their instructors. Specifically, our study investigates how extensively instructors at Malaysian higher education institutes have integrated humanization principles into their online teaching, which aligns with the social presence component of the CoI framework. Although there has been considerable research on the subject of humanizing online courses using various methods, little academic attention has been paid to what extent the four principles of humanization in online teaching have been incorporated into online classes.

This study embarked on a two-fold mission: first, to develop a quantifiable index that captures the degree to which instructors’ online teaching is humanized; and second, to delve into the determinants that influence this humanization. The reliability and internal consistency of the questions used in the survey for examining the extent to which online teaching has been humanized, measured by Cronbach’s alpha [19], was the first step in developing the index. Furthermore, we constructed another index that encapsulates students’ online learning experiences as perceived by their instructors. By bridging these two indices, our research sought to unravel the intricate relationship between the humanization of online teaching and the resulting student experiences in the virtual learning environment. This research sheds light on the symbiotic interplay between instructional methodologies and student engagement, offering insights that could redefine the paradigms of online education.

In the subsequent sections, this paper systematically unfolds its narrative. Starting with the Method section, we detail our research methodology, encompassing sample selection, questionnaire development, and data collection. This is complemented by the Development of the Questionnaire section, where the process of creating a bespoke tool to assess humanized online learning and the CoI framework is elucidated. The Measures section then delves into the specifics of the questionnaire, followed by Statistical Analysis, where we describe our analytical techniques. The Results section presents pivotal findings, including insights into participant demographics, technological usage in teaching, and the infusion of the human touch in online education. Subsequently, the Discussion section contextualizes these findings within the wider educational landscape, drawing connections to the broader implications of online teaching. The paper concludes by summarizing key takeaways and their broader impact, while acknowledging the study’s limitations and suggesting avenues for future research. The structure of this paper offers a comprehensive, interconnected exploration of the humanization of online teaching, its measurement, and its influence on the educational experience.

Method

Sample and procedure

A total of 152 educators were recruited using purposive sampling from 22 higher education institutes in Malaysia. The researchers invited their colleagues to participate via email and social media platforms, and enlisted the help of peers from other colleges and universities to distribute the online questionnaire. Data collection took place from January 12, 2022 to February 24, 2022.

We used a self-constructed questionnaire as there were no similar measures of human touch in online learning when this research was being conducted. The self-constructed online questionnaire was designed using Google Forms. The questionnaire opened with details about the purpose of the study, participants’ voluntary participation and anonymity, and researchers’ contact information. Before beginning the survey, participants were required to give their consent by signing a consent form.

The research was granted ethical approval (HREC(NM)/2020(1)/04) by the Research Ethics (Non-Medical) Committee at the University of Malaysia Sarawak (UNIMAS).

Development of the questionnaire

In addressing the principles of humanized online learning and the CoI framework, this study acknowledges the gap in current research regarding specific measurement tools. Though existing literature extensively discusses the theoretical aspects of humanizing online education and the CoI framework, there is a notable absence of comprehensive questionnaires designed to quantitatively assess these constructs in practical educational settings. Existing studies primarily focus on qualitative analyses or general discussions of the principles involved, such as trust, presence, awareness, empathy, and the CoI’s three presences (social, cognitive, and teaching) [1114]. Recognizing this gap, our research contributes to the literature by developing and implementing a specific questionnaire aimed at evaluating these principles in a structured manner. This original questionnaire is derived from the theoretical underpinnings of the aforementioned frameworks, but tailored to capture the nuanced realities of online teaching and learning experiences. It is an advancement in the field by as a tool for empirical investigation into the practical application of these widely acknowledged educational theories.

Measures

Table 1 provides a summary of the elements of the four principles of humanized online learning, as well as the questionnaire items used to evaluate the principles of humanized online learning in this study. Table 2 provides an overview of the design elements for each of the three presences in the CoI framework, based on which the questionnaire items were constructed.

thumbnail
Table 1. Questionnaire items based on the principles of humanized online learning.

https://doi.org/10.1371/journal.pone.0307262.t001

thumbnail
Table 2. Mapping of questionnaire questions with social, cognitive, and teaching presence of the community of inquiry framework.

https://doi.org/10.1371/journal.pone.0307262.t002

The first part of the 37-item questionnaire (Section A) aimed to gather demographic information from the respondents (6 items), such as their sex, age, highest level of education, teaching experience, university, subjects taught, and their preferences toward online teaching prior to the COVID-19 pandemic. Participants’ sex, age, highest degree, and teaching experience were collected as single data points, and an ordinal scale of 1 (Do not like it at all) to 5 (Like it a great deal) was used to assess their online teaching preferences. Open-ended questions were also included in this section to inquire about the universities and courses the instructors taught.

Section B (8 items) collected data on the software and hardware technologies used before and during the pandemic, and those that the respondents anticipated they would continue using post-pandemic. The survey used single-response questions to gather data on the participants’ data plans and the type of Internet connection they used for online teaching. Multiple-response questions were used to inquire about the devices and platforms instructors used before and during the pandemic, as well as those they plan to use for online teaching post-pandemic. The questionnaire also explored the entities that assist online instructors during technological difficulties.

Section C (13 items) focused on measuring the human touch construct in online teaching with an ordinal scale ranging from 1 (Once in a semester) to 6 (Almost every class) for the first 12 items, and from 1 (Sometimes I do not respond) to 5 (Within the same hour) for the final item.

Based on the CoI framework, Section D (9 items) aimed to examine the online learning experiences of students from the perspective of their instructors, using an ordinal scale of 1 (Strongly Disagree) to 5 (Strongly Agree). There was an additional open question about how online teaching affected educators’ lives. The measurement framework for the questionnaire is tabulated in Table 3.

thumbnail
Table 3. Framework of the measurements in the questionnaire.

https://doi.org/10.1371/journal.pone.0307262.t003

Statistical analysis

The data obtained were processed and analyzed using the statistical package IBM SPSS Statistics (version 28.0.1.0) for Windows.

Results

Demographic profile

Table 4 presents the demographic characteristics of the participants. The majority of participants are female (65.1%), and most fall between the ages of 31 and 50 (83.6%). A considerable percentage of participants have over 10 years of teaching experience (73.7%) and hold a PhD degree (52.6%). As an open-ended question was used to inquire about the subject respondents’ taught, responses were categorized into Science and Engineering (46.1%) or Art and Humanities (53.9%). Regarding online teaching preferences before the COVID-19 pandemic, 42.1% of participants were ambivalent, whereas 10.5% indicated they did not like it at all, accounting for 52.6% of the total participants. Only a small percentage of participants (5.3%) reported liking online teaching a lot before the pandemic, whereas 19.1% liked it a little, and 23.0% only somewhat liked it.

Technology used for online teaching

Table 5 provides a summary of the technologies the participants used in their online classes before, during, and after the pandemic.

thumbnail
Table 5. Technologies used before, during, and after the COVID-19 pandemic.

https://doi.org/10.1371/journal.pone.0307262.t005

Human touch in online teaching

Table 1 displays the mean and standard deviation for the 13 items related to humanized online teaching as reported by the participants. Responding to students’ comments was the most commonly practised humanized approach (Q20: mean = 5.81). By contrast, posting or displaying a photo of themselves was the most unusual practice (Q17: mean = 3.41). Though the responses were closely clustered around the mean for Q20 (std = 0.6), there were disparities in the responses to Q17 (std = 2.13).

Students’ experiences of online learning

Table 2 presents the mean and standard deviation of the responses obtained. Instructors were effective in teaching presence by utilizing a variety of methods to explain concepts to their students in order to facilitate better comprehension (Q34: mean = 4.22, std = 0.64). However, the instructors were generally ineffective in allowing students to vent their emotions, particularly regarding the reasons for their absence from classes (Q31: mean = 2.44, std = 1.00).

Cronbach’s alpha

Cronbach’s alpha, or coefficient alpha, is the most commonly used statistical measure for assessing the internal reliability or consistency, and item interrelatedness of a questionnaire scale [19]. Its ideal value of acceptance is ambiguous, with the frequently cited acceptable range being 0.70 or above [2022]. However, [23] stated that the value of Cronbach’s alpha between 0.6 to 0.8 is deemed acceptable. The value of Cronbach’s alpha is influenced by several factors, including the number of items and the interrelatedness of those items [2426].

The alpha values obtained for each of the principles were less than 0.70 (Trust = 0.331, Presence = 0.582, Awareness = 0.517, Empathy = 0.507) (Table 6), which is mostly due to the small number of questions used for each principle. As it is impacted by the length of the test, additional test items assessing the same idea should be added to increase the value [26]. Accordingly, the reliability and consistency of all the items on humanized online learning are examined as a whole. This approach aligns with the preference for multi-item measures over single-item measures in evaluating psychological attributes or perceptions. Multi-item measures are favoured due to their ability to reduce random measurement error, enhance precision, and provide a more comprehensive representation of complex constructs [27, 28]. The Cronbach’s Alpha Based on Standardized Items was employed for Q28, which utilized a different scale compared to the other 12 items. A score of 0.782 was obtained, suggesting that all items on the test of humanized online learning contribute positively and sufficiently to the assessment of the same construct.

thumbnail
Table 6. Cronbach’s alpha for the four principles of humanized online learning.

https://doi.org/10.1371/journal.pone.0307262.t006

All 13 items appeared to be worthy of retention, resulting in a decrease in the alpha if deleted (Table 7). This demonstrates that a composite index based on these 13 items could be created to quantify the humanized online learning concept. The mean of these 13 items was computed to create a composite index, termed as Humanized Online Teaching Index, for each participant. Fig 1 illustrates the distribution of the Humanized Online Teaching index for the 152 participants in this study, revealing that the index spanned from 3.00 to 5.92.

thumbnail
Table 7. Item analysis of 13 humanized online teaching questions.

https://doi.org/10.1371/journal.pone.0307262.t007

The computed mean index of all 152 participants was 4.82, whereas the average score for the 13 items in this section was 5.92 (with Q16 to Q27 having a maximum score of 6, and Q28 having a maximum score of 5). The ratio of the mean indexes to the average score on the 13 items was 0.8138 or 81.38%. This suggests that, on average, the participants had implemented 81.38% of the humanized online teaching elements that were examined in their online instructions.

Table 8 displays the Cronbach’s alpha values that were obtained to evaluate the reliability and internal consistency of the questions that assessed the students’ online learning experiences from the instructors’ perspectives, based on the CoI framework’s social, cognitive, and teaching presences. The low alpha values obtained for social presence (0.345), cognitive presence (0.371), and teaching presence (0.300) suggest that the questions used to evaluate these presences independently were not reliable or internally consistent due to the small number of questions utilized (i.e., only three questions for each presence). The Cronbach’s alpha obtained for all nine questions was 0.436, which is less than or equal to 0.5, indicating that the consistency and internal reliability of the questions were not acceptable. Therefore, an analysis was conducted on the inter-item correlation of these items, and Q29 to Q32 were reverse-coded.

thumbnail
Table 8. Cronbach’s alphas for the three presences of the CoI framework.

https://doi.org/10.1371/journal.pone.0307262.t008

After removing items 29 to 32, the Cronbach’s alpha improved to 0.553. Deleting Q36 further raised the alpha value to 0.631 (Table 9). An alpha value ranging from 0.61 to 0.65 indicates moderate internal consistency and reliability of the items [29]. To construct an index on the teaching experiences of the instructors regarding students’ online learning experiences, Q33, Q34, Q35, and Q37 were used. The distribution of the index is shown in Fig 2, with the maximum index being 5.00 and the minimum being 1.75. The mean index obtained was 3.88. Based on the average score of 5.00 for these four items, the ratio between the mean index and average score was 0.7753 or 77.53%. These results indicate that, on average, online students’ experiences from the perspective of their teachers are good, with a rating of 77.53%.

thumbnail
Fig 2. Students’ online learning experiences index distribution.

https://doi.org/10.1371/journal.pone.0307262.g002

Discussion

Humanized online teaching index

The humanized online teaching index ranged from 3.00 to 5.92. For each of the 13 humanized online teaching items, the least frequent act is “Once a semester” for Q16 to Q27 and “Sometimes I don’t react” for Q28. Selecting the lowest frequency for all 13 items would result in an average of 1, whereas selecting the highest frequency with a score of 6 (“Almost every class”) for Q16 to Q27 and 5 (“Within the same hour”) for Q28 would achieve the maximum average of 5.92. A score of 1 would have a ratio of 0.1688 or 16.88%, meaning the participant had incorporated 16.88% of the tested humanized online teaching items in their online instructions. Based on a minimum score of 3.00 on the humanized online teaching index, we may conclude that the instructors incorporated 50.68% of the evaluated items. The derived average of 4.82 over 5.92, which is 81.38%, shows that the instructors in Malaysia had incorporated human touch in their online classes. The analysis revealed that among the tested items, Q17, Q26, and Q18 ranked highest on the lowest scale, in descending order of their scores (Fig 3). In general, participants were hesitant to display a photograph of themselves (Q17). Though they seldom monitored students who were passive in class (Q26), they always addressed students by name during online classes (Q21) and responded to their comments in chat or discussion threads to instill a feeling of individuality in them (Q20). They regularly requested students to show a photograph of themselves (Q18), but this request was often not adhered to by the instructors themselves.

thumbnail
Fig 3. Scale distribution for Q16 to Q28 on humanized online teaching of the 152 participants.

https://doi.org/10.1371/journal.pone.0307262.g003

Factors contributing to humanized online teaching

In the digital age, understanding the nuances that contribute to effective and humanized online teaching is of paramount importance to not only enhancing students’ learning experience, but also fostering a more engaging and interactive virtual environment. Utilizing the devised humanized online teaching index, this study delved into the determinants that shape such practices among higher education instructors.

A multiple regression model anchored to participants’ demographic data (Table 4) and their technological preferences for online instruction (Table 5) was employed. Eight pivotal features emerged as potential influencers of humanized online teaching (Table 10).

thumbnail
Table 10. Key features influencing humanized online teaching.

https://doi.org/10.1371/journal.pone.0307262.t010

Each of these features—from the subjects taught to the platforms used—offers unique insight into the diverse facets that mold online teaching methodologies. For instance, the type of Internet connection or specific platform used for online classes might influence the fluidity and interactivity of sessions, thereby impacting the humanization aspect. Similarly, the duration of teaching experience provides insights into adaptability and evolution in teaching methods over time.

Statistically, predictors with p-values below the 0.05 threshold are deemed significant. A consistent variance inflation factor (VIF) below 10 across these features allayed concerns of multicollinearity, ensuring the reliability of our model. A deeper dive into the regression model, focusing on these eight features, is summarized in Table 11. With a multiple correlation coefficient (R) of 0.563, a moderate association emerged between the observed and predicted values of the dependent variable. The R-squared value of 0.317 suggests that our model elucidates approximately 31.7% of the variation in the dependent variable.

thumbnail
Table 11. Summary of regression model with selected features.

https://doi.org/10.1371/journal.pone.0307262.t011

The eight predictors were found to be statistically significant (with p-values less than 0.05) (Table 10). Given that the selected predictors are statistically valid, the R-squared value of 0.317 is indeed significant. As highlighted by Ozili [30], even an R-square as low as 0.1 (or 10 percent) may be deemed acceptable provided that a significant portion of the predictors are statistically valid. In our study, the selected features under consideration met this criterion, underscoring the robustness of our findings.

Relationship between humanized online teaching and students’ online learning experience

To understand the interplay between humanized online teaching and students’ online learning experiences, we employed Pearson correlation to analyze the relationship between the constructed Humanized Online Teaching Index and the Students’ Online Learning Experiences Index. As presented in Table 12, the Pearson correlation coefficient (r) stands at 0.423, signifying a moderate positive correlation between the two indices. This suggests that an increase in the Humanized Index score corresponds to a rise in the Students’ Experiences Index score, and vice versa. The statistical significance of this correlation is underscored by a p-value of less than 0.001, which is significant at the 0.01 level (2-tailed). Such a result indicates that the observed correlation between the indices is highly unlikely to be a mere coincidence. Consequently, we can infer a statistically significant moderate positive relationship between the Humanized Index and the Students’ Experiences Index. In essence, this relationship suggests that a higher degree of humanization in online teaching, as gauged by the Humanized Online Teaching Index, correlates with enhanced student experiences, as captured by the Students’ Experiences Index. From a practical standpoint, this underscores the potential benefits of amplifying the human element in online teaching, as it could pave the way for enriched student experiences.

thumbnail
Table 12. Correlation between humanized index and students’ experiences index.

https://doi.org/10.1371/journal.pone.0307262.t012

Conclusion

The study reveals that higher education instructors in Malaysia have effectively incorporated a significant human touch in their online instructions. They have integrated 81.38% of the evaluated humanized online teaching items. However, there are areas of hesitancy, such as the reluctance to display their own photograph and the monitoring of passive students. Conversely, there are positive practices where instructors consistently address students by name and respond to comments, fostering a sense of individuality among students.

Factors that are correlated with humanized online teaching include the subjects being taught, duration of teaching experience, quality of Internet access, type of Internet connection, and platforms or applications used for online classes. The regression model shows that these features are associated with approximately 31.7% of the variation in the dependent variable, indicating a potential correlation with humanized online teaching.

In terms of students’ online learning experiences, instructors with higher scores on the Humanized Online Teaching Index were associated with perceptions of better student experiences. These instructors tended to be more student-centered, creative, curiosity-driven, and interactive. This approach suggests a correlation where more humanized teaching methods are associated with perceptions of enhanced online experiences for students.

Lastly, there exists a moderate positive correlation between the Humanized Online Teaching Index and the Students’ Online Learning Experiences Index. This relationship suggests a possible correlation where higher degrees of humanization in online teaching are associated with improved perceptions of student experiences. This correlation is statistically significant, emphasizing the profound impact of integrating a human touch in online teaching on the overall student learning experience. These findings underscore the importance of humanizing online teaching, not only for the sake of the teaching process but also for the enriched learning outcomes it brings to students.

Implications, limitations, and recommendations

This research underscores the critical role of humanizing online teaching, revealing a distinct trend: instructors who integrate more human touch elements into their teaching strategies tend to provide enhanced online learning experiences for their students. Parker, Mahler [31] support this finding, emphasizing how maximizing human interactions can mitigate students’ feelings of isolation and significantly improve their engagement and retention. These insights underscore the pressing need to incorporate more human elements in instructional methodologies, a strategy that can significantly improve the quality of student experiences. By synergizing the Humanized Online Teaching Index with the Students’ Online Learning Experiences Index, this study offers a comprehensive framework to evaluate and refine online teaching and learning practices.

Despite its contributions, this study has several limitations: 1. Reliance on Instructors’ Perspectives: The study primarily uses instructors’ perspectives to assess students’ online learning experiences, which may not fully capture the students’ actual perceptions and experiences; 2. Geographical and Contextual Constraints: The research, focused on higher education instructors in Malaysia, may have limited applicability in other geographical or educational contexts; 3. Self-Constructed Questionnaire: The absence of established measures at the time of the study led to the development of a self-constructed questionnaire. This approach could introduce biases or omit certain aspects of the human touch in online learning. Moreover, the limited number of items per construct might not sufficiently capture the full scope of the constructs, potentially affecting the robustness of the findings; and 4. Internal Consistency Measures: To enhance the reliability assessment of the constructs, future studies should consider employing additional measures of internal consistency, such as Split-Half Reliability or Average Inter-Item Correlation, alongside Cronbach’s Alpha. These methods could offer a more nuanced understanding of the questionnaire’s internal consistency, especially given the limitations of its self-constructed nature and item quantity.

Future research should aim to explore the following: 1. Solicit Student Feedback: Direct feedback from students would offer a more comprehensive view and complement the insights gained from instructors, as highlighted in [32] on the role of feedback in promoting student learning; 2. Replicate the Study in Diverse Contexts: Replicating this study in various geographical and cultural settings would enhance the generalizability of the findings, as evidenced by research such as [33] that highlights the necessity of testing hypotheses across diverse contexts to validate their external applicability and generalizability; 3. Incorporate Technological Considerations: Acknowledging the significant role of technology in online education, institutions should invest in robust technological solutions that support the human element in teaching. Jiang and Kamel Shaker Al-Shaibani [34] highlight that teaching support, learning platforms, and curriculum settings are crucial factors influencing students’ learning adaptability, especially in vocational education settings in China, underlining the necessity of comprehensive technological and curricular support systems; and 4. Continued Research and Development: As online education evolves, ongoing research is vital to identify and integrate new elements of human touch in online teaching, ensuring its continued effectiveness and relevance. This is supported by [35] who emphasizes the need for continuous research to refine online learning practices.

Educational institutions should consider providing specialized training for instructors focused on humanizing online teaching. This proactive approach can leverage the positive link between the human touch in teaching and improved student experiences, further enriching the quality of online education.

Acknowledgments

The authors would like to thank Universiti Malaysia Sarawak (UNIMAS) for the facilities provided. Special appreciation is extended to Dr Voon Mung Ling from Swinburne University of Technology Sarawak Campus, Malaysia in reviewing the questionnaire and aiding in its dissemination.

References

  1. 1. Gunawardena CN, McIsaac MS. Distance education. Handbook of research on educational communications and technology: Routledge; 2013. p. 361–401.
  2. 2. Roberts JJ. Online learning as a form of distance education: Linking formation learning in theology to the theories of distance education. HTS: Theological Studies. 2019;75(1):1–9.
  3. 3. Collins K, Groff S, Mathena C, Kupczynski L. Asynchronous video and the development of instructor social presence and student engagement. Turkish Online Journal of Distance Education. 2019;20(1):53–70.
  4. 4. Martin F, Bolliger DU. Engagement matters: Student perceptions on the importance of engagement strategies in the online learning environment. Online Learning. 2018;22(1):205–22.
  5. 5. Borup J, West RE, Graham CR. Improving online social presence through asynchronous video. The Internet and Higher Education. 2012;15(3):195–203.
  6. 6. Acheson LL, editor Making Technology Human: Real Time Interactions In Online Learning. Educational Technologies 2020 (ICEduTech 2020); 2020; São Paulo, Brazil.
  7. 7. Pacansky-Brock M, Smedshammer M, Vincent-Layton K. Humanizing online teaching to equitize higher education. Current Issues in Education. 2020;2(21).
  8. 8. Shin M. Confronting (de) humanizing remote teaching practices. Contemporary Issues in Early Childhood. 2021:14639491211035452.
  9. 9. Herman PC. Online Learning Is Not the Future 2020 [Available from]: https://www.insidehighered.com/digital-learning/views/2020/06/10/online-learning-not-future-higher-education-opinion.
  10. 10. Costa K. LinkedIn. 2020. [cited 2022]. Available from: https://www.linkedin.com/pulse/cameras-damned-karen-costa/.
  11. 11. Humanizing Learning: Strategies to Implement Now [Internet]. Pedagogicon Poster Gallery. 2022. Available from: https://encompass.eku.edu/cgi/viewcontent.cgi?article=1011&context=pedagogicon_postergallery.
  12. 12. Pacansky-Brock M. How to humanize your online class, version 2.0 2020.
  13. 13. Garrison DR, Anderson T, Archer W. Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education. 1999;2(2–3):87–105.
  14. 14. Swan K, Garrison D, Richardson JC. A constructivist approach to online learning: The community of inquiry framework. Information technology and constructivism in higher education: Progressive learning frameworks: IGI global; 2009. p. 43–57.
  15. 15. Bektashi L. Community of Inquiry Framework in online learning: Use of technology. Technology and the curriculum: Summer 2018. 2018.
  16. 16. Kim GC. Effective Online Instruction Through the Community of Inquiry Framework: An Exploratory Study in Kinesiology. 2022.
  17. 17. Weidlich J, Bastiaens TJ. Explaining social presence and the quality of online learning with the SIPS model. Computers in Human Behavior. 2017;72:479–87.
  18. 18. Nolan-Grant CR. The Community of Inquiry framework as learning design model: a case study in postgraduate online education. Research in learning technology. 2019;27:2240.
  19. 19. Cronbach LJ. Coefficient alpha and the internal structure of tests. psychometrika. 1951;16(3):297–334.
  20. 20. Nunnally JC. Psychometric theory 3E: Tata McGraw-hill education; 1994.
  21. 21. Bland JM, Altman DG. Statistics notes: Cronbach’s alpha. Bmj. 1997;314(7080):572.
  22. 22. DeVellis RF, Thorpe CT. Scale development: Theory and applications: Sage publications; 2021.
  23. 23. Shi J, Mo X, Sun Z. Content validity index in scale development. Zhong nan da xue xue bao Yi xue ban = Journal of Central South University Medical sciences. 2012;37(2):152–5. pmid:22561427
  24. 24. Vaske JJ, Beaman J, Sponarski CC. Rethinking internal consistency in Cronbach’s alpha. Leisure sciences. 2017;39(2):163–73.
  25. 25. Sijtsma K. On the use, the misuse, and the very limited usefulness of Cronbach’s alpha. Psychometrika. 2009;74:107–20. pmid:20037639
  26. 26. Tavakol M, Dennick R. Making sense of Cronbach’s alpha. International journal of medical education. 2011;2:53. pmid:28029643
  27. 27. Gliem JA, Gliem RR, editors. Calculating, interpreting, and reporting Cronbach’s alpha reliability coefficient for Likert-type scales2003: Midwest Research-to-Practice Conference in Adult, Continuing, and Community.
  28. 28. Spector PE. Summated rating scale construction: An introduction: Sage; 1992.
  29. 29. Taber KS. The use of Cronbach’s alpha when developing and reporting research instruments in science education. Research in science education. 2018;48(6):1273–96.
  30. 30. Ozili PK. The acceptable R-square in empirical modelling for social science research. Social research methodology and publishing results: A guide to non-native english speakers: IGI Global; 2023. p. 134–43.
  31. 31. Parker N, Mahler BP, Edwards M. Humanizing Online Learning Experiences. Journal of Educators Online. 2021;18(2).
  32. 32. Burkšaitienė N. Promoting student learning through feedback in higher education. Socialinių mokslų studijos: mokslo darbai = Social sciences studies: research papers/Mykolo Romerio universitetas Vilnius: Mykolo Romerio universitetas, 2012, Nr 4 (1). 2012.
  33. 33. Cheng Z, Dimoka A, Pavlou PA. Context may be King, but generalizability is the Emperor! Journal of Information Technology. 2016;31(3):257–64.
  34. 34. Jiang L, Kamel Shaker Al-Shaibani G. Influencing factors of students’ small private online course-based learning adaptability in a higher vocational college in China. Interactive Learning Environments. 2022:1–22.
  35. 35. Shattuck K. What we’re learning from Quality Matters-focused research: Research, practice, continuous improvement. Quality Matters. 2012:1–29.