Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Development of a Hybrid Course on Wheelchair Service Provision for clinicians in international contexts

  • Yohali Burrola-Mendez,

    Roles Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Writing – original draft, Writing – review & editing

    Affiliations Department of Rehabilitation Science and Technology, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America, International Society of Wheelchair Professionals (ISWP), University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America

  • Mary Goldberg ,

    Roles Conceptualization, Funding acquisition, Methodology, Project administration, Supervision, Writing – review & editing

    mgoldberg@pitt.edu

    Affiliations Department of Rehabilitation Science and Technology, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America, International Society of Wheelchair Professionals (ISWP), University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America

  • Rachel Gartz,

    Roles Conceptualization, Writing – review & editing

    Affiliations Department of Rehabilitation Science and Technology, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America, International Society of Wheelchair Professionals (ISWP), University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America

  • Jon Pearlman

    Roles Conceptualization, Funding acquisition, Methodology, Project administration, Resources, Supervision, Writing – review & editing

    Affiliations Department of Rehabilitation Science and Technology, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America, International Society of Wheelchair Professionals (ISWP), University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America

Abstract

Introduction

Wheelchair users worldwide are at high risk of developing secondary health conditions and premature death due to inappropriate wheelchair provision by untrained providers. The International Society of Wheelchair Professionals (ISWP) has developed a Hybrid Course based on the World Health Organization’s Wheelchair Service Training Package—Basic Level. The Hybrid Course leverages online modules designed for low-bandwidth internet access that reduces the in-person training exposure from five to three and a half days, making it less expensive and more convenient for both trainees and trainers.

Methods

The Hybrid Course was designed using a systematic approach guided by an international group of stakeholders. The development followed the Quality Matters Higher Educational Rubric, web design guidelines for low bandwidth, experts’ opinions, and the best practices for blended course design. A quasi-experimental approach was used to evaluate the effectiveness of the Hybrid Course taken by six graduate students in Rehabilitation Sciences at the University of Pittsburgh by measuring pre- and post knowledge using the validated ISWP Wheelchair Service Provision—Basic Test. The outcome measure was assessed using a paired sample t-test between pretest and posttest scores. The quality of the Hybrid Course was evaluated by three external reviewers using the Quality Matters Higher Educational Rubric who were blind to each others’ evaluation and the results of the training intervention.

Results

Hybrid Course participants reported significant increases in scores on the ISWP Wheelchair Service Provision—Basic Test after participating in the training, with an average increase of 10.84±5.42, p = 0.004, Cohen’s d = 1.99. In addition, the Hybrid Course met the Quality Matters Standards in two out of three evaluations and reported a percentage of agreement between evaluators of 84%.

Conclusions

The Hybrid Course met quality standards and proved to be effective in increasing basic level wheelchair knowledge in a group of Rehabilitation Science graduate students.

Introduction

The World Health Organization (WHO) estimates that 10% of people with disabilities, approximately 112 million people, need a wheelchair for mobility and function. However, only 5%–15% of them have access to a properly fitted wheelchair, indicating that approximately 96 million people do not have a wheelchair or have one that does not meet their needs [13]. Wheelchair users who do not have access to appropriate wheelchair provision by trained providers are at a high risk of developing secondary health conditions and premature death [1, 4]. The lack of training of personnel delivering wheelchair services may result in poorly fitted wheelchairs that are difficult to propel, fail prematurely and cause injuries to the user [1, 4, 5]. The health complications of inappropriate wheelchair provision include pressure injuries, falls, overuse or repetitive strain injuries, postural deformities, restricted breathing, and a limited range of motion [1, 6]. This situation suggests that countries where inappropriate wheelchair service delivery is occurring are not fulfilling the promise of the United Nations Convention on the Rights of People with Disabilities (UNCRPD), which entitles all people to the right of personal mobility [7].

In 2008 the WHO, the United States Agency for International Development (USAID), the International Society for Prosthetics and Orthotics, and Disabled Peoples’ International launched the Guidelines for the Provision of Manual Wheelchairs in Less-Resourced Settings as an international effort to promote training and to assist nations in fulfilling the United Nations Convention on the Rights of People with Disabilities (UNCRPD) [1]. The WHO Guidelines outline eight service steps and the minimum standards that form the basis of a comprehensive wheelchair service based on international evidence-based practice and research [1, 6, 8, 9]. Following the release of the WHO Guidelines, the WHO and USAID, published a series of Wheelchair Service Training Packages (WHO WSTPs) to support clinicians’ training and to increase wheelchair access worldwide. To date, there are five WHO WSTPs: Basic, intermediate, managers’, and stakeholders’, and trainers’ packages [1014]. All of the WHO WSTPs follow a learning methodology of in-person training held over consecutive days. This training format may make it difficult for busy providers to attend and to scale across multiple settings, including university training programs. As a result, there still is a widespread delivery of inappropriate wheelchairs worldwide, indicating that the training uptake has been slow, and capacity is insufficient.

Blended learning, or hybrid learning, is a mix of different learning environments and approaches that include online and in-person methods [15]. This type of learning is a cost-effective [16] and student-accepted [1719] method of knowledge dissemination and could feasible for global health education [2022]. Literature has demonstrated that blended learning is as effective as in-person learning in medical and non-medical education [15, 17, 2325] and can be a feasible solution to overcome knowledge dissemination barriers in less-resourced areas [20].

As blended education proliferates, so do efforts to evaluate its effectiveness by research and assessment groups worldwide such as the International Association for K-12 Online Learning (iNACOL) [26], Education Elements [27], the University of Central Florida—Blended Learning Toolkit [28], Quality Matters [29], and Educause [30], among others. One such organization, Quality Matters (QM), is an international leader in rubric development and quality assurance for online education. QM has developed rubrics intended to guide the development, evaluation, and improvement of online and blended courses, such as the QM Higher Education Rubric [29, 31]. This rubric includes 8 General Standards and 43 Specific Review Standards used to evaluate the design of online and blended courses [29, 31, 32]. To certify the quality of a course, QM requires at least 85% of the content to meet the quality expectations of the rubrics. Multiple studies have used the QM Rubrics successfully to guide the development and to assess the quality of online and blended courses [3337].

Motivated by the successful learning outcomes using blended courses as described in the literature and with the aim of offering alternative learning methodologies that increase the spread of wheelchair service delivery training, our goal was to develop and evaluate a blended learning approach for the WHO WSTP-Basic level (WHO WSTP-B).

The specific aims of this action research study were to:

  1. Determine the online design criteria and content allocation and develop online modules in English.
  2. Do a pilot test of the Hybrid Course to evaluate the learning effect.
  3. Evaluate the quality of the Hybrid Course using the Quality Matters Higher Education Rubric.

We hypothesized that trainees of the Hybrid Course would have significantly higher scores on the ISWP Wheelchair Service Provision–Basic Test after receiving training and that the Hybrid Course would receive an adequate rubric score (85%) to meet the quality standards for blended courses.

Materials and methods

Each specific aim was completed in sequence; the methods are described below.

Specific aim 1: Determine the online design criteria, the allocation of online content, and develop online modules in English

Identify the online design criteria.

To address specific aim 1 the International Society of Wheelchair Professionals (ISWP, a coordinating body for the wheelchair sector) [38] formed the Hybrid Subcommittee (HSC), a multidisciplinary and international stakeholders group that guided the development of the Hybrid Course. The HSC was represented by eight members from high-, middle- and low-income countries (Brazil, Canada, Colombia, India, Mexico, Philippines, the United Kingdom, and the United States of America) with experience in delivering wheelchair training and developing educational programs for high- and low-resource settings. Over the course of 12 months (May 2015 to April 2016), the HSC held synchronous monthly online meetings to discuss and advise upon the development of the Hybrid Course (Fig 1). ISWP staff attended the meetings to help coordinate the sessions and distribute the agenda and the minutes. All meetings were recorded and made available to HSC members [39]. An ISWP core team member and HSC member (YBM) with both curriculum and course development and clinical wheelchair experience was the primary developer of the Hybrid Course with advice from the co-authors of this study and the HSC.

Identify design criteria.

The HSC considered the QM Higher Education Rubric [32] to be a useful framework to guide the development of the Hybrid Course. However, the HSC identified that the Rubric lacks strategies to implement courses in international contexts where connectivity and low internet speeds may be challenging. The HSC reviewed the best practices on blended course design [40] and considered its members’ experience developing educational programs and delivering training in international settings to offer suggestions that strengthened the QM Higher Education Rubric and guided the development of online modules.

Determine the appropriate allocation of online content.

The aim of the Hybrid Course is to offer an alternative learning methodology for the WHO WSTP-B. The purpose of the WHO WSTP-B is to develop the skills and knowledge of personnel who are required to deliver basic level wheelchair services to people with mobility impairments who can sit upright without additional postural support [10]. No clinical background is required to access the training, which makes it feasible to replicate in places where there are few or no professionals in the field of seating and mobility [10]. Table 1 presents the content and time allocation of the WHO WSTP-B in its original learning methodology of in-person training.

The HSC and the primary developer of the Hybrid Course analyzed the theoretical and practical components of the WHO WSTP-B to select the most appropriate content to host online.

Online module development.

The primary developer of the Hybrid Course (YBM) utilized the QM Higher Education Rubric [32], HSC recommendations, web design guidelines for low bandwidth [41] and best practices for blended course design [40] to develop a set of specific review standards that guided the development of the Hybrid Course. The development of the online modules included two rounds of internal (co-authors) and external (HSC) revisions (Fig 1). In the first round, a module prototype was developed and distributed to the HSC members to collect feedback about the visual design of the course, the modules’ sections and the layouts. In the second round of revision, all modules and their respective content were created and distributed via the online platform. For this round, feedback was solicited on curriculum and platform access. In terms of curriculum, reviewers were asked to verify that the content (learning objective, topics, quizzes, and activities) strictly followed the WHO WSTP course. To evaluate the platform access, HSC members were asked to test the modules in different settings with high and low internet connection speeds and using different technology devices such as smartphones, tablets, laptops, and desktop computers.

Specific aim 2: Do a pilot test of the Hybrid Course

To address specific aim 2, a quasi-experimental trial utilized a pretest-posttest design to evaluate changes in basic level wheelchair knowledge using the validated ISWP Wheelchair Service Provision–Basic Test [42]. The study was approved by the University of Pittsburgh Institutional Review Board.

Study sample.

The sample was selected using a convenience sampling method guided by the co-authors and the HSC. The research team met with the academic directors of the Physical Therapy (PT), Occupational Therapy (OT), Prosthetics and Orthotics (P&O), and Rehabilitation Science and Technology (RST) programs from the University of Pittsburgh to inform and share the scope of the project. The academic directors distributed the Hybrid Course flyer to their students inviting them to register for the Hybrid Course. The flyer included the description of the course, inclusion criteria, location, online and in-person time commitments, schedule, registration process and contact information (Fig 2).

The study’s inclusion criteria included: 1) students, staff, or professors from the PT, OT, P&O, and RST programs from the University of Pittsburgh; 2) who have not taken the ISWP Wheelchair Service Provision–Basic Test. We excluded participants who were simultaneously working on other wheelchair-related study or training.

Outcome measure: Wheelchair Service Provision knowledge.

The ISWP Wheelchair Service Provision–Basic Test is a valid method for measuring the basic competency of wheelchair professionals independent of geographic location[42]. The test consists of 19 sociodemographic questions and 75 multiple-choice questions that evaluate basic wheelchair service delivery. The multiple-choice questions evaluate seven domains of wheelchair service delivery: 1) assessment, 2) prescription, 3) fitting, 4) production, 5) user training, 6) process, and 7) follow-up and maintenance as covered in the WHO WSTP-B. The domains have different weights based on the pre-set number of questions that each domain was allocated. Each domain has a pool of questions created to reduce the likelihood of receiving the same question when taking the test multiple times. Test scores greater than or equal to 53 points (70% of the total points) are considered passing scores. The test was hosted and distributed online through the testing platform, Test.com®.

Intervention.

Fig 3 presents the study’s overview and timeline. The training was purposefully implemented interprofessionally. Two trainers were physical therapists and one was an occupational therapist. The trainers had been trained in the WHO WSTP-B and have participated in the WHO Wheelchair Service Training of Trainers Package–Basic level [14]. In addition, all the trainers have facilitated the WHO WSTP-B and have provided basic level wheelchair services in both high- and less-resource settings.

Assessments were collected one-week before and after the training intervention. The participants received an email with the instructions on how to log into the testing platform, Test.com®, and the contact information of ISWP staff in case of technical problems or questions. Participants were instructed to complete the test without accessing course materials. The settings of ISWP Wheelchair Service Provision–Basic Test included: 1) a random distribution of questions and answers from the domains’ pool of questions; 2) forced completion that required the participants to complete the test in a one-time entry; and 3) feedback and scores where the test provided the immediate test scores and the option to review correct and incorrect answers.

Online training: As indicated in Fig 3, the online learning was divided into three sequential phases. At each phase, the participants reviewed the content and completed the required activities asynchronously. After the completion of each phase, an online session (recitation) occurred synchronously between the participants and the trainers. During the recitations, trainers reinforced the key points of the modules, answered questions, discussed topics and promoted interactions between the participants. The recitations were recorded and made available to the participants and trainers. ISWP staff helped to coordinate the recitations and provide support if needed. Table 2 presents the training agenda of the online and in-person sessions.

In-person training: After the completion of the online modules, the participants attended three days of in-person training led by the three trainers at the University of Pittsburgh, USA (Table 2). Three experienced wheelchair users were also invited to participate as volunteers in the in-person sessions, and the trainees had the opportunity to work directly with them throughout the in-person sessions.

Data management and analysis.

All data was collected in a Test.com® database, exported into a CSV file and then into SPSS® Version 24.0. Descriptive statistics were calculated. For the outcome measure, knowledge change, a paired sample t-test was calculated to compare the levels of knowledge between the baseline and post-training total scores. In addition, paired sample t-tests were calculated for each test domain to explore specific knowledge changes. The effect size was calculated using Cohen’s d [43]. All analyses were carried out using an exploratory alpha level of 0.05.

Specific aim 3: Evaluate the quality of the Hybrid Course using the Quality Matters Higher Education Rubric

To address specific aim 3, ISWP recruited three evaluators from two middle-income countries, fluent in English, with experience implementing educational health programs in high- and less-resourced settings using in-person and blended learning methodologies. This quality evaluation was conducted after the implementation of the Hybrid Course.

The QM Higher Education Rubric was used to evaluate the quality of the Hybrid Course. This tool includes 8 General Standards: 1) Course Overview and Introduction; 2) Learning Objectives (Competencies); 3) Assessment and Measurement; 4) Instructional Materials; 5) Course Activities and Learner Interaction; 6) Course Technology; 7) Learner Support; and 8) Accessibility and Usability; and 43 Specific Standards that are dichotomously rated as either “met” or “not met” [31, 32]. When a Specific Standard is met, it receives a pre-assigned value of 1, 2, or 3 points; those with point values of 3 are considered “essential standards” and must be met for a course to “meet standards” [32, 44]. The maximum score of the Rubric is 99 points. A score of 85 points out of 99, or 85%, as well as meeting all 3-point essential standards, is required for a course to meet the QM Standards.

The evaluators received an individual invitation via email from the primary developer of the Hybrid Course to voluntarily review and evaluate the course using the QM Higher Education Rubric. Upon agreeing to participate, a second email was sent with the link and instructions to access and review the Hybrid Course and the QM Higher Education Rubric attached. Reviewers had six weeks to evaluate the Hybrid Course and return the rated QM Rubric via email. In addition, they were encouraged to submit suggestions and comments to help to improve the course and to contact the author with any inquiries. Reviewers were blinded to both each other’s evaluations and the pilot results from the Hybrid Course.

The rated rubrics were transcribed to a Microsoft Excel spreadsheet database. A total score was obtained per reviewer by adding all the pre-assigned value (1, 2, or 3 points) of the Specific Standards that were met. In addition, the percentage of agreement was computed by calculating the number of times raters agreed on a rating and then divided by the total number of ratings [45, 46]. This technique allows exploration of agreement of multiple evaluators and the opportunity to identify items that may be problematic [46]. An odd number of reviewers was set to identify probably problematic items. The HSC and the co-authors of the study determined that if an item was rated as “not met” by two or more reviewers, it would be considered as problematic. Addressing problematic items will be part of future studies. The additional recommendations offered by the HSC were included in the rubric. Reviewers were asked to evaluate if the items suggested by the HSC were “met” or “not met”; however, these items did not receive any points to not interfere with the QM’s review process.

Results

Specific aim 1: Determine the online design criteria, the allocation of online content, and develop online modules in English

Identify the online design criteria.

The HSC offered eight suggestions to strengthen the QM Higher Education Rubric and to guide the development of the online modules. The suggestions were added to the original QM Rubric and are presented in Table 3. The items 1.10, 1.11, 3.6, 3.7, 5.5, 8.6, 8.7, and 8.8 represent HSC’s suggestions. The category Accessibility and Usability received the most suggestions (8.6, 8.7, 8.8), reflecting HSC’s emphasis on developing a course that could be shared and scaled in the future. To meet the accessibility and usability requirements, the authors selected Adobe Captivate 9® and CourseSites by Blackboard® as the authoring tool and the learning management system, respectively, to develop and host the online modules of the Hybrid Course. The selection was made based on the availability of the Adobe Captivate 9® program and the free hosting and publishing online courses that CourSesites® offers [47].

thumbnail
Table 3. Test results from the Quality Matters Higher Education Rubric and the additional Hybrid Subcommittee items.

https://doi.org/10.1371/journal.pone.0199251.t003

Determine the appropriate allocation of online content.

The HSC reached a consensus and selected the Introduction and Core Knowledge to be the online components of the Hybrid Course due to their theoretical components and the few practical activities that they included (Table 4). The Wheelchair Service Basic Steps section was selected to be the in-person component of the Hybrid Course following the methodology proposed by the WHO WSTP-B. Table 5 presents the Hybrid Course content distribution.

thumbnail
Table 4. WHO WSTP-B time allocation for practicals sessions.

https://doi.org/10.1371/journal.pone.0199251.t004

Develop the online modules.

The specific actions implemented by the primary author to fulfill the QM Rubric and the additional HSC suggestions are grouped based on the QM’s Standards and described below.

  1. Course Overview and Introduction: An Introductory module was developed that included: 1) an overview of the Hybrid Course that covered structure of the course, the purpose of the course, target audience, prerequisites, instructional materials, technology requirements, instructional videos explaining how to access CourseSites® and how to navigate through the Adobe Captivate® modules; and 2) tips on how to succeed in online learning. This module was hosted at ISWP’s website and distributed via an external link to the participants prior to the beginning of the course.
  2. Learning Objectives: Each online module of the Core Knowledge included learning objectives according to the WHO WSTP-B.
  3. Assessment and Measurements: Short quizzes (three to five questions) were developed to monitor the participants’ comprehension of the material. They were developed by the primary author and reviewed by the co-authors and the HSC. The quizzes included different types of questions such as multiple choice, multiple answers, matching columns, case studies, and true or false (Fig 4). The quizzes were automatically evaluated; this allowed the participants to receive feedback and to review the quiz immediately after its completion. Passing scores were not required to continue with the in-person training; however, all modules with their respective quiz needed to be completed. Individual scores were automatically reflected in the grading center allowing the participants and trainers to track the learning progress (Fig 5).
  4. Instructional Materials: An electronic version of the two required materials, the WHO Reference Manual for Participants and the Participant’s Workbook, were available for participants to download from CourseSites® prior to the training. Moreover, if the participants preferred to read the materials online, each module included the reading or activity that needed to be reviewed. This feature centralized resources and improved navigation throughout the course. Some modules included a folder with optional reading materials.
  5. Course Activities and Learner Interaction: Course Activities: The online modules followed all the activities suggested by the WHO Trainer’s Manual–Basic Level; however, 25.8% of the allocated time to Core Knowledge is used to practice skills (Table 3). In order to comply with the WHO WSTP-B content structure, the practical activities were allocated to the first day of the in-person portion. The online modules introduced the activity with a video; after watching it, the participants were informed that they will practice those skills during the in-person portion of the training.
    Learner Interaction: With the aim of supporting active learning and promoting participants’ and trainers’ interaction, two activities were developed: 1) Discussion Forums, in which the participants were asked to post questions for each module and encouraged to respond to each others’ inquiries. Trainers monitored the discussion forums and added necessary information. This feature promoted human interaction consisting of two-way communication between one student to another student and between students and trainers; and 2) Recitations, three synchronous recitations were conducted to monitor the online section; the first, after the completion of the first three modules; the second, after the completion of the next three modules; and the third, after completing the last two modules (Fig 3). During the recitations, facilitators reinforced the key elements of each module and responded to participants’ questions that were not addressed by their peers.
  6. Course Technology: The Introductory module included a section about technology requirements needed to access CourseSites®, where the online modules are hosted. This section has a link to a browser checker that verifies whether CourseSites® supports a learner’s browser and operating system [48] and provided feedback on what steps to follow to be able to view content within CourseSites®.
  7. Learner Support: The Introductory module included a section on how to succeed in online learning, that described effective communication skills and how to get technical support during the online components of the training. The participants could send emails to the trainers and primary developer of the modules through CourseSites®. In addition, the platform includes a help center with information about common issues and student FAQs [49].
  8. Accessibility and Usability: Adobe Captivate 9® and CourseSites by Blackboard® were selected as the authoring tool and the learning management system to develop and host the online modules. The tools are compatible with Sharable Content Object Reference Model (SCORM) and Learning Management System (LMS) criteria suggested by the HSC to share and scale the training program [50, 51]. Also, Adobe Captivate® and CourseSites® met the low bandwidth requirements and mobile content accessibility needed to implement training programs in settings with slow internet connection speeds [47, 52].

Specific aim 2: Do a pilot test of the Hybrid Course

A total of six participants were recruited; all of them completed the pre-and post-assessments, and therefore there were no dropouts. The average age was 26±3 years. The participants’ characteristics are described in Table 6. The demographic questions that referred to the work setting, age group served, and motivation to take the training allowed multiple answers, and the participants were asked to select all applicable options. Data was tested for normality and homogeneity of variance; the assumption of normal distribution between pre-and post-scores was met (Shapiro-Wilk test p>0.05). A paired-samples t-test indicated that post-assessments scores were significantly higher (M = 64.17, SD = 5.41) than pre-assessments scores (M = 53.33, SD = 1.66), t(5) = 4.897, p = 0.004; Cohen’s d = 1.99. Knowledge changes per the test domains were analyzed; all domains except for “Follow up and maintenance” saw an increase in mean scores between pretest and posttest (Table 7). There was a significant increase in scores in the domains of “Prescription”, “User Training” and “Process” (Table 7).

Specific aim 3: Evaluate the quality of the Hybrid Course using the Quality Matters Higher Education Rubric

The evaluators’ reviews are included in Table 2. The rated rubrics from evaluators 1 and 3 met the QM standards by (1) reporting a mean total score above the 85%; and (2) scoring as “met” all essential standards. Nevertheless, results from evaluator 2 indicated that the total score did not fulfill the 85% threshold nor were essential standards were met; in particular, the essential standards not met were 2.4, 3.3 and 6.2. The percentage of agreement between evaluators in the QM rubric was 84% (Table 8).

The items considered problematic, those rated as “not met” by two or more reviewers were: 7.4 Course instructions articulate or link to an explanation of how the institution’s student services and resources can help learners succeed and how learners can obtain them; and 8.6 The course was developed considering low bandwidth requirements.

Discussion

We developed a Hybrid Course based on the WHO WSTP-B using a systematic approach guided by an international committee of experts, the HSC, with experience delivering wheelchair training and developing educational programs for international contexts. The HSC was composed of members from low to high-income countries with experience developing and facilitating wheelchair service training in different settings. Studies have argued that stakeholder engagement is a key mechanism for increasing the relevance of research, promoting knowledge translation, and enhancing positive effects in a community [5355]. Collaborating with stakeholders results in more usable, relevant and transferable knowledge that could help solve global health problems[5456]. In particular, the rehabilitation field calls for a greater involvement and collaboration of stakeholders in all phases of the research process[54, 5759] and we followed this guidance and ensured that the HSC was involved in all phases of the development and implementation processes.

In addition to our primary evaluation mechanism (international stakeholder feedback), we used the Quality Matters Standards to identify areas of opportunity for future development and provide insight into any areas that may be problematic for the initial implementation of the course. The Hybrid Course met the Quality Matters Standards in most of its reviews [29, 44] and reported a percentage of agreement between evaluators at an acceptable value of 84% (>80%) [46]. The two items considered problematic were 7.4 Course instructions articulate or link to an explanation of how the institution’s student services and resources can help learners succeed and how learners can obtain them; and 8.6 The course was developed considering low bandwidth requirements. A first approach to address these items could be (7.4) to clearly enlist the resources and support that the participants could access throughout the course; and (8.6) to state that the course was developed considering low bandwidth requirements. Given the fact that the Hybrid Course was developed considering web design guidelines for low bandwidth [41] and using an authoring tool and a learning management system that met low bandwidth requirements [50, 51] as described in the Methods and Results section, we believe that this item was marked as “not met” due to the lack of a clear statement of this characteristic and/or the inability of the reviewers to test this feature. The objective of this first quality review was to collect preliminary results that could help identify areas of opportunity. As the Hybrid Course is implemented in other settings, future studies could continue exploring the quality of the course and include additional sources of feedback such as the participants’ levels of satisfaction after the training intervention.

It is worth noting that multiple studies have used the QM Rubrics to guide the development of online and blended courses; however, the courses represented in those studies were developed for learners from the United States and other high-income countries [3337]. In contrast, the Hybrid Course is intended for international contexts including low to high resource settings. Therefore, the HSC slightly adapted the QM Higher Education Rubric to include items that make it more contextually appropriate for less-resourced settings (e.g. 8.6 The course was developed considering low bandwidth requirements, and 8.8 The course is hosted in a platform that allows mobile content access). Future studies could continue working on this rubric and include validity evidence that support its use.

Additionally, there were significant increases in both the total ISWP Wheelchair Service Provision–Basic Test score and several domain scores, further demonstrating the training’s potential value and the feasibility of improving the participants’ basic wheelchair knowledge. Prior to the course, the participants’ mean pre-test score was slightly above the passing cutoff of the test (53 points), which indicates that our sample had some basic level wheelchair knowledge; despite that, the Hybrid Course still positively impacted the participants’ knowledge, reporting a significant increase in the mean post-test scores (p = 0.004) with an effect size (d = 1.99) that exceeded Cohen’s convention for large effect size (d = 0.80)[43]. This may suggest that the Hybrid has potential to increase knowledge for participants who have a range of baseline knowledge, including those who may be more advanced.

Moreover, paired t-tests were calculated per domain to explore specific knowledge change. Participants’ scores significantly increased after the training in some domains, but not all. There is some evidence to suggest that may be due to the emphasis placed on domain content through practical sessions in the in-person training and the participation of wheelchair users. For example, the "Prescription", "User Training" and "Process" domains all resulted in significantly higher scores, and the participants were able to fully experience those aspects of the provisioning process with wheelchair users in the practical sessions of the training. It is worth noting these domains also are highly represented in the test, suggesting that a significant increase would be more likely. However, this finding was not universal as the "Assessment" and "Fitting" domains, which are also highly represented in the test and include in-person practical sessions, did not demonstrate a significant increase on the post-test. This could be because the "Assessment" domain baseline knowledge was already high before the test (pre-score of 80.7%) and resulted in a final increase to 90.4%. It is important to note that the alpha level for multiple testing was not adjusted because this is an exploratory study with a small sample size and we were concerned about the potential of making a Type II error [6062]. Furthermore, we were interested in exploring the magnitude of the effect rather than statistical significance. A larger sample size would help to determine the Hybrid Course’s specific impact on each domain and to explore which learning modalities are likely to have the biggest impact on the participants’ knowledge.

The Hybrid Course, due to its proposed scalability, flexibility, and effectiveness in improving participant knowledge, may have the potential to build local and global capacity by increasing the number of people trained in basic level wheelchair delivery. Increasing the number of trained personnel delivering wheelchair services may reduce secondary complications due to inappropriate wheelchair provision such as pressure injuries, postural deformities, and restricted breathing that may lead to better health outcomes among wheelchair users. The unique characteristics of the Hybrid Course design, such as its low bandwidth design [41], its mobile accessibility [47, 63], and its compatibility with SCORM and LMS [50, 51] allow the course to be adapted to different contexts and to be scaled to different levels [64, 65]. At the local level, governments could use the Hybrid Course to spread trainings across regions and to fulfill the promise of the United Nations Convention on the Rights of People with Disabilities. At the international level, the Hybrid Course could be an available tool for nations aligned with the WHO Global Cooperation on Assistive Technology (GATE) goals of promoting access to high-quality, affordable assistive products to lead a healthy, productive and dignified life [66]. Moreover, ISWP’s range of partners that including NGO’s, universities, and disability organizations [67] could use the Hybrid Course to implement trainings more efficiently. The reduced number of the in-person sessions and their associated costs, without compromising the effectiveness of the training, may allow organizations to allocate resources to more underserved regions.

Several limitations of this study are important to note in planning for future research and interpreting the current results. Although we have previously discussed the benefits of stakeholders’ engagement throughout all phases of the research process, the Hybrid Course may still face operational challenges that were not identified by our stakeholders’ group or were unsuccessfully addressed in the online development. These challenges would be realized more clearly through the implementation of the Hybrid Course in different contexts; this is part of our ongoing work. In this study, we had a small and highly educated sample size that may not reflect the population we intend to target and make our findings not generalizable to wheelchair providers from different settings.

We used the QM Higher Education Rubric to evaluate the quality of the Hybrid Course. This rubric uses a unidimensional approach with a dichotomous rating that asked reviewers to select if each of the Specific Standards had been “met” or “not met”. This approach may be too constricting and limit the analysis. A different scale format, like a 5-point Likert type-scale, could help quantify evaluators’ opinions and perceptions and provide detailed feedback on how to address the course’s limitations. Moreover, we only recruited three external reviewers that were not certified as Quality Matters’ Master Reviewers. Their responses were analyzed using a percentage of agreement that does not account for the impact of chance agreement and may overestimate or underestimate the agreement between evaluators [45, 46]. Conducting an official Quality Matters review or having more reviewers from different geographic locations may have uncovered greater disagreement between the evaluators or operational challenges in accessing the online modules. The implementation of the Hybrid Course in low-, medium- and high-income countries is a topic of ongoing work and will help us to address the limitations of this study and measure the impact of the training in other contexts.

Ongoing & future work

Ongoing and future work will be focused on implementing the Hybrid Course in different contexts to evaluate the feasibility of the training program to be used for global capacity building. Implementation of the Hybrid Course was recently completed in India and Mexico (after translation into Spanish), and results are being compared to the traditional 5-day in-person training. The results of these comparisons will be the topic of future research publications. Additional future work includes translating the Hybrid Course into different languages, disseminating the training material through global partners, and expanding the Hybrid Course training program by moving more content into online training modules, including content from other WHO WSTP programs (e.g., the intermediate, managers’, stakeholders’, and trainers’ packages).

Conclusions

A Hybrid Course on Wheelchair Service Provision for wheelchair providers in international contexts was developed using a systematic approach guided by an international group of stakeholders. The Hybrid Course met quality standards in two out of three evaluations and proved to be effective in increasing basic level wheelchair knowledge in a pilot study held at the University of Pittsburgh. Comparisons between the Hybrid and traditional in-person training are currently being performed as part of a feasibility study in international contexts.

Acknowledgments

Authors would like to acknowledge Lee Kirby, chair of the Hybrid Subcommittee, for his support and guidelines throughout the development of the Hybrid Course, Nancy Augustine for her critical revision of the manuscript, and to our external evaluators Norma Jimenez García, Eduardo Jacome Sampieri, and Francisco J. Bonilla-Escobar.

References

  1. 1. World Health Organization. Guidelines on the provision of manual wheelchairs in less resourced settings. Geneva: WHO; 2008.
  2. 2. World Health Organization. World Report on Disability. Geneva: 2011.
  3. 3. United Nations, Department of Economic and Social Affairs, Population Division. World Population Prospects: The 2017 Revision, custom data acquired via website. Geneva: United Nations; 2017 [cited 2018 Jan 3]. https://esa.un.org/unpd/wpp/DataQuery/.
  4. 4. Toro ML, Worobey L, Boninger ML, Cooper RA, Pearlman J. Type and Frequency of Reported Wheelchair Repairs and Related Adverse Consequences Among People With Spinal Cord Injury. Arch Phys Med Rehabil. 2016;97(10):1753–60. Epub 2016/05/08. pmid:27153763.
  5. 5. Mhatre A, Martin D, McCambridge M, Reese N, Sullivan M, Schoendorfer D, et al. Developing product quality standards for wheelchairs used in less-resourced environments. Afr J Disabil. 2017;6:288. Epub 2017/09/25. pmid:28936410.
  6. 6. Greer N, Brasure M, Wilt TJ. Wheeled mobility (wheelchair) service delivery: scope of the evidence. Ann Intern Med. 2012;156(2):141–6. Epub 2012/01/18. pmid:22250145.
  7. 7. United Nations, Division for Social Policy and Development Disability. Convention on the Rights of Persons with Disabilities (CRPD) New York: United Nations; 2006 [cited 2018 Jan 3]. https://www.un.org/development/desa/disabilities/convention-on-the-rights-of-persons-with-disabilities.html.
  8. 8. Visagie S, Scheffler E, Schneider M. Policy implementation in wheelchair service delivery in a rural South African setting. Afr J Disabil. 2013;2(1):63. Epub 2013/09/09. pmid:28729993.
  9. 9. Toro ML, Eke C, Pearlman J. The impact of the World Health Organization 8-steps in wheelchair service provision in wheelchair users in a less resourced setting: a cohort study in Indonesia. BMC Health Serv Res. 2016;16:26. Epub 2016/01/24. pmid:26801984.
  10. 10. World Health Organization. Wheelchair Service Training Package: Basic Level. Geneva: WHO; 2012.
  11. 11. World Health Organization. Wheelchair Service Training Package: Intermediate Level. Geneva: WHO; 2013.
  12. 12. World Health Organization. Wheelchair Service Training Package: Managers. Geneva: WHO; 2015.
  13. 13. World Health Organization. Wheelchair Service Training Package: Stakeholders Geneva: WHO; 2015.
  14. 14. World Health Organization. Wheelchair Service Training of Trainers Package. Geneva: WHO; 2018.
  15. 15. Frehywot S, Vovides Y, Talib Z, Mikhail N, Ross H, Wohltjen H, et al. E-learning in medical education in resource constrained low- and middle-income countries. Hum Resour Health. 2013;11:4. Epub 2013/02/06. pmid:23379467.
  16. 16. Maloney S, Nicklen P, Rivers G, Foo J, Ooi YY, Reeves S, et al. A Cost-Effectiveness Analysis of Blended Versus Face-to-Face Delivery of Evidence-Based Medicine to Medical Students. J Med Internet Res. 2015;17(7):e182. Epub 2015/07/23. pmid:26197801.
  17. 17. Atkins S, Yan W, Meragia E, Mahomed H, Rosales-Klintz S, Skinner D, et al. Student experiences of participating in five collaborative blended learning courses in Africa and Asia: a survey. Glob Health Action. 2016;9:28145. Epub 2016/10/12. pmid:27725077.
  18. 18. Motschnig-Pitrik R. Participatory Action Research in a Blended Learning Course on Project Management Soft Skills. 36th ASEE/IEEE Frontiers in Education Conference; San Diego, CA: IEEE; 2006.
  19. 19. Lewis PA, Tutticci NF, Douglas C, Gray G, Osborne Y, Evans K, et al. Flexible learning: Evaluation of an international distance education programme designed to build the learning and teaching capacity of nurse academics in a developing country. Nurse Educ Pract. 2016;21:59–65. Epub 2016/10/19. pmid:27756057.
  20. 20. Liyanagunawardena TR, Aboshady OA. Massive open online courses: a resource for health education in developing countries. Glob Health Promot. 2017:1757975916680970. Epub 2017/01/31. pmid:28134014.
  21. 21. Marrinan H, Firth S, Hipgrave D, Jimenez-Soto E. Let’s Take it to the Clouds: The Potential of Educational Innovations, Including Blended Learning, for Capacity Building in Developing Countries. Int J Health Policy Manag. 2015;4(9):571–3. Epub 2015/09/05. pmid:26340485.
  22. 22. Protsiv M, Atkins S, Arcade consortium. The experiences of lecturers in African, Asian and European universities in preparing and delivering blended health research methods courses: a qualitative study. Glob Health Action. 2016;9:28149. Epub 2016/10/12. pmid:27725078.
  23. 23. Ruiz JG, Mintzer MJ, Leipzig RM. The Impact of E-Learning in Medical Education. Academic Medicine. 2006;81(3):207–2012. pmid:16501260
  24. 24. Dantas AM, Kemm RE. A blended approach to active learning in a physiology laboratory-based subject facilitated by an e-learning component. Adv Physiol Educ. 2008;32(1):65–75. Epub 2008/03/13. pmid:18334571.
  25. 25. Means B, Toyama Y, Murphy R, Bakia M, Jones K, Center for Technology in Learning. Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. Washington, D.C.: U.S. Department of Education, 2010.
  26. 26. International Association for K-12 Online Learning. About iNACOL. Vienna, VA: iNACOL; 2015 [cited 2018 Jan 3]. https://www.inacol.org/about/.
  27. 27. Education Elements. Washington, DC: Education Elements; 2018 [cited 2018 April 3]. https://www.edelements.com/.
  28. 28. University of Central Florida. Blended Learning Toolkit. Orlando, FL: University of Central Florida; [cited 2018 April 3]. https://blended.online.ucf.edu/.
  29. 29. Quality Matters. About Quality Matters. Annapolis, MD: Quality Matters; 2017 [cited 2018 April 3]. https://www.qualitymatters.org/about.
  30. 30. Educause. Educause Initiatives 2018 [cited 2018 April 03]. https://www.educause.edu/.
  31. 31. Quality Matters. Higher Education Course Design Rubric. Annapolis, MD: Quality Matters; 2014 [cited 2018 April 3]. https://www.qualitymatters.org/qa-resources/rubric-standards/higher-ed-rubric.
  32. 32. Quality Matters. Non-annotated Standards from the QM Higher Education Rubric. Annapolis, MD: Quality Matters; 2014 [updated 2017 Feb 22; cited 2018 April 3]. Fifth Edition:[https://www.qualitymatters.org/sites/default/files/PDFs/StandardsfromtheQMHigherEducationRubric.pdf.
  33. 33. Lowenthal P, Hodges C. In search of quality: Using Quality Matters to analyze the quality of massive, open, online courses (MOOCs). Int Rev Res Open Dis. 2015;16(5):83–101.
  34. 34. Parscal T, Riemer D. Assuring quality in large-scale online course development. OJDLA. 2010;13(2).
  35. 35. Ralston-Berg P, Nath L, editors. What makes a quality online course? The student perspective. Proceedings of the 24th Annual Conference on Distance Teaching and Learning; 2008 Aug 7.
  36. 36. Puzziferro M, Shelton K. A model for developing high-quality online courses: Integrating a systems approach with learning theory. Journal of Asynchronous Learning Networks. 2008;12:119–36.
  37. 37. Legon R, Runyon J. Research on the impact of the quality matters course review process. 23rd Annual Conference on Distance Teaching & Learning; Aug 8–10 2007; Madison: Library Hi Tech News; 2007.
  38. 38. International Society of Wheelchair Professionals (ISWP). About ISWP Pittsburgh, PA: ISWP; 2015 [cited 2018 April 3]. http://wheelchairnet.org/about-us.
  39. 39. International Society of Wheelchair Professionals (ISWP). Training Working Groups 2018 [cited 2018 April 03]. http://www.wheelchairnet.org/training-working-groups.
  40. 40. McGee P, Reis A. Blended course design: A synthesis of best practices. Journal of Asynchronous Learning Networks. 2012;16(4):7–22.
  41. 41. Aptivate The Digital Agency for International Development. Wed Design Guidelines for Low Bandwidth [cited 2018 April 03]. http://www.aptivate.org/webguidelines/Multimedia.html.
  42. 42. Gartz R, Goldberg M, Miles A, Cooper R, Pearlman J, Schmeler M, et al. Development of a contextually appropriate, reliable and valid basic Wheelchair Service Provision Test. Disabil Rehabil Assist Technol. 2016;12(4):333–40. Epub 2016/04/22. pmid:27100362.
  43. 43. Cohen J. Statistical power analysis for the behavioral sciences. Second Edition ed. Hillsdale, NJ: Lawrence Erlbaum Associates, Publishers.; 1988.
  44. 44. Zimmerman WA. Rater Agreement and the Measurement of Reliability in Evaluations of Online Course Design Using the Quality Matters Rubric. State College, PA: Pennsylvania State University; 2011.
  45. 45. Bajpai S, Chaturvedi H, Bajpai R. Evaluation of Inter-Rater Agreement and Inter-Rater Reliability for Observational Data: An Overview of Concepts and Methods. Journal of the Indian Academy of Applied Psychology. 2015;41(3):20–7.
  46. 46. McHugh ML. Interrater reliability: the kappa statistic. Biochem Med (Zagreb). 2012;22(3):276–82. Epub 2012/10/25. pmid:23092060.
  47. 47. Coursesites by Blackboard. Get the most powerful tools for your classroom [cited 2018 May 29]. https://www.coursesites.com/webapps/Bb-sites-course-creation-BBLEARN/pages/learn.html.
  48. 48. Blackboard Cb. Browser Ckecker [cited 2018 May 29]. https://help.blackboard.com/Learn/Instructor/Getting_Started/Browser_Support/Browser_Checker.
  49. 49. Coursesites by Blackboard®. Frequently Asked Questions: Blackboard®; [cited 2018 April 3]. https://www.coursesites.com/webapps/Bb-sites-course-creation-BBLEARN/pages/faq.html.
  50. 50. Adobe. Uploading an Adobe Captivate project to a Learning Management System 2017 [cited 2018 May 29]. https://helpx.adobe.com/in/captivate/using/learning-management-system-lms.html.
  51. 51. Blackboard. SCORM Package [cited 2018 May 29]. https://help.blackboard.com/Moodlerooms/Teacher/Activities_and_Resources/Assignments_and_Submissions/SCORM_Package.
  52. 52. Adobe®. How to improve the file size of Adobe Captivate output: Adobe®; 2016 [updated 2016 Oct 17; cited 2018 Jan 3]. https://helpx.adobe.com/captivate/using/captivate-output.html.
  53. 53. Tamburrini AL, Gilhuly K, B H-R. Enhancing benefits in health impact assessment through stakeholder consultation. Impact Assessment and Project Appraisal. 2011;29(3):195–204.
  54. 54. Camden C, Shikako-Thomas K, Nguyen T, Graham E, Thomas A, Sprung J, et al. Engaging stakeholders in rehabilitation research: a scoping review of strategies used in partnerships and evaluation of impacts. Disabil Rehabil. 2015;37(15):1390–400. Epub 2014/09/23. pmid:25243763.
  55. 55. Bowen SJ, Graham ID. From knowledge translation to engaged scholarship: promoting research relevance and utilization. Arch Phys Med Rehabil. 2013;94(1 Suppl):S3–8. Epub 2012/11/13. pmid:23141502.
  56. 56. Hinchcliff R, Greenfield D, J B. Is it worth engaging in multi-stakeholder health services research collaborations? Reflections on key benefits, challenges and enabling mechanisms. International journal for quality in health care 2014;26(2):124–8. pmid:24519121
  57. 57. Morris C, Shilling V, McHugh C, K. W. Why it is crucial to involve families in all stages of childhood disability research. Developmental Medicine & Child Neurology. 2011;53(8):769–71.
  58. 58. Menon A, Korner-Bitensky N, Kastner M, McKibbon K, S S. Strategies for rehabilitation professionals to move evidence-based knowledge into practice: a systematic review. Journal of Rehabilitation Medicine. 2009;41(13):1024–32. pmid:19893996
  59. 59. Rosenbaum P. Family-centred research: what does it mean and can we do it?. Developmental Medicine & Child Neurology. 2011;53(2):99–100.
  60. 60. Jaeger R, Halliday T. On confirmatory versus exploratory research. Herpetologica. 1998:S64–S6.
  61. 61. Moore C, Carter R, Nietert P, Stewart P. Recommendations for Planning Pilot Studies in Clinical and Translational Research. Clinical and translational science. 2011;4(5):332–7. pmid:22029804
  62. 62. Perneger TV. What’s wrong with Bonferroni adjustments. BMJ. 1998;316(7139):1236–8. Epub 1998/05/16. pmid:9553006.
  63. 63. Pew Research Center. Communications Technology in Emerging and Developing Nations Washington, DC: Pew Research Center,; 2015 [updated 2015 Mar 19; cited 2018 April 3]. http://www.pewglobal.org/2015/03/19/1-communications-technology-in-emerging-and-developing-nations/.
  64. 64. Hensley G. Creating a hybrid college course: Instructional design notes and recommendations for beginners. J Online Learn Teach. 2005;1(2):66–78.
  65. 65. Henrich A, Sieber S. Blended learning and pure e-learning concepts for information retrieval: experiences and future directions. Information Retrieval. 2009;12(2):117–47.
  66. 66. World Health Organization. Global Cooperation on Assistive Technology (GATE). Geneva: WHO; 2014 [cited 2018 April 3]. http://www.who.int/phi/implementation/assistive_technology/phi_gate/en/.
  67. 67. International Society of Wheelchair Professionals (ISWP). Collaborators 2018 [cited 2018 April 03]. http://www.wheelchairnet.org/collaborators.