Figures
Abstract
In the present research, we aim to confirm factors for measuring the perceived quality of Information Technology (IT) services within a higher education context. The perceived quality of IT services is complex and is often measured by multi-dimensional constructs, which requires designing a sufficiently valid scale. Drawing upon literature and expert input, this study has identified 5 dimensions of IT service quality. Using an empirical study, we prove that perception of IT Service Quality (ITSQ) has a dimensional structure and can be measured using a 44-item scale which has been satisfactorily validated. A Confirmatory Factor Analysis (CFA) is applied to confirm the relationship between the items and dimensions. The findings of this study present a scale for measuring ITSQ within higher education.
Citation: Aziz R, Alluhaidan AS (2022) Are we good enough? A measurement for Information Technology Service Quality (ITSQ) in higher education institutions in Saudi Arabia. PLoS ONE 17(11): e0277265. https://doi.org/10.1371/journal.pone.0277265
Editor: Tauseef Ahmad, King Saud University, SAUDI ARABIA
Received: July 28, 2021; Accepted: October 25, 2022; Published: November 17, 2022
Copyright: © 2022 Aziz, Alluhaidan. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: All relevant data are within the manuscript and its Supporting information files.
Funding: Funder Name Grant Number Grant Recipient Deanship of Scientific Research, Princess Nourah Bint Abdulrahman University (43- PRFA-P-53) Dr. Ala Alluhaidan The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing interests: The authors have declared that no competing interests exist.
Introduction
In rapidly changing technology era, the information technology service quality (ITSQ) has become a major determinant of organizational success. Improving the service quality of information technology has become a primary factor in the successful execution of the strategic plan of any organization including higher education institutions. A well-established high quality ITSQ is a key to enhance the quality of higher education institutions. Most Higher Education Institutions (HEI) are striving to make the latest advanced technology available for their staff, faculty, and students. Establishing a standard process for measuring ITSQ is still considered a challenging endeavor [1, 2].
ITSQ may be measured by the stakeholder’s needs or what is perceived as a good practice for IT service. The goal of evaluating ITSQ is to have an indication of service performance, identify service problems to enact improvements, and guarantee optimum service for all. In the extant literature, there are three related streams of research: 1) service quality, 2) IT service quality, and 3) quality of IT-enabled services, see Fig 1. The existing literature covers many domains such as banks, airlines, healthcare, …, etc.
In the area of higher education, a lot of work has been done regarding the quality of higher education with respect to teaching and learning. Yet, there is a need to develop and validate a scale to measure IT service quality in higher education [1]. This is because IT is playing a dominant role in delivering services in higher education which makes it an integral component for assessing stakeholder satisfaction with provided services. Researchers have proposed some instruments for measurement of service quality in areas like healthcare, higher education, banking, …,etc. In this work, we have adapted and extended the existing knowledge for developing and validating an instrument for measurement of ITSQ in higher education institutions.
Looking closer into the “quality” definition in the education context, some have proposed it as high quality in administration, equipment, facility, and instructions. In [3], the authors refer to quality as helping students acquire graduate skills and what employers are looking for. The paper [3] is focused on undergraduate students, academic staff, and employers’ perspectives. Questionnaires data was collected from over 300 students, around 30 staff, and 17 employers. Also, qualitative data were gathered from students in the form of focus groups. Results revealed that employers highly appreciated graduate personal qualities, while students highly valued quality of education and learning, feedback, and staff.
The quality of education has improved with provision of higher quality IT services. Dissemination of information, differentiation in teaching techniques, and diversity in delivery methods have increased and improved as IT services prosper. The research paper [4] investigates the quality of IT-Enabled Services (ITES) in the higher education institutes in Saudi Arabia. Specifically, a mixed research method was used to gain information from two models in higher education institutions: -public and private universities. Results show the quality of IT-enabled services within a public institution is better than in private institution.
With improved quality in IT services in higher education, services become cheaper, faster, and more precise. This consequently helps in reducing university workload while automating most processes. With enforced usage of IT within an organization, students will be obligated to improve their communication abilities, computer skills, team working attitudes, and collaborative learning tendencies [2]. This paper helps in measuring ITSQ within the higher education sector. It synthesizes and presents factors of the quality of ITSQ. Addressing the following question: What are the current factors in measuring information technology service quality in higher education.
With the advancement in technology and services provided, there is a need to measure user satisfaction and standardize the process of such a measurement. We chose a higher educational system as a target domain, as the available literature of this type of measurement needs to be updated. Therefore, the present study addresses this gap by developing a reliable and valid scale for measuring ITSQ.
The paper is organized as follows, first the literature review is covered, then methodology and results are explained. The last two sections are discussion and conclusion.
Literature review
Standardizing services to maintain satisfaction is a goal in many fields. For example, in healthcare domain, an article investigated critical satisfaction factors in the healthcare domain regarding IT and information quality, system quality, service quality, professional maturity, and personal innovativeness were suggested as the umbrellas [5]. This study provides valuable insights for decision-makers in the healthcare domain. In fact, assessing service quality of healthcare management from the patient’s perspective is essential in the evaluation and implementation of corrective procedures. The paper [6] provides a systematic review of Multiple Criteria Decision-Making (MCDM) measures that are used in healthcare service for quality evaluation. MCDM measures are used widely in healthcare to reach decisions. Over 42 publications were covered from 33 journals spans over 12 years from 2004 to 2016. Yet, the scale is designed with an eye of a healthcare perspective and our targeted domain is higher education.
Quality in HE
The higher education institutions are facing an ever increasing pressure to offer high-quality services to students, staff, and faculty, therefore, there is a need to measure how well the services are rated by consumers. The previous research has been focused only on ITES in higher education which we chose to have as a starting point for our work to develop and test an instrument for IT service quality in higher education.
Considering how quality is assessed, European countries agreed that quality is measured internally and not by comparison as the standards varied [7] for organizations as well as their goals. Yet, autonomy is one of the aspects that is emphasized for quality in Europe. One of the most commonly known methods for measuring service quality is SERVQUAL, which is a comprehensive measurement instrument. SERVQUAL started with ten dimensions then decreased to five dimensions: reliability, responsiveness, assurance, empathy, and tangibles [8]. The scale in this case is neither tailored for IT nor has been customized for higher education.
A literature review of the service quality (SQ) from articles published from 1984 to 2017 was covered here to understand issues involved in service quality [1]. In general, the quality was discussed from conceptualization and operationalization perspectives. Results show different categories of services indicate different quality measures. Healthcare, manufacturing, banking, information technology, and higher education are the top sectors investigated in the literature. “More than 60 models of the SQ have been identified. Service-driven capabilities may be structured along adaptation with strategic drivers and imperatives, learning and alignment, and problem structuring [1].”
In [2], authors present the impact of information systems in enhancing service quality of higher education institutions. In this study, they modeled information system from the perspective of hardware, software, human, information, and knowledge that are integrated into teaching, learning, administration, research, and community engagement.
For evaluating quality of service in higher education at Marmara and Niğde Omer Halisdemir Universities, a screening model was used. The model consists of 28 items and 6 factors. The study was conducted during the 2016–2017 academic year and data of 886 students was collected. Gender, grade, university, and academic success were used as personal variables. The results show that females were higher than males with respect to the academic reputation and institutional image. Additionally, “the perceptions of students in 3rd grade were higher than students in 4th grade according to academic reputation, institutional image, offered diploma programs, and physical opportunities.” [9] This study is focused on students’ appreciation of the quality needed as they graduate.
Higher education service quality management and improvement were discussed in [10]. The paper instruments Schneider and Bowen’s model, which contains three layers of service organizations and service quality management and enhancement methods, on higher education institutions. Results show quality of service given to the students by each employee who has interaction with the students is greatly dependent on top HE management and different departments in the institution. Paper presents practical implications and recommendations on how to administer and improve service quality in higher education.
Students’ satisfaction and loyalty with their university have been investigated in [11] under conditions of highly competitive environment between higher education institutions. Data were collected through an online questionnaire from 14,870 students at the University of Pisa. The results show valuable insight on how teaching and lectures as well as course organization are the key factors of students’ satisfaction and students’ loyalty.
Paper [12] studies how to measure satisfaction with an emphasis on sustainability considering variables such as groups (family, teachers, and pupils). After confirming factors of the instrument, reliability was calculated through confirmatory factor analysis. Results show significant differences between groups. Therefore, the role performed by a quality education with sustainability equally includes teachers, students, and surrounding communities. This role incorporates effective methods into teaching, methods for encouraging members of the educational community and providing knowledge, skills, attitudes, and necessary values. The characteristics and experiences of individuals with their education are all critical here as indicated by the study.
A study by [13], aims to measure academic quality improvement in higher education through examining the quality of medical courses in 4 medical universities and assessing them according to the 9 dimensions of the Academic Quality Improvement Program (AQIP) model. AQIP scales contain these dimensions: a) support learning, b) clear achieving goals, c) understanding needs, d) appreciating people, e) leading and communicating, f) maintain operations, g) measuring effectiveness, h) constant improvement, i) building cooperative relationships. The study was conducted in Iran choosing Isfahan, ShahidBeheshti, Tehran, and Iran universities. The sample consists of academic members, graduate students, as well as scientific board partners. Although four universities were relatively ranked high on 9 dimensions of the AQIP model, there was a difference between students and scientific board members on how they judge the academic quality improvement.
In reviewing several quality constructs, their application, success, and flaws, in higher education services [14], is scoped to discuss total quality management (TQM), Kaizen, Six Sigma, Lean and Lean Six Sigma (LSS), comparing their value and their inadequacy in absorbing quality within higher education. Specifically, the focus of the research is identifying the success and shortcomings of several quality constructs in higher education services. Findings show a demand for a more comprehensive instrument with complete constructs to measure the quality of service within the HE context.
ITES
Technology-enabled services (TESs) are perceived differently by customers and [15] investigates passengers’ perceived importance of various airline TESs. Established TES, network access, and new TES are three categories of TESs while the two dimensions of Technology Readiness (TR) are optimism and innovativeness. TR was found significantly coupled with the perceived importance of TESs. With higher levels of optimism, established TESs was rated as very important while network access and new TESs were rated highly on innovativeness and regarded as more important. The relationship between TR dimensions and the perceived importance of TESs was more apparent in passengers of low-cost airlines than in customers of full-service airline [15].
IT-Enabled Services (ITES) are embedded in many fields and higher education is now picking up on the importance of having higher quality in ITES. The paper by [16] identifies 17 factors for measuring the quality of ITES in higher education universities in Saudi Arabia. Factors are: “accessibility, delivery of teaching, efficiency, information quality, inter-operability, privacy, security, response time, service reliability, service usability, site design, system integrity, user support, customization, functionality, trust, and usefulness.” [16] Authors used an analytic hierarchy process to extract factors from publications. We used this article in fact as a starting point for building the instrument for ITSQ.
In order to form the scale for IT service quality factors, we review the articles that covered the recent advances and measures [1, 4, 16]. Findings show a list of suggested factors [4, 16] while another research evaluated services against the latest technology [1]. Overlapping factors and updating services requirements were the primary sources for factors and items presented in this work.
Methodology
Our study is centered on a specific problem and is real-world practice-oriented. The main focus is to develop and validate an instrument that can be used by higher management to measure the quality of IT services in higher education. Therefore, in accordance with these characteristics of our work, we used a mixed-methods approach based on pragmatism in our research [17]. There were two phases in our research, a qualitative phase was followed by a quantitative phase. The participants were required to sign a written informed “Consent Form” which explained the purpose of the research. The study included only adults. The two phases are depicted in Table 1 and explained in the following paragraphs.
In the first phase, an IT service quality measurement model for higher education was developed on the basis of a comprehensive review of the literature and expert interviews with academics and IT professionals. The author’s 2015 paper [16] had short-listed seventeen factors for IT-enabled services’ quality from 102 general service quality factors. On the basis of the literature review, the seventeen factors for the Quality of ITES dimension were reviewed and updated in [7]. In our current work in addition to IT-enabled services’ quality four more dimensions were identified for the IT service quality measurement model for higher education on the basis of literature review. These dimensions which were identified from the literature were found to be consistent with the findings of our interviews with academics and experts.
Moreover, items for all dimensions were identified from the literature review. The content validity of resultant dimensions and their items was ensured via subjective expert reviews and pre-survey tests. The experts’ panel consisted of two senior faculty members and one IT manager. The recruitment of experts’ panel was through email. They were asked for an interview after reviewing the initial questionnaire and having given their feedback. The interview was conducted over Microsoft Teams for 45 minutes. The interview questions included: what is their opinion in general about the questionnaire?, what are the most tangible and intangible items pertaining to ITSQ in their opinion?, and what are the items that would be good to be included in the questionnaire? Interview notes were recorded by the interviewer (one of the authors) and then discussed with the coauthor.
The items were updated in the light of expert opinion and pretest. A questionnaire was developed using five-point likert scale based on the five dimensions and identified items. The survey was tested and refined before data collection. The test was done with a panel of three IT experts. The test was done to ensure the clarity of questions and the matching of questions with factors and dimensions.
At the end of the qualitative phase, we had identified the following dimensions and items:
- Quality of User support by the IT staff (9 items)
- Quality of Physical Environment in IT labs/classroom (7 items)
- Quality of Technical Environment in IT labs/classroom (12 items)
- Quality of ITES (28 items)
- IT facilities for distance learning (9 items)
The dimensions and items are given in Table 2.
The second phase of our research was based on quantitative methods. Once the questionnaire was developed, it was used to collect data. The purpose of the survey was to empirically test the proposed scale and to test its reliability and validity.
Data collection
An Institutional Review Board (IRB) exception was granted by Princess Nourah bint Abdulrahman University IRB with project number [2–0304] because there is no identifiable information in our questionnaire. Upon receiving IRB exempt, the survey was conducted among a target random sample of students who are at the College of Computer and Information Sciences at Princess Nourah Bint Abdulrahman University, Riyadh, Saudi Arabia. The sample size was determined according to the requirements of factor analysis. Field [18], recommended sample size of 300 or more for the provision of stable factor analysis. The data were collected in person by visiting the classrooms and administering the survey. The students were asked to click on a web link to fill the survey form. Approximately 500 students filled the survey and after cleaning 371 usable and complete questionnaires were obtained which is well above the limit prescribed by Field [18].
Data analysis
Field [18], recommended sample size of 300 or more for the provision of stable factor analysis. The sample size, in this case, was 371, which is acceptable for the use of factor analysis. Factor analysis is supplemented by average variance extracted analysis, composite reliability, and fit indices calculations.
Factor analysis
Factor analysis was done for the five sets of items belonging to the five dimensions of IT service quality to confirm the relationship. Factor loading of all items was calculated to assess the extent to which the item is related to its dimension. Factor loading was calculated using the correlation coefficient for the five dimensions and their 65 items as shown in Table 1. The value 0.7 or higher factor loading represents that the factor is suitable for the dimension. Therefore, cut-off point for factor loading was chosen as 0.7 [19], although three values very close to 0.7 were retained (0.685, 0.699, and 0.695). On the basis of this criteria, 21 items were deleted from all dimensions.
The deleted items are also indicated in Table 2. The factor loading for the remaining 44 items is given in Table 3.
Average variance extracted analysis
Average Variance Extracted analysis (AVE) was used to establish construct validity. AVE analysis is shown in Table 4. The square root of every AVE value belonging to each dimension is larger than any correlation among any pair of dimensions.
Composite reliability
Composite reliability also called construct reliability is a measure of internal consistency in scale items and is similar to Cronbach’s alpha [20]. Cut-off points for composite reliability are suggested in literature from 0.60 and above. Values for composite reliability are given in Table 5.
Fit indices
The fit indices are used to measure the degree of overall fit of a model to data. Those metrics with the numbers indicate if the model is good enough for practical purposes. Fit indices are developed based on test statistics [21]. Four fit indices were also calculated to assess factor analysis. They are given in Table 6.
Discussion
At the end of the qualitative phase, we had identified sixty-five items spread across five dimensions. The confirmatory factor analysis was done to confirm the relationship of items and dimensions using a factor loading cut-off point of 0.7 [19]. As a result of factor analysis, 20 items resulted in factor loading less than 0.7 and they were dropped. The number of dropped items in every dimension is:
- Quality of User support by the IT staff (3 out of 9 items were dropped)
- Quality of Physical Environment in IT labs/classroom (1 out of 7 items was dropped)
- Quality of Technical Environment in IT labs/classroom (8 out of 12 items were dropped)
- Quality of ITES (6 out of 28 items were dropped)
- IT facilities for distance learning (2 out of 9 items were dropped)
The dropped items are indicated in Table 1. The number of remaining items in the five dimensions was:
- Quality of User support by the IT staff (6 items)
- Quality of Physical Environment in IT labs/classroom (6 items)
- Quality of Technical Environment in IT labs/classroom (4 items)
- Quality of ITES (22 items)
- IT facilities for distance learning (7 items)
The factors and their items are listed in Table 2.
For the remaining items construct validity, construct reliability and four fit indices were calculated as already explained above. The purpose of these calculations was to establish the validity of the proposed scale.
Our study is one of the first empirical researches to propose a model to measure IT service quality in higher education. This research fills a gap between theory and practice of IT service quality measurement in higher education. The results show that the universities can use this model for IT service quality measurement.
The model was finalized based on data collected from female Saudi students only. It is recommended to include the perspective of the other stakeholders in a future study. The model can be further refined by repeating the study in different cultures.
Conclusions
This paper describes a study to reveal a feasible and practical instrument to measure the level of IT service quality at a higher education institution. The experiment aims to determine elements and factors to consider when designing and implementing an IT service quality scale. Once a standard is established, measuring quality is the first step towards understanding institution services and people’s satisfaction.
The main features to measure the quality of ITQS are: user support by the IT staff, physical environment in IT labs/classroom, technical environment in IT labs/classroom, ITES, and IT facilities for distance learning. We have conducted a mixed method approach for creating and validation the scale. Our data points to benefits and promising direction of improving the quality of IT services. Our results suggest that quality is more than just giving a service, tracking, and logging. It has the potential for advancing and impacting people’s usability of technology. Our study contributes a measurement scale for IT Service Quality that could be utilized in higher education institutions especially with a major shift to online education and digital accessibility after COVID 19.
The instrument proposed and validated in this paper can help the HEIs in many ways. Institutions can use it to measure and improve their IT service quality. The instrument can also be used for guidance in case an organization wants to develop Key Performance Indicators for their IT service quality. In case an institution decides to outsource their IT services then the instrument can help in deciding the clauses of service level agreements.
Future work and limitations
Future studies can incorporate number of claims and accessibility of web services into the scale. Also, there is a plan to implement the scale in two different organizations for comparing different sizes or different sectors such as public and private organizations. A case study also could utilize the scale to measure the improvements of services before and after.
Another suggested direction for future work is the measurement of IT service quality from the perspective of persons with disability. Accessibility principles for persons with disability have been suggested in the literature [22] and there is a need to assess IT service quality with respect to these principles.
References
- 1. Prakash G., “Understanding service quality: insights from the literature,” Journal of Advances in Management Research, vol. 16, no. 1, pp. 64–90, Jan. 2019.
- 2.
N. Gunawardhana, “Improving the Service Quality of Higher Education Institutions: Special reference to Information Systems,” vol. INTERNATIONAL JOURNAL OF ADVANCED STUDIES IN COMPUTER SCIENCE AND ENGINEERING VOLUME 7, pp. 13–17, Nov. 2018.
- 3. Dicker R., Garcia M., Kelly A., and Mulrooney H., “What does ‘quality’ in higher education mean? Perceptions of staff, students and employers,” Studies in Higher Education, pp. 1–13, Mar. 2018.
- 4.
R. Aziz and B. Shahzad, “Quality of IT Enabled Services in Higher Education Insti-Tutions in Saudi Arabia,” 2016. https://www.semanticscholar.org/paper/Quality-of-IT-Enabled-Services-in-Higher-Education-Aziz-Shahzad/3ab58663591bd84f6c647768939d0f16a3c732cc (accessed Apr. 06, 2020).
- 5.
R. Nelson, “Best Practices When Implementing ITIL.” 2021. [Online]. https://www.mountainview-itsm.com/itil-training/downloads/ca_itsm.pdf
- 6.
M. Mutlu, G. Tuzkaya, and B. Sennaro, “MULTI-CRITERIA DECISION MAKING TECHNIQUES FOR HEALTHCARE SERVICE QUALITY EVALUATION: A LITERATURE REVIEW,” p. 12, 2017.
- 7.
T. Loukkola and T. Zhang, Examining quality culture. Part 1, Part 1,. Brussels: European University Association, 2010.
- 8. Miklós P., Haddad H., Nagy J., Popp J., and Oláh J., “The Service Quality Dimensions that Affect Customer Satisfaction in the Jordanian Banking Sector,” Sustainability, vol. 11, Feb. 2019.
- 9. Ada S., Baysal Z., and Şahenk Erkan S., “An Evaluation of Service Quality in Higher Education: Marmara and Niğde Omer Halisdemir Universities’ Department of Education Students,” Universal Journal of Educational Research, vol. 5, pp. 2056–2065, Nov. 2017.
- 10. Sharabi M., “Managing and improving service quality in higher education,” International Journal of Quality and Service Sciences, vol. 5, Aug. 2013.
- 11. Masserini L., Bini M., and Pratesi M., “Do Quality of Services and Institutional Image Impact Students’ Satisfaction and Loyalty in Higher Education?,” Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, vol. 146, no. 1, pp. 91–115, 2019.
- 12. del C. Olmos-Gómez M., Suárez M. L., Ferrara C., and Olmedo-Moreno E. M., “Quality of Higher Education through the Pursuit of Satisfaction with a Focus on Sustainability,” Sustainability, vol. 12, no. 6, pp. 1–20, 2020.
- 13. Yarmohammadian M. H., Mozaffary M., and Esfahani S. S., “Evaluation of quality of education in higher education based on Academic Quality Improvement Program (AQIP) Model,” Procedia—Social and Behavioral Sciences, vol. 15, pp. 2917–2922, Jan. 2011.
- 14. Sunder MV., “Constructs of quality in higher education services,” International Journal of Productivity and Performance Management, vol. 65, no. 8, pp. 1091–1111, Jan. 2016.
- 15. Wang Y., So K. K. F., and Sparks B., “What Technology-Enabled Services Do Air Travelers Value? Investigating the Role of Technology Readiness,” Journal of Hospitality and Tourism Research, Jul. 2014.
- 16.
A. Romana and B. Shahzad, “Factors for Measurement of ITES Quality for Higher Education Institutions in Saudi Arabia.” https://globaljournals.org/item/4731-factors-for-measurement-of-ites-quality-for-higher-education-institutions-in-saudi-arabia (accessed Apr. 06, 2020).
- 17.
Creswell J. W. and Creswell J. D., Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 5th edition. Los Angeles: SAGE Publications, Inc, 2018.
- 18.
A. Field, “Discovering Statistics Using IBM SPSS Statistics,” SAGE Publications Ltd, Mar. 30, 2021. https://uk.sagepub.com/en-gb/eur/discovering-statistics-using-ibm-spss-statistics/book257672 (accessed Mar. 31, 2021).
- 19.
H. J. F. Jr, Black W. C., Babin B. J., and Anderson R. E., Multivariate Data Analysis, 7th edition. Upper Saddle River, NJ: Pearson, 2009.
- 20.
R. G. Netemeyer, W. O. Bearden, and S. Sharma, Scaling Procedures: Issues and Applications. SAGE Publications, 2003.
- 21.
J. Korstanje, “Structural Equation Modeling,” Medium, Jun. 27, 2021. https://towardsdatascience.com/structural-equation-modeling-dca298798f4d (accessed Oct. 02, 2022).
- 22.
K. Veljanovska, N. Blazheska-Tabakovska, B. Ristevski, and S. Savoska, “User Interface for e-learning Platform for Users with Disability,” p. 14.