Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Approach in inputs & outputs selection of Data Envelopment Analysis (DEA) efficiency measurement in hospitals: A systematic review

  • M. Zulfakhar Zubir,

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Software, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliations Medical Development Division, Ministry of Health Malaysia, Putrajaya, Malaysia, Department of Public Health Medicine, Faculty of Medicine, Universiti Kebangsaan Malaysia, Kuala Lumpur, Malaysia

  • A. Azimatun Noor ,

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    azimatunnoor@ppukm.ukm.edu.my

    Affiliation Department of Public Health Medicine, Faculty of Medicine, Universiti Kebangsaan Malaysia, Kuala Lumpur, Malaysia

  • A. M. Mohd Rizal,

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliation Department of Public Health Medicine, Faculty of Medicine, Universiti Kebangsaan Malaysia, Kuala Lumpur, Malaysia

  • A. Aziz Harith,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Writing – original draft, Writing – review & editing

    Affiliation Occupational and Aviation Medicine Department, University of Otago Wellington, Wellington, New Zealand

  • M. Ihsanuddin Abas,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Writing – original draft, Writing – review & editing

    Affiliation Department of Public Health, Faculty of Medicine, Universiti Sultan Zainal Abidin, Terengganu, Malaysia

  • Zuriyati Zakaria,

    Roles Writing – original draft, Writing – review & editing

    Affiliation Medical Development Division, Ministry of Health Malaysia, Putrajaya, Malaysia

  • Anwar Fazal A. Bakar

    Roles Writing – original draft, Writing – review & editing

    Affiliations Department of Public Health Medicine, Faculty of Medicine, Universiti Kebangsaan Malaysia, Kuala Lumpur, Malaysia, Medical Practice Division, Ministry of Health Malaysia, Putrajaya, Malaysia

Abstract

The efficiency and productivity evaluation process commonly employs Data Envelopment Analysis (DEA) as a performance tool in numerous fields, such as the healthcare industry (hospitals). Therefore, this review examined various hospital-based DEA articles involving input and output variable selection approaches and the recent DEA developments. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) methodology was utilised to extract 89 English articles containing empirical data between 2014 and 2022 from various databases (Web of Science, Scopus, PubMed, ScienceDirect, Springer Link, and Google Scholar). Furthermore, the DEA model parameters were determined using information from previous studies, while the approaches were identified narratively. This review grouped the approaches into four sections: literature review, data availability, systematic method, and expert judgement. An independent single strategy or a combination with other methods was then applied to these approaches. Consequently, the focus of this review on various methodologies employed in hospitals could limit its findings. Alternative approaches or techniques could be utilised to determine the input and output variables for a DEA analysis in a distinct area or based on different perspectives. The DEA application trend was also significantly similar to that of previous studies. Meanwhile, insufficient data was observed to support the usability of any DEA model in terms of fitting all model parameters. Therefore, several recommendations and methodological principles for DEA were proposed after analysing the existing literature.

1. Introduction

Efficiency is a well-established concept in the field of economics. Farrell proposed that efficiency measurement should consider all inputs and outputs, avoiding index number issues and providing practical calculation methods. Hence, efficiency-related articles have attracted significant attention from various fields, including statisticians, economists, healthcare, and medicine [1]. Nevertheless, insufficient agreement regarding the optimal method is observed for measuring efficiency. For example, various methodologies are often used for efficiency-related articles in health facilities, including data envelopment analysis (DEA), stochastic frontier analysis (SFA), Pabon Lasso, and ratio analysis [24]. The World Health Organization (WHO) also introduced a unique approach to evaluating the effectiveness of healthcare systems in the Global Programme on Evidence for Health Policy Discussion Paper Series. Compared to a previous study in this field [5], this approach introduced numerous objectives of the healthcare system, including responsiveness (level and distribution), fair finance, health inequality, and the more traditional goal of improving population health.

Another report by the WHO evaluated the performance of the health system by assessing how well national health systems achieved three main objectives: good health, expectation responsiveness to the population, and fairness of financial contribution [6]. Despite the acknowledgement of this method, disagreement and criticism have occurred over the methodology employed. Conversely, a consensus has been observed regarding the importance of accurately directing these assessments, performing a more critical analysis, adopting a more constructive approach, and facilitating a crucial dialogue among stakeholders in the healthcare system [79]. Recently, the DEA has been used to compute the effectiveness of healthcare systems in 180 countries. This assessment is based on six key dimensions: clinical outcomes, health-adjusted life years, access, equity, safety, and resources [10]. Stakeholders must comprehend that universally applicable efficiency metrics for all healthcare systems are impossible. Therefore, a comprehensive understanding of the institutional arrangements, data, and measurements is necessary to select suitable measures, resources, and other health system components.

A framework is required following the analysis process. The optimal approach to implementing performance measurement is not to identify a minor adjustment as a supporting role to enhance one aspect of the health system outcomes. Instead, this identification should be utilised as a general strategy in gauging performance among the various system components [11, 12]. Numerous indicators, such as activity and expense comparison measures, are also available to assess whether limited health resources are utilised most efficiently. The primary focus of these indicators is based on quantitative metrics for evaluating hospital performance. Furthermore, the quality of hospital services can be examined using various indicators [13, 14]. Efficiency comparisons can also be assessed objectively using techniques from a solid economic theory. Currently, the DEA and SFA approaches are frequently applied to measure the efficiency of the healthcare industry [11]. Since the publication of Nunamaker’s study, these strategies have been widely used in healthcare settings over the past 40 years [1519]. Although the theoretical and methodological limitations have been acknowledged in DEA, this method has attracted interest from researchers who aim to address the limitations. Hence, these studies have developed multiple methods integrating DEA with other statistical techniques and methodologies to improve efficiency evaluation [20, 21].

1.1 DEA as an efficiency analysis tool in hospital

The DEA is a mathematical technique for assessing the relative efficiency of homogenous decision-making units (DMUs) with many input and output variables. Initially, this method was developed within operations research and econometrics. The effectiveness of a DMU is then evaluated concerning the effectiveness of each other members of the group. Nevertheless, one drawback of the DEA is its non-parametric and deterministic nature, suggesting that outliers are more easily detected. Meanwhile, an efficient DMU usually involves maximum output production while utilising the same input levels as all other DMUs [17, 22]. Various DEA-related articles have proposed that this outcome is denoted as the Charnes, Cooper, and Rhodes (CCR) model or constant return to scale (CRS) assumption. This observation allows for examining input-output correlation without considering any congestion effects, indicating that the outputs can present a precise linear correlation with the inputs [10, 23].

Banker expanded the CCR model and the CRS assumption using a Banker, Charnes, and Cooper (BCC) model and variable returns to scale (VRS) assumption. This assumption suggests that the scales of the economies shifted with higher DMU size [23, 24]. The DEA approach also considers the model orientation (input or output-oriented) alongside the model type and returns to scale assumption. For example, a DMU in the input orientation assumption can control more inputs than outputs. Nonetheless, this statement can be argued that organisations can improve their outputs by utilising efficiency-oriented inputs [23, 25]. Hence, the input and output variables should be carefully considered when using the DEA to measure the effectiveness of a DMU or an organisation. This suggestion indicates that a precise, thorough, pertinent, and appropriate selection and combination of the input and output variables is necessary to effectively portray the functionality of a hospital while meeting the stakeholders’ expectations and assessing its efficiency [18, 21]. Numerous advanced analyses have also been incorporated into DEA, such as the advanced CCR and BCC models, longitudinal or window analysis (Malmquist index), and statistical analysis (regression and bootstrapping methods) [20, 23, 2527].

1.2 Input and output selections for hospital-based DEA applications

Multiple articles have demonstrated the practicality and potential of DEA in evaluating hospital efficiency [2831]. Despite that DEA rating comparisons across several hospital-based articles produce helpful hypotheses, significant drawbacks are observed as follows:

  1. The input and output metrics vary across different timeframes.
  2. The DEA score distribution is highly skewed, rendering it inaccurate to rely on standard measures of central tendency.
  3. The output metrics in the articles present significant divergence from each other.
  4. The hospital production models and types possess substantial differences.

Certain hospital-based articles have reported that innovative strategies can provide valuable insights to decision-makers [23, 32]. These articles have also included the DEA for hospital-based applications. Generally, DEA-based applications involve health care performance measurement [15, 16, 18], categorisation or clustering of DEA techniques [20, 33], DEA comparison with other methods, countries or durations [28, 29, 30, 34], and development of novel knowledge and approaches concerning DEA assessment [17, 21]. Likewise, each stage in a systematic literature review (SLR) employs organised, transparent, and reproducible techniques to identify and integrate relevant articles to a particular topic comprehensively. The reviewer’s methodology is meticulously recorded, allowing readers to track the decisions and actions taken and evaluate them [35]. Although numerous hospital-based DEA articles have been recorded, inadequate complete analysis has been observed. Consequently, this outcome requires further investigation, leading to a research gap involving hospital-based DEA articles.

This review investigated various hospital-based DEA articles for selecting the most suitable input and output variables. Notably, hospital institutions were chosen due to the significant challenges in assessing their efficiency. This limitation was further complicated by the dynamic nature of service production and variation across several providers [25, 36, 37]. To the authors’ knowledge, no reviews regarding hospital-based DEA articles involving optimal input and output variable selections were reported. Thus, this review addressed this research gap by observing the current trends in hospital-based DEA analyses. The remainder of this review is structured as follows: Section 2 describes the methodology used and the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statements. Section 3 presents the literature review of the relevant articles and their corresponding discussions concerning the input and output variable selections for hospital-based DEA applications. Finally, Section 5 highlights the limitations and conclusions of this review.

2. Methodology

This section discusses the methodology used to obtain relevant hospital-based DEA articles. The PRISMA methodology involved Web of Science, Scopus, PubMed, ScienceDirect, Springer Link, and Google Scholar databases. This process conducted the SLR, eligibility with exclusion criteria, review stages (identification, screening, and eligibility), and data abstraction with analysis.

2.1 PRISMA

The PRISMA methodology concisely collects components for documenting SLRs and meta-analyses based on supported evidence. Even though this approach typically focuses on reporting reviews evaluating intervention effects, it can also be used as a basis for publishing SLRs with objectives other than assessing interventions (Appendices A and B) [38]. Hence, a comprehensive manual on the SLR methodological approach is required for future researchers. This SLR initiates with developing and verifying the review method, publication standard, and reporting standard or guidance. These articles can then provide a systematic guideline for researchers outlining the factors necessitating consideration during the review process [39].

2.2 Journal databases

Various articles published from 2014 until 2022 were obtained on 5th April 2023, using six databases: Web of Science, Scopus, PubMed, ScienceDirect, Springer Link, and Google Scholar. The analysis of the search engines revealed significant performance discrepancies, indicating the absence of an optimal search approach. Therefore, searchers must be well-trained, capable of evaluating the strengths and weaknesses of a system, and able to determine where and how to search based on that information to use them effectively. The six databases were selected based on their potential to provide a meticulously curated medical database with recall-enhancing features, tools, and other alternatives to optimise precision [40, 41].

2.3 Identification

This systematic review process comprised four stages (identification, screening, quality appraisal, analysis). Several search terms were identified during the first stage, which involved searching previous articles using various terms: “efficiency*”, “performance*”, “productivity*”, “benchmark*”, “hospital*”, “data envelopment analysis”, and “DEA”. The search string was modified according to the requirement of database. The records were exported from the databases into Microsoft Excel sheet for screening. The final query string is as follow:

  1. (((("hospital") AND ("efficiency")) OR (("hospital") AND ("performance")) OR (("hospital") AND ("benchmark")) OR (("hospital") AND ("productivity"))) AND (("data envelopment analysis") OR ("DEA")))

2.4 Screening

The inclusion and exclusion criteria were established in this review. The titles and abstracts were independently screened by three reviewers. Only articles containing empirical data were initially selected, and this process excluded review articles (SLR and SR), book series, books, book chapters, and conference proceedings. Non-English articles were then excluded in the search attempts, avoiding any ambiguity or difficulty in translation. Subsequently, a nine-year duration was chosen for the chronology (2014–2022) to observe significant developments in research and relevant articles. This duration also functioned as a continuation of a previous study by O’Neill et al. (1984–2004), Cantor and Poh (1994–2017), and Kohl et al. (2005–2016). Consequently, 89 articles were finalised for the quality appraisal stage. Fig 1 depicts the PRISMA diagram, which provides a detailed description of the entire search procedure.

2.5 Quality appraisal

A quality appraisal stage was conducted to ensure that the methodology and analysis of the selected articles were performed satisfactorily. This process contained two quality appraisal tools: knowledge transfer [29, 42] and economic evaluations and efficiency measurement [43, 44]. Mitton et al. developed a 15-point scale that covered several topics: literature evaluation, research gap identification, question, design, validity and reliability, data collection, population, sampling, and result analysis and report. These criteria were evaluated using a score range between 0 and 3: 0 for not being present or reported, 1 for being present but of low quality, 2 for being present and mid-range quality, or 3 for being present and of high quality [42].

Another checklist by Varabyova and Müller employed four dimensions: reporting, external validity, bias, and power. All items on the quality assessment checklist were assigned a score of either 0 (indicating no or unclear) or 1 (indicating yes). One item in the checklist also focused on conducting a second-stage analysis to investigate potential sources of bias in the study. The articles with and without second-stage analysis received maximum scores of 14 and 13, respectively. This checklist assessed the article from an economic perspective to ensure the findings could be used in policy analysis and managerial decisions. Only the items relevant to the design of the article were utilised to establish the maximum score (100%) for each study [44]. Overall, no recognised standards for assessing the planning or implementation of research on healthcare efficiency indicators were recorded. Thus, the scientific soundness of the chosen article was investigated using two tools to improve robustness and minimise bias. Two co-authors from different institutions evaluated each selected article separately using both tools to enhance reliability. A third reviewer was then requested to assess an article if a disagreement occurred.

2.6 Data extraction and analysis

The selected articles were subjected to examination and analysis. Specific articles were also prioritised to meet the objectives directly. The data extraction process could be performed by reading the abstracts and the entire article. Meanwhile, content and quantitative and qualitative analyses were used to determine the input and output selection approaches for the hospital-based DEA articles. Four reviewers extracted the data independently using a standardized data extraction form which is organized using Microsoft Excel. The information in this form included publication year, country of study, studied hospital type, number of hospitals, number of observations (DMUs), model type, returns to scale, model orientation, measured efficiency type, input, output, number of models, application of second stage analysis, and approaches used in selecting input or output variables.

2.7 Statistical analysis

In evaluating the studies, the intra-class correlation (ICC) was used to measure the agreement between two raters (co-authors). This process examined the dependability of ratings by comparing the variability between various evaluations of the same subject regarding the overall variation observed across all ratings and subjects. Each of the evaluation processes was also quantitative. Meanwhile, the ICC coefficient values for Mitton et al.’s (15-point scale) and Varabyova and Müller’s (economic evaluation and efficiency measurement) studies were 0.956 and 0.984, respectively. No articles were excluded at this stage as the review encompassed qualitative and quantitative aspects. Nonetheless, highly rated articles were considered highly in the data analysis and result interpretation processes.

3. Results

All the 89 articles included in this analysis were retrospective studies published between 2014 and 2022. Appendices C and D contain a comprehensive summary of all the selected articles.

3.1 Efficiency analysis

The efficiency analysis in DEA primarily focused on the data by quantifying the set performance of DMUs. Given that the definition of DMU is generic and broad, this review focused on “hospital”. Typically, the four main efficiency concepts are technical, scale, pricing, and allocative efficiencies [25]. Certain studies also described efficiency as technical, pure, scale, allocative, cost, and congestion. The DEA could perform efficiency analysis at a single point in time and over time [26]. Thus, the data could be categorised as cross-sectional (single period) or longitudinal (panel data). Specifically, the longitudinal analysis of DEA utilised two approaches to quantify efficiency: the Malmquist Productivity Index (MPI) and Window Analysis (WA).

Out of the 89 articles, a significant portion of them (32.58%, 29 of 89) focused only on evaluating hospital performance using Pure Technical Efficiency (PTE) [4573]. This metric is defined as the effectiveness of an input set producing an output on the VRS frontier [49, 68]. Hence, a hospital is deemed technically efficient when it generates the highest quantity of outputs using the fewest inputs. The overall Technical Efficiency (TE) is determined by multiplying the Scale Efficiency (SE) and PTE [74, 75]. Generally, TE refers to the efficiency measured under the CRS production frontier. In contrast, SE measures how much a unit deviates from an optimal scale, which is an area involving CRS in the correlation between outputs and inputs [76, 77]. Hence, the equation for TE is expressed as follows:

Fig 2 provides a geometric representation of the concepts involved in efficiency measurement using the DEA. Of the 89 articles, 26 (29.21%) measured the overall TE [76101]. Additionally, 24 (26.97%) computed the TE, PTE, and SE [74, 75, 102125]. Even though the remaining articles were assessed by combining TE and PTE, certain articles did not explicitly specify the tested efficiency type (see S5 Appendix).

3.2 Model parameters

The DEA was applied using four considerations specified by the researcher: model type, technological assumption of the delivery process, model orientation, and input-output combination [112, 121]. This model could be further analysed or extended through a second stage or integrated with other statistical methods. Consequently, this process could improve efficiency measurement, understanding of the variation or difference in organisational performance, and evaluation of the productivity of the organisation over a specific period [91, 110, 119]. Considering that the performance was analysed over a certain period, the data type was also essential.

3.2.1 Model type.

The DEA has been utilised to assess the performance of various entities involved in diverse activities under different circumstances. This process leads to numerous models and extensions explaining the intricate and frequently unpredictable correlations between multiple inputs and outputs in organisation activities or productions [106, 124]. Hence, these models can be described as basic DEA and extension models. Certain articles have also denoted the model as Radial, Non-radial and Oriented, Non-radial and Non-oriented, and Radial and Non-radial [23, 125]. Most articles in this review (80.90%, 72 of 89) used Radial DEA models [45, 4751, 5361, 6476, 78, 79, 8185, 8893, 9599, 101115, 117123, 126129].

The BCC, CCR, or a mixture of both models were used to measure efficiency by examining the radial changes in input and output values. Nevertheless, only 7.87% (7 of 89) [46, 52, 62, 63, 94, 100, 130] or 4.49% (4 of 89) [80, 86, 87, 124] employed Non-radial and Oriented or Non-radial and Non-oriented models, respectively. The Non-radial model deviated from the conventional approach of proportional input or output changes and instead focused on addressing slacks directly. Only one article was observed using the Radial and Non-radial models [77], while one combined Radial, Non-radial, and Oriented models to measure efficiency [116]. The remaining four articles did not explicitly specify the model employed in the study (see S6 Appendix) [131134].

3.2.2 Model orientation.

Orientation refers to the specific direction in which input or output is measured to determine efficiency. The primary evaluation objective is to either increase output or decrease input. Most articles in this review (55.06%, 49 of 89) applied input-orientated DEA models [4547, 4953, 57, 59, 6368, 7072, 7476, 79, 9093, 95, 97, 98, 102106, 108110, 112115, 117, 120, 123, 126, 127, 129, 134]. These articles selected the input orientation to align with the standard practice in healthcare facilities of minimising inputs while achieving a desired output level. Thus, the organisation acquired minimal or non-existent authority over the output [45, 50, 109].

Approximately 25.84% (23 of 89) of the articles presented contradictory findings [48, 5456, 58, 60, 61, 69, 73, 78, 8185, 101, 107, 111, 116, 118, 121, 122, 133]. Given the fixed and non-flexible nature of the input, the organisation should strive to raise its output. This outcome implied that output-orientated DEA models were more appropriate in their respective settings [78, 82, 83]. Meanwhile, only 5.62% (5 of 89) [80, 86, 87, 99, 124] or 3.37% (3 of 89) [94, 100, 128] employed non-orientated or combined input and output-orientated DEA models, respectively. The remaining articles did not specify the orientation used in their measurements and did not clearly state their orientation (see S7 Appendix) [62, 77, 88, 89, 96, 119, 130132].

3.2.3 Returns to scale assumption.

Approximately one-third of the articles (35.96%, 32 of 89) involving the returns to scale assumption combined CRS and VRS assumptions in evaluating efficiency [74, 75, 78, 91, 101124, 126129]. These articles compared the efficiency score to acquire a more comprehensive understanding of the organisation. Moreover, these articles provided additional knowledge on how they might utilise each assumption to enhance hospital services [101, 108, 113]. Another one-third of the articles (32.58%, 29 of 89) used the VRS assumption and implied a significant correlation between the outputs of organisations (DMUs) (increase or decrease) and inputs [4573]. Likewise, 20.22% (18 of 89) [76, 79, 8185, 8890, 92, 93, 95100] assumed that the outputs of their organisations (DMUs) varied (increase or decrease) similarly to the inputs (see S8 Appendix).

3.2.4 Input and output selections.

Appropriate input and output selections are necessary for conducting a comprehensive efficiency evaluation. Therefore, identifying the key attributes depicting the investigated process or output is critical. This process implies that all relevant resources should be incorporated into the inputs, while the administrative objectives of the organisations (DMUs) should be outlined in the outputs [52, 104]. Nonetheless, suitable inputs and outputs can present varying features depending on the situation. Data availability also requires significant consideration alongside appropriate input and output selections. Hence, various recommendations have been presented involving locating suitable measures [76, 131, 133]. This process is further discussed as the main objective of this review.

Several articles employed input and output classifications for measuring efficiency, including capacity, labour, and expenses-related or capital investment, labour, and operating expenses. Specific articles also further delineated this classification process into sub-categories. For example, the outputs were classified as inpatient with outpatient services and effectiveness (quality). Other outputs were classified into two categories: activity (inpatient and outpatient) and quality-related (effectiveness dimension) [20, 21, 25, 32]. Table 1 lists the input and output classification and sub-classification processes in this review. Tables 2 and 3 summarise the details of each sub-classification frequency distribution and percentages.

thumbnail
Table 1. Summary of the used input and output categories.

https://doi.org/10.1371/journal.pone.0293694.t001

3.2.4.1 Capacity-related inputs. The size, capacity, and functioning of a hospital as a health service are determined mainly by its number of fully staffed and operating beds. Out of the 89 articles, 75 of them (84.27%) considered the number of beds (general, intensive care unit and special) as inputs in their analyses [4754, 5661, 6373, 75, 7792, 94100, 102108, 111115, 117, 118, 120124, 127130, 132134]. Only seven (9.33%) of the 75 articles used bed-related data as their inputs (bed type, cost, or ratio) [60, 65, 70, 77, 97, 100, 128]. Another 12 (16.00%) of the 75 articles studies combined beds and capital assets as capacity-related inputs [69, 77, 84, 88, 89, 100, 102, 105, 108, 121, 124, 132]. Even though only one article employed capital assets as its input, it was combined with cost-related assets. Unlike other articles, it became apparent why this article did not include beds as part of its input [55]. Overall, the primary capacity-related input in these articles was the number of general beds. This input was followed by the number of facility types and the number of medical equipment.

3.2.4.2 Cost-related inputs. Cost-related input was the least utilised in all the articles. Of the 89 articles, only 31 (34.83%) were applicable. Another three of the 31 articles specifically used cost-related inputs [62, 101, 119]. Interestingly, most of these 31 articles (90.32%) combined capacity and staff-related inputs in their analyses [4547, 50, 55, 6163, 69, 7478, 82, 90, 93, 94, 100, 101, 108, 110, 111, 113, 115, 119, 124, 128, 129, 131, 132]. Overall, the primary cost-related input utilised in these articles was total operational cost, followed by fixed costs. Subsequently, service and consumable costs followed behind.

3.2.4.3 Staff-related inputs. Most articles (93.26%, 83 of 89) employed staff-related inputs [4554, 5661, 6389, 91118, 120124, 126130, 133, 134]. Another two of the 83 articles combined the number of staff (staff-related) and labour cost (cost-related) as their inputs [74, 77]. Alternatively, cost-related inputs (labour or operating costs) substituted staff-related input as proxies in six articles [55, 62, 90, 119, 131, 132]. The staff-related input values exhibited variability across the observed articles, while most articles employed arithmetic numbers (actual). A full-time equivalent and a ratio of specific values followed this input. Overall, these articles demonstrated that the number of doctors was the most common input, followed by the number of nurses and clinical staff.

3.2.4.4 Production-related outputs. A significant portion (98.88%, 88 of 89) of the articles highlighted production-related outputs [4591, 93124, 126134]. Only eight (8 of 89) articles combined production and quality-related outputs [60, 72, 83, 86, 94, 97, 127, 132]. The most prevalent production-related output in these articles was the number of outpatients. This output was sequentially followed by the number of inpatients (admission and discharge), the total number of operations, and the number of inpatients.

3.2.4.5 Quality-related outputs. Quality-related outputs were less prominent than production-related outputs, and only nine (9 of 89) articles employed quality-related outputs [60, 72, 83, 86, 92, 94, 97, 127, 132]. Notably, one article focused exclusively on a quality-related output in their research, aligning with the objectives of the study [92]. Overall, the mortality rate (infant, adult, and specific diseases) was the most applied quality-related output. This output was followed by revisit rates (outpatients and emergency) and the number of students.

3.2.5 Extended analysis and data type.

Approximately 80 articles (89.89%, 80 of 89) conducted extended analysis in their analyses [4555, 5763, 6568, 7076, 7893, 9597, 100115, 117124, 127134]. The applied data type was also almost equally distributed. Among them, 51 articles (57.30%, 51 of 89) [46, 48, 51, 5759, 66, 6972, 7476, 78, 79, 8184, 87, 90, 91, 93, 95, 96, 99, 100, 102, 104108, 110, 111, 113115, 119121, 123, 124, 126, 128134] used panel data, In contrast, 38 articles (42.70%, 38 of 89) [45, 47, 49, 50, 5256, 6065, 67, 68, 73, 77, 80, 81, 85, 86, 88, 89, 92, 94, 97, 98, 101, 103, 109, 112, 116118, 122, 127] employed cross-sectional data in their investigations.

Forty extended analyses were identified within the included articles, in which one or multiple extended analyses were used for each study (some mentioned as “stages”). Out of the extended analysis-related 80 articles, 46 integrated two or more extended analysis in their DEA measurements [45, 46, 5053, 57, 59, 62, 63, 66, 67, 70, 7274, 76, 78, 79, 8285, 87, 9093, 95, 100, 101, 106108, 113115, 118120, 127, 129, 130, 132134]. The maximum number of extended analyses in all 80 articles were five [66, 91, 93], in which regression analysis (29.11%, 46 of 158) was primarily used for assessing hospital efficiency. This analysis type was sequentially followed by production function analysis (16.46%, 26 of 158), statistical analysis (15.82%, 25 of 158) and resampling methods (15.82%, 25 of 15). Table 4 tabulates the complete list of the specific analyses for each classification (see S4 Appendix).

3.3 Input and output selection approaches

Various approaches or methods were adopted by the 89 articles in selecting inputs and outputs for hospital-based DEA. Each article relied on previous studies or literature reviews as the main or partial component of their methodology for selecting input and result variables. Only a few articles explicitly indicated using a local DEA efficiency study from their respective country as the reference for input and output selections. Certain articles also employed a combination of methodologies. Meanwhile, 63 of the 89 articles (70.79%) utilised only literature review to determine the input and output variables [4547, 4954, 58, 59, 61, 6377, 80, 81, 84, 85, 8891, 93, 9597, 100, 102, 104, 107, 109, 110, 112, 114121, 123, 124, 126, 127, 129, 131134]. Meanwhile, the remaining articles employed a literature review in combination with other approaches, highlighting diverse combinations. Nonetheless, the most prevalent combination approach identified was a literature review combined with data availability (13.48%, 12 of 89) [55, 56, 78, 82, 103, 105, 106, 111, 113, 122, 128, 130]. This combination was followed by the literature review with systematic method (5.62%, 5 of 89) [57, 60, 83, 101, 108] and the literature review with DMU limitation (5.62%, 5 of 89) [48, 79, 86, 94, 98]. A maximum combination of four was also observed in one article [87]. Table 5 lists the complete list of various specific approaches.

thumbnail
Table 5. Classification summary of the input and output selection approaches.

https://doi.org/10.1371/journal.pone.0293694.t005

4. Discussions

The size of the DEA universe can be intimidating to unfamiliar individuals. Even when the literature is limited to healthcare applications, reading every previous study to acquire knowledge from their experiences is exceedingly challenging. Consequently, the 89 articles were subjected to meticulous examination to accomplish the objectives of this review. Nunamaker was the first to publish a health application involving the DEA to examine nursing services. Subsequently, Sherman released a second DEA article evaluating the medical and surgical departments of seven hospitals [135, 136]. Hence, these articles have evolved DEA applications in the healthcare industry over the past four decades. The quality of the articles has advanced due to access to resources and information technology [23, 25, 137, 138].

4.1 Researchers’ input and output selection approaches involving DEA for hospital efficiency measurement

The relative effectiveness of various institutions, including businesses, hospitals, universities, and government agencies, is frequently assessed using DEA. Conversely, these assessments are different from traditional-based analyses. Providing healthcare services in hospital-based environments is also distinct from the manufacturing process. Raw materials undergo a physical transformation to become final commodities in a conventional factory, in which participation and co-production are absent due to the exclusion of the customer component. Therefore, identifying the appropriate variables is difficult due to the involvement of patients in the process. The effectiveness (quality component) in healthcare is also equally crucial alongside performance and efficiency [86, 92, 127]. Even though DEA studies have not highlighted a standard set of input and output, several guidelines have been recommended using analytic procedures or principles to aid the optimal variable selection process [139143].

4.1.1 Literature review.

A literature review remains a commonly employed method and is often regarded as one of the most effective techniques to place a study within the body of knowledge. This method contains numerous review types (narrative, rapid, scoping, or systematic reviews) functioning as foundations or building blocks for knowledge advancement, theory development, and improvement area identifications [144146]. The literature reviews examined in this study were also the most prevalent approach for input and output selections concerning hospital efficiency-based DEA analyses. This review revealed that all the articles utilised literature review either as the primary method or as part of a combined approach.

None of the articles provided detailed information about their literature review approaches. Nevertheless, few articles explicitly mentioned selecting literature from their local country to compare their findings with previous local studies [49, 85, 104, 124]. Typically, the DEA is a non-parametric technique relying entirely on the observed input-output combinations of the sampled units. This process does not necessitate any presumptions regarding the functional structure correlations between inputs and outputs [76, 114]. Given that the DEA could measure the efficiency value depending on the objectives (even if it yielded a less significant value), an advantage was observed for these articles involving input and output selections based on the literature review [139, 142]. Even though this method remained valid for academicians, other assessors (managers, economists or policymakers) could perceive it as contradictory to their practical perspectives. Hence, several factors must be considered from these individuals’ perspectives, including different indicators, production objectives, and policies.

4.1.2 Data availability.

The DEA relies on the homogeneity of the assessment of a unit. The DMU is presumed to produce comparable activities or products using resources and technology in the same environment. Therefore, a common set of similar inputs and outputs can be established. Certain factors also require consideration when large hospital-related datasets are involved, including data quality, availability, scale, and type [139, 141]. Although this review indicated that the examined articles used literature reviews to select the input and output variables, this selection method depended on data availability. These articles only developed a few solutions to address the limitation. For example, few articles exclusively gathered data available within their scopes [87, 103, 130]. Certain articles also omitted the DMUs with incomplete data, focusing their analyses on DMUs with complete data [78, 105]. Thus, the DEA was advantageous in measuring only the relative efficiency or the production frontier of the units included in the analysis. Specific articles also applied the DEA during a defined data availability period to ensure all necessary input and output variables were complete [55, 113]. Although the DEA with missing incomplete data could be addressed, none of the reviewed articles attempted to resolve this issue [147, 148].

4.1.3 Systematic method.

Many possible factors can be listed when determining the input and output variables. Nevertheless, this phenomenon can produce two significant issues: a lengthy input and output list and a negative impact on the DEA in accurately measuring efficiency if a limited number of DMUs are observed [139, 142]. Hence, selecting the significant variables and simultaneously accurately measuring efficiency is essential. The process has also been evaluated in various articles by incorporating systematic procedures. This review identified four systematic approaches: Delphi, Preference Ranking Organisation Method for Enrichment Evaluation (PROMETHEE), bibliometric analysis, and variance filter [48, 57, 60, 101, 108]. Consequently, these systematic approaches could formalise the judgmental process of stakeholder viewpoints (managers, economists, and policymakers). The Delphi method focuses on gathering the most reliable consensus of expert opinion for challenging situations. This forecasting method was initially presented in the 1950s by Olaf Helmer and Norman Dalkey of the Rand Corporation based on the responses from several iterations of questionnaires distributed to a panel of experts [149, 150]. Therefore, the Delphi method is widely recognised in assessing DEA efficiency in various areas [151153].

The PROMETHEE method was initially developed in 1982 and underwent more advancements in 1985 [154156]. This method is recognised as highly utilised and practical for multiple criteria decision aid (MCDA), including its application with DEA [157160]. Compared to other MCDA methods, the PROMETHEE is considered a straightforward and computationally simple ranking system. The system incorporates weights indicating the relative significance of each criterion alongside the preference function associated with each criterion. One of the critical applications of PROMETHEE involves its capability to assist decision-makers in choosing the optimal options for evaluating hospital performance. This process enables investigations to include PROMETHEE in DEA-based applications [161164].

Pritchard is credited with coining the term "bibliometric" in 1969. This method is defined as an “application of mathematics and statistical methods to books and other media of communication”. Therefore, bibliometric analysis evaluates the bibliographic information or metadata properties from a database or collection of documents to enhance understanding of the topic under investigation [165, 166]. Numerous articles have applied bibliometric analysis for various objectives as follows:

  1. To identify new trends in journal performance, collaborative styles, and research components
  2. To lay the groundwork for the new and significant advancements in a field
  3. To systematically understand the massive amounts of unstructured data to interpret and map the cumulative scientific knowledge and evolutionary nuances of established domains [167, 168]

Hence, bibliometric analysis can determine the input and output variables in DEA studies involving the healthcare industry, such as hospitals.

The variance filter is a mechanism used in feature selection. The process determines and retains the most essential traits, helping to reduce noise, lower the computational expense of a model, and occasionally boost model performance. This review suggested that certain studies involved listing the crucial input and output variables [169, 170]. The variance filter (feature selection) allowed these studies to eliminate variables (input or output) with minimal or negligible impact on the efficiency measurement of DMUs. This method was also well accepted and commonly used in DEA-based articles [101, 171173].

4.1.4 Expert judgement.

Expert judgement can be utilised by researchers, stakeholders, or decision-makers to refine the input and output variable selections. A value judgement is “logical constructs used in efficiency assessment research that reflect the decision makers’ preferences during the efficiency assessment procedure”. This process includes the decision to exclude or assign a zero weight in the variable [142, 174], which depends on the efficiency measurement capacity of DEA based on the necessities and requirements of the decision-makers. Nonetheless, various constraints can develop, such as selection bias, input and output exclusions significantly impacting efficiency measurement or incorrect input and output weights. Therefore, researchers are encouraged to incorporate expert or value judgement to achieve their objectives. Different motivations are also observed for managers, economists, government policy makers, and academicians. Despite these individuals being committed to improving productivity, different judgements involving variable selections are presented. Considering that the DEA application possesses benefits and drawbacks, understanding the managerial and statistical implications of employing value judgment in input and output selections is crucial [175].

4.2 Managerial and economic implications in the input and output selection processes

This review empirically provided the approaches used in input and output variable selections for hospital evaluation-based DEA methods. The healthcare sector encounters daily challenges from public policy, resulting in new organisations, laws, and technology. Hence, managers must address these concerns by implementing practical performance evaluation and decision-making strategies. These concerns necessitate careful performance and decision-making evaluations from economic and managerial perspectives. Generally, the input and output selections in DEA comprise two components: selected methods and variables. This selection process in the analysis can significantly impact the DEA outcomes. Conversely, this review indicated little consideration was devoted to the input and output variable selections in a real-world scenario. The DEA often owns an extensive initial list of potential variables to consider. Therefore, each resource that a DMU uses should be considered as an input variable. The assessment made by a manager or economist to justify the selection process also holds significant importance in practical situations.

The selection process in actual conditions is made more complex by the objective of production economics. These factors can include profit, quality control, and customer satisfaction. Hence, evaluating these several competing factors is a difficult task due to the influence of multiple decision-makers. For example, a profit-oriented DEA assessment can conflict with customer satisfaction. The results may not represent the production objective if the manager combines these measurements simultaneously. After analysing this review, certain judgements made from management and economic perspectives are proposed as follows:

  1. Establish the objective production of the analysis, ensuring that all stakeholders easily understand it
  2. Utilise the existing selection approaches to the greatest extent possible
  3. Introduce a managerial-level committee to evaluate the variables before deciding on the final model
  4. Physical units and managerial or economic perspectives differ according to the objective production of the analysis, such as comparing salary in dollars against the number of employees

4.3 Common DEA model parameters in hospital efficiency evaluation

4.3.1 Model type.

Significant advancements and transformations throughout time have been observed in the DEA application. Hence, numerous models are available to assess efficiency, ranging from general models to the more specialised use of DEA. Approximately 80.90% (72 of 89) of the examined articles in this review applied the radial DEA model. The models included were the BCC, CCR, and a combined BCC and CCR model. Consequently, these findings align with other healthcare-based reviews [17, 21, 33]. The focus of a radial DEA model is typically on the proportionate change in input or output values. Therefore, slacks (excess inputs or shortfall outputs still existing in the model) are ignored or treated as optional. Even though the radial model demonstrated various limitations, it is still commonly employed due to its fundamental nature, simplicity, and ease of application (minimal requirements on the production criteria of the DMUs) [47, 79, 176].

4.3.2 Model orientation.

Several factors or arguments can influence the DEA orientation selection, such as the decision maker’s level of control, the nature of production, and the researcher’s purpose from the model [25, 32, 177]. Healthcare organisations or hospitals generally possess limited or less control over their outputs. Nevertheless, this observation does not imply that a DEA efficiency evaluation in a hospital must be focused solely on input orientation. Thus, this review discovered that 55.06% (49 of 89) of the articles used input-oriented DEA models. Previous articles also highlighted similar findings with varying proportions [17, 20, 21]. Researchers and hospital managers viewed reducing inputs while achieving a desired output level as a more appropriate measure of hospital efficiency. This outcome was attributed to the limited control hospitals possessed over their outputs.

4.3.3 Returns to scale assumption.

Ongoing discussions have been observed concerning which of the two fundamental models (CRS or VRS) are superior. Hospital managers are actively searching for the most effective evaluation methods to assess the efficiency impact of various inputs and outputs on their organisations. Hence, selecting a return to scale involving a hospital is determined by the size of the hospital [64], organisation factors [75, 90], input and output process flow [110, 126], and technological involvement [81, 121]. Adopting an inappropriate return to scale can result in an excessively constrained search region for effective DMUs. Therefore, both assumptions should be examined to comprehend the implications of using either one [178]. This review discovered that most articles applied CRS and VRS assumptions for comparison (35.96%), followed by only VRS assumptions (32.58%). Previous articles also demonstrated a trend towards replacing CRS with VRS assumption in DEA-based applications [20, 21, 33]. Specifically, most hospital efficiency assessments focused on economies of scale and considered the non-proportional correlation between inputs and outputs in the healthcare production function.

4.3.4 Input and output selections.

The methodologies involving input and output selections were effectively covered in this review, achieving the main objective of this study. Thus, appropriate input and output selections were crucial for the DEA analysis. Many previous articles denoted that the effectiveness of the DEA efficiency analysis was heavily influenced by the quality and quantity of these indicators [1518, 20, 21, 30, 33]. Most articles (52.76%) used staff-related input as one of the variables in efficiency measurement. Given that human resources were significant in any organisation (including hospitals), this outcome was not surprising. Consequently, this finding was similar to previous DEA-related and performance-based articles on healthcare services [2, 13].

Typically, the analysis unit for staff-related factors is contingent upon the operational dynamics of the organisation. This review suggested two staff-related factors: the actual number of staff and the full-time equivalent. Various staff types were observed, from clinical to non-clinical (see S3 and S4 Appendices). The number of general beds was the highest (74.49%) when examining the sub-type of inputs, which hospital beds were a fundamental capital input for a hospital. This factor was a key indicator to assess hospital performance, capacity, and competency while comparing healthcare services across different countries [179181]. Meanwhile, most articles applied production-related outputs rather than quality-related outputs. This phenomenon was attributed to the fact that it was simpler to quantify production-based data and provide stakeholders with a clear objective to improve upon it. Healthcare managers also would not prioritise effectiveness (quality) over efficiency [25, 182]. Overall, this review indicated that the common outputs applied were the number of inpatients, outpatients, and operations. Given that these outputs were the fundamental components of hospital services, the extensive utilisation of these factors was not unexpected.

4.3.5 Extended analysis.

The classic DEA model is considered insufficient on its own because of the complexity of hospital processes and the continuous efforts of researchers and practitioners to enhance healthcare efficiency assessment. The majority of recent research studies on healthcare efficiency assessment integrate DEA with various approaches and techniques in order to address the weaknesses of the latter and offer a comprehensive and accurate picture of healthcare efficiency.

Forty extended analysis were observed within the reviewed articles. Despite the fact that each study had a different rationale for performing an extended analysis, a consistent theme was found.

  1. To ascertain how contextual or environmental factors affect the efficiency scores [86, 108]
  2. To quantitatively compare efficiency scores [52, 63]
  3. To resolve the issues with serially linked estimates and produce bias-corrected efficiency estimates by utilising simulated distributions to compute the indices’ standard errors and confidence ranges [82, 129]
  4. To assess healthcare facilities’ long-term performance using panel data analysis [76, 133]
  5. To ascertain the relationship between the indicators that were to be included in the DEA model for input and output [45, 53]
  6. To forecast, following the consideration of exogenous elements in the efficiency assessment, whether or not a healthcare unit should be deemed efficient [67, 132]

Consequently, in order to gain a better understanding of how these approaches were applied and helped the various researchers achieve their goals, it is necessary to recognise and credit these ways.

5. Limitations and conclusion

This novel systematic review represented the comprehensive investigation methods used to identify input-output variables for measuring hospital efficiency using DEA. To the authors’ knowledge, no prior studies were conducted on this topic. The primary objective of this systematic review was to offer an overview of the existing approaches. This review also provided an update on the current application of DEA models for evaluating hospital efficiency. Approximately 89 articles were reviewed and assessed thoroughly with the specified objectives, and the literature review was primarily employed as a method for selecting inputs and output variables in DEA. These articles utilised literature review as a single method or combined with other approaches to enhance the robustness and vigour of the selection process. Considering that the selection of variables in DEA could lead to varying efficiency measurement outcomes, this process was considered crucial [139]. Nevertheless, no definitive approach or methodology could be identified for selecting variables (input-output) in DEA, concurrently representing its advantages and disadvantages [183185].

Researchers and stakeholders should use the DEA to assess the effectiveness of their organisation according to their preferences. Conversely, these individuals should be aware of the limitations and potential constraints of DEA [139, 142]. Even though this review specifically examined methodologies employed in hospital settings, the scope of the findings could be restricted. Alternative procedures or methods could be utilised to select input and output variables for DEA studies in different fields or based on other perspectives [186]. Given that researchers and healthcare professionals aim to improve healthcare efficiency assessment, an optimal input-output selection approach should be identified. Hence, examining past, present, and potential developments in the DEA literature is essential due to its significant impact on DEA studies. The parameters for the DEA models also did not present any evidence to support an optimal or universally fitting model, for which almost all models were utilised multiple times (see S3 and S4 Appendices). Consequently, this review offered guidelines and methodological principles for conducting DEA studies based on established research. This process can provide insights to hospital managers, healthcare workers, policy officials, and students on the efficiency evaluation using DEA.

5.1 Registration and protocol

This study was registered at OSF Registries (https://osf.io/registries). All information regarding the registration and study protocol can be accessed at https://osf.io/nby9m or https://osf.io/e7mj9/?view_only=53deec8e6c6946eeaf0ea6fe2f0f212a.

Supporting information

S3 Appendix. Table 6 summary of 89 reviewed publications.

https://doi.org/10.1371/journal.pone.0293694.s003

(DOCX)

S4 Appendix. Table 7 summary of 89 reviewed publications.

https://doi.org/10.1371/journal.pone.0293694.s004

(DOCX)

S5 Appendix. Table 8 types of efficiency studied.

https://doi.org/10.1371/journal.pone.0293694.s005

(DOCX)

S6 Appendix. Table 9 model types applied in the studies.

https://doi.org/10.1371/journal.pone.0293694.s006

(DOCX)

S7 Appendix. Table 10 model orientation applied in the studies.

https://doi.org/10.1371/journal.pone.0293694.s007

(DOCX)

S8 Appendix. Table 11 return to scale assumption applied in the studies.

https://doi.org/10.1371/journal.pone.0293694.s008

(DOCX)

References

  1. 1. Farrell MJ. The Measurement of Productive Efficiency. J R Stat Soc Ser A. 1957;120: 253.
  2. 2. Bahadori M, Izadi AR, Ghardashi F, Ravangard R, Hosseini SM. The Evaluation of Hospital Performance in Iran: A Systematic Review Article. Iran J Public Health. 2016;45: 855. Available: /pmc/articles/PMC4980339/. pmid:27516991
  3. 3. Hafidz F, Ensor T, Tubeuf S. Efficiency Measurement in Health Facilities: A Systematic Review in Low- and Middle-Income Countries. Applied Health Economics and Health Policy 2018 16:4. 2018;16: 465–480. pmid:29679237
  4. 4. Hussey PS, de Vries H, Romley J, Wang MC, Chen SS, Shekelle PG, et al. A Systematic Review of Health Care Efficiency Measures. Health Serv Res. 2009;44: 784–805. pmid:19187184
  5. 5. Tandon A, Murray C, Lauer J, Evans DB. Measuring Overall Health System Performance for 191 Countries. 2000. https://www.who.int/publications
  6. 6. World Health Organisation. The World Health Report 2000 Health Systems: Improving Performance. 2000. https://www.who.int/publications/i/item/924156198X
  7. 7. McKee M. The World Health Report 2000: 10 years on. Health Policy Plan. 2010;25: 346–348. pmid:20798126
  8. 8. Navarro V. The World Health Report 2000: Can Health Care Systems Be Compared Using a Single Measure of Performance? Am J Public Health. 2002;92: 31. pmid:11772754
  9. 9. Coyne JS, Hilsenrath P. The World Health Report 2000: Can Health Care Systems Be Compared Using a Single Measure of Performance? Am J Public Health. 2002;92: 30. pmid:11772753
  10. 10. Benneyan J, Ceyhan M, Sunnetci A. Data Envelopment Analysis of National Healthcare Systems and Their Relative Efficiencies. The 37th International Conference on Computers and Industrial Engineering, pp251–261. 2007.
  11. 11. Cylus Jonathan, Papanicolas I, Smith PC. Health Systems Efficiency How to make measurement matter for policy and management. Organização Mundial da Saúde, editor. Organização Mundial da Saúde; 2016. https://www.ncbi.nlm.nih.gov/books/NBK436888/
  12. 12. Papanicolas I, Rajan D, Karanikolos M, Soucat A, Figueras J 1959-. Health system performance assessment A framework for policy analysis. 2022. https://www.who.int/publications/i/item/9789240042476
  13. 13. Rasi V, Delgoshaee B, Maleki M. Identification of common indicators of hospital performance evaluation models: A scoping review. J Educ Health Promot. 2020;9. pmid:32489998
  14. 14. Rahimi H, Khammar-nia M, Kavosi Z, Eslahi M. Indicators of Hospital Performance Evaluation: A Systematic Review. International Journal of Hospital Research. 2014;3: 199–208. Available: http://ijhr.iums.ac.ir/article_10152.html
  15. 15. Hollingsworth B, Dawson PJ, Maniadakis N. Efficiency measurement of health care: a review of non-parametric methods and applications. Health Care Management Science 1999 2:3. 1999;2: 161–172. pmid:10934540
  16. 16. Hollingsworth B. Non-Parametric and Parametric Applications Measuring Efficiency in Health Care. Health Care Management Science 2003 6:4. 2003;6: 203–218. pmid:14686627
  17. 17. Giancotti M, Pipitone V, Mauro M, Guglielmo A. 20 Years of Studies on Technical and Scale efficiency in the Hospital Sector: a Review of Methodological Approaches. 2016.
  18. 18. Hollingsworth B. The measurement of efficiency and productivity of health care delivery. Health Econ. 2008;17: 1107–1128. pmid:18702091
  19. 19. Fazria NF, Dhamanti I. A Literature review on the Identification of Variables for Measuring Hospital Efficiency in the Data Envelopment Analysis (DEA). Unnes Journal of Public Health. 2021;10: 1–15.
  20. 20. O’Neill L, Rauner M, Heidenberger K, Kraus M. A cross-national comparison and taxonomy of DEA-based hospital efficiency studies. Socioecon Plann Sci. 2008;42: 158–189.
  21. 21. Cantor VJM, Poh KL. Integrated Analysis of Healthcare Efficiency: A Systematic Review. Journal of Medical Systems 2017 42:1. 2017;42: 1–23. pmid:29167999
  22. 22. Charnes A, Cooper WW, Rhodes E. Measuring the efficiency of decision making units. Eur J Oper Res. 1978;2: 429–444.
  23. 23. Cooper WW, Seiford LM, Zhu J. Handbook on Data Envelopment Analysis. Cooper WW, Seiford LM, Zhu J, editors. Boston, MA: Springer US; 2011.
  24. 24. Benneyan JC, Sunnetci A, Mehmet E. Data envelopment analysis models for identifying and benchmarking the best healthcare processes. International Journal of Six Sigma and Competitive Advantage. 2008;4: 305–331.
  25. 25. Ozcan YA, Tone K. Health Care Benchmarking and Performance Evaluation. Boston, MA: Springer US; 2014.
  26. 26. Bogetoft P, Otto L. Benchmarking with DEA, SFA, and R. 2011;157.
  27. 27. Aljunid S, Moshiri H, Ahmed Z. Measuring Hospital Efficiency: Theory and Methods. Casemix Solutions Sdn Bhd, Kuala Lumpur; 2013.
  28. 28. Xu GC, Zheng J, Zhou ZJ, Zhou CK, Zhao Y. Comparative Study of Three Commonly Used Methods for Hospital Efficiency Analysis in Beijing Tertiary Public Hospitals, China. Chin Med J (Engl). 2015;128: 3185. pmid:26612294
  29. 29. Ravaghi H, Afshari M, Isfahani P, Bélorgeot VD. A systematic review on hospital inefficiency in the Eastern Mediterranean Region: Sources and solutions. BMC Health Serv Res. 2019;19: 1–20.
  30. 30. Eklom B, Callander E. A Systematic Review of Hospital Efficiency and Productivity Studies: Lessons from Australia, UK and Canada. 2020 [cited 3 Nov 2022].
  31. 31. Fitriana, Hendrawan H. Analisis Efisiensi dengan Data Envelopment (DEA) di Rumah Sakit dan PUSKESMAS. CV. Amerta Media; 2021.
  32. 32. Irwandy. Efisiensi dan Produktifitas Rumah Sakit: Teori dan Aplikasi Pengukuran dengan Pendekatan Data Envelopment Analysis. CV. Social Politic Genius (SIGn); 2019.
  33. 33. Kohl S, Schoenfelder J, Fügener A, Brunner JO. The use of Data Envelopment Analysis (DEA) in healthcare with a focus on hospitals. Health Care Management Science 2018 22:2. 2018;22: 245–286. pmid:29478088
  34. 34. Amini S, Karami Matin B, Didehdar M, Alimohammadi A, Salimi Y, Amiresmaili M, et al. Efficiency of Iranian Hospitals Before and After Health Sector Evolution Plan: A Systematic Review and Meta-Analysis Study. Front Public Health. 2021;9. pmid:34900889
  35. 35. Littell JH, Corcoran J, Pillai V. Systematic Reviews and Meta-Analysis. Systematic Reviews and Meta-Analysis. 2008;9780195326543: 1–210.
  36. 36. A E., Juni MH, R A.M. A Systematic Review of Hospital Inputs and Outputs in Measuring Technical Efficiency Using Data Envelopment Analysis. International Journal of Public Health and Clinical Sciences. 2018;5: 17–35. Available: http://publichealthmy.org/ejournal/ojs2/index.php/ijphcs/article/view/563
  37. 37. Cook WD, Zhu J. Classifying inputs and outputs in data envelopment analysis. Eur J Oper Res. 2007;180: 692–699.
  38. 38. Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. The BMJ. 2021;372. pmid:33782057
  39. 39. Mohamed Shaffril HA, Samsuddin SF, Abu Samah A. The ABC of systematic literature review: the basic methodological guidance for beginners. Quality & Quantity 2020 55:4. 2020;55: 1319–1346.
  40. 40. Gusenbauer M, Haddaway NR. Which academic search systems are suitable for systematic reviews or meta-analyses? Evaluating retrieval qualities of Google Scholar, PubMed, and 26 other resources. Res Synth Methods. 2020;11: 181–217. pmid:31614060
  41. 41. Bramer WM, Giustini D, Kramer BM, Anderson P. The comparative recall of Google Scholar versus PubMed in identical searches for biomedical systematic reviews: a review of searches used in systematic reviews. Syst Rev. 2013;2: 115. pmid:24360284
  42. 42. Mitton C, Adair CE, McKenzie E, Patten SB, Perry BW. Knowledge Transfer and Exchange: Review and Synthesis of the Literature. Milbank Q. 2007;85: 729. pmid:18070335
  43. 43. Alatawi A, Ahmed S, Niessen L, Khan J. Systematic review and meta-analysis of public hospital efficiency studies in Gulf region and selected countries in similar settings. Cost Effectiveness and Resource Allocation. 2019;17: 1–12.
  44. 44. Varabyova Y, Müller JM. The efficiency of health care production in OECD countries: A systematic review and meta-analysis of cross-country comparisons. Health Policy (New York). 2016;120: 252–263. pmid:26819140
  45. 45. Czypionka T, Kraus M, Mayer S, Röhrling G. Efficiency, ownership, and financing of hospitals: The case of Austria. Health Care Manag Sci. 2014;17: 331–347. pmid:24338279
  46. 46. Berger M, Sommersguter-Reichmann M, Czypionka T. Determinants of soft budget constraints: How public debt affects hospital performance in Austria. Soc Sci Med. 2020;249: 112855. pmid:32109755
  47. 47. Wang X, Luo H, Qin X, Feng J, Gao H, Feng Q. Evaluation of performance and impacts of maternal and child health hospital services using Data Envelopment Analysis in Guangxi Zhuang Autonomous Region, China: A comparison study among poverty and non-poverty county level hospitals. Int J Equity Health. 2016;15: 1–6.
  48. 48. Zhu J, Song X. Changes in efficiency of tertiary public general hospitals during the reform of public hospitals in Beijing, China. Int J Health Plann Manage. 2022;37: 143–155. pmid:34494295
  49. 49. Gao Q, Wang D. Hospital efficiency and equity in health care delivery: A study based in China. Socioecon Plann Sci. 2021;76: 100964.
  50. 50. Peng Z, Zhu L, Wan G, Coyte PC. Can integrated care improve the efficiency of hospitals? Research based on 200 Hospitals in China. Cost Effectiveness and Resource Allocation. 2021;19: 1–12.
  51. 51. Schneider AM, Oppel EM, Schreyögg J. Investigating the link between medical urgency and hospital efficiency—Insights from the German hospital market. Health Care Manag Sci. 2020;23: 649–660. pmid:32936387
  52. 52. Zarrin M, Schoenfelder J, Brunner JO. Homogeneity and Best Practice Analyses in Hospital Performance Management: An Analytical Framework. Health Care Manag Sci. 2022;25: 406–425. pmid:35192085
  53. 53. Li Y, Lei X, Morton A. Performance evaluation of nonhomogeneous hospitals: the case of Hong Kong hospitals. Health Care Manag Sci. 2019;22: 215–228. pmid:29445892
  54. 54. Patra A, Ray PK. Operational efficiency analysis of public hospital systems of india: Application of data envelopment analysis. Advances in Intelligent Systems and Computing. 2018;590: 415–424.
  55. 55. Irwandy , Sjaaf AC, Achadi A, Nadjib M, Ayuningtyas D, Junadi P, et al. The efficiency and productivity of Public Services Hospital in Indonesia. Enferm Clin. 2020;30: 236–239.
  56. 56. Irwandy Sjaaf AC. Using data envelopment analysis to improve the hospitals efficiency in Indonesia: The case of South Sulawesi Province. Indian J Public Health Res Dev. 2018;9: 214–219.
  57. 57. Asiabar AS, Sharifi T, Rezapour A, Firouzabadi SMAK, Haghighat-Fard P, Mohammad-pour S. Technical efficiency and its affecting factors in Tehran’s public hospitals: DEA approach and Tobit regression. Med J Islam Repub Iran. 2020;34: 176. pmid:33816375
  58. 58. Pirani N, Zahiri M, Engali KA, Torabipour A. Hospital Efficiency Measurement Before and After Health Sector Evolution Plan in Southwest of Iran: a DEA-Panel Data Study. Acta Inform Med. 2018;26: 106–110. pmid:30061781
  59. 59. Goudarzi R, Gholamhoseini MT, Hekmat SN, YousefZadeh S, Amini S. The effect of Iran’s health transformation plan on hospital performance: Kerman province. PLoS One. 2021;16: e0247155. pmid:33596262
  60. 60. Jahantigh FF, Ostovare M. Application of a Hybrid Method for Performance Evaluation of Teaching Hospitals in Tehran. Qual Manag Health Care. 2020;29: 210–217. pmid:32991538
  61. 61. Almeida A, Frias R, Pedro Fique J. Evaluating Hospital Efficiency Adjusting for Quality Indicators: An Application to Portuguese NHS Hospitals. Health Econ Outcome Res. 2015;1: 1–5.
  62. 62. Pereira MA, Ferreira DC, Figueira JR, Marques RC. Measuring the efficiency of the Portuguese public hospitals: A value modelled network data envelopment analysis with simulation. Expert Syst Appl. 2021;181: 115169.
  63. 63. Ortega-Díaz MI, Martín JC. How to detect hospitals where quality would not be jeopardized by health cost savings? A methodological approach using DEA with SBM analysis. Health Policy (New York). 2022;126: 1069–1074. pmid:35927090
  64. 64. Fumbwe F, Lihawa R, Andrew F, Kinyanjui G, Mkuna E. Examination on level of scale efficiency in public hospitals in Tanzania. Cost Effectiveness and Resource Allocation. 2021;19: 1–10.
  65. 65. Klangrahad C. Evaluation of Thailand’s regional hospital efficiency: An application of data envelopment analysis. ACM International Conference Proceeding Series. 2017;Part F131202: 104–109.
  66. 66. Cinaroglu S. Changes in hospital efficiency and size: An integrated propensity score matching with data envelopment analysis. Socioecon Plann Sci. 2021;76: 100960.
  67. 67. Cinaroglu S. Integrated k-means clustering with data envelopment analysis of public hospital efficiency. Health Care Management Science 2019 23:3. 2019;23: 325–338. pmid:31325003
  68. 68. Özgen Narcı H, Ozcan YA, Şahin İ, Tarcan M, Narcı M. An examination of competition and efficiency for hospital industry in Turkey. Health Care Management Science 2014 18:4. 2014;18: 407–418. pmid:25515038
  69. 69. Küçük A, Özsoy VS, Balkan D. Assessment of technical efficiency of public hospitals in Turkey. Eur J Public Health. 2020;30: 230–235. pmid:31412115
  70. 70. İlgün G, Konca M. Assessment of efficiency levels of training and research hospitals in Turkey and the factors affecting their efficiencies. Health Policy Technol. 2019;8: 343–348.
  71. 71. Gok MS, Altındağ E. Analysis of the cost and efficiency relationship: experience in the Turkish pay for performance system. European Journal of Health Economics. 2015;16: 459–469. pmid:24722916
  72. 72. Şahin B, İlgün G. Assessment of the impact of public hospital associations (PHAs) on the efficiency of hospitals under the ministry of health in Turkey with data envelopment analysis. Health Care Manag Sci. 2019;22: 437–446. pmid:30465130
  73. 73. Onder O, Cook W, Kristal M. Does quality help the financial viability of hospitals? A data envelopment analysis approach. Socioecon Plann Sci. 2022;79: 101105.
  74. 74. Fragkiadakis G, Doumpos M, Zopounidis C, Germain C. Operational and economic efficiency analysis of public hospitals in Greece. Annals of Operations Research 2014 247:2. 2014;247: 787–806.
  75. 75. Ortega-Díaz MI, Ocaña-Riola R, Pérez-Romero C, Martín-Martín JJ. Multilevel Analysis of the Relationship between Ownership Structure and Technical Efficiency Frontier in the Spanish National Health System Hospitals. Int J Environ Res Public Health. 2020;17: 1–19. pmid:32823922
  76. 76. Piubello Orsini L, Leardini C, Vernizzi S, Campedelli B. Inefficiency of public hospitals: a multistage data envelopment analysis in an Italian region. BMC Health Serv Res. 2021;21: 1–15.
  77. 77. Wu D, Wu DD. Risk-Based Robust Evaluation of Hospital Efficiency. IEEE Syst J. 2019;13: 1906–1914.
  78. 78. Nguyen BH, Zelenyuk V. Aggregate efficiency of industry and its groups: the case of Queensland public hospitals. Empir Econ. 2021;60: 2795–2836.
  79. 79. Karma E, Gashi S. Technical efficiency of Kosovo public hospitals. South Eastern European Journal of Public Health (SEEJPH). 2022 [cited 20 Jun 2023].
  80. 80. Soares AB, Pereira AA, Milagre ST. A model for multidimensional efficiency analysis of public hospital management. Research on Biomedical Engineering. 2017;33: 352–361.
  81. 81. Garmatz A, Vieira GBB, Sirena SA. Assessing the technical efficiency of Brazil’s teaching hospitals using data envelopment analysis. Cien Saude Colet. 2021;26: 3447–3457. pmid:34468641
  82. 82. Chowdhury H, Zelenyuk V. Performance of hospital services in Ontario: DEA with truncated regression approach. Omega (United Kingdom). 2016;63: 111–122.
  83. 83. Yin G, Chen C, Zhuo L, He Q, Tao H. Efficiency Comparison of Public Hospitals under Different Administrative Affiliations in China: A Pilot City Case. Healthcare (Basel). 2021;9. pmid:33917844
  84. 84. Li B, Mohiuddin M, Liu Q. Determinants and Differences of Township Hospital Efficiency among Chinese Provinces. Int J Environ Res Public Health. 2019;16. pmid:31067779
  85. 85. Li H, Dong S. Measuring and Benchmarking Technical Efficiency of Public Hospitals in Tianjin, China: A Bootstrap—Data Envelopment Analysis Approach. Inquiry. 2015;52. pmid:26396090
  86. 86. Zarrin M. A mixed-integer slacks-based measure data envelopment analysis for efficiency measuring of German university hospitals. Health Care Manag Sci. 2022;26: 138–160. pmid:36396892
  87. 87. Guo H, Zhao Y, Niu T, Tsui K-L. Correction: Hong Kong Hospital Authority resource efficiency evaluation: Via a novel DEA-Malmquist model and Tobit regression model. PLoS One. 2018;13: e0193266. pmid:29447275
  88. 88. ArulJothi K N, Irusappan S, Amarnath G, Chandrasekaran S, SAB K, Harishankar M, et al. A CRITICAL DATA ENVELOPMENT ANALYSIS OF HOSPITAL EFFICIENCY IN INDIA. IJSR - International Journal of Scientific Research. 2016;Volume 5 Issue 2: 63–65.
  89. 89. Rezaee MJ, Karimdadi A. Do Geographical Locations Affect in Hospitals Performance? A Multi-group Data Envelopment Analysis. Journal of Medical Systems 2015 39:9. 2015;39: 1–11. pmid:26208596
  90. 90. Guerrini A, Romano G, Campedelli B, Moggi S, Leardini C. Public vs. Private in Hospital Efficiency: Exploring Determinants in a Competitive Environment. International Journal of Public Administration. 2018;41: 181–189.
  91. 91. Cavalieri M, Guccio C, Lisi D, Pignataro G. Does the Extent of Per-Case Payment System Affect Hospital Efficiency? Evidence from the Italian NHS. SSRN Electronic Journal. 2014 [cited 20 Jun 2023].
  92. 92. Campanella P, Azzolini E, Izzi A, Pelone F, De Meo C, La Milia D, et al. Hospital efficiency: how to spend less maintaining quality? Ann Ist Super Sanita. 2017;53: 46–53. pmid:28361805
  93. 93. Dohmen P, van Ineveld M, Markus A, van der Hagen L, van de Klundert J. Does competition improve hospital performance: a DEA based evaluation from the Netherlands. Eur J Health Econ. 2022 [cited 20 Jun 2023]. pmid:36192512
  94. 94. Li RC, Tangsoc JC, See SL, John Cantor VM, Lauren Tan ML, Joy Yu RS. A DEA-based Performance Measurement Mathematical Model and Software Application System Applied to Public Hospitals in the Philippines. BUSINESS & ECONOMICS REVIEW DLSU Business & Economics Review. 2016;25: 166–196.
  95. 95. Valdmanis V, Rosko M, Mancuso P, Tavakoli M, Farrar S. Measuring performance change in Scottish hospitals: a Malmquist and times-series approach. Health Serv Outcomes Res Methodol. 2017;17: 113–126.
  96. 96. Chen KC, Chen HM, Chien LN, Yu MM. Productivity growth and quality changes of hospitals in Taiwan: does ownership matter? Health Care Manag Sci. 2019;22: 451–461. pmid:30607800
  97. 97. Hung SY, Wu TH. Healthcare quality and efficiency in Taiwan. ACM International Conference Proceeding Series. 2018; 83–87.
  98. 98. Ho CC, Jiang YB, Chen MS. The healthcare quality and performance evaluation of hospitals with different ownerships-demonstrated by Taiwan hospitals. Proceedings - 2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics, CISP-BMEI 2017. 2018;2018-January: 1–4.
  99. 99. Ghahremanloo M, Hasani A, Amiri M, Hashemi-Tabatabaei M, Keshavarz-Ghorabaee M, Ustinovičius L. A novel DEA model for hospital performance evaluation based on the measurement of efficiency, effectiveness, and productivity. Engineering Management in Production and Services. 2020;12: 7–19.
  100. 100. Khushalani J, Ozcan YA. Are hospitals producing quality care efficiently? An analysis using Dynamic Network Data Envelopment Analysis (DEA). Socioecon Plann Sci. 2017;60: 15–23.
  101. 101. Villalobos-Cid M, Chacón M, Zitko P, Inostroza-Ponta M. A New Strategy to Evaluate Technical Efficiency in Hospitals Using Homogeneous Groups of Casemix: How to Evaluate When There is Not DRGs? J Med Syst. 2016;40: 1–12. pmid:26880102
  102. 102. Babalola TK, Ojugbele HO, Shahwan M, Moodley I. Analysis of factors influencing technical efficiency of public district hospitals in KwaZulu-Natal province, South Africa. Hum Resour Health. 2022;20: 1–10.
  103. 103. Ahmed S, Hasan MZ, Laokri S, Jannat Z, Ahmed MW, Dorin F, et al. Technical efficiency of public district hospitals in bangladesh: A data envelopment analysis. Cost Effectiveness and Resource Allocation. 2019;17: 1–10.
  104. 104. Jiang S, Min R, Fang PQ. The impact of healthcare reform on the efficiency of public county hospitals in China. BMC Health Serv Res. 2017;17. pmid:29262816
  105. 105. Zhao L, Wang L, Li S, Zhang Y. Evaluation and analysis of hospital efficiency in China based on macro- and micro-level analyses. Journal of Public Health (Germany). 2020;28: 191–197.
  106. 106. Cheng Z, Tao H, Cai M, Lin H, Lin X, Shu Q, et al. Technical efficiency and productivity of Chinese county hospitals: an exploratory study in Henan province, China. BMJ Open. 2015;5. pmid:26353864
  107. 107. Jing R, Xu T, Lai X, Mahmoudi E, Fang H. Technical Efficiency of Public and Private Hospitals in Beijing, China: A Comparative Study. Int J Environ Res Public Health. 2019;17. pmid:31861922
  108. 108. Zheng W, Sun H, Zhang P, Zhou G, Jin Q, Lu X. A four-stage DEA-based efficiency evaluation of public hospitals in China after the implementation of new medical reforms. PLoS One. 2018;13. pmid:30281620
  109. 109. Vrabková I, Vaňková I. Efficiency of Human Resources in Public Hospitals: An Example from the Czech Republic. Int J Environ Res Public Health. 2021;18. pmid:33925160
  110. 110. van Ineveld M, van Oostrum J, Vermeulen R, Steenhoek A, van de Klundert J. Productivity and quality of Dutch hospitals during system reform. Health Care Manag Sci. 2016;19: 279–290. pmid:25774011
  111. 111. Ali M, Debela M, Bamud T. Technical efficiency of selected hospitals in Eastern Ethiopia. Health Econ Rev. 2017;7: 1–13.
  112. 112. Flokou A, Aletras V, Niakas D. Decomposition of potential efficiency gains from hospital mergers in Greece. Health Care Manag Sci. 2017;20: 467–484. pmid:27068659
  113. 113. Xenos P, Yfantopoulos J, Nektarios M, Polyzos N, Tinios P, Constantopoulos A. Efficiency and productivity assessment of public hospitals in Greece during the crisis period 2009–2012. Cost Eff Resour Alloc. 2017;15. pmid:28450811
  114. 114. Flokou A, Aletras V, Niakas D. A window-DEA based efficiency evaluation of the public hospital sector in Greece during the 5-year economic crisis. PLoS One. 2017;12: e0177946. pmid:28542362
  115. 115. Str S, Kalogeropoulou M. THE IMPACT OF ECONOMIC CRISIS TO HOSPITAL SECTOR AND THE EFFICIENCY OF GREEK PUBLIC HOSPITALS. European Journal of Business and Social Sciences. 2016.
  116. 116. Tiwari SP, Shukla V. DEA: A NEW APPROACH TO MEASURE HOSPITAL OPERATIONS EFFICIENCY. International Journal of Technical Research and Applications. 2015;3: 368–373. Available: www.ijtra.com
  117. 117. Yousefi Nayer M, Fazaeli AA, Hamidi Y. Hospital efficiency measurement in the west of Iran: data envelopment analysis and econometric approach. Cost Effectiveness and Resource Allocation. 2022;20: 1–6.
  118. 118. Matranga D, Sapienza F. Congestion analysis to evaluate the efficiency and appropriateness of hospitals in Sicily. Health Policy (New York). 2015;119: 324–332. pmid:25561357
  119. 119. Anthun KS, Kittelsen SAC, Magnussen J. Productivity growth, case mix and optimal size of hospitals. A 16-year study of the Norwegian hospital sector. Health Policy (New York). 2017;121: 418–425. pmid:28214046
  120. 120. Sultan WIM, Crispim J. Measuring the efficiency of Palestinian public hospitals during 2010–2015: An application of a two-stage DEA method. BMC Health Serv Res. 2018;18: 1–17.
  121. 121. Stefko R, Gavurova B, Kocisova K. Healthcare efficiency assessment using DEA analysis in the Slovak Republic. Health Econ Rev. 2018;8. pmid:29523981
  122. 122. Mujasi PN, Asbu EZ, Puig-Junoy J. How efficient are referral hospitals in Uganda? A data envelopment analysis and tobit regression approach. BMC Health Serv Res. 2016;16: 1–14.
  123. 123. Ayiko R, Mujasi PN, Abaliwano J, Turyareeba D, Enyaku R, Anguyo R, et al. Levels, trends and determinants of technical efficiency of general hospitals in Uganda: Data envelopment analysis and Tobit regression analysis. BMC Health Serv Res. 2020;20: 1–12. pmid:33023598
  124. 124. Zhang X, Tone K, Lu Y. Impact of the Local Public Hospital Reform on the Efficiency of Medium-Sized Hospitals in Japan: An Improved Slacks-Based Measure Data Envelopment Analysis Approach. Health Serv Res. 2018;53: 896–918. pmid:28266025
  125. 125. DEA-Solver-Pro. DEA-Solver-Pro Newsletter No. 20. In: https://saitech.capoo.jp/en/profile-of-dea-solverpro/ [Internet]. 2022 [cited 26 Jun 2023].
  126. 126. Sultan WIM, Crispim J. Are public hospitals reforming efficiently in West Bank? Confl Health. 2018;12: 1–14.
  127. 127. Alatawi AD, Niessen LW, Khan JAM. Determinants of Technical Efficiency in Public Hospitals: The Case of Saudi Arabia. Health Econ Rev. 2020;10: 1–11.
  128. 128. Lacko R, Hajduová Z, Gábor V. Data Envelopment Analysis of Selected Specialized Health Centres and Possibilities of its Application in the Terms of Slovak Republic Health Care System. J Health Manag. 2017;19: 144–158.
  129. 129. Franco Miguel JL, Fullana Belda C, Rúa Vieites A. Analysis of the technical efficiency of the forms of hospital management based on public-private collaboration of the Madrid Health Service, as compared with traditional management. Int J Health Plann Manage. 2019;34: 414–442. pmid:30303272
  130. 130. See KF, Ng YC. Do hospital reform and ownership matter to Shenzhen hospitals in China? A productivity analysis. Econ Anal Policy. 2021;72: 145–155.
  131. 131. Giménez V, Prieto W, Prior D, Tortosa-Ausina E. Evaluation of efficiency in Colombian hospitals: An analysis for the post-reform period. Socioecon Plann Sci. 2019;65: 20–35.
  132. 132. Caballer-Tarazona M, Clemente-Collado A, Vivas-Consuelo D. A cost and performance comparison of Public Private Partnership and public hospitals in Spain. Health Econ Rev. 2016;6: 1–7.
  133. 133. Kim Y, Lee KH, Choi SW. Multifaced Evidence of Hospital Performance in Pennsylvania. Healthcare (Basel). 2021;9. pmid:34199711
  134. 134. Hunt DJ, Link CR. Better outcomes at lower costs? The effect of public health expenditures on hospital efficiency. Appl Econ. 2020;52: 400–414.
  135. 135. Sherman HD. Hospital efficiency measurement and evaluation. Empirical test of a new technique. Med Care. 1984;22: 922–938. pmid:6436590
  136. 136. Nunamaker TR. Measuring routine nursing service efficiency: a comparison of cost per patient day and data envelopment analysis models. Health Serv Res. 1983;18: 183. Available: /pmc/articles/PMC1068745/?report=abstract. pmid:6874357
  137. 137. Emrouznejad A, Yang G liang. A survey and analysis of the first 40 years of scholarly literature in DEA: 1978–2016. Socioecon Plann Sci. 2018;61: 4–8.
  138. 138. Emrouznejad A, Cabanda E. Managing Service Productivity Using Frontier Efficiency Methodologies and Multicriteria Decision Making for Improving Service Performance. Emrouznejad A, Cabanda E, editors. Berlin, Heidelberg: Springer Berlin Heidelberg; 2014. https://doi.org/10.1007/978-3-662-43437-6
  139. 139. Dyson RG, Allen R, Camanho AS, Podinovski V V., Sarrico CS, Shale EA. Pitfalls and protocols in DEA. Eur J Oper Res. 2001;132: 245–259.
  140. 140. Cook WD, Tone K, Zhu J. Data envelopment analysis: Prior to choosing a model. Omega (Westport). 2014;44: 1–4.
  141. 141. Emrouznejad A, De Witte K. COOPER-framework: A unified process for non-parametric projects. Eur J Oper Res. 2010;207: 1573–1586.
  142. 142. Golany B, Roll Y. An application procedure for DEA. Omega (Westport). 1989;17: 237–250.
  143. 143. Chilingerian JA, Sherman HD. Health-care applications: From hospitals to physicians, from productive efficiency to quality frontiers. International Series in Operations Research and Management Science. 2011;164: 445–493.
  144. 144. Robinson P, Lowe J. Literature reviews vs systematic reviews. Aust N Z J Public Health. 2015;39: 103. pmid:25827181
  145. 145. Webster J, Watson RT. Analyzing the Past to Prepare for the Future: Writing a Literature Review. Management Information Systems Research Center, University of Minnesota. 2002;26: 1–11. https://www.jstor.org/stable/4132319
  146. 146. Snyder H. Literature review as a research methodology: An overview and guidelines. J Bus Res. 2019;104: 333–339.
  147. 147. Zha Y, Song A, Xu C, Yang H. Dealing with missing data based on data envelopment analysis and halo effect. Appl Math Model. 2013;37: 6135–6145.
  148. 148. Kao C, Liu ST. Data envelopment analysis with missing data a reliable solution method. Modeling Data Irregularities and Structural Complexities in Data Envelopment Analysis. 2007; 291–304.
  149. 149. Dalkey NC, Helmer-Hirschberg O. An Experimental Application of the Delphi Method to the Use of Experts. 1962 [cited 17 Jul 2023]. https://www.rand.org/pubs/research_memoranda/RM727z1.html
  150. 150. Strielkowski W, Chen T-A. Business Performance Evaluation for Tourism Factory: Using DEA Approach and Delphi Method. Sustainability 2022, Vol 14, Page 9209. 2022;14: 9209.
  151. 151. Oikonomou N, Tountas Y, Mariolis A, Souliotis K, Athanasakis K, Kyriopoulos J. Measuring the efficiency of the Greek rural primary health care using a restricted DEA model; the case of southern and western Greece. Health Care Manag Sci. 2016;19: 313–325. pmid:25913830
  152. 152. Thi Nong NM. An application of delphi and dea to performance efficiency assessment of retail stores in fashion industry. The Asian Journal of Shipping and Logistics. 2022;38: 135–142.
  153. 153. Nong TNM. Performance efficiency assessment of Vietnamese ports: An application of Delphi with Kamet principles and DEA model. The Asian Journal of Shipping and Logistics. 2023;39: 1–12.
  154. 154. Brans J-P. L’ingenierie de la decision, l’laboration d’instruments d’aidea la decision. In Proceedings of the Colloque sur l’Aidea la Decision. Faculte des Sciences de l’Administration, Universite Laval, Québec, QC, Canada. 1982. https://books.google.com.my/books?hl=en&lr=&id=y4rp7dDqgZcC&oi=fnd&pg=PA183&ots=pdHrJn-_W4&sig=8nuZkvcobLpQP97dewmppGobopE&redir_esc=y#v=onepage&q&f=false
  155. 155. Doumpos M, Zopounidis C. A multicriteria classification approach based on pairwise comparisons. Eur J Oper Res. 2004;158: 378–389.
  156. 156. Brans JP, Vincke Ph. Note—A Preference Ranking Organisation Method. 1985;31: 647–656.
  157. 157. Babaee S, Bagherikahvarin M, Sarrazin R, Shen Y, Hermans E. Use of DEA and PROMETHEE II to Assess the Performance of Older Drivers. Transportation Research Procedia. 2015;10: 798–808.
  158. 158. Bagherikahvarin M, De Smet Y. A ranking method based on DEA and PROMETHEE II (a rank based on DEA & PR.II). Measurement. 2016;89: 333–342.
  159. 159. de Oliveira MS, Steffen V, de Francisco AC, Trojan F. Integrated data envelopment analysis, multi-criteria decision making, and cluster analysis methods: Trends and perspectives. Decision Analytics Journal. 2023;8: 100271.
  160. 160. Behzadian M, Kazemzadeh RB, Albadvi A, Aghdasi M. PROMETHEE: A comprehensive literature review on methodologies and applications. Eur J Oper Res. 2010;200: 198–215.
  161. 161. Alidrisi H. DEA-Based PROMETHEE II Distribution-Center Productivity Model: Evaluation and Location Strategies Formulation. Applied Sciences 2021, Vol 11, Page 9567. 2021;11: 9567.
  162. 162. Abdullah L, Chan W, Afshari A. Application of PROMETHEE method for green supplier selection: a comparative result based on preference functions. Journal of Industrial Engineering International. 2019;15: 271–285.
  163. 163. Brans JP, De Smet Y. PROMETHEE methods. International Series in Operations Research and Management Science. 2016;233: 187–219.
  164. 164. Ostovare M, Shahraki MR. Evaluation of hotel websites using the multicriteria analysis of PROMETHEE and GAIA: Evidence from the five-star hotels of Mashhad. Tour Manag Perspect. 2019;30: 107–116.
  165. 165. Ahmi Aidi. Bibliometric Analysis for Beginners. UUM Press, Universiti Utara Malaysia; 2022. https://aidi-ahmi.com/index.php/bibliometric-analysis-for-beginners
  166. 166. Pritchard A. Statistical bibliography or bibliometrics. Journal of Documentation. 1969.
  167. 167. Mejia C, Wu M, Zhang Y, Kajikawa Y. Exploring Topics in Bibliometric Research Through Citation Networks and Semantic Analysis. Front Res Metr Anal. 2021;6: 742311. pmid:34632257
  168. 168. Donthu N, Kumar S, Mukherjee D, Pandey N, Lim WM. How to conduct a bibliometric analysis: An overview and guidelines. J Bus Res. 2021;133: 285–296.
  169. 169. Agarwal R. The 5 Feature Selection Algorithms every Data Scientist should know. In: https://towardsdatascience.com/the-5-feature-selection-algorithms-every-data-scientist-need-to-know-3a6b566efd2 [Internet]. 2019 [cited 18 Jul 2023].
  170. 170. Rasulov Z. Feature Selection—Filter Method. In: https://medium.com/analytics-vidhya/feature-selection-filter-method-43f7369cd2a5 [Internet]. 2021 [cited 18 Jul 2023].
  171. 171. Zhang Y, Yang A, Xiong C, Wang T, Zhang Z. Feature selection using data envelopment analysis. Knowl Based Syst. 2014;64: 70–80.
  172. 172. Wu J, Pan Y, Zhou Z. Assessing environmental performance with big data: A DEA model with multiple data resources. Comput Ind Eng. 2023;177: 109041.
  173. 173. Benítez-Peña S, Bogetoft P, Romero Morales D. Feature Selection in Data Envelopment Analysis: A Mathematical Optimization approach. Omega (Westport). 2020;96: 102068.
  174. 174. Allen R, Athanassopoulos A, Dyson RG, Thanassoulis E. Weights restrictions and value judgements in Data Envelopment Analysis: Evolution, development and future directions. Ann Oper Res. 1997;73: 13–34.
  175. 175. Thanassoulis E, Portela MC, Allen R. Incorporating Value Judgments in DEA. Handbook on Data Envelopment Analysis. 2004; 99–138.
  176. 176. Cooper WW, Seiford LM, Tone K. Introduction to data envelopment analysis and its uses: With DEA-solver software and references. Introduction to Data Envelopment Analysis and Its Uses: With DEA-Solver Software and References. 2006; 1–354.
  177. 177. Coelli TJ, Prasada Rao DS, O’Donnell CJ, Battese GE. An introduction to efficiency and productivity analysis. An Introduction to Efficiency and Productivity Analysis. 2005; 1–349.
  178. 178. Smith P. Model misspecification in Data Envelopment Analysis. Ann Oper Res. 1997;73: 233–252.
  179. 179. Imani A, Alibabayee R, Golestani M, Dalal K. Key Indicators Affecting Hospital Efficiency: A Systematic Review. Front Public Health. 2022;10: 830102. pmid:35359774
  180. 180. Asbu EZ, Masri MD, Naboulsi M Al. Determinants of hospital efficiency: A literature review. Int J Healthc. 2020;6: 44.
  181. 181. Cylus J, Papanicolas I, Smith PC. Health system efficiency: How to make measurement matter for policy and management. European Observatory Health Policy Series. 2016;46. Available: https://www.ncbi.nlm.nih.gov/books/NBK436888/
  182. 182. Carini E, Gabutti I, Frisicale EM, Di Pilla A, Pezzullo AM, de Waure C, et al. Assessing hospital performance indicators. What dimensions? Evidence from an umbrella review. BMC Health Serv Res. 2020;20. pmid:33183304
  183. 183. Peyrache A, Rose C, Sicilia G. Variable selection in Data Envelopment Analysis. Eur J Oper Res. 2020;282: 644–659.
  184. 184. Nataraja NR, Johnson AL. Guidelines for using variable selection techniques in data envelopment analysis. Eur J Oper Res. 2011;215: 662–669.
  185. 185. Ruggiero J. IMPACT ASSESSMENT OF INPUT OMISSION ON DEA. 2011;4: 359–368.
  186. 186. Wagner JM, Shimshak DG. Stepwise selection of variables in data envelopment analysis: Procedures and managerial perspectives. Eur J Oper Res. 2007;180: 57–67.