Figures
Abstract
With the continuous increase in technology research and development investment, the overall operational efficiency of universities as the main body of scientific research has always been a focus of research. This study evaluates the efficiency of technology transfer in 67 Chinese universities directly affiliated with the Ministry of Education from 2016 to 2020. By integrating the meta-frontier analysis with the network Slacks-based Measure (SBM) Data Envelopment Analysis (DEA) approach, we assess the overall efficiency, stage-specific efficiency, and sources of inefficiency across different types of universities. Results indicate that while some institutions operate at the optimal frontier, the overall efficiency remains moderate, with the Technology Transfer and Application (TTA) stage consistently underperforming compared to the R&D stage. Significant heterogeneity exists among university types: normal, and medical & pharmaceutical universities demonstrate higher efficiency levels, whereas comprehensive, science and engineering, and agricultural and forestry universities exhibit notable inefficiencies, particularly in the TTA stage. Further decomposition reveals that technological gaps are the dominant source of inefficiency, especially in the later stage of the innovation process. Based on these findings, we propose targeted policy recommendations aimed at improving infrastructure, enhancing management practices, and tailoring reform strategies according to institutional type. This study contributes to the understanding of internal inefficiencies in university-led technology transfer and provides practical insights for policymakers and university administrators seeking to enhance the commercialization of academic research.
Citation: Cheng B (2025) Efficiency analysis of 67 Chinese research universities considering inter-university heterogeneity: Evidence from a meta-frontier network SBM DEA model. PLoS One 20(9): e0331923. https://doi.org/10.1371/journal.pone.0331923
Editor: Taiyi He, Southwestern University of Finance and Economics, CHINA
Received: June 10, 2025; Accepted: August 24, 2025; Published: September 18, 2025
Copyright: © 2025 Bo Cheng. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: All relevant data are within the paper and its Supporting information files.
Funding: This paper is a general research project of the Chongqing Municipal Education Commission K24YG2160345. the funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing interests: The authors have declared that no competing interests exist.
1. Introduction
In recent years, with the continuous advancement of the “Double First-Class” initiative in China’s higher education system, university research has become an increasingly important indicator of academic excellence and societal contribution [1]. According to the 2023 Statistical Bulletin on National Science and Technology Expenditure jointly issued by the National Bureau of Statistics, the Ministry of Science and Technology, and the Ministry of Finance, China’s total expenditure on research and development (R&D) reached 3,335.71 billion yuan in 2023, reflecting a steady growth rate of 8.4% compared to the previous year. Among this, R&D funding allocated to higher education institutions amounted to 275.33 billion yuan—an increase of 14.1%, the highest among all executing entities—highlighting the growing importance of universities in the national innovation ecosystem. Notably, basic research funding reached 225.91 billion yuan in 2023, representing a year-on-year increase of 11.6%, and accounting for 6.77% of total R&D expenditure—the highest proportion on record [2]. Higher education institutions and government-affiliated research organizations remain the primary contributors to basic research, responsible for 60.2% and 31.6% of its growth, respectively. These figures underscore the significant progress made by Chinese universities in terms of R&D investment and their pivotal role in supporting national scientific and technological innovation.
However, alongside rapid development, substantial disparities have emerged across regions, institutional types, and structural dimensions. These imbalances pose new challenges to improving the efficiency and quality of university research performance. Currently, most evaluation systems focus primarily on quantitative indicators such as the number of publications, total research funding, patents granted, and awards received. While these metrics are easy to collect and implement, they suffer from several limitations. First, overreliance on quantity-based indicators may encourage a “publish-or-perish” culture that neglects the quality and real-world impact of research outcomes. Second, existing methods may not fully capture the relationship between inputs and outputs, which could result in rankings that overstate the performance of institutions with relatively high inputs but modest output efficiency. Furthermore, current evaluation frameworks lack flexibility in index selection, weight assignment, and adaptation to regional or institutional heterogeneity, limiting their ability to reflect the diverse performance of universities under different developmental goals [3,4].
To address these issues, it is essential to develop a comprehensive evaluation framework that accounts for both input and output efficiency while being capable of handling multi-dimensional and multi-indicator data. Data Envelopment Analysis (DEA), a non-parametric method widely used for performance evaluation, offers a promising solution. DEA can assess the relative efficiency of decision-making units (DMUs) without relying on predefined weights, making it particularly suitable for analyzing complex systems with multiple inputs and outputs [5–7]. In the context of higher education, an increasing number of studies have adopted network DEA models to unpack the internal processes of efficiency generation, thereby opening the so-called “black box” [3,8–10]. For example, Yang et al. (2018) proposed a two-stage network DDF-DEA framework to evaluate inefficiencies in key research universities under the Ministry of Education, integrating Luenberger productivity indicators for decomposition analysis [11]. Ma et al. (2022) quantitatively assessed university technology transfer efficiency across three stages—research innovation, experimental development, and value creation—in 31 Chinese universities [12]. Liu et al. (2024) evaluated the research performance of 61 Double First-Class universities using a network DEA model comprising two processes and three sub-stages [13].
Some studies have considered external and internal resource factors in evaluating university performance. Ding et al. (2023), for instance, introduced a two-stage DEA model incorporating fixed-budget constraints based on government grants [14]. Chen et al. (2023) proposed an aggregated two-stage DEA approach from the perspective of resource sharing and internal-external linkages to measure efficiency scores for 52 Chinese universities in 2014 [15]. Another strand of literature focuses on identifying key determinants of university performance across institutions [3,16,17].
Although some recent studies, e.g., [13,18,19], have acknowledged the need for group-wise comparisons, few have explicitly accounted for institutional heterogeneity in frontier construction. In practice, universities differ in aspects such as funding structures, disciplinary specializations, research infrastructure, student composition, and output profiles. Without accounting for such variations, efficiency estimates may not fully reflect actual performance differences.
In addition to methodological advances in efficiency evaluation, the second-stage analysis can also be applied to examine the determinants of efficiency through regression-based approaches. Existing studies emphasize that university performance is shaped by a combination of external environmental conditions and internal institutional characteristics [16,17,20–22]. External factors typically include the regional economic environment, the extent of governmental financial support, and policy incentives, all of which influence the resources available for academic and research activities. Internal factors, on the other hand, relate to institutional capabilities and strategic orientation, such as faculty composition and the degree of international engagement. Together, these dimensions form a comprehensive framework for understanding the drivers of efficiency variations among universities, particularly within China’s “Double First-Class” initiative, where policy priorities and institutional contexts interact in shaping performance outcomes.
Building upon these works, this study employs a two-stage network SBM DEA model to evaluate the efficiency of 67 key universities directly affiliated with the Ministry of Education in China. To address institutional heterogeneity, we adopt the meta-frontier framework proposed by [23], classifying the sample into five categories: Comprehensive Universities, Normal Universities, Medical and Pharmaceutical Universities, Science and Engineering Universities, and Agricultural and Forestry Universities [24]. By decomposing inefficiency into Technological Gap Ratio Inefficiency (TGRI) and Management Inefficiency (MI), we provide deeper insights into the sources of inefficiency across groups. The main contributions of this study are as follows:
- Structured efficiency assessment of Chinese research universities–We develop a comprehensive framework that evaluates the efficiency of 67 key universities directly affiliated with the Ministry of Education through a two-stage network SBM DEA model, capturing both the R&D stage and the Technology Transfer and Application (TTA) stage.
- Classification-based meta-frontier analysis–By adopting a meta-frontier framework and grouping universities into five categories (Comprehensive, Normal, Medical and Pharmaceutical, Science and Engineering, and Agricultural and Forestry), we account for institutional heterogeneity in production technology and provide more accurate group-wise efficiency benchmarks.
- Identification of inefficiency sources–We decompose overall inefficiency into Technological Gap Ratio Inefficiency (TGRI) and Management Inefficiency (MI), revealing whether performance gaps are driven primarily by technological disparities across university types or by internal managerial shortcomings.
The remainder of this paper is structured as follows: Section 2 outlines the evolution of the research model and variable definitions; Section 3 presents statistical descriptions and preprocessing of variables; Section 4 reports the empirical findings, including overall efficiency, stage-specific efficiency, and inefficiency decomposition; and Section 5 concludes with policy implications and future research directions.
2. Method
2.1. Model development
Charnes et al. (1978) developed the CCR model, based on the efficiency measurement framework introduced by Farrell (1957), to evaluate the performance of DMUs with multiple inputs and multiple outputs [6,25]. Banker et al. (1984) later extended the CCR model by incorporating variable returns to scale (VRS), resulting in what is now known as the BCC model [5]. However, both of these foundational models are radial data envelopment analysis (DEA) approaches. In 2001, Tone proposed the Slacks-Based Measure (SBM) model, which differs from traditional radial models by focusing on input and output slacks directly. This non-radial approach provides a scalar efficiency score that avoids the overestimation often associated with radial DEA models [7]. Moreover, production processes are rarely as simple as a single-stage transformation. To address this limitation, Färe et al. (2007, 2014) introduced Network Data Envelopment Analysis (Network DEA), which explicitly accounts for intermediate products and internal structures, thereby opening up what was previously treated as a “black box” in conventional DEA models [8,9]. Building on this, Tone and Tsutsui (2009) proposed a weighted SBM Network DEA model that incorporates the linkages between sub-units within a DMU [26]. In this framework, each sub-unit is treated as a Sub-DMU, and the SBM approach is applied to derive optimal efficiency scores while considering interdependencies among departments. Subsequently, in 2014, Tone and Tsutsui further extended their work by integrating dynamic elements into the network structure. Based on the Dynamic SBM model proposed earlier [27], they introduced a weighted SBM Dynamic Network DEA model. This model not only considers the internal linkages among Sub-DMUs but also incorporates carry-over activities across time periods as temporal linkages [28].
Furthermore, due to differences among universities in terms of funding availability, disciplinary strengths, and geographical location, their actual performance outcomes may vary significantly. Ignoring these heterogeneities could obscure the sources of inefficiency, leading to biased efficiency estimates and reducing the model’s relevance to real-world conditions. To address this issue, this study draws on the meta-frontier framework proposed by [29] and [30], which allows for a more accurate comparison across heterogeneous groups. We integrate this meta-frontier approach with a two-stage network SBM model, and further develop a chained network two-stage SBM model under the meta-frontier framework. Fig 1 shows the chained network structure of the proposed research.
2.2. Parameter and input-output variable settings
All universities, denoted as N (j = 1,…,N), are composed of G distinct groups of DMUs, where . Under the meta-frontier framework, each DMUp selects the output weights that are most favorable to its own efficiency score, thereby maximizing its measured efficiency. The overall efficiency score under the common frontier can be obtained by solving the following optimization program:
Suppose there are n decision-making units (DMUs) ; K stages
; and T time periods
. Let
and
denote the number of inputs and outputs, respectively, at each stage k. The notation
represents the link from stage k to stage h, and
denotes the set of links between divisions h and k. The definitions of inputs, outputs, and links are summarized as follows.
2.2.1. Inputs and outputs.
refers to input
at time period
for
at division
. As far as this study is concerned, in the first stage, the total number of full-time equivalent R&D personnel (in person-years) and the total R&D expenditure (in thousand yuan) are selected as the input variables. In the second stage, the number of personnel engaged in the application of R&D outcomes and science-technology services (in person-years), along with the corresponding funding allocated to these activities (in thousand yuan), are treated as additional inputs.
refers to output r in time period
for
at division
. The outputs from the first stage include the number of academic monographs, research articles published, software copyrights registered, patents granted, and consultation reports issued. The final outputs in the second stage consist of the number of highly cited papers, adopted consultation reports, commissioned research funding received from enterprises and institutions in the humanities and social sciences (in thousand yuan), and actual revenue generated from technology transfer (in thousand yuan).
2.3. Non-parametric two-stage meta-frontier network SBM
2.3.1. Meta-frontier network SBM DEA.
Under the meta-frontier framework, each DMUp can choose the most favorable set of weights for its final outputs to maximize its own efficiency score. Consequently, the efficiency of a specific DMUp under the meta-frontier can be obtained by solving the following linear programming formulation:
Overall efficiency:
Subject to:
Research and Development (R&D) Stage:
Technology Transfer and Application (TTA) Stage:
are stage 1 of input/output slacks.
is link slacks.
are the input/output slacks at stage 2.
indicates the weight of the two stages and it conforms to
. In this study, it is believed that both stages are equally important.
2.4. Technology gap ratio (TGR) and inefficiency decomposition
Definition 1: TGR is used to measure the technological efficiency gap between the Meta-Frontier (MF) and the Group-Frontier (GF) and is seen as a measure of whether the DMU has chosen the best production technology. Since the convex front is a convex combination of each population, and in general, the efficiency obtained under MF is smaller than that under GF, there will be an MFE less than the GFE. It also means that . TGR can be calculated by the following formula:
Definition 2: In addition, in order to explore the efficiency at MF and GF respectively and as well as the policy implications behind their formation, we referred to the methodology of [23,27,31]. DMU inefficiency under MF can be broken down into Technological Gap Ratio Inefficiency (TGRI) at the frontier of a specific group and Management Inefficiency under the specific group (Group Managerial Inefficiency, GMI). TGRI represents the inefficiency of the DMUp at the group-specific frontier, which stems from the technological gap between meta-frontier and group-specific frontier. The specific formula is as follows:
Definition 3: Group Managerial Inefficiency (GMI) represents the inefficiency of DMUs in a specific group frontier. The reason for the inefficiency is attributed to the management failure of DMUs. Its calculation formula is as follows:
Meanwhile, the inefficiency based on the meta-frontier can be expressed as
3. Statistical description, data processing and source
3.1. Statistical description of input, intermediate and output variables
Table 1 shows the statistical description of the input, output and intermediate variables at the R&D stage and TTA stage respectively.
3.2. Data processing and source
In analyzing the transformation of scientific and technological achievements into practical applications, it is crucial to consider the inherent time lags between research inputs and their resulting outputs. Moreover, considering that universities require time to conduct research and innovation, experimental development, and value creation activities, this study adopts a one-year lag to better capture the temporal delay between research inputs and their realized outputs. Consistent with methodologies used in previous studies [12,32,33], we apply a one-year lag to the measurement of output indicators. Specifically, input data such as R&D investment correspond to year , while intermediate outputs (e.g., research outcomes) and final outputs related to technology transfer are taken from years
and
, respectively. For this study, input data on technology transfer activities cover the period from 2014 to 2018; outcome indicators reflecting scientific achievements are drawn from 2015 to 2019; and data on actual technology commercialization span from 2016 to 2020.
The majority of the statistical data used in this study are obtained from the Data Analysis Report and Compilation of Basic Information on Teaching and Education in Universities Directly Affiliated with the Ministry of Education (2013–2020). The dataset includes various indicators such as R&D funding and personnel, funding and staffing for the application of R&D outcomes and science-technology services, number of scientific publications and monographs, submission and adoption rates of think tank reports, number of granted patents, value of signed technology transfer contracts, and commissioned research funding from enterprises and institutions. In addition, data on software copyrights are sourced from the Tianyancha database (www.tianyancha.com). Based on the registration dates, we compiled the annual number of software copyright registrations for each university. Citation-based indicators such as total citation counts, average citations per paper, number of highly cited papers, field-normalized impact scores, proportion of highly cited papers, and H-index are derived from Clarivate Analytics’ InCites database, which is built upon authoritative citation data from Web of Science (including SCIE, SSCI, and A&HCI).
4. Empirical analysis
4.1. Overall efficiency analysis
Overall, the comprehensive efficiency of the sampled universities remained relatively stable between 2016 and 2020 (Fig 2). While the efficiency scores exhibited a fluctuating pattern—first increasing, then decreasing, and subsequently rising again—they generally stayed within the range of 0.7 to 0.8, indicating room for improvement in overall performance. The efficiency scores of the R&D stage remained consistently high, exceeding 0.9 throughout the five-year period, suggesting that these universities were operating very close to the production frontier in terms of research input utilization.
In contrast, the efficiency of the technology transfer and application (TTA) stage followed a trend of initial increase, followed by a decline, and eventual stabilization at around 0.7. This relatively low level of efficiency highlights a key challenge in the innovation process: although research inputs are being effectively converted into academic outputs, the subsequent transformation of these outputs into practical applications remains inefficient.
Taken together, these findings indicate that the sampled universities did not achieve full efficiency during the study period, primarily due to inefficiencies in the TTA stage. This further confirms the prevailing tendency in Chinese higher education institutions to emphasize research output over the practical application of scientific achievements—what can be described as a pattern of “valuing research output while neglecting technology transfer.” As a result, the effective commercialization and societal impact of technological innovations have yet to receive sufficient attention and development.
4.2. Overall, stage and TGR analysis of different universities
As shown in Table 2, the average overall efficiency scores as well as the stage-specific efficiencies for the R&D and Technology Transfer and Application (TTA) stages were calculated for all sampled universities over the period 2016–2020. Specifically, thirteen universities—including BUAA, BUCT, BNU, BUCM, CHU, HHU, NCEPU, CCNU, NJU, SNNU, OUC, USTC, and RUC—achieved optimal efficiency scores of 1 across both comprehensive and two-stage evaluations, indicating that they operated on the best-practice frontier. Among these efficient institutions, three are comprehensive universities (PKU, FDU, XMU), six are science and engineering universities (BUAA, BUCT, CHU, HHU, USTC, and RUC), three are normal universities (BNU, CCNU, and SNNU), and one is a medical university (BUCM).
In contrast, ten universities—including TongJU, CUP, DUT, HIT, CAU, NWAFU, SCU, USTB, BUPT, and BJTU—exhibited generally low levels of comprehensive and TTA stage efficiency. Among them, two are comprehensive universities (TongJU, SCU), six are science and engineering universities (CUP, DUT, HIT, USTB, BUPT, BJTU), and two are agricultural and forestry universities (CAU, NWAFU).
Further analysis of input excesses and output shortfalls among these inefficient universities reveals the following patterns: BJTU, BUPT, SCU, and CAU show output deficiencies primarily in terms of academic publications and monographs, as well as think tank reports. USTB, CUP, DUT, and NWAFU exhibit significant output shortfalls across multiple indicators, including publications, monographs, think tank reports, adopted policy reports, technology transfer revenue, and commissioned research funding from enterprises and institutions. Finally, HIT and TongJU display both input overuse and output underperformance, particularly in terms of excessive R&D expenditure and insufficient levels of commissioned research funding and adopted think tank reports.
There are significant differences in the efficiency of technology transfer among universities under the meta-frontier and group-specific frontiers. As shown in Table 3: First, the efficiency of technology transfer under the meta-frontier is notably lower than that under the group-specific frontier. Over the period 2016–2020, the average technology transfer efficiency under the meta-frontier was 0.761, compared to 0.905 under the group frontier. A similar pattern is observed for both the output stage and the overall efficiency. Second, across both frontiers, the efficiency of the technology transfer stage is consistently lower than that of the output stage—consistent with earlier findings. Specifically, the average output stage efficiencies under the meta-frontier and group frontier were 0.966 and 0.981, respectively, whereas the corresponding technology transfer stage efficiencies were significantly lower at 0.695 and 0.883. Despite multiple national policies and financial incentives aimed at promoting technology transfer, its efficiency remains below expectations. Third, there are substantial variations in technology transfer performance across different types of universities. Under the meta-frontier framework, all university categories generally exhibit the following ranking: output stage efficiency > comprehensive efficiency > technology transfer stage efficiency. In terms of comprehensive and technology transfer stage efficiencies, the ranking is as follows: medical and pharmaceutical universities > normal universities > comprehensive universities > science and engineering universities > agricultural and forestry universities. In terms of output stage efficiency, the order is: medical and pharmaceutical universities = normal universities > comprehensive universities > agricultural and forestry universities > science and engineering universities.
Notably, normal and medical universities achieved average annual efficiency scores above 0.9 across all stages, indicating strong overall performance. In contrast, comprehensive, science and engineering, and agricultural and forestry universities recorded average comprehensive and technology transfer stage efficiencies below 0.85, although their output stage efficiencies remained mostly above 0.95. This suggests that while these institutions perform well in generating research outputs, they lag behind in effectively translating those outputs into practical applications.
In addition, the average technological gap ratio (TGR) across all universities from 2016 to 2020 was 0.836, close to 1, indicating a relatively small technological gap between the group frontier and the meta-frontier on the whole. However, considerable heterogeneity exists across individual universities. On one hand, TGR values vary significantly by institutional type. Among the sampled universities, 13 achieved a TGR of 1, including six science and engineering universities, three normal universities, three comprehensive universities, and one medical university. Conversely, the five lowest-performing universities in terms of TGR were primarily composed of four agricultural and forestry universities and one science and engineering university, highlighting clear inter-type disparities. On the other hand, from a temporal perspective, the TGR increased from 0.848 in 2016 to 0.867 in 2020, indicating an overall narrowing of the technological gap over time. Specifically, normal and medical universities showed a steady increase followed by stabilization, suggesting minimal technological gaps within these groups. In contrast, comprehensive, science and engineering, and agricultural and forestry universities exhibited an initial decline followed by recovery—indicating a temporary widening of the technological gap before it began to narrow again. Nonetheless, agricultural and forestry universities still face the largest technological gap compared to other types of institutions.
4.3. Meta-frontier decomposition and inefficiency analysis
4.3.1. Four quadrant analysis of inefficient sources.
In the process of technology transfer at universities, efficiency loss (Efficiency Loss, EL) can be decomposed into technological gap inefficiency (TGRI) and management inefficiency (MI). Specifically, TGRI is calculated as the difference between the group-specific frontier and the meta-frontier efficiency scores, while MI reflects the gap between each university’s performance under the group frontier and its potential optimal performance on the production frontier.
By decomposing the sources of inefficiency and visualizing the results in a quadrant plot (Fig 3), this study classifies universities based on whether their average TGRI and MI values are above or below the sample mean. The classification yields four categories: (1) high-TGRI and high-MI, (2) low-TGRI and high-MI, (3) low-TGRI and low-MI, and (4) high-TGRI and low-MI.
- Quadrant I (High TGRI & High MI): This group includes ten universities such as USTB, BUPT, and UESTC, accounting for 14.9% of the total sample. These are primarily comprehensive and science and engineering universities that exhibit weaknesses in both technological capability and managerial effectiveness.
- Quadrant II (Low TGRI & High MI): This group comprises eleven universities including BIT, DUT, and NEU, representing 16.4% of the sample. Most of them are science and engineering universities, characterized by relatively small technological gaps but significant management inefficiencies.
- Quadrant III (Low TGRI & Low MI): The largest group, with 31 universities (46.3% of the sample), includes institutions such as BUAA, BUCT, and BFU. These universities perform well in both dimensions and are predominantly comprehensive universities, followed by science and engineering, normal, medical, and agricultural and forestry universities.
- Quadrant IV (High TGRI & Low MI): This group contains fifteen universities such as PKU, BJTU, and NEFU, making up 22.4% of the total. These universities show relatively strong management capabilities but face notable technological limitations. The types of universities in this group are more evenly distributed, with five each from the comprehensive, science and engineering, and agricultural and forestry categories.
Overall, the quadrant analysis reveals distinct patterns of efficiency loss among universities in the context of technology transfer. These findings suggest that targeted strategies—such as addressing technological constraints or improving management practices—are necessary to enhance overall technology transfer performance across different types of institutions.
4.3.2. Stage analysis of inefficiency source.
Based on the decomposition of inefficiency into technological gap inefficiency and managerial inefficiency as discussed earlier, Fig 3 presents the efficiency loss of the sampled universities across the comprehensive stage, output stage, and technology transfer stage. It also illustrates the percentage contribution of each source of inefficiency—technological versus managerial—to the overall efficiency loss. Detailed results are reported in Table 4.
Overall, a total of 25 universities operate on the best-practice frontier and exhibit no efficiency loss. However, 24 universities experience significant efficiency losses due to technological gap inefficiency (TGRI), with half of them showing TGRI accounting for more than 85% of total inefficiency. Additionally, 17 universities suffer from substantial management inefficiency (MI), among which two universities have MI contributing over 85% to their overall efficiency loss. In the R&D stage, 42 universities are located on the production frontier and thus achieve full efficiency. In contrast, 15 universities face major efficiency losses due to TGRI, with most of these institutions reporting TGRI shares exceeding 85%. Nine universities show high levels of MI, and among them, three have MI contributing more than 85% to their total inefficiency. In TTA stage, only 13 universities perform at the optimal level. A large number of universities—34 in total—experience significant efficiency losses due to technological gaps, with 24 of them having TGRI contributions above 85%. Meanwhile, 20 universities are affected by notable management inefficiencies, and in two cases, MI accounts for over 85% of total inefficiency.
Overall, technological differences contribute more significantly to efficiency loss than management inefficiencies across all stages and university types. Moreover, the majority of both TGRI and MI stem from the TTA stage, highlighting it as the primary source of inefficiency in the technology transfer process. Take Beijing Jiaotong University as an example: its overall efficiency loss in the comprehensive stage is 0.686, entirely attributed to technological gaps. While the efficiency loss in the R&D stage is relatively small (0.173), the TTA stage exhibits a much higher loss (0.851), indicating that this stage is the main driver of inefficiency.
Another representative case is Southeast University. Its comprehensive efficiency loss is 0.376, equally divided between TGRI and MI. In the R&D stage, the efficiency loss is minimal (0.082) and fully attributable to technological gaps. However, in the TTA stage, the efficiency loss increases to 0.469, with 47% due to TGRI and 53% due to MI. This suggests that while SEU faces technological limitations in both the output and transfer stages, its management inefficiencies are primarily concentrated in the TTA stage.
These findings indicate that different universities should adopt targeted strategies based on their specific sources of inefficiency: For universities where technological gaps are the dominant source of inefficiency, efforts should focus on upgrading research infrastructure, modernizing laboratory equipment, attracting high-caliber researchers, and strengthening collaborations with enterprises and research institutions to enhance innovation capabilities. For universities suffering from management inefficiencies, improvements should be made in administrative processes, including the adoption of modern management practices, enhancement of internal monitoring systems, and implementation of performance-based evaluation mechanisms to ensure efficient resource allocation and utilization.
5. Finding and recommendation
5.1. Key findings
This study evaluates the efficiency of technology transfer among Chinese universities directly affiliated with the Ministry of Education from 2016 to 2020, using a combined meta-frontier and network SBM DEA approach. By analyzing overall efficiency, stage-specific efficiency, and sources of inefficiency, we provide a comprehensive assessment of the performance of university-based technology transfer in China.
The key findings are as follows: 1) The overall efficiency of technology transfer is at a relatively high level, with some universities achieving full efficiency by operating on the best-practice frontier. However, notable discrepancies exist between group-specific frontiers and the meta-frontier. In particular, the efficiency of TTA stage is significantly lower than that of the R&D stage, indicating that the transformation of research outputs into practical applications remains a weak link in the innovation chain. 2) Significant heterogeneity exists across university types. Normal universities and medical & pharmaceutical universities consistently perform well, with average annual efficiency scores exceeding 0.9 across all stages. In contrast, comprehensive, science and engineering, and agricultural and forestry universities exhibit lower efficiency—particularly in the TTA stage, where efficiency values fall below 0.85, suggesting the need for improved resource allocation and more effective technology transfer mechanisms. 3) Technological gaps are the primary source of inefficiency. Technological gap-related inefficiency (TGRI) accounts for a larger proportion of total inefficiency than management inefficiency (MI), especially in the TTA stage. Further decomposition under the meta-frontier framework and quadrant analysis reveal that most universities fall into the “low TGRI–low MI” category. However, a number of institutions remain in the “high TGRI–high MI” quadrant, indicating dual disadvantages in both technological capability and managerial effectiveness. Overall, the TTA stage exhibits more pronounced efficiency losses due to both TGRI and MI, compared to the R&D stage. Case studies of Beijing Jiaotong University and Southeast University illustrate the coexistence of technological limitations and managerial shortcomings in certain institutions.
5.2. Policy recommendations
Based on the empirical findings, this study proposes the following policy recommendations to enhance the efficiency of technology transfer in Chinese higher education institutions:
- 1. Addressing technological gaps
Technological gaps are identified as the dominant source of inefficiency, particularly in the Technology Transfer and Application (TTA) stage. To bridge these gaps, universities should prioritize upgrading scientific infrastructure, including modern laboratory equipment, maintenance systems, and research facilities. Recruiting high-caliber researchers and strengthening collaborations with enterprises and research institutions can further enhance innovation capabilities. Moreover, fostering industry-university-research partnerships and accelerating the commercialization of research outcomes will help close the gap between academic research and industrial application.
- 2. Improving management efficiency
Management inefficiencies, especially in the TTA stage, indicate the need for institutional reforms. Universities should streamline technology transfer processes, minimize bureaucratic procedures, and remove unnecessary intermediaries to improve operational efficiency. Introducing modern management practices, establishing robust internal monitoring systems, and implementing performance-based evaluation frameworks can ensure effective resource utilization. Strengthening incentive mechanisms, such as financial rewards and policy support for faculty participation in technology transfer activities, is also critical.
- 3. Differentiated strategies by institutional type
Given the significant performance differences among university types, tailored strategies are necessary:
- Normal and Medical & Pharmaceutical Universities: Policies should aim to consolidate existing strengths and promote closer integration with local economies and industries to enhance the practical application of research outcomes.
- Comprehensive and Science & Engineering Universities: Efforts should focus on improving TTA efficiency by fostering interdisciplinary collaboration and deepening engagement with industry partners, increasing the rate of successful technology transfers.
- Agricultural and Forestry Universities: These universities should prioritize the development and dissemination of advanced agricultural technologies. Emphasis should be placed on bridging technological gaps and accelerating the adoption of innovative solutions in the agricultural sector.
Supporting information
S1 File. Overall, stage, and period-stage efficiency of 67 research universities.
https://doi.org/10.1371/journal.pone.0331923.s001
(7Z)
S1 Appendix. All abbreviations and nomenclature used in this study are summarized below.
https://doi.org/10.1371/journal.pone.0331923.s002
(DOCX)
Acknowledgments
We gratefully acknowledge the editor and the two anonymous reviewers for their insightful comments and constructive suggestions, which have significantly improved the quality and clarity of this manuscript.
References
- 1.
Ministry of Education of the People’s Republic of China (MOE). The “Double First-Class” Initiative for Higher Education Development. 2017. Retrieved from http://www.moe.gov.cn/jyb_xwfb/s5147/201709/t20170922_315007.html
- 2.
National Bureau of Statistics, Ministry of Science and Technology, & Ministry of Finance. 2023 National Statistical Bulletin on R&D Expenditure. National Bureau of Statistics of China. 2024, October 2. [Accessed on April 30, 2025. ]. Retrieved from https://www.gov.cn/lianbo/bumen/202410/content_6978191.htm
- 3. Witte KD, López-Torres L. Efficiency in education: a review of literature and a way forward. Journal of the Operational Research Society. 2017;68(4):339–63.
- 4. Sun Y, Wang D, Yang F, Ang S. Efficiency evaluation of higher education systems in China: A double frontier parallel DEA model. Computers & Industrial Engineering. 2023;176:108979.
- 5. Banker RD, Charnes A, Cooper WW. Some Models for Estimating Technical and Scale Inefficiencies in Data Envelopment Analysis. Management Science. 1984;30(9):1078–92.
- 6. Charnes A, Cooper WW, Rhodes E. Measuring the efficiency of decision making units. European Journal of Operational Research. 1978;2(6):429–44.
- 7. Tone K. A slacks-based measure of efficiency in data envelopment analysis. European Journal of Operational Research. 2001;130(3):498–509.
- 8. Färe R, Grosskopf S, Whittaker G. Network DEA. In Zhu J, Cook WD, (Eds.). Modeling Data Irregularities and Structural Complexities in Data Envelopment Analysis. Springer US. p. 209–40.
- 9. Färe R, Grosskopf S, Whittaker G. Network DEA II. In Cook WD, Zhu J, (Eds.). International Series in Operations Research & Management Science. Springer US. 2014. p. 307–27.
- 10. Liu J, Ji L, Sun Y, Chiu Y, Zhao H. Unleashing the convergence between SDG 9 and SDG 8 towards pursuing SDGs: Evidence from two urban agglomerations in China during the 13th five-year plan. Journal of Cleaner Production. 2024;434:139924.
- 11. Yang G, Fukuyama H, Song Y. Measuring the inefficiency of Chinese research universities based on a two-stage network DEA model. Journal of Informetrics. 2018;12(1):10–30.
- 12. Ma D, Cai Z, Zhu C. Technology transfer efficiency of universities in China: A three-stage framework based on the dynamic network slacks-based measurement model. Technology in Society. 2022;70:102031.
- 13. Liu X, Wu X, Zhang W. A new DEA model and its application in performance evaluation of scientific research activities in the universities of China’s double first-class initiative. Socio-Economic Planning Sciences. 2024;92:101839.
- 14. Ding T, Zhang Y, Zhang D, Li F. Performance evaluation of Chinese research universities: A parallel interactive network DEA approach with shared and fixed sum inputs. Socio-Economic Planning Sciences. 2023;87:101582.
- 15. Chen Y, Pan Y, Liu H, Wu H, Deng G. Efficiency analysis of Chinese universities with shared inputs: An aggregated two-stage network DEA approach. Socio-Economic Planning Sciences. 2023;90:101728.
- 16. Lee BL. Efficiency of Research Performance of Australian Universities: A Reappraisal using a Bootstrap Truncated Regression Approach. Economic Analysis and Policy. 2011;41(3):195–203.
- 17. Wolszczak-Derlacz J, Parteka A. Efficiency of European public higher education institutions: a two-stage multicountry approach. Scientometrics. 2011;89(3):887–917. pmid:22081733
- 18. Chen Y, Ma X, Yan P, Wang M. Operating efficiency in Chinese universities: An extended two-stage network DEA approach. Journal of Management Science and Engineering. 2021;6(4):482–98.
- 19. Ding T, Yang J, Wu H, Wen Y, Tan C, Liang L. Research performance evaluation of Chinese university: A non-homogeneous network DEA approach. Journal of Management Science and Engineering. 2021;6(4):467–81.
- 20. Chen B, Chen Y, Sun Y, Tong Y, Liu L. The measurement, level, and influence of resource allocation efficiency in universities: empirical evidence from 13 “double first class” universities in China. Humanit Soc Sci Commun. 2024;11(1).
- 21. Ginder W, Kwon W-S, Byun S-E. Effects of Internal–External Congruence-Based CSR Positioning: An Attribution Theory Approach. J Bus Ethics. 2019;169(2):355–69.
- 22. Wang C, Fu B. A study on the efficiency of allocation and its influencing factors on innovation and entrepreneurship education resources in Chinese universities under the five-in-one model. The International Journal of Management Education. 2023;21(1):100755.
- 23. Chiu C-R, Liou J-L, Wu P-I, Fang C-L. Decomposition of the environmental inefficiency of the meta-frontier with undesirable output. Energy Economics. 2012;34(5):1392–9.
- 24.
MOE (Ministry of Education of the People’s Republic of China). List of Higher Education Institutions Directly Affiliated to the Ministry of Education. 2023. Retrieved from http://www.moe.gov.cn
- 25. Farrell MJ. The Measurement of Productive Efficiency. Journal of the Royal Statistical Society Series A (General). 1957;120(3):253.
- 26. Tone K, Tsutsui M. Network DEA: A slacks-based measure approach. European Journal of Operational Research. 2009;197(1):243–52.
- 27. Tone K, Tsutsui M. Dynamic DEA: A slacks-based measure approach⋆. Omega. 2010;38(3–4):145–56.
- 28. Tone K, Tsutsui M. Dynamic DEA with network structure: A slacks-based measure approach. Omega. 2014;42(1):124–31.
- 29. Battese GE, Rao DSP, O’Donnell CJ. A Metafrontier Production Function for Estimation of Technical Efficiencies and Technology Gaps for Firms Operating Under Different Technologies. Journal of Productivity Analysis. 2004;21(1):91–103.
- 30. O’Donnell CJ, Rao DSP, Battese GE. Metafrontier frameworks for the study of firm-level efficiencies and technology ratios. Empirical Economics. 2007;34(2):231–55.
- 31. Wang Q, Su B, Sun J, Zhou P, Zhou D. Measurement and decomposition of energy-saving and emissions reduction performance in Chinese cities. Applied Energy. 2015;151:85–92.
- 32. Lim D-J, Kim M-S. Measuring dynamic efficiency with variable time lag effects. Omega. 2022;108:102578.
- 33. Yue W, Gao J, Suo W. Efficiency evaluation of S&T resource allocation using an accurate quantification of the time-lag effect and relation effect: a case study of Chinese research institutes. Research Evaluation. 2020;29(1):77–86.