Skip to main content
Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

The graduation shift of German universities of applied sciences


In research into higher education, the evaluation of completion and dropout rates has generated a steady stream of interest for decades. While most studies only calculate quotes using student and graduate numbers for both phenomena, we propose to additionally consider the budget available to universities. We transfer the idea of the excellence shift indicator [1] from the research to the teaching area, in particular to the completion rate of educational entities. The graduation shift shows the institutions’ ability to produce graduates as measured against their basic academic teaching efficiency. It is an important advantage of the graduation shift that it avoids the well-known heterogeneity problem in efficiency measurements. Our study is based on German universities of applied science. Given their politically determined focus on education, this dataset is well-suited for introducing and evaluating the graduation shift. Using a comprehensive dataset covering the years 2008 to 2013, we show that the graduation shift produces results, which correlate closely with the results of the well-known graduation rate and standard Data Envelopment Analysis (DEA). Compared to the graduation rate, the graduation shift is preferable because it allows to take the budget of institutions into account. Compared to the DEA, the computation of the graduation shift is easy, the results are robust, and non-economists can understand them results. Thus, we recommend the graduation shift as an alternative method of efficiency measurement in the teaching area.


In times of new public management, universities are no longer solely interested in measures of research excellence, but also in the efficiency of research: Can the given input (in terms of employees or expenditures) efficiently be transformed into research output (in terms of publications or patents)? Bornmann et al. [1] introduced the excellence shift to assess the efficiency of (higher) education institutions in conducting (successful) research. The method makes it possible to avoid the well-known heterogeneity problem in efficiency research, with either the data or the institutions being too varied to be fairly compared [2, 3]. Institutions are heterogeneous for many reasons, but they differ primarily in their location and focus: institutions are located in varying states and operate thus under conditions that are not comparable. Some universities emphasize research whereas others are more teaching oriented. The advantage of the excellence shift is that institutions are compared based on their own basic efficiency, which avoids comparing disparateness. For the calculation, two output variables depicting the institutional research side are used, whereby one is the subset of the other. The term “shift” refers to the fact that the indicator measures not only the efficiency of institutions, but also the magnitude of change, if the basic efficiency level is compared to an excellence level. For the excellence shift, Bornmann et al. [1] employ the total number of papers and the number of highly-cited papers as a subset. Based on these data sources, the shift shows the institutions’ ability to produce highly-cited papers as measured against their basic academic research efficiency (using institutions’ total budget as input indicator).

This paper makes three contributions to the literature:

  1. Firstly, we transfer the idea of the excellence shift from the research area [1] to the teaching area, particularly to the completion rate of educational entities. Completion and dropout are topics of consistently high interest in research on higher education [4, 5], especially in Germany [6]. We call the transferred approach graduation shift. It is based on two output variables: the number of students at a university, which signals how attractive a university is, and the number of graduates, indicating how successful the graduation process works. The number of graduates is a subset of the number of students who have enrolled at that university. The main input variable is the expenses of the institution. The graduation shift then shows the institution’s ability to produce graduates as measured against their basic academic teaching efficiency. The output and input variables used in this study are an established choice for efficiency studies, which have been used by Agasisti and Dal Bianco [7], for example. To compare the results of the graduation shift with the results of a standard efficiency method, we also employ the variables in a Data Envelopment Analysis (DEA). Additionally, we contrast the results of our graduation shift to a simple measure of completion: the graduation rate, which shows how many students graduate from the university.
  2. Secondly, unlike the previous literature looking at conventional universities, we deliberately use teaching data for universities of applied sciences (so-called Fachhochschulen). The institutions complement the existing German conventional universities by having a politically predefined focus on education (and not research or research training). They emerged in the 1960s, in response to the need for skilled labor and the growing demand for student places. Graduates receive the same formal title, but differ from leavers of conventional universities through their place of study. Most of the institutions are multidisciplinary, vocationally oriented and align their subject range to suit the regional economy [8]. Hence, our teaching oriented efficiency approach is perfectly suited to assess their effectiveness. Despite their growing status within the German higher education sector, with half of all existing institutions being universities of applied sciences, they have only rarely been subject to efficiency studies to date.
  3. Thirdly, we outline some drawbacks of the excellence shift. For example, it does not appropriately account for the input parameters in specific situations.

This paper starts with a brief overview of the efficiency literature as well as a description of our data set. Afterwards, the graduation shift approach is explained. In the final sections, we present the graduation shift results and compare them with the well-known graduation rates and the results from the DEA.

Related literature

De Witte and López Torres [2] and Rhaiem [9] provide excellent summaries of the efficiency literature in the education sector. The term “efficiency” is defined as the success of maximizing the output from a given set of inputs (or vice versa). The efficiency of educational entities emerged as a topic of early interest, with initial studies recommending relevant input, as well as output variables [10] and later studies discussing limitations, especially in terms of the comparability of universities [11]. While the productivity of conventional universities has been frequently analyzed in the past (see exemplary [12, 13, 14] for evaluations of the German HE Sector and [15] for a cross-country comparison), only two studies have examined the efficiency of universities of applied sciences to date [16, 17]. Both studies classified the universities as just one component of the higher education sector and therefore examined them as part of a bigger sample. Olivares and Wetzel [16] thereby focus the analysis on the economies of institutional scale and scope. The authors applied a recent specification of the Stochastic Frontier Analysis (SFA) to an unbalanced panel, covering 72 conventional universities and 80 applied institutions during the time period from 2001 to 2008. Their results show that all entities work on a similarly high level of efficiency and exhibit increasing returns to scale. With a similarly mixed, but much smaller sample, Başkaya und Klumpp [17] used the DEA in a cross-section of 33 institutions. Their evaluation reveals that the universities of applied science exhibit heterogeneous efficiency scores on a low average level. The differences between the results of the studies by Olivares and Wetzel [16] and Başkaya und Klumpp [17] are primarily caused by the differences in the respective efficiency approaches and considered variables, for example, for the representation of the teaching output. Agasisti und Haelermans [18] illustrate how sensitive the efficiency values are to the variable representing the teaching output (number of students or graduates at each institution). A further valuable addition to the literature on universities of applied sciences are the publications by the German Council of Science and Humanities (Wissenschaftsrat), which give a thorough view of the universities of the applied science landscape (see, for example [19]). Although the council separately evaluated the input and output variables of the universities, it missed the opportunity to evaluate their efficiency.

Within the literature on efficiency of higher education institutions, the teaching side of universities is most commonly represented by the number of students or graduates. The majority of authors thereby considers the absolute number or split the overall amount according to different levels of education or subject groups. Variables reflecting the quality of education or representing the completion rate are rarely included. Notable exceptions are the following three studies: Agasisti [20] evaluated the efficiency of Italian institutions. The author distinguished between students (graduates), who finished their studies within the regular duration of the course, and the overall number of students. Whereas Zoghbi, Rocha and Mattos [21] considered dropout numbers in their study and Sav [22] took account of the student fall to fall semester return (retention) in their efficiency evaluations.

Data and methods

The initial sample consists of 262 German public universities of applied science (classified by the Federal Statistical Office of Germany) including 163 private and/or specialized institutions (the latter are primarily located in theology, art, and pedagogy). These private and specialized institutions have not been considered in this study, mainly due to their different funding arrangements. Due to mergers of institution and missing data, 18 further institutions had to be dropped. The final sample thus comprises 81 of the 99 German public universities of applied science. To gain insights into the productivity of these institutions, we evaluated their primary activity, namely teaching, with respect to their main input, i.e. expenses. The output variable “teaching” is represented by the total number of first semester students alongside the graduates from bachelor and master courses (or equivalent). The Federal Statistical Office of Germany distinguishes between students in their first subject related semester (in German: Fachsemester) and their first university semester (in German: Hochschulsemester). We deliberately used students in the first Fachsemester, since it comprises students in their first Hochschulsemester and also envelopes students who changed their field of study. Student numbers refer to the academic years 2008/2009 through 2013/14 and financial variables are from 2008 to 2013. The data were provided by the Federal Statistical Office of Germany. Expenditure data are deflated to the year 2013. Table 1 reports descriptive statistics for the year 2013. The values are similar to those reported by Olivares and Wetzel [16]. An institution has around 1,400 students in the first semester and around 1,100 graduating in that year. Average expenditure amounts to 34 million euros per university. The largest institution among the 81 universities of applied science is the FH Cologne with respect to both students and expenditure.

Fig 1 shows the change in average students, graduates, and expenses over the considered timeframe. While the first two variables show a moderate and similar increase, the expenditures grew to a larger extent.

A crucial point is the definition of the point in time when a student graduates. This is surely not the same for students from different universities. For a discussion of the problems in measuring time to degree in the German higher education sector see Theune [23]. Therefore, given our data framework, we have to make some assumptions, based on the standard and actual duration of study. While bachelor (master) students in Germany have a standard period of study with 6 (4) semesters, the actual period of study is listed as 7.3 (4.2) semesters by the Federal Statistical Office of Germany [24]. Since we have a mixed sample, featuring both bachelor and master students, we assume an average study duration of 6 semesters. Hence, we split the overall sample (with a period from 2008 to 2013) according to the contained student cohorts and obtain three groups. Based on our assumption of six semesters, the first semester students from 2008 (2009 or 2010, respectively) have been related to the graduates of 2011 (2012 or 2013, respectively). For each cohort, the average expenditure over the corresponding three years has been calculated.

The graduation shift

We have one input and two output variables. On the output side, our approach is based on two indicators: (1) total number of first semester students (S) and (2) total number of graduates (G). The input is defined as the total expenditure (E).

Given our dataset, the graduation shift is formally calculated as follows [1]:

  1. The relative shares p1i = Si/∑Si; p2i = Gi/∑Gi and exi = Ei/∑Ei are calculated. These represent the share of each university given the sum of inputs and outputs. The percentages standardise the absolute numbers and make them comparable across indicators.
  2. The university efficiency scores for the two outputs given by e1i = p1i/exi and e2i = p2i/exi are calculated. These are simple productivity measures relating the outputs to the input.
  3. The difference of the two efficiency scores e2e1 defines the graduation shift. The score can be interpreted only in relative terms.

The term “shift” points to the direction and magnitude of change in productivity of a university compared to its basic efficiency. The framework of the graduation shift allows considering one input and two outputs simultaneously. Thereby one output has to be a subset of the other output indicator.

To gain an impression of the graduation shift’s robustness with respect to outliers, we experimented with extreme values from our data set. These analyses show that the index changes only marginally and the resulting rankings remained almost unchanged.


To relate the results from the graduation shift to the results of an established method, we additionally performed an efficiency analysis as a benchmark. Two main methods for estimating efficiency coexist for the educational sector. In both cases, inefficiency is measured by the distance of each institution to a calculated efficiency frontier. Since the frontier is determined by the sample, efficiency is a relative measure: the efficiency of a particular institution is calculated relative to the performance of the other institutions in the sample.

We choose the non-parametric DEA introduced by Charnes, Cooper and Rhodes [25] as a benchmark for this study, because it is the most frequently used method and it can be implemented in a straight-forward manner. Using linear programming, the frontier and the position of each entity are calculated by the ratio of (weighted) outputs over (weighted) inputs. Detailed overviews of advantages and variations of the DEA can be found in Bogetoft and Otto [26] as well as Wilson and Clemson [27]. To achieve the best possible benchmark, we performed the DEA with the same dataset as used for the graduation shift, considering first semester students and expenditure as inputs and graduates as output. We allow for Variable Returns to Scale (VRS) and choose the output-oriented approach, assuming that universities maximise their output with the given input.

The DEA has the advantage that it can handle various numbers of inputs and outputs simultaneously. In principle, one could consider the number of employees or the physical capital (as the number of computers or laboratories) as inputs, alongside the expenditures. Since this is not possible with the graduation shift, we included in the DEA the same restricted number of inputs and outputs as in the graduation shift calculation. To test the robustness of the results, we additionally vary the considered input, given by the expenditures in our baseline model, and consider the number of employees instead. One potential drawback of the DEA is that the results can be sensitive to outliers (see Gnewuch and Wohlrabe [28]). Since the graduation shift is not sensitive to outliers, the new approach has this advantage over the DEA.


In the first step, we calculated the graduation shift for each year from 2008 to 2010. Fig 2 plots the corresponding kernel estimate of all 81 scores for every cohort year. It shows that the distribution is constant across time. Both mean and median are negative. There are more negative than positive scores on average over the three years. However, a visual inspection of the results reveals that the relative positions of the universities are volatile with respect to both the level and the ranking positions. This impression is confirmed by corresponding correlation coefficients (see Spearman Rank and Pearson coefficients in Table 2), which are all below 0.7. The results indicate that interpretations may differ slightly depending on the year selected.

Fig 2. Kernel estimates of the graduation shift (2008–2010).

Table 2. Spearman rank and Pearson correlations across time for the graduation shift.

Fig 3 shows the scatterplot of the ranking positions, which result from the different approaches for 2010. The scatterplot reveals that the ranking positions of the universities are fairly homogenous when we compare the graduation rate with the graduation shift. This is confirmed by the results in Table 3, which provide the Spearman rank and Pearson correlations for the different comparisons across all years. The coefficients for the correlation between graduation rate and shift always exceed 0.96. Thus, both approaches lead to quite similar conclusions for most of the universities.

Fig 3. Ranking comparison of different approaches for measuring efficiency (2010).

Table 3. Correlations between different efficiency measures.

Tables 4 to 6 document the results of the analyses for the individual universities of applied sciences for the years 2008–2010. The tables are sorted in alphabetical order. The tables show the expenditure and the number of first semester students and graduates–including their relative shares. The graduation shift is the difference between the two relative efficiency measures %S/%E and %G/%E. In the last column, DEA scores are listed. With an average of 0.739 for the year 2010 (see Table 6) the institutions exhibit a fairly high efficiency level. The University of Applied Sciences in Neu-Ulm has the highest graduation shift compared to the other universities. In other words, the university exhibits the best relative graduation process of students based on the given expenses. While this university is only ranked 60th with respect to the relative student efficiency (%S/%E), it reaches the seventh position when it comes to graduation efficiency (%G/%E). This results in a very good relative performance with respect to the graduation shift. A DEA score of 1.00 and the first rank with respect to the graduation rate confirm the high graduation efficiency. At the lower end of the ranking we find the FH Brandenburg, which performs quite well with respect to student efficiency (rank 10), but drops to the 63rd position in the graduation ranking. In addition, the university features a very small DEA score and graduation rate.

Table 4. Input and output indicators for 81 universities of applied sciences and the resulting graduation shift (and DEA) for the year 2008.

Table 5. Input and output indicators for 81 universities of applied sciences and the resulting graduation shift (and DEA) for the year 2009.

Table 6. Input and output indicators for 81 universities of applied sciences and the resulting graduation shift (and DEA) for the year 2010.

To demonstrate differences between graduation rate and shift, we plot a histogram depicting the ranking differences for all three years (see Fig 4). The histograms show the difference in ranking positions between the graduation shift and the graduation rate and DEA, respectively, for each year. The left Panel of Fig 4 reports the differences between the graduation shift and rate. It shows that there are more universities where the ranking position differs than the high Spearman correlation coefficient might suggest. The average ranking change over all three years is about 2.25 ranking positions. The maximum difference between the graduation rate and shift rankings is 13 positions in 2008: the University of Applied Science in Landshut is ranked 2nd with respect to the graduation shift but drops to the 15th place in the graduation rate ranking. Evaluating the positions of the universities in more detail, it can be noted that the first 2 positions in both rankings remain the same over all three years. However, there is some variation for universities which are listed in the top 10 of the respective ranking. Some universities enter the top 10 when we account the expenditures. This sensitivity of the ranking should not be underrated, since rankings have implications for the reputation of a university.

Fig 4. Histogram of ranking differences between efficiency approaches across years.

We have two possible explanations for our result that graduation shift and rate lead to similar rankings, but exhibit some differences for selected universities especially at the upper and lower part of the ranking positions (see Fig 3): firstly, graduation rates are good proxies for teaching efficiency, even if they are adjusted to relative expenditure figures. Secondly, we have a quite homogeneous sample including similar universities of applied sciences.

The correlations between the results of the graduation shift and DEA are fairly high with around 0.85, but more dispersed. However, since the correlation is not perfect, a high DEA score is not necessarily associated with a high graduation shift. This is supported by the ranking differences shown in the right panel of Fig 4. The differences are larger than those between the graduation shift and rate are. The average positional change across the years is about nine. The largest gain is 48 positions and the largest drop is 23. The Technical University of Applied Sciences in Cologne, for example, is ranked 15th in the DEA ranking in 2010, but only 61st in the graduation shift ranking. Hence, the figure shows that the assessment of teaching efficiency differs between the standard DEA approach and the newly introduced graduation shift.

Sensitivity analysis

In order to test the robustness of our results we use the number of employees as an alternative input measure (instead of expenditures). The number of employees is a frequently used input indicator at the institutional level [2]. We refrain from reporting results on the comparison of this graduation shift variant to the graduation rate and DEA. We concentrate on the comparison between the two graduation shift variants (using number of employees and expenditures as inputs). Table 7 shows the correlations between the two variants. The correlations of the ranks (according to the spearman rank correlation) and the values (according to the Pearson correlation) are nearly perfect, always exceeding r = 0.9 across the years. The comparison shows therefore that the conclusions on the universities remain nearly the same independent of the used input measure. However, since the personnel expenditures account for the majority of the overall institutional costs, this result is not surprising.

Table 7. Spearman rank and Pearson correlations for the expenditures and employee variation.


No indicator is without limitations. We stated above that the graduation shift can handle only one input and two outputs. The DEA might be the better alternative, if this is not sufficient in the statistical analysis.

Although the graduation shift yields plausible results in our study and correlates highly with other efficiency measures, the shift has some drawbacks, which are illustrated in Table 8. The table features some examples with three artificial universities and their corresponding inputs and outputs. Panel A is the starting point where all indicators are identical. The graduation shifts of the universities are zero. In Panel B we increase ceteris paribus the expenditure of university A which leaves the graduation shift unchanged. One would expect a decrease. In Panel C–with identical expenditures and numbers of students as in Panel A–we drop the graduation rate of university C, which results in a negative graduation shift for C and positive graduation shifts for A and B. In Panel D, we additionally lower the expenditures of university B, which leads to the highest graduation shift score among the universities. The results in Panels C and D are unsurprising. In Panels E and F, however, we see the opposite effect, which defies our expectations. In Panel E, we have two universities (B and C) with a negative graduation shift due to smaller graduation rates compared to university A. If we decrease the expenditure of university B, as shown in Panel F, the institution is punished compared to university C although both exhibit the same graduation rates.

The examples with three artificial universities in Table 8 illustrate that the idea of Bornmann et al. [1] does not appropriately account for the effect of the input variable under ceteris paribus conditions and when the shift is negative. However, in most practical applications of the shift, the limitations described in this section will not affect the efficiency results.


In research into higher education, the comparison of the numbers of first semester students and graduates has attracted a steady stream of interest for decades. The negative side of graduation is certainly dropout, which should be as low as possible for universities. To minimize dropout rates at universities, governments and university administrations are interested in the causes of student dropout and the subsequent career developments of these students. For example, the German Centre for Higher Education Research and Science Studies (DZHW GmbH; formerly HIS GmbH) has published several studies on the causes and motives for dropout, in addition to attrition and dropout rates at universities. In most of the studies on completion and dropout, only quotes of both phenomena have been calculated (an overview of the literature can be found in European Commission [29]). Based on the results of our study, we propose to consider also the available budget of the universities as an input variable and to calculate the ability of universities to graduate their students–in view of the available budget.

Using a comprehensive sample of 81 institutions within the period of 2008 to 2013, we show that some German universities are better able to guide students to graduation than others–given their budget constraints. We introduce the graduation shift in this study, which can be used to assess the efficiency of students’ completion success for a set of universities.

We find that the graduation shift is closely related to graduation rates. However, the graduation shift is certainly preferable, because it takes the budget of institutions into account. Although the correlation between graduation shift and rate is high, we demonstrate that there are various universities with larger differences between both indicators. The maximum difference between both indicators is 13 positions for the University of Applied Science in Landshut if the universities are ranked with respect to both indicators.

The graduation shift leads to similar ranking positions of the universities as the DEA. Since the DEA is an established instrument in efficiency measurement, the relatively high correlations could be interpreted as a validation of our new approach. However, the correlation coefficients are not perfect, which can be interpreted as follows: (1) The graduation shift does not measure efficiency in the same way as the DEA does. (2) The graduation shift can be seen as an alternative method of efficiency measurement to the DEA. (3) Some examples with three artificial universities reveal that the graduation shift is not without issues–as nearly all other indicators.

Taken as a whole, it is an advantage of the graduation shift that the differences between institutions are controlled with respect to institutional data and heterogeneity. This control is not considered in the DEA approach. It is a further advantage of the graduation shift that the computation is easy and the results are understandable to non-economists (which is not always the case with the DEA). However, when applying the graduation shift, it is worth bearing in mind that the shift has its limitations.

In this study, we used a dataset with universities of applied sciences to exemplify the calculation of the graduation shift. Future studies could elaborate this idea by computing the shift not only for universities in Germany, but also for universities in other countries. The topics of study completion and student dropout rates concern all nations with higher education systems. The results of these studies are of interest to a wide audience including students, university administrations, and policy makers.


  1. 1. Bornmann L, Wohlrabe K, de Moya Anegon F. Calculating the excellence shift: How efficiently do institutions produce highly cited papers? Scientometrics. 2017;112(3):1859–1864.
  2. 2. De Witte K, López-Torres L. Efficiency in education: a review of literature and a way forward. J Oper Res Soc. 2017;68(4):339–363.
  3. 3. Johnes G, Johnes J. Higher education institutions’ costs and efficiency: Taking the decomposition a further step. Econ Educ Rev. Februar 2009;28(1):107–13.
  4. 4. Tinto V. Dropout from Higher Education: A Theoretical Synthesis of Recent Research. Rev Educ Res. 1. März 1975;45(1):89–125.
  5. 5. Aulck L, Velagapudi N, Blumenstock J, West J. Predicting Student Dropout in Higher Education. ArXiv Prepr 160606364. 20. Juni 2016;16–20.
  6. 6. Heublein U. Student Drop-out from German Higher Education Institutions. Eur J Educ. 1. Dezember 2014;49(4):497–513.
  7. 7. Agasisti T, Dal Bianco A. Reforming the university sector: effects on teaching efficiency—evidence from Italy. High Educ. 2009;57(4):477–98.
  8. 8. Kyvik S. Structural changes in higher education systems in Western Europe. High Educ Eur. 2004;29(3):393–409.
  9. 9. Rhaiem M. Measurement and determinants of academic research efficiency: a systematic review of the evidence. Scientometrics. 2017;110(2):581–615.
  10. 10. Carter CF. The efficiency of universities. High Educ. 1972;1(1):77–90.
  11. 11. Sadlak J. Efficiency in higher education—concepts and problems. High Educ. 1978;7(2):213–220.
  12. 12. Kempkes G, Pohl C. The efficiency of German universities–some evidence from nonparametric and parametric methods. Appl Econ. 2010;42(16):2063–2079.
  13. 13. Johnes G, Schwarzenberger A. Differences in cost structure and the evaluation of efficiency: the case of German universities. Educ Econ. Dezember 2011;19(5):487–99.
  14. 14. Gralka S. Persistent inefficiency in the higher education sector: evidence from Germany. Educ Econ. 4. Juli 2018;26(4):373–92.
  15. 15. Agasisti T, Gralka S. The transient and persistent efficiency of Italian and German universities: A stochastic frontier analysis. CEPIE Work Pap. 2017;14/17.
  16. 16. Olivares M, Wetzel H. Competing in the higher education market: empirical evidence for economies of scale and scope in German higher education institutions. CESifo Econ Stud. 12. Januar 2011;60(4):653–80.
  17. 17. Başkaya S, Klumpp M. International data envelopment analysis in higher education: how do institutional factors influence university efficiency? J Bus Econ. 2014;5(11):2085–2090.
  18. 18. Agasisti T, Haelermans C. Comparing efficiency of public universities among European countries: Different incentives lead to different performances. High Educ Q. 2016;70(1):81–104.
  19. 19. Wissenschaftsrat. Empfehlungen zur Rolle der Fachhochschulen im Hochschulsystem. Wissenschaftsrat. 2010;1–171.
  20. 20. Agasisti T. Cost structure, productivity and efficiency of the Italian public higher education industry 2001–2011. Int Rev Appl Econ. 2016;30(1):48–68.
  21. 21. Zoghbi AC, Rocha F, Mattos E. Education production efficiency: Evidence from Brazilian universities. Econ Model. 2013;31:94–103.
  22. 22. Sav GT. Female faculty, tenure, and student graduation success: Efficiency implications for university funding. Int J Bus Manag Econ Res. 2012;3(6):633–640.
  23. 23. Theune K. The working status of students and time to degree at German universities. High Educ. 2015;70(4):725–752.
  24. 24. Destatis. Hochschulen auf einen Blick 2016. 2016;1–50.
  25. 25. Charnes A, Cooper WW, Rhodes E. Measuring the efficiency of decision making units. Eur J Oper Res. 1978;2(6):429–444.
  26. 26. Bogetoft P, Otto L. Benchmarking with Dea, Sfa, and R. 157. Springer Science & Business Media; 2010.
  27. 27. Wilson PW, Clemson SC. FEAR: frontier efficiency analysis with R. USA Dep Econ Clemson Univ. 2013;2.
  28. 28. Gnewuch M, Wohlrabe K. Super-efficiency of education institutions: an application to economics departments. Educ Econ. 2018;1–14.
  29. 29. Commission European. Dropout and Completion in Higher Education in Europe, Annex 1: Literature Review. Educ Cult. 2015;1–60.