• Tidak ada hasil yang ditemukan

08832323.2013.876379

N/A
N/A
Protected

Academic year: 2017

Membagikan "08832323.2013.876379"

Copied!
9
0
0

Teks penuh

(1)

Full Terms & Conditions of access and use can be found at

http://www.tandfonline.com/action/journalInformation?journalCode=vjeb20

Download by: [Universitas Maritim Raja Ali Haji] Date: 11 January 2016, At: 20:44

Journal of Education for Business

ISSN: 0883-2323 (Print) 1940-3356 (Online) Journal homepage: http://www.tandfonline.com/loi/vjeb20

An Investigation of U.S. Undergraduate Business

School Rankings Using Data Envelopment Analysis

With Value-Added Performance Indicators

Susan W. Palocsay & William C. Wood

To cite this article: Susan W. Palocsay & William C. Wood (2014) An Investigation of U.S. Undergraduate Business School Rankings Using Data Envelopment Analysis With Value-Added Performance Indicators, Journal of Education for Business, 89:6, 277-284, DOI: 10.1080/08832323.2013.876379

To link to this article: http://dx.doi.org/10.1080/08832323.2013.876379

Published online: 03 Sep 2014.

Submit your article to this journal

Article views: 87

View related articles

(2)

An Investigation of U.S. Undergraduate Business

School Rankings Using Data Envelopment Analysis

With Value-Added Performance Indicators

Susan W. Palocsay and William C. Wood

James Madison University, Harrisonburg, Virginia, USA

Bloomberg Businessweek ranks U.S. undergraduate business programs annually. These rankings provide a convenient overall measure of quality, which is important in today’s environment of concern about higher education costs and employment after graduation. Data envelopment analysis (DEA) has advantages over previous regression approaches in characterizing value added. The authors use a DEA approach to estimate relative efficiencies based on starting salaries and recruiter surveys, identifying some schools as overachievers relative to their Bloomberg Businessweek rankings. In particular, DEA-based reranking highlights the ability of some public institutions facing high student–faculty ratios to turn out well-regarded graduates with high starting salaries.

Keywords: data envelopment analysis, DEA, efficiency, performance, rankings, undergraduate business schools

Business Week first began ranking undergraduate schools and colleges of business in 2006 (Lavelle, 2006). Since that

time, the magazine’s successor Bloomberg Businessweek

has refined its methodology, expanded the number of schools ranked, and provided additional detail on ranking procedures (Gloeckler, 2013). These rankings, however, do not account for the efficiency of resource use, so that lower cost schools achieving the same results as higher cost com-petitors see no boost in their rankings. Since 2006, college tuition and fees have increased more than 30% (38.7% for public institutions and 32.1% for private ones) even as the national economy has endured its most striking downturn since the Great Depression (National Center for Education Statistics, 2013). Meanwhile, student loan debt has expanded sharply, growing 58% between 2004 and 2012 (Gross, 2013).

Amid public concern about costs and quality, ineffi-ciency in the use of funds by higher education could, over time, fundamentally erode the support for colleges and

universities. A national survey revealed “a chipping away of public support for higher education and a growing suspi-cion about how well colleges and universities use the money they have” (Immerwahr & Johnson, 2009, p. 5). The survey authors suggested that colleges and universities address public concerns over high costs proactively, since governments would likely act with greater regulation and supervision if the growth of costs remained unchecked. The combined concerns of students, parents, and government provide good reasons to investigate the efficiency of higher education, including undergraduate business education. In this environment it is important to note that efficiency-adjusted rankings can differ significantly fromBloomberg Businessweek’s reported rankings (Kreutzer & Wood, 2007).

In this study, we used data envelopment analysis (DEA) to examine the relative efficiency of U.S. undergraduate

business schools included in the most recent Bloomberg

Businessweek study (Gloeckler, 2013) from an economic, value-added perspective. DEA was first proposed by Charnes, Cooper, and Rhodes (1978) to empirically mea-sure how effectively organizational units convert multiple inputs into multiple outputs in comparison with each other. DEA calculates a weighted output-over-input ratio for each unit, which is defined as a relative efficiency score, with 1 indicating efficient status. A piece-wise linear surface is Correspondence should be addressed to Susan W. Palocsay, James

Madison University, Department of Computer Information Systems and Business Analytics, MSC 0202, 800 S. Main Street, Harrisonburg, VA 22807, USA. E-mail: [email protected]

Color versions of one or more of the figures in the article can be found online at www.tandfonline.com/vjeb.

ISSN: 0883-2323 print / 1940-3356 online DOI: 10.1080/08832323.2013.876379

(3)

constructed through the efficient units to estimate a frontier, and efficiency computations are made relative to this fron-tier (Seiford & Thrall, 1990). Thus, units that lie below the frontier are assigned scores of less than 1, indicating that there is a linear combination of efficient units that could produce as much output using smaller input amounts.

Since its introduction more than 30 years ago, DEA has gained widespread acceptance as a standard tool for evalua-tion of economic productivity with successful applicaevalua-tions in numerous nonprofit and for-profit domains (Emrouzne-jad, Parker, & Tavares, 2008). A brief overview of the DEA approach in education for business is provided in the next section. Then we describe the data and DEA model for estimating relative efficiencies and use these scores to gen-erate a full reranking for comparative purposes. Results are described in the following section, and we conclude with suggestions for future research.

DEA APPLICATION TO BUSINESS SCHOOLS

In a 2008 literature review of DEA research, Emrouznejad et al. identified education as one of the most prevalent areas of DEA application, finding 44 publications using it as a keyword. Generally, studies in the higher education sector have applied DEA at either the academic department level within one institution or for universities in a particular country (Sav, 2012). The majority of the latter group focus on managerial performance in responding to financial reforms in public university funding (Sav, 2013) and chal-lenges to revenue generation for private institutions (Liu & Liu, 2010). Commonly used productivity measures were enrollments, graduation and/or retention rates, and credit hour production (e.g. Avkiran, 2001; Sexton & Commu-nale, 2010).

Other authors have used different measures of DEA effi-ciency for U.S. master of business administration (MBA) programs (Colbert, Levary, & Shaner, 2000; Hirao, 2012) and for business schools outside the United States in Tai-wan (Kong & Fu, 2012) and India (Sreekumar & Mahapa-tra, 2011). In these studies, the focus was primarily on postgraduation approval and employment-related outcomes instead of more traditional administrative standards. As a group, they demonstrate various DEA modeling approaches to facilitate comparisons of business schools based on potential economic benefits being offered.

In Colbert et al. (2000), multiple efficiency scores were calculated for each of the 24 best Business Week–ranked MBA programs based on their achievement of either stu-dent satisfaction and/or recruiter satisfaction. Input resour-ces were represented by faculty-to-student ratios, average GMAT scores, and number of elective offerings. In their experimental trials, the number of efficient programs varied from 8 to 16 with all of the efficiency scores exceeding 0.9. In the trial using average salary and average recruiter score

as outputs, the minimum efficiency was 0.9420 and the DEA model generated an efficiency score of one for 13 of the 24 schools. This lack of differentiation was attributed to the similarity of programs in the sample, all prominent top-ranked MBA programs.

More recently, Hirao (2012) estimated efficiency of U.S. graduate business schools in terms of translating peer assessment and average GMAT scores into average starting salaries and employment rates. The reference data came from U.S. News & World Report rankings of the top 50 business schools in 2006, of which 27 were private and 23 were public. Peer assessment score (from appraisals by other department heads) had a stronger positive correlation with both starting salaries and employment rates than aver-age GMAT scores (0.908 vs. 0.735 for salaries and 0.514 vs. 0.383 for employment). Five of the 50 schools achieved (relative) efficiency and scores ranged from 0.8087 to 1, with a mean of 0.9356. On average, public institutions had lower overall efficiencies (0.9155) than private ones (0.9527). School names were not identified in this study and no comparative analysis of rankings by DEA was provided.

Kong and Fu (2012) ranked a small group of 21 Taiwa-nese business colleges using survey data from recent graduates to construct indicators for job market perfor-mance and student satisfaction. With restrictions from recruiters imposed on indicator weights, only 3 of the col-leges had a DEA efficiency score of 1 but all scores were higher than 0.9. They did, however, find average perfor-mance of public schools (0.991) exceeded that of private ones (0.981).

For the evaluation of 49 business schools in India, Sree-kumar and Mahapatra (2011) chose three inputs: faculty/ student ratio, infrastructure, and tuition fees. They devel-oped a broad group of eight outputs that encompassed start-ing salary and the satisfaction of faculty, students and recruiters as well as other measures such as international exchange programs and student awards. DEA showed its ability to discriminate within this sample by evaluating only 4 schools as relatively efficient. The minimum score was 0.356 and mean efficiency was 0.625 with a standard deviation of 0.175. Further analysis of DEA results was done to identify peer groups for benchmarking of inefficient schools to improve performance, although individual school names were not revealed.

In this study, we extended the application of DEA into U.S. undergraduate business education with a value-added purpose. We generally followed the approach in Colbert et al. (2000) as described previously, augmented by guid-ance from Kreutzer and Wood (2007) to consider the impact of tuition costs. Our study benefits from having a larger sample containing more diverse schools when deter-mining the effect of ranking via DEA scores. We also com-pared efficiencies of public and private business schools and investigate the composition of peer groups for addi-tional insights aimed at performance management.

278 S. W. PALOCSAY AND W. C. WOOD

(4)

DATA AND MEASUREMENT OF EFFICIENCY SCORES

In the present analysis, data were taken from theBloomberg Businessweek2013 rankings of 124 undergraduate business schools. The ranking method includes nine variables, all related in some way to student satisfaction, post-graduation outcomes or academic quality (Lavelle, 2013). Student sat-isfaction is measured through a survey sent to graduating seniors at the ranked schools. Post-graduation outcomes include employer opinion (from a survey), school-reported median starting salaries, and admissions to top MBA pro-grams. Academic quality was measured through average SAT scores, student–faculty ratios, average class size, the percentage of business majors with internships, and weekly student hours reported to be spent on schoolwork. In the final summation, student assessment accounts for 30% of the ranking, recruiter survey scores account for 20%, start-ing salaries and MBA admissions contribute 10% each, and the academic quality items together account for the remain-ing 30%.

Selection of inputs and outputs for DEA should be guided by the environment being examined and by organi-zational objectives (Avkiran, 2001). For our analysis, we sought a parsimonious set of variables to compare business programs on the basis of their performance in producing highly marketable graduates, conditional on resource avail-ability and incoming-student quality. With this perspective in mind, we chose student/faculty ratio to characterize school quality and average SAT scores to account for stu-dent ability. Annual tuition was included as a third input measure, consistent with the approach in Kreutzer and Wood (2007). As DEA permits more than one output, we were able to incorporate the employer rankings (based on recruiter opinions) with median starting salaries to reflect the overall value of graduates from each program.

A statistical summary is given in Table 1. Note that one institution, Massachusetts-Amherst, was removed from the data set due to concern about its artificially low reported tuition. That reported tuition was $1,714, but mandatory fees were $11,518 (Massachusetts Department of Higher Education, 2013), making the institution an outlier with no easy means of correction. The resulting final sample size was 123. The distribution of private versus public schools was similar, with 59 private and 64 public institutions.

The correlation coefficients for input and output varia-bles are listed in Table 2, with employer rank inverted for ease of interpretation. The table shows that students’ aver-age SAT scores had a stronger (positive) correlation with median starting salaries and employer ranking than stu-dent–faculty ratio and tuition costs. These scores were also associated with student–faculty ratios, indicating that schools with lower SAT scores also have more students per faculty member. There was only moderate correlation between salaries and employer rankings.

DEA is a nonparametric method for deriving an efficient frontier from empirical data. This frontier is defined by those schools with the best performance in converting inputs to outputs, compared to the other schools in the data set. For an individual school t, efficiency is computed as the ratio of the weighted sum of outputs to the weighted sum of inputs, withxtD(x1t, x2t, x3t) andytD(y1t, y2t)

rep-resenting the amounts of inputs and outputs, respectively. Values of the weights for school twere determined by an optimization model (Ray, 2004) formulated as:

Maximize

(student-faculty ratio), 2 (SAT), 3 (tuition). Employer rank-ings and SAT scores were inverted to meet the model’s requirements. The constraints prevent selection of weights that would give any school (including school t) an effi-ciency that is greater than 1 or 100%.

For implementation purposes, the fractional program-ming model was mathematically transformed into an equiv-alent linear program by requiring the sum of the weighted

inputs for school t (the denominator in the objective

TABLE 1 Summary Statistics

Measure Student–faculty ratio SAT Tuition ($) Salary ($)

M 22.460 1219.423 22,712.46 50,626.55

Median 20.4 1207 14,985.00 50,000.00

SD 9.537 103.725 14,666.72 6,823.04

Minimum 0.9 1020 4,710.00 32,500.00

Maximum 65.0 1492 45,735.00 70,000.00

TABLE 2 Correlation Coefficients

Item

Student–faculty

ratio SAT Tuition Salary

Employer rank

Student–faculty ratio

SAT ¡.526**

Tuition ¡.410** .341**

Salary ¡.423** .738** .321**

Employer rank ¡.154* .522** ¡.087 .505**

*pD.10. **pD.01.

(5)

function) to equal 1. By constructing and solving a separate linear program for each of the 123 schools in the sample, a set of weights was found to maximize that particular school’s efficiency without allowing this ratio to exceed 1 for any school, including itself (Ragsdale, 2012).

RESULTS OF EVALUATING EFFICIENCY SCORES

The relative DEA efficiencies of the 123 business schools, with model input and output data, are reported in the Appendix. These scores ranged from 0.281 to 1, with a mean of 0.579 and a standard deviation of 0.208, as shown in Figure 1. Thirteen percent of schools in the sample had a score above 0.9. Efficiency scores for schools that appeared

in the top 20 Bloomberg Businessweek rankings ranged

from 0.51 to 1, with a mean score of 0.86.

The detailed results show that seven of the programs were efficient with DEA scores of 1. The average starting salary at the efficient schools was $60,571, with an average employer ranking of 28. Four of the efficient programs

were originally ranked in the top 10 by Bloomberg

Busi-nessweek: University of Virginia (#2), Washington Univer-sity at St. Louis (#4), UniverUniver-sity of Pennsylvania (#5), and University of North Carolina at Chapel Hill (#10). How-ever, there was more variation inBusinessweek placement for the remainder: Brigham Young University (#12), Mas-sachusetts Institute of Technology (#19), and University of Florida (#37). Massachusetts Institute of Technology (MIT) was one of three schools with the maximum starting salary of $70,000 (the other two were Pennsylvania and Carnegie Mellon). And while Brigham Young and Florida only had average salaries of $51,000 (see Table 1), they were both very highly ranked by employers.

DEA scores that were less than 1 indicate the presence of inefficiencies relative to schools in this efficient set. These schools were ranked from highest to lowest efficiency score

in Appendix A. When compared toBusinessweekrankings,

there were considerable differences, as shown in Figure 2.

Positive deviations on the right side of the graph indicate schools that had a higher ranking based on DEA efficiency, whereas negative deviations were associated with schools that were ranked higher byBusinessweek. Approximately 9% of schools improved their rank position by 50 or more under the relative efficiency criterion. A closer inspection revealed two business programs that were particularly note-worthy in this regard: Binghamton, which moved up from 57th to eighth with a DEA score of 0.984, and California Polytechnic State, that went from 64th to 13th with a DEA score of 0.945. Both were public institutions with fairly high student–faculty ratios (31.6 and 30, respectively), yet they were still able to place their graduates with better than average starting salaries.

Using the DEA model solutions, it is possible to identify a peer group and peer weights for each inefficient school. Then a composite institution can be constructed that can pro-duce at least as much output (or more) using the same or less input. The hypothetical composite school has a bundle of input and output values which is mathematically computed as a linear combination of the input–output sets for efficient schools. For example, a reference school for Binghamton, which had an efficiency score of 0.984, can be created by combining the inputs and outputs of the University of Vir-ginia and Brigham Young University with weights of 76.4% and 21.9%, respectively. The composite values for this hypothetical institution are $57,000 (salary), 24.2 (employer rank), 1325.5 (SAT), 11.6 (student–faculty ratio), and $5,479 (tuition). An inefficient school could use the compos-ite values as a discussion starter on ways of improving.

In Appendix A, all of the efficient programs were assigned a rank of 1. However, further examination of DEA model results did show substantial differences in the num-ber of times each appeared in a peer group for an inefficient school (Sreekumar & Mahapatra, 2011). The most fre-quently referenced program was Virginia (79 times), fol-lowed by North Carolina at Chapel Hill (56 times). Both of these were public schools with high rankings in the original

Businessweekstudy. Among efficient private schools, Penn-sylvania, Brigham Young, and MIT were peers for 39, 31,

0%

0.2-0.3 0.3-0.4 0.4-0.5 0.5-0.6 0.6-0.7 0.7-0.8 0.8-0.9 0.9-1

R

DEA Efficiency Score

FIGURE 1. DEA efficiency scores.

0%

FIGURE 2. Comparison of DEA efficiency andBusinessweekrankings.

280 S. W. PALOCSAY AND W. C. WOOD

(6)

and 37 inefficient schools, respectively. In contrast, Florida and Washington-St. Louis showed up far less often as peers (12 and 11 times, respectively).

Efficiency scores for public and private institutions, shown in Figure 3, were also reviewed (Hirao, 2012). The mean efficiency measure for public schools was 0.641 in comparison to 0.511 for private schools. Statistical testing showed a significant difference between the means of these two groups (p <.001). However, the mix of schools with efficiencies of 0.9 or higher was relatively even, with nine being public and seven private.

A frequent concern with DEA benchmarking is that a high proportion of units may be rated as efficient, even with a large sample size in relation to the total number of inputs and outputs (Seiford & Thrall, 1990). In this study, our empirical analysis identified only 5.7% of the 123 business programs as points on the efficient frontier, showing rea-sonable discrimination across schools. Thus this frontier is a boundary of schools with efficient input-output levels measured by DEA scores corresponding to estimates of rel-ative efficiency, rather than an estimation of average effects as found in regression analysis. As a result, there is no sin-gle functional form for use in directly determining the effects of changes in inputs or outputs or hypothesis testing (Thanassoulis, 1993). And while DEA provided a rating of inefficient programs, it did not allow for ranking of the effi-cient ones (Andersen & Petersen, 1993). However, by treat-ing the frontier schools as potential best practice institutions, DEA methodology objectively identified a peer group subset for each inefficient school with weights to tar-get productivity improvements.

CONCLUDING REMARKS

Our objective was to examine the ability of undergraduate business schools to achieve strong performance in generat-ing valued employees for entry-level positions usgenerat-ing pub-lished data. Applying DEA, we constructed a multioutput

model that incorporated a measure of recruiter satisfaction in addition to starting salaries for evaluation of overall rela-tive efficiency. This model gave consideration to institu-tional variations in student–faculty ratios, average SAT scores, and annual tuition costs. Results show that the DEA model is able to identify differences between schools that

are not readily apparent in the Bloomberg Businessweek

rankings, providing an alternative view for prospective students.

Given the increasing public scrutiny of educational costs versus benefits, mathematical techniques for analyzing effi-ciency and performance such as DEA can provide addi-tional insight into how undergraduate business schools compare with each other. As a recommendation for future research, DEA can be applied in a longitudinal study to observe which schools are consistently efficient over time. Investigations using DEA models with alternative effi-ciency measures, returns to scale, and/or weight distribution restrictions that address limitations of the basic technique may also be useful (Ray, 2004). For more advanced analy-ses, researchers should explore hybrid methodologies that combine DEA with regression-based efficiency assessment (Park, Lee, Park, & Kim, 2009; Tofallis, 2001) or multi-stage DEA models with adjustments for external environ-mental factors (Sav, 2013).

REFERENCES

Andersen, P., & Petersen, N. C. (1993). A procedure for ranking efficiency

units in data envelopment analysis.Management Science,39, 1261–1264.

Avkiran, N. K. (2001). Investigating technical and scale efficiencies of

Australian universities through data envelopment analysis.

Socio-Eco-nomic Planning Sciences,35, 57–80.

Charnes, A., Cooper, W. W., & Rhodes, E. (1978). Measuring the

effi-ciency of decision making units. European Journal of Operational

Research,2, 429–444.

Colbert, A., Levary, R. R., & Shaner, M. C. (2000). Determining the

rela-tive efficiency of MBA programs using DEA. European Journal of

Operational Research,125, 656–669.

Emrouznejad, A., Parker, B. R., & Tavares, G. (2008). Evaluation of research in efficiency and productivity: A survey and analysis of the first

30 years of scholarly literature in DEA.Socio-Economic Planning

Sci-ences,42, 151–157.

Gloeckler, G. (2013, March 20). Notre Dame’s Mendoza takes no. 1

rank-ing, again. Bloomberg Businessweek. Retrieved from http://www.

businessweek.com/articles/2013-03-20/notre-dames-mendoza-takes-no-dot-1-ranking-again

Gross, J. (2013).How bad is it? The student loan default crisis in

perspec-tive. Coalition of State University Aid Administrators. Retrieved from

http://www.cosuaa.org/conference/2013_Resources/Presentations/How% 20Bad%20Is%20It-Loan%20Default-COSUAA%202013.pdf

Hirao, Y. (2012). Efficiency of the top 50 business schools in the United

States.Applied Economics Letters,19, 73–78.

Immerwahr, J., & Johnson, J. (2009). Squeeze play 2009: The public’s

views on college costs today. San Jose, CA: National Center for Public Policy and Higher Education.

Kong, W., & Fu, T. (2012). Assessing the performance of business colleges in Taiwan using data envelopment analysis and student based

value-added performance indicators.Omega,40, 541–549.

0

0.2-0.3 0.3-0.4 0.4-0.5 0.5-0.6 0.6-0.7 0.7-0.8 0.8-0.9 0.9-1

N

DEA Efficiency Score

Public

Private

FIGURE 3. Comparison of efficiency scores for public and private schools.

(7)

Kreutzer, D., & Wood, W. (2007). Value-added adjustment in

undergradu-ate business school ranking.Journal of Education for Business, 82,

357–362.

Lavelle, L. (2006, May 8). The best undergraduate B-schools.Business

Week, 76–93.

Lavelle, L. (2013, March 20). FAQ: How we ranked the schools.

Bloom-berg Businessweek. Retrieved from http://www.businessweek.com/

articles/2013–03-20/faq-how-we-ranked-the-schools#rDlr-sr

Liu, C. A., & Liu, W. (2010). Performance evaluation on private higher

education using data envelopment analysis. Proceedings of the IIE

Annual Conference, 1–6.

Massachusetts Department of Higher Education. (2013).Resident mandatory

fee rates. Retrieved from http://www.mass.edu/campuses/res_fees.asp

National Center for Education Statistics. (2013).Digest of education

statis-tics 2012, table 381. Retrieved from http://nces.ed.gov/programs/digest/ d12/tables/dt12_381.asp

Park, K. S., Lee, K. W., Park, M. S., & Kim, D. (2009). Joint use of DEA and constrained correlation analysis for efficiency valuations involving

categorical variables.The Journal of the Operational Research Society,

60, 1775–1785.

Ragsdale, C. (2012).Spreadsheet modeling & decision analysis(6th ed.).

Mason, OH: South-Western Cengage Learning.

Ray, S. C. (2004).Data envelopment analysis: Theory and techniques

for economics and operations research.New York, NY: Cambridge University Press.

Sav, G. T. (2012). Productivity, efficiency, and managerial perfor-mance regress and gains in United States universities: A data

envel-opment analysis.Advances in Management & Applied Economics,2

(3), 13–32.

Sav, G. T. (2013). Four-stage DEA efficiency evaluations: Financial

reforms in public university funding.International Journal of

Econom-ics and Finance,5, 24–33.

Seiford, L. M., & Thrall, R. M. (1990). Recent developments in DEA: The

mathematical approach to frontier analysis.Journal of Econometrics,46

(1–2), 7–38.

Sexton, T. R., & Comunale, C. L. (2010). An efficiency analysis of U.S.

business schools. Journal of Case Studies in Accreditation and

Assessment, 1. Retrieved from http://www.aabri.com/manuscripts/ 09256.pdf

Sreekumar, S., & Mahapatra, S. S. (2011). Performance modeling of Indian

business schools: A DEA-neural network approach.Benchmarking: An

International Journal,18, 221–239.

Thanassoulis, E. (1993). A comparison of regression analysis and data envelopment analysis as alternative methods for performance

assess-ments. The Journal of the Operational Research Society, 44,

1129–1144.

Tofallis, C. (2001). Combining two approaches to efficiency

assess-ment.The Journal of the Operational Research Society,52, 1225–

1231.

APPENDIX DATA AND SCHOOL RANKINGS BY DEA EFFICIENCY

Efficiency

rank College/university Student–faculty

SAT

score Tuition ($)

Starting salary ($)

Employer rank

BW rank

Efficiency score

1 Virginia (McIntire) 9.9 1390 9,622 60,000 15 2 1

1 Washington U.-St. Louis (Olin) 10 1492 44,100 62,000 34 4 1

1 Pennsylvania (Wharton) 9.7 1461 39,088 70,000 26 5 1

1 North Carolina-Chapel Hill (Kenan Flagler) 10.9 1350 5,823 60,000 28 10 1

1 Brigham Young (Marriott) 15 1219 4,710 51,000 3 12 1

1 MIT (Sloan) 0.9 1457 41,770 70,000 80 19 1

1 Florida (Warrington) 18.9 1286 6,170 51,000 8 37 1

8 Binghamton 31.6 1321 5,570 57,000 36 57 0.9837

9 Michigan-Ann Arbor (Ross) 12.5 1377 13,040 65,000 14 8 0.9800

10 Texas-Austin (McCombs) 24 1362 10,738 55,000 4 9 0.9782

11 UC-Berkeley (Haas) 17.3 1396 14,985 60,000 17 11 0.9659

12 Indiana (Kelley) 20.4 1299 8,750 55,000 1 13 0.9647

13 California Polytechnic State (Orfalea) 30 1234 5,472 55,000 74 64 0.9451

14 Notre Dame (Mendoza) 15 1414 42,464 57,000 5 1 0.9428

15 Emory (Goizueta) 9.7 1377 42,400 60,000 16 7 0.9240

16 Cornell (Dyson) 15.8 1416 27,045 58,922 21 3 0.9224

17 NYU (Stern) 12.2 1444 41,358 63,250 25 14 0.8945

18 Ohio State (Fisher) 19 1305 9,168 50,000 13 34 0.8655

19 Illinois-Urbana-Champaign 17.6 1345 16,556 55,000 6 21 0.8632

20 William & Mary (Mason) 12.1 1343 8,677 55,000 98 27 0.8192

21 Carnegie Mellon (Tepper) 11 1415 44,880 70,000 69 24 0.8131

22 USC (Marshall) 24.4 1384 43,722 54,000 11 31 0.7978

23 Texas A & M (Mays) 25 1230 9,330 52,000 7 33 0.7879

24 Louisiana State (Ourso) 28 1152 5,193 43,000 91 118 0.7647

25 Boston College (Carroll) 22.5 1355 43,140 60,000 9 6 0.7417

26 Florida International (Landon) 35 1052 5,217 41,748 95 113 0.7390

27 North Carolina State (Poole) 32 1185 5,748 44,472 57 95 0.7225

(Continued on next page)

282 S. W. PALOCSAY AND W. C. WOOD

(8)

Efficiency

rank College/university Student–faculty

SAT

score Tuition ($)

Starting salary ($)

Employer rank

BW rank

Efficiency score

28 Northeastern (D’Amore-McKim) 21.1 1367 39,320 55,000 19 25 0.7199

29 West Virginia 22 1059 5,794 45,000 111 111 0.7173

30 Arizona (Eller) 14 1135 9,114 50,000 30 50 0.7066

31 Houston (Bauer) 18 1158 6,796 50,771 75 103 0.7042

32 Georgetown (McDonough) 22 1372 42,360 63,000 22 16 0.6983

33 Rutgers-New Brunswick 29 1290 10,688 55,000 27 81 0.6932

34 Arkansas-Fayetteville (Walton) 62 1111 6,141 46,000 81 105 0.6929

35 Wisconsin-Madison 23 1270 10,273 52,000 29 32 0.6656

36 Georgia Tech (Scheller) 15 1291 7,718 50,000 48 41 0.6647

37 Buffalo 20 1125 5,570 40,000 62 112 0.6632

38 Connecticut 15 1226 8,712 55,000 44 54 0.6573

39 Georgia (Terry) 17.6 1255 7,646 47,750 39 44 0.6500

40 Purdue (Krannert) 33.4 1179 9,900 52,000 18 58 0.6452

41 James Madison 26.8 1156 8,808 58,000 45 29 0.6327

42 South Florida 49 1226 6,330 42,000 124 121 0.6303

43 Virginia Tech (Pamplin) 27 1180 9,187 50,000 24 52 0.6265

44 Penn State-University Park (Smeal) 31 1226 17,824 56,000 2 26 0.6259

45 Texas-Dallas (Jindal) 12 1267 10,566 45,000 110 75 0.6198

46 Minnesota (Carlson) 27 1305 12,560 51,000 38 39 0.6193

47 U. of Washington (Foster) 17.7 1290 12,383 52,000 49 48 0.6127

48 Case Western (Weatherhead) 8.5 1267 40,120 47,500 51 69 0.6080

49 Alabama-Tuscaloosa (Culverhouse) 21 1150 9,200 58,000 82 73 0.6070

50 Boston U. 16 1296 42,400 52,000 20 23 0.5990

51 Colorado State 24 1163 6,874 43,390 87 89 0.5961

52 Kansas State 41.3 1105 6,829 43,000 53 114 0.5878

53 Oklahoma (Price) 30 1170 8,700 53,000 33 88 0.5861

54 Delaware (Lerner) 19.5 1218 10,150 55,000 65 76 0.5837

55 New Jersey 21 1231 10,102 53,400 92 59 0.5829

56 Villanova 14.8 1322 42,150 55,000 31 15 0.5802

57 Miami U. (Farmer) 18.6 1240 13,067 55,000 32 22 0.5735

58 Wake Forest 13.3 1333 42,700 57,000 40 18 0.5656

59 Michigan State (Broad) 26.2 1127 13,800 52,000 12 43 0.5587

60 Bentley 19 1207 38,130 50,000 10 20 0.5405

61 Tulane (Freeman) 15.5 1327 41,500 55,000 78 49 0.5249

62 Tennessee-Chattanooga 39.4 1065 5,722 32,500 118 120 0.5245

63 Iowa (Tippie) 36 1140 7,678 42,000 42 100 0.5198

64 Kansas-Lawrence 27 1155 8,790 47,250 59 110 0.5164

65 Arizona State (Carey) 27 1182 9,208 49,000 46 77 0.5147

66 Richmond (Robins) 13.4 1301 44,210 56,764 106 17 0.5054

67 South Carolina (Moore) 19 1217 10,088 47,499 63 87 0.5038

68 Missouri-Columbia (Trulaske) 20.1 1185 9,272 47,500 77 78 0.4960

69 Bowling Green State 17.7 1068 10,393 49,000 99 90 0.4954

70 Lehigh 16.4 1279 41,920 57,000 55 35 0.4897

71 Pittsburgh (Katz) 32 1262 17,568 48,000 41 82 0.4873

72 Utah (Eccles) 27.7 1120 8,921 45,000 121 117 0.4825

73 Oregon (Lundquist) 15 1143 9,310 40,000 79 107 0.4727

74 Southern Methodist (Cox) 24 1283 37,050 52,000 70 30 0.4667

75 Rutgers-Newark 44 1060 10,356 50,000 105 119 0.4629

76 Ohio 29 1100 10,216 49,000 89 86 0.4618

77 Worcester Polytechnic Institute 31 1190 40,790 63,000 86 51 0.4581

78 Clemson 28.7 1220 11,870 44,195 58 94 0.4541

79 Cincinnati (Lindner) 18.9 1142 9,124 43,000 66 84 0.4539

80 Santa Clara (Leavey) 16 1275 40,572 52,500 90 38 0.4528

81 Akron 23 1103 9,553 45,000 72 108 0.4517

82 Fordham (Gabelli) 18.1 1230 41,000 57,500 68 40 0.4515

83 Northern Illinois 44.7 1040 9,488 45,000 43 106 0.4506

84 U. of Miami 16.7 1284 39,980 50,000 113 70 0.4400

(Continued on next page)

(9)

Efficiency

rank College/university Student–faculty

SAT

score Tuition ($)

Starting salary ($)

Employer rank

BW rank

Efficiency score

85 Toledo 35 1020 9,774 45,000 123 116 0.4372

86 Colorado-Boulder (Leeds) 33.3 1169 12,646 46,700 54 101 0.4295

87 Tulsa (Collins) 19 1207 32,410 53,500 67 55 0.4271

88 John Carroll (Boler) 10 1083 32,130 41,500 115 79 0.4268

89 Illinois State 32 1105 10,050 43,500 47 99 0.4256

90 Bradley (Foster) 13.5 1224 27,920 45,000 120 98 0.4208

91 Elon (Love) 26 1218 28,633 47,000 37 42 0.4193

92 Babson 21 1267 41,888 50,000 60 36 0.4174

93 Denver (Daniels) 18 1300 38,232 45,000 56 68 0.4152

94 Baylor (Hankamer) 24.4 1211 30,586 51,000 97 66 0.4148

95 DePaul (Driehaus) 18 1129 31,650 55,000 52 60 0.4121

96 George Washington 20 1270 45,735 50,000 88 71 0.4078

97 Vermont 29.4 1130 13,344 47,321 108 123 0.4022

98 Texas Christian (Neeley) 24 1196 36,500 53,000 84 28 0.4013

99 Seton Hall (Stillman) 19 1175 32,700 52,500 94 85 0.3953

100 American (Kogod) 18.9 1220 37,554 50,000 114 56 0.3941

101 San Diego 15.7 1216 39,486 50,000 102 46 0.3897

102 Kentucky (Gatton) 28.1 1090 9,676 38,250 96 122 0.3787

103 Syracuse (Whitman) 25 1182 37,610 51,500 50 72 0.3773

104 Marquette 24.3 1184 32,810 48,000 71 74 0.3648

105 Duquesne (Palumbo) 19.5 1112 27,668 48,000 76 96 0.3581

106 Fairfield (Dolan) 25 1150 41,090 52,522 107 83 0.3560

107 Bryant 25.8 1125 35,591 52,000 35 63 0.3501

108 Ohio Northern University (Dicke) 15 1105 35,678 44,000 101 62 0.3488

109 Seattle (Albers) 18 1149 34,200 47,500 100 67 0.3486

110 Butler 18.3 1143 32,280 47,000 104 47 0.3484

111 Loyola-Chicago (Quinlan) 20 1165 33,810 47,500 83 104 0.3481

112 St. Louis (Cook) 21 1185 34,740 46,000 109 91 0.3456

113 Loyola-Maryland (Sellinger) 18 1180 41,026 47,000 103 53 0.3389

114 Quinnipiac 22.5 1089 38,000 53,000 61 61 0.3346

115 Loyola Marymount 22 1192 38,012 44,000 117 65 0.3273

116 California-Riverside 65 1070 12,192 41,000 112 124 0.3261

117 Hofstra (Zarb) 22 1147 34,900 45,500 116 115 0.3199

118 Providence College 25 1152 41,350 47,000 93 109 0.3191

119 Rochester Institute of Technology

(Saunders)

18 1131 32,784 41,917 73 93 0.3104

120 St. Joseph’s (Haub) 23.5 1115 37,670 47,300 64 92 0.3101

121 St. Thomas (Opus) 19 1180 33,040 39,644 122 80 0.3004

122 Xavier (Williams) 25 1107 32,140 44,000 85 102 0.2939

123 Belmont 18.9 1147 24,900 35,500 119 97 0.2811

284 S. W. PALOCSAY AND W. C. WOOD

Gambar

TABLE 2
FIGURE 2.Comparison of DEA efficiency and Businessweek rankings.
FIGURE 3.Comparison of efficiency scores for public and privateschools.

Referensi

Garis besar

Dokumen terkait