Full Terms & Conditions of access and use can be found at
http://www.tandfonline.com/action/journalInformation?journalCode=vjeb20
Journal of Education for Business
ISSN: 0883-2323 (Print) 1940-3356 (Online) Journal homepage: http://www.tandfonline.com/loi/vjeb20
A Multiattributes Approach for Ranking PhD
Programs
Frank R. Urbancic
To cite this article: Frank R. Urbancic (2008) A Multiattributes Approach for Ranking PhD Programs, Journal of Education for Business, 83:6, 339-346, DOI: 10.3200/JOEB.83.6.339-346 To link to this article: http://dx.doi.org/10.3200/JOEB.83.6.339-346
Published online: 07 Aug 2010.
Submit your article to this journal
Article views: 44
View related articles
he PhD supply shortage for busi- nesseducationhasbeenwelldoc-umentedinrecentyears.Initialattention wasdrawntotheproblembytheAsso-ciationtoAdvanceCollegiateSchoolsof BusinessInternational(AACSB)inthe landmarkreportManagementEducation atRiskin2002.AACSBrespondedby creatingtheDoctoralFacultyCommis-sion. The AACSB (2003) commission presented a comprehensive assessment of the crisis 1 year later and provided recommended actions for addressing the problem in its reportSustaining Scholarship in Business Schools. One ofAACSB’skeyrecommendationscalls for the development of PhD program rankings. Unlike other business school programs, such as the master of busi-ness administration (MBA) program, there are few financial or reputation incentives for academic institutions to invest in PhD programs. According to AACSB (2003), the development of PhD program rankings should provide reputational incentives to stimulate added investments in the programs by business schools, thereby counterbal-ancingthedisproportionateinfluenceof MBArankingsonbusinessschools.
Evidencesuggeststhatrankingsmat-ter to prospective PhD students, espe-cially during the early stages of their process of identifying a set of potential programs.InasurveyofMBAstudents who indicated that they might enter a
PhDprogramatsomepointinthefuture, Davis and McCarthy (2005) asked the students to rate the importance of fac-torsinselectingprogramstowhichthey would apply. According to this survey, oneofthemostimportantfactorsiscol-legeranking.Althoughreadilyavailable, the aforementioned college rankings focusprimarilyontheMBAprogramsof business colleges, and therefore a rank-ingofPhDprogramsforeachoneofthe majordisciplinesofbusinesswouldprove tobemuchmorerelevanttoprospective students.Therearepreviouslypublished studiesthatrankPhDprograms,butthe studieswerebasedonlyonasingleattri-bute (e.g., a count of either the number of articles published or the number of citations to the published research of a program’s graduates). The purpose of the present study is to propose a multi- attributesapproachforrankingPhDpro-grams. The advantage inherent to this approach is a broader consideration of theindicatorsforqualityandreputation ofaprogramasmeasuredbytheaccom-plishmentsofitsgraduates.Becausethe AACSB (2003) explicitly emphasizes the role of research as a contributor to PhDprogramquality,themultiattributes approach that is presented in this study includes a ranking metric to recognize theimportanceofresearch.
In this study, we demonstrate the multiattributes approach for ranking PhD programs by an application to
AMultiattributesApproachforRanking
PhDPrograms
FRANKR.URBANCIC
UNIVERSITYOFSOUTHALABAMA MOBILE,ALABAMA
T
ABSTRACT. InitsplantocombatthePhD shortagecrisis,theAssociationtoAdvance CollegiateSchoolsofBusinessInternational (AACSB;2003)hascalledforthedevelop-mentofPhDprogramrankingstoserveas incentivesforacademicinstitutionstoinvest moreinPhDprograms,therebycounter-balancingthedisproportionateinfluenceof masterofbusinessadministration(MBA) rankingsonbusinessschools.Theauthor reportsonthedevelopmentofaunique multiattributesapproachforobjectivelyrank-ingPhDprograms.Theadvantageofthis approachisaninherentlybroaderconsider- ationfortheindicatorsofqualityandreputa- tionofaprogramasmeasuredbytheaccom-plishmentsofitsgraduates.Bycombining multipleattributesintoarankingmetric,this approachemphasizesresearchqualitythatis inlinewiththerecommendationstatedbythe AACSB(2003).Also,becausethemultiattrib-utesapproachincorporatesdatathatisreadily available,PhDprogramrankingscanbemore efficientlyupdatedannually.
Keywords:doctoralprograms,PhDshort-age,rankingmetric,researchreputation
Copyright©2008HeldrefPublications
accounting.However,thesamemethod can be applied to rank the PhD pro-grams for each of the other primary disciplinesoffinance,management,and marketing.Theremainderofthisstudy is organized as follows: First, previ-ous research related to ranking PhD programs is reviewed, and the attri-butesusedinthesestudiesarecritically evaluated for suitability. In the second section, relevant attributes are identi-fied and support for their inclusion is discussed.Thethirdsectionpresentsthe findings from application of the multi-attributesapproach.Finally,concluding comments on the significance of the findingsarediscussed.
RelatedResearch
Previously published studies that rank PhD programs in accounting are basedonlyonasingleattribute.Where the studies vary is in the specific attri-bute chosen as the basis for ranking programs. To date, published rankings have been based on the application of anattributechosenfromoneofthefol-lowing:perceptionsofprogramquality, numberofpublishedjournalarticlesby graduates, number of published cita-tionstotheresearchofgraduates,initial placement record of graduates, gradu-ates’ representation on editorial boards of academic journals, and the number of endowed positions held by gradu-ates. A discussion of these attributes follows and includes consideration for theirsuitabilitytothepurposeofrank-ingPhDprograms.
The earliest PhD-ranking studies for accounting are the surveys of Carpen-ter,Crumbley,andStrawser(1974)and Estes(1970),whichfocusedonpercep-tionsofdoctoralprogramquality.Both studiesreliedonasurveyquestionnaire butdifferedintheirapproach.Carpenter et al. provided a list of doctoral pro-gramsto1,190facultymemberswitha request for an assessment of perceived qualityforeachprogrambasedona4-pointscale.Estesalsoprovidedalistof doctoralprograms,butparticipantswere asked to rank only the top programs from1to10.Thesesurveystudieswere soundly criticized by Morton (1975), ZeffandRhode(1975),andRhodeand Zeff(1970),primarilyforthelackofa
consistent standard or defined criteria onwhichtoevaluatequalitybutalsofor inherent problems of bias that signifi-cantlylimittheusefulnessofthesurvey approachasasuitablebasisforranking doctoralprograms.
Another technique for ranking PhD programs is based on a count of the number of published journal articles by graduates. This approach serves as the basis for ranking in studies by BazleyandNikolai(1975),Bublitzand Kee (1984), Hasselback and Reinstein (1995), Jacobs, Hartgraves, and Beard (1986),andStevensandStevens(1996). Differences among the rankings pro-vided by the studies are the result of differencesinchoiceofjournalsandthe time periods examined. For example, Bublitz and Kee counted articles from thelargestnumberofjournals(69)but fortheshortestperiodoftime(5years). ComparedwiththestudybyBublitzand Kee,thestudiesbyBazleyandNikolai and by Jacobs et al. used longer time frames (7 and 13 years, respectively) butcountedarticlespublishedinavery small group of journals (4 and 8 jour-nals,respectively).Journalarticlecount studies that are based on the longest time periods have been by Hasselback andReinstein(1995),whoexamined41 journals for a period of 15 years, and byStevensandStevens,whoexamined 40journalsforaperiodof19years.A key difference between the latter stud-ies is that Hasselback and Reinstein adjustedtheircountsforcoauthorships, whereasStevensandStevenscounteda coauthoredarticleasawholearticlefor eachauthorregardlessofthenumberof authorsonthearticle.
The wide differences of opinion regardinghowtoidentifyanappropriate setofjournals,howtochoosetheright time frame, and whether it is fitting to adjustforcoauthorshipsallcombineto limittheusefulnessofarticlecountsas a base for ranking PhD programs. For example,researchersmightarguethata count should be based only on articles that are published in top-tier journals. However,astudybySmith(2004)pro-vided empirical evidence as proof that not all the articles in the top journals are top articles. Another weakness of articlecountasanattributeforranking program quality is that the approach
ignores important research contribu-tionsthatarepublishedaseitherbooks or monographs as opposed to journal articles.Onthebasisofaquestionnaire survey of 2,135 accounting academi-cians, Heck and Huang (1986) identi-fied the top 15 research monographs thathavemadethemostsignificantcon-tributions to the accounting literature. Yet these types of publications are not considered in PhD rankings that are basedonarticlecounts.
A third approach that researchers have used to rank PhD programs is basedonnumberofcitationstothepub-lished research of a program’s gradu-ates.Frequencyofcitationisameasure that is considered by some to be as revealing of reputation for quality as any other approach. PhD programs in accounting are ranked on the basis of citationanalysisbyBrownandGardner (1985),GambleandO’Doherty(1985), andSriramandGopalakrishnan(1994). These studies yield different rankings primarily because of differences in the journals chosen for analysis. Gamble andO’DohertyanalyzedtheAccounting Review(AR)andJournalofAccounting Research(JAR), Brown and Gardner assessedtheJournalofAccountingand Economics(JAE)and AccountingOrga-nizations and Society (AOS) in addi-tion to AR and JAR, whereas Sriram and Gopalakrishan analyzed six jour-nals: AR, JAR, JAE, AOS,Auditing: AJournalofPracticeandTheory,and
Journal of Accounting, Auditing and Finance.Therefore,asisthecasewith article counts, the lack of agreement aboutthecorrectsetofjournalsandthe focusonjournalarticlestotheexclusion of books and monographs raises ques-tions about the suitability of the cita-tionanalysisapproachforrankingPhD and negative citations as equals; and inability to differentiate citations that are biased in favor of popular authors, topics,ormethodologies.
The initial placement of graduates representsafourthapproachtoranking
PhD programs. According to Fogarty andSaftner(1993),thepremiseofthis approach suggests that because candi-dates are hired on the basis of how they will appear to outside observers, the prestige of their doctoral program is central, whereas real credentials and underlying facts of the candidate go unexamined. Fogarty and Saftner used this approach to rank 68 programs for their placements from 1980–1989, and Stammerjohan and Hall (2002) stud-ied placements to rank the graduates of 80 programs on the basis of initial placements from 1986–1990. In Fog-arty and Saftner’s study, prestige was measuredonthebasisofthepercentage of a program’s graduates who are ini-tiallyplacedinpositionswithdoctoral- granting departments rather than non– doctoral-granting departments. In con-trast,StammerjohanandHallrecognized thattheprestigeofsomenon–doctoral- granting departments may actually exceed that of some less prestigious doctoral-granting departments, and for this reason they used a different basis forrankingPhDprograms.Inthestudy by Stammerjohan and Hall, the mea-suresofgraduateplacementwerealong two lines: The first scale used results fromarankingofuniversitiesandcol-legespublishedinU.S.NewsandWorld Report: America’s Best Colleges,and the second scale used previously pub-lished information (from Hasselback and Reinstein, 1995) on the research productivityofaccountingdepartments. According to Stammerjohan and Hall, the prestige of a PhD program can be measuredbygraduates’placementswith top-tieruniversitiesandbytheirplace-mentswithaccountingdepartmentsthat arerecognizedforabove-averagepubli-cation productivity.Although the latter study improved the method used by Fogarty and Saftner, questions remain concerning the suitability of initial placements as a basis for ranking PhD programs.Forexample,prestigemaybe offsetbyotherfactorsthatareexcluded fromthesestudies,suchasacandidate’s geographic location preference in the job search. Also, supply and demand characteristics can partially mitigate prestigestructuressothatinitialplace-mentcharacteristicsarenotstableover time. Indeed, the current severe
short-age of faculties could cause changes in the hiring choices of higher quality departments,andsuchshiftsareafunc-tionofthelabormarketratherthanthe qualityofPhDprograms.
ThefifthapproachusedtorankPhD programs is based on a count of the numberofjournaleditorialboardmem- bershipsheldbythegraduatesofapro-gram.Editorialboardrepresentation,as discussed by Urbancic (2006), is often used to rank faculties in the areas of accounting, economics, finance, mar-keting, real estate, statistics, and trans-portation.However,Mittermaier(1991) extended the editorial board approach to develop a ranking of PhD programs based on the doctoral origins of edi-torial board members for accounting journals. A multidisciplinary study by Trieschmann, Dennis, Northcraft, and Niemi (2000) added validity and rel-evance to the use of editorial board memberships as a basis for an assess-ment of academic quality by demon-strating a positive correlation between the number of memberships held and businessschoolrankings.Becauseitis imperativethatjournaleditorsendeavor to sustain and enhance journal reputa-tion, Rynes (2006) stated that schol-arswithstrongpublicationandcitation recordsarethemostobviouscandidates to receive board invitations to leading journals. In effect, the editorial board approach encompasses both the article countandcitationanalysismethodsfor rankingPhDprograms.Thelatestyear for which Mittermaier (1991) obtained editorialboarddatawas1990,butsince thattimeanadditionalsevenPhDpro-grams in accounting have been estab- lished,andthereforemorerecentinfor-mationonmembershipsisnecessaryto provideamorecurrentrankingofPhD programs.
ThesixthapproachusedtorankPhD programs is a count of the number of namedpositions(endowedchairs,fund-edprofessorships,andfellowships)held bythegraduatesofaprogram.Accord-ingtoastudybyWorthington,Waters, andFields(1989),adoctoralprogram’s ability to develop highly productive graduatescanbemeasuredbythenum-berofdoctoralgraduatesholdingnamed positions. This approach has served as the basis for a ranking of
account-ing PhD programs in studies by Meier and Kamath (2005), Tang and Griffith (1997), and Worthington et al. (1989). But,exceptforthestudybyWorthing-ton et al. (1989), the reported results are not sufficiently comprehensive in their program coverage. For example, there are more than 80 PhD programs in accounting in the United States, but Tang and Griffith presented a ranking for only 28 programs—although they indicated that there are at least 100 graduates of other PhD programs that also hold named positions. The study byMeierandKamathimprovedonthe workofTangandGriffithbyreporting rankings for 37 programs, but numer-ousprogramsrepresentedbygraduates holding 89 named positions remained unreportedfortheirranking.Thestudy byWorthington et al. offered the most completelookatallthePhDprogramsin accountingwithrespecttonamedposi-tionsheldbygraduates,buttheranking isbasedondatacollectedin1988,and since then several more PhD programs havebeeninitiated,andanevengreater number of named positions have been established.Thenumberofnamedposi-tionholdersisarelevantbasisonwhich torankPhDprograms,butamorecom-prehensive and current compilation of informationiscalledfor.
METHOD
The review of previously published approaches used to rank PhD programs suggeststhatarankbasedonlyonasin- gleattributedoesnotsufficientlydistin-guishdifferencesinquality.Therefore,an improvementintherankingofPhDpro-gramscouldbeachievedbydeveloping a multiattributes approach.An essential considerationinthedevelopmentofthis approach is explicit recognition of the emphasis placed by theAACSB (2003) on research as the primary determinant of PhD program quality and rankings. For this approach, three attributes are chosen to compose a ranking metric basedonthedoctoraloriginsofthefol-lowing: research award winners, edito-rialboardmembersfortopjournals,and namedpositionholders.
The first component of the multi-attributes ranking approach is the doc-toraloriginsofresearchawardwinners.
Althoughnotinusebypriorresearchers to rank PhD programs, the power of nationalawardsasasignalofleadership inresearchhasbeendocumentedbyLee (1995).UsingthehistoryoftheAmeri-can Accounting Association (AAA) as an empirical foundation for analyzing thedevelopmentofacademicaccounting research,Lee(1995)foundthatresearch awards exist on the same level as edi-torial board appointments in terms of their capacity to signify research elites among doctoral programs. Currently, theAAA provides national recognition in the form of seven awards of which five are based on research: Wildman Medal Award, Seminal Contributions toAccounting LiteratureAward, Nota-ble Contributions to Accounting Lit-eratureAward,OutstandingAccounting Educator Award, and the Competitive ManuscriptAward.Thedoctoralorigins wereidentifiedforallwinnersofthese awards and incorporated as part of the rankings for PhD programs. Research awardsasabasisforrankingPhDpro-grams has three advantages compared with counts of the number of articles published or research citations. First, both the Wildman Medal Award and the Notable Contributions to Account-ing Literature Award more broadly consider significant books and mono-graphs, as well as journal articles, in therecognitionofresearch.Second,as previouslydiscussed,astudybySmith (2004) provided empirical evidence as proof that not all the articles in the top journals are top articles. By com-parison, only research judged by the AAA as top is bestowed with national recognition.Third,thedisadvantagesof citationanalysisasabasisforrankings are avoided because research that has garnered award(s) is most likely to be heavilycitedresearchanyway.
Thenumberofeditorialboardmem-bershipsheldbythegraduatesofaPhD programconstitutesavalidindicatorof quality (Mittermaier, 1991). Because a strong record of publication is a pre-requisite for selection to a board, it is reasonablethatthenumberofmember-ships held implicitly includes “num-ber of articles published” as a ranking metric, but without a need to confront the problem of whether to adjust for coauthorship credit. In relying on the
numberofeditorialboardmemberships held as an indicator of quality, it is necessary to first identify an appropri-ate core set of journals. Studies that identifythemostinfluentialjournalsin academic accounting have been made byBonner,Hesford,VanderStede,and Young(2006)andBallasandTheohara-kis(2003).Bothstudiesconcludedthat the top five journals in accounting are AOS,AR,JAE,JAR,andContemporary AccountingResearch(CAR).Therefore, in the present study the multiattributes ranking includes the doctoral origins for the editorial board members of thesefivejournalsbasedonthedegree information published in Hasselback’s (2006)Accounting Faculty Directory 2006–2007.
We also used the data provided by Hasselback’s (2006)Accounting Fac-ulty Directory 2006–2007to identify namedfacultypositionholdersandtheir doctoralorigins.Inamannersimilarto that of Meier and Kamath (2005), we interpreted named positions broadly to include endowed chairs, named pro-fessorships, and fellowships without regard to faculty rank. Validation for using the doctoral origins of named positionholdersasthethirdcomponent forrankingPhDprogramswasprovided bysurveystudiesofnamedpositionsby Rezaee, Elmore, and Spiceland (2004) andbyTang,Forrest,andLeach(1990), because the results from both studies indicatedthatthemostimportantcrite-rioninthedecisionforanappointment to a named position is the record of published research productivity estab-lished by an individual. Respectively, these studies reported that universities seekscholarswithoutstandingorexcel-lent publication records to fill named positions.
RESULTS
Information on the doctoral origins of research award winners, editorial board members for top journals, and named position holders for the gradu-ates of 80 PhD programs is in Table 1. We excluded from Table 1 all PhD programs with fewer than 5 graduates (Duke, Florida International, Georgia Institute of Technology, Lehigh, Rens-selaer, Rice, SUNY–Binghamton, and
Vanderbilt) and doctoral programs in accounting that had been discontinued at three universities (American, St. Louis, and Santa Clara). Collectively, graduates of the 80 PhD programs in Table 1 have received 226 awards for outstandingresearch,hold236editorial board memberships for top journals, andhold462namedfacultypositions.
Comparisons among the programs presented in Table 1 reveal that it is a rare accomplishment for a program’s graduates to excel in more than one of the three categories. Programs whose graduates have received 15 or moreAAAawardsforresearchinclude Berkeley, Chicago, Cornell, Illinois, Michigan, and Stanford, while 15 or more memberships to editorial boards areheldbygraduatesfromtheprograms ofChicago,Iowa,Michigan,Rochester, and Stanford. And the graduates from programsofIllinois,Indiana,Michigan, Ohio State, Pennsylvania State, and TexasatAustinhold15ormorenamed faculty positions.At the other extreme are graduates from six programs who havenotreceivedaresearchaward,who do not currently serve as members of a prominent editorial board, and who do not hold appointments to a named position.TheseprogramsareCleveland State, Drexel, Memphis, Rutgers, Vir-ginia Commonwealth, and Washington State. Awards for research have gone only to the graduates of 32 programs, whereas the editorial board appoint-mentsextendtoonlythegraduatesof37 programs, and the named positions are held by graduates from 71 programs. Thiswidedisparityintheachievements attainedbygraduatesofPhDprograms in accounting further underscores the importance of using a multiattributes approachtoranktheprograms.
The process for assigning the rela-tiverankstoPhDprogramsinamulti- attributeformatisbasedoncomputing acombinedscoretorepresentresearch awards, editorial board memberships, and named positions. For example, information in Table 1 indicates that one research award winner, no board members,and12namedpositionhold-ersreceivedtheirPhDsfromAlabama. Therefore, a computed score for Ala-bama’sPhDprogramwouldequal.0304 (orthesumof1/226+0/236+12/462).
Becausethereareapproximatelytwice the total number of named position holders(462)asthereareeitherawards (226)oreditorialboardmembers(236), theportionofthescorethatisweighted fornamedpositionsisineffectreduced by half. However, the aforementioned reduction is a justifiable outcome of differences inherent to the three attri-butes.Inotherwords,agraduatefroma givenPhDprogramhaseitherreceived a nationalAAA award for research or has not, and the graduate is either a memberofatopjournaleditorialboard orisnot.Butbycomparison,appoint-menttoanamedpositiononthefaculty doesnotineveryinstancealwayscarry the same distinctive significance as either an award or selection to an edi-torial board because named positions areknowntowidelyrangefromjusta relativelysmallannualstipendtoafar more lucrative salary package. Find-ings from survey studies of endowed position holders by Bloom, Fuglis-ter, and Meier (1996) and by Rezaee, Elmore,andSpiceland(2004)indicated extensive differences in the financial amounts provided to fund support for the positions, with corresponding dif- ferencesincompensationfortheposi-tionholders.Therefore,comparedwith AAAresearchawardsandselectionto a top journal editorial board, named position appointments signify relevant achievement but are not uniformly as strong an indication of the research emphasis in the PhD program of the appointee. And bear in mind that the emphasisontheroleof“researchasan important contributor to PhD program quality”iscentraltotheAACSB(2003) callforrankingPhDprograms(p.34).
Table2presentstherankfor80PhD programs in accounting on the basis oftheprocessforcomputingweighted scores as described in the preceding paragraph. According to these results, thetop10programsareChicago,Mich-igan, Illinois, Stanford, Berkeley, Cor-nell, Ohio State, Washington, Texas– Austin, and Rochester. Most of these top programs are also highly ranked in the previous studies as well, but by incorporating a ranking metric based onmultipleattributes,therankingscan beextendedtoagreaternumberofpro-gramsthanwouldnormallybethecase
TABLE1.DoctoralOriginsofResearchAwardWinners,EditorialBoard Members,andNamedPositionHolders
Research Editorial Named PhDprogram awardwinners boardmembers positionholders
Alabama 1 0 12
Arizona 1 8 12
ArizonaState 0 2 14
Arkansas 0 0 13
Boston 0 1 0
California–Berkeley 22 8 9 California–LosAngeles 0 1 4 CarnegieMellon 10 6 13 CaseWesternReserve 0 0 3 CentralFlorida 0 0 1
Chicago 32 24 13
Cincinnati 0 0 2
CUNY–Baruch 0 0 2 ClevelandState 0 0 0
Colorado 0 0 2
Columbia 0 3 4
Connecticut 0 1 0
Cornell 17 10 6
Drexel 0 0 0
Florida 3 2 8
FloridaState 0 0 6 GeorgeWashington 0 0 1
Georgia 1 0 8
GeorgiaState 0 0 8
Harvard 1 6 5
Houston 0 1 4
Illinois 23 7 22
Indiana 1 1 15
Iowa 4 16 8
Kansas 0 3 1
KentState 0 0 3
Kentucky 0 0 10
LouisianaState 2 0 11 LouisianaTech 0 0 4
Maryland 0 0 4
Massachusetts 0 1 1 MassachusettsInst.ofTech. 2 3 1
Memphis 0 0 0
Michigan 15 30 19 MichiganState 10 3 14 Minnesota 5 10 14 Mississippi 0 0 9 MississippiState 0 0 7
Missouri 2 0 10
Nebraska 0 0 10
NewYork 0 2 4
NorthCarolina 0 3 10 NorthTexas 0 0 9 Northwestern 1 5 8 OhioState 14 7 18
Oklahoma 0 0 3
OklahomaState 1 0 8
Oregon 3 2 3
Pennsylvania 1 8 5 PennsylvaniaState 1 6 16
Pittsburgh 0 5 2
Purdue 0 0 1
Rochester 6 15 8
(tablecontinues)
ifbasedonlyonasingleattribute.How-ever,theexpansionofrankstoinclude more programs results in several tied scores.Fordeterminingtherankstobe assigned in Table 2, we resolved any tied scores between programs in favor of the program with fewer graduates through the year 2005, according to data provided by Hasselback’s (2006)
Accounting Faculty Directory 2006– 2007. In all, there were 11 tied scores resolved on this basis, with only the programs of CUNY–Baruch andTem-pleremainingtiedinthe66thposition becausebothprogramshadanidentical number of graduates (38).We empha-size that number of graduates is noth-ing more than an expedient condition forbreakingtiesandisnotnecessarily coincidentwitheitherahigherorlower rank. Some findings indicate that the larger programs do not automatically haveanadvantageintermsofrank.For example,ChicagoandStanford,with74 and72graduates,respectively(accord-ing to Hasselback’s 2006Accounting Faculty Directory 2006–2007), rank significantly higher (1st and 4th) than Missouri (26th) and Arkansas (30th), with 183 and 168 graduates,
respec-tively. Conversely, Oregon (28th) and Pittsburgh(31st),with51and37grad-uates, respectively, rank significantly lowerthandoMichigan(2nd)andOhio State(7th),with119and133graduates, respectively.Althoughsizeofprogram does not coincide with rank, there are indications that the age of a program, asmeasuredbythe1styearinwhicha degree was conferred per data in Has-selback’s (2006)Accounting Faculty Directory 2006–2007, tends to align with rank in that the established pro-gramsrankhigherthannewerprograms. For example, 17 of the 20 programs composing the top quartile conferred first degrees prior to 1968; the only exceptions are Arizona, Pennsylvania, and Rochester. On the other hand, all of the programs composing the fourth quartile conferred a first degree after 1968, except for Colorado, Utah, and Washington–St.Louis.
DISCUSSION
WhencomparingPhDprograms,the role and extent of emphasis on qual-ity research is an essential character-istic that sets the programs apart from
TABLE1.(cont.)
Research Editorial Named PhDprogram awardwinners boardmembers positionholders
Rutgers 0 0 0
SouthCarolina 0 0 5 SouthFlorida 0 0 2 SouthernCalifornia 1 1 6 SouthernIllinois 0 0 2 Stanford 18 16 14 SUNY–Buffalo 1 0 1
Syracuse 0 0 3
Temple 0 0 2
Tennessee 0 0 13
Texas–Arlington 0 0 2 Texas–Austin 12 3 24 TexasA&M 1 0 7
TexasTech 0 0 9
Tulane 0 2 0
Utah 0 0 2
VirginiaCommonwealth 0 0 0 VirginiaPoly.Inst. 0 0 4 Washington 11 10 14 Washington–St.Louis 0 0 2 WashingtonState 0 0 0 Wisconsin 3 4 14
Total 226 236 462
TABLE2.PhDProgram Rankings
Rank Program Score
1 Chicago .2714 2 Michigan .2346 3 Illinois .1790 4 Stanford .1777 5 California–Berkeley .1507 6 Cornell .1306 7 OhioState .1306 8 Washington .1213 9 Texas–Austin .1178 10 Rochester .1074 11 Iowa .1028 12 CarnegieMellon .0978 13 Minnesota .0948 14 MichiganState .0873 15 PennsylvaniaState .0645 16 Arizona .0643 17 Wisconsin .0605 18 Pennsylvania .0491 19 Northwestern .0429 20 Indiana .0411 21 Harvard .0407 22 Florida .0391 23 ArizonaState .0388 24 NorthCarolina .0344 25 LouisianaState .0327 26 Missouri .0305 27 Alabama .0304 28 Oregon .0282 29 Tennessee .0281 30 Arkansas .0281 31 Pittsburgh .0255 32 MassachusettsInst. ofTech. .0237 33 OklahomaState .0217 34 Georgia .0217 35 SouthernCalifornia .0216 36 Kentucky .0216 37 Nebraska .0216 38 Columbia .0214 39 TexasA&M .0196 40 TexasTech .0195 41 Mississippi .0195 42 NorthTexas .0195 43 GeorgiaState .0173 44 NewYork .0171 45 MississippiState .0152 46 Kansas .0149 47 FloridaState .0130 48 California–Los
Angeles .0129 49 Houston .0129 50 SouthCarolina .0108 51 Maryland .0087 52 LouisianaTech .0087 53 VirginiaPoly.Inst. .0087 54 Tulane .0085 55 SUNY–Buffalo .0066 56 CaseWesternReserve .0065 57 Syracuse .0065
(tablecontinues)
each other. Such comparisons inevita-bly invite development of a suitable approach to ranking PhD programs. Comparedwiththevariousapproaches used as a basis in prior studies that rankedPhDprograms,themethodthat we applied in this study is unique by virtue of its simultaneous combination of three attributes as a ranking metric ofresearchquality.Theseattributesare research awards, editorial board mem-berships for top journals, and holders of named positions. In terms of their capacity to signal the research quality ofaPhDprogram,theattributesencom-pass the more traditional measures of productivity accomplishment, such as counts of either the number of articles publishedorthenumberofcitationsto the published research of a program’s graduates. Also, because the multi- attributes approach incorporates data that is readily available, PhD program rankingscanbemoreefficientlyupdat-edannually.
The development of a reliable and efficient means for ranking PhD pro-grams is a worthwhile goal. Survey results from a study by Davis and McCarthy (2005) provided evidence
that college rankings matter greatly to prospective PhD students, especially during the early stages of their pro-cess to identify a set of potential PhD programs. However, although readily available, the aforementioned college rankings focus primarily on the MBA programs of business colleges, and therefore a ranking of PhD programs foreachoneofthemajordisciplinesof business would offer far more relevant information to prospective students. Also, AACSB (2003) emphasized that thedevelopmentofPhDprogramrank- ingsshouldprovidereputationalincen-tives to stimulate added investments in the programs by business schools, therebycounterbalancingthedispropor-tionate influence of MBA rankings on business schools. The results from the presentstudyshowthatamultiattributes approachtorankingPhDprogramshas thepotentialtosuccessfullyachievethe objectivesetforthbyAACSB.
AdecisiontopursueaPhDisasig-nificant one, as is the choice of pro-grams to apply to. Rankings are an important information source for com-paringprogramsduringearlystagesof the search process, but there are addi-tional factors that prospective students shouldconsiderpriortofinalizingtheir decisions.Someoftheseconsiderations areadmissionrequirements,curriculum, amounts of financial support offered, geographicpreferences,andpreferences betweensmall-townlocationandlarger citylocation.
NOTES
Dr. Frank R. Urbancic’s research interests are financialreportingstandardsandbusinesseducation.
Correspondence concerning this article should be addressed to Frank R. Urbancic, Department ofAccounting,MitchellCollegeofBusiness,Uni-versity of South Alabama, Mobile, AL 36688, USA.
E-mail:furbanci@usouthal.edu
REFERENCES
Association to Advance Collegiate Schools of Business. (2002).Management education at risk.St.Louis,MO:Author.
Association to Advance Collegiate Schools of Business. (2003).Sustaining scholarship in businessschools.St.Louis,MO:Author. Ballas,A., & Theoharakis,V. (2003). Exploring
diversity in accounting through faculty jour-nal perceptions.Contemporary Accounting Research,20,619–644.
Bazley, J. D., & Nikolai, L. A. (1975). A com-parison of published accounting research and
qualities of accounting faculty and doctoral programs.AccountingReview,50,605–610. Bloom, R., Fuglister, J., & Meier, H. H. (1996).
Trendsinnamedprofessorshipsinaccounting. AccountingEducators’Journal,8(1),80–90. Bonner,S.E.,Hesford,J.W.,VanderStede,W.A.,
&Young, S. M. (2006). The most influential journals in academic accounting.Accounting OrganizationsandSociety,31,663–685. Brown,L.D.,&Gardner,J.C.(1985).Applying
citation analysis to evaluate the research con-tributions of accounting faculty and doctoral programs.AccountingReview,60,262–277. Bublitz, B., & Kee, R. (1984). Measures of
research productivity.Issues in Accounting Education,2(1),39–60.
Carpenter, C. G., Crumbley, D. L., & Strawser, R. H. (1974). A new ranking of accounting faculties and doctoral programs.Journal of Accountancy,137(6),90–94.
Davis, D. F., & McCarthy, T. M. (2005). The futureofmarketingscholarship:Recruitingfor
Hasselback, J. R. (2006).Accounting faculty directory2006-2007.UpperSaddleRiver,NJ: Prentice-Hall.
Hasselback,J.R.,&Reinstein,A.(1995).Assess-ing accountHasselback,J.R.,&Reinstein,A.(1995).Assess-ing doctoral programs by their graduates’ research productivity.Advances in Accounting,13(1),61–86.
Heck,J.L.,&Huang,J.C.(1986).Contributions to accounting literature:A peer assessment of monographs.JournalofAccountingEducation,
Jacobs, F.A., Hartgraves,A.A., & Beard, L. H. (1986). Publication productivity of doctoral alumni: A time-adjusted model.Accounting Review,61,179–187.
Lee, T. (1995). Shaping the U.S. academic accountingresearchprofession:TheAmerican Accounting Association and the social con-struction of a professional elite.Critical Per-spectivesonAccounting,6,241–261. Meier, H. H., & Kamath, R. (2005). A
multi-dimensional investigation of named professor-ships in accounting: 2002–2003.Journal of EducationforBusiness,80,295–301. Mittermaier, L. J. (1991). Representation on the
editorial boards of academic accounting jour-nals: An analysis of accounting faculties and doctoral programs.Issues in Accounting Edu-cation,6,221–238.
Morton, J. R. (1975). Comments on “a new ranking of accounting faculties and doctoral programs.”Journal of Accountancy,139(2), 103–105.
Rezaee, Z., Elmore, R. C., & Spiceland, D. (2004). Endowed chairs in accounting world-wide.AccountingEducation,13(1),29–50. Rhode,J.G.,&Zeff,S.A.(1970).Commentson
“arankingofaccountingprograms.”Journalof Accountancy,130(6),83–85.
TABLE2.(cont.)
Rank Program Score
58 Oklahoma .0065 59 KentState .0065 60 Massachusetts .0064 61 SouthFlorida .0043 62 SouthernIllinois .0043 63 Texas–Arlington .0043 64 Utah .0043 65 Washington–St.
Louis .0043 66 CUNY–Baruch .0043 66 Temple .0043 68 Cincinnati .0043 69 Colorado .0043 70 Connecticut .0042 71 Boston .0042 72 CentralFlorida .0022 73 Purdue .0022 74 GeorgeWashington .0022 75 ClevelandState .0000 76 WashingtonState .0000 77 Drexel .0000 78 Rutgers .0000 79 Virginia
Commonwealth .0000 80 Memphis .0000
Rynes, S. L. (2006). “Getting on board” with AMJ: Balancing quality and innovation in the reviewprocess.AcademyofManagementJour-nal,49,1097–1102.
Smith,S.D.(2004).Isanarticleinatopjournalatop article?FinancialManagement,33(4),133–149.
Sriram,R.S.,&Gopalakrishan,V.(1994).Rank-ing of doctoral programs in accounting: Pro-ductivity and citational analyses.Accounting Educators’Journal,6(1),32–53.
Stammerjohan, W. W., & Hall, S. C. (2002). Evaluation of doctoral programs in account-ing:An examination of placement.Journal of AccountingEducation,20(1),1–27.
Stevens,K.T.,&Stevens,W.P.(1996).Ranking accounting doctoral programs by the research productivityofgraduates:1974–1992.Account-ingEducators’Journal,8(1),51–79.
Tang, R.Y., Forrest, J. P., & Leach, D. (1990). Findings from a survey on accounting chair professorships.Journal of Accounting Educa-tion,8,241–251.
Tang, R. Y., & Griffith, D. (1997). Accounting chairprofessorshipsin1997.JournalofApplied BusinessResearch,14(1),137–147.
Trieschmann,J.S.,Dennis,A.R.,Northcraft,G.B., &Niemi,A.W.(2000).Servingmultiplecon-stituenciesinbusinessschools:MBAprogram
versusresearchperformance.AcademyofMan-agementJournal,43,1130–1141.
Urbancic, F. R. (2006). Institutional gatekeep-ers of supply chain research.International JournalofIntegratedSupplyManagement,2, 330–337.
Worthington,J.S.,Waters,G.L.,&Fields,K.T. (1989).Aprofileofchairholdersbasedonchair efficiency ratios.Accounting Educators’ Jour-nal,2(1),87–103.
Zeff, S. A., & Rhode, J. G. (1975). Comments on“anewrankingofaccountingfacultiesand doctoral programs.”Journal of Accountancy, 139(2),105–106.
� � � � � � � � �
� � � �
��������������������
���������������������
�����������������
���������������������
��������������������
������� ������������
������������ � ������ �� ��� ������� ����������� �� ����������
�� ������������ �� ������������ ������������������������������
��������� � ������ ���� ���������� ���� �� ����������� �� ���������� �� ������������ �� ������������
���������������