• Tidak ada hasil yang ditemukan

The Performance Based Research Fund in NZ: Taking Stock and Looking Forward

N/A
N/A
Protected

Academic year: 2023

Membagikan "The Performance Based Research Fund in NZ: Taking Stock and Looking Forward"

Copied!
37
0
0

Teks penuh

This article draws on previous research by the authors to review changes in research quality in New Zealand universities since the introduction of the Performance Based Research Fund (PBRF) in 2003. It is based on a series of detailed analyzes by the authors, which make use of the extensive data generated by. The stocktaking involves analyzing the nature of the changes that have taken place in NZ universities in relation to the originally stated objectives of the PBRF.

After considering the changes since the introduction of the PBRF, Chapter 8 looks ahead with a number of proposed changes. The TEAC also recommended the creation of the Tertiary Education Commission (TEC), which administers the PBRF. There have been differing statements about the purpose of introducing a PBRF to the New Zealand tertiary education sector.

The limitations faced by young academics are recognized only in the designation of the supplementary letters, NE, for. These are shown, for all universities and subject groups combined, in Figure 1 for each of the PBRF rounds.

The effects on the research quality of New Zealand universities

The publication of AQS measures, for universities and subject areas, is nevertheless related to objective (iv) mentioned above, to provide public information to interested parties about research performance within and between TEOs. The fact that performance measures corresponding to the metrics do not exist prior to the introduction of the PBRF raises a challenge in attributing any changes during different 'rounds' to the PBRF itself. However, by carefully examining the precise nature of the new incentives introduced by the PBRF, and examining the detailed data collected over its 15 years of experience in NZ, it is possible to make a strong argument that the changes within universities and disciplinary groups have closely reflected those incentives.

However, the AQS of the entries may be above the AQS at the beginning of the period. First, the introduction of the PBRF has generated large-scale changes in NZ universities that are not sustainable over the long term. Moreover, the extent of the replenishment process, while essential, has been somewhat restrained in recent years by the difficulty in sustaining the improvements mentioned above.

Therefore, in terms of their AQSs, NZ universities have become more similar as a result of the PBRF. The extent of convergence over the period for the PBRF, 2003 to 2018, is illustrated in Figure 3. This shows the annual average AQS growth rate, plotted against the logarithm of the original (2003) AQS, for universities and subject groups.

In summary, it is possible to trace a significant overall increase in the AQS of NZ universities, together with a strong convergence process and associated 'diminishing returns', and to understand, via the social accounting framework, the precise dynamics – in terms of the nature of the ​​staff turnover and quality transformations – which have contributed to this growth. Furthermore, the stimulus for these changes can be understood in terms of the incentives created by the PBRF and the various constraints on change, particularly imposed by the nature of. Nevertheless, it is not possible to provide a measure of the extent to which this can be judged a political success or failure.

Given the unique nature of the PBRF, no similar international comparisons can be made at all.

The global ranking of New Zealand Universities

This inability to appreciate such a huge change in the conditions facing universities is, of course, shared by many policy initiatives. Policymakers rarely, if ever, provide a clear indication of desired outcomes, other than general expressions that cannot be used as clear tests of performance. Notes: Universities are ranked by PBRF based on the AQS for the university.

For QS's top-ranked university, AU, the score deteriorated slightly from 2009 to 2015, but has remained fairly stable since then. The second highest-ranked QS university, OU, has seen a steady decline in its ranking over a 14-year period. The other five universities saw a moderate improvement in QS scores overall.

The QS rankings are determined from scores on six metrics: academic reputation (40%), faculty citations (20%), faculty-to-student ratio (20%), employer reputation (10%), international faculty ratio (5%) and Share international students (5%). It also shows results for each of the two interesting components that contribute to the QS ranking: Academic Reputation (AR) and Faculty Citations (C/f). In both cases, the QS results appear to be more influenced by the experience and age of the researcher (other things being equal) than the RO component of an individual researcher's score in the PBRF, although they may match more closely with the other components (CRE and PE).

The AR score is based on a survey of over 130,000 individuals in higher education regarding the quality of teaching and research. The C/f score is based on the total number of citations received by all papers produced by an institution over a 5-year period by the institution's faculty members. Given the relatively heavy weight given to research indicators, it is interesting to explore the relationship between NZ universities' PBRF performance and changes in their global QS rankings, despite the limited comparability of the two measures.

This therefore provides some prima facie evidence that the PBRF has contributed to a general improvement in global rankings.

Teaching and learning at postgraduate levels

Evidence on the overall effect of research on teaching is mixed, with some studies finding a positive relationship, others a negative one, and some suggesting that the two scholarly activities are largely independent. Evidence also suggests that the relationship between teaching and research varies by discipline, level of study, workload distribution, faculty-to-student ratio, and institutional settings.

Looking forward: suggested modifications to the PBRF

In considering the future of PBRF evaluations and funding formulas, it is necessary to recognize, as emphasized above, that incentives play a critical role. A detailed analysis of the resulting demographic transitions has shown that the changes taking place in New Zealand's universities are consistent with the incentives created by the PBRF. Indeed, it is important to maintain clear incentives for individuals, heads of departments and senior managers of universities.

Hazledine and Kurniawan report Web of Science (2004) detailed estimates of the total costs incurred by the implementing agencies and university compliance costs for the first round of PBRF in 2003. These costs more reflect the process of increasing the average research productivity of universities. researchers and universities. It is not clear how the changes proposed below would affect these costs other than by providing more accurate information on study performance.

Importantly, for evaluation purposes, information on individual streams within social accounting has been shown to be key to identifying the precise sources of variation in university research quality. However, the current peer review process is extremely cumbersome and time-consuming and, as shown above, most of the process is unnecessary. Something like the current categories could continue to be used: these are R, C, B and A, along with C(NE), if it is decided to provide different funding 'awards' for C and C(NO).

It is important to provide clear guidelines on what is considered research output, and then use peer review to directly assign individuals to the various categories (essentially referring to 'no substantive output', low, medium and high quality). When setting out such examples, it is also necessary to avoid confusing outputs with inputs. Research quality comparisons between disciplines involve major problems, and it is vitally important that there is no need to assume that they can be made.

Given the distribution of researchers across the various categories, it is of course possible for anyone to impose their own subjective weights and to calculate some sort of average quality measure for disciplines and universities.

Conclusions

But for the purposes of the PBRF, there is no reason to pretend that weights have anything to do with objective quality measurement, rather than funding allocations. The process must also ensure that the assessment is carried out by 'peers' within the discipline of the researcher being evaluated, not by a panel consisting of a number of disciplinary groups with very different research cultures. Much has been learned about New Zealand's universities, their significant responses to the introduction of explicit research incentives and the application of a unique 'yardstick'.

It would be remarkable if a policy evaluation did not find fault with the process or failed to recognize the nature of the significant staff turnover it stimulated. After the establishment of the PBRF system and after the transition to the full PBRF revenue allocation process, total revenue from the PBRF reached 8.1 percent of total university revenue and has subsequently been at just over 7 percent. Therefore, the growth in the quality of university researchers in NZ combined with the incentives created by the ERI component of the PBRF may have induced growth in income from.

Source: Derived from TEC: Financial Performance | Tertiary Education Commission (tec.govt.nz) and Statistics New Zealand data. The pattern of change for the 'research component', which makes up 30 percent of the total score for a university, is very different from that for the global ranking scores. The impact on research quality of performance-based funding: the case of the New Zealand PBRF scheme.

2019a) The evolution of research quality in New Zealand universities as measured by the performance-based research funding process. 2019b) An evaluation of metrics used by the performance-based research funding process in New Zealand. 2021) Sources of convergence and divergence in university research quality: Evidence from the performance-based research funding system in New Zealand.

New Zealand Department of Education (2018b) Reference Document 1: A Review of the PBRF Objectives. WEB Research (2004) PBRF: Phase 1 evaluation - July 2004: Phase 1 evaluation of the implementation of the PBRF and the conduct of the 2003 quality evaluation. Buckle is Emeritus Professor at the Wellington School of Business and Government, Victoria University of Wellington, New Zealand.

Working Papers in Public Finance

Referensi

Dokumen terkait

Conclussion This research test influence review board structure in perspective gender and ownership stucture on performance companies listed on the Indonesian stock exchange IDX.. The