Full Terms & Conditions of access and use can be found at
http://www.tandfonline.com/action/journalInformation?journalCode=mpmr20
Download by: [RMIT University Library] Date: 30 March 2016, At: 14:29
Public Performance & Management Review
ISSN: 1530-9576 (Print) 1557-9271 (Online) Journal homepage: http://www.tandfonline.com/loi/mpmr20
Auditing Performance Data in Local Government
William C. Rivenbark & Carla M. Pizzarella
To cite this article: William C. Rivenbark & Carla M. Pizzarella (2002) Auditing Performance Data in Local Government, Public Performance & Management Review, 25:4, 413-420, DOI:
10.1080/15309576.2002.11643678
To link to this article: http://dx.doi.org/10.1080/15309576.2002.11643678
Published online: 22 Jan 2016.
Submit your article to this journal
Article views: 1
View related articles
Citing articles: 1 View citing articles
AUDITING PERFORMANCE DATA IN LOCAL GOVERNMENT
WILLIAM C. RIVENBARK CARLA M. PIZZA RELLA University of North Carolina at Chapel Hill
P
erfonnance measurement for local government represents the implementation of perfonnance measurement systems, the collection of performance data on regular intervals, and the calculation of perfonnance measures to address governmental accountability. One goal of performance measurement is to create usable infonnation for improving the operational efficiency and effectiveness of service delivery. This is nonnally accomplished by tracking the perfonnance of particular aspects of business functions and activities with measures of output (workload), outcome (effectiveness), and efficiency. More specific uses of perfonnance measurement include strategic planning, performance budgeting, program evaluation, process improvement, and contract monitoring (Ammons, 2001).There is another dimension of performance measurement that has gone unnoticed in public administration literature: auditing the accuracy of performance data used to create performance measures. Barrett and Greene (2000) proposed that verifying the accuracy of statistics generated by performance data is actually the last step in perfor- mance measurement. Auditing perfonnance data, however, involves much than just verifying their accuracy. It is an activity that allows governmental officials to evaluate whether perfonnance measures are providing meaningful and useable infonnation, to gain a greater understanding of processes and programs, and to test for data accuracy, reliability, and comparability. I This article discusses why perfonnance data must be audited and presents an approach to auditing the accuracy of them. The audit process is based on actual audits conducted in North Carolina local government, providing fac- tors that affect success for each step in the methodology.2
Why
Audit Performance Data?Implementing and managing perfonnance measurement systems produce organi- zational costs in the fonn of time, effort, and training. The assumption, however, is that the benefits of engaging in perfonnance measurement outweigh the costs, providing that the infonnation generated is used to enhance the utility of making sound deci- sions. Auditing perfonnance data allows an organization to test this assumption by determining the actual use of perfonnance data by management, suggesting that the
Public Peiformance & Management Review, Vol. 25 No.4, June 2002 413-420
© 2002 Sage Publications
413
Downloaded by [RMIT University Library] at 14:29 30 March 2016
414 PPMR /June 2002
value of certain measures is often less than the cost of producing them. This point was illustrated when public works officials changed the measure of cost per centerline mile to cost per lane mile when an audit revealed that centerline data were not producing meaningful information.
Another reason for auditing performance data is to ensure that governmental offi- cials understand the dimensions of programs and processes that produce the perfor- mance data. A performance measure that is routinely reported for social services is average time for adoptive placement. The measure must be based on data that are col- lected with a clearly defined beginning time (available for adoption) and ending time (adoptive placement). Otherwise, the measure is not producing information within an appropriate context for continuous process improvement.
Performance measures are routinely reported in budget documents to provide oper- ational accountability and to track the progress toward department objectives. How- ever, are the measures as accurate as the financial information contained within the documents? Performance data, like financial data, are likely to become less reliable over time if a periodic review process is not established. Performance measurement is an intricate management tool for measuring complex functions in evolving organiza- tions, requiring data verification to ensure accuracy, reliability, and comparability. The following areas were found to affect performance data from 14 local government audits conducted in North Carolina (Rivenbark & Pizzarella, 2002):
• complexity of process
• organizational changes
• interpretation of measures
• reporting capabilities
• functional boundaries
These areas are applicable to internal performance measurement systems that col- lect and report performance measures over time and to inter jurisdictional comparisons for the advantages of benchmarking. They are not, however, mutually exclusive. A change in one area often affects the others, causing a ripple effect on the accuracy of performance measures.
COMPLEXITY OF PROCESS
Performance measures generally take the form of business metrics that provide information on particular aspects of service delivery. One performance measure for the local government service of emergency communications, for example, is average time from receipt of call to dispatch. The simplicity of the performance measure is not indicative of the process it is measuring. Complexity of process relates to the intrica- cies of detail inherent to local government services. Table 1 provides six steps that occur in emergency communications when a call is received.
An understanding of the process of service delivery is an important element in achieving accurate data. In the case of the measure average time from receipt of call to dispatch, local officials should recognize if the start time begins when the phone rings, when the telecommunicator answers the phone, or when the first entry is made into the
Downloaded by [RMIT University Library] at 14:29 30 March 2016
Rivenbark, Pizzarella / AUDITING PERFORMANCE DATA 415
Table l. Common Steps From Receipt of Call to Dispatch 1. Call rings into emergency communication center.
2. Call is answered by telecommunicator (beginning of talk time).
3. Call is entered into the computer aided dispatch (CAD) system and routed, if necessary, to appropriate dispatcher.
4. Call is ready for dispatch (and unit begins to respond if available).
5. Call is held until unit is available to respond (ifresponse is not immediate).
6. Unit is assigned and responds to call.
computer aided dispatch (CAD) system. Additionally, the end time may represent when the call is ready for dispatch or when the call is actually dispatched. This sce- nario would even require further explanatory information for centers that provide emergency medical dispatch.
The types of services provided by local government are relatively consistent. The variation exists in the methods, processes, and activities of service delivery. Although localities often provide emergency communications, each has its own process for receiving and dispatching calls. The steps described in Table 1 are not universal among emergency communication providers and can even change internally from one report- ing period to another. Therefore, localities must verify the accuracy of performance data on a regular basis to ensure that performance measures are reporting on the intended service dimension.
ORGANIZATIONAL CHANGES
Localities are evolving entities. Organizational changes include turnover of person- nel, alteration of service delivery, and resource realignment. Turnover of personnel, particularly in positions responsible for collecting performance data, creates difficulty with service definitions, data interpretation, and data accuracy. A learning curve exists with performance measurement, creating the need for training and technical assistance on data collection and data cleaning on an ongoing basis. Realignment of resources can alter the processes of service delivery. Such changes often crossover departmental and divisional lines. Although organizational changes cause difficulties with data accuracy, they are natural occurrences within organizations and should be accounted for when collecting performance data.
INTERPRETATION OF MEASURES
Interpretation of measures relates to how a jurisdiction defines a performance mea- sure and its components. The goal is to achieve consistency in interpretation from one period to the next so that performance data are comparable. In one year, a police department may use Type I and Type II calls for calculating average response time to high priority calls. The next year, the police department may base the average response time to high priority calls only on Type I calls. This variation creates an inconsistency in the data collected, causing problems with making management decisions based on trend analysis and with tracking progress toward performance targets. Auditing per- formance data ensures that performance measures are consistent over time.
Downloaded by [RMIT University Library] at 14:29 30 March 2016
416 PPMR / June 2002
REPORTING CAPABILITIES
A jurisdiction's reporting capability has a direct bearing on the accuracy of perfor- mance data. Reporting capabilities refer to the mechanisms that produce and track the data. Ideally, all applicable data are included and captured in the same format. The reality of performance measurement is that some data are overlooked, whereas others are included erroneously. Accuracy also is affected by the wholeness of the data col- lected. Estimates based on conversions of partial year data, subsets, or samples are not as accurate as a complete data set. Limitations in reporting capabilities of tracking sys- tems are the primary reason for using estimates. Auditing data enables jurisdictions to identify weaknesses in reporting processes, providing managers with an understand- ing of the integrity of performance measures and performance measurement systems they support.
FUNCTIONAL BOUNDARIES
Functional boundaries designate the resources that are included in services or func- tions being measured. They determine the cost of the function as well as the personnel who deliver the service. Functional boundaries are particularly important for measures of input and output because they provide the foundation for creating higher order mea- sures of efficiency and effectiveness. For the purpose of performance measurement, ideally, a service is separate and distinct from other functions and has clearly identifi- able products, personnel, and costs. Defining services and their inputs is difficult, par- ticularly for departments and divisions with multiple and overlapping functions or with positions that are assigned to more than one function. This is a common occur- rence in public works where crews typically work on multiple processes during a reporting period, including sidewalk repair, asphalt maintenance, and street cleaning.
Moreover, organizational changes can obscure the functional boundaries of service delivery over time, causing problems with data accuracy and comparability. This area alone demands the review of performance data at regular intervals for determining the reliability of performance measurement systems.
An Audit Approach
The framework for auditing performance data is similar to standard audit process. It contains an entrance conference, an agreed on scope, the methodology for review, the audit findings, the recommendations for improvement, the implementation guide- lines, and an exit conference. It also contains follow-up review to determine if the rec- ommendations were implemented, a component of standard audit process often underutilized in government (Brooks & Pariser, 1995). A budget analyst, for example, must be prepared to face challenges beyond that of conducting an internal audit.
Auditing data linked to operational accountability, unlike auditing financial data, requires questioning organizational structure, business practices, and managerial philosophy.
Table 2 contains the framework for auditing performance data, including factors for success. It was constructed based on standard audit process and 14 local government audits conducted in North Carolina (General Accounting Office, 1994; Rivenbark &
Downloaded by [RMIT University Library] at 14:29 30 March 2016
Rivenbark, Pizzarella / AUDITING PERFORMANCE DATA 417
Table 2. Framework for Auditing Performance Data Step
I. Audit schedule 2. Entrance conference 3. Scope of audit 4. Methodology 5. Findings 6. Recommendations 7. Implementation guidelines 8. Exit conference
9. Audit report 10. Follow-up review
Factor for Success
Considers the capacity for conducting meaningful audits, including time and personnel
Includes managers and line employees
Encompasses policy, process, definition, and data
Includes data review, process analysis, and employee interviews Based on scope and methodology
Based on informational needs of managers and line employees Based on service dimensions and definitions for data accuracy Includes managers and line employees
Uses standard format similar to internal audit reports
Requires a process for review to ensure that recommendations have been implemented
Pizzarella, 2002). The first step is to construct an audit schedule, including the assign- ment of audit responsibility. This requires an inventory of organizational capacity.
First, the business processes that produce performance data must be identified. Sec- ond, the audit schedule is constructed, including the audit cycle (e.g., biannual, annual, or biennial). Third, the responsibility of conducting the audit is assigned. Budget, financial, and management analysts are ideal candidates for conducting the review.
This provides the necessary independence because line functions produce the majority of performance data. Auditors (internal or external) may be required for reviewing per- formance data of financial functions given that analysts often work in these areas.3
Once the audit schedule is complete and the responsibility for conducting the audit is assigned, an entrance conference, or planning meeting, is held with employees involved in the service or function under audit. For example, a budget analyst assigned to an audit of recycling data schedules an entrance conference with key staff members within the public works department. The meeting is used to discuss the audit process, identify roles and responsibilities, and address any questions or concerns. This is a crit- ical step because these are the individuals who the analyst must rely on for describing the processes, defining the performance data, and producing the necessary documen- tation for review.
The entrance conference also is used to review and refine the audit scope. The scope of a typical audit includes the function, process, or activity producing the performance data, ordinances, resolutions, policies, and business practices that affect the functional area under review; mechanisms used to track, record, and report performance data;
performance measures within a selected reporting period; and the specific definitions used for measurement interpretation. In the case of the recycling example, the audit scope is defined as the collection of household recycling materials only. Based on this scope, the budget analyst gathers information on the recycling program and seeks gen- eral information on recycling standards and practices to gain an understanding of the service dimensions and processes under review.
The methodology for auditing performance data normally contains two parts. The first is to review all documentation involved in the process, including the information previously described within the audit scope as well as financial and budgetary docu-
Downloaded by [RMIT University Library] at 14:29 30 March 2016
418 PPMRI June 2002
mentation, organizational structure, position control, and external information. The second requires employee interviews, process analysis, data reconciliation, and defini- tion review. A critical aspect of the employee interview process is to determine how the performance measures are being used, focusing primarily on their utility for making decisions. The budget analyst finds that the recycling coordinator is using cost per ton and participation rate to expand the recycling program.
The audit process now involves the reporting of findings, recommendations, and implementation guidelines. The findings emanate from scope and methodology.
Returning to the example, the budget analyst determines that the cost per ton does not include the costs associated with the material recovery facility, and the participation rate is based on a route estimate. The recommendations are driven by the informational needs of management, suggesting that existing measures are deleted or revised and new performance measures are created. The recommendations are accompanied with explanatory information to provide definitions of performance measures and to place them into context. The construction of implementation guidelines is the primary rea- son for the budget analyst to gain an understanding of the service dimensions during the audit scope and methodology. The likelihood of producing accurate, reliable, and comparable performance data increases as the guidelines become more specific. The budget analyst must understand, however, that the recommendations and implementa- tion guidelines are only in draft form and will not be finalized until after the exit conference.
The final steps of the framework are the exit conference, the audit report, and the follow-up review. The exit conference, like the entrance conference, is held with employees involved in the function or process under study. The conference provides a forum to discuss findings, recommendations, and implementation guidelines. In the case of the recycling data audit, the budget analyst proposes new methodologies for calculating cost per ton and participation rate. The analyst also recommends that the measure of recyclables collected as a percentage of solid waste collected be added for an additional source of information on the effectiveness of the service, including guidelines on data sources and calculation. The group agrees that the new and revised measures are more accurate and useful. The audit report is then finalized based on the results of the exit conference and includes the following sections: introduction, scope, methodology, findings, recommendations, implementation guidelines, and conclu- sion. Finally, the budget analyst schedules a follow-up review to ensure that the agreed on recommendations are implemented.
The framework presented in Table 2 was constructed for auditing performance data derived from any program or process in local government. However, an analyst must use caution when conducting the individual steps. It often takes leadership and inter- personal skills to conduct an effective audit. Analysts will find themselves questioning senior employees who have invested time and energy in the processes and activities of service delivery. One of the most effective ways to communicate with line employees is to become knowledgeable about the process under study, which establishes creditability and allows analysts to conduct audits more efficiently.
Downloaded by [RMIT University Library] at 14:29 30 March 2016
Rivenbark. Pizzarella / AUDITING PERFORMANCE DATA 419
Conclusion
Research has clearly demonstrated that local government is engaged in perfor- mance measurement (Berman & Wang, 2000; Poister & Strieb, 1999). This can be explained in part by the enormous amount of emphasis placed on it, propelled by the professionalism of administrators (Streib & Poister, 1998), the research of academics (Ammons, 2001; Hatry, 1980), and the sponsorship of professional organizations (Fischer, 1994). However, there is a current need for additional research on auditing the accuracy of performance data that are used to create performance measures for operational accountability. In other words, although performance measures are being reported, the accuracy, reliability, and comparability are still in question.
The advantages of auditing performance data go beyond that of ensuring the accu- racy of performance measures. It is a dimension of performance measurement that includes reviewing the business functions and activities that produce performance data, auditing the mechanisms used to record and report performance data, and under- standing the specific definitions of service delivery. This sets the stage for expanding audit findings and recommendations to include the handling of performance data and for continuous process improvement. Auditing performance data is an excellent way to build annual audit work plans from the standpoint of program evaluation and perfor- mance auditing.
There is a broader implication of auditing performance data that must be addressed by local government. The Governmental Accounting Standards Board (GASB) is con- ducting research to determine the feasibility of requiring service, efforts, and accom- plishments (SEA) reporting (Harris, 1995; Rivenbark, 2001). Part of the discussion on whether to require SEA reporting in annual financial statements or in separate reports involves the role of auditing the performance data for materiality" Some form of inde- pendent review is necessary if the GASB adopts a reporting requirement for perfor- mance measures. This will be a major organizational capacity question for local gov- ernment regarding time and resources.
Notes
I. Auditing performance data is not the same as performance auditing, which is a standard audit proce- dure for analyzing the efficiency and effectiveness of a selected program. For more information on perfor- mance aUditing, see Nobles, Brown, Ferris, and Fountain (1993).
2. Staff members of the North Carolina benchmarking project conducted audits of performance data in each of its 14 participating municipalities. Audits were conducted for the service areas of police patrol, police investigations, and emergency communications during Spring 2001. The purpose of the audits was to ensure that accurate, reliable, and comparable data were being submitted for project participation. The audit approach reflected standard audit process. Entrance and exit conferences were held, and on-site interviews were conducted with employees of each service area. The areas that affect accuracy and comparability of performance data and the audit approach presented in this article were derived from the procedures and find- ings of the individual local government audits. For more information on the North Carolina benchmarking project, see Few and Vogt (1997).
3. Internal auditors are ideal candidates for auditing performance data in larger local governments. For more information on the expanding role of auditors, see Grifel, Morgan, and Epstein (2001).
Downloaded by [RMIT University Library] at 14:29 30 March 2016
420 PPMR I June 2002
4. The Governmental Accounting Standards Board (GAS B) issued Concepts Statement No.2 in April 1994, addressing service, efforts, and accomplishments (SEA) reporting. The GASB is now conducting research on the feasibility of requiring performance measures as part of general purpose external financial reports. The research includes the extent to which governments are verifying the accuracy of performance measures with audits.
References
Ammons, D. N. (2001). Municipal benchmarks. Thousand Oaks, CA: Sage.
Barrett, K., & Greene, R (2000). Truth in measurement. Governing, 13(10), 86.
Berman, E., & Wang, X. (2000). Performance measurement in U.S. counties: Capacity and reform. Public Administration Review, 60(5), 409-420.
Brooks, R c., & Pariser, D. B. (1995). Audit recommendation follow up systems: A survey of states. Public Budgeting & Finance, 15(1), 72-83.
Few, P. K .. & Vogt, A. J. (1997). Measuring the performance oflocal governments in North Carolina. Gov- ernment Finance Review, 13(4), 29-34.
Fischer, R 1. (1994). An overview of performance measurement. Public Management, 76(9), 2-8.
General Accounting Office. (1994). Government auditing standards. Washington, DC: Author.
Governmental Accounting Standards Board. (1994). Concepts statement no.2-Service efforts and accom- plishments reponing. Norwalk, CT: Author.
Grifel, S. S., Morgan, S. L., & Epstein, P. D. (200 I). Evolving roles for auditors in government performance measurement. PA Times, 24(7), 10.
Harris, 1. (1995). Service efforts and accomplishments standards: Fundamental questions for an emerging concept. Public Budgeting & Finance, 15(4), 18-37.
Hatry, H. P. (1980). Performance measurement principles and techniques: An overview for local govern- ment. Public Productivity Review, 4(2), 312-339.
Nobles, J. R, Brown, 1. R., Ferris, J. A., & Fountain, J. R. (1993). AGA Task Force report on performance auditing. Government Accountants Journal, 42(2), 11-25.
Poister, T. H. & Strieb, G. (1999). Performance measurement in municipal government: Assessing the state of the practice. Public Administration Review, 59(4), 325-335.
Rivenbark, W. C. (2001). The GASB's initiative to require SEA reporting. Public Administration Quarterly.
Rivenbark, W. C., & Pizzarella, C. M. (2002). Ensuring the integrity of crucial data. Popular Government, 67(2), 28-34.
Streib, G., & Poister, T. H. (1998). Performance measurement in municipal governments. Municipal Year Book, ICMA, 65, 9-15.
William C. Rivenbark is an assistant professor with the School of Government at the Uni- versity of North Carolina at Chapel Hill. He specializes in local government administra- tion,Jocusing primarily on performance and financial management. His articles appear in numerous academic and professional journals. Contact: [email protected].
unc.edu
Carla M. Pizzarella is a research associate with the School of Government at the Univer- sity of North Carolina at Chapel Hill. She earned a Master of Public Administration and a Master of Urban Planning at the University of Kansas. She specializes in local govern- ment budgeting and performance management. Contact: [email protected].
unc.edu
Downloaded by [RMIT University Library] at 14:29 30 March 2016