HomeMy WebLinkAbout23 - Performance Measures and BenchmarksCITY OF NEWPORT BEACH
CITY COUNCIL STAFF REPORT
Agenda Item No. 23
September 8, 2009
TO: HONORABLE MAYOR AND MEMBERS OF THE CITY COUNCIL
FROM: City Manager's Office
Homer Bludau, City Manager
949/644 -3000 or hbludau @newportbeachca.gov
SUBJECT: City Council Acceptance of the Final Report from ICMA Consultants
Regarding the Establishment of Performance Measures and
Benchmarks, In Order To Continuously Improve As an Organization
ISSUE:
Does the City Council desire to
Rapp and Brooke Myhre on the
months? Is there any direction
ongoing process?
RECOMMENDATION:
accept the final written report from ICMA consultants Craig
benchmarking program effort conducted over the past 18
to staff the City Council wants to provide regarding this
Accept the report and provide direction to staff as desired.
DISCUSSION:
Background:
In both 2008 and 2009 the City Council selected the following as one of its annual priorities:
"Implement an organizational performance improvement effort through the data gathering of
performance measurements, resulting in comparing the City's organizational effectiveness
with other agencies and also internally."
The initial step of this effort was to conduct a statistically valid customer satisfaction survey,
which would establish the basis for future measurements of customer satisfaction levels for
a wide variety of City services. A professional survey firm obtained telephone input from
residents on a series of service related questions and determined that high satisfaction
levels existed for most City services.
With the customer satisfaction survey information as background, the next step was the
identification of certain services we desired to benchmark and the performance measures
which would assist in ouantifvina the effectiveness of orovidino these services. In order to
Performance Measurements and Benchmarks Final Report
September 8, 2009
Page 2
assist us with these processes, a Request for Proposals was issued, and the City Council
selected the International City Management Association (ICMA.) team of Craig Rapp and
Brooke Myhre to assist in this effort. Mr. Rapp and Mr. Myhre worked with City staff and the
Council Finance Committee to develop the 30 City services to be benchmarked and the
performance measures to be utilized for each service. The City Council approved the final
list of both benchmarked services and performance measures. In January 2009, staff began
collecting the internal performance measurement data; that effort continues today.
A key step in the benchmarking process was the selection of the City services to be
benchmarked. The City Council and staff selected services which they believed offered
opportunities for improvement or that were key indicators of service reliability and
effectiveness to our residents. That is, the City selected for benchmarking those services
which had importance to our particular circumstances and customers, as opposed to
selecting benchmark services for which there was a ready supply of available data to use in
comparing ourselves with other city service providers.
The City Council selected the cities it desired to compare our benchmark data with;
however, most services included by the City Council for benchmarking were unique in some
way to Newport Beach or the selected cities did not benchmark the same services, so there
was limited comparative data to utilize for this purpose. In some cases, City staff will seek
out regional service providers in order to attempt to develop groups of cities to benchmark
against, but that effort will take time. In the meantime, City data is being collected in order
to form a base year against which to measure our future services.
While the customer satisfaction survey established that Newport Beach residents have high
satisfaction levels with most City services, the survey results did not answer the question of
how efficient we are in providing those services. That is the purpose of benchmarking and
tracking performance measures. We want to do both — that is, provide high customer
satisfaction levels and provide services in a very cost effective delivery manner.
For the past two years, the benchmarking effort has resulted in a large expenditure of time
on the part of several staff members in every department, as much training took place in the
process of the selection and development of benchmarked services and performance
measures. In particular, the core benchmark team of George Murdoch, Susan Warren, Mike
Wojciechowski and John Kappeler has provided exemplary leadership in serving as the
communication and training links between the ICMA consultants and the organization. They
deserve much of the credit for getting us to the point where a final report is available for
review by the City Council.
With this phase of the benchmarking effort concluded, we have come to a good place to
reflect on where we are, the goals of the City Council in initiating this effort almost two years
ago, and where staffs continued time expended on this effort can be most productive in the
future.
Craig Rapp will review the final report results with the City Council in an oral presentation.
Performance Measurements and Benchmarks Final Report
September 8, 2009
Page 3
Environmental Review: The City Council's approval of this Agenda Item does not require
environmental review.
Public Notice: This agenda item has been noticed according to the Brown Act (72 hours in
advance of the meeting at which the Council considers the item).
Submitted by:
H MER L. BLUD U
City Manager
Attachment: Performance Measurement and Benchmarks Final Report
Performance Measurement and
Benchmarking Project
September, 2009
Newport Beach, California
Submitted by and reply to:
Management Services
ICMA Consulting Services
International City/County Management Association
777 North Capitol Street NE, Suite 500
Washington, DC 20002
Craig.Rapp @icma.org
202- 962 -3583
ICMA
Leaders at the Core of Better Communities
Table of Contents
Introduction.................................................................................................................... ............................... 3
I. Performance Measurement .................................................................................... ............................... 4
Uses and Benefits of Performance Measurement ..................................... ............................... 4
II. Project Purpose and Overview .......................... ...............................
Goal Alignment and Validation ............... ...............................
Identification and Adoption of Services for Measurement
Benchmarking -Peer City Comparisons ... ...............................
Staff Training — Measures and Data Collection ....................
111. Project Results ......................... ...............................
Understanding the System and the Data ..
Data Gathering Issues ... ...............................
Benchmark Measures - Summary ................
Benchmark Comparison Data ......
IV. Observations/ Recommendations ......
Observations . ...............................
Recommendations ......................
5
7
R
10
11
12
12
12
14
15
37
37
38
Introduction
This report summarizes ICMA Consulting Services' (ICMA) effort to assist the City of Newport Beach California in
creating a performance measurement system and to initiate benchmark comparisons with other communities.
The goal of the project was to establish comparative data on organizational effectiveness and to support
continuous improvement.
The contract with the City of Newport Beach required ICMA to provide local government benchmarking data,
deliver performance measurement best practices, train city staff, and facilitate development of a system that
would enable the city to analyze and improve its performance.
Over the past year, ICMA has worked with the staff and City Council to replace a system of service load indicators
with a system of balanced performance measures and benchmark comparisons. To accomplish this, benchmark
comparison cities were identified; a core team of employees were trained, and a set of benchmark and general
performance measures were developed that were tied to Council priorities.
Initial data for twenty -eight benchmark measures is provided comparing Newport Beach to comparable
jurisdictions both locally and across the country. Additionally, data collection is underway for more than 100
other performance measures that will be used to analyze and improve operations once collection cycles conclude
and data becomes available.
The project represents the first step in a long -term process of analyzing and defining key drivers of organizational
performance and service delivery. Creation of the system and collection of initial data provides the foundation
for additional inquiry. Data collection activities reflect results, but do not provide definitive answers about
performance.
Finding those answers and managing the process of continuous improvement will be the on -going challenge
facing the city. This may lead to new ways of doing business, refining processes, and potentially rethinking
strategic direction. Performance measurement and reporting is an evolutionary process- tied both to the
strategic planning and performance improvement activities of the organization.
The system that has been created that will provide the opportunity for improved data driven decision - making.
To successfully implement this, the City Council and staff to will need to ensure that roles and accountabilities are
clear, and that performance measures continue to reflect community priorities.
Performance Measurement
Performance measurement is a process of data collection and analysis used to assess the results and
effectiveness of business activities over a specific time period. Progressive cities throughout the United States use
performance measurement systems to both guide and manage operations.
It is important to keep in mind, however, that performance measurement does not represent an end point, but in
fact begins the conversation about organizational effectiveness.
The logic of performance measurement is simple and compelling:
Measurement focuses attention on priorities and results
o A performance measurement system links Council strategic objectives to the daily work of employees
Measurement answers the question "How are we doing ?"
o Performance measurement provides managers and employees with the information necessary to
critically examine operations and compare this to other organizations or benchmark criteria
Measurement identifies successful strategies
o Performance measurement reveals processes and approaches that are working and enable sharing
with others in the same field
Measurement enhances accountability
o Performance measurement provides the basis for developing reporting systems that explain the cost,
quality and effectiveness of service delivery to stakeholders
The benefits of Performance Measurement are:
Measures reveal values
o If customer satisfaction, low cost, or speedy delivery is valued— it will be measured
Measures motivate behavior
o If something is valued and becomes a priority —it will be measured and systems of work will be
focused
Measures help you learn
o Most importantly, measures help answer critical questions -is it working? Are we having an impact?
What happened? —in order to facilitate continuous learning and improvement
11. Project Purpose and Overview
The performance measurement and benchmarking project began in February, 2008. A project schedule was
developed to align with the deliverables expected by the City Council. The key components and ultimate timing
of each element is listed below:
❖ Validate Alignment with Goals
March 2008
4• Review /Recommend Services
April 2008
•:• Recommend Measures
June 2008
❖ Recommend Peer Cities
July 2008
Train staff /refine /collect
Throughout
❖ Implementation /Process
Aug -Oct 2008
❖ Catch -up /suspend activities
Oct -Dec 2008
❖ Refine /finalize /collect data
Jan -June 2009
❖ Report to Finance Committee
June 2009
❖ Department Wrap -up Meetings
August 2009
•:• Managing with Data Workshop
August 2009
S• Report to City Council
September 2009
A "central team" of four key staff was chosen to oversee the project and interface with the consultant team. In
addition, each department appointed a set of facilitators who received training and worked with the central team
and consultants to develop the performance measures and benchmark data.
The central team members are:
Susan Warren — Library Services Manager
2. George Murdoch — Director of Utilities
3. John Kappeler— Division Manager of Code and Water Quality Enforcement
4. Michael Wojciechowski — MIS Operations Coordinator
The department facilitators are listed on the following page.
Figure 1: Department Facilitators Trained in Performance Measurement
Lois Thompson
Dennis Danner
Administrative Services
Sandra Wilcox
Building
Jay Elbettar
Faysal Jurdi
City Attorney
David Hunt
Kristy Parker
City Clerk
Leilani Brown
Shana Stanley
Homer Bludau
Tammie Frederickson
City Manager
Sharon Wood
Tara Finnigan
Dave Kiff
David Mais
Fire
Steve Lewis
Ralph Restadius
Steve Bunting
Maurice Turner
General Services
Mark Harmon
Mike Pisani
Human Resources
Terri Cassidy
Gwen Bouffard
Library Services
Cynthia Cowell
Melissa Kelly
Planning
David Lepo
Jay Garcia
Police
Robert Luman
Jim Kaminsky
Public Works
Jamie Pollard
Steve Badum
David Webb
Recreation & Senior
Laura Detweiler
Sean Levin
Services
Utilities
George Murdoch
Cindy Asher
Goal Alignment and Validation
In order to ensure that performance measures were aligned with Council and community goals, the project team
reviewed a variety of documents including Council priorities, business and resident surveys and various planning
documents. The following table summarizes the most important goals and priorities. The priorities expressed —
particularly the overlapping priorities of the two lists- provided the context for developing performance
measures:
Figure 2: Priorities Identified by Council — Annual Session & Resident Survey
Council Priorities
Resident Survey Priorities
City Hall
Management of traffic flow
Facilities financing plan
Maintenance of
streets/infrastructure
Implement group home
ordinance
Maintenance of beaches
Water Quality
Enforcement of
codes /ordinances
Benchmarking
Quality of water supply
Banning Ranch
Appraisal /Acquisition
Effectiveness of City
Communications
Traffic Management
Implementation
Parks & Recreation
programs and facilities
Make city more energy efficient
Quality of city customer
service
Quality of library system
Identification and Adoption of Services for Measurement
Once priorities were identified, staff training was conducted on performance measurement concepts. Initial
training focused on creating a basic understanding of performance measurement and how measurement and
benchmarking would be used in the organization to both manage operations and address critical Council
priorities.
Following the training sessions, the team developed a list of thirty-six "service areas" for possible measurement
The list was presented to the Finance Committee and reduced to thirty, and then adopted by the City Council.
Developing performance measures for thirty service areas is an ambitious undertaking. A more typical approach
would be to establish measures in a few departments to test the effort, and then use that experience to expand
the system across the organization. In this circumstance the City Council and senior leadership determined that a
broad cross - section of the organization should be represented, therefore the larger effort went forward.
Once final service areas were approved, the consultant, central team and facilitators embarked upon a series of
facilitated sessions leading to the development of performance measures for each service. Prior to this time,
Newport Beach used a system of "service load indicators" to report the output of each department.
The performance measures that were developed address the results, or "outcomes" of service delivery, and as such
provide a link to Council's priorities and from the previous service load indicators. A balanced set of measures
(quality, Cost, Cycle Time, and Customer Satisfaction) was established for each service area.
Using a Balanced Set of Measures
A comprehensive measurement system demands that the data collected reflect, to the greatest extent possible, a
balance of four key dimensions of performance. The four categories that comprise a balanced set of measures are
described below:
Quality - is the umbrella term designed to capture the effectiveness of the service and indicate whether the
organization is doing the right thing.
Sample Quality Measure: Percentage of streets in very good or excellent pavement condition
El Cost —is the amount of resources required to produce a definable per unit result. Cost usually includes labor,
materials, overhead, facilities and utilities. This measure may also reflect other financial aspects such as revenue,
savings, or investments.
Sample Cost Measure: Operating and maintenance expenditures per system mile for drinking water distribution
Cycle Time —is from when a customer requests the service to when the service is delivered. This measure should
include waiting time, and look at the whole service. Cycle time is usually measured against some standard of
response time.
Sample Cycle Time Measure: Percentage of requested maintenance completed within time standards
Customer Satisfaction — is how customers feel about a service they received and the results of that service. It
can include a broad range of measures examining customers' feelings about timeliness, quality, professionalism
of service delivery, and courtesy. This category differs from the others because it is entirely based upon
perception; where the others typically measure an objective condition. It is important to remember that both
are needed to provide a balanced picture.
Sample Customer satisfaction Measure: Percentage of customers rating library customer service as "good" or
better. (e.g. 4 or 5, on a S point scale)
Each service area generally includes one or more measures covering the four measurement types (Quality, Cost, Cycle Time,
and Customer Satisfaction) to demonstrate "how well ", "how fast ", "how efficient", or "how satisfactory" the results are of
the service. For benchmarking purposes, the one or two dimensions deemed most important were used.
Finally, each measure was subjected to an additional evaluation to determine whether they were meaningful, useful and
sustainable. Meeting these criteria was important in order to ensure that the measures have value and will be used by
Council and staff. A description of these terms follows:
Meaningful - Does the measure provide information people want or need to know regarding the effectiveness, cost,
responsiveness, or overall satisfaction with the service delivered? Will measuring these results help us improve?
Useful— Can this information be used to communicate success or support decision - making? What specific
communications or decisions are envisioned?
Sustainable - Is the information available? What additional work or resources are needed to implement the
measurement and collection of the data? Is the information worth the investment of time and resources necessary to
collect, analyze and report it?
Benchmarking- Peer City Comparison
One of the City Council's primary objectives for the project was to establish data on comparative performance. To
put benchmarking into context, and to ensure common understanding, terminology was approved for use in the
system. The following are key benchmarking definitions:
Benchmark - a performance measure that permits valid comparison of service delivery results with other
jurisdictions or organizations, recognized "best practices ", or professional standards for a service. Benchmarks
are selected and analyzed to ensure a meaningful "apples to apples" relationship exists between compared
operations and reported data.
Benchmarking Sources — these are other jurisdictions, professional organizations, or readily available sources that
provide comparisons of performance, best practices, innovations or problem solving.
Concurrent with the development of benchmark measures was the identification of "peer' cities with which to
benchmark performance. In order to establish a reasonable comparison group, a set of recommended criteria or
"filters" was developed and reviewed by the Finance Committee in August 2008, and approved by the City
Council in September. The initial recommendation to the City Council contained nine criteria and a total of
twenty -three benchmark cities. The Council ultimately selected seventeen cities for benchmarking purposes.
Figure 3: Benchmark Cities Selected by City Council
Carlsbad, CA
Bellevue, WA
Long Beach, CA
Coral Springs, FL
Santa Barbara, CA
Virginia Beach, VA
Huntington Beach, CA
Westminster, CO
Manhattan Beach, CA
Scottsdale, AZ
Santa Monica, CA
Evanston, IL
Ventura, CA
Fishers, IN
Santa Cruz, CA
Coronado, CA
Palo Alto, CA
The following are the nine criteria or "filters" that were used to determine peer cities:
1. Population
2. Participate in ICMA Center for Performance Measurement (CPM)
3. Beach /Resort and /or CA Community
4. Median Income
5. Total Operating Budget
6. Total Operating Budget/Number of City Employees Ratio
7. Median Housing Price
8. Number of Jobs in the Community
9. Services Provided
Through benchmarking with other cities, Newport Beach hoped to answer the following key questions-
1. "How does Newport Beach compare with the benchmark local governments in terms of its
performance results ?"
2. "What can Newport Beach learn about service delivery from a city that performs a similar service but
has different levels of performance?'
Staff Training- Measures and Data Collection
Upon establishment of a benchmarking and measurement structure, additional staff training and development was
initiated to move from basic understanding of concepts to full competency developing measures, collecting and
recording data, and analyzing results.
In addition, ICMA trained the central team to a higher level in an effort to "train the trainer" and to provide an on -site
support system for the consultants. This also served to establish an in -house capacity for sustaining the system going
forward.
Ultimately, the project size and scope proved to be overwhelming for staff, resulting in training and data collection
activities being suspended for three months in the fall of 2008. Reestablishing momentum after suspending activities
proved to be difficult, however, additional training and guidance was provided during the first quarter of 2009.
Measurement activity is on- going, with support from ICMA.
To ensure sustainability of the project, a final facilitator training curriculum has been provided
III. Proiect Results- Performance Data
Understanding the System and the Data
The data presented in this report represents an integrated system. In every performance measurement system there
are different types of data and different data needs depending upon the scope of authority and responsibilities of the
person using the information.
For example, it is typical for a City Council to focus on the broadest types of outcome measures such as "overall
customer satisfaction with Library Services ". This differs from a measure monitored by a Library Manager such as
"amount of time it takes for a new item to go from initial receipt to the shelf ". Each represents valid data, but how it
is used, and the need to know vary depending upon the responsibilities of each party.
The performance measurement system developed for Newport Beach was constructed for two primary purposes: (1)
to provide "benchmark" performance comparisons with other cities on a range of services; and (2) to provide a
balanced set of measures that can be used for monitoring and evaluating operations. The first purpose primarily
serves the needs of the City Council and senior management, the second serves the needs of senior management and
front line supervisors.
Benchmark measures, along with initial comparison data are provided in the following sections. The benchmarks
measures included were chosen because they represent broad indicators of performance deemed to be useful for
City Council deliberations. The results are clearly depicted in the graphs, therefore limited interpretation is provided
except where necessary to explain data collection anomalies or to provide a recommendation for future data
development.
General performance measures which address operations are listed in Appendix A. These measures were developed
by staff to support Council priorities and connect those priorities to the daily work of employees. Data collection has
begun on many of them, with initial results to be reported during the 2009 -2010 fiscal year. Some of the measures
are still under evaluation for appropriateness — related primarily to issues associated with the time and expense of
data collection or the need to change IT or HR systems in order to collect the data.
Benchmark Comparisons — Data Collection
The city chose to address thirty different service areas for measurement, in an attempt to cover a comprehensive
range of city operations. While this choice will ultimately serve the city well for general measurement of
performance, it provided a significant challenge for benchmarking.
For example, although ICMA's Center for Performance Measurement (CPM) has the largest database of local
government performance measures in the country with over 200 jurisdictions reporting, there are only fifteen service
areas in which data is collected. Multiple measures are contained within the fifteen services however; therefore CPM
data is provided for twenty -one of the thirty -four benchmark measures recommended.
Even within CPM's comprehensive database, not all cities involved collect or report data on all measures. In some
cases, peer cities who are part of the CPM database don't report on desired measures, either because they don't
provide the exact service, or they choose not to measure it. To compensate for these situations, other CPM cities
similar to the peer cities have been included for comparison, along with the median and mean of all reporting cities.
To fill the remaining void, third party data was also collected. While useful, third party data is often limited in its
comparability and depth of information. One example of this is the data for Utilities, which came from the American
Waterworks Association (AW WA). While AWWA does collect nationwide information on utility operations, it did not
provide the same level of comparability as the CPM database.
Benchmarking also requires that comparable, "clean" data be available in order to achieve legitimacy. Clean data
is defined as data that has been "scrubbed ", or verified for accuracy and comparability. For example, CPM has a
large staff devoted to data cleaning. This means they engage in ongoing verification, along with regular meetings
with groups of cities who discuss in detail the intricacies of their operations so that any differences can be noted
and reflected in the presentation of data. Independent data collection by city staff can become a tremendous
burden when data cleaning is involved.
The information presented in this report was drawn from Newport Beach financial and performance data, and
was analyzed to a level that provides reasonable assurance of its accuracy and reliability; however, the
information was not audited or subjected to the same level of testing, or data cleaning that CPM data receives.
In the future, as the organization integrates more thoroughly into the CPM group of cities, Newport Beach data
will be cleaned and its validation will be assured to that level.
Finally, there are some general cautions that should be applied to any benchmark data. The Governmental
Accounting Standards Board (GASB) issues the following advice, excerpted from GASB Publication: Reporting
Performance Information: Suggested Criteria for Effective Communication:
Criterion Eleven - Comparisons for Assessing Performance
"The two staples of comparative performance reporting are time series and comparison against targets. Users are
also interested in comparing their government with other, similar governments or industry standards.
Comparisons with other organizations can be very informative as long as any significant differences in measures
or circumstances that have been identified are noted.
Care should be taken to ensure that reported information is comparable. (The some measures are measured in
the same way and use reliable data) if the information is not truly comparable, either no comparisons should be
reported orsufficient explanation should be provided to describe how the information is not comparable.
Care should also be taken when selecting the pool of organizations with which to make interorganizational
comparisons. Efforts should be made to select organizations that are similar or comparable to the organization
making the comparison. Care should also be taken to ensure that comparative information about other
organizations was collected in a reliable manner.
The ICMA Comparative Performance Measurement Project is one effort that addresses the comparative problem
For this project over 200 cities and counties have agreed on measurement methods far a selected number of
performance measures. The organizations were trained on how to develop and report the selected performance
measures and testing is done to check whether there is a reasonable degree of comparability among the data
reported."
This advisory is offered not to minimize the usefulness of benchmark data, but to indicate that caution should always
be exercised when comparing performance between organizations.
Benchmark Measures
The following is a summary of 34 benchmark measures covering 22 different service areas within 12 different
departments. They are grouped by department and service area. Originally, 30 service areas were selected for
measurement. The eight service areas not included in benchmarking are being measured internally, but not
benchmarked externally at this point in time. The decision not to benchmark those service areas was based upon a
number of factors, but primarily lack of readily accessible data.
Detailed comparison data is provided for 28 of the 34 measures listed. Six measures are presented with no data. They
have been included with no data due to the importance of that measure to the city. In those cases, a range of options
is suggested, from creation of a single issue benchmark group to the use of third party data sources not aligned with
CPM or peer city reporting.
Specific benchmark data is depicted in a series of graphs which follow the summary. Unless otherwise noted, all data
is from fiscal year 2005 -2006, the most recent year for which comprehensive and "clean" data was readily available.
The data largely speaks for itself, therefore no interpretation or evaluation of the results is included beyond
footnotes.
The graphs are color coded to aid in reviewing the comparison data. As mentioned previously, peer cities have been
supplemented by other jurisdictions within the CPM database to provide a reasonable comparison group. Cities
chosen for inclusion were jurisdictions of similar size, and have community characteristics deemed important by the
City Council.
Summary of Benchmark Measures- by Department and Service Area
The following is a summary of the 34 benchmark measures presented for consideration, listed by department and service
area. Benchmark data is provided on 28 of the 34 measures in twenty -two different service areas across twelve
departments.
Department
- Service Area
Benchmark Measures
I. Administrative Services
I -a.) IT Operations and Maintenance (O &M) expenditures as percentage of
- IT Services
jurisdiction O &M
- Purchasing
I -b.) Purchasing administrative cost as percentage of total purchases
II. City Attorney's Office
II -a.) Cost of representation by in -house OCA compared with benchmark
cities and outside firms
Ill. City Manager's Office
- Code Compliance
-Water Quality/ Environmental
III -a.) Code Compliance expenditures per capita
Protection
III -b.) Average days from complaint to first non - inspection response
III -c.) Limited comparison data. Recommend using external beach ratings
IV. Development Services
IV -a.) Percentage of first plan reviews completed in 30 days
- Plan Check
(Recommend development of California benchmarking group)
V. Fire
V -a.) Percentage of responses within 5 minutes
- Fire Emergency Response
V -b.) Percentage of fires confined to room /structure of origin
V -c.) Fire & EMS operating expenditures per 1000 population
V -d.) Fire & EMS Staffing — FTE's per 1000 population
VI. General Services
Vka.) No comparison data- Recommend starting a benchmarking group
- Maintenance of Beaches & Piers
VI -b.) Park maintenance expenditures per developed acre
- Park Maintenance
VI -c.) Resident rating of refuse service
- Refuse Collection
VI -d.) Refuse collection costs per account (Regional Comparison)
- Street Sweeping
VI -e.) Street sweeping expenditures/Residential curb -mile swept
VII. Human Resources
VII -a.) Cost of central HR per City FTE
- Employment Services
VII -b.) Avg. days to complete internal recruitment
VIII. Library
- Library Customer Service
VIII -a.) Reference transactions per FTE Reference staff
VIII -b.) Materials expenditures per capita
IX. Police
IX -a.) Feeling of safety in neighborhood
- Police Crime Prevention
IX -b.) Average response time from dispatch to arrival
- Police Emergency Response
IX -c.) Total Police O &M expenditures per 1,000 population
- Investigation
IX -d.) Total Police Staffing - FTE's per 1,000 population
IX -e.) Percentage of UCR part I violent crimes cleared
X. Public Works
X -a.) Limited comparisons, recommend focusing on internal benchmarking
- Capital Project Delivery
until a reliable comparison group is established
- Traffic and Transportation Services
X-b.) Pavement Condition Index compared to Orange County cities
X -c) Satisfaction with road condition
XI. Recreation and Senior Services
- Active Kids Program
XI -a.) No comparison data; recommend starting a benchmarking group
- Senior Transportation Services
XI -b.) Cost per trip (Local benchmarking group initiated)
XII. Utilities
- Drinking Water Supply
X11-a, & b.) Percentage of days in full compliance with regulations (AWWA)
- Wastewater Collection
X11-c.) Number of leaks /breaks per 100 miles piping (AWWA)
XII -d. & e.) Overflows per 100 miles collection system piping (AWWA)
Lal Administrative 5 rvices AT Services - O &ME enditures as %of Total Cily O &M
IT Operating Expenditures as %or Total jurisdiction Operating Expenditures
0% 1% 2% 3% 4%
Bellevue WA
Westminster CO
Santa Monica CA
Scottsdale AZ
Flshers IN
Coral springs FL
ICMA Mean
Newport Beach CA
ICMA Median
Santa Harbars CA
Manhattan Beach CA IF
Readily available data for numerous peer cities- from CPM database
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
Purchasing 0 &M Costs as a Percentage or Total Purchases
0.00% L00% 2.00% 3.00% 4.00% 5.00% 6.00% 7.00%
Newport Beach CA
Palm Coast FL
Rockford IL
Sandy Springs GA
SchaumpurglL
North Richland Hllis TX
ICMA Median
MOAIIen TX
Las cruces NM
Davenport IA
ICMA Mean
sterling Heights MI
Rowlett TX
Peoria AZ
Winter Haven FL
I I
Readily available comparison data from CPM database- peer cities not reporting
Me) City Attorney's Office/ Cost of representation by in -house OCA compared to benchmark cities and firms
No benchmark data is available for this measure from either CPM or other readily available source. The City Attorney
requested that this measure be established, and has indicated that he will initiate the development of a benchmark
group for the purposes of providing meaningful data on this measure.
All Code Enforcement Categodes- Expenditures per Capita
$0.00 $5.00 $10.00 $15.00 $20.00 $25-00
Newport Beach CA
Bellows WA
Sterling Heights MI
Virginia Beach VA
Evanston IL
IOMA Median
Skokie IL
Long Beach CA
Cml Springs FL
ICMA Mean
Westminster CO
Scottsdale AZ
University Park T%
Santa Menace CA
Palm Coast FL
numerous near cities- from CPM database
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
Ill.b Ci ana al's Office Code Com ii ce - Ave. Days Com faint to Non-Insviection Resoonse
Zoning Code Violation - Average Days from Complaint to First Non inspection Response
0 1 2 3 4 5
Scottsdale AZ
Highland Park It
Newport Beach CA
Coral Springs FL
Johnson City TN
Lombard IL
Skokie It
Suwanee GA
ICMAMedian
Des Moines IA
Queen peek AZ
Fort Collins CO
Schaumburg IL
winter Hwen FL
ICMA Mean
Long Beach Cl1
Readily available comparison data from CPM database- limited peer cities reporting
Ill.cl Qty Managers Office/Water Quality /Environmental Protection
No ICMA comparison data is available for this service area, and there is limited data generally on this specific service
area. Due to its importance to the City Council and other stakeholders however, it is recommended that the City
develop external data sources to enable objective comparison of selected measures of ocean and bay water safety.
There are two sources of comparison data recommended for consideration.
One source is the "Beach Report Card" produced for the last ten years by Heal the Bay, a private, non - profit
organization which compiles and reports results of water quality monitoring performed by California health agencies.
Regular monitoring of the levels of three potentially hazardous water -borne organisms is required under California law
(AB 411).
Heal the Bay converts this data into an equivalent `letter grade" and publishes the results for nearly 500 monitoring
locations throughout California. Reports are produced annually each May, and end of summer reports are produced in
September. The 2008 Annual Report published results for approximately 35 beach and bay areas within Newport
Beach. All of the Newport Beach and Bay areas with data received an "A +" or "A" grade for dry weather readings.
Heal the Bay describes the significance of their ratings as follows: "The better the grade a beach receives, the lower the
risk of illness to ocean users. The report is not designed to measure the amount of trash or toxins found at beaches."
A second source of beach water quality benchmarks is the National Resource Defense Council (NRDC) annual survey of
water quality and public notification at 200 popular U.S. beaches. This survey provides ratings of two beaches
maintained by the City of Newport Beach. The table below is excerpted from the NRDC website for illustration
purposes:
NRDC Ratings for a Sample of 200 Popular Swimming Beaches in the United States
9'o fail to meet water
quality health Fails less Posts closing/
Water quality Always issues
Beach name Rating details standards than testing advisories advisories
59'' /year for fre uen rom I online and at `
3 years q ty p Pty beach
2008 2007 2006
Newport
Beach at 7C 17c )f 1 1 1 YES 5 /week YES YES
Balboa Pier 7�
Newport
Beach at y 0 0 0 YES 5 /week YES YES
Newport Pier 7C
Key. Water quality, 2008 1 *Water quality, last 3 years I *Water quality testing frequency I *Always issues advisories promptly I
Posts closings /advisories online and at beach. Each star indicates that this beach met a specific standard in 2008
Each of the benchmark data sources has strengths and limitations, but both offer a readily available source of comparison
information on observed and reported water quality information as well as the reliability of advisory postings or beach
closings. A combination of the two sources should be considered to provide a comprehensive picture of water quality
conditions as well as the effective communication and management of beach hazards as they may occur.
The recommended future benchmarks are: 1) Percentage of Newport Beach beaches rated A -minus or above by the
annual Heal the Bay Beach Report Card, and 2) Average Star Rating (5 star maximum) of City maintained beaches by
the annual NRDC Annual Beach Report.
IV.a Development Services /Plan Review - % First Plan Reviews Completed Within 30 Days
Limited comparable data is readily available for this service area. The measure stated above is a standard used by cities
within the CPM database, however these measurement parameters are not used by the City of Newport Beach,
therefore no meaningful comparison can be presented.
Due to its importance to the City Council and other stakeholders however, it is recommended that Newport Beach
explore the development of a comparison group in California- focusing on cities with similar profiles and development
characteristics. Plan review responsiveness, is an issue identified by the City Council is measured by many jurisdictions,
however, the manner in which it is measured varies greatly. Therefore, it is recommended that Newport Beach
continue to seek comparison jurisdictions that currently, or in the future, may collect data on the completion of initial
plan reviews within 30 days. At present, City of Palo Alto and Santa Barbara publish data that is apparently
comparable. Efforts should be made to determine whether a complete benchmarking group can be identified.
V.a Fite/Emerizency Re nse — Response Time -Call Ent to Arrival
percentage of Emergency Hire Calls with a Response Time of 5 Minutes and Under -Call Entry to Arrival
0% 10% 20% 30% 40% 50% 60% 70% 90% 90% 100%
Johnson City TN
Mc Allen Tx
Sterling Heights MI
Tyler Tx
Newport Beach CA
Weshnlnster CO
Eugene OR
ICMA Median
iCMA Mean
Farmers6ranch T%
De Kalb IL
Bellevue WA
CorvalRS OR
Virginia Beach VA i
Readily available comparison data from CPM database- limited peer cities reporting
Note: Newport Beach data includes time from local station notification to arrival; future reports will capture Newport Beach data from call entry to
arrival.
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
All Structure Hal Incidents- Percentage of Flrea Confined to Room/Structure of Ortgn
Westminster Co
Bellevue WA
Newport Reach CA
Vlrglnla Beech VA
Alexandria VA
Long Beach CA
Tifton GA
Woodstock GA
Matanuska5usltna AK
Savannah GA
Kirkland WA
ICMA Mellen
ICMA Mean
Fast Roviderrce RI
Peoria IL
0% 10%
30% 30% 40% 60% 00% 70% 80% 90% 100% 1
Readily available
comparison data from CPM database- limited oeer cities renortine
Fire Deportment footnote: The primary function of the public fire service has always been to limit fires to the "Building of Origin ". Excellence in fire
protection can be measured in a department's ability to limit a fire to the "Room of Origin" Factors that directly contribute to this measurement
include response time, staffing adequacy and effective equipment.
Fire & EMS Operating Expenditures par :LOGO population
so $50,000 $100,000 5150,000 $200,000 $250,000 6300,000 $350,000
ICMA Median
ICMA Mean
Evanston IL
shudington Beach CA
Santa Cruz CA
Ventura CA
Long Beach CA
Bellewe WA
Santa Barbara CA
Manhattan Beach CA
Santa Monica CA
Newport Beach CA
Palo Alto CA
Readily available data for numerous Deer cities- from CPM
Note: Newport Beach data excludes all Lifeguard expenses with the exception of some Training Division expenses for Lifeguard /Jr. Lifeguard
training which could not be separated for this report.
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
V.d) Fire /EmergenLV Resoonse — FTE Der 1000
FIRE & EMS FTEr, pw 1.000 population
0.0 0.5 LO L5 2.0 2.5
Bellevue WA
Palo Alto CA
Fishers IN
Evanston IL
ICMA Mean
ICMA Median
WeshnlMter CO
Newport Beach CA
Vhginla Beach VA
Santa Monica CA
Coral Springs FL
Scottsdale AZ
Santa Barbara CA
Manhattan Beach CA
Numinghm Beach CA
Readily available data for numerous Deer cities- from CPM database
Note: Newport Beach data excludes Lifeguard, Jr. Lifeguard and related training staff FTCs.
Fire Department footnote: Several factors determine the number of full time employees per a percentage of the population. These include housing
density, geography and staffing policies. Newport Beach has a mixture of housing types and densities. Newport Beach fire stations are located to
minimize response time irrespective of the density of their response districts. Geography also influences fire department staffing. Without the
division of the City by the harbor and Back Bay, it would be possible to meet our response time objectives with two less stations, which would lower
the FTE /1000 ratio from 1.61 to 133.
VIA General Services /Beach and Pier Maintenance
No comparison data is available for this service area. Due to its importance to the City Council and other stakeholders
however, it is recommended that measures be developed for objectively assessing the condition of beaches and piers,
and further, that Newport Beach initiate the development of a comparison group in California.
Readily available comparison data from CPM database -mix of peer cities and others reporting
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
YI c) General Services /Refuse - Resident Rating of Service
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
VIA) General Services / Refuse Collection —Costs per Account
There is currently no ability to present meaningful comparisons on this important benchmark measure. In order to benchmark the
per account cost of refuse collection /disposal, the General Services Department will need to utilize an annual rate study of regional
municipalities conducted by an independent 3rd party. This approach will allow for a meaningful cost comparison taking into
account the operating constraints faced by all Los Angeles & Orange County cities, such as AQMD restrictions, minimum waste
diversion requirements, and County landfill fees and requirements. The following list of recommended benchmark cities includes
programs that utilize either in -house or contract staff and offer either source- separated or mixed waste recycling options in cities of
similar population.
Benchmark Cities: Buena Park
La Palma
Claremont
Orange
Costa Mesa
Pomona
Culver City
Santa Monica
Huntington Beach
Stanton
Irvine
Tustin
VI e) General Services /Street Sweeping Expenditures /Residential Curb -Miles Swept
08M Expenditures for Street Sweeping per Linear Mlle Swept
$0 35 $10 $15 $20 $25 $30 $35 $40
Newport Beach CA
Long Beach CA
Scottsdale A2
Benchmark Group
Median
Coral springs FL
Benchmark Group Mean
Westminster C0
BellBWO WA
Readily available data for numerous peer cities- from L.PM database
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
Vll.aj Human Resources Em to ment Services— Central HR Expenditures Per City FFE
Human Reactances Expenditures per FTE
so $500 S 1,000 5100 $2,000 $2,500 63,000 $3,500
Fishers M
ICMA Median
Long Beach CA
ICMA Mean
Wastminster CO
Coral Springs FL
Santa Monica CA
Serra Bathara CA
Bellevue WA
Scottsdale AZ
Newport Beach CA
Manhattan Beach CA
Palo Age CA
Ventura CA
Readily available data for numerous peer cities- from CPM database
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
Vll.bl Human Resourcesi Employment Services — Average giants, - Intemal Recruitment
Awerage Number of Days to Complete Internal Recruitment (No Tesu
0 10 20 30 40 so 60
Sandy Springs GA
Coral Springs FL
Woodstock GA
Bellewaa WA
Schaumburg IL
Me Henry It
ICMA Median
Highland Park It
Newport Beach CA
SL Charles IL
ICMA Mean
Johnson City TN
Vlrglnla Reach VA
Westminster 00
Des Mathes N
Readily available comparison data from CPM database- mix of peer cities and others reporting
Reference Questlom Answered per Full Time Employee
0 500 ]1000 11500 2,000 2,500 3,000 3,500 4,000 4,500 51000
Johnson City TN
ICMA Mean
Starling Heights MI
Westminster CO
Newport Beach CA
Evanston IL
Scottsdale AZ
North Las Vegas NV
Farmers Branch TX
Vlrglnla Beach VA
ICMA Median
Davenport IA
Salem OR
Newport News VA
Keller TX
Rowlett TX
Readily available comparison data from CPM database- mix of Deer
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
Library Services Material Expenditures par Capita
s0 $1 $2 $3 S4 $5 $6 $] $a $9 $10
Johnson City"
Winter Haven FL
North Lae Vegas NV
Smyrna OA
Sterling Heights MI
Salem OR
Long Beach CA
Westmlmier CO
ICMA Median
North Richland Hills TX
ICMA Mean
Davenport IA
Chesapeake VA
Virginia Beach VA
Scottsdale AZ
Newport Reach CA
Evanston IL
Readily available comparison data from CPM database- mix of peer cities and others reporting
IX.al Police Crime Prevention — Feeling of Safety in Nei hborhood
Citizen Sumay- Walking Alone In Nelghbwrhaad After Dark & Durlog Day(Very Sob+ Someemat Safe)
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
Omandela IA
HlgnlarM Park IL
Fishers IN
111"a n Barran CA
Fort Collins CO
Bellevue WA
Daeahu OA
Coral springs FL
Skokie IL
ICMA Medlan Sets, I
Sans Monica CA
lohmon Clty TN
ICMAMean
Palm Coast FL
RDUdngg
Non Nara VA
awp
NAfter Dark
Readily available comparison data from CPM database- mix of peer cities and others reporting
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
Average Time from Recelpt of Telephone Call to An"I an Scene PA SecoMe)
Pearls IL
Newport Beach CA
Coral springs FL
NOM Richland HIIIS TX
Slack City IA
Baliewe WA
WestminsterOO
ICMA Median
ICMA Mean
Salem OR
Fshers IN
Eugene OR
Rowlett TX
6alneeville FL
SchaumburglL
0 100 200 300 400 500 600 700
available data for numerous Deer
Police Department footnote: An emergency response is defined as an in- process violent crime or any call for service which is in progress and
is potentially life- threatening
Total Police O&M Expenditures per 1000 Population
so $100,000 $200,000 $300,000 $400,000 5500,000 $600,000 $700,000
Fishers IN
Virginia Beach VA
ICMA Median
ICMA Mean
Evanston IL
Nuntington Beach CA
Ventura CA
Believed WA
Santa Crux CA
Santa Barbara CA
Palo Aha CA
Manhattan Beads CA
Newport Beach CA
Santa Monica CA
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
IXAI Police /Emer iencv Response — Total Police Staffina FTE Per 1_000.po ululation
Pollee services FM per 1000 pepulatlon
0.0 1.0 2.0 3.0 4.0 5.0 6.0
Santa Manisa CA
sootisdate AZ
Evanston IL
Newport Beaeh CA
Pala Alto CA
Manhattan Beach CA
WesttMnster M
Santa Barbara CA
ICMA Mean
Coral Springs FL
Bellevue WA
ICMA Median
Wginla Beach VA
Huntington Beach CA
Fishae IN
Readily available data for numerous peer cities -from CPM database
IX.e) Police /Investigations — Part I Violent Crimes Cleared
Percentage of UCR Pert I Violent Crimea Cleared
0% 10% 20% 30% 40% 50% 00% 70% 80%
Newport Beech CA
Coral Springs FL
Bellevue WA
Weslminster CO
Oslnss'u" FL
ICMA Meen
Sterling Nelghts MI
IC
MA Median
Virginia Beach VA
Evanston IL
North Richland Rill. TX
Schaumburg IL
North ta. Vages NV
F.ham IN
Fort Collin. CO
Palm Coast FL
Readily available comparison data from CPM database- mix of peer cities and others reporting
Police Department footnote: The FBI Uniform Crime Reporting (UCR) definition of violent crimes includes murder, negligent
manslaughter, robbery, forcible rape, and aggravated assault.
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
1 Peer Cities
Other CPM Cities
CPM Median and Mean
120%
100%
80%
G0" /o
40%
20%
0%
X.a) Public Works /Capital Proiect Delivery
Percentage of Capital Projects Completed: On -time, On- budget & Total Number of
Projects Completed
FY 2007 -08
C.Lsbad Newport Beach San Jose
On rune ® On budget Toed projects completed
120
100
80
60
40
20
Preliminary data from Newport Beach's CIP Monitoring System shows that of 23 projects completed in 2007 -08 78%
were on schedule and 83% were completed within Council authorized funding. Limited benchmarking sources exist for
capital project completion performance. Two cities that have several years of performance measurement experience
are provided for reference. It is recommended that ongoing efforts be focused on year to year comparisons of internal
data until a reliable benchmarking group can be established.
X.b) Public Works/Traffic & Transportation — Pavement Condition Index Comparison to Orange Co. Cities
0%
also Viejo CA
Indne CA
Orange County CA
Dare Point CA
Anaheim CA
Newport Beach CA
Tustin CA
Lake Forest CA
Duane Park CA
Mission Viejo CA
Yarba Linda CA
San Juan Capistrano CA
Laguna Woods CA
Laguna Beach CA
Laguna Niguel CA
Costa Mesa CA
Rancho Santa Margarlte CA
Seal Beach CA
Villa Park CA
Orange County CA Median
Orange County CA Mean
Placentia CA
Laguna Hills CA
Cypress CA
Fountaln Valley en
Brea CA
Fullerton CA
Orange CA
San Clemente CA
Le Habra CA
Huntington Beach CA
Santa Ana CA
Westmlmter CA
Los Alamitos CA
Stanton CA
Pavement
10%
Condition
20%
Index Comparison
30%
40%
with Orange
SO%
County
60%
Cities
70%
90%
90%
100%
Information gathered from Orange County cities- all use the same pavement condition evaluation methodology
Public Works footnote: All PCI are Projected to 2065. Assumptions are that agencies have updated pavement maintenance and
rehabilitation history since last field survey
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
X.cl Public Works/Traffic & Transportation- Satisfaction with Road Condition
Citizen Satisfaction with Road Conditions (Good + Mostly Good)
0% 2G% 40% 60% 80% 100%
Plano Tx
Bellevue WA
Cara) swings Ft
University Place WA
Scottsdale AZ
North Richland Hills...
Sterling Heights MI
Lynnwood WA
ICNIA Median
Westminster CO
ICMA Mean
Cartersville GA
Newport Seaeh CA
Davenport IA
Readily available comparison data from CPM database- mix of peer cities and others reporting
Note: Newport Beach data is from the initial Resident Satisfaction Survey (2007)
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
Xl.al Recreation & Senior Services /Active Kids After School Program – % of Program Filled
The Newport Beach service area related to recreation programs is the "Active Kidz" after - school program. ICMA does not
collect performance data specifically for After - School recreation programming. Should Newport Beach desire to establish a
benchmark for Active Kidz, alternative sources of data will need to be developed such as the establishment of a local
benchmarking group. Recreation staff has contacted several local jurisdictions but, to date, no comparable data has been
successfully developed.
XI.b1 Recreation and Senior Services /Senior Transportation —Cost Per Trip
ICMA does not collect performance data specifically for senior transportation programs. Newport Beach Recreation
staff has initiated a local benchmarking group of four other senior transportation programs in Orange County, and has
obtained comparable data on the programs' cost per trip. The initial data is presented below for Fiscal Year 2007 -2008.
Huntington Beach, CA
Newport Beach, CA
Irvine, CA
OCTA ACCESS
South County Senior Services
Senior and DWIttee Transportation Regrams -Per Tdp Coat
So 55 S1D 515 $20 525 $30 835 .
Data provided is for Fiscal Year 2007 -2008
Note: All of the above Senior Transportation programs operate with paid staff except Huntington Beach which utilizes
volunteer drivers.
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
XI_I.a) Utilities /Drinking Water Supply
Water Supply Quality is one of the services critical to a community's physical and economic health
The bench marking source for water utilities is the American Water Works Association's "Qualserve" program whose
member agencies contribute data on up to 22 indicators and attributes of water service and wastewater collection.
Initial comparisons with the AWWA's 19 -city California region aggregate data for 2006 -2007 shows Newport Beach
performing relatively well on two principal measures of drinking water supply service quality. Days in full compliance with
water quality standards is at 100 %. Water distribution system integrity is within the top 25% of agencies reporting.
Cost data is currently not comparable with that of the AWWA jurisdictions because Newport Beach water supply and
wastewater collection functions do not include operation of treatment facilities. To provide useful cost comparisons,
Newport Beach will need to work with the AWWA and with other jurisdictions to develop a group of utilities with a similar
scope of operations.
Xll.bl Utilities/ Drinking Water Supply— Percentage of days in full compliance
Comparison to AW W A Benchmarking Group of 19 California Agencies
FY 2006 -2007 Data
Drinking Water:
Percentage of days in full compliance
1
Newport Reach CA
25th Percentile
Median
75th percentile
0°/ 70% 40. 60° /u 80% 100%
From AW WA- only source of comparison data available for utilities measures
X11M Utilities /Drinking Water Supply -System Integrity: Number of leaks /100 ml of plain!
Comparison to AW WA Benchmarking Group of 19 California Agencies
FY 2006 -2007 Data
Water Distribution System Integrity:
Number of leaks /breaks per 100 miles piping
e
Newport Beuh CA
25th percentile
Median
0 5 10 15 20 25 30 35 40 45
From AW WA- only source of comparison data available for utilities measures
Xll.dl Utilities/ Wastewater Collection
Wastewater collection is a service critical to a community's physical and economic health. As with the drinking water
supply service, a bench marking source for wastewater collection is the American Water Works Association's
"Qualserve" program.
At present the AWWA can provide benchmark data for one aspect of wastewater collection system quality: Sewer
Overflow Rate per 100 Miles of Piping. Initial comparison with the AWWA's 19 -city California region aggregate data
shows the 2006 -2007 Newport Beach overflow rate is well below the mean for the benchmark group. This benchmark
does not include 81 miles of sewer lateral piping for which Newport Beach also provides maintenance.
System Integrity (system leaks and breaks per 100 miles of collection piping) is another measure of system quality,
however no AWWA data was yet available for this measure. Cost data is currently not comparable to most AWWA
jurisdictions because Newport Beach does not operate sewage treatment facilities. To provide accurate cost
comparisons Newport Beach would need to work with the AWWA and other jurisdictions to develop a benchmark
group of similar wastewater collection utilities.
X II.el Utilities- Wastewater Collection- Overflows Per 100 miles of Piping
Comparison to AWWA Benchmarking Group of 19 California Agencies
FY 2006 -2007 Data
From AWWA- only source of comparison data available for utilities measures
Benchmark Comparisons- Summary
In general, the City of Newport Beach appears to compare favorably on most measures surveyed in terms of quality, cost,
cycle time and customer satisfaction. While general conclusions may be drawn from the results reported, it is
recommended that the results be used as the first step in the review of operations and as a way of determining priorities
for new investment or process improvement.
IV. Observations/Recommendations
Observations
Over the course of the project ICMA consultants gained insights into staff capabilities and the challenges associated
with implementing and maintaining a performance measurement and benchmarking system. We offer the following
observations for your consideration:
1. The City has the necessary systems to support performance measurement. While working with IT, HR and
other support functions, it was apparent that systems were in place to provide necessary assistance. IT in
particular was very adaptable and flexible during the project.
2. Managing with data will be a significant change for the organization. How the City Council and managers use
data, the way it's reported to the community, as well as the various ways the city secures customer feedback
will all be new activities. This will require discussion and deliberation between Council and staff regarding
importance, approach and expectations.
3. The organization has been stressed by the development of a performance measurement system and the data
collection requirements. This is due in part to the scope of the project, and the lack of familiarity with
performance measurement. However, it also appears to relate to the workload and staff capacity in some
areas. This should be reviewed in greater detail to determine on -going needs and to create a plan for
sustaining the effort.
4. It appears that the organization has a very "lean' analytical capacity. While this is commendable in terms of
keeping costs low, it may have an impact on the organization's on -going assessment, collection and
maintenance of performance data. Newport Beach appears to have a more limited capacity than similarly
situated organizations. A Council -staff discussion regarding the distribution of responsibilities and analytical
capacity would be useful.
5. The staff is very committed and engaged. Over the course of a very difficult project, with many stops and
starts, the staff remained engaged and committed to getting the job done. In particular, the central team went
the extra mile to assist the consultants and work with department facilitators to finalize measures and explain
concepts.
Recommendations
Based upon the foregoing information and observations, the following recommendations are offered for your
consideration:
1. Continue refining measures and benchmarks. A great deal of effort has been committed to establishing a solid
base of measures for which data is currently being collected. Some measures will need to be refined as a better
understanding of the work emerges, and some benchmarks will be abandoned or changed as discussion of what's
important evolves. A measurement team should continue, and a person in each department should be
designated as the "measurement coordinator".
2. Conduct a debriefing session with the core team and facilitators. The purpose of the session would be for the City
Manager and Department Heads to uncover issues and determine staff capacity to fully implement a
performance management system. This should be followed by a Council study session regarding the costs and
benefits of managing with data.
3. Decisions should be made regarding what measures and outcomes the Council will track on an on -going basis in
order to establish a clear linkage to strategic planning/goal setting sessions.
4. Conduct regular strategic planning. The City has committed to performance excellence, and to benchmarking
against "best in class" organizations. Developing a multi -year strategic plan with actionable objectives, linked to a
business plan and annual budget will bring Newport Beach in line with best practice in this area. This will also
help to define the strategic outcomes for measurement, and improve alignment with department operations.
5. Consider leading an effort with the similarly situated cities to conduct benchmarking - focused on key Newport
Beach measures. As noted, some of the benchmarking measures lack comparables. Although additional effort by
staff will expand the comparable data, many of the measures important to Newport Beach are not collected by
the peer cities. One remedy would be for Newport Beach to lead a benchmark group. ICMA has numerous
benchmark consortiums across the country. Newport Beach could set the standards for a southern California
group.
6. Continue regular community and business surveys. Surveys of customer requirements and satisfaction are crucial
to the measurement of various services. In addition, future surveys should be designed to elicit specific
information concerning customer requirements so that systems can be designed properly and data can be
collected to measure results related to meeting customer needs.
7. Conduct regular employee surveys. Much like the community and business surveys, an employee survey will
enable management to understand employee requirements — particularly as they relate to internal service
delivery and strategic alignment.
8. Adopt a five year Capital Improvement Program (CIP). Because capital project management is a high priority for
the City Council, and the staff delivers a high volume of projects, the establishment of a multi -year Capital
Improvement Program is a prudent course of action. This should become part of the business planning/budgeting
cycle.
The opportunity to assist the City of Newport Beach with this project is sincerely appreciated. The organization's
openness to suggestions and willingness to work to implement this system has been gratifying.
Performance Measurement and
Benchmarking Project
September, 2009
Newport Beach, California
Submitted by and reply to:
Management Services
ICMA Consulting Services
International City /County Management Association
777 North Capitol Street NE, Suite 500
Washington, DC 20002
Craig. Rapp @icma.org
202.862 -3583
ICMA
Leaders at the Core of Better Communities
Table of Contents
Introduction................................. _. _ ............................................................................... ._ ... ........................ 3
I. Performance Measurement .................................................. _ ............. _ ............. ... ............. _................. 4
Uses and Benefits of Performance Measurement ...................... ................. _.... ... _................. 4
II. Project Purpose and Overview ......... . .......... ..._ .... _ ....... ...................................................................... 5
GoalAlignment and Validation .................................................................... ............................... 7
Identification and Adoption of Services for Measurement ........................... ............ . .... ..... 8
Benchmarking -Peer City Comparisons ...................................................... ......................_.._..... 10
Staff Training — Measures and Data Collection .......................................... . .... _ .... __...._...._.. 11
III. Project Results ....................................................................._............................._. ............_.................. 12
Understanding the System and the Data .............. » ..... _ ................................................ .......... 12
DataGathering Issues ......... . ...................... » ........................................... _................................. 12
BenchmarkMeasures - Summary ............... ........................................... . ...................... ... ...... 14
BenchmarkComparison Data ..... ... .......... ........................................................... _.. 15
IV. Observations/ Recommendations ......... . ........................................................... ............................... 37
Observations... . ............. » .......... _............................................................................................... 37
Recommendations........................... ................ ... ............................................. ..... ....... ........ 38
Introduction
This report summarizes ICMA Consulting Services' (ICMA) effort to assist the City of Newport Beach California in
creating a performance measurement system and to initiate benchmark comparisons with other communities.
The goal of the project was to establish comparative data on organizational effectiveness and to support
continuous improvement.
The contract with the City of Newport Beach required ICMA to provide local government benchmarking data,
deliver performance measurement best practices, train city staff, and facilitate development of a system that
would enable the city to analyze and improve its performance.
Over the past year, ICMA has worked with the staff and City Council to replace a system of service load indicators
with a system of balanced performance measures and benchmark comparisons. To accomplish this, benchmark
comparison cities were identified; a core team of employees were trained, and a set of benchmark and general
performance measures were developed that were tied to Council priorities.
Initial data for twenty -eight benchmark measures is provided comparing Newport Beach to comparable
jurisdictions both locally and across the country. Additionally, data collection is underway for more than 100
other performance measures that will be used to analyze and improve operations once collection cycles conclude
and data becomes available.
The project represents the first step in a long -term process of analyzing and defining key drivers of organizational
performance and service delivery. Creation of the system and collection of initial data provides the foundation
for additional inquiry. Data collection activities reflect results, but do not provide definitive answers about
performance.
Finding those answers and managing the process of continuous improvement will be the on -going challenge
facing the city. This may lead to new ways of doing business, refining processes, and potentially rethinking
strategic direction. Performance measurement and reporting is an evolutionary process- tied both to the
strategic planning and performance improvement activities of the organization.
The system that has been created that will provide the opportunity for improved data driven decision- making.
To successfully implement this, the City Council and staff to will need to ensure that roles and accountabilities are
clear, and that performance measures continue to reflect community priorities.
Performance Measurement
Performance measurement is a process of data collection and analysis used to assess the results and
effectiveness of business activities over a specific time period. Progressive cities throughout the United States use
performance measurement systems to both guide and manage operations.
It is important to keep in mind, however, that performance measurement does not represent an end point, but in
fact begins the conversation about organizational effectiveness.
The logic of performance measurement is simple and compelling:
e Measurement focuses attention on priorities and results
o A performance measurement system links Council strategic objectives to the daily work of employees
Measurement answers the question "How are we doing ?"
I
) Performance measurement provides managers and employees with the information necessary to
critically examine operations and compare this to other organizations or benchmark criteria
Measurement identifies successful strategies
o Performance measurement reveals processes and approaches that are working and enable sharing
with others in the same field
'? Measurement enhances accountability
o Performance measurement provides the basis for developing reporting systems that explain the cost,
quality and effectiveness of service delivery to stakeholders
The benefits of Performance Measurement are:
Measures reveal values
o If customer satisfaction, low cost, or speedy delivery is valued- it will be measured
4 Measures motivate behavior
o If something is valued and becomes a priority -it will be measured and systems of work will be
focused
fi Measures help you learn
o Most importantly, measures help answer critical questions -is it working? Are we having an impact?
What happened? - in order to facilitate continuous learning and improvement
II. Project Purpose and Overview
The performance measurement and benchmarking project began in February, 2008. A project schedule was
developed to align with the deliverables expected by the City Council. The key components and ultimate timing
of each element is listed below:
i• Validate Alignment with Goals
March 2008
} Review /Recommend Services
April 2D08
} Recommend Measures
June 2008
{• Recommend Peer Cities
July 2008
❖ Train staff /refine /collect
Throughout
} Implementation /Process
Aug -Oct 2008
S• Catch -up /suspend activities
Oct -Dec 2008
•:• Refine /finalize /collect data
Jan - June 2009
❖ Report to Finance Committee
June 2009
} Department Wrap -up Meetings
August 2009
❖ Managing with Data Workshop
August 2009
O Report to City Council
September 2009
A "central team" of four key staff was chosen to oversee the project and interface with the consultant team. In
addition, each department appointed a set of facilitators who received training and worked with the central team
and consultants to develop the performance measures and benchmark data.
The central team members are:
1. Susan Warren - Library Services Manager
2. George Murdoch - Director of Utilities
3. John Kappeler - Division Manager of Code and Water Quality Enforcement
4. Michael Wojciechowski - MIS Operations Coordinator
The department facilitators are listed on the following page.
Figure 1. Department Facilitators Trained in Performance Measurement
Lois Thompson
Dennis Danner
Administrative Services
Sandra Wilcox
Building
Jay Elbettar
Faysal Jurdi
City Attorney
David Hunt
Kristy Parker
City Clerk
Lellani Brown
Shana Stanley
Homer Bludau
Tammie Frederickson
City Manager
Sharon Wood
Tara Finnigan
Dave Kiff
David Mais
Fire
Steve Lewis
Ralph Restadius
Steve Bunting
Maurice Turner
General Services
Mark Harmon
Mike Pisani
Human Resources
Terri Cassidy
Gwen Bouffard
Library Services
Cynthia Cowell
Melissa Kelly
Planning
David Lepo
lay Garcia
Police
Robert Luman
Jim Kaminsky
Public Works
Jamie Pollard
Steve Badum
David Webb
Recreation & Senior
Laura Detweder
Sean Levin
Services
Utilities George Murdoch
I —
Cindy Asher
Goal Alignment and Validation
In order to ensure that performance measures were aligned with Council and community goals, the project team
reviewed a variety of documents including Council priorities, business and resident surveys and various planning
documents. The following table summarizes the most important goals and priorities. The priorities expressed —
particularly the overlapping priorities of the two lists- provided the context for developing performance
measures:
Figure 2: Priorities Identified by Council — Annual Session & Resident Survey
Council Priorities
Resident Survey Priorities
City Hall
Management of traffic flow
Facilities financing plan
Maintenance of
streets /infrastructure
Implement group home
ordinance
Maintenance of beaches
Water Quality
Enforcement of
codes /ordinances
Benchmarking
Quality of water supply
Banning Ranch
Appraisal /Acquisition
Effectiveness of City
Communications
Traffic Management
Implementation
Parks & Recreation
programs and facilities
Make city more energy efficient
Quality of city customer
service
Quality of library system
Identification and Adoption of Services for Measurement
Once priorities were identified, staff training was conducted on performance measurement concepts. Initial
training focused on creating a basic understanding of performance measurement and how measurement and
benchmarking would be used in the organization to both manage operations and address critical Council
priorities.
Following the training sessions, the team developed a list of thirty -six "service areas" for possible measurement
The list was presented to the Finance Committee and reduced to thirty, and then adopted by the City Council.
Developing performance measures for thirty service areas is an ambitious undertaking. A more typical approach
would be to establish measures in a few departments to test the effort, and then use that experience to expand
the system across the organization. In this circumstance the City Council and senior leadership determined that a
broad cross - section of the organization should be represented, therefore the larger effort went forward.
Once final service areas were approved, the consultant, central team and facilitators embarked upon a series of
facilitated sessions leading to the development of performance measures for each service. Prior to this time,
Newport Beach used a system of "service load indicators" to report the output of each department.
The performance measures that were developed address the results, or "outcomes" of service delivery, and as such
provide a link to Council's priorities and from the previous service load indicators. A balanced set of measures
(Quality, Cost, Cycle Time, and Customer Satisfaction) was established for each service area.
Using a Balanced Set of Measures
A comprehensive measurement system demands that the data collected reflect, to the greatest extent possible, a
balance of four key dimensions of performance. The four categories that comprise a balanced set of measures are
described below:
Quality - is the umbrella term designed to capture the effectiveness of the service and indicate whether the
organization is doing the right thing.
Sample Quality Measure: Percentage of streets in very good or excellent pavement condition
El Cost - is the amount of resources required to produce a definable per unit result. Cost usually includes labor,
materials, overhead, facilities and utilities. This measure may also reflect other financial aspects such as revenue,
savings, or investments.
Sample Cost Measure: Operating and maintenance expenditures per system mile for drinking water distribution
Cycle Time - is from when a customer requests the service to when the service is delivered. This measure should
include waiting time, and look at the whole service. Cycle time is usually measured against some standard of
response time.
Sample Cycle Time Measure: Percentage of requested maintenance completed within time standards
Customer Satisfaction - is how customers feel about a service they received and the results of that service. It
can include a broad range of measures examining customers feelings about timeliness, quality, professionalism
of service delivery, and courtesy. This category differs from the others because it is entirely based upon
perception; where the others typically measure an objective condition. It is important to remember that both
are needed to provide a balanced picture.
Sample Customer Satisfaction Measure: Percentage of customers rating library customer service as "good" or
better. (e.g. 4 or 5, on a 5 point scale)
Each service area generally includes one or more measures covering the four measurement types (Quality, Cost, Cycle Time,
and Customer Satisfaction) to demonstrate "how well ", "how fast ", "how efficient ", or "how satisfactory" the results are of
the service. For benchmarking purposes, the one or two dimensions deemed most important were used.
Finally, each measure was subjected to an additional evaluation to determine whether they were meaningful, useful and
sustainable. Meeting these criteria was important in order to ensure that the measures have value and will be used by
Council and staff. A description of these terms follows:
Meaningful - Does the measure provide information people want or need to know regarding the effectiveness, cost,
responsiveness, or overall satisfaction with the service delivered? Will measuring these results help us improve?
Useful - Can this information be used to communicate success or support decision- making? What specific
communications or decisions are envisioned?
Sustainable - Is the information available? What additional work or resources are needed to implement the
measurement and collection of the data? Is the information worth the investment of time and resources necessary to
collect, analyze and report it?
Benchmarking- Peer City Comparison
One of the City Council's primary objectives for the project was to establish data on comparative performance. To
put benchmarking into context, and to ensure common understanding, terminology was approved for use in the
system. The following are key benchmarking definitions:
Benchmark - a performance measure that permits valid comparison of service delivery results with other
jurisdictions or organizations, recognized "best practices", or professional standards for a service. Benchmarks
are selected and analyzed to ensure a meaningful "apples to apples" relationship exists between compared
operations and reported data.
Benchmarking Sources — these are other jurisdictions, professional organizations, or readily available sources that
provide comparisons of performance, best practices, innovations or problem solving.
Concurrent with the development of benchmark measures was the identification of "peer" cities with which to
benchmark performance. In order to establish a reasonable comparison group, a set of recommended criteria or
"filters" was developed and reviewed by the Finance Committee in August 2008, and approved by the City
Council in September. The initial recommendation to the City Council contained nine criteria and a total of
twenty-three benchmark cities. The Council ultimately selected seventeen cities for benchmarking purposes.
Figure 3: Benchmark Cities Selected by City Council
Carlsbad, CA
Bellevue, WA
Long Beach, CA
Coral Springs, FL
Santa Barbara, CA
Virginia Beach, VA
Huntington Beach, CA
Westminster, CO
Manhattan Beach, CA
Scottsdale, AZ
Santa Monica, CA
Evanston, IL
Ventura, CA
Fishers, IN
Santa Cruz, CA
Coronado,CA
Palo Alto, CA
The following are the nine criteria or "filters" that were used to determine peer cities:
1. Population
2. Participate in ICMA Center for Performance Measurement (CPM)
3. Beach /Resort and /or CA Community
4. Median Income
5. Total Operating Budget
6. Total Operating Budget /Number of City Employees Ratio
7. Median Housing Price
8. Number of lobs in the Community
9. Services Provided
Through benchmarking with other cities, Newport Beach hoped to answer the following key questions-
1. "How does Newport Beach compare with the benchmark local governments in terms of its
performance results?'
2. "What can Newport Beach learn about service delivery from a city that performs a similar service but
has different levels of performance ?"
Staff Training- Measures and Data Collection
Upon establishment of a benchmarking and measurement structure, additional staff training and development was
initiated to move from basic understanding of concepts to full competency developing measures, collecting and
recording data, and analyzing results.
In addition, ICMA trained the central team to a higher level in an effort to "train the trainer" and to provide an on -site
support system for the consultants. This also served to establish an in -house capacity for sustaining the system going
forward.
Ultimately, the project size and scope proved to be overwhelming for staff, resulting in training and data collection
activities being suspended for three months in the fall of 2008. Reestablishing momentum after suspending activities
proved to be difficult, however, additional training and guidance was provided during the first quarter of 2009.
Measurement activity is on- going, with support from ICMA.
To ensure sustainability of the project, a final facilitator training curriculum has been provided.
NI.. Project Results- Performance Data
Understanding the System and the Data
The data presented in this report represents an integrated system. In every performance measurement system there
are different types of data and different data needs depending upon the scope of authority and responsibilities of the
person using the information.
For example, it is typical for a City Council to focus on the broadest types of outcome measures such as "overall
customer satisfaction with Library Services ". This differs from a measure monitored by a Library Manager such as
"amount of time it takes for a new item to go from initial receipt to the shelf'. Each represents valid data, but how i,
is used, and the need to know vary depending upon the responsibilities of each party.
The performance measurement system developed for Newport Beach was constructed for two primary purposes: (l)
to provide "benchmark" performance comparisons with other cities on a range of services; and (2) to provide a
balanced set of measures that can be used for monitoring and evaluating operations. The first purpose primarily
serves the needs of the City Council and senior management, the second serves the needs of senior management and
front line supervisors.
Benchmark measures, along with initial comparison data are provided in the following sections. The benchmarks
measures included were chosen because they represent broad indicators of performance deemed to be useful for
City Council deliberations. The results are clearly depicted in the graphs, therefore limited interpretation is provided
except where necessary to explain data collection anomalies or to provide a recommendation for future data
development.
General performance measures which address operations are listed in Appendix A. These measures were developed
by staff to support Council priorities and connect those priorities to the daily work of employees. Data collection has
begun on many of them, with initial results to be reported during the 2009.2010 fiscal year. Some of the measures
are still under evaluation for appropriateness - related primarily to issues associated with the time and expense of
data collection or the need to change 17 or HR systems in order to collect the data.
Benchmark Comparisons - Data Collection
The city chose to address thirty different service areas for measurement, in an attempt to cover a comprehensive
range of city operations. While this choice will ultimately serve the city well for general measurement of
performance, it provided a significant challenge for benchmarking.
For example, although ICMA's Center for Performance Measurement (CPM) has the largest database of local
government performance measures in the country with over 200 jurisdictions reporting, there are only fifteen service
areas in which data is collected. Multiple measures are contained within the fifteen services however; therefore CPM
data is provided for twenty-one of the thirty -four benchmark measures recommended.
Even within CPM's comprehensive database, not all cities involved collect or report data on all measures. In some
cases, peer cities who are part of the CPM database don't report on desired measures, either because they don't
provide the exact service, or they choose not to measure it. To compensate for these situations, other CPM cities
similar to the peer cities have been included for comparison, along with the median and mean of all reporting citir.
To fill the remaining void, third party data was also collected. While useful, third party data is often limited in its
comparability and depth of information. One example of this is the data for Utilities, which came from the American
Waterworks Association (AWWA). While AWWA does collect nationwide information on utility operations, it did not
provide the same level of comparability as the CPM database.
Benchmarking also requires that comparable, "clean" data be available in order to achieve legitimacy. Clean data
is defined as data that has been "scrubbed ", or verified for accuracy and comparability. For example, CPM has a
large staff devoted to data cleaning. This means they engage in ongoing verification, along with regular meetings
with groups of cities who discuss in detail the intricacies of their operations so that any differences can be noted
and reflected in the presentation of data. Independent data collection by city staff can become a tremendous
burden when data cleaning is involved.
The information presented in this report was drawn from Newport Beach financial and performance data, and
was analyzed to a level that provides reasonable assurance of its accuracy and reliability; however, the
information was not audited or subjected to the same level of testing, or data cleaning that CPM data receives.
In the future, as the organization integrates more thoroughly into the CPM group of cities, Newport Beach data
will be cleaned and its validation will be assured to that level.
Finally, there are some general cautions that should be applied to any benchmark data. The Governmental
Accounting Standards Board (GASB) issues the following advice, excerpted from GAS8 Publication: Reporting
Performance Information: Suggested Criteria for Effective Communication:
Criterion Eleven - Comparisons for Assessing Performance
"The two staples of comparative performance reporting are time series and comparison against targets. Users are
also interested in comparing their government with other, similar governments or industry standards.
Comparisons with other organizations can be very informative as long as any significant differences in measures
or circumstances that have been identified are noted.
Care should be taken to ensure that reported information is comparable. (The some measures are measured in
the some way and use reliable data) If the information is not truly comparable, either no comparisons should be
reported or sufficient explanation should be provided to describe how the information is not comparable.
Care should also be taken when selecting the pool of organizations with which to make interorgonizational
comparisons. Efforts should be made to select organizations that are similar or comparable to the organization
making the comparison. Core should also be taken to ensure that comparative information about other
organizations was collected in a reliable manner.
The 1CMA Comparative Performance Measurement Project is one effort that addresses the comparative problem.
For this project over 200 cities and counties have agreed on measurement methods for a selected number of
performance measures. The organizations were trained on how to develop and report the selected performance
measures and testing is done to check whether there is a reasonable degree of comparability among the data
reported. "
This advisory is offered not to minimize the usefulness of benchmark data, but to indicate that caution should always
be exercised when comparing performance between organizations.
Benchmark Measures
The following is a summary of 34 benchmark measures covering 22 different service areas within 12 different
departments. They are grouped by department and service area. Originally, 30 service areas were selected for
measurement. The eight service areas not included in benchmarking are being measured internally, but not
benchmarked externally at this point in time. The decision not to benchmark those service areas was based upon a
number of factors, but primarily lack of readily accessible data.
Detailed comparison data is provided for 28 of the 34 measures listed. Six measures are presented with no data. They
have been included with no data due to the importance of that measure to the city. In those cases, a range of options
is suggested, from creation of a single issue benchmark group to the use of third parry data sources not aligned with
CPM or peer city reporting.
Specific benchmark data is depicted in a series of graphs which follow the summary. Unless otherwise noted, all data
is from fiscal year 2005 -2006, the most recent year for which comprehensive and "clean" data was readily available.
The data largely speaks for itself, therefore no interpretation or evaluation of the results is included beyond
footnotes.
The graphs are color coded to aid in reviewing the comparison data. As mentioned previously, peer cities have been
supplemented by other jurisdictions within the CPM database to provide a reasonable comparison group. Cities
chosen for inclusion were jurisdictions of similar size, and have community characteristics deemed important by the
City Council.
Summary of Benchmark Measures- by Department and Service Area
The following is a summary of the 34 benchmark measures presented for consideration, listed by department and service
area Benchmark data is provided on 28 of the 34 measures in twenty-two different service areas across twelve
departments.
Department
- Service Area
Benchmark Measures
I. Administrative Services
I -a.) IT Operations and Maintenance (0 &M) expenditures as percentage of
- IT Services
jurisdiction 0 &M
- Purchasing
I -b.) Purchasing administrative cost as percentage of total purchases
II -a.) Cost of representation by in -house OCA compared with benchmark
II. City Attorney's Office
cities and outside firms
Ili. City Manager's Office
-Code Compliance
- Water Quality
ty /Environmental
II -a.) Code Compliance expenditures per capita
Protection
III -b.) Average days from complaint to first non - inspection response
III -c.) Limited comparison data. Recommend using external beach ratings
IV. Development Services
IV -a.) Percentage of first plan reviews completed in 30 days
- Plan Check
(Recommend development of California benchmarking group)
V. Fire
V -a.) Percentage of responses within 5 minutes
- Fire Emergency Response
V -b.) Percentage of fires confined to room /structure of origin
V -c.) Fire & EMS operating expenditures per 1000 population
V -d.) Fire & EMS Staffing - FTE's per 1000 population
VI. General Services
VI -a.) No comparison data- Recommend starting a benchmarking group
- Maintenance of Beaches & Piers
VI -b.) Park maintenance expenditures per developed acre
- Park Maintenance
VI -c.) Resident rating of refuse service
- Refuse Collection
VI -d.) Refuse collection costs per account (Regional Comparison)
- Street Sweeping
VI -e.) Street sweeping expenditures /Residential curb -mile swept
VII. Human Resources
VII -a.) Cost of central HR per City FTE
- Employment Services
VII -b.) Avg. days to complete internal recruitment
VIII. Library
- Library Customer Service
VIII -a.) Reference transactions per FTE Reference staff
VIII -b.) Materials expenditures per capita
IX. Police
IX -a.) Feeling of safety in neighborhood
- Police Crime Prevention
IX -b.) Average response time from dispatch to arrival
- Police Emergency Response
IX -c.) Total Police O &M expenditures per 1,000 population
- Investigation
IX -d.) Total Police Staffing - FTE's per 1,000 population
IX -e.) Percentage of UCR part I violent crimes cleared
X. Public Works
X -a.) Limited comparisons, recommend focusing on internal benchmarking
- Capital Project Delivery
until a reliable comparison group is established
- Traffic and Transportation Services
X-b.) Pavement Condition Index compared to Orange County cities
X -c) Satisfaction with road condition
XI. Recreation and Senior Services
- Active Kids Program
XI -a.) No comparison data; recommend starting a benchmarking group
- Senior Transportation Services
%I-b.) Cost per trip (Local benchmarking group initiated)
XII. Utilities
- Drinking Water Supply
XII -a. & b.) Percentage of days in full compliance with regulations (AWWA)
- Wastewater Collection
XII -c.) Number of leaks /breaks per 100 miles piping (AW WA)
XII -d. & e.) Overflows per 100 miles collection system piping (AWWA)
n OPOnting Expenditures as `.. of Total jurisdiction Operating Expenditures
0". P, 2 ^, 0. T,
Bellevue WA
Westminatcr CO
Santa Monica CA
ScoHSdale A2
Fishers IN
Coral Springs Fl
ICMA Mean
Newport Beach CA
ICMA Median
Santa Barbap CA
Manhanan Beach CA
Readily available data for numerous peer cities. from CPM database
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
ParchaJng 06M Coat as a Percentage of Total Purchase*
0.00\ 1.00% 2.001 3.00% 4.00`+. 5.00% 6.00% 7.00%
Newport Beach CA
Palm Coast Fl
Ro kfmd IL
Sandy Springs GA
Schaumbuq IL
North Richland Hills TA
ICMA Median
Mc Allen "
Las Cruces NM
Davenport IA
ICMA Mean
Sterling Neighb MI
RowleH"
Peoria A2
Winter Haven FL
available comparison data from CPM database peer cities not recorting
f ..af_CitV Attorney's Office( Cost of representation by in-house OCA compared to benchmark cities and firms
No benchmark data is available for this measure from either CPM or other readily available source. The City Attorney
requested that this measure be established, and has indicated that he will initiate the development of a benchmark
group for the purposes of providing meaningful data on this measure.
Mane e/s Office Code Com Hance - Ea enditures per Capita
All Code Enforcement Categories - Expenditures per Capita
$0.00
55.00 $10.00 $15.00 520.00 $25.00
Newport Beach CA
Bellevue WA
Sterling Heights MI
Virginia Beach VA
Evanston IL
ICMA Median
Skokie IL
Long Beach CA
Coral Springs FL
ICMA Mean
Westminster CO
Scottsdale AZ
University Park TX
Santa Monica CA
Palm Coast FL
Readily available data for numerous Deer cities from CPM database
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005.2006)
Newport Beach
'Peer Cities
Other CPM Cities
CPM Median and Mean
Zoning Code Violation Average Dar. from C.h taint to First Non Inspection Response
0 1 2 3 0 5
ScenWale AZ
Met." Park IL _
Newport Beach CA
Coral Springs FL
Johnson City rN
Lombard IL
Skokie IL
Suwarae GA
ICMA Median
De. Molnes IA
Queen Creek AZ
Fort Collins CO
Schaumburg IL
Winter Haven FL
ICMA Mean
Long Beach CA
Readily available Comparison data from CPM database limited Deer cities reporting
II City Manage /s Office /Water Quality /Environ me nt at Protection
No ICMA comparison data is available for this service area, and there is limited data generally on this specific service
area. Due to its importance to the City Council and other stakeholders however, it is recommended that the City
develop external data sources to enable objective comparison of selected measures of ocean and bay water safety.
There are two sources of comparison data recommended for consideration.
One source is the "Beach Report Card" produced for the last ten years by Heal the Bay, a private, non - profit
organization which compiles and reports results of water quality monitoring performed by California health agencies.
Regular monitoring of the levels of three potentially hazardous water -borne organisms is required under California law
(AB 411).
Heal the Bay converts this data into an equivalent 'letter grade" and publishes the results for nearly 500 monitoring
locations throughout California. Reports are produced annually each May, and end of summer reports are produced in
September. The 2008 Annual Report published results for approximately 35 beach and bay areas within Newport
Beach. All of the Newport Beach and Bay areas with data received an "A +" or "A" grade for dry weather readings.
Heal the Bay describes the significance of their ratings as follows: "The better the grade a beach receives, the lower the
risk of illness to ocean users. The report is not designed to measure the amount of trash or toxins found at beaches"
A second source of beach water quality benchmarks is the National Resource Defense Council (NRDC) annual survey of
water quality and public notification at 200 popular U.S. beaches. This survey provides ratings of two beaches
maintained by the City of Newport Beach. The table below is excerpted from the NRDC website for illustration
purposes:
NRDC Ratings for a Sample of 200 Popular Swimming Beaches in the United States
Each of the benchmark data sources has strengths and limitations, but both offer a readily available source of comparison
information on observed and reported water quality information as well as the reliability of advisory postings or beach
closings. A combination of the two sources should be considered to provide a comprehensive picture of water quality
conditions as well as the effective communication and management of beach hazards as they may occur.
The recommended future benchmarks are: 11 Percentage of Newport Beach beaches rated A -minus or above by the
annual Heal the Bay Beach Report Card, and 2) Average Star Rating (5 star maximum) of City maintained beaches by
the annual NRDC Annual Beach Report.
%fail to meet water
quality health Fails less
Water quality Always issues
Posts closing/
Beach time
Rating details standards than testing advisories
advisories
5% /year for
online and at
frequency promptly
3years
beach
2008 2007 2006
Newport
1
if
Beach at
* 1 1 1 YES 5 /week YES
YES
Balboa Pier
Newport
Beach at
* 0 0 0 YES 5 /week YES
YES
Newport Pier
Key *Wale,
quality. 2008 1 * Water quality, last 3 years I * Water quality testing frequency I * Always issues advisories promptly I
Posts closings /adwsones online and at beach. Each star indicates that this beach meta spe fie standard in 2008
Each of the benchmark data sources has strengths and limitations, but both offer a readily available source of comparison
information on observed and reported water quality information as well as the reliability of advisory postings or beach
closings. A combination of the two sources should be considered to provide a comprehensive picture of water quality
conditions as well as the effective communication and management of beach hazards as they may occur.
The recommended future benchmarks are: 11 Percentage of Newport Beach beaches rated A -minus or above by the
annual Heal the Bay Beach Report Card, and 2) Average Star Rating (5 star maximum) of City maintained beaches by
the annual NRDC Annual Beach Report.
IV.al Development Services /Plan Review %First Plan Reviews Completed Within 30 Days
Limited comparable data is readily available for this service area. The measure stated above Is a standard used by cities
within the CPM database, however these measurement parameters are not used by the City of Newport Beach,
therefore no meaningful comparison can be presented.
Due to its importance to the City Council and other stakeholders however, it is recommended that Newport Beach
explore the development of a comparison group in California- focusing on cities with similar profiles and development
characteristics. Plan review responsiveness, is an issue identified by the City Council is measured by many jurisdictions,
however, the manner in which it is measured varies greatly. Therefore, it is recommended that Newport Beach
continue to seek comparison jurisdictions that currently, or in the future, may collect data on the completion of initial
plan reviews within 30 days. At present, City of Palo Alto and Santa Barbara Publish data that is apparently
comparable. Efforts should be made to determine whether a complete benchmarkmg group can be identified.
fil l Fire /Emergency Response - Response Time -Call Entry to Arrival
Percentage of Emergency Fire Calla wits a Response Time of 5 Minutes and Undef -Call Entry to Ardval
011 10'. 20i 30'. 40� . 50'. 601� 10\ go'. 90. 100.
Johnson City TN
Mc Allen Tx
Sterling Height. MI
Tyler Tx
Newport Reach CA
Westminster CO
Eugene OR
ICMA Median
ICMA Mean
Farmers Branch lX
To Kalb It
Bellevue WA
Corvallis OR
Virginia (tech VA P I
Readily evadable comparison data from CPM database- limited peer cities reporting
Note Newport Beach data includes time from local station notification to arnval, future reports will capture Newport Beach data from call entry to
arrival
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
V.bl Fire /Emergency Response— Fire Containment
All Structure Fire incidents Percentage of Fires Confined to Room., Structure of Origin
0* 30'. 30'. 30'. 60'. 50% 60, 70, . BD`:
Westminster CO
Bellevue WA
Neoport Beach CA
VirgiNa Beach VA
Alexandria VA
Long Beach CA
Tilton GA
Woodstock GA
Mstae.ka,S.Ana AK
Savannah GA
Kirkland WA
ICMA Median
ICMA Mean
East Providence RI
Peoria II.
90` 100',
Readily available comparison data from CPM database. limited peer cities reporting
Fire Deportment Joonfote- The primary function of the public fire service has always been to lima fires to the "Building of Origin" Excellence m fire
protection can be measured in a department's abil iv to limit afire to the "Room of Origin' Factors that directly contribute to this measurement
include response time, staffing adequacy and effective equipment
ICMA Median
ICMA Mean
Evanston IL
Huntington Beach CA
Santa Crue CA
Ventura CA
Long Beach CA
Bellevue WA
Santa Barbara CA
Manhattan Beach CA
Santa Monica CA
Newport Beach CA
Palo Auto CA
Fire 8 EMS Operating EapersdOures per 1.000 population
s0 S50,000 5100.000 5150.000 5200,000 $250.000 $300.000 $730.000
Readily available data for numerous peer cities from CPM database
Note. Newport Beach data excludes all Lifeguard expenses with the exception of some Training Division expenses for Lifeguard /Jr. Lifeguard
training which could not be separated for this report.
Legend for All Graphs (Unless noted otherwise, all data Is for FY 2005 -2006(
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
FIRE 6 EMS "Es per 1.000 population
0.0 0.5 1.0 1.5 2.0 a.$
Bellevue WA
Palo Alto CA
Fishers IN
Evanston IL
ICMA Mean
ICMA Median
Westminster co
Newport Beach CA
Virginia Beach VA
San. Monica CA
Coral Springs R
Scottsdale AZ
Santa Barbara CA
Manhattan Beach CA
Huntington Beach CA
ilv available data for numerous peer cities- from CPM databa
Note. Newport Beach data excludes Lifeguard, Jr. Lifeguard and related training staff FTE's.
fire Deparrment joornore Several factors determine the number of full time employees per a percentage of the population. These include housing
density, geography and staffing policies Newport Beach has a mixture of housing types and densities. Newport Beach fire stations are looted to
mlmmoe response time irrespective of the density of their response districts Geography also Influences fire department staffing. Without the
division of the City by the harbor and Back Bay, it would be possible to meet out response time obfectives with two less stations, which would lower
the FTE /1000 ratio from 1.61 to 1 .33.
VIA General Services /Beach and Pier Maintenance
No comparison data is available for this service area. Due to Its Importance to the City Council and other stakeholders
however, it is recommended that measures be developed for objectively assessing the condition of beaches and piers,
and further, that Newport Beach initiate the development of a comparison group in California.
VU4LgSrtqrl Services / Park M in n n n i r A r
O asend Maintenance ExpanONUm per Developed AIXa
s0 82.000 80.000 88.000 88,000
Palm Coast FL
RowlOtt Tx
Neepon Beach C4
i
Helle�ue WA
Fishers IN
Peachtree City GA
Santa Bareara County CA
ICMA Median
North Richland Hills Tx
Suwartee GA
Savannah GA
ICMA Mean
Evanston IL
Queen Creeh AZ
C01a1 Sprints FL
Readily available comparison data from CPM database mix of peer titles and others reporting
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
Cache" ReOnp of Ralaential Reluee Collection Seri ice (Excellent • Good)
(ReePPrt Ill data rrom 3007 Resident Suner)
01 101 204 30% 401 501 801 70% 80% 90% 100%
uroaneale IA
Newoon News VA
New,on Reach C4
Lonxmont CO
Sasannah GA
Coral SRrInRs Fl
xldland WA
Tatar... FL
Scottsdale AZ
tMmenln Place WA
ICMA Medlan
ICMA Mean
Ste.onx Re'Ohts MI
D...noort IA
Palm Coast FL
Marlene GA
Readily available comparison data
CPM database limited peer cities reporting
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
There is currently no ability to present meaningful comparisons on this Important benchmark measure. In order to benchmark the
Per account cost of refuse collection /disposal, the General Services Department will need to utilize an annual rate study of regional
municipalities conducted by an independent 3rd party. This approach will allow for a meaningful cost comparison taking Into
account the operating constraints faced by all Los Angeles & Orange County cities, such as AOMO restrictions, minimum waste
diversion requirements, and County landfill fees and requirements. The following list of recommended benchmark titles includes
programs that utilize either In -house or contract staff and offer either source separated or mixed waste recycling options in cities of
similar population.
Benchmark Cities: Buena Park
La Palma
Claremont
Orange
Costa Mesa
Pomona
Culver City
Santa Monica
Huntington Beach
Stanton
Irvine
Tustin
VI.el General Servicesj$treet Sweeping- EApendltures /Residential Curb -Miles Swept
O&M Eapeneiturea for Street Sneepmg per Linoar MJp Swept
s0 $5 $to 515 520 525 S30 535 $AO
Newport Beach CA j
Long Beach CA
Scottsdale AZ _
Benchmark Group
Meeun
Coral Spnngs FL
i
Benchmark Groat, Mean
Westminster CO
Bellevue WA
Readily available data for numerous peer cities- from CPM database
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
Vll.al Human Resources /Employment Serv2es - Central HR Expenditures per Ctv FTE
Human Resources Eapenddures per FrE
5o 5500
Fishers IN -
ICMA Median
Long Beach CA
ICMA Mean
Westminster CO
Coral Springs FL
Santa Monn:a CA
Santa Barbara CA
Bellevue WA
SconWale A2
Newport Beach CA
Mandanan Beach CA
Pat. Aft. CA
Ventura CA
51.000 51.500 52.000 $2.500 53.000 53,500
Readily available data for numerous peer cities- from CPM
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
of numan Nesources/tmOloyment Sere CeS - Average Days -internal Recruitment
Average Number of Oars to Compete Infernal Recmdment (NO rest)
0 10 20 30 40 50 60
Sandy Springs GA
Coral Springs FL
Woodstock GA
Bellevue WA
Schaumburg IL
Mc Hemp IL
ICMA Median
Highland Park IL
Newporl Besets CA
St. Charles IL
ICMA Mean
Johnson City rN
Vl.ginia Beach VA
Westminster CO
Des Moines M
Readily available comparison data from CPM database- mix of peer cities and others reporting
Reference Questions Answered per Full Time Employee
0 Soo 1,000 1.500 2.000 2.500 3,000 3.500 4.000 4.500 5.000
Johnson City rN
ICMA Mean
Sterling Heights MI
Westminster CO
Newport Beach CA
Evanston It
Scottsdale AZ
North Jas Vegas NV
Farmers Branch rx
Vkglnw Beach VA
ICMA Median
Davenport IA
Salem OR
Newport News VA
Keller Tx
Rowten Tx
Readily available comparison data from CPM database mix of peer cities and others reporting
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
- Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
Vlll.bl library/Customer Service - Materials Expenditures per Capita
Library Services Materiel Etper,dlture9 per Capita
s0 s1 $2 S3 S4 55 56 S] $8 39 310
Johnson City rN
Winter Haven FL
North Las Vegas NV
Smyrna GA
Starting Nelghts MI
Salem OR
Long Beach CA
Westminster CO
ICMA Median
North Richland Hills rx
ICMA Mean
Davenport IA
Chesapeake VA
Virginia Beach VA
Sconsdale AZ
Newport Beach CA
Evanston It
Readily avadabl
comparison data
CPM database- mix
cities and others reporting
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
rn Lm i n n n Rvr n
Av-- TYM hem Re s,10 of Telephpne CMI a Arr1yM w Beale (No tMpMlb)
Peods IL
Newport Beaty CA
Coral Springs FL
North Rlehbnd Hills TX
S1oua City IA
Bellevue WA
W estmieste, c0
ICMA Willa.
ICMA Mean
Salem OR
Flshen IN
Eugene OR
Rowlett TX
Gainesville R
SchaulMurg IL
0 100 200 300 400 600 Goo 700
Readily available data for numerous peer cities- from CPM database
Police Deportment footnote An emergency response is defined as an in-process violent crime or any call for service which sin progress and
Is potentially h e- threatening
Total Police 06M ErPehdllurc3 oer 1000 Population
50 5100.000 5200.000 $300.000 $000.000 5500.000 $600.000 S700.000
Fhnen IN ^--------- -
Virginia Beach VA
ICMA Median
ICMA Mean
Evanalon IL
Huntington Beach CA
Ventura CA
Bellarue WA
Santa Cruz CA
Santa Barbara CA
Palo Alto C4
Manhattan Beach CA
Newpon Beach CA
Santa Monuca CA i
Legend for All Graphs (Unless noted otherwise, all data Is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
X.dI Police /Emergency Response -Total Police Staffing FTE per 3000p9WWae.n
Police Senic. "Es par 3o00 Population
Santa Monlu CA
Scottadala A3
Evamton it
Newpon Beach CA
Palo Ana CA
Manhattan Beall CA
Weatminatar W
Santa Bar W ra CA
ICMA Mean
Coral Synngt FL
Bellevue WA
1CMA Median
Virginia Beach VA
Huntington Beach CA
Fh uh. IN
0.0 10 2.0 3.0 40 5.0 60
Readily available data for numerous peer titles- from CPM database
IX.e) PolicelInvestigations - Pan I Violent Crimes Cleared
Percentage of UCR PAM I Vlognt Cnmu Clelmd
0`, 10'' 201 30% 40', W. 110•, 7011, 80%
Newport Beach CA
Caul S,Mno FL
Rate,. WA
Westminster CO
Gainesville FL
ICMA Mean
Sterling %eight% MI
ICMA Median
Virginia Beach VA
Evnston IL
North Richland Hills TX
Scnaumhwx IL
North Las Vegas NV
Flshe IN
Fort Collins C0
Palm Coast FL
Readily available comparison data from CPM database- mix of peer cities and others reporting
Police Deportment footnote: The FBI Uniform Crime Reporting (UCR) definition of violent crimes includes murder, negligent
manslaughter, robbery, forcible rape, and aggravated assault
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
%.a) Public Works /Capital Protect Delivery
Percentage of Capital Projects Completed: On -time. On- budget & Total Number of
Projects Completed
FY 2007 -08
Fisc
Arad
0. nnK
\cu�,rt lkach tim Joxc
t )n Imdget • Taui pmlecu completed
120
IW
811
60
40
20
Preliminary data from Newport Beach's CIP Monitoring System shows that of 23 projects completed in 2007 -08 78%
were on schedule and 83% were completed within Council authorized funding. Limited benchmarking sources exist for
capital project completion performance. Two cities that have several years of performance measurement experience
are provided for reference. It is recommended that ongoing efforts be focused on year to year comparisons of Internal
data until a reliable benchmarking group can be established.
%.b) Public Works /Traffic & Transportation - Pavement Condition index Co oa sson to Orange Co 0t -es
Pavement Condition Index Comparison with Orange County Cities
0.1 30`: 201 30.1 40'. 50'1 601. 70'1 00'. 90. LOW
Ali. Viejo CA
Irvine CA
Orange County CA
Dana Paint CA
Anaheim CA
Newport Beach CA
Tustin CA
Lake Forest CA
Buena Park CA
Misaton Vlelo CA
Yorba Linda CA
San Juan Capistrano CA
Laguna Woods CA
Laguna Beach CA
Laguna Niguel CA
Costa Mesa CA
Rancho Santa Margarita CA
Seat Beach CA
Villa Park CA
Orange County CA Median
Orange County CA Mean
Placentia CA
Laguna I6Ra CA
Cypep d
Fountain Valley CA
area CA
Fullerton CA
Orange CA
San Clemente CA
La Nahra CA
Nuntinillon Beach CA
Santa Ana CA
Westminster CA
Los Alamitos CA
Stanton OA
Information gathered from Orange County cities all use the same pavement condition evaluation methodology
Public Works footnote: All PC] are Projected to 2005. Assumptions are that agencies have updated pavement maintenance and
rehabihtauon history since last held survey
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
4
X.cl Public Works/Traffic & Transportation- Satisfaction with Road Condition
Plano TX
Bellevue WA
Coral Springs R
University Mace WA
Scottsdale AZ
North Richland HIIH..
Sterling Heights MI
Lynnwood WA
ICMA Median
Westminster CO
ICMA Mean
Carterville GA
Newport Beach CA
Davenport IA
Citizen Satisfaction with Road Covditlens (Good • Mostly Goodl
05 20% 405 805 80•1 300, .
Readily available comparison data from CPM database- mix of peer cities and others reporting
Note: Newport Beach data is from the initial Resident Satisfaction Survey (2007)
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005.2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
XI.al Recreation & Senior Services /Active Kidz After School Program - % of Program Filled
The Newport Beach service area related to recreation programs is the "Active Kidz" after - school program. ICMA does not
collect performance data specifically for After - School recreation programming. Should Newport Beach desire to establish a
benchmark for Active Kidz, alternative sources of data will need to be developed such as the establishment of a local
benchmarking group. Recreation staff has contacted several local jurisdictions but, to date, no comparable data has been
successfully developed.
XI.b) Recreation and Senior Services; Senior Transportation —Cost per Trip
ICMA does not collect performance data specifically for senior transportation programs. Newport Beach Recreation
staff has initiated a local benchmarking group of four other senior transportation programs in Orange County, and has
obtained comparable data on the programs' cost per trip. The initial data is presented below for Fiscal Year 2007 -2008.
nuntlneton Beach. CA
Newport Beach. CA
Inmu CA
OCTA ACCESS
Seertn C .ty Senor Se,. — ,
Senor And Di"bled Trewponation Proprm Per Trip Cwt
s0 SS 1110 $13 $20 $23 $20 S35
Data provided is for Fiscal Year 2007 -2008
Note: All of the above Senior Transportation programs operate with paid staff except Huntington Beach which utilizes
volunteer drivers.
Legend for All Graphs (Unless noted otherwise, all data is for FY 2005 -2006)
Newport Beach
Peer Cities
Other CPM Cities
CPM Median and Mean
%II al Utilities /Drinking Water Supply
Water Supply Quality is one of the services critical to a community's physical and economic health
The benchmarking source for water utilities is the American Water Works Association's "Qualserve" program whose
member agencies contribute data on up to 22 indicators and attributes of water service and wastewater collection.
Initial comparisons with the AWWA's 19 -city California region aggregate data for 2006 -2007 shows Newport Beach
performing relatively well on two principal measures of drinking water supply service quality. Days in full compliance with
water quality standards is at 100 %. Water distribution system integrity is within the top 25% of agencies reporting.
Cost data is currently not comparable with that of the AWWA jurisdictions because Newport Beach water supply and
wastewater collection functions do not include operation of treatment facilities. To provide useful cost comparisons,
Newport Beach will need to work with the AWWA and with other jurisdictions to develop a group of utilities with a similar
scope of operations.
Xll.bl Utilities/ Drinkina Water supply - Percentage of days in full compliance
Comparison to AWWA Benchmarking Group of 19 California Agencies
FY 2006 -2007 Data
Newport 6cxh CA
25th per,cnnlc
lrlcdr�
'ith perccnttic
Drinking Water:
Percentage of days in full compliance
IP6 211',.. MPb fAr ". 8M. ItrV1.
From AWWA only source of comparison data available for utilities measures
sv
sainseaw saillpin loy algepene elep uosuedwon yo a»nos Aluo -VMMV wolf
OY 54. IJf 5z off 5; M s 0
6uldld salllu DOT jad s>,eajq /slleal To jagwrW
:Ayj6aluT u3a3sAS unllnqu3s!a jayeM
ml �iV
�lnu unxl tptz
V:1 q." uodm�N 1
eleO LOOZ -90OZ Al
sawmagV elwoyge7 61 to dnaag Sugiewgmaq VMMV of uosuedwo)
Nuldic 10 lw ooT s:lea! yo jagwnp : 3N alu! wars S - AICOnS iateM u!muuD sammn wiix
MIA) Utilities/ Wastewater Collection
Wastewater collection is a service critical to a community's physical and economic health. As with the drinking water
supply service, a benchmarking source for wastewater collection is the American Water Works Association's
"Qualserve" program.
At present the AWWA can provide benchmark data for one aspect of wastewater collection system quality: Sewer
Overflow Rate per 100 Miles of Piping. Initial comparison with the AWWA's 19 -city California region aggregate data
shows the 2006.2007 Newport Beach overflow rate is well below the mean for the benchmark group. This benchmark
does not include 81 miles of sewer lateral piping for which Newport Beach also provides maintenance.
System Integrity (system leaks and breaks per 100 miles of collection piping) is another measure of system quality,
however no AWWA data was yet available for this measure. Cost data is currently not comparable to most AWWA
jurisdictions because Newport Beach does not operate sewage treatment facilities. To provide accurate cost
comparisons Newport Beach would need to work with the AWWA and other jurisdictions to develop a benchmark
group of similar wastewater collection utilities.
XtLe) Utilities - Wastewater Collection- Overflows per 100 miles of piping
Comparison to AWWA Benchmarking Group of 19 California Agencies
FY 2006 2007 Data
Wastewater Collection:
Overflows per 100 miles of collection system piping
eta, r<,a11,d<
M<an
vwt +.n Ikuh (.A
21141 4W 6W six) 711,1. 1214! 14141 76
From AWWA. only source of comparison data available for utilities measures
Benchmark Comparisons- Summary
In general, the City of Newport Beach appears to compare favorably on most measures surveyed in terms of quality, cost,
cycle time and customer satisfaction. While general conclusions may be drawn from the results reported, it is
recommended that the results be used as the first step in the review of operations and as a way of determining priorities
for new investment or process improvement.
IV. Observations /Recommendations
Observations
Over the course of the project ICMA consultants gained insights into staff capabilities and the challenges associated
with implementing and maintaining a performance measurement and benchmarking system. We offer the following
observations for your consideration:
1. The City has the necessary systems to support performance measurement. While working with IT, HR and
other support functions, it was apparent that systems were in place to provide necessary assistance. IT in
particular was very adaptable and flexible during the project.
2. Managing with data will be a significant change for the organization. How the City Council and managers use
data, the way it's reported to the community, as well as the various ways the city secures customer feedback
will all be new activities. This will require discussion and deliberation between Council and staff regarding
importance, approach and expectations.
3. The organization has been stressed by the development of a performance measurement system and the data
collection requirements. This is due in part to the scope of the project, and the lack of familiarity with
performance measurement. However, it also appears to relate to the workload and staff capacity in some
areas. This should be reviewed in greater detail to determine on -going needs and to create a plan for
sustaining the effort.
4. It appears that the organization has a very "lean" analytical capacity. While this is commendable in terms of
keeping costs low, it may have an impact on the organization's on -going assessment, collection and
maintenance of performance data. Newport Beach appears to have a more limited capacity than similarly
situated organizations. A Council -staff discussion regarding the distribution of responsibilities and analytical
capacity would be useful.
S. The staff is very committed and engaged. Over the course of a very difficult project, with many stops and
starts, the staff remained engaged and committed to getting the job done. In particular, the central team went
the extra mile to assist the consultants and work with department facilitators to finalize measures and explain
concepts.
Recommendations
Based upon the foregoing information and observations, the following recommendations are offered for your
consideration:
1. Continue refining measures and benchmarks. A great deal of effort has been committed to establishing a solid
base of measures for which data is currently being collected. Some measures will need to be refined as a better
understanding of the work emerges, and some benchmarks will be abandoned or changed as discussion of what's
important evolves. A measurement team should continue, and a person in each department should be
designated as the "measurement coordinator ".
2. Conduct a debriefing session with the core team and facilitators. The purpose of the session would be for the City
Manager and Department Heads to uncover issues and determine staff capacity to fully implement a
performance management system. This should be followed by a Council study session regarding the costs and
benefits of managing with data.
3. Decisions should be made regarding what measures and outcomes the Council will track on an on -going basis in
order to establish a clear linkage to strategic planning/goal setting sessions.
4. Conduct regular strategic planning. The City has committed to performance excellence, and to benchmarking
against "best in class" organizations. Developing a multi -year strategic plan with actionable objectives, linked to a
business plan and annual budget will bring Newport Beach in line with best practice in this area. This will also
help to define the strategic outcomes for measurement, and improve alignment with department operations.
S. Consider leading an effort with the similarly situated cities to conduct benchmarking - focused on key Newport
Beach measures. As noted, some of the benchmarking measures lack comparables. Although additional effort by
staff will expand the comparable data, many of the measures important to Newport Beach are not collected by
the peer cities. One remedy would be for Newport Beach to lead a benchmark group. ICMA has numerous
benchmark consortiums across the country. Newport Beach could set the standards for a southern California
group.
6. Continue regular community and business surveys. Surveys of customer requirements and satisfaction are crucial
to the measurement of various services. In addition, future surveys should be designed to elicit specific
information concerning customer requirements so that systems can be designed properly and data can be
collected to measure results related to meeting customer needs.
7. Conduct regular employee surveys. Much like the community and business surveys, an employee survey will
enable management to understand employee requirements - particularly as they relate to internal service
delivery and strategic alignment.
8. Adopt a five year Capital Improvement Program (CIP). Because capital project management is a high priority for
the City Council, and the staff delivers a high volume of projects, the establishment of a multi -year Capital
Improvement Program is a prudent course of action. This should become part of the business planning/budgeting
cycle.
The opportunity to assist the City of Newport Beach with this project is sincerely appreciated. The organization's
openness to suggestions and willingness to work to implement this system has been gratifying.
. n 1 /1 r 11 1 M 1 1
Appendix A
Performance measures for thirty service areas are contained in this Appendix.
The vast majority of the measures will be used to track organizational per-
formance and will provide valuable information to staff for continuous im-
provement efforts. As time passes and experience with performance measure-
ment increases, a number of measures are likely to change to better reflect
operational needs and management requirements. This is a natural evolution
and should be expected as the system becomes integrated into day -to -day
operations.
The City Council previously approved a preliminary list of these measures in
September 2008. Over the intervening months some of the measures have
been modified to better fit internal needs and /or reporting requirements.
City of Newport Beach Performance Measurement and Benchmarking Project
Department / Service
Administrative Services
Budgeting
Information Technology Services
Purchase Requisitioning /Purchase Ordering
City Attorney's Office
Contract Approval
Litigation Matters
City Clerk
Records Management
City Manager's Office
Community Code Compliance
Public Information / Communication
Water Quality /Environmental Protection
Development Services
Plan Check
Building Inspection
Fire
Fire Emergency Response
Fire and Life Safety Complaint Resolution
General Services
Beach and Pier Maintenance
Park Maintenance
Refuse Collection
Street Sweeping
Table of Contents
Tab
Department / Service
Tab
Human Resources
•
Employment Services
20
4
Employee Relations
21
Library
Library Customer Service
22
Police
'
Crime Prevention
23
Police Emergency Response
24
.
Police Investigations
25
Public Works
•+
Capital Improvement Plan Delivery
2b
iu
Traffic and Transportation Management
2-
i i
Recreation and Senior Services
After School Programming - Active Kidz Program
2n
12
Senior Transportation Services
29
1;
Utilities
Drinking Water Supply
to
14
Electrical Services
ai
1<
Wastewater Collection
;2
ib
ih
• Development Services measures will be aligned into the two major phases of development review, plan check and build-
ing inspection, and include the services delivered by the Planning, Building, Public Works, and Fire Departments towards
the process. The individual services are detailed below by partner department.
Department / Service
Building
Plan Check
Building Inspection
Planning
Discretionary Land Use and Development Approvals
Plan Check Processing
Department / Service
Fire
Fire Department Plan Review
Public Works
Development Review Process
2
(ity of Newport Beach Performance Measurement and Benchmarking Project
To develop and administer a financial operating plan that balances short -term priorities along with long-term
stability, flexibility and sustainability while addressing the operating needs of our departments.
Proposed Measures
• Contingency Reserves and other relevant Council reserve policies are fully funded.
• Contributions toward unfunded liabilities meet or exceed the actuarial and /or City Council policy
requirement for each program.
• The City has received the GFOA Award for CAFR Excellence.
• The minimum contributions to insure the sustainability of the facilities financing plan are being met
(as defined by the last approved facilities financing plan).
• None proposed.
Budget Adopted per internal budget calendar deadlines.
• Survey of internal customer satisfaction as part of the proposed biannual employee survey.
Evaluative Criteria
Meaningful
These measures will provide information that can be used to monitor and communicate the extent of compliance with budget and fi-
nancial policies and goals
Useful
These measures will monitor the quality of service we provide in addition to internal customer awareness and satisfaction.
Sustainable
These measures can be tracked easily by staff. The proposed biannual employee survey will be used to elicit input from our internal
customers regarding their satisfaction with our services.
City of Newport Beach Performance Measurement and Benchmarking Project
Provide enterprise access to City information and services in support of our customers
Proposed Measures
• Percentage of time the City's network and major subsystems such as email, voicemail, Pentomation
Financial package, etc. are available for use
• Cost of central network support per hour the network is available (for each major sub - system)
• Cost of network support per workstation
• IT Operations and Maintenance expenditures — % of Total City O&M $ B
timely resolution of IT issues (by priority):
• Priority 1 — A problem impacting a significant group of users or any mission critical IT service af-
fecting a single user or group of users. (30 minutes)
• Priority 2 — Non- critical, but significant issue; or an issue that is degrading performance and reli-
ability; however the services are still operational. Support Issues that could escalate if not ad-
dressed quickly. (60 minutes)
• Priority 3 — Routine support requests that impact a single user or non - critical software or hard-
ware error. (8 hours)
• Biannual survey of internal customers (City Staff) rating their satisfaction with quality and timeli-
ness of IT services provided.
Evaluative Criteria
Meaningful
Information and technology services are critical to the successful delivery of most City services. Data on the availability of major IT
systems and the timely resolution of any issues are increasingly important to evaluate not only the IT function, but the service delivery
of on entire organization.
Useful
These measures will monitor the quality, cost, and timeliness of our technology services and obtain user request trends to identify areas
of improvement needed. Benchmarking Sources: ICMA CPM.
Sustainable
Other than city -wide surveys, these measures can be tracked easily by in -house methods. Reporting Frequency:eoch Fiscal Year. It is
recommended that a biannual city -wide employee survey be developed to elicit input from our internal customers regarding their
satisfaction with our services.
City of Newport Beach Performance Measurement and
Purchasing enables City Departments to meet their needs with quality goods and other resources at
competitive prices while complying with State law, Council policy, City Charter, and Municipal Code
requirements.
Proposed Measures
Quality
Cost
• Percent of supplier bids awarded without delay due to re -bid or protest.
• Dollars saved on purchase orders compared to total cost submitted on requisitions
• Administrative cost of purchasing /warehouse vs. total purchases made ($) B
Cycle rime
0 • Average number of calendar days from receipt of requisition to Purchase Order Issuance for bid
amounts of $10,000 to $25,000
Customer Satisfaction
• Biannual survey of internal customers (City Staff) rating their satisfaction with centralized pur-
chase order processing
Evaluative Criteria
Meaningful
These measures will monitor the quality, timeliness, and efficiency of services that we provide in addition to the information on quantity
outputs now reported.
Useful
Measurement results will be used to refine our purchasing methods as needed. These measures will provide information that is easily
compored with other organizations and provides a management tool to monitor our annual workloads and production. Benchmorking
Sources: ICMA CPM, California Association of Public Purchasing Officers (CAPPO)
Sustainable
These measures are already tracked by staff. Recommend biannual employee satisfaction survey be developed to elicit input from
our internal customers. Reporting Frequency: each Fiscal Year
City of Newport Beach Performance Measurement and Benchmarking Project
Ensure legally enforceable and sound contracts in accordance with Council Policy F -14 entered into between
City and contractors for ovtsourced projects or services in conjunction with timely review, comment and /or
approval.
Proposed Measures
oual -ity
s� None proposed
Cost of representation by in -house Office of City Attorney compared with benchmark cities and B
outside firms
None proposed
Satisfaction of Customers (City staff, Council & Commissions) with legal advice /services:
• Responsiveness by OCA to form contract review and approval as to form
• Responsiveness by OCA to complex contract and Ordinance review and ap-
proval as to form
Evaluative Criteria
Meaningful
Useful
Suoports decision making in staffing levels for the City. Reporting Frequency: As needed, at least annually, or as requested by Coun-
cil
Sustainable
Cannot collect data on turnaround times with current systems, instead, OCA is developing a City -wide Contract Standard Operating
Procedure (SOP) for staff and "Contract Gatekeeper" Training to improve department understanding of contract process and regvire-
mems to ultimately decrease the number of delays. Will measure customer satisfaction with OCA's responsiveness on contract review
and approval using an annual staff and management survey.
(ity of Newport Beach Performance Measurement and Benchmorking Project
Represent the City's interests before all courts and administrative bodies and act as the City's chief
prosecutor in all code enforcement matters
Proposed Measures
• % of Cases resolved in favor of the City and /or within Council- Approved settlement range
• % of Code Enforcement cases referred to OCA which result in substantial compliance
• Cost per case litigated
None proposed
• Customer satisfaction with OCA efforts to prosecute municipal code offenders
• Customer satisfaction with OCA efforts to protect the City's interests in Court
• Customer satisfaction with being timely informed of significant developments in a case
Evaluative Criteria
Meaningful
Annual survey will provide increased communication with customers (staff, council, commissions).
Useful
Increased awareness of enforcement process by the customers, increased compliance by citizens. Reporting Frequency: Annually or as
required by City Council
Sustainable
7
City of Newport Beach Performance Measurement and Benchmarking Project
2 111 1 CITY CLERK
RECORDS MANAGEMENT
" 4
To manage City records so they are accessible and maintained efficiently and cost - effectively, and so the
City retains records it is required to keep, identifies and preserves records with permanent value, and
disposes of those with no further value in a secure and timely manner
Proposed Measures
Quality
Note: Availability and timely access to required records are the principal measures of quality for
this service. The proposed Cycle Time measures (see below) will cover this aspect of service deliv-
ery performance.
Cost
Overall expenditures to manage records
Annual revenue from fees /charges.
Cycle Time
• Internal Customers: % of permanent documents (ordinances, resolutions, minutes, agenda pack-
ets, etc,) entered into Alchemy record system within six business days.
• All Customers: Provide requested paper or electronic documents within time targets (TBD based
on type, accessibility and complexity of request)
Customer Satisfaction
• Biannual survey of internal customers (City Staff):
Customer satisfaction relative to record availability, timeliness, courtesy as well as convenience
using online document search engine
Evaluative Criteria
Meaningful
Maintaining public records is an essential job function of the City Clerk's office. The records provide information relative to Newport
Beach's history, regulations, and policies. These measures provide information that the City Clerk's office can use in order to monitor
efficiency in staff time and resources.
Useful
Customers may became more aware of the information that the City Clerk's office provides. City Clerk staff will become more aware
of customer satisfaction and expectations. City Clerk staff will also be able to determine improvements that can be made to the docu-
ment scanning system or web -based access. Cost of records management and annual revenues from fees and charges may be bench-
marked against local jurisdictions (TBD).
Sustainable
The information is already available and accessible to the public and City staff. However, the City Clerk's office will work with IT to
implement a tracking system for online documents, tracking system for Alchemy documents, and online feedback to coincide with the
benchmorking process. The City Clerk's office will develop a database to determine what type of information is being requested by
staff or the public, and the turnaround time to produce the information. Reporting Frequency: each Fiscal Year. .
City of Newport Beach Performance Measurement and Benchmarking Project
CITY MANAGER
/ COMMUNITY CODE COMPLIANCE
Preserve community character, protect the environment, protect property rights and promote compliance with
land use regulations.
Proposed Measures
Rates of complaint resolution /compliance obtained through:
• phone call, meeting, letter
• Notice of Violation (NOV)
• Administrative Citation
Code Enforcement expenditures per capita.
3
Average number of calendar days from the time a complaint is first reported to the first non- B
inspection response. (A non - inspection response is defined as a response to a complainant that
does not include an on -site inspection, i.e. phone calls, letters, emails, drive -by /visual inspection
and other similar type responses)
Average number of calendar days from first inspection to resolution through:
• Phone call, meeting, letter
• Notice of Violation (NOV)
• Administrative Citation
• Percent of residents satisfied with code enforcement /compliance services (using biennial resident
survey results).
Evaluative Criteria
Meaningful
These measures provide Information relative to how effectively we do our job given the tools and resources available to us.
Useful
These measures can monitor the efficiency of our time and resources. In addition, the cost measure will be a useful tool providing City
management with comparable cost data for this division. Benchmarking Sources: ICMA CPM database. Reporting Frequency: Annually
Sustainable
Staff is working to develop a system to better hack information needed to document these measures. Additional staff time will be
needed to track and report on these measures.
City of Newport Beach Performance Measurement and Benchmarking Project
The purpose is to ensure that residents, businesses and City staff are well-informed about City issues, projects
and policies.
Proposed Measures
Quality
4 • Percentage of customers that were able to find the information desired on the City Web site
• City web site usage —% of total services transacted on line.
Cost
None proposed
Cycle Time
00 Frequency of postings / edits to City Web site
Customer Satisfaction
• Percent of customer satisfaction with City efforts to keep residents informed about local issues
• Percent of customer satisfaction with the quality NBTV programming
• Percent of customer satisfaction with the quality of the City Web site
Evaluative Criteria
Meaningful
Effective communication is a City Council priority and essential to serving the Newport Beach community.
Useful
The customer satisfaction survey, web surveys and focus groups will enable us to evaluate the effectiveness of our communication tools
and processes and focus funding and resources on those most - valued by our community.. Reporting Frequency: Annually
Sustainable
The customer satisfaction measures can be tracked with future community surveys or through focus groups. The majority of the commu-
nication measures can be evaluated using data curremly collected by the department. Web -based surveys could also be created to
measure customer satisfaction with finding information an the City website.
10
City of Newport Beach Performance Measurement and Benchmarking Project
Preserve the safety and quality of Newport Beach's bay and ocean waters through compliance with applicable
regulations and required monitoring.
Proposed Measures
Quality
Cost
# of Beach Mile Days when beach water quality warnings are posted during the summer
(AB41 1) period
• % of 35 City beaches rated A -minus or better by annual Heal the Bay Report Card B
• Average Star Rating (5 Star maximum) of City Beaches by the Natural Resources De- B
fense Council Annual Beach Report (Newport Beach at Balboa Pier and Newport Beach
at Newport Pier)
Costs for compliance with NPDES permit regulations
Cycle Time
0 None recommended
Customer Satisfaction
Percent of residents satisfied with beach and bay water quality (per biennial resident
04 survey results)
Evaluative Criteria
Meaningful
Safety of and access to beaches and waterways directly impacts human and aquatic health, and the local economy. Recommend use
of annual Natural Resources Defense Council (NRDQ beach rating as a benchmarking source. NRDC provides o 5 -stor rating guide for
200 of the nation's most popular beaches, bored on indicators of beach water quality, monitoring frequency, and public notification of comomino-
lion.. Both of the City's oceon beaches rated in 2008 received 5 slots. An additional source is the Heal the Bay Beach Report Card which owards
letter grades to 35 City beaches and comporison to nearly 500 California beaches on monitored water quality.
Useful
Compliance with water quolity standards is required by federal, state and local laws and regulations. Comparison data will allow
benchmorking of Newport Beach water quality and management practices to local and other beaches. In addition, the cost measure
will be a useful tool providing City management with comparable cost data for water quality. Reporting Frequency: Annually
Sustainable
Recommend continuation of City Resident Survey to sustain customer satisfaction information. Water quality monitoring data is cur-
rently collected and available and is routinely reported to regulatory agencies. Benchmarking data is readily available from exter-
nally produced annual reports.
City of Newport Beach Performance Measurement and Benchmarking Project
To implement State and City policies and regulations, protect the value of real property and improvements,
and ensure the safety of building occupants
Proposed Measures
Quality
• % of projects approved over the counter
• % of projects approved after first review, after being taken in for review (by dept)
• % of corrections (by dept) required due to:
Incomplete submittal — Lack of code compliance
Staff error Project changes
Cost
• Average number of review cycles, by project type'
• Application fees .
• Budgeted expenditures per $1,000 valuation of permits issued
Cycle Time
% of plan reviews (by dept) completed within target dates:
90% of 1 st reviews within 30 days B
— 90% of rechecks within 14 days
• Number of days with City; number of days with applicant, by project type'
Customer en
• Customer er comm comment cords
• Biannual, statistically valid survey of Development Services customers.
• Project types: New commercial, Commercial additions & tenant impvts, Residential valued $200K +. Residential valued up to $200K
Evaluative Criteria
Quality
Meonirgfult Show number of projects approved quickly and neosons far corrMions; corecfion?rns will not odd to IWI, because projects are not
approved for more than one reason; measures quality of applicant submittals as well as City staff work; does not measure staff flexibility in adminis-
tering regulations; connot be compared with other cities, Useful: Inform applicants and Council why projects take longer than expected; highlight
staff training needs. Sustainable: Tracking reasons for corrections will require customized programming, report formatting / preporation, and staff
training /time to make proper entries in Permits Plus for every plan review.
Cost
Meaningful; Cost of service can be compared to other cities as a measure of efficiency, but fee structures, valuation tables, and complexity of pro-
jects may not be comparable. Useful: Con help with annual fee adjustments. Sustainable: Readily available information.
Cycle Time
Meaningful: Show whether City is meeting forgets, is comporable to other jurisdictions, and whether City or opplicom is taking more time for projects
with long duration. Useful: Help balance staff resources and workload; highlight staff training needs. Sustainable: Currently implementing custom.
ized programming, report formatting and preparation, and staff training and time to make proper entries in Permits Plus for every action. limited
comporable data is readily available; staff will evaluate the development of a benchmorking group within California.
Customer Satisfaction
Meaningful: Commenn cards provide point of service reactions; and survey provides reliable data. Useful; Comment cord responses should be tabu-
lated to increase usefulness. Survey is most useful, if it includes a statistically significant sample of everyone who has used plan check services during
the year. Sustainable: Tabulating comment card responses requires staff Time; survey requires appropriation for specialized consultant.
12
City of Newport Beach Performance Measurement and Benchmarking Project
�s
The purpose of inspections is to implement State and City policies and regulations, protect the value of real
property and improvements, and ensure the safety of building occupants.
Proposed Measures
GiNone Proposed
• Application fees .
• Budgeted expenditures per $1,000 valuation of permits issued
• % of inspections provided on the day requested
• % of construction related complaints investigated on the next business day after the complaint is
received
• Customer comment cards
• Biannual, statistically valid survey of Development Services customers.
Evaluative Criteria
Quality
None Proposed
Cost
Meaningful: Cost of service can be compared to other cities as a measure of efficiency, but fee structures, valuation tables, and com-
plexity of projects may not be comparable. Useful: Can help with annual fee adjustments. Sustainable: Readily available information.
Cycle Time
Stow Department's responsiveness to customer needs.
Useful: Help determine if staffing level is adequate.
Sustainable: Implementing significant programming enhancements and staff training to make entries and compile reports.
Customer Satisfaction
Meaningful: Meetings provide ongoing feedback; comment cards provide point of service reactions; and survey provides reliable
data. Useful: Comment card responses should be tabulated to increase usefulness. Survey is most useful, if it includes a statistically
significant sample of everyone who has used plan check services during the year. Sustainable: Tabulating comment card responses
requires staff time; survey requires appropriation for specialized consultant.
13
City of Newport Beach Performance Measurement and Benchmarking Project
To respond as quickly as possible, with appropriate resources necessary, to stop and minimize the loss of life
and property.
Proposed Measures
• % of structure fires confined to the room /structure of origin.
0
• Total Fire and EMS operating and vehicle expenditures (less lifeguard costs) per 1,000 pop. B
• Fire and EMS staffing (FTE) per 1,000 population. B
• % of emergency calls in which the first unit arrives within 5 minutes of call entry by Fire Dis• B
patch.
• % of emergency calls in which the first paramedic unit arrives within 8 minutes of call entry by
Fire Dispatch
• % of structural fire calls in which the first Truck Company arrives within 8 minutes of call entry
by Fire Dispatch
• Biannual resident survey of satisfaction among those having contact with Fire department
Evaluative Criteria
Meaningful
Will show city customers that the fire department responds to emergencies in an appropriate time frame in accordance with accepted
national standards.
Useful
Provides a tool to ascertain whether we are maintaining the proper time standards required for providing appropriate medical care
to patients and providing appropriate firefighting intervention in regards to structure fires.
Benchmarking Sources: ICMA, National Fire Protection Standards Guideline 1710; International Association of fire Chiefs; California
Fire Chiefs; Orange County Fire Chiefs Association; Other fire departments in our area.
Sustainable
Response time data available from CAD records. Reporting Frequency: Annually
14
City of Newport Beach Performance Measurement and Benchmarking Project
�W,`
To respond as quickly as possible, with appropriate resources necessary, to stop and minimize the loss of life
and property.
Proposed Measures
1• None proposed
Cost per complaint investigated.
• % of non -life safety complaints responded to within 2 business days
• % of complaints resolved within 10 business days
• % of complaints resolved within 10.20 business days
• % of complaints unresolved
• Biannual resident survey of satisfaction among those having contact with Fire department
Evaluative Criteria
Meaningful
Complaints from our customers are an important method by which the Fire Department learns of hazards in our community. Investiga-
tion of these complaints is a priority for the Fire Department's Fire Prevention Division
Useful
The performance measures will allow the Fire Prevention Division to track the amount of time it takes to investigate complaints and
reduce any hazards identified. The measures will also allow the Fire Prevention Division to maintain a record of the number and type
of hazards identified through the complaint process and make changes to the Fire Code if necessary.
Sustainable
The Fire Prevention staff has developed a system by which complaints can be fully tracked. Reporting Frequency: Annually
U:
fity of Newport Beach Performance Measurement and Benchmorking Project
Maintain status as world -class destination by keeping ocean & bay beaches and piers clean for patrons.
Proposed Measures
• # of service requests by type & area
- Type: Infrastructure (restrooms, fire rings), Appearance (natural debris on sand, foreign debris
on sand)
- Area: Beach Areas 1-6, little Corona, Balboa Island, Bayside /Mist
• Total annual maintenance cost
- per beach -mile
- per acre
• Staff- hours /month
None proposed
• Biannual resident survey of satisfaction with maintenance of beaches and piers.
Evaluative Criteria
Meaningful
Meosu,es he,p +o qucnrify some oh the unique aspects of beach onci bay maintenance.
Useful
Provide Into,,,cnon to assist with self - evaluation and comparison considering unique conditions and services
Sustainable
Most data is currently collected daily. Reporting Frequency:
Annually
In
City of Newport Beach Performance Measurement and Benchmarking Project
Ensure active and passive enjoyment of open spaces and
stewardship of valued City resource
Proposed Measures
• # of service requests by type & facility:
- Type: Infrastructure (restrooms, irrigation)
- Appearance (natural debris in landscaping, foreign debris in landscaping, natural debris on
turf, foreign debris on turf, foreign debris in native /sensitive areas)
• Maintenance cost /contract acre, Maintenance cost /in -house acre
• Maintenance cost /developed acre B
• Maintenance cost /undeveloped acre
None proposed
• Biannual resident survey of satisfaction with park maintenance.
Evaluative Criteria
Meaningful
Measures can be used to evaluate service delivery based on customer satisfaction and cost per acre.
Useful
Allows for mointenance cost of new park facilities to be estimated and existing pork facilities service levels to be evaluated. Bench-
marking Sources: ICMA CPM. Reporting Frequency: Annually
Sustainable
Most data is currently collected daily and reported monthly and annually.
17
City of Newport Beach Performance Measurement and Benchmarking Project
Mandated collection and disposal of residential refuse and the diversion of recyclable materials
Proposed Measures
Quality
• Recyclables as % of all tons of residential refuse collected
• Number of non- routine issues /requests for service
Cost
• Operating and maintenance expenditures for refuse collection per refuse collection account
• Comparison of cosh /expenditures to annual HF +F study selected cities in region B
Cycle Time
0 • % of non- routine issues /requests for service resolved by the next business day (city operated
routes only)
Customer Satisfaction
• Biannual resident survey of satisfaction with refuse collection.
Evaluative Criteria
Meaningful
Measures provide information about how much service delivery costs and how the customers perceive the quality of the service.
Useful
The measures can be used to determine if the service delivery mandates are being met. Benchmarking sources: ICMA CPM and annual
HF +F cost /rate study of regional cities.
Sustainable
Most data is currently collected on an ongoing basis. Non - routine issue /request data can be collected without additional resources.
Recommend continuation of biannual Resident Survey.
18
City of Newport Beach Performance Measurement and Benchmarking Project
Keep debris out of roadways, alleys, beaches, and bays
Proposed Measures
Quality
6 • Percent of streets swept on schedule
Cost
Operating and maintenance expenditures per linear mile swept
Cycle Time
0* Time between sweepings
Customer Satisfaction
• Biannual resident survey of satisfaction with street sweeping.
A* Number of service requests.
Evaluative Criteria
Meaningful
Measures ca, be used !o e,clucte coter,ticl service delivery cycle changes.
Useful
Provides information an efficiency of service delivery at given cost. Benchmarking source: ICMA
Sustainable
The data Is currently collected daily and reported monthly and annually. Recommend continuation of biannual Resident Survey.
19
City of Newport Beach Performance Measurement and Ben(hmarking Project
The purpose of Employment Services is to deliver capable, quality employees and to minimize departmental
workflow disruption due to turnover.
Proposed Measures
• Citywide Turnover Rate
• Citywide Vacancy Rate
• % of Employees passing probation.
Cost of Central Human Resources staff per 100 City employees B
Cost per Hire for: Executive Recruitments
Police /Fire
Non-Police/Fire
• % of time Hiring Timeline goals are met (Outside Recruitments and Internal Promotions)
• Average days to hire (Outside Recruitments and Internal Promotions) B
Customer Satisfaction
• Biannual Survey of hiring managers using the hiring process
OA• Biannual City Employee Survey on HR services (recruitment, training, open enrollment process, etc.)
Evaluative Criteria
Meaningful
Recrunmenrs are on essential function and often a number one priority of Human Resources. It's important for departments to know
what is entailed and how long the recruitment process typically lasts so that arrangements can be made for a smooth transition.
Useful
These measures can monitor efficiency of time and resources. Customers may become more aware of the volume and complexity of
projects being produced each year. Staff can become more aware of customer expectations. Benchmorking Sources:ICMA CPM.
Sustainable
Reporting Frequency: Annually except for the two bionnuol surveys. Would require recommended implementation of a biannual Em.
ployee Survey to measure satisfaction with internal services delivered to Newport Beach employees.
20
City of Newport Beach Performance Measurement and
The purpose of Human Resources is to guide and counsel City employees and supervisors in adhering to City
policies and procedures, fostering a fair and consistent work environment.
Proposed Measures
Quality
%' • Average length of Employment
• of Grievances resolved before passing from management control
Cost
None recommended
Cycle Time
0 • % of timely responses to grievances filed
• % of Employee Performance Evaluations completed on time.
Customer satisfaction
None recommended
Evaluative Criteria
Meaningful
Performance measures for Employee Relations focus on the processes of counseling, formal discipline, and grievance resolution as well
as overall indicators of workforce tenure. However, because of the relatively few cases of discipline occurring each year, the percent-
age changes may appear to overstate the variances. Therefore it is recommended that the Department and the City track occur-
rences only as service load indicators.
Useful
Sustainable
21
City of Newport Beach Performance Measurement and
The purpose is to provide the community with information and materials through personal service by staff and
self - service through accessible technology.
Proposed Measures
ouality
• Number of reference transactions per FTE reference staff B
• High ratings from Secret Shoppers indicating that accurate answers were received quickly.
Cost
• Costs per checkout item in relation to the operating budget for public services
• Materials expenditures per capita B
• Cost per Reference question in relation to the reference staffing costs
Cycle Time
0 • Amount of time it takes for a new item to go from initial receipt to the shelf
Customer satisfaction
• Biannual resident survey of satisfaction with Library Services
Evaluative Criteria
Meaningful
Customer service is the main component of library service. It is important to know that we are filling our customer's needs and bolanc.
ing our personal service with our technology and self - service.
Useful
The customer surveys and secret shopper reports will give us insight into how we ore fulfilling the needs of the community. The ongoing
look at statistics will help us to ensure that we are efficient and allocating our resources to the right services. Benchmarking Sources:
Colifomio State Library Annual Survey, ICMA CPM, HAPLR Index.. Reporting Frequency: Annually
Sustainable
The evaluation of service and resources will always be necessary. The methods we use to count our successes may change as library
service evolves.
22
City of Newport Beach Performance Measurement and
To educate key customers in crime prevention methods that will enable them to protect themselves from loss of
life and property and to reduce the likelihood of becoming victimized.
Proposed Measures
Quality
6 • Percentage of residents who participate in some form of neighborhood watch as sponsored by
the Police Department.
Cost
Crime prevention expenditures per 1,000 population.
Cycle Time
Customer Satisfaction
As measured by Resident Survey, % of residents who feel safe: B
44 • in their neighborhood during the day /at night
• in commercial /business areas after dark
Evaluative Criteria
Meaningful
Shows extent of the police department's effort in crime prevention relative to those of other cities. Customers will also realize the po-
lice department's focus and commitment to crime prevention.
Useful
Justifies investment value. Allows the police department to ascertain how their crime prevention efforts affect public perception of
safety.
Sustainable
Recommend continuation of biannual resident and business surveys. Reporting Frequency: Annually
23
City of Newport Beach Performance Measurement and
The purpose is to respond to emergency calls as quickly as possible with the goal of reducing the loss of lives
and property
Proposed Measures
Quality
As measured by Resident Survey, % of residents who feel safe:
• in their neighborhood during the day /at night
• in commercial /business areas after dark
Cost
• Total Police O +M expenditures per 1,000 population. B
• Total Police staffing (FTE) per 1,000 population. B
Cycle Time
• Average response time to top priority calls (from receipt of call to arrival on scene). B
• % of emergency calls (present and imminent threat to life or property) answered by dispatch-
ers within 5 seconds.
• % of emergency calls (police response with lights and sirens) responded to by field officers
within 5 minutes.
Customer Satisfaction
• % reporting parties satisfied based on effective response, timeliness, courtesy, ability to control
44 emergency.
• Biannual resident and business surveys
Evaluative Criteria
Meaningful
Will slow relative speed of police department dispatching and response to emergency calls and that response is according to ac-
cepted protocols and policies.
Useful
Perception of safety benchmark can be a good marketing tool for the city and assist in Justifying Investment /showing value. Cycle time
data can support analysis of resource deployment. Benchmarking Sources: ICMA, FBI, International. Assoc. of Chief's of Police, Cal.
Police Officers Assoc., Police Executive Research Forum.
Sustainable
Most information is available from Computer -Aided Dispatch (CAD) system records. Recommend continuation of biannual resident sur-
vey to capture perception of safety and satisfaction data. Reporting Frequency: Annually.
24
City of Newport Beach Performance Measurement and
The purpose is to investigate crimes in order to bring the case to resolution. To restore property in some
cases, and to make customers feel safer in that they will not be victimized again.
Proposed Measures
Quality
- FBI UCR Part I Crime rates
• % of reported UCR Part I crimes (including Police- initiated cases) that received criminal investi-
gations.
• % of investigated UCR Part I crimes cleared (includes crimes cleared by Patrol Officers).
• % of UCR Part I Crimes Cleared B
Cost
E30 Expenditures per UCR Part I crime cleared.
Cycle Time
0, Percentage of crimes investigated within time targets (TBD)
Customer Satisfaction
• As measured by Resident Survey, % of residents who feel safe:
• in their neighborhood during the day /at night
• in commercial /business areas after dark
Evaluative Criteria
Meaningful
Crime rates are primarily on indicator of community conditions while percentage of crimes investigated and cleared provide measures
of the Police Deportment's performance in addressing crime in the community.
This data can show that the Police Department investigates /clears a higher number of crimes than other cities Will tell customers how
quickly they can expect service and illustrate a department that is focused on more personal service.
Useful
Justifies level of investment value. Enables the Police Department to understand better how it allocates its time and resources. The
results of this benchmark will be a good marketing tool for the city and address customers' need to feel safe. Benchmarking Sources:
ICMA, FBI.
Sustainable
Most case information is currently collected. Initial and exit interviews with victims are currently performed and results can be tracked.
Recommend continuation of a biannual resident survey to receive feedback on overall perception of safety in the community - survey.
Reporting Frequency: Monthly (except Resident Survey(.
25
Oty of Newport Beach Performance Measurement and Benchmarking Project
The purpose of the CIP Delivery program is to protect and provide public infrastructure and assets so that
programs and services can be delivered to the people of Newport Beach.
Proposed Measures
• Maintain a citywide minimum average Pavement Condition Index of 80 and minimum individual
street ratings of 60
Slurry Seal all local and residential streets as well as city parking lots at a minimum of every 8
Deliver 95 percent of CIP projects (by category) scheduled for completion within a Fiscal Year B
within the original construction contract cost plus 10 %.
Substantially complete 85% of all CIP projects (by category) scheduled for completion within a B
Fiscal Year within two months of the approved baseline completion date
of Operating Department customers rating new or rehabilitated capital facilities as being
functional and sustainable after first year of use
of customer satisfaction with CIP projects
Evaluative Criteria
Meaningful
These measures provide information people want to know. The customer satisfaction measures can provide staff with feedback useful
for future projects.
Useful
These measures can monitor efficiency of time and resources. Customers may become more aware of the volume and complexity of
projects being produced each year. Staff can become more aware of customer expectations. Resource allocation and processes can
be questioned and modified with the information provided by these measures.
Sustainable
Staff is continuing to develop a database system to better trock information needed to document these measures.
26
City of Newport Beach Performance Measurement and Benchmarking Project
The purpose is to provide for the safe and efficient movement of people and goods through Newport Beach
through the installation and maintenance of engineered controls and measures.
Proposed Measures
Quality
i • Monitor corridor travel times (travel time /volume) for percent change.
• Pavement Condition Index compared to Orange County cities iTBD) B
Cost
• Deliver 95 percent of CIP projects (by category) scheduled for completion within a Fiscal Year
within the original construction contract cost plus 10 %.
Cycle Time
0 • Respond to and correct non - emergency signal maintenance requests within 5 days 90% of the
time
Customer Satisfaction
• Percent of customer satisfaction on improved controls or measures
• Resident satisfaction with road condition (biannual survey) B
Evaluative Criteria
Meaningful
These measures provide information people wont to know. The customer satisfaction measures can provide staff with feedback useful
for future projects.
Useful
These measures can monitor efficiency of time and resources. Customers may become more aware of the volume and complexity of
projects being produced each year. Staff can become more aware of customer expectations. Resource allocation and processes can
be questioned and modified with the information provided by these measures. Benchmarking Sources: APWA, City of San Jose, City
of Carlsbad
Sustainable
Reporting frequency: Annually
27
City of Newport Beach Performance Measurement and Benchmarking Project
Kids in a safe, enriching and educational environment having fun
Proposed Measures
Quality
Cost
6 • % of total capacity filled with registered participants
• Program cost recovery rate
Cycle Time
00 Not recommended. Determined not meaningful through methodology process.
Customer Satisfaction
• Annual Survey of Parents and Kids. Elements include participant learning, fun, enrichment; safe
and clean facilities; parents who "would refer the program to others"
Evaluative Criteria
Meaningful
These measures provide informotion to help staff with feedback to ensure safe, fun, enriching, and cost - effective operation of the Ac-
tive Kidz program. No benchmorking information is readily available for this service. Recreation staff have initiated efforts to estab-
lish a local benchmarking group
Useful
Information can be used to evaluate the program, make changes, improve quality to meet expectations, monitor and improve enroll-
ment.
Reporting Frequency: Annually
Sustainable
Data is sustainable; attendance is kept on a daily basis. Survey can be administered onnuoily.
29
City of Newport Beach Performance Measurement and Benchmarking Project
The purpose of this Senior Services Activity is to provide supportive social services to older adults so that they
can maintain an active, independent lifestyle.
Proposed Measures
Quality
6 • None recommended. Primary elements of service quality can be measured by cycle time and cus-
tomer satisfaction.
Cost
Cost per client trip
Cycle Time
% of Transportation requests scheduled within one business day
• % of Trips client gets to destination /appointment within 5 minutes of scheduled time
Customer Satisfaction
• Customer survey mailed out to transportation clients annually. Elements include: vehicle safety
and comfort, driver safety and courtesy, timeliness, other needs.
• Focus groups with Relatives, Caregivers, Health Care Providers on service delivery needs, satis-
faction
Evaluative Criteria
3
Meaningful
The customer satisfaction measures can provide staff with feedback for the future operation of the transportation program and in-
crease customer satisfaction.
Useful
These measures con monitor efficiency of time and resources in the delivery of transportation services and compare cost per trip with
similar services provided by other agencies within Orange County. Information could be used to improve the transportation services.
Reporting Frequency: Annually
Sustainable
Most of the data is collected on on ongoing basis and is sustainable. The survey data will be collected and collated annually.
29
City of Newport Beach Performance Measurement and Benchmarking Project
—3`
Provide residents, businesses and visitors with safe, dependable and good tosting potable water by supplying water from
ground water and import sources.
Proposed Measures
Quality
Water Distribution System Integrity — total annual number of leaks and pipeline breaks B
per 100 miles of distribution piping.
Drinking Water Compliance Rate ( All Regulatory Agencies) (% Days in full compliance) B
6• Unplanned disruptions per/ 1,000 customers
Cost
Operations & Maintenance Cost Ratios:
• O &M cost per account
• O &M cost per MG distributed
Cycle Time
Percentage of customer service requests
responded to within 24 hours.
00 Disruption of water service
- Unplanned
Customer Satisfaction
Bi- annual customer satisfaction survey
Al
Evaluative Criteria
Meaningful
City currently provides onnuolly a "Water Quality Report" to the community. Information collected will give opportunity to measure
cost effectiveness of service delivery. Enhancing the service request system will log the number of service inquires and workload. In.
formation collected will give opportunity to measure cost effectiveness of service delivery. Enhancing the service request system will
log the number of service inquires and workload.
Useful
Communicating the cost effectiveness of water service compored to other agencies. Improving response time, monitor compliance with
State and Federal standards. Benchmarking source: American Water Works Association.
Sustainable
Continuotion of City Resident Survey. New bi -annual customer service satisfaction would be developed & implemented New service
request system will track service requests /complaints
30
City of Newport Bench Performance Measurement and Benchmarking Project
Operate and maintain the City's street lights.
Proposed Measures
Quality
Percentage of Planned vs. Unplanned Streetlight Maintenance (hours)
Cost
• Operations & Maintenance Cost Ratios:
- O &M cost per streetlight
Cycle Time
00 Percentage of streetlights repaired within 7 days of notice
• Percentage of service requests responded to within 24 hours
Customer Satisfaction
Bi- annual customer service satisfaction survey
04 -
Evaluative Criteria
Meaningful
Measu,lnq Tools To track Imo�ovemews. Enhonced se,.lce equest system . i og the number o4 service inqui -es and workload.
Useful
Measures can identify opportunities to be more energy efficient and develop capital improvement projects to ensure system integrity
and reliability.
Sustainable
New bi -amual customer service satisfaction would be part of city -wide resident survey. Enhancing existing service request system ro
capture data
31
fity of Newport Beach Performance Measurement and Benchmarking Project
Operate and maintain the City's wastewater collection system that transports wastewater from the residents and
businesses in order to minimize wastewater overflows and safely deliver wastewater to the Orange County Sanitation
District.
Proposed Measures
Quality
• Collection System Integrity — number of collection system failures each year per 100
miles of collection system piping.
• Number of sewer overflows per 100 miles of collection piping. B
• Beach mile days of beach closures.
Cost
Operations & Maintenance Cost Ratios:
- O &M cost per account
- O &M cost per mile
Cycle Time
00 Percentage of customer service requests responded to within 24 hours.
Customer satisfaction
• Bi- annual customer satisfaction survey
Evaluative Criteria
Meaningful
Compiiance with all stare, local and federal laws. Comparing to other agencies. Measuring tools to track improvements for better
customer service and a better healthy environment.
Useful
Measure of collection system piping condition and the cost effectiveness of routine maintenance to ensure a sound deliver system.
Customer service satisfaction survey to continue high level of customer service. Benchmarking source: American Water Works Associa-
tion.
Sustainable
New Bi- annual customer service satisfaction survey would be part of city -wide resident survey. Enhancing existing service request sys-
tem to capture data. Developing capital Improvement projects to ensure system integrity & reliability.
32