Executive Summary
Today, more than ever, law enforcement needs to improve and adapt to societal changes. Agencies have been under the microscope and citizens are taking a more active role in policing oversight.
Law enforcement agencies shouldn’t wait until public opinion and local legislation imposes changes to policing. Agencies should be proactive and begin to look for ways to improve and evolve. One of the best ways to effect real change and improve processes in any organization is to implement a system of metrics where information on the current state is gathered and analyzed. Then, recommendations based on the findings can be implemented to improve upon the old ideas.
This paper explores why metrics are important and how they can be used to effect advances in law enforcement, although any organization can benefit from these methods. Our recommendations provide a step-by-step process that anyone can implement within their organization.
Red Carrot’s Research on Metrics and Analysis of Law Enforcement Agency Information
Government, private industry, corporations, and many other entities use some form of metrics collection and analysis to evaluate and assess their programs and document operational effectiveness. The importance of information sharing is crucial to the mission of the work and therefore needs the proper tools, database system, and trained employees to conduct research and evaluations of programs.
For example, offices within the Department of Homeland Security (DHS) can measure workload by tracking how many cases are processed as well as the time it takes to complete each case. Tracking can be managed using a simple spreadsheet that documents the date when the case was received, the entity that sent it, the team member assigned to complete the work, the type of case, and the date completed.
Two Important Components of Metrics
- Reliable Tracking System: A reliable tracking system is an essential tool for keeping information together in an organized fashion. It makes it easier for employees to find data by filtering information by date, sender, case number, and others. Using a database makes it simple to measure and track performance.
- Proper Training: To effectively use a tracking system, employees need to be able to learn and understand the metrics and their use.
One Final Component of a Complete Metrics System

To determine if processes are working to stakeholder satisfaction, many agencies provide a survey to obtain feedback and ensure operations are completed in a timely manner and with appropriate quality.
Collecting Metrics
There are many ways to collect metrics. The simplest, quickest, and easiest to update collection method uses some form of a database with entry automation and a flexible data retrieval tool. Other forms of data collection include interviews, information gathering sessions with a group of stakeholders, and direct observation. These methods tend to be less instantaneous and require more manual updating of a database but are certainly viable alternatives to a fully automated system. A graphic illustrating a simple approach to information collection is presented in Figure 2.
Analyzing the Data
Data analysis is critical to reporting meaningful and actionable metrics. A common misconception is that if you throw numbers into a table or chart, you have communicated useful information to your audience. The real insight comes from looking closely at the data to identify trends or patterns and to uncover the possible causes for changes in the data. For example, you may report that the number of help desk cases spiked in a specific period. By itself, this brings attention to the period in question but does not explain why the spike occurred. You need to know why it is happening so that you can use the metrics to help improve the system. When you identify the cause of the spike – for example, the cases spiked the day after a major software upgrade – you are now reporting information that the stakeholders can discuss and act on and perhaps perform a deeper dive to find the underlying cause. Did this software upgrade contain many significant and complicated changes? This may tell you that you should break the updates into smaller pieces to ensure better quality. Or did the project run short on time, forcing testing to be compressed to meet the announced release date? This may tell you that better scheduling estimates and resource management may improve the next update.
Reporting Metrics
Metrics can be reported using numerous methods. One extremely effective method is to build a dashboard that reports on Key Performance Indicators (KPI). KPIs are individual metrics that map with program objectives. These are determined in the design phase of the metrics program and can be continually updated as metrics are refined. A fairly simple way to display KPIs is to build a dashboard that provides the numbers in real-time. A SharePoint page or organization web page are viable candidates for hosting the dashboard. This also ensures that metrics are continually updated because it’s easy to see when the metrics become stale and are no longer current. Dashboards can display metrics tables, graphs, and charts.
Other reporting methods include presentations, paper reports, and scoresheets that can be distributed.
Many agencies provide annual reports to share statistical data on how their agency performed throughout the year. Annual reports can show which agencies and/or departments performed due diligence with reporting. You can view who has implemented a tracking system and/or training and how quickly each organization updated its processes. Other measurements can include how many resources are required to complete the work.
Process for Developing Effective Metrics Collection and Analysis in Law Enforcement
Challenges exist to effectively collect and analyze law enforcement data that may not produce quantifiable results. Many data elements can be qualitative and anecdotal in nature. Additional difficulties arise when attempting to measure results that may be attributed, in part, to factors external to the system or program being evaluated. To investigate these issues and provide recommendations to refine data collection to provide more precise results, a study was performed as part of the Comprehensive Regional Information Sharing Project (CRISP). The study was performed by Noblis’ Center for Criminal Justice Technology, in partnership with the National Institute of Justice (NIJ). The study examined the use of metrics as a tool to assess the effectiveness of law enforcement Information-Sharing Systems (ISS) and their subsequent impact on operations. Special challenges exist when assessing an ISS—particularly regarding elapsed time between metrics collection, analysis, the resulting modifications, and the noticeable impact on operations.1 Additionally, an ISS may be one of many resources used that have an impact on operations, so its role may not be measurable or directly attributable to process improvements that were implemented based on data findings.
The study began by conducting research on the state of metrics collection in law enforcement, with an emphasis on metrics related to ISS programs. This provided some insight into lessons learned on the use of metrics and identified basic elements needed for an ISS metrics program.
Next, metrics evaluation lessons learned were gathered from information-sharing programs and interviews with law enforcement agencies as part of the larger CRISP effort; programs contacted included the Comprehensive Regional Information Management Exchange System (CRIMES), Florida Department of Law Enforcement (FDLE) InSite, the Factual Analysis Criminal Threat Solution (FACTS), Citizen Law Enforcement Analysis and Reporting (CLEAR)/Illinois CLEAR (I-CLEAR), the Florida Integrated Network for Data Exchange and Retrieval (FINDER), and the Automated Regional Justice “Information” System (ARJIS). The primary effort—which is the focus of the study—was to devise a detailed, automated approach for developing a metrics collection and analysis program that results in more precise and impactful data.
Finally, issues and impacts associated with the devised approach were examined to guide its appropriate application. It’s important to have a formal plan in place for metrics collection so that appropriate metrics are collected without burdening users with the collection process.2
To overcome some of the collection and analysis challenges, the study used a phased approach. This approach should be followed to properly design and implement a successful data collection system.
Conclusions and Key Recommendations for Introducing Metrics
Metrics provide an important tool that helps agencies track and report performance and identify candidate areas for improvement.
Recommendation 1.
Institute a formal plan for metrics collection so that useful and appropriate metrics are collected without burdening users with the collection process.
Recommendation 2.
Consider the benefits of a preliminary behavioral study on how best to obtain quality input.
Recommendation 3.
Provide stakeholders with a survey to complete after a task is finished or a report is issued.
About Red Carrot
Red Carrot, an 8(a) and woman-owned business, is distinguished by our proven federal experience and performance-driven processes. Our team is fueled by passion, backed by intelligence, and built on expertise.
Red Carrot believes that there is always a better way. We solve our clients’ biggest Strategic Communications, Customer Experience, Management Consulting, and Human Capital challenges.
Red Carrot approaches challenges through our vetted processes, based on industry best practices and proprietary data. We continuously explore innovative and often untapped perspectives. This constantly enhances the quality of our work. From our inception, we have stayed research-centric, data-informed, and customer-oriented while expanding our range of highly skilled capabilities. The Red Carrot team supports projects across multiple industries and government agencies. Our accolades include the On the Rise Government Contractor of the Year, Telly Awards, and Hermes Creative Awards.