An inspector calls
What makes a police force truly effective? How should we measure police performance? And how do we know when a force is failing? These questions are not just academic - they matter greatly to the people who work within policing, from police and crime commissioners (PCCs), who are elected to hold the police to account, to chief constables, the people charged with actually raising the standards of officers and staff.
With the Home Office having recently taken a back seat in performance management (the infamous ‘Police Standards Unit’ having long since been wound up), it is now left to Her Majesty’s Inspectorate of Constabulary (HMIC) to be the arbiter of police effectiveness. Every year, HMIC undertakes a series of so-called ‘PEEL’ assessments, which set out to measure a force’s ability to cut crime and keep people safe.
Though widely reported by local and national media, these judgements matter more than just reputationally - they can make or break careers; indeed a bad judgement can damage policing effectiveness in its own right and have a very significant impact on officer morale/the confidence of the local community. Given the weight such judgements carry, it is unsurprising that a growing number of people within policing are beginning to question the methodology behind HMIC’s judgements; to ask the question ‘who inspects the inspectors?’.
Who inspects the inspectors?
HMIC core questions to test effectiveness:
There are certainly legitimate questions to be asked about the HMIC process. The 2016 PEEL questions (listed above), which form the basis of HMIC’s judgements, are primarily an attempt to quality assure inputs and processes, but without any kind of empirical assessment of the relationship to outcomes, which is arguably what you would need to do to truly be able to assess effectiveness. As such, HMIC’s judgements often read as somewhat subjective documents, rather than a data-led assessment of a force’s health.
Transformational change can only be delivered within the boundaries of current resources
Another common criticism of HMIC inspections is that they do not always appear to take into account the question of feasibility. The really difficult question for anybody trying to measure police effectiveness is how to understand the scope for delivering transformational change within the boundaries of current resources. Put another way, to what extent are variations in police performance a function of:
- Funding against demand (e.g. forces which are smaller/have more challenging demand profiles arguably do worst out of the funding formulae)
- Current and previous management/leadership (including the opportunity costs of previous reorganisations)
- Workforce capability (reflecting the pool of available talent)
Some have gone as far as arguing that the inspection process has turned the police into scapegoats: forced to dig an austerity-shaped hole, stand in it and then be measured and told they aren’t tall enough. Clearly, there will always be some who have a vested interest in seeking to reduce every discussion about police performance (see the fierce debate about counter-terrorism) to one of resources. Nonetheless, the public interest would arguably be better served if feasibility issues were discussed and debated openly, rather than brushed under the carpet. This approach has served other public services well. For example, school league tables have moved on from a crude comparison of exam scores to a more rounded assessment of ‘value added’ performance.
There is at least scope to ask deeper questions. For example, how does a force’s workforce composition relate to its outcomes? Is there any correlation between force spend on neighbourhood policing and local confidence? Do forces with greater investment in investigation/intelligence teams tend to do better on vulnerability than their peers? What is the connection (if any) between operating models and effectiveness/efficiency? Are there particular models that have historically been more or less successful than others?
Some of the most interesting work that we do at Crest is to help the police and other criminal justice agencies understand the impact of what they do and then work out how best to communicate it to the right audiences. That work can take a variety of forms - analytics, strategy, stakeholder engagement - but it is usually driven by a simple and powerful motivation: the desire to keep people safe and build public confidence. Crest has begun to work with forces and PCCs who are seeking an additional independent assessment of how they are performing which addresses some of questions above.
In a world of rising demand and shrinking resources, inspectorates perform a vital democratic function. They give the government (and, by proxy, us) the evidence to challenge public services and hold them to account - whether that be a police force, rehabilitation company, hospital or school. Yet it would be wrong to rely on inspectors, whose reports raise questions as well as providing answers, to fulfil this role on their own. To truly understand effectiveness, we need to ask more demanding questions of forces which reflect local conditions, trends and priorities.
To truly understand effectiveness, we need to ask more demanding questions
Crest Advisory helps criminal justice agencies understand and communicate their effectiveness. If you would like to find out more about our specialist services, please get in touch.
Harvey Redgrave is Director of Strategy and Delivery at Crest Advisory. Previously, Harvey worked as a senior policy advisor at the Labour Party and was a deputy director at the Prime Minister's Strategy Unit.
More from Crest on this topic
Community sentences: where did it all go wrong? Read about our report.
Examining the case for justice devolution. Read more here.
Crest has designed a dashboard that brings together crime, court and prosecution data in a single place. Find out more here.