Hitting the target on police performance

Harvey Redgrave, Chief Executive Officer

Wednesday 5 February 2020


Ministers have made bold promises on new investment into policing and improving outcomes for the public. Inevitably there will be ‘strings attached’, in the form of the return of some performance management measures for police forces.

Here, Crest CEO Harvey Redgrave considers how a new regime might be constructed which avoids the mistakes of previous government efforts to impose policing ‘targets’.


Boris Johnson’s decision to prioritise police funding as one of his earliest and most substantial domestic policy pledges has done much (at least in the short-term) to reduce the Conservatives’ vulnerability on crime and repair their relationship with the police. Yet, this was only ever going to provide temporary respite. With an increase in spending comes an expectation of better outcomes. And, as last week’s crime statistics showed, violence is continuing to rise and the proportion of crimes that are charged and prosecuted has fallen to a record low.


Government ministers are mindful that promises of additional spending will only reassure people for so long while those stats are heading in the wrong direction. So it's no surprise to hear the Policing Minister talk about wanting to see the police make clear 'measurable improvements’ in return for the additional spending. There has even been a suggestion that the Prime Minister might seek to reprise the model that served him as Mayor of London: the so-called ‘20: 20: 20 challenge’ (reducing crime by 20 per cent, boosting public confidence by 20 per cent and cutting costs by 20 per cent).


The response from policing leaders has betrayed a sense of collective nervousness. In particular, there are fears that a more interventionist Home Office might represent a return to the days of Public Service Agreements (PSAs) and targets, driven from Whitehall.


On one level, that nervousness is justified. Targets can often lead to significant ‘gaming’ of the system and unintended effects. One obvious example of this was the introduction of the ‘offences brought to justice’ target (OBJT) in 2002, which was introduced in order to reduce the ‘justice gap’ between the number of detected crimes and the number which resulted in a positive outcome. The target did not discriminate based on the severity or complexity of the crime. It has been argued that this resulted in an unintended ‘net-widening’ effect, whereby the police were incentivised to give formal responses for low-level crimes (such as cannabis warnings) in order to meet the target, which may have previously resulted in an informal outcome. As a result, there was a significant rise in the number of children entering the youth justice system (YJS) for the first time from 2003 onwards. The target was subsequently removed in 2010, following which, the number of ‘first time entrants’ fell.


Another problem with centrally prescribed targets is that they can sometimes be too crude for complex systems. The central argument here is that public services, like the police, are complex ‘human activity’ systems, which cannot be measured in a simplistic numerical snapshot approach. As a result, the needs of the police to meet simplistic hard targets can end up competing with the needs of the public. At various times over the last decade, the police have been subject to numeric ‘confidence’ targets (including under former Home Secretary Jacqui Smith and during Boris Johnson’s stint as London Mayor). Public confidence is a nebulous thing, impacted by a whole host of variable factors, many of which do not lie within the police’s gift. Confidence targets rarely achieve their stated goal.


In 2010, the new Home Secretary Theresa May swept away all remaining national policing targets - collateral in a broader shift away from New Labour’s perceived ‘micro-managerialism’ of public services towards a more devolved model of policing policy - something that was accelerated by the introduction of Police and Crime Commissioners in 2012.


Yet while there are clearly downsides to these new public management-type regimes, even policing leaders would acknowledge that in recent years, the balance has gone too far in the other direction.


"It sometimes feels as if there is not much central push; it is, “Get on with it and good luck” and at the same time it has sometimes felt a bit parent-child. We are in this together for our public and we would love to work evermore closely with the Home Office so that they feel even more confidence in us and they can project that to others."

Cressida Dick, Home Affairs Committee, 5 June 2018



The reality is that while previous Conservative administrations successfully ended the era of PSAs, they have failed to replace it with a credible alternative.


Meanwhile, the police service has struggled to clearly explain the public value it delivers in a post-targets world, nor how it should be measured. For example, while it is recognised that a growing part of the police’s role is to proactively manage ‘vulnerability, risk and harm’, this is poorly defined and often subjective. It certainly isn’t easy to measure. As Gavin Hales has commented, the ‘public value baby seems to have been thrown out with the targets culture bathwater, and nothing has been put in its place.’


A lot of weight is currently placed on HM Inspectorate of Constabulary’s ‘PEEL’ inspections, but these are essentially narrative judgements and are often subjective: they don’t translate into clear measures of effectiveness and efficiency. As a result, a vacuum has opened up, with confusion over who is responsible for cutting crime and driving up police performance.


And here we come back, full circle, to the question - what should the government reasonably expect in return for the promise of more resources? How should policymakers go about devising a performance regime that genuinely holds the police to account and facilitates an improvement in standards?


It is not our intention here to provide definitive answers to these questions. However, drawing on our work helping police forces to model their demand and helping PCCs to devise local justice performance frameworks, we can outline some of the factors policymakers should take into account.


Firstly, is it simple? Performance frameworks that are complex tend to perform more poorly than ones that are clear and easy to understand. The NHS’ four-hour A&E waiting time target is a good example of a simple numeric target which has had the effect of signalling ambition and galvanising effort around a priority policy area.


Secondly, is it transparent? There is little point in devising a performance management system that nobody can see. For example, the Crown Prosecution Service publishes a small number of high level metrics which do not constitute official statistics and do not enable local scrutiny of performance.


Thirdly, does it measure the relationship of inputs to outcomes? A target which is excessively input-focused risks excessive gaming and/ or perverse behaviours. (This is why it would be a bad idea to set an arrests target for the police, since this would incentivise the police to focus on those who are easiest to arrest, rather than on the most dangerous, sophisticated or elusive criminals). Equally, a target that is purely focused on outcomes can have the opposite problem - that it is too removed from the core activity in question. Targets based purely on cutting crime and/ or improving public confidence are examples of this. Whether they are delivered or missed will be affected heavily by factors outside of the police’s control.


Fourthly, does it measure added value? Performance targets need to be contextualised and show ‘distance travelled’. School league tables provide an example. Simply publishing raw attainment data is now understood to be misleading, given the different socio-economic characteristics of different areas and hence the intake of different schools. Hence the DfE’s devising of ‘intelligent’ accountability measures, which take into account the cohort being worked with and judge the ‘value added’ of schools (so-called ‘progress 8’ scores). There is a potential crossover into the world of policing here given some of the demographic drivers of crime. HMICFRS developed a methodology for creating groups of 'most similar forces' which enables forces to compare their performance with others facing similar challenges. However, whilst this accounts for deprivation to an extent (using the acorn category of 'hard pressed' from census data), the index of multiple deprivation is a more comprehensive way of benchmarking and contextualising performance.


Fifth, what is the likely impact on behaviours? Good performance management must consider any possible gaming effects and the likely impact on front-line behaviours.


Across nearly every performance metric, from response times to charge rates, from victim satisfaction to the time it takes to bring offenders to justice, performance is worsening rather than improving. So the government is right to signal its intention to ask for more from the police in return for greater public spending. But a good deal of hard thinking will be required to ensure the performance regime is fit for purpose.

Crest_Logo_White.png

We are Crest Advisory - the UK's only consultancy dedicated to crime and justice.

Crest_Logos_Blue_200.png