Report: Britain's Cops Have Big Data But Not Big Analysis
“We’re sitting on absolutely monumental amounts of information collected from different sources,” a detective inspector in the United Kingdom told Alexander Babuta. “What we lack is the technological capability to effectively analyze it.”Barbuta is a research analyst at the Royal United Services Institute for Defence and Security Studies, or RUSI, and his report for RUSI on “Big Data and Policing: An Assessment of Law Enforcement Requirements, Expectations and Priorities” has been made public today. And as the detective inspector would echo, the think tank’s report finds Britain’s police are drowning in data even as research on analyzing that data is lacking.“Few organizations in the UK collect data on the same scale as the police, and fewer still have such wide-reaching powers to acquire data from other sources. Yet the police make use of only a very small proportion of this data. At present, the analysis of police data is a laborious task, as forces do not have access to sophisticated data-mining tools and infrastructure. If the police were able to effectively apply such technology to the data they collected, they would greatly enhance their operational efficiency and crime-fighting capabilities.”Two big issues loom large in Barbuta’s 41-page report – the lack of a central clearinghouse for data the aggregated agencies have and will collect (think only of the CCTV cameras blanketing Britain), and the absence – but not necessarily the lack -- of technology to make sense of this data. “[I]n the majority of cases,” Barbuta writes, “the analysis of digital data is almost entirely manual, despite software being available to automate much of this process. In addition, police forces do not have access to advanced analytical tools to trawl and analyse unstructured data, such as images and video, and for this reason are unable to take full advantage of the UK’s wide-reaching surveillance capabilities.”The report’s findings tend to echo the observations of other big data thinkers like Harvard University’s Gary King, whose famous mantra is that “The value is not the data. It’s not the big. It’s the analytics.”Ironically, Barbuta notes that while hours are increasingly scarce in policing departments, what information technology time is available often goes toward keeping legacy data systems operating, and not exploiting the new capabilities presented by analyzing those mounds of data.And as might be expected in any collection of bureaucracies, those legacy systems often can’t talk to each other. “[T]his paper finds that the fragmentation of databases and software applications is a significant impediment to the efficiency of police forces, as police data is managed across multiple separate systems that are not mutually compatible.”While these critiques are relatively straightforward, since the policing agencies themselves might disagree on the next steps, RUSI offers a package of 14 recommendations for the police, the Home Office, and software developers.Some of the proposals are expected: consult with the people in the field before making decisions, create a national strategy for buying technology, figure out how to share data across jurisdictional lines, develop a standard glossary of common terminology, make sure people are trained on the new gear they do get. The report also makes bets on specific methodologies and procedures for the future, along with calling for national policies and strategies “to create coherence between forces seeking to implement new technologies.”Some of the recommendations that seem of particular import to social and behavioral scientists include:
- Prioritize exploring the potentials of predictive mapping software
Predictive hotspot mapping has been shown time and again to be significantly more effective in predicting the location of future crimes than intelligence-led techniques. However, few forces have integrated the practice into current patrol strategies.
- Use national, rather than local data sets when using tools to predict the risks associated with individuals
Predictive analytics makes it possible for police forces to use past offending history to identify individuals who are at increased risk of re-offending, as well as using partner agency data to identify individuals who are particularly vulnerable and in need of safeguarding. Analysis of this kind is currently carried out using local police datasets, but the use of national datasets is necessary to gain a full understanding of these risks.
- All data applications should include an event log feature that is always on, documenting any changes that are made to a data set.
- Developers of predictive policing software should conduct further research into the use of network-based models for generating street segment-based crime predictions.
- Explore the potential uses of risk terrain modelling (RTM) for identifying areas most at risk of experiencing crime.
Current predictive mapping methods rely on past criminal events alone to predict future crimes, and are indifferent to the underlying geographical and environmental factors that make certain locations more vulnerable to crime. RTM takes account of these underlying environmental factors to provide a comprehensive analysis of spatial risk, and some studies have shown that RTM has better predictive power than retrospective hotspot mapping.
- Explore the use of harm matrices to assess the harms caused by different types of crime.
At present, in most forces, police resources flow to address the total volume of crime in a given area, as opposed to the harms caused by different types of crime. Tools such as the MoRiLE Matrix demonstrate that it is possible to use data to understand harm in a much deeper way, by taking into account factors such as the harms caused to individuals, communities and the economy.Lastly, the report calls for a national-level framework to ensure big data is collected and analyzed ethically. “This must be addressed as a matter of urgency,” Barbuto writes, “to ensure that organisations such as the police are able to make effective use of these new capabilities without fear of violating citizens’ right to privacy. “
Report MethodsResearch for this RUSI report involved three phases. The first reviewed existing academic literature, government policy documents, law enforcement strategies and private sector reports on the police’s use of data. In the second Barbuta interviewed 25 serving police officers and staff from four forces, as well as five experts from the technology sector and academia. The third phase was a half-day workshop in London, which gathered representatives from five police forces, as well as the Home Office, College of Policing and academia.