THE meanest streets of Kent are to be found in little pink boxes. Or at least they are if you look at them through the crime-prediction software produced by an American company called PredPol. Places in the county east of London where a crime is likely on a given day show up on PredPol’s maps highlighted by pink squares 150 metres on a side. The predictions can be eerily good, according to Mark Johnson, a police analyst: “In the first box I visited we found a carving knife just lying in the road.”
PredPol is one of a range of tools using better data, more finely crunched, to predict crime. They seem to promise better law-enforcement. But they also bring worries about privacy, and of justice systems run by machines not people.

Criminal offences, like infectious disease, form patterns in time and space. A burglary in a placid neighbourhood represents a heightened risk to surrounding properties; the threat shrinks swiftly if no further offences take place. These patterns have spawned a handful of predictive products which seem to offer real insight. During a four-month trial in Kent, 8.5% of all street crime occurred within PredPol’s pink boxes, with plenty more next door to them; predictions from police analysts scored only 5%. An earlier trial in Los Angeles saw the machine score 6% compared with human analysts’ 3%.

Intelligent policing can convert these modest gains into significant reductions in crime. Cops working with predictive systems respond to call-outs as usual, but when they are free they return to the spots which the computer suggests. Officers may talk to locals or report problems, like broken lights or unsecured properties, that could encourage crime. Within six months of introducing predictive techniques in the Foothill area of Los Angeles, in late 2011, property crimes had fallen 12% compared with the previous year; in neighbouring districts they rose 0.5% (see chart). Police in Trafford, a suburb of Manchester in north-west England, say relatively simple and sometimes cost-free techniques, including routing police driving instructors through high-risk areas, helped them cut burglaries 26.6% in the year to May 2011, compared with a decline of 9.8% in the rest of the city.
For now, the predictive approach works best against burglary and thefts of vehicles or their contents. These common crimes provide plenty of historical data to chew on. But adding extra types of information, such as details of road networks, can fine-tune forecasts further. Offenders like places where vulnerable targets are simple to spot, access is easy and getaways speedy, says Shane Johnson, a criminologist at University College London. Systems devised by IBM, a technology firm, watch how big local events, proximity to payday and the weather affect the frequency and location of lawbreaking. “Muggers don’t like getting wet,” says Ron Fellows, IBM’s expert. Jeff Brantingham of PredPol thinks that finding speedy ways to ingest crime reports is more important than adding data sets. Timelier updates would allow PredPol to whirr out crime predictions constantly, rather than once per shift. Mr Fellows enthuses about sensors that detect gunshots (already installed in several American cities) and smart CCTV cameras that recognise when those in their gaze are acting suspiciously. He promises squad cars directed by computers, not just control centres, which could continually calculate the most useful patrol routes.

Minority report
Predicting and forestalling crime does not solve its root causes. Positioning police in hotspots discourages opportunistic wrongdoing, but may encourage other criminals to move to less likely areas. And while data-crunching may make it easier to identify high-risk offenders—about half of American states use some form of statistical analysis to decide when to parole prisoners—there is little that it can do to change their motivation.

Misuse and overuse of data can amplify biases. It matters, for example, whether software crunches reports of crimes or arrests; if the latter, police activity risks creating a vicious circle. And report-based systems may favour rich neighbourhoods which turn to the police more readily rather than poor ones where crime is rife. Crimes such as burglary and car theft are more consistently reported than drug dealing or gang-related violence.

But mathematical models might make policing more equitable by curbing prejudice. A suspicious individual’s presence in a “high-crime area” is among the criteria American police may use to determine whether a search is acceptable: a more rigorous definition of those locations will stop that justification being abused. Detailed analysis of a convict’s personal history may be a fairer reason to refuse parole than similarity to a stereotype.
Technology may also sharpen debates about what people want from their justice systems, and what costs they are willing to accept. For example, software developed by Richard Berk, an American statistician, which is credited with helping to cut recidivism among paroled prisoners in Philadelphia, requires the authorities to define in advance their willingness to risk being overly tough on low-risk offenders or to under-supervise nasty ones.
This sort of transparency about what goes on in predictive systems, and what their assumptions are, may also be a partial solution to worries voiced by Andrew Ferguson, a law professor in Washington, DC. Mr Ferguson fears that judges and juries could come to place too much credence in the accuracy of crime prediction tools, jeopardising justice. If transparency is a good counter to this, it will be important to preserve it as prediction becomes a bigger business and gets further from its academic roots.

It is as prediction moves from places to people that it becomes most vexed. Police attending domestic disturbances in Los Angeles have tried out a checklist, derived from much data-crunching, to determine whether the incident presages violence. Mr Berk is working with authorities in Maryland to predict which of the families known to social services are likely to inflict the worst abuses on their children. Federal officials aim to forecast potential health and safety infringements. America’s Department of Homeland Security is seeking to perfect software which scans crowds or airport queues to detect nervous behaviour such as fidgeting, shallow breathing and signs of a swift heartbeat.

So far, predictions have mostly been made about people who have already had contact with the justice system—such as convicted criminals. The growth of social media provides a lot of crunchable data on everyone else. Firms that once specialised in helping executives measure how web users feel about their brands now supply products that warn police when civil unrest approaches, and help them closely follow crises. Cops in California admit to trawling social networks for early warnings of wild parties. ECM Universe, an American firm, offers software that crawls sites “rife with extremism” to identify people who deserve closer attention.
The legal limits on using social media to fish out likely wrongdoers, or create files on them, are contested. Most laws governing police investigations pre-date social networking, and some forces assert that all information posted to public forums is fair game. But Jamie Bartlett of Demos, a British think-tank, says citizens and police forces need clearer guidance about how to map physical-world privacy rights onto online spaces. He thinks gathering information about how someone behaves on social sites ought to require the same clearance needed to monitor them doggedly in public places. Officers who register anonymously or pseudonymously to read content, or send web crawlers to trawl sites against their owner’s wishes, would require yet more supervision.

Identifying true villains among the oddballs and loudmouths found by social-media searches is tricky. Most police efforts are embryonic. Evgeny Morozov, an academic and technology writer, thinks the privacy-conscious have more to fear from crime detection algorithms cooked up by social networks themselves. Some of those firms already alert investigators when they suspect users of soliciting minors. Unlike the cops they employ clever coders who can process private messages and other data that police may access only with a court order.

These projects make life difficult for many criminals. But smart ones use the internet to make predictions of their own. Nearly 80% of previously arrested burglars surveyed in 2011 by Friedland, a security firm, said information drawn from social media helps thieves plan coups. Status updates and photographs generate handy lists of tempting properties with absent owners. It does not take a crystal ball to work out what comes next.