The precogs were right
February 3, 2016 10:06 PM   Subscribe

strangely, the NYSE is not those maps at all....or any local corporate headquarters...

yes i rtfa...shitty resource allocation targeting minorities - top to bottom, left to right.
posted by j_curiouser at 10:20 PM on February 3, 2016 [2 favorites]

 Kade Crockford, the director of the Technology for Liberty program at the Massachusetts ACLU, says that predictive policing is based on "data from a society that has not reckoned with its past," adding "a veneer of technological authority" to policing practices that still disproportionately target young black men.

Funny how feigning blindness to race leads to racist outcomes.
posted by justsomebodythatyouusedtoknow at 11:19 PM on February 3, 2016 [31 favorites]

But Dolly was attracted by Azavea’s ability to analyze the impact of businesses, churches, and weather patterns on criminal activity. It was also cheaper: Azavea quoted around $50,000 for a year of HunchLab, where PredPol was asking for roughly $200,000.

I didn't need predictive software to see that bilking of taxpayers coming.
posted by a lungful of dragon at 12:18 AM on February 4, 2016 [17 favorites]

How is this not inherently subject to feedback loops? Computer predicts - correctly, and without deliberate racism - that more crime will occur in poor neighborhoods (which will correlate with black neighborhoods), leads to more cops patrolling those areas, leads to more opportunistic arrests + confrontations due to distrust between the community and the police, leads to even stronger predictions to prioritize patrolling those same neighborhoods?

Even though the algorithms will likely be free of per se racial bias, this seems almost guaranteed to effectively create additional class warfare and racial discrimination.
posted by Ryvar at 1:07 AM on February 4, 2016 [15 favorites]

I'm going to call this "racism laundering". It's like money laundering, but instead of cash, it's social policies that come out of the process looking clean.
posted by cotterpin at 2:33 AM on February 4, 2016 [73 favorites]

Wow, it was just a year ago I was making fun of the ACLU for predicting this kind of thing.

The future is coming too fast.
posted by mmoncur at 2:42 AM on February 4, 2016 [1 favorite]

"No!" the responsible adult in charge of things said, "You can't have any new toys until you clean your room, St. Louis police."
posted by Wolfdog at 3:26 AM on February 4, 2016 [11 favorites]

Strangely enough, there's a new Numberphile video out: The Mathematics of Crime and Terrorism. Coincidence??

That video explains a sort of prediction similar to earthquake prediction. There's like a long term average for the primary occurrence (earthquake) which then increases the probability of a related occurrence (after shocks). If a house is robbed, for a time after there's an increased probability of another nearby house being robbed. Maybe the robbers cased the neighborhood and now know all the good escape routes, or they saw some other interesting target.
posted by zengargoyle at 3:33 AM on February 4, 2016 [1 favorite]

The future is coming too fast.

Not only that, but it ain't what it used to be ...
posted by oheso at 3:46 AM on February 4, 2016 [2 favorites]

As one who lived and worked (for 30+ years+) in a high crime predominantly black inner city neighborhood I can assure you I know a number of residents who wished police computers disproportionately allocated resources based on a failure to reckon with past history.
posted by rmhsinc at 3:48 AM on February 4, 2016 [3 favorites]

Linked in TFA is the real thoughtcrime program.

Imagine being placed on a watchlist because you're at the mean between known violent criminals ...
posted by oheso at 3:52 AM on February 4, 2016

rmhsinc, I am not sure what you mean there. Could you explain a bit? I am not trying to be a dick. I think you were saying that you know from experience that some people in the neighborhood where you worked would have liked to see more police on the street, which would be something worth acknowledging. Neighborhoods, like all groups of people, are not monolithic, and this needs to be borne in mind when setting local policies. I'm not sure how that should play out with regard to this topic, but it's a reasonable point as far as it goes if that is indeed what you mean.
posted by Anticipation Of A New Lover's Arrival, The at 4:31 AM on February 4, 2016 [2 favorites]

Anticipation..yes--that is what I meant. The increased presence of police in the high crime area--whether bicycle, car, auto or on foot--was welcomed and sought. I was responding to the statement by an ACLU spokesperson and the statement in the lede. Having lived most of my adult life in two predominantly black neighborhoods ( working poor and middle class) I am confident most black residents preferred a strong police presence while waiting for social change and the day(s) of reckoning.
posted by rmhsinc at 5:06 AM on February 4, 2016 [2 favorites]

It's worth noting that poorly trained and socialized police officers with bad attitudes towards their communities, and the departmental cultures that perpetuate these negative characteristics, are an independent phenomenon from the process by which patrols are assigned and resources are allocated by a given department.

Which is to say, if your department and its cops have a shitty relationship with poor and brown people, a fancy computer won't help at all. Yeah, in that situation, bolting a data-driven prioritization process could very well lead to increasing the burden of bad policing on poor/minority neighborhoods.

However, the poorly run and staffed police departments are still the root cause of the problem, not the computer system. Other, better-positioned PD's with better community relationships use this kind of technology very well.
posted by BigLankyBastard at 5:11 AM on February 4, 2016 [5 favorites]

If you put the wrong people in the right place, you're still not going to get a great outcome.
posted by aramaic at 6:30 AM on February 4, 2016 [5 favorites]

Huh, the computer tells me what I wanted to hear after I trained it!
posted by benzenedream at 8:59 AM on February 4, 2016 [3 favorites]

That HunchLab had sent him to a location where he may or may not have averted illegal activity was, for the moment, tangential; it would not be clear for months whether crime rates in Jennings might be affected by the program.

This "may or may not" is far from being tangential, it really seems to be the core of pevention: the "may or may not" allows for an extremely wide definition of a factual event. It doesn't really matter if anything happens or if anybody commits a crime. If a crime is committed in a highlighted area, the program was right; if a crime is not committed, the program was still right in that the crime was prevented. Either way, an event was predicted. Whether it took place (as in, was grounded in a specific time and space) or not, it has already happened in the simulacrum.

to put it "in the abstract"
posted by sapagan at 9:12 AM on February 4, 2016 [1 favorite]

Big Data on the Beat - Predictive policing has arrived.
Currently, about a dozen American cities—including Los Angeles as well as Santa Cruz, Atlanta, Georgia, and Tacoma, Washington—are using PredPol, a leading predictive-policing software and analytics program. Many of these cities rolled out the system over the past few years, and they are seeing positive, and sometimes dramatic, results. In Los Angeles, a nearly two-year study by UCLA crime scholars and law-enforcement officials, released this past fall, found that PredPol successfully predicted—and prevented—twice as much crime as human crime analysts did. The LAPD is now using PredPol in 14 of its 21 divisions.
posted by the man of twists and turns at 9:44 AM on February 4, 2016 [1 favorite]

gillian tett in the silo effect has a chapter on brett goldstein, who formerly ran operations for opentable, left to join the chicago police department after 9/11, and eventually ended up building the US' first (i think?) predictive analytics unit to fight (pre)crime:
In early 2007 Goldstein was assigned to the Harrison District, one of the toughest neighborhoods on Chicago’s West Side. (Goldstein, raised in Boston, now lives in Chicago’s Pilsen neighborhood.) After 13 months, he moved to headquarters. While working on the West Side, according to an August Chicago Sun-Times article, Goldstein had “started thinking about how he could design a computer model that could replicate” an officer’s intuition. He had this in mind when he transferred off the street.

In 2009, with the help of a $200,000 National Institute of Justice grant, Goldstein launched his predictive-analytics project. The group would analyze crime data to focus manpower where trouble was most likely to occur.

The germ of the idea was born earlier, back when he was getting his master’s at Chicago. Computer-science professor Leo Irakliotis had Goldstein in his 2005 intensive data-mining course, where students learned to extract patterns from data.

The two worked together on a project to analyze call records from the Oak Park (Illinois) Police, to see who frequently called 911 and hung up. They then predicted the corners most likely to have hang-up offenders.

It’s finding these patterns in seemingly random events that drives Goldstein’s work at the CPD, where he makes forecasts using the entire data system. Data analytics uses time and location patterns to calculate where crime might happen. Goldstein’s group, for example, might show “a wave of burglaries in one neighborhood,” says Irakliotis. “They would use the data to see how it might expand to another neighborhood.”
the platform was shut down under rahm emanuel for political reasons -- 'racist computers' and an intransigent police force, essentially -- and goldstein subsequently moved on to other things... but it looks like it's since been resurrected and metastasizing?

anyway, this is all to say that cathy o'neil -- who's been critical of tett's reporting on predictive policing -- is writing a book on unfair algorithms and data ethics which should be out soon! :P
posted by kliuless at 10:04 AM on February 4, 2016 [2 favorites]

It's not normal for a fortysomething to think "Well, at least I'll be dead before it gets *really* bad" every time he/she reads the news, is it?
posted by entropicamericana at 11:00 AM on February 4, 2016 [1 favorite]

"Throughout his shift, Officer Keener witnessed hints of simmering distrust. At one point, several children danced in the street, which he said was locally understood to be an insult to police."

You have got to be kidding me.
posted by seiryuu at 11:02 AM on February 4, 2016 [6 favorites]

I guess it's possible, but that needs a lot more detail. Maybe if it's a specific thing that's being done, it could be. But really as a police officer currently, you're already insulting people with your presence often times because of the history of disrespect your organization has shown the community, so why wouldn't you already know many members of the community aren't fond of you until you prove you're worthy of respect and trust.
posted by cashman at 11:14 AM on February 4, 2016

Cops have always know where the crime is. I mean, what the hell-- they know where the crime happens, they know the right blocks and even the right houses. That's why everybody on the block gets harassed! That's why everybody in the building gets treated like criminals! That's why the cops don't come when your TV gets stolen-- they don't care, just another petty crime in a shit neighborhood to them.

If you live in one of those places, you don't get regular policing, and regular patrols, and officers who care if something bad happens to you, because you deserve it-- you live in one of those places! All they're doing is outsourcing the same entrenched ideas they already had, so they don't have to change anything! The program told them it was a bad neighborhood; it wasn't their call!
posted by headspace at 11:17 AM on February 4, 2016

« Older Reese's Peanut-Butter-Ectomy with Oreo Cream...   |   Going to the Puppies Newer »

This thread has been archived and is closed to new comments