The Failure of Judges and the Rise of Regulators
February 23, 2012 6:54 AM   Subscribe

The Control Revolution And Its Discontents - "the long process of algorithmisation over the last 150 years has also, wherever possible, replaced implicit rules/contracts and principal-agent relationships with explicit processes and rules."
posted by kliuless (25 comments total) 12 users marked this as a favorite
The first link is a very hand-wavy essay. The author makes a lot of assumptions and doesn't back up most of the points with any sort of proof.
posted by burnmp3s at 7:24 AM on February 23, 2012 [2 favorites]

Maybe I'm just dense, but I couldn't make heads or tails of the essay, despite the author's use of bold text, as though shouting the important bits would make them easier to understand.

The first link doesn't define "algorithmisation," nor did a couple of links on the page that I checked. Would you mind pointing us in the direction of what the author means by that, and also explaining what any of this has to do with linear programming?

The article also vaguely mentions contracts and regulations, but I'm also not sure what this has to do with judges and regulations, per the post title.

Here's my stab at understanding the article: Most of the economy is not actually a market, much less a perfect one, so there's a lot of uncertainty. Top-down control* becomes increasingly fragile in the face of uncertainty. (I guess I would agree with both of those points.) Therefore we should..."tackle disturbances and generate novelty/innovation with an an emergent systemic response that reconfigures the system."

And that's where he lost me. He might as well have said "therefore we should have a better system."

* It's not clear whether the author means factory-level automation, management-level control, industry-level regulation, or economy-level regulation. Maybe he means all of the above?
posted by jedicus at 7:27 AM on February 23, 2012 [1 favorite]

I think it would be helpful to distinguish between presence/absence of "control" on the one hand and tightness of coupling on the other. Tightly coupled systems are prone to catastrophic failure. Loosely coupled ones aren't as much so. Uncontrolled systems are prone to anything at all.
posted by DU at 7:32 AM on February 23, 2012

Wow. If jargon is the stain of having been over-steeped in theory, that first link is stained.

I have a suspicion that this post would be improved by an explanation of Taylorism, but do not have time to further comment right now.
posted by gauche at 7:37 AM on February 23, 2012 [1 favorite]

It seems to me that, ultimately, he's arguing that "disruption" is what truly moves things forward, and that the "invisible foot" (i.e., failure) should be allowed to do its job.

The problem I see is that government and business are, by definition, at least partially enemies. Business has to be concerned with business, while government has to be concerned with the larger society. Sometimes those aims coincide, but often they are at odds. Some "control" is necessary and unavoidable.

It gets sticky fast. For instance, if the government imposes no control, then a business might grow to the size that they affect an unhealthy portion of the economy. Then what happens when the "invisible foot" comes along? (We are living that scenario, are we not?)
posted by Benny Andajetz at 7:38 AM on February 23, 2012 [2 favorites]

The first link doesn't define "algorithmisation," nor did a couple of links on the page that I checked. Would you mind pointing us in the direction of what the author means by that, and also explaining what any of this has to do with linear programming?

As far as I can tell he uses the term algorithmisation to mean reducing a system or process involving people to a set algorithm by removing human judgment. So for example taking a group that does cold call sales in a call center and giving them a very detailed script to use rather than letting them use their own judgment and sales skills.
posted by burnmp3s at 7:38 AM on February 23, 2012 [1 favorite]

This is a fascinating perspective, even if it's so abstract that it's hard to find a falsifiable conclusion. From my IT background I would have focused more on telecoms and car manufacturers than banks. I'm intrigued that he never uses the phrases socialization of risk and regulatory capture.

I'd argue that the uncanny valley analogy is a little too pat, and it's not an issue of increasing 'complexity' leading to fragility, unless complexity in the authors mind is any system that reduces the need for human decision making. I'd actually term those systems more 'elegant', but the risk is that with less room for human input, the psychological human elements guarantee failure on the macroeconomic scale in the form of either efficiency destruction through rent seeking (utilities, insurance, financial transaction oligopolies, usury) , fraud (CDS), or usurpation of control (rogue traders, industry co-option of government regulation and control).

The author also seems to imply in at least one spot that algorithmic improvements and the control revolution will always continue to improve efficiency, but that only applies in a perfectly competitive market. Any real market will eventually diverge into using control to drive rent seeking even where such behavior is absolutely counter to efficiency in terms of cost control (health insurance, telecoms, utilities).

I do wish the author wouldn't bother with the links, most of which go to a paywall or don't obviously support the point at hand without an insider's knowledge. I still can't figure out what Zara was about.
posted by BrotherCaine at 7:43 AM on February 23, 2012

Once again, I should have finished before commenting.
posted by BrotherCaine at 7:48 AM on February 23, 2012

The second link was actually a really nice read in the history of science written by one who was on the front lines of the research. A++.
posted by kaibutsu at 8:37 AM on February 23, 2012

What is the term for an article or paper that does not clearly state its thesis in the first paragraph (or, sometimes for dramatic build-up, the first few)?

Oh, yeah: gibberish.
posted by IAmBroom at 9:06 AM on February 23, 2012

I really like the first link for what it is—one person's take on the fundamental processes of our modern socioeconomic system. Yes it's casually written and personal and somewhat far-ranging, but what I got out of it was upturning the very ingrained idea that the Computer Revolution is a positive leap forward (e.g., he recontextualizes it as the Algorithmic or Control revolution, and I agree with this choice of diction).

The second article is useful as a piece of intellectual/technical interest, but I see its politics as being subverted by the things being said in the first article. The first article is much more directly related to today's problems.
posted by polymodus at 9:09 AM on February 23, 2012

What is the term for an article or paper that does not clearly state its thesis in the first paragraph

I feel that given it's a blog post it has to be read with that in mind. I agree that the opening is weak, but I would point out that the first two paragraphs are really about him coming to terms with an opinion he didn't hold in prior postings. I can cut him intellectual slack for this.
posted by polymodus at 9:11 AM on February 23, 2012

take for example debt relations [1,2,3,4,5,6 so far...] in the organisation of society and, say, norbert wiener and cybernetics and whether they are (can be) isomorphic... then look at today; what are 'price' signals actually telling you (or not?).

now to make this relevant consider the role of finance (and financialisation) in context: there was a brief period during the transition from the traditional economy to the control economy during the early part of the 19th century... the displacement of traditional controls (familial ties) with the invisible hand of market... Much of the "innovation" of the control revolution was not technological but institutionallimited liability, macroeconomic stabilisation via central banks etc.

to further summarise Bruce Wilder whom Parameswaran quotes at length:
The elaborate theory of market price gives us an abstract ideal of allocative efficiency, in the absence of any firm or household behaving strategically (aka perfect competition). In real life, allocative efficiency is far less important than achieving technical efficiency...

Economic rents are pervasive, but potentially beneficial, in that they provide a means of stable structure, around which investments can be made and production processes managed to achieve technical efficiency...

In the actual, uncertain world, with limited information and knowledge, only constrained maximization is possible. All firms, instead of being profit-maximizers (not possible in a world of uncertainty), are rent-seekers, responding to instituted constraints: the institutional rules of the game, so to speak. Economic rents are what they have to lose in this game, and protecting those rents, orients their behavior within the institutional constraints...
SRW@interfluidity suggests "we can work around compromised banking systems and gradually render them obsolete with a combination of 'crowdfunding', social insurance, and a shift of government support away from opaque debt guarantees and towards undiversified equity," while alternative currency systems and public finance have also been proposed with an eye on the moral dimension.

cf. modern monetary theory [1,2,3]
posted by kliuless at 10:49 AM on February 23, 2012 [2 favorites]

I found the first link much more tolerable if I mentally replaced every instance of "algorithm" with "science" (or "mathematics") and decreased all the dates by 200 years.

(Full disclosure: I design, analyze, study, publish, and teach algorithms for a living. No, I will not get off this guy's lawn.)
posted by erniepan at 10:54 AM on February 23, 2012

I found the first link much more tolerable if I mentally replaced every instance of "algorithm" with "science" (or "mathematics") and decreased all the dates by 200 years.

Well, I think it is too simplistic to label this guy a type of Luddite or technophobe. There is supporting academic literature for his opinion even if he himself isn't aware of them.
posted by polymodus at 11:16 AM on February 23, 2012

There is supporting academic literature for his opinion even if he himself isn't aware of them.

Do tell.
posted by regicide is good for you at 11:37 AM on February 23, 2012

Hi - I'm the author of the first article. Thanks for the link and the comments.

On the "uncanny valley" thesis, the article may make more sense if you read this earlier post called 'People Make Poor Monitors for Computers' - I am not trying to make a point that algorithms are worse than human judgement. I am simply saying that with increasing complexity, human beings simply cannot comprehend the systems they are supposed to monitor. In this earlier post, I compare some of my personal experiences in finance with the Air France crash a few years ago. If you want a more academic work on this subject, James Reason's book called 'Human Error' from which I quote liberally, is excellent.

On the arguments regarding capitalism and the control revolution, most of what I'm saying can be extracted out of James Beniger's book 'Control Revolution' which is excellent and highlights just how much of a continuum the last 150 years have been in this regard.

On the absence of academic rigor, mea culpa. But in my defense, it isn't an academic article. I have written posts in the past full of detailed academic arguments (like this one) but no one reads them! I am happy to provide more justification/detail if anyone wants it.

And I would stress that this is not a technophobic argument - I have spent most of my career in an incredibly quantitative, tech-heavy field and the "sweet spot" which I refer to in the post is mostly algorithmised. This is an argument drawn from experience that the road to "perfection" is not as easy as its made out to be.
posted by ashwinp at 12:19 PM on February 23, 2012 [7 favorites]

BrotherCaine - On Zara, Thanks for pointing that out. I have changed the link to a freely available link that hopefully explains it better.

The gist of it is that Zara is a popular case study for how to implement an incredibly quick-turnaround supply chain where production reacts almost instantaneously to actual sales in stores. Also enables them to carry much more variety of clothing per season and cut those units that don't sell, increase production of those that do etc. The essence here is feedback control which has been the essence of the control revolution since its inception and is essentially an equilibriating, stabilising technology as all control technologies are.

For the history of feedback control, Otto Mayr's 'History of Feedback Control' and Stuart Bennett's 'History of Control Engineering 1800-1930' and 'History of Control Engineering 1930-1955' has all the gory details.
posted by ashwinp at 2:10 PM on February 23, 2012 [1 favorite]

Wouldn't Walmart be a more well known example of the supply chain renaissance?
posted by Chekhovian at 3:56 PM on February 23, 2012

Thanks Ashwinp, I clearly lacked the framing to understand your definitions and assumptions, also, I shouldn't have commented without reading more. I look forward to your future musings.
posted by BrotherCaine at 4:15 PM on February 23, 2012

Enough of this sophistry. Where are the formulas, the truth tables and state diagrams. I demand infographics.
posted by humanfont at 4:18 PM on February 23, 2012

Best of the web, kliuless, post and comment. Thank you.
posted by carping demon at 11:17 PM on February 23, 2012

Chekhovian - I chose Zara simple because I'm more familiar with them (I'm from London - they're quite popular here. Walmart - not so much!).

BrotherCaine - not your fault for not understanding all the things I left undefined. Thanks for reading.
posted by ashwinp at 1:37 AM on February 24, 2012

Hi Ashwin,

Thanks for stopping by, and great post - I have some questions for you :-) Coming from an IT background, the business cycle oscillation & gradual degradation seems analagous to the accretion of cruft & addition of new features that happens in many large software projects. So, by analogy, might not this be a problem of architecture? Wouldn't a "small pieces loosely joined" philosophy as embodied in the UNIX toolset or the web serve us better than an essentially monolithic architecture?

Secondly, the whole post seems to posit an inevitable centralisation, but doesn't the democratisation of IT itself allow people opposing the accretion of state-corporate control to turn these weapons against them? e.g. OWS Android applications for reporting arrests, or the NYCGA website/forums using de-militarised computer network technology.

You post touched on lots of things I've been thinking about recently and has given me a lot of new reading material, so thank you :-)
posted by ianso at 2:12 AM on February 24, 2012

Ian - the connection with large software projects is a valid one. I did not delve into it because I'm still not sure exactly what to say about this connection.

Let me take a personal example which may explain better where I'm coming from - when I started out in markets working on modelling/pricing/trading financial derivatives etc, we used imperfect but incredibly intuitive and comprehensible models. You really had to use a lot of discretion and care to get good results but the system never failed in incomprehensible ways i.e. it failed often but you knew why pretty easily.
Over the years, the systems have become more all-encompassing i.e. even novices can use it and it tries to take care of every eventuality without human discretion. But now some of them tend to fail in ways that would baffle the user or modeller. You may even understand each component of the system but the combined emergent behaviour would be too complex for you to understand fully. Even if you wanted to go back to the earlier ways, the newer employees simply don't have the expertise to use the old system any more. Essentially moving from a tool-like system to a black box has made things easier to implement/use but harder to fully understand. And every time things go wrong, another piece of code gets added on to take care of the problem. And we understand the system even less.

So unless we're sure that the algorithmic system needs no human monitoring whatsoever and can deal with any eventuality, then we're maybe better off with a simpler, more comprehensible system.

Also, stability makes this problem of increasing complexity worse. Part of the reason why most large banks have such insanely complex systems for tasks such as sending money from point A to point B is that all the various players in this space have been there for so long - occasionally we need an outsider to start from scratch again. In most of the tech world, this regeneration happens anyway but in most of the rest of the economy, not so much.
"Small pieces loosely joined" is certainly more tool-like but I think it requires that users not be novices. A lot of the evolution from small pieces to large monolithic in my part of banking happened so that we could roll out the system to be used by even novices or people from other departments etc.

In the absence of disruption, centralisation is inevitable but so too is the eventual collapse. The longer it goes without a collapse, the harder the fall. But collapse in this context is not all bad, it simplifies and regenerates as well.

Thanks for reading!
posted by ashwinp at 6:21 AM on February 24, 2012

« Older Our Black Year.   |   /usr/lib/mozilla/plugins/ Newer »

This thread has been archived and is closed to new comments