Preventing Computerpocalypse
October 4, 2017 10:58 AM   Subscribe

More and more, critical systems that were once controlled mechanically, or by people, are coming to depend on code. The attempts now underway to change how we make software all seem to start with the same premise: Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it. Although code has increasingly become the tool of choice for creating dynamic behavior, it remains one of the worst tools for understanding it. The point of “Inventing on Principle” is to show that you can mitigate that problem by making the connection between a system’s behavior and its code immediate.
posted by blue shadows (27 comments total) 31 users marked this as a favorite
 
Bret's lab is mucking about for grants now, because they only have a few month's runway.
posted by hleehowon at 11:27 AM on October 4, 2017 [2 favorites]


The article talks about Inventing on Principle but doesn't actually link to it. Here's the video, here's a transcript. This is one of the few cases where it's actually worth watching the video instead of reading the transcript, though (the demos are very visual).

Another link worth reading, his very detailed critique of Resig's Khan Academy programming platform (the one discussed at length in the article), which doubles as a fascinating look into what programming tools designed for humans could look like.

There's a throwaway sentence in the article about Therac-25, a machine that killed several patients because of a software error. As part of my CS degree, I took a required operating systems course that devoted two whole classes (which were two hours each) to discussing the incident. The errors involved were related to the concepts we were learning, but the broader point of the lecture was to drive home that as future software engineers, our mistakes and carelessness could have real cost and impact.

There's a comment the professor made at the end of the lectures that really stuck with me. The most "prestigious" companies, the ones that everyone in our degree aspire to get into are companies like Google, Twitter, Airbnb. You're looked down upon if you don't end up there, you're considered an inferior programmer. But, she said, if all the "inferior" programmers are the ones programming our medical equipment, our cars, our 911 systems...

It's sort of depressing when you think of how the most highly trained minds of our generation are all working on advertising, and someone like Bret Victor who decides not to can barely get research funding.
posted by perplexion at 11:41 AM on October 4, 2017 [40 favorites]


the problem of objects. a beginning/mid programmer can be productive (profitable) making code feature-correct out of the gate. otoh, it takes A Lot of mileage to begin thinking in objects. that timeline is expensive.

but, at the end of trail, you get code that is
securely written
feature-correct
maintainable
testable
tested
resilient to change (loosely coupled)

today's front-end coders don't like objects or design patterns or SOLID. they like functions. lots of them.

this doesn't dismiss the object disasters like j2ee, where object decomposition is perpetrated to a comic level of granularity. nor does it excuse the misapplication of patterns.

imho, the design-by-principle is intrinsic to an ood approach.

obvs, most of the industry at this time differs.
posted by j_curiouser at 12:02 PM on October 4, 2017 [4 favorites]


Objects don't fix this problem any more than Christianity solves violence.
posted by idiopath at 12:31 PM on October 4, 2017 [13 favorites]


Lisp is the answer to all of this

this is a humor joke, intended facetiously
posted by sandettie light vessel automatic at 12:34 PM on October 4, 2017 [10 favorites]


It's sort of depressing when you think of how the most highly trained minds of our generation are all working on advertising
Capitalism. It's the most efficient and effective system for running a society ever found. As long as the only thing you care about is money.
posted by Bee'sWing at 12:34 PM on October 4, 2017 [3 favorites]


I'm going to follow up because in retrospect that might look like lazy snark.

Object orientation, depending on whose specific idiosyncratic definition you apply, has rules that if followed carefully will absolutely improve the situation.

Practice and history tell us that people can be ignorant with Objects as successfully as they are with any other paradigm. If they were using functions properly they could also avoid many of these problems (that is, actual FP, with immutable data and segregated side effects etc.). Functions aren't the solution, Objects aren't the solution, some kind of skill and craft help, but there is no simple shortcut.
posted by idiopath at 12:36 PM on October 4, 2017 [5 favorites]


It's really, really strange that an article about model-based design mentioned Margaret Hamilton as a founding figure in computer science, but did not mention her invention of Universal Systems Language..
posted by muddgirl at 12:38 PM on October 4, 2017 [2 favorites]


It's sort of depressing when you think of how the most highly trained minds of our generation are all working on advertising

a tiny percentage of google and facebook developer actually work on ads. most of the developers are just conjuring up ways to spend all the money ads make.
posted by GuyZero at 1:13 PM on October 4, 2017 [2 favorites]


There is nothing I can see about any of the tools they described later in the article that would have accounted for the fact that the developers of the 911 system did not anticipate that this system would still be in use after millions of calls had been placed. They talk about Squarespace replacing designers, but it hasn't. Squarespace has designers. They let small businesses share good designers for cheap. That's great. The design still has to happen, though, and you still have to actually keep an eye on your website to make sure that it's continuing to function as well years later. If you set up a Squarespace template and then leave it untouched for a decade, there's absolutely no guarantee that it will continue to look decent or even function in 2027. If you never plan for software to scale beyond a certain point but you also don't ever go back and review any of it ever again, it shouldn't actually be any kind of a shock that the system fails.

The problem of the "inferior" programmers working on the 911 systems is that, well, the public? The public sees fit to throw a lot of money at Amazon and a lot of eyeballs (and therefore money) at Google and Facebook. If you suddenly told every state and local government that the cost of their 911 software was going to triple, they'd have much better software with much better developers, but even though it would still be one tiny line item in their budgets, people would scream. The public has to be convinced that these systems need resources devoted to them. The companies that have resources don't seem to find that code is too complicated to be maintained. It isn't that complicated. It just costs.
posted by Sequence at 2:13 PM on October 4, 2017 [8 favorites]


my speculative interpretation of the 911 counter problem is that they just used an int as a counter, not thinking about what would happen when it rolled over or errored (depending on the language)
posted by idiopath at 2:16 PM on October 4, 2017


The root of all of this is unwanted side effects of code. Structured programming, abstraction, information hiding, encapsulation, etc... are all mechanisms to try to stop unwanted side effects in code written by humans.

There is a similar problem with operating systems, there's no (simple/useful/foolproof) way to prevent applications from having unwanted side effects. Sure, there are systems somewhere off in the future that can handle this, like GNU Hurd, or Genode, but that is another similar battle that needs to be fought.
posted by MikeWarot at 2:55 PM on October 4, 2017 [2 favorites]


There is nothing I can see about any of the tools they described later in the article that would have accounted for the fact that the developers of the 911 system did not anticipate that this system would still be in use after millions of calls had been placed.

This always gets brought up when people talk about formal methods, and I don't really understand it as a counterpoint, true as it is. If we lived in a world where software only ever failed to perform as desired because of incorrect specifications, we'd be ecstatic.
posted by invitapriore at 3:05 PM on October 4, 2017 [2 favorites]


Who needs formal guarantees, just write more unit tests.
(This is also a humor joke)
posted by Pyry at 3:21 PM on October 4, 2017 [4 favorites]


What the world really needs is a tool for summarizing sprawling Atlantic and New Yorker articles and pointing out the interesting bits.

Victor's presentations were beautiful, but if you scratch the surface you just find more code. Building "visual REPL" tools for domain-specific software development is not a bad idea when it provides more value than it costs, but it is an old one, and it does have real costs: The value is tight feedback loops and visual feedback, the costs are that you have to design in greater generality than the current project requires, and you have to validate that the tools are telling you what they think they are and generating code with the expected behavior. The idea that you could go from his examples to a generally usable visual scientific programming environment with contemporary technology seemed like a pipe dream to me in 2012... Don't know where his work is today and I would love to have been proven wrong... even incremental improvements in software development methods can bring huge benefits.

The TLA section led me to Newcombe's talk. That looks like it could be worth learning.

The automotive software section seems to get to the heart of the matter, to me: People will always, for excellent reasons, be pushing at the boundaries of what can be automated, at least until all human drudgery and limitation can be automated away. That means our systems will always be reaching at least a little further than they can grasp.
posted by Coventry at 4:21 PM on October 4, 2017 [3 favorites]


I thought it was very good, as an article about computer programming targeted at a mainstream audience.
Most such articles are crap; this one had enough meat on its bones to keep me distracted for an entire root canal.

One problem I had with it was that it left out a broad shift that's going on toward formal methods. It presents TLA+ as a little-known thing (and I suppose it is, I've barely heard of it before), but what about haskell/coq/adga/rust/etc, all pushing formal methods into mainstream programming in various ways.

I'm also sceptical of the picture the article paints of programming by editing text going away. (Here I feel its mainstream target audience influenced it somewhat.) Perhaps I'm just clinging onto my 1's and 0's and stone knives, but being able to reason about and specify software at higher levels doesn't mean that we won't be doing that reasoning by way of editing text. More visual and interactive ways of working on software are less likely to be natural when the software is not a video game or GUI, but is itself the embodiment of new abstract concepts.
posted by joeyh at 4:24 PM on October 4, 2017 [4 favorites]


So I actually did get to shove over to the guy's lab and play with the updated demos (I have a brochure they had for some reason) and code on them.

The current iteration he's got is backed by a disgustingly small engine codebase written in idiosyncratic smalltalk-cum-lua-with-DSL-features dialect. These link up to projectors which sort of enliven literal pieces of paper by a computer vision feedback loop (projector is next to a camera, do cheapo dot detection on the camera input, project onto the sections of paper that do stuff). Friend did Internet shit with it and had a not incredible coding experience, but it worked. They have their twitter here. You could link up the current system to ipython stuff pretty quickly and easily if you had six months and a decent buncha programmers. Buggy piece of research code to program on but it exists. I don't know about the helpfulness towards research.
posted by hleehowon at 4:41 PM on October 4, 2017 [1 favorite]


Editing the code is still on a (web) ide, but you literally take pieces of paper and point them to other pieces of paper to index files on the ide
posted by hleehowon at 4:45 PM on October 4, 2017 [1 favorite]


what about haskell/coq/adga/rust/etc, all pushing formal methods into mainstream programming in various ways.

That surprised me, too, but I think the idea of elaborate type systems would be very hard to get across.
posted by Coventry at 5:18 PM on October 4, 2017


Last week on the No Such Thing As A Fish podcast (featuring some of the QI elves), one of the hosts mentioned that Donald Trump recently cut off the funding to a workgroup devoted to ensuring Y2K compliance in the US government.
posted by ricochet biscuit at 5:33 PM on October 4, 2017


They removed some Y2K-related paperwork requirements which are mostly ignored in practice. It's a picayune step in the right direction, I guess.
posted by Coventry at 5:42 PM on October 4, 2017 [1 favorite]


Perhaps I'm just clinging onto my 1's and 0's and stone knives, but being able to reason about and specify software at higher levels doesn't mean that we won't be doing that reasoning by way of editing text.

I thought that subtext of abstraction past a certain point being impossible with text was dumb as hell. It is a beloved straw man of people who want to make programming more accessible, which is a totally noble goal, but its inaccessibility is not primarily a function of its input modality. Bret Victor seems like a very intelligent person with a lot of good ideas, but whether he's doing it purposefully or not I always find him incorrigibly disingenuous in this regard. The Mario-clone demo in particular is a great example, where the interactivity relies in large part on the object in development being complete enough to sustain things like visualizing changes in the coefficient of gravity. I don't know why there seems to be a mostly binary choice of camps here between people who think we should just update HyperCard for the 21st century and people who think Emacs and VT-100 emulation ought to be enough for anybody. It feels like neither of them are generating any truly novel approaches to the problem.
posted by invitapriore at 6:40 PM on October 4, 2017 [4 favorites]


It's a picayune step in the right direction, I guess.

That makes more sense.
posted by ricochet biscuit at 8:11 PM on October 4, 2017


I just took a job where I'll be back to writing C and probably asm. I'm going to miss ARC and GC. But my shit is gonna be faaaaaast...
posted by jeffamaphone at 10:06 PM on October 4, 2017


Metafilter: VT-100 emulation ought to be enough for anybody
posted by Coventry at 6:06 AM on October 5, 2017 [5 favorites]


At last! A topic where I'm eponysterical!
posted by SPrintF at 6:09 AM on October 5, 2017 [5 favorites]


Coventry: "What the world really needs is a tool for summarizing sprawling Atlantic and New Yorker articles and pointing out the interesting bits."

It's called "Metafilter."
posted by Chrysostom at 11:39 AM on October 5, 2017 [6 favorites]


« Older Time to renovate the Nobels ?   |   Every case file is a story Newer »


This thread has been archived and is closed to new comments