"They didn't know what they were doing, so they tried everything"
July 31, 2013 11:45 AM   Subscribe

 
You know, the older I get the less I think the reflexive laughing at "lack of vision" stories is funny. Instead of mocking von Neumann for preferring machine code...why not try it? On an IBM 650 especially, it might actually have some advantages.

And in general his point seems to be about how lame humans are in not keeping up with the pace of technology, as though change were an absolute good and the faster the better. This is not necessarily the case.
posted by DU at 12:23 PM on July 31, 2013 [4 favorites]


Well, whatever camera they were using to film that in 1973 was certainly ahead of it's time.
posted by grog at 12:28 PM on July 31, 2013


DU, I don't think you watched the whole thing. That wasn't his point at all.
posted by gwint at 12:36 PM on July 31, 2013 [4 favorites]


Prolog! There's a technology of past futures.
posted by boo_radley at 12:44 PM on July 31, 2013


As gwint mentions, that wasn't his point. His point was to approach programming without the decades of dogma, and maybe you'll come up with something new?

Microsoft still sucks, tho.
posted by slater at 12:46 PM on July 31, 2013


That SOAP sure is a different SOAP than what I'm used to...

(Which reminds me of when I first heard about my current job and the listing mentioned something about RPG experience, which really confused me: why does an insurance company care about roleplaying games? Of course it's actually a midrange IBM language. Which has plenty of its own baggage.)
posted by kmz at 12:55 PM on July 31, 2013


why does an insurance company care about roleplaying games?

Or rocket-propelled grenades?

...although that would be an interesting claim to read, for sure.
posted by Rangeboy at 12:58 PM on July 31, 2013 [1 favorite]


My old advisor always said "research is what you do when you don't know what you're doing"

At first, I thought it was a clever re-hash of "those who can't, teach" (equating professors to teachers, which itself is wrong).

I think some of the talk amplifies a more correct interpretation, about creativity and exploration and trying new things.
posted by k5.user at 12:58 PM on July 31, 2013 [1 favorite]


Yup. As an industry, software development tends toward a narrow set of currently-fashionable tools and "best practices" rather than a broad base of diverse tools and problem-solving techniques.

I don't think it's alone in this respect, and there's probably some good reasons for it. Chief among them is probably that at some point, rather than giving all your time to the eternal better tool search, there's a need to settle on something and just get the job done. Possibly as or more influential is the fact that fashionable stuff is an easier sell to whoever you're accountable to (and a harder target if/when something goes wrong).

I share the speaker's wish that we were less apt to converge on a small set of solutions, though, and
one of the things I love most about the explosion of the web is that it's sortof served as a pushback against a monoculture.
posted by weston at 1:12 PM on July 31, 2013 [2 favorites]


I think he is ignoring two things, practicality and the desire for backwards compatibility. Sure, entirely new bodies of knowledge daily is cool, but most of us have software to maintain. It isn't just pigheadedness, it is a desire to get paid.

Also. I'm not so sure any of this stuff has been forgotten, or is even surprising. I'm not even sure that is what he is saying since most of this stuff is actually in daily use. The things he talked about are so common nobody thinks twice.

  • In 1973 SNOBOL and Regexes were compiled into finite automata. Not only is this technology still in use we have improved on it. Nowadays we use backtracing that compiles them into a tree which is faster. This type of technology was generalized an led to "model-oriented architecture" and elaborate ways to model data in various ways.

  • We have programs where you can build a bridge and simulate it without the program knowing anything about what a bridge is. Here is a site devoted to them.

  • With the actor model, and I'm no expert in Erlang,you still need a way to pass messages. Using shared memory and locks is a primitive, a way to implement message passing.Not message passing itself. Using locks is not some sign you are a bad engineer.

    We've stuck with speeding up single processor architectures because it gave immediate speed boosts to existing programs. Starting 10-25 years ago we also moved to multiprocessor architectures in a very practical way. Moving mathematics operations that benefit from specific hardware to specialized processors. Who here has a GPU?

    In more recent years, parallel and concurrent programming has become a hot topic as we started pushing the the limits of a single core.

  • With regards to programming in text files. He is confusing data with it's representation. It is entirely immaterial to me how my code is stored. It is entirely coincidental when one class per file maps well. Using Visual Studio, I can view my classes and methods in exactly the way he describes. Why don't we store code in some completely different way? backwards compatibility with compilers. Turns out that is more important than getting rid of text files.

  • posted by Ad hominem at 1:17 PM on July 31, 2013 [5 favorites]


    DU: Instead of mocking von Neumann for preferring machine code...why not try it?

    There is the feeling of "being in the zone" with the code that you write. And then there's the feeling of "being fucked" with the code that someone else has written, or that you've written but have forgotten how it works. Higher level languages help reduce the latter feeling, and I can't imagine enjoying parsing machine code as part of my job, if it wasn't for making a better, higher level tool. That is a flavor of self-flagellation I don't have time for.

    That said, I'm reading Dyson's book "Turing's Cathedral" right now, and Von Neumann comes off as a bad-ass.
    posted by hanoixan at 1:39 PM on July 31, 2013 [1 favorite]


    I think one of the big things that made that era of programming languages so esoteric is that there were a lot of walled gardens around different groups of programmers and types of programming that had their own arbitrary standards. So scientific computing had it's own completely different ways of doing things, which was completely different than the way business report generating was done, etc. And most of those differences were for no real reason other than that one group chose one possible preference and another group chose another one to solve the same problem.

    These days there's a lot more convergence around the most popular standard conventions that work the best, while still allowing room to specialize outwards in one direction or another when there's actually a purpose in doing so. For example, the standard convention for all languages these days is to use a small number of non-letter symbols along with english words for the core built-in tokens, and there's no real benefit in making a new programming language that goes the APL route of using a weird set of unintuitive symbols that you need a non-standard keyboard to work with. Whereas a project like Twisted that changes some of the core conventions of the built-in Python library can still exist within Python instead of having to be a completely different and incompatible programming language. People are still coming up with novel ways of programming, they just tend to be on the library level these days rather than the programming language level.
    posted by burnmp3s at 1:59 PM on July 31, 2013 [3 favorites]


    gwint: “DU, I don't think you watched the whole thing. That wasn't his point at all.”

    That certainly seemed to be his point near the beginning. So there's some big reveal later on where he says he was wrong at the beginning? I'm looking forward to that, and I'll be disappointed if it doesn't happen...
    posted by koeselitz at 2:10 PM on July 31, 2013


    There is the feeling of "being in the zone" with the code that you write. And then there's the feeling of "being fucked" with the code that someone else has written, or that you've written but have forgotten how it works. Higher level languages help reduce the latter feeling, and I can't imagine enjoying parsing machine code as part of my job

    There is no more perfect trap of being fucked by someone else's code than when that code is implementing the high-level environment you depend on and it does something surprising and inexplicable.

    I've done several big projects entirely in assembly because there were no other development tools, and while it's time consuming it's not hard, and you never have an excuse to feel fucked with your code because CPU instructions tend to be very definite in their effects.

    The trick is to program from the bottom up, implementing and testing functions of ever greater abstraction (and of course documenting them as you go) until you're mostly making calls comparable to the functions of a higher level language. But unlike the HLL you will have written all those functions yourself, you will know their limitations, and you will be able to refactor them if you find them crimping your style at a higher level.

    Frankly the only HLL's I have ever been fully comfortable with are C (NOT ++) which is not that much better than machine code, VB4-6 which is about the only pretty IDE language that has stolen days of my time with a serious bug or undocumented behavior, and various tools I've built for myself over the years whose limits I know intimately.
    posted by localroger at 3:15 PM on July 31, 2013


    localroger, I guess it all depends on what you depend on, and the quality and turnaround time you expect. I don't know what domain you program in, but it sounds like you do low-level embedded work. If that works for you, and you can put out fast, high quality code on your intimate island, that's awesome and valid.

    I depend on many libraries, languages, and public protocols to get things done, within an acceptable level of speed and quality for me and those who depend on it. And I work with others. I'm happy that I can look at someone else's work and know what they did wrong without having to convert to machine code in my head. And mistakes are made no matter what the language.

    Frankly the only HLL's I have ever been fully comfortable with are C (NOT ++) which is not that much better than machine code

    You can't possibly believe the last half of that statement.
    posted by hanoixan at 3:39 PM on July 31, 2013


    There is no more perfect trap of being fucked by someone else's code than when that code is implementing the high-level environment you depend on and it does something surprising and inexplicable.

    Well the question Von Neumann was vehement about was not "asm or Python" the question was "machine code or asm". The only real thing I can think of that using machine language gains you is that you don't need to run the assembler, but assuming you're using a keyboard and a monitor, you still need something to format the instructions as numbers in ASCII or EBDIC. The world in which the cost of running an assembler is significant compared to other software development expenses was a very different world from what we live in today.
    posted by aubilenon at 3:50 PM on July 31, 2013 [2 favorites]


    I wrote an entire, really crappy, game in x86 assembly in college. I tip my hat to anyone who can do that all day every day. There are so many gotchas to keep track of. Distance limits to JMP operations, pushing and poping shit in the right order for anything like function calls. Handling interrupts.

    Why would you use machine language over assembly ?

    He probably objected to register indirect addressing or offset addressing, which I believe is impossible in machine language? Anyone know ?
    posted by Ad hominem at 4:04 PM on July 31, 2013 [1 favorite]


    in assembly... you never have an excuse to feel fucked with your code because CPU instructions tend to be very definite in their effects.

    Why would adding a useless MOV instruction speed up my program?
    posted by jacalata at 4:09 PM on July 31, 2013 [5 favorites]


    Yeah we talked about this once before.

    A programmer has to know both how the processor's branch prediction will work at any given moment and how operations get broken down into microops to be sure of anything.

    I think the answer was that the peeps doing asm or machine language were working on embedded projects with specialized processors not x86.

    WE already know as far back as the P5 or P6 Abrash was finding that even his IDIV operations didn't work in a predictable number of cycles.
    posted by Ad hominem at 4:15 PM on July 31, 2013


    It's worth remembering that von Neumann was programming machines that were designed to be programmed in machine code by humans; if you look at their architecture (the original MIX from Knuth's Art of Computer Programming is a good example) the structure of the program word is meant to be human-readable in ways that more modern CPU's never were.

    which is not that much better than machine code

    You can't possibly believe the last half of that statement


    Well I'm a very experienced ASM programmer and I know how to build my tools, and for most purposes I either know which legacy project to go to to steal what I need or I've done it so many times that doing it for the new CPU is a cakewalk. (I wholeheartedly concur with Chuck Moore as to the value of learning how to do these things, as even when you are in an environment where you don't need them you will have an idea what is going on when things go wrong.)

    C is a great time saver compared to writing raw ASM but it lets you trip and accidentally shiv yourself in all the same ways asm does, yet is structured to invite n00bs who are not qualified to puzzle out such difficulties to use it. It is absolutely the worst language imaginable (well, other than for entirely different reasons Javascript) for a beginning programmer. Asm is better for learning because you at least understand why you can get in so much trouble, and when you graduate to more convenient environments you can appreciate what they're doing for you and what their unspoken limitations might be.

    It is probably worth remembering that when I talk of asm that's the thing von Neumann was railinig against, but that's because CPU's ranging from 6502/Z80 through 80386, which is where I mainly live, are not designed like 60's mainframes for word codes that can be interpreted by direct inspection. And on the really modern stuff the arcane tricks to eke out a little more performance do require the attention of another machine to fully finesse them. And if you aren't writing a device driver or doing stupid shit like using double precision floats for everything modern machines are so fast that most code doesn't need to be optimized that way. Since moving to Windows (3.1) in 1994 I've never felt the need to code asm on the PC, although I had to do it all the time with my DOS based system in the 1980's.
    posted by localroger at 4:38 PM on July 31, 2013


    It took me way too long to figure out that there was a camera in the head of the overhead projector. Clever!
    posted by zsazsa at 6:39 PM on July 31, 2013


    [C] is absolutely the worst language imaginable (well, other than for entirely different reasons Javascript) for a beginning programmer.

    Having dealt with a generation of MIT grads who were taught Scheme as their first language, I can definitively imagine something worse.
    posted by Tell Me No Lies at 10:24 PM on July 31, 2013 [3 favorites]


    Well I'm a very experienced ASM programmer and I know how to build my tools

    Then I think I was replying to a statement which you didn't mean to say (that C wasn't much better than machine code), when you actually meant ASM. That's a far more believable statement :)
    posted by hanoixan at 6:47 AM on August 1, 2013


    Yeah when von Neumann was making his epic rant there was a major difference between "machine code," which was sometimes entered directly into the front panel of the CPU by hand, and "assembly" which was the result of an involved process by comparison.

    By the time I came along the division had blurred; a hex editor, assembling hex editor, and assembler were all programs, and there were no more CPU control panels and CPU words were no longer laid out in human-sensible divisions. (Actually the 8080 opcodes were, but only if you used octal, which nobody did because 3 does not divide into 8 evenly.) At that point it's more about mass storage options; my first computer had no disk drives, and running an assembler was a PITA compared to using a hex editor that understood op-code mnemonics and keeping a handwritten list of address locations.

    In fairness, a similar gap is probably what motivated von Neumann; you could be interacting directly with the machine, sharpening your understanding and getting stuff done, not dicking around with rolls of paper or magnetic tape to get a result you wouldn't be able to monitor through the panel if you wanted to. Similarly I could type code into the assembling editor and just run it and directly inspect the result, or go through an elaborate process of swapping tapes to assemble and run each change. Once you have floppy drives none of that applies any more, and ASM effectively is what you mean by "machine code."
    posted by localroger at 10:01 AM on August 1, 2013 [3 favorites]


    I love Bret Victor. I'd like to get him and Edward Tufte in a room together and see what happens.
    posted by madred at 1:58 PM on August 1, 2013


    That von Neumann's resistance to an assembler vs friendly machine code is arguably based on sound thinking for time/tools/problem domain is neither too surprising nor particularly antithetical to what I see as Victor's point.

    In fact, it reinforces it. First in the specifics of that judgment: the common wisdom is that machine code is completely unsuitable for humans to work in. If I hadn't had an early educational experience with a microcontroller where the only interface at hand was a hex-alphanumeric keypad and a small eight segment display, and I'd started programming 20 years later with Python or Ruby, I'd probably believe it wholesale. There are problem domains where that's all you need, where writing an assembler could even be a waste of time and using it would unnecessary overhead.

    But it reinforces it when you zoom out and look at history, too: JvN may have been "right" in the sense that he had some very good/smart reasons for his judgment, but that may not prevented him from being stuck on a local maximum. And the problem with local maxima, of course, is that the way to something higher starts with a trip downhill and a slog across the valley.

    If that's true, what it means that it's not just stupid, visionless, or lazy people that end up converging on a limited-capacity / narrowly adapted set of tools. It means maybe even some of the smartest human beings to ever walk the planet can have that problem, while most of us mortals probably get easily stuck at a local adequate instead of maxima. And that means that I'm not immune and I'd better be doing some work to either mitigate the problem or live with that fact.
    posted by weston at 1:54 PM on August 2, 2013 [2 favorites]


    Now that I've finally had time to watch the whole OP I see that it's quite brilliant. Past the "wouldn't it be tragic if in 40 years..." snark are some really sharp insights.

    The sad thing isn't that any of these other principles would be sure to work, it's that we've stopped trying anything different and are trapped on our own local maximum. Commercial success is a kind of curse in that regard because it becomes really, really hard to imagine setting up economies of scale for a different architecture, or different semiconductor chemistries or anything else that doesn't play well with the other existing infrastructure, whether that be compilers that tweak micro-op latencies or lithographic manufacturing techniques.

    For the last few years I've been doing quite a bit of work -- professionally, even -- with the Parallax Propeller, a CPU so removed from ordinary CPU metrics that it literally made me gasp when I realized how it worked. It's a very small step away from the paradigm the OP is poking, but it is a step, one that was thinkable because Parallax is a small company whose educational market doesn't really need the cutting edge, and they can target a niche that doesn't promise billions of sales. They wrote a true teaching language that really doesn't have GOTO and has an absolutely wicked expression evaluator, and built absolutely democratic interchangeable CPU cores with identical functionality and all capable of running interpreted high level code from the large but slow shared RAM or generating video in software. Tasks handled on other CPU's by the interrupt system are handled on the Propeller by using another core.

    The market being the market, though, Parallax is currently busy dressing up the Propeller to run sequential C code using all but one of its eight parallel cores as mere helpers to implement I/O. So I guess we all get sucked in.
    posted by localroger at 6:06 PM on August 2, 2013 [1 favorite]


    For a more large-scale commercial example, the PS3 was less successful for being more different. And it's not really that different.
    posted by aubilenon at 1:05 AM on August 3, 2013


    « Older "No one will be admitted after the start of the...   |   Abbie the Cat, 1997 - 2013. He had a posse. Newer »


    This thread has been archived and is closed to new comments