What is Code? said jesting ftrain
June 11, 2015 6:59 AM   Subscribe

Paul Ford (yes, yes, MeFi's Own) has created a juggernaut of an article / lived experience / beautiful time-sink about coding. At this point I'll shut up so you can pack a lunch and go immerse yourself now.
posted by maudlin (92 comments total) 121 users marked this as a favorite
 
Code-phobic, liberal arts friends? Don't be scared. Let Paul be your Virgil.
I began to program nearly 20 years ago, learning via oraperl, a special version of the Perl language modified to work with the Oracle database. A month into the work, I damaged the accounts of 30,000 fantasy basketball players. They sent some angry e-mails. After that, I decided to get better.

Which is to say I’m not a natural. I love computers, but they never made any sense to me. And yet, after two decades of jamming information into my code-resistant brain, I’ve amassed enough knowledge that the computer has revealed itself. Its magic has been stripped away. I can talk to someone who used to work at Amazon.com or Microsoft about his or her work without feeling a burning shame. I’d happily talk to people from Google and Apple, too, but they so rarely reenter the general population.
posted by maudlin at 7:31 AM on June 11, 2015 [8 favorites]


Why isn't this stuff the responsibility of the new CTO rather than me?
posted by biffa at 7:35 AM on June 11, 2015


Once upon a time I complained that most explanations of how our brains work are comparable to an "explanation" of computers that starts with transistors and logic gates and then says "and if you hook enough of them together you get a computer."

This is that article about computers. With snazzy animations.
posted by Bringer Tom at 7:40 AM on June 11, 2015 [3 favorites]


I am so curious about this, but I find his style unreadable. It actually makes me a little sad that so many people get to enjoy something I find somewhere between soporiffic & suicide-inducing. So, please everyone, read it and tell me more.

And then you can get one of these.
posted by dame at 7:47 AM on June 11, 2015 [2 favorites]


A computer is a clock with benefits
I really love this article.
posted by Lord_Pall at 7:55 AM on June 11, 2015 [4 favorites]


His pedentry about algorithms is silly. Any program is simultaneously an interpreter of a language defined by the capabilities of its UI and a (usually very specialized) algorithm with that input (usually a stream of events) as its argument.
posted by idiopath at 8:05 AM on June 11, 2015 [4 favorites]


If you'd like a book that describes how computers work, that starts at the absolute beginning (how electricity works) and walks you through each step in a clear way that leads to the next (all the way to spreadsheets), Charles Petzold's book Code is really very good, and easily approachable without a background in technology. Even if you have a technology background it's a good read.
posted by fatbird at 8:14 AM on June 11, 2015 [22 favorites]


I have problems reading this article, mostly because of the dot-com era graphics. Hello, 2001.
posted by ZeusHumms at 8:17 AM on June 11, 2015


If anyone would like to fix the article, here's the github repo. No really. It has a github repo.
posted by zabuni at 8:29 AM on June 11, 2015 [23 favorites]


Any piece of work that says:

"in code as in life ideas grow up inside of languages and spread with them"

and

"For not only are computers as dumb as a billion marbles, they’re also positively Stradivarian in their delicacy."

and

"Upon accessing the Web page the user if logged in will be identified by name and welcomed and if not logged in will be encouraged to log in or create an account. (See user registration workflow.)”

is fine with me. I loved it too.

p.s. just me or a whiff of McLuhan?
posted by unless I'm very much mistaken at 8:31 AM on June 11, 2015 [3 favorites]


+1 on the Petzold "Code" recommendation. It's a very fine book.
posted by adrianhon at 8:48 AM on June 11, 2015 [1 favorite]


Does this benefit highly from interactivity / graphics on the web? I was planning on waiting until I got my paper copy so I could curl up with this on the beach this weekend, but I'm willing to jump on it now if it's likely better online.

the fact I consider this beach reading is probably saying way too much about myself in a public forum...
posted by thecaddy at 8:51 AM on June 11, 2015


It's my age, but I was really rooting for the non-tech executive in his attempts to manage the project...
posted by alasdair at 9:11 AM on June 11, 2015


I'm not a programmer but I enjoyed the interactivity. There's a paragraph about Javascript and node.js and right after that my mouse pointer started changing color and showing the coordinates of where it was. Coupled with the ("you read x words in a minute so you should be done by this time") thing in the menu, it kind of helped crystallize how much stuff can be tracked on the page and why all those Firefox block plugins are kind of necessary.

Plus, the minigames were kinda cute. There's one that's Grindr but for code where you pick which code is good. Or picking what code was bugged (how is === correct over ==?) .

I liked most of the 38k words I read but found some of the comments about code creating more code and how software becomes the middleman odd. Cynical? Snarky? Like there was this undercurrent of "code is great but wow do we cause a lot of our own problems" or something.
posted by zix at 9:12 AM on June 11, 2015


And then I went back to the article to find a particular thing and the animation's all "Hey, Barbecue, where's the fire?"

how can i feel so judged
posted by zix at 9:13 AM on June 11, 2015 [1 favorite]


zix: programmers hate code the same way architects hate buildings. Nothing is ever good enough, and some piece of shit someone threw together ages ago somehow becomes a permanent fixture you have to accommodate in all future projects... Even worse if you are the one who made said piece of shit (but didn't intend it to last - there are no temporary solutions in software).
posted by idiopath at 9:15 AM on June 11, 2015 [27 favorites]


> Cynical? Snarky? Like there was this undercurrent of "code is great but wow do we cause a lot of our own problems" or something.

I'm a programmer, and I've never met anyone in my industry who didn't feel exactly this way.
posted by a mirror and an encyclopedia at 9:25 AM on June 11, 2015 [15 favorites]


Really made my morning train ride, but then again I'm a Python dev, so the endless flattery helped.
posted by TypographicalError at 9:28 AM on June 11, 2015 [9 favorites]


It doesn't really require a book or even an article the length of the OP to span the gap between handwaving and actual understanding of what computers do. The author's basic problem is that despite his job description he really doesn't know how the computer works, which is why this article is a bunch of empty verbal flourishes spiced with buzzwords.

The main guts of a computer are registers, busses, the sequencer, and the ALU. Registers store data; they're made of flip-flops, and you can get lots of info on how those are made from gates. All computer registers have an input that tells them to read from a bus, and another that tells them to put what they've stored out on a bus.

Busses are groups of wires linking the registers together. Some, like the memory data and address busses, are visible to the outside world; others shunting data between CPU registers and the ALU generally aren't.

The ALU does math and logic operations. It will have inputs telling it what to do (add, OR, AND, etc.) and it will generally take data from one or two busses and put the result out on another, where a register can be told to grab it.

The sequencer is what flips the bits telling registers to read and write to busses, as well as a few other things like telling the ALU what operation to perform. The sequencer takes as inputs counters driven by the CPU clock and some register bits (such as the instruction word fetched from memory) and uses those to generate an output flipping bus control bits. In older CPU's like the 6502 the sequencer might have been a crazy tangle of gates; in more modern designs it's often a memory holding "microcode."

There are a few other bits, but they're simple. You need at least one counter to drive the sequencer. (This isn't the "program counter," which is a register. It's a lower level entity that drives the sequencer to fetch a program instruction as well as to execute it.) You need decoders which take all the possible combinations of states of a number of binary bits and break them out to individual outputs (00-0001, 01-0010, 10-0100, 11-1000). You can find the internal logic diagram of a 74LS138 online easily to see how this works. The sequencer uses lots of decoders, as do memory chips which must take an address and use it to find one of thousands of memory registers.

And really, with those hints and a little quality time with the 7400 series data book, most people would be able to figure out how to build a working computer. Maybe not a very good one, but the rest of what makes them so complicated is just optimizations and details.

And for giggles, here's a guy who figured out how to build one with relays. If you scroll down that page you'll find a neat diagram of his computer's registers and busses.
posted by Bringer Tom at 9:36 AM on June 11, 2015 [8 favorites]


Yeah, maybe you have to be a programmer to love this. It's like Ray Bradbury was a burned out dev who put all their chips into Iron Python back in the day and is now facing having to learn Go, and instead wants to assemble the weird fever dream that has been his life.
posted by lumpenprole at 9:37 AM on June 11, 2015 [3 favorites]


I have problems reading this article, mostly because of the dot-com era graphics. Hello, 2001.

If anyone would like to disable the flashing background graphics (does not affect the animations in the foreground, but does remove the title page):

Chrome
View / Developer / Javascript Console
In the console window, type:
$('#background-canvas').hide()

Firefox
Tools / Web Developer / Web Console
In the console window, type:
$('#background-canvas').hide()
It may ask you to type 'allow pasting' before allowing you to paste into the console, as a measure against running malicious javascript.

Safari
Develop / Show Error Console
In the console window, type:
$('#background-canvas').hide()
If you do not have a Develop menu, you'll first need to go into Safari / Preferences, under the tab 'Advanced', and check the box for 'Show Develop menu in the menu bar' at the bottom of the window.

Any of these will be reset with a refresh of the page, in case you want the graphics back.
posted by frimble at 9:38 AM on June 11, 2015 [1 favorite]


I'm having problems reading the article, primarily because of the horrible typography.

If you know how to override styles in your browser or have something like Stylish installed, I highly recommend applying this to Bloomberg's site before reading all 38,000 words:
p {
    line-height: 1.75em;
}
posted by schmod at 9:47 AM on June 11, 2015


how is === correct over ==?

I haven't gotten there in the article yet, but in Javascript, the double-equals operator does type coercion, while triple-equals doesn't (so '0' == 0 evaluates to true, while '0' === 0 evaluates to false). This can lead to unintended results.
posted by uncleozzy at 9:54 AM on June 11, 2015 [5 favorites]


Bringer Tom: "It doesn't really require a book or even an article the length of the OP to span the gap between handwaving and actual understanding of what computers do. The author's basic problem is that despite his job description he really doesn't know how the computer works, which is why this article is a bunch of empty verbal flourishes spiced with buzzwords."

"A computer is a clock with benefits." is a pretty elegant way to think of it. But I'm not sure if you can get there if you're not already there, if you see what I mean.
posted by boo_radley at 9:56 AM on June 11, 2015 [4 favorites]


git add . 
Ugh, no, you want "git add .; git all -u" (but never just one or the other) or "git add --all" or "git add -A" (but not "git add -a") and absolutely never "git add *". And that's why I use Mercurial.
posted by WCWedin at 9:57 AM on June 11, 2015 [4 favorites]


If anyone would like to fix the article, here's the github repo. No really. It has a github repo.

Including commits from rusty(kuro5hin)!
posted by schmod at 10:00 AM on June 11, 2015 [2 favorites]


Imagine all of L.A. programming. East Hollywood would be for Mac programmers, West L.A. for mobile, Beverly Hills for finance programmers, and all of Orange County for Windows.

ORANGE COUNTY ISN'T PART OF LA, PAUL.

Or maybe he means Windows isn't part of programming?
posted by kenko at 10:05 AM on June 11, 2015 [3 favorites]




uncleozzy: "This can lead to unintended results."

I like this understatement. It reminds me of Hirohito's "the war situation has developed not necessarily to Japan's advantage" in the message of unconditional surrender.
posted by Chrysostom at 10:12 AM on June 11, 2015 [1 favorite]


This can lead to unintended results

My competency test for PHP programmers is to ask them what the value of
0 == "null"
is, and why.
posted by fatbird at 10:23 AM on June 11, 2015 [1 favorite]


WCWedin: And that's why I use Mercurial.

Hint: try actually running it. Paul's example is correct as described.

Hint 2: next time you want to promote Mercurial, try explaining why it's great rather than a reflexive slam, particularly if you aren't willing to actually back up your attack with teaching. Nobody needs yet another flame war about insignificant preferences.
posted by adamsc at 10:34 AM on June 11, 2015 [4 favorites]


Go over to this thread for that sort of thing.
posted by Chrysostom at 10:39 AM on June 11, 2015


Hint: try actually running it. Paul's example is correct as described.

Hint 2: next time you want to promote Mercurial, try explaining why it's great rather than a reflexive slam, particularly if you aren't willing to actually back up your attack with teaching. Nobody needs yet another flame war about insignificant preferences.


Um, whoa, chill dude. The example is bad because it's a very, very bad practice for general use. This specific thing is a super common mistake in git tutorials that sends beginners down a documentation rabbit hole just so that they can figure out which flags trigger the sensible behavior that should be the default. It's worth pushing back against.
posted by WCWedin at 10:47 AM on June 11, 2015 [2 favorites]


> Yeah, maybe you have to be a programmer to love this.

No, you can also be someone who loves good writing. Cases in point:
Code is inert. How do you make it ert?

C is a language you use for building systems; it has the same role in computing that Latin did among Renaissance academics. You won’t often meet a serious practitioner of the digital arts who doesn’t have at least a passing familiarity. The more serious scholars are pretty fluent.

Being an advocate for Smalltalk is a little like being very into Slovenian cinema or free jazz. Some of its advocates are particularly brilliant people. I’m not one of them.

Languages have agendas. People glom onto them. Blunt talk is seen as a good quality in a developer, a sign of an “engineering mindset”—spit out every opinion as quickly as possible, the sooner to reach a technical consensus. Expect to be told you’re wrong; expect to tell other people they’re wrong. (Masculine anger, bluntly expressed, is part of the industry.)
I spent too much of the day (I have work to do, dammit) reading it, though I skimmed a little because he was getting a bit too far into the weeds for me, but basically I'll read anything Paul Ford writes.

Then I read the thread expecting lots of snarky "Well, actually..." And was not disappointed. But there's a lot of appreciation too, which gratifies me. I'll never learn to code, but I'm glad to have seen this post. Thanks, maudlin.
posted by languagehat at 10:51 AM on June 11, 2015 [31 favorites]


languagehat: in my defense, my "well-actually" was a meta-well-actually, taking down unwarranted pedantry in the original
posted by idiopath at 10:54 AM on June 11, 2015


"A computer is a clock with benefits." is a pretty elegant way to think of it.

Only if you have some idea what the benefits are. If you have time to make a Javascript animation of random gates processing random data but you don't have the time to construct one sentence like "what the computer mainly does is move numbers around between registers and sometimes do math on them on the way," then I think it's safe to say you don't really know how it works.

Which I suppose would lead to some uncertainty when your job depends on making decisions about how to make the computer do things. Most of the concerns mentioned in the article aren't even really about programming, they're about fighting more abstract things like language capabilities and API's to get them to do what you need or expect. But those are not fundamental to the generic craft of coding.
posted by Bringer Tom at 10:57 AM on June 11, 2015


It doesn't really require a book or even an article the length of the OP to span the gap between handwaving and actual understanding of what computers do.

It also doesn't require a book to explain how the big bang works. First the universe was in a hot dense state, then it expanded and galaxies formed. The end.

Sometimes just because an explanation isn't the one *you* want, it doesn't mean it's a bad explanation.
posted by kiltedtaco at 11:09 AM on June 11, 2015 [6 favorites]


It is not the point of the article to teach Computer Science. Its target is people who don't know computers and don't really want to. It aims to give an intuitive sense of the field with enough details to ground the reader's understanding.

There's an Ask MeFi thread from a couple days ago in which someone was asking for good nonfiction books. I brought up a distinction between Isaac-Asimov-type and John-McPhee-type science writing. The former is for readers who want to know the subject, but without the comprehensiveness & pedagogy of a real college textbook. The latter is for people who want a good writer to entertain them with a sketch of what professionals in the subject do, what kinds of things it can explain. You come away from the former having learned like you would at a college class lecture, from the latter saying "gosh, I never knew Geology was so interesting."

Charles Petzold's Code is true Asimov. Hofstadter's Gödel, Escher, Bach is in the middle (perhaps a third type, the opinionated argument with just enough background that you can follow the metaphors, like Stephenson's In The Beginning...), and this article is full-on McPhee. I love it.
posted by Harvey Kilobit at 11:13 AM on June 11, 2015 [22 favorites]


I really super appreciate this.

I'm swimming in newbie coder world and this was a perfect mix of lay-person and tech language explanation. It put a whole lot of information I've been working to absorb into a context that I found really easy to understand.

Thanks for posting.
posted by Jalliah at 11:18 AM on June 11, 2015 [1 favorite]


Most of the concerns mentioned in the article aren't even really about programming, they're about fighting more abstract things like language capabilities and API's to get them to do what you need or expect. But those are not fundamental to the generic craft of coding.

I don't know anybody that codes generically,* though, so I'm not sure what this sort of Platonism buys us. I liked that the article focuses on the more mundane concerns of programming in the wild, since it works to dispel the air of arcane wizardry that tends to hang around the craft in the public discourse.

Then I read the thread expecting lots of snarky "Well, actually..." And was not disappointed. But there's a lot of appreciation too, which gratifies me. I'll never learn to code, but I'm glad to have seen this post. Thanks, maudlin.

It's funny, too, because I noticed myself at points starting to balk and say to myself, "that's not really accurate, in fact I'd say ...," and then stop, because that's not the point here. I think programmers self-mythologize too much when they mark certain dispositions as essential to the notion of being a programmer, but I wonder to what extent the field selects for nitpickers as opposed to breeding them.

* especially not if they write in Go hurrrrrrrrrr
posted by invitapriore at 11:22 AM on June 11, 2015 [2 favorites]


Once upon a time I complained that most explanations of how our brains work are comparable to an "explanation" of computers that starts with transistors and logic gates and then says "and if you hook enough of them together you get a computer."

Bringer Tom, can you elaborate on what you mean here? I admittedly don't know much about computers but I don't understand what's wrong with a starting point of transistors and logic gates. Your second comment mentions many times how logic gates factor in at the lowest levels. What is the role of transistors then if they're not worthy of being included in an explanation of "how computers work?"

Maybe I am reading your original comment wrong?
posted by laptolain at 11:23 AM on June 11, 2015


First the universe was in a hot dense state, then it expanded and galaxies formed. The end.

It's a matter of scale. I didn't set the starting point of the OP; the writer did, and he set it by discussing logic gates. You do not get from logic gates to computers without passing through a couple of other way stations which he could have covered in two sentences, but skipped.

Had your explanation of the Big Bang been preceeded by a paragraph about hadrons and bosons, then yes it would be a crappy and inadequate continuance. A much better and similarly short one is presented here:
Once the explosion happened: Rapid inflation occurred and the universe cooled. Quarks combined into hadrons. The forces separated. Matter (atoms) formed. Matter condensed into galaxies, stars, etc.
Contrariwise, the OP could have started with something like "computers basically move numbers around and sometimes do math on them," without pretending that he understands how that happens, and the simplification would not have been jarring.

I think it's relevant because the whole 38,000 words is is about the angst of not really understanding things which are important to his job description. But there are, in fact, people who understand those things and not being one of them might be part of his problem.
posted by Bringer Tom at 11:27 AM on June 11, 2015


Your competitor has an animated shopping cart that drives across the top of the screen at checkout.

Please tell me this is fiction.
posted by bukvich at 11:31 AM on June 11, 2015 [2 favorites]


I think what Bringer Tom meant is that the explanation sort of goes 1. Transistors, 2. ???, 3. Computers!, but I read it as less of an explanation of how computers work than just some adornment around the fundamental point that computers perform complex actions by running a small set of very basic operations extremely quickly.
posted by invitapriore at 11:34 AM on June 11, 2015 [5 favorites]


After our fellow MeFite moonmilk planted the seed in my head, years ago, I can't see Ford's site "ftrain" without pronouncing it, in my head, in the French fashion, which, as moonmilk put it, is something like "fftrr[honk]"
posted by Sunburnt at 11:35 AM on June 11, 2015 [2 favorites]


ok though one bit of pedantry I can't hold back on is it scares me that anyone anywhere is recommending that you sudo pip install anything
posted by invitapriore at 11:38 AM on June 11, 2015 [2 favorites]


Oh god the Language Question. I have been procrastinating a project that I should probably write as a small internal web site, probably backed by C#, but that I would rather deploy as a command-line utility (written in almost anything else) that slurps up files from a file share. I will probably continue to procrastinate this project as long as possible.
posted by uncleozzy at 11:50 AM on June 11, 2015


Please tell me this is fiction.

I have seen things.
posted by brennen at 11:56 AM on June 11, 2015


But there are, in fact, people who understand those things and not being one of them might be part of his problem.

Whatever.

Please note that the goal of the article is not to (a) show people how to build a computer from scratch (b) show how smart the author is because he is well-read on basic processor architecture.

This complaint is especially dumb because the sequence of primitive ops executing on clock cycles is the high level mental model even low level programmers use nowadays. On a modern processor, you might have to consider ALU ports, delay slots and cache lines, etc.. but the high level model is still oriented around the clock cycle and how many of them it takes to execute an instruction in aggregate as it puts things in and out of boxes.

If I say sporks are made of plastic, and they can be used as forks or spoons, it just doesn't make coherent sense pedagogically or otherwise to complain that I skipped the manufacturing and molding process.
posted by smidgen at 12:03 PM on June 11, 2015 [14 favorites]


I've been really enjoying the article as a meditation on programming, but I don't think it's a particularly good explanation of literally how a computer executes code. How does doing an unfathomable number of math problems per second ever execute even the simplest program? That's ok, though I think it sells itself a little strong as such an explanation and doesn't deliver on it.

Fundamental to the explanation is the concept of the many layers of abstraction that computers use to go from a line of JavaScript or Ruby to a gazillion microinstructions to an uncountable number of electrons bouncing around in the CPU. In terms of simple machines, modern programming is the ultimate lever: exert a tiny amount of effort at the highest level, and more things than you can possibly comprehend happen at the bottom. To a programmer, reading a file from disk into a variable in one line of code should be like a Pharaoh muttering "meh, maybe build a tomb or something?" and finding the largest pyramid in the world out his window a microsecond later. Add the internet and APIs, and a programmer can type one line of code into his machine, already a specimen of one of the most powerful machines humans have ever created, and have an untold number of instructions execute on hundreds if not thousands of such machines all over the world. With the exception of the power of the atom, I can't think of anything that beats this kind of leverage.

It's these layers of abstraction, provided by hardware and software layers down the Tower of Hanoi, if you will, that is modern computing, that allow me to type these words into a text box without having to think about font metrics or USB keyboard standards or where the insertion point comes from or how the pixels get to the screen or how the computer manages to both make the insertion point blink and listen for my next keypress at the same time, among the myriad of things someone had to spend a lot of time thinking about at some point. That crucial concept––the way modern programming has built itself on the shoulders of decades worth of effort building these layers of abstraction, complete with little chutes and ladders up and down the stack as optimizations dip down into lower level environments and programmer shortcuts swoop up into higher level ones––is picked up a little, but never fully explained.

And it's fine to just be a 38,000 word mediation on coding. But to me, the beauty Ford wants to convey about code comes from understanding these layers and the incredible power that comes from having these layers of abstraction at your fingertips. I like the article, but wish it explored this a little more.
posted by zachlipton at 12:07 PM on June 11, 2015 [15 favorites]


So were any of you video gamers in the 1980s?

Remember the cheat code for Konami games?

Try typing
↑↑↓↓←→←→BA
while you're reading this article.
posted by Harvey Kilobit at 12:41 PM on June 11, 2015 [9 favorites]


The headline is “What is Code?” — and the capitalization is important there, because it ends up being more of a “what it’s like to be a software engineer in 2015” snapshot than an actual literal explanation of coding. I was not expecting him to talk about tech conferences, or the gender imbalance, or even that we argue about dumb shit.

I did feel like Ford would mention a thing, half-explain it, then move on to the next thing before a beginner would fully understand the first thing; but then I’m not a beginner and it’s hard to pretend. Yes, he does talk about logic gates and then jump straight to systems programming, skipping a few things in between, and I’m not sure how to fix that except by making the article longer than it already is.

But the people who stick through this article to the end will perhaps have a better understanding of computers — or, if not that, at least a better understanding of our self-deprecating in-jokes.
posted by savetheclocktower at 12:51 PM on June 11, 2015 [2 favorites]


the way modern programming has built itself on the shoulders of decades worth of effort building these layers of abstraction,

This would be a fun article to read -- and the analogies are easy to find in any complex infrastructure -- from the music we play, to the roads we travel to the way we obtain and prepare food.
posted by smidgen at 12:52 PM on June 11, 2015


Not a code monkey, but I have poked much code with a stick. Liked the article and stopped by to say that and mention I have already shared links with both an admin I know, and my DPhil in Romantic Lit from Oxford Dad, who is frequently confused as to why GIGO no longer works.
posted by Samizdata at 12:59 PM on June 11, 2015


Phew. Definitely worth the journey, but do take your hiking boots.

I think I see what he's trying to do, and I think he almost gets there. But this is either an severely overlong article or a severely foreshortened book. There are so many dimensions to 'how a computer works' that trying to cover them all in one article, even a novella-length one, is going to be a hard task, if you're going for consistency and pace. You're always going to be over-covering the stuff you know well and skating across the bits you don't. Hanging the thing on an enterprise dev project means you can illustrate a lot of how this stuff actually happens, but - for example - I didn't spot one mention of UI or UX (I guess that's accurate enough for enterprise projects, snark snark).

There's a lot of really good info in there (it's in my 'steal from this' URL list already), and a great deal of obviously hard-won truth. The way he switched from overly-discursive to very breathless styles crashed my mental gears from time to time, as if some things came easily and others were placeholders where time ran out (or the beast was growing too big) and they never got properly fleshed out.

But I think the format and the length do it few favours; it's trying to be too many things at once, and I wonder how many editing cycles it went through between him and the magazine after the first draft - or even whether it started out as something rather different.

All this sounds terribly critical, and I don't mean it to be. I very much like that Bloomberg is happy to get this down and dirty on programming for a general audience; there seems to be increasing interest in this these days, and the author clearly knows his onions and can sling words together. I suspect it should - perhaps will - be the core of a very respectable book.
posted by Devonian at 1:03 PM on June 11, 2015


The example is bad because it's a very, very bad practice for general use. This specific thing is a super common mistake in git tutorials that sends beginners down a documentation rabbit hole just so that they can figure out which flags trigger the sensible behavior that should be the default. It's worth pushing back against.
You're still asserting your personal aesthetic preference as a global fact. Just because something doesn't work the same way as the tool you're more familiar with doesn't automatically mean that it's “a very, very bad practice” and that doesn't change by restating your preference more stridently.

If a 3 word example in a 38,000 word article is worth jumping on, it's also worth explaining what it does and why you believe it's so dangerous. That encourages people to understand what the tools are actually doing rather than cargo-culting advice of unknown provenance.
posted by adamsc at 1:07 PM on June 11, 2015


I've been gradually working my way through this article over the afternoon. As a developer myself, I think it's an interesting look at things. But it seems rather too long and discursive to hold the interest of, or ultimately impart much of substance to, its putative suit-clad audience.
posted by escape from the potato planet at 1:08 PM on June 11, 2015


Imagine all of L.A. programming. East Hollywood would be for Mac programmers, West L.A. for mobile, Beverly Hills for finance programmers, and all of Orange County for Windows.

ORANGE COUNTY ISN'T PART OF LA, PAUL.

Hah, I came here to say that's an apt description of Los Angeles, even down to the fact that Orange Country is really just the most southern part of LA. Hell, they even renamed their baseball team to reflect that point.
posted by sideshow at 1:15 PM on June 11, 2015


This article is not a tutorial on git, so I don't think it matters.

However, even for those who prefer git, "git add ." *is* bad because it ends up adding "everything" new to the files to commit. Unless you were very careful with the specifications in your .gitignore file (which new users proabably don't even know exists), you'll end up adding all sorts of misc garbage in the directory (and all subdirectories) that you didn't really want to. Better to do it selectively, or use "add -u" to pick up only changes to existing files.
posted by smidgen at 1:39 PM on June 11, 2015 [1 favorite]


Wait what, so little about X windows, where's Forth? Did he cover Whitespace, missed that... I tabbed down a few pages immediately and there appeared a cute little css popup snarking "You're the fastest reader ever!" What about assembly, machine language and microcode? The coverage of Microsoft was sufficient. I'd love to see a similarly paced article about genetics.
posted by sammyo at 1:41 PM on June 11, 2015


Great post! I've just gotten to the first embedded video and it was fantastic. Planning on going through the rest later ...
posted by freecellwizard at 1:48 PM on June 11, 2015


ORANGE COUNTY ISN'T PART OF LA, PAUL.
I'm not inviting you to any Angels Games, dude. Still, most of my 40+ years in L.A. were spent in the San Fernando Valley. I always thought that The Valley was totally Windows, and Pasadena was old school DOS.
posted by oneswellfoop at 1:50 PM on June 11, 2015


smidgen: I agree that that this is a largely-irrelevant tangent for such a broad essay. My point was simply just that if it was that important, it'd also be worth providing the supporting education needed to make it more meaningful than “don't do x or evil spirits will eat your computer”.
posted by adamsc at 1:50 PM on June 11, 2015


Everything I know about programming I learned from The Thrilling Adventures of Lovelace and Babbage.

But seriously, most of my "programming education" came from some non-credit UCLA Extension courses in the 1980s that led to one of my meaningless Certificates. Because the UC owned a version of Pascal, most of my training there was in that now-mostly-extinct language. Which was too bad, because its ultra-flexible namings of processes, constants and variables made it totally self-documenting in the right hands... and at the time, I had the right hands. One of my two big class assignments I wrote - in working Pascal code, earning me As - was in the form of an Operating Manual. The other was a science fiction short story. (I was so much smarter then) Never got a job programming after that, but wrote a lot of Documentation.
posted by oneswellfoop at 2:01 PM on June 11, 2015


Fun game: try to get the logic gate animation to all red/green. It's super buggy and real logic gates work nothing like that, but it's possible to get them all green. (I know because I tried)
posted by ikalliom at 2:04 PM on June 11, 2015


Bringer Tom: "Only if you have some idea what the benefits are. If you have time to make a Javascript animation of random gates processing random data but you don't have the time to construct one sentence like "what the computer mainly does is move numbers around between registers and sometimes do math on them on the way," then I think it's safe to say you don't really know how it works."

Right, which was the point of my second sentence.
posted by boo_radley at 3:11 PM on June 11, 2015


Here's my score:

"Congratulations! You read 31302 words in 769 minutes, which is 41 words per minute. That’s insanely slow. How many times have you read it? So thorough! So proud. Or did you just leave the tab open?"
posted by isthmus at 3:58 PM on June 11, 2015 [1 favorite]


It doesn't really require a book or even an article the length of the OP to span the gap between handwaving and actual understanding of what computers do. The author's basic problem is that despite his job description he really doesn't know how the computer works, which is why this article is a bunch of empty verbal flourishes spiced with buzzwords.

There are a whole bunch of different arguments I'm going to touch on in a couple lines here but

a.) I don't think this exactly purports to explain how a computer works, though one could argue in that light that "what is code" is then not quite the right question.
b.) It does a good job answering "what is it like to make software?"
b.) But as far as the question "what is code?" goes there are two equally valid ways to answer it. Bottom up, as you do, or top down, from the abstract theory of computation. I feel like it's a little more in the latter tradition - in a very layman way - but it's a long, long piece of writing so that's debatable. If there's a major deficiency it's that it's throwing out incomplete explanations from the top and the bottom at once.

Well, really if there's a major deficiency it's that the whole thing is just generally all over the place. Sexism in tech is a really important subject but why does that whole section about it just come out of nowhere?
posted by atoxyl at 4:25 PM on June 11, 2015


I mean the idea that code is an expression of an algorithm, that you don't need a computer to compute, is at least as interesting and important as how a CPU works. But how a CPU works will blow your damn mind if you haven't really seen the specifics laid out so you can't go wrong with either starting point - it's just that the territory in between is so huge that I think it is best to pick one.

I've head that this book is a really good bottom-up explanation of computers but I haven't read it.
posted by atoxyl at 4:48 PM on June 11, 2015


bukvich: “Your competitor has an animated shopping cart that drives across the top of the screen at checkout.

Please tell me this is fiction.”
Negatronic, Ghostrider. That's why almost 20 years later I still say, "You can have a spinning logo or a flaming logo."

And on it goes, whiteboard after whiteboard, punctuated by the sound of a mobile phone’s fake camera shutter. “Do we need to keep track of how many times they’ve been e-mailed?” “How do we remove e-mails once they’re in the system?” “What if someone enters the same e-mail twice?”

Programmer A, the leader, seems very professional. She’s at the whiteboard, scribbling, erasing, scribbling, erasing. Lists, arrows, boxes, lines. She wrote RUSSIANS? on the board. But after an hour you realize: This is just e-mail. One field. One little bit of data. You haven’t even hit names yet.
Stuff like this is why, like I was saying in the other thread, the two most beautiful words in any spec are "By Others."
posted by ob1quixote at 4:51 PM on June 11, 2015 [1 favorite]


And on it goes, whiteboard after whiteboard, punctuated by the sound of a mobile phone’s fake camera shutter.

If you've ever wondered what Programming is like, This about sums it up.
posted by zachlipton at 4:57 PM on June 11, 2015 [5 favorites]


Non-computer people - did you actually read this whole thing and enjoy it? Or is it just programmers staring at our navels and hoping to find things to argue about?
posted by atoxyl at 5:26 PM on June 11, 2015 [1 favorite]


I really didn't meant to derail; the article has good spots, mostly where it talks about (as atoxyl suggests) at the higher levels of abstraction.

Two things really punched me in the face as I started the article though. One is that if you're going to spend a novella talking about high levels of abstraction it's kind of a waste of words to mention gate level logic at all. It's like including an entire chapter about spark plug design in an article about the experience of being a NASCAR driver.

And the second is that the explanation of how gates make up a computer is just plain wrong. It's not just an oversimplification; it's downright wrong. Computers are not organized, even approximately, like that Javascript gate animation. And it's not just a simplification; it's a basic falure to understand what is going on. An accurate description would have been no longer or more complicated than what is in the article. But that gate animation is obviously the imaginative fancy of someone who really does think of it as magic -- hey, if you get the right combination of gates by clicking on them, you'll get a computer! And it absolutely is much more organized and sensible than that.

That animation basically perpetuates the idea that computers work by magic.

By contrast, you can read Harry Porter's PDF about the development of his relay computer (which is considerably shorter than the OP) and come away with a very thorough understanding of how simple things like relays can be made to do computering. That's not just because Porter is working from a low level, it's because he knows how the damn things work, that it isn't magic, and that it's not even very hard to understand if you know the tricks.

I realize the OP wasn't trying to explain how computers work at that level, but it would have been a lot better (and less downright misleading) if he had simply skipped the whole gate logic thing completely and left that to someone who knew what they were talking about.
posted by Bringer Tom at 5:36 PM on June 11, 2015 [5 favorites]


Interesting that he gave props to the discontinued IKEA Jerker desk. I have one, and it's probably on its last, or penultimate at most, move; some of the bolts seem to be wearing out from being disassembled and reassembled every time I moved flat, and IKEA don't seem to have spares. It's a pretty good desk, and the lack of equivalents is a bit frustrating.
posted by acb at 6:52 PM on June 11, 2015


Bro, do you even code?
posted by blue_beetle at 9:16 PM on June 11, 2015 [1 favorite]


If you want an explanation of why git add prior to 2.0 violates expectations, well, that's more empirical than logical, but I think it's fair to say that when a user says "get ready to commit my changes", the default assumption shouldn't be "okay, just some of them, right?" I'm not an evangelist or an educator, but if you're insisting that I try, the rule of thumb that doesn't loose any work and does the same thing everywhere is "git add -A". Am I off the hook yet?
posted by WCWedin at 10:39 PM on June 11, 2015


Metafilter: Sometimes just because an explanation isn't the one *you* want, it doesn't mean it's a bad explanation.
posted by Sebmojo at 10:39 PM on June 11, 2015 [1 favorite]


And really, with those hints and a little quality time with the 7400 series data book, most people would be able to figure out how to build a working computer. Maybe not a very good one, but the rest of what makes them so complicated is just optimizations and details.

When I was in college, a few years before me, a project group in the freshman year digital logic class went a bit overboard and put together a RISC CPU with about 25 breadboards' worth of 74xx DIPs. That made the display case.
posted by save alive nothing that breatheth at 1:04 AM on June 12, 2015 [3 favorites]


I want a job where I hit you people with croquet mallets all day now, instead of working with computers
posted by thelonius at 5:29 AM on June 12, 2015 [2 favorites]


thelonius: por que no los dos?
posted by idiopath at 10:08 AM on June 12, 2015 [1 favorite]


The article is not for people that already know these things. Of course that doesn't stop the pedantry here, this is MeFi.

The point is to explain why modern software development, critical to almost all businesses, is an inexact science, unlike many other areas of business. It goes into detail as why this is the case.

This concept is a difficult one for most business leaders to comprehend, as most other investments can have precise deadlines and and budgets that can be predicted with high accuracy before a decision to proceed is made.
posted by Argyle at 10:22 AM on June 12, 2015 [7 favorites]


Non-computer people - did you actually read this whole thing and enjoy it?

Yes! (Even despite a lot of the bells & whistles not working on the mobile clock-with-benefits that I read it on.)
posted by progosk at 10:51 AM on June 12, 2015 [2 favorites]


This thread is basically a demonstration of why the article is necessary.
posted by no regrets, coyote at 3:52 PM on June 12, 2015 [5 favorites]


This thread is basically a demonstration of why the article is necessary but not sufficient.
posted by speicus at 5:59 PM on June 12, 2015 [1 favorite]


I am pleased to see that they are taking the GitHub repo for the article seriously and are accepting pull requests. I had a small one merged in (just adding a link to cite some stats) and it's live on the Bloomberg site. Given that an article like this is going to attract a lot of "no that's not how it really works" pedantry, letting everyone submit edits through GitHub is a nice approach. I'd be curious to know how this was worked out with the Bloomberg editors and how they feel about it.

Since I'm a nitpicky jerk though, I do have to ask about the license. The article text is apparently CC BY-NC-ND 4.0. If I fork the repo and change the text, haven't I just created a derivative work, which GitHub is now distributing for me in violation of the license? That seems like, well, not really the idea they were going for.
posted by zachlipton at 10:34 PM on June 12, 2015 [1 favorite]


OK I finally read the whole thing. Here is what I do not get. The overlayed clippy-style avatar who kept interfering with me trying to read the article. I thought it was proven that clippy tech was a non-starter and nobody uses it any more or ever will again. The only time I ever see shit like that now seems to be in ads.

On the positive side the photography was excellent. I saved the one with the guy and laptop in loft.
posted by bukvich at 6:10 AM on June 13, 2015 [1 favorite]


As someone trained in the humanities and who now codes for a living, Ford's "What is Code?" was delicious in its every word.

I especially enjoyed the dramatized ethnographic detail of software developers working in Agile and Scrum. But the broader characterizations (and some of the finer points) of PHP, JavaScript, Clojure, Ruby, their relationship to each other, the requirements of continuous integration, Test Driven Development, the effects of re-archtecting an ecommerce platform (really I could go on and on here) were simultaneously satisfying and provocative.

Unfortunately, many of the comments above reveal that some readers are so literal-minded they can't understand the delight and humor of, for one example, downloading the entire Node.js repo, making some changes, committing those changes with git add ..

Did you even understand the larger context in which that "sample code" was provided?! Hint, the next "suggested" line is git push origin master.

So, yeah, a few (not all) MeFites sorely disappointed me in their failure to comprehend even the basic gist of the article which (as a few have already acknowledged) is nowhere remotely close to a specification for producing a functional processor.

To my mind, the community over at Hacker News has a more accurate assessment of what Ford has achieved with "What is Code?" To quote the first comment (by dankohn1) in its entirety:
I hate to sound hyperbolic, but I can't overstate how impressive this work is. For me, it evokes nothing so much as Tracy Kidder's The Soul of A New Machine [0] for opening up an obscure world (the one many HN posters live in, but obscure to most people). I am amazed both by the technical fidelity and by the quality of the story telling.

[0] http://www.amazon.com/Soul-New-Machine-Tracy-Kidder/dp/0316491977/
I too am amazed by Ford's technical fidelity and the quality of his story telling.
posted by mistersquid at 8:52 PM on June 13, 2015 [12 favorites]


My workplace is a digital services agency, but one of the owners has a communications degree rather than computer science. He was the one slacking me all day with choice snippets like "why are coders angry?" and "different eras of computing smell differently"; and "OMG, he made hot or not for code!". Ford really is amazing at bridging the two worlds.
posted by fatbird at 9:42 AM on June 14, 2015 [1 favorite]


Also: in lieu of him appearing here, Ford writes (in "footnote" 16):

Writing this article was a nightmare because I know that no matter how many people review it, I’ll have missed something, some thread that a reader can pull and say, “He missed the essence of the subject.” I know you’re out there, ashamed to have me as an advocate. I accept that. I know that what I describe as “compilation” is but a tiny strand of that subject; I know that the ways I characterize programming languages are reductive. This was supposed to be a brief article, and it became a brief book. So my apologies for anything that absolutely should have been here, but isn’t. They gave me only one magazine. If you want to rant, I’m ford@ftrain.com.
posted by progosk at 10:59 AM on June 14, 2015 [4 favorites]


“‘What is Code?’”Charlie Rose, 10 June 2015
On “Charlie Rose,” a conversation about this week's Bloomberg Businessweek magazine which asks "What is Code?" We are joined by Josh Tyrangiel, the editor of Bloomberg Businessweek, and Paul Ford, a journalist and programmer, and the author of this piece.
posted by ob1quixote at 3:05 AM on June 15, 2015 [2 favorites]


Did you even understand the larger context in which that "sample code" was provided?! Hint, the next "suggested" line is git push origin master.

If you're suggesting that it was a joke and I'm thick, I dunno. I went back and read it and it still isn't funny. Anyway, I don't appreciate the dogpile over an offhand comment about a pet peeve of mine. I'm sorry if I've somehow come to represent some kind of assailant against the joys of the humanities with all my "literal minded"-ness, but well, I'm sorry, I'm not the villain you're looking for.
posted by WCWedin at 12:17 PM on June 17, 2015 [1 favorite]


Some of the comment critique here seems a little harsh, and critique-for-the-sake-of-critique. Anyway, perhaps for balance, this ex-coder from back in the day seems to like the article.
posted by Wordshore at 10:10 AM on June 24, 2015 [1 favorite]


« Older DEAR SEGA   |   Windmill Not Included Newer »


This thread has been archived and is closed to new comments