Learnable Programming
September 27, 2012 7:17 AM   Subscribe

Bret Victor: We often think of a programming environment or language in terms of its features -- this one "has code folding", that one "has type inference". This is like thinking about a book in terms of its words -- this book has a "fortuitous", that one has a "munificent". What matters is not individual words, but how the words together convey a message. Likewise, a well-designed programing system is not simply a bag of features. A good system is designed to encourage particular ways of thinking, with all features carefully and cohesively designed around that purpose.
posted by AceRock (69 comments total) 35 users marked this as a favorite
 
I read this earlier today. He makes a lot of good points. And he's definitely right that in order to do something, you need to see it. But I think he's being too literal with the word "see". Not every program is something that you can draw a literal picture of. Blind people, for example, have no particular cognitive difficulty that I'm aware of but would find his examples completely useless.

He uses the word "context" a lot which is good, but illustrates all of those examples with pictures, which is misleading. A lot of context is exactly that: text.
posted by DU at 7:29 AM on September 27, 2012 [3 favorites]


True; although in some cases Victor suggests using text for explanatory purposes (e.g. for labeling parameters, in the "Make meaning transparent" section). I think the use of pictures is in large part because the environment he's criticizing -- Javascript/Processing, as used by Khan Academy -- is very visually-oriented. So while the suggestions for improvement wouldn't be much good to a non-sighted person, neither would the Khan tutorials to begin with.

Honestly I think he's starting out with a bit of an easy target, because I've never thought that JS/Processing or the Khan tutorials were really all that good. (Of course, this could just by my terrible loathing of Javascript and web programming in general getting in the way.)

The tutorials at Learn Python suffer from some of the same issues that the article takes fault with Khan's tutorials for doing (it shows you input and output, but doesn't step through loops), but I don't know if the same specific improvement suggestions would apply there. Though I do like the idea of a scrubber bar that lets you step through execution, line by line.

And the suggestions about either showing or eliminating state are really good, IMO. But I think that should be generalized to either explaining or eliminating everything that's being shown to the user. I've always despised programming tutorials that start out by immediately telling you to blindly type in a bunch of stuff that you're supposed to ignore, or will only be explained later on. (Old "Learn to Program in C" tutorials were especially bad at this, since every program requires a certain amount of boilerplate to even generate output, and most tutorials don't explain any of that stuff until several lessons in.) It's not really confidence-inspiring to have a bunch of stuff obviously going on behind the curtain that you're aware of but told not to pay attention to.
posted by Kadin2048 at 7:49 AM on September 27, 2012 [2 favorites]


the cooking/kitchen analogies are really, really strained. Experimenting in a programming environment, especially a live coding IDE, is very low cost activity, while doing so in the kitchen is much more time consuming, expensive, and potentially dangerous.
posted by blue t-shirt at 7:51 AM on September 27, 2012 [1 favorite]


Apparently, math is hard.
posted by erniepan at 7:55 AM on September 27, 2012 [1 favorite]


The majority of this article is really about making an easy-to-use IDE than about actual programming languages. We talk about programming languages in terms of features because a programming language is much more analogous to a toolbox than it is to a book. Programming in the Twisted framework with Python is completely different than programming in Django with Python because they are designed for different types of projects and require different structures and ways of thinking. The fact that both of those frameworks can be written in Python is because it's a flexible language that can be used in a lot of different ways, rather than one that has a particular way of thinking hardcoded into it. There are a lot of ways of doing things when it comes to programming, and programming languages are more collections of ways of doing things than anything else.

Also the main reason why his canned examples look impressive versus the current state of the art for IDEs is that it takes a lot of very specific code to make a system like that work. Even in cases like designing a UI form, it's extremely difficult to get even a basic WYSIWYG form editor to work well with random changes made in the code by the user. Learning to program is a lot like his unlabeled microwave example, but so is programming once you learn the language. A lot of the actual work of programming is figuring out things that you don't understand or don't work properly. If you need a lot of hand-holding to figure out how a for loop works, you're also going to be pretty lost when you have to track down an obscure bug or figure out how some badly written legacy code works. Part of the reason I started programming in the first place is that if I'm presented with something like the unlabeled microwave problem, reverse engineering it myself is actually something fun and the inevitable trial and error involved is a key part of problem solving in general. He talks about how learning programming language structures to learn how to program is like learning about pencils to learn how to draw, but his solution for how to learn how to program is not like learning how to draw at all either. In both cases the main way to learn is to just draw or program a lot (possibly using some tips from people who know how to do it better than you do) , and along the way you'll pick up things that work and figure out things that don't work.
posted by burnmp3s at 7:55 AM on September 27, 2012 [8 favorites]


the cooking/kitchen analogies are really, really strained.

This. Also I hope I'm not the only person to think that a microwave with unlabelled buttons that you had to work out for yourself sounds actually pretty rad.
posted by Jofus at 7:55 AM on September 27, 2012 [1 favorite]


Yeah, I started skimming through this yesterday after a hall conversation at work, and a couple of things struck me:

First, for all of his pronouncements about the right way to teach things, the fact is that the Khan Academy folks are the only people who've actually tried this. You can rationalize your mental model all you want, but until you actually test it I'm gonna take long screeds on "the right way" more as manifesto than paper.

Second, a good portion of the process of learning to program was building those mental models for myself. Yes, I think modern languages could be a lot closer to BASIC, with a dynamic environment that's editable and changeable, and with a mixed immediate and deferred execution mode that was smarter than re-editing functions, but functions and objects are also nicer for a longer-term environment.

And I think that kind of brings me to: These ideas apply themselves well to functional programming paradigms, but functional languages have monads. At some point you have to alter state. A couple of jobs ago I was working on a very nice system with threaded parallelized evaluation of complex dependency graphs, and the reality was that for all of its theoretical clarity we still ended up pushing local state through shortcuts for performance reasons.

Programming isn't about understanding those clear theoretical models, it's about what happens when you actually have to apply those models to reality. Which goes to burnmp3's point about how his canned demos are nice, but in the real world IDEs are just annoying as f#$%^ and people with the ability to better abstract into their own environment are far more productive in real editors.
posted by straw at 7:58 AM on September 27, 2012 [1 favorite]


I'm so far away from learning to program that probably none of this is relevant to me.

The bit quoted above (comparing language features to words in a book) seems particularly weak. I really don't understand how languages are like books. Even if they are, I have trouble understanding the correspondence between language features and the words in the book.

However, the ideas around time visualization of program behavior are interesting. Earlier this year I had an 'aha' moment about why I so frequently choose to debug by adding print statements: this is frequently the easiest way to get a view of program behavior that cuts across time, something that is made particularly difficult by the graphical debuggers I've used (though it's doable via commandline debuggers of course)
posted by jepler at 7:58 AM on September 27, 2012


The majority of this article is really about making an easy-to-use IDE than about actual programming languages.

I'll draw an even finer distinction than that. It's about making an easy-for-a-n00b-to-use IDE. Which, to his credit, he definitely mentions from time to time.

However, easy-for-n00bs is not at all the same thing as easy-for-experts. The analogy of the learning "ramp" is useful here. The hard part for a n00b is getting from the ground state up the ramp to the plateau. It's nice if that ramp has a very gentle slope.

However, an expert is already on the plateau. The hard part for her is reaching up to a problem from there. If the plateau is not high enough, that's going to be difficult. The ramp is long in the past at this point, what we need is a high plateau, which translating back to reality means powerful tools and a rich language.

GUI IDEs are neither. There are only a few graphical elements in use and they only combine in ways that the manufacturer decided someone might want to combine them. A lot of combinations exist now, but it doesn't come close to the combinatorially explosive power of plain text. A picture is worth 1000 words, but only in that one order. Whereas given 1000 words you can describe millions of pictures.
posted by DU at 8:02 AM on September 27, 2012


He makes a lot of decent points and there is a lot of value in making the result of writing code engaging. If it's not engaging, there is little incentive and perhaps only negative incentives (do this or you will fail) for a kid to touch it.

Ultimately, the appeal of coding is to make the monkey dance, but what he shows is such teeny piece of actual coding and he falls way short addressing what's next.

For his cooking analogy, if you grew up on the kid's show Zoom, there were frequently spots that showed some simple recipe. For example, there was one that I remember where they take a scoop of ice cream, stuck a couple twizzlers in the side for arms, a couple pretzel sticks for antenna and a couple of pieces of candy for eyes.

What he shows is as much programming as this is cooking.

Maybe it's enough to get started, to get hooked, to attract the personalities that latch onto it (control freak? Me? How did you know? Anal-retentive most certainly has a hyphen, thanks). From my professional eye, that environment is fluff and visual chaff. Training wheels - so what do you do when you get them off the bike? What's next?

And it's the what's next that gets interesting. When I was a senior in high school, taking Calculus, I loved LOVED Simpson's rule for approximation of definite integration. Loved it. Wrote a paper on it including extensions to improve the accuracy (at the cost of calculation). As part of that, I wrote a program that could visualize how it worked plotting a function (defined in BASIC) which would then call a routine to flood-fill the area under the curve out of sight and slowly display it on top of the curve from left-to-right. In order to do this, I independently invented flood fill. Or more precisely, I invented the concept of flood fill and then did a non-recursive implementation in 6502 assembly language that ran like a bat out of hell.

My challenge is find out how to get from the training wheels to this point. I did it on my own without formal training wheels (just my own imagination), and I can tell you right now that I had figured out all the stuff that he presents in the first few weeks. Now what? How do I get from the very basics of control structure and abstraction to writing a video game? I did that on my own in a year. This should go way faster now since it isn't a requirement for the kid to (1) learn 6502 assembly language, (2) learn to write bit-blitters, (3) learn to write I/O, (4) learn to write cooperative multitasking code, (5) learn to write sound-generators for a platform that could only pop the speaker, etc.
posted by plinth at 8:05 AM on September 27, 2012 [9 favorites]


Forgot:
kid.GetOff(my.Lawn);
posted by plinth at 8:09 AM on September 27, 2012 [5 favorites]


In the time this got spent pontificating about IDE's or whatever he is talking about he could have shat out some actually useful code.
posted by H. Roark at 8:12 AM on September 27, 2012 [1 favorite]


This seems based on a lot of the principles behind a video for a visual IDE, which I think became a Kickstarter -- Lighttable? Something like that.
posted by fightorflight at 8:16 AM on September 27, 2012


Here we are: Light Table. Which - ah! - was based on an earlier video by the Bret of the original link.
posted by fightorflight at 8:17 AM on September 27, 2012 [1 favorite]


He uses the word "context" a lot which is good, but illustrates all of those examples with pictures, which is misleading.

Take the tying-a-tie example: how does the text lack context? If you already know that you're tying a tie and it's set up in more or less the right initial conditions, what's the problem? On the other side, how does the picture give you more context as opposed to a (quite possibly) easier to follow set of instructions?
posted by kenko at 8:20 AM on September 27, 2012


He is reacting against his ideas being labelled "live coding", but the live coding community has already implemented much of what he talks about, and done a lot more besides:
http://toplap.org/?p=212

He has some nice ideas, and he's right to draw attention to some history, but he comes across as out of touch with contemporary work.
posted by yaxu at 8:21 AM on September 27, 2012 [1 favorite]


Well, I wouldn't dismiss this information too much. Just be careful where you integrate it in. Part of the learning process: absolutely. Part of a final tool for experts: Certainly not in the form shown.

Think of it as the elementary school worksheet version of programming. Information presented in multiple modalities until you can recognize and understand it in a canonical form.

(Also, the example of Processing being identical to assembly was hilarious. It's like the worst of both worlds. Graphical for the combinatorial part, textual for the "only a few different options" part.)
posted by DU at 8:23 AM on September 27, 2012


Yes, I think modern languages could be a lot closer to BASIC, with a dynamic environment that's editable and changeable, and with a mixed immediate and deferred execution mode that was smarter than re-editing functions, but functions and objects are also nicer for a longer-term environment.

So modern languages could be more like Smalltalk and Lisp.
posted by kenko at 8:24 AM on September 27, 2012 [3 favorites]


Kadin2048: "It's not really confidence-inspiring to have a bunch of stuff obviously going on behind the curtain that you're aware of but told not to pay attention to."

But this is how all natural learning happens - we make sense of various parts of the whole in isolation. In fact, in the programming world, it is a small and exceptional set of people who have a comprehensive top to bottom understanding of a given project (design -> code -> language -> language implementation -> OS -> hardware -> electrons). The steps other than the one you are assigned are abstracted, and rightfully so. You don't need to understand all the boilerplate as a beginning C programmer any more than you need to understand all the gates in your processor.
posted by idiopath at 8:31 AM on September 27, 2012


This may be the single most overwrought thing I've ever read about programming.

Also, other than explaining a for loop by turning it into a while loop I find it hard to believe that these examples would actually help anyone learn how to write code. The dot timeline is particularly baffling - he seems to be struggling to find a way to express program logic graphically in the same way as he can express drawing commands graphically (which is not so hard, for obvious reasons) and I don't think he's coming up with the right answers, if there are any right answers.
posted by A Thousand Baited Hooks at 8:37 AM on September 27, 2012 [4 favorites]


The majority of this article is really about making an easy-to-use IDE than about actual programming languages.

Only kind of. When you think about implementing an IDE like this for most mainstream languages, you (rightly) think "that would never work..." He explicitly places the blame for that on the languages. The languages need to have certain characteristics to make an IDE like what is described work.

I had a much more positive reaction to this post than anyone else here. Of course there are things to be worked out, but as a general direction of thinking it's really interesting.
posted by a snickering nuthatch at 8:48 AM on September 27, 2012 [2 favorites]


I had a positive reaction to the post. The only part I'm down on is the "graphics are the best modality for all problems" tone, which he's inherited from the surrounding culture (see GUIs, touchscreens, etc). It's simply not true, but you have to live outside the "graphic designer" reality-distortion field for a while to see it.
posted by DU at 8:54 AM on September 27, 2012


What matters is not individual words, but how the words together convey a message

I agree with at least this idea. When looking at a programming language, I tend to overlook the technicalities and specifications and look at what people have actually made using it. For this reason, in the world of user applications, I have steered clear of Java and have embraced Objective C. While that may be a contentious conclusion, I think the reasoning is sound. If other people have struggled to make something good that I want to make something like, in a language, I'm going to assume it's a struggle to do - even if it may be possible or in some way "theoretically" better.
posted by iotic at 8:55 AM on September 27, 2012


I didn't RTFA, but there's a big difference between ease of use and ease of learning a language.

And between ease of learning the language basics, and ease of getting to proficiency.

I think all I'm really saying is (a) I prefer C++ (with the STL) to C# and (b) I haven't had any caffeine this morning and am feeling kind of sick and out of it.
posted by Foosnark at 9:12 AM on September 27, 2012


I'm more worried that the kinds of environments most of us in the cohort that reached the professional ranks in the early to mid '90s learned how to program in aren't available to kids today.

The 8-bit computers we had facilitated a great deal of experimentation, allowing a curious owner to PEEK and POKE their memory directly and gain a better understanding of the outcomes of their programs. They also were relatively safe to experiment on because it was difficult, if not impossible, to "ruin" the computer simply by executing code. The worst that could happen is you had to power cycle and you lost anything you hadn't saved to disk or tape. I believe that this fostered the necessary lack of fear of the computer, as well as the desirable "fortune favors the bold" attitude, that current systems are unlikely to engender in today's future programmers.

Contrast this to the Windows computers in most homes now where a bad program can easily corrupt the OS to the point that a wipe and re-install is required. Yes there are sandbox type learn-to-program applications that are safe, but they don't give one the interaction with the hardware that's required for real understanding of how computers work.

There are the laptops for children, but they worse than a Windows box. Just as easy to corrupt and much less easy to subsequently fix. There are also the hobbyist computers such as Arduino, but they far too complicated and many don't have built in video output.

I think that having easy-to-understand computers that you could switch on and moments later receive a "READY" prompt made those of us of a certain age much better and more confident programmers. I worry that, despite the millions of young "coders" out there on the Internet dinking around with node.js and what have you, there won't be enough actual programmers to replace us when we retire.
posted by ob1quixote at 9:16 AM on September 27, 2012 [4 favorites]


Only kind of. When you think about implementing an IDE like this for most mainstream languages, you (rightly) think "that would never work..." He explicitly places the blame for that on the languages. The languages need to have certain characteristics to make an IDE like what is described work.

Such as? I had the opposite reaction to the examples, that they would work for pretty much any language as long as you only do the specific canned examples for what he is showing, but would not work for any possible language in a more generalized sense. For example, in one of the animations someone starts typing "fill" and the IDE automatically autocompletes it (which is a standard part of a lot of IDEs), but the cool part of it is that it also autocompletes some canned parameters and instantly displays a visual representation of the result of calling "fill" with those parameters. In actual code you are much more likely to be typing somethings like "submitRecord(recordDatabase, record, overwriteDuplicateRecords=True)". Where do the autocomplete parameters come from, and what insight are they going to give you beyond what a standard IDE already shows you in an autocomplete popup? And more importantly, how exactly are you going to instantly display a visual representation of that call that gives you the sort of insight on what it's doing that the canned graphical example does? Really what his examples show is that IDEs for visual work (drawing things, writing HTML) don't do a very good job of doing some basic visualization tasks for those specific types of visual work, it doesn't really say much about novel programming languages concepts that aren't currently implemented anywhere.
posted by burnmp3s at 9:26 AM on September 27, 2012


ob1quixote, I'd agree with you if marvellous things like javascript, Scratch, fluxus, gibber, python, etoys and Linux didn't exist.

It's easy to get wistful for the past, but actually the computers, devices, operating systems and language environments that children have access to now are massively superior to the expensive crap we had to deal with back then.

Just try to watch this without getting jealous of the possibilities children are being confronted with now.

Programming is becoming part of the social and cultural lives of children, that changes the game completely. From live coding (which despite Bret Victors complaints, is exactly what he's doing) to life coding.
posted by yaxu at 9:30 AM on September 27, 2012 [2 favorites]


To me this is just a whole lot of effort for minimal benefit. The premise here is that it's hard to learn to program, but that can be solved with having the perfect IDE to tell you how to do everything you want. When really the problem, if there is one, is that not everybody really truly wants to learn to be a programmer.

It's sort of like saying "learning to play guitar is hard, so we need a better teaching method to encourage people who otherwise would be turned off by the process." In my experience, that's false. Sure it's great to make things easier for beginners, but when push comes to shove either you want to learn, and spend as much time as you can to make it happen, or you don't, and you move on to other things.

Learning to program is already amazingly accessible for those who really want it. I built my career on Google searches, experimentation, and a very small handful of books. The idea that a better IDE is going to be what makes programming accessible is a solution in want of a problem.
posted by rouftop at 9:31 AM on September 27, 2012 [6 favorites]


Also: TURTLE LOGO 4EVER!!!11
posted by rouftop at 9:32 AM on September 27, 2012 [1 favorite]


You want a time scrubber? check out the reverse command in ocaml's debugger.
posted by bdc34 at 9:37 AM on September 27, 2012 [1 favorite]


I see my friends kids spend a lot of time learning to play Wii baseball. They can play really well, and wipe the floor with me anytime I play. But I don't see the techniques they've learned having any application to actually playing baseball. That's what this reminds me of

It also didn't help that about one-third of that essay made me think "Just use named parameters, and instead of a for loop use foreach".
posted by benito.strauss at 9:47 AM on September 27, 2012 [1 favorite]


yaxu: Just try to watch this without getting jealous of the possibilities children are being confronted with now.
That's all great, but it none of it teaches kids, or more to the point makes room for them to teach themselves, how computers actually work.
posted by ob1quixote at 9:48 AM on September 27, 2012


There are also the hobbyist computers such as Arduino, but they far too complicated and many don't have built in video output.

You might be wanting a Parallax Propeller.
posted by localroger at 9:49 AM on September 27, 2012 [1 favorite]


Am I the only one who is excited about emphasizing discoverability and dynamic, meaningful documentation? Because those things are shit now in almost every corner of the programming world. Even "good" documentation is seriously lacking as learning tool, in my experience, and nothing is less discoverable than an empty text document.

APIs, programming languages, and everything else is constantly evolving. Lowering the amount of time needed in order to pick up a new tool is AWESOME and important. This seems like a great and achievable goal, and not at all a waste of time or something to scoff at because your emacs macros basically write the code for you!!eleven.
posted by jsturgill at 9:52 AM on September 27, 2012 [4 favorites]


He has some nice ideas, and he's right to draw attention to some history, but he comes across as out of touch with contemporary work.

This is something the Khan Academy suffers from, too. As well meaning as they are, they employ very few (and until recently, no) actual professional educators. That is, they've fallen into that nerd fallacy of thinking that because they're smart, and they're good at doing something difficult, that everything else must be easy.

"Engineer's Disease", it's often called, and it's why so many really smart builders make such terrible managers, leaders and teachers; because they don't think those things are real skills. But managers, amirite? So let's try to revolutionize something we have open contempt for and don't really understand anyway, that'll work for sure.
posted by mhoye at 9:56 AM on September 27, 2012 [3 favorites]


where's my diphthongs?
posted by quonsar II: smock fishpants and the temple of foon at 10:08 AM on September 27, 2012


I am certainly not on the cutting edge of what the newest thought is on programming, but some of the stuff that seems odd in the article, like translating for loops to a while, have more to do with current programming fashions than anything else.

People sort of think that a for loop is a code smell. There is at least one comprehensive article about it but I don't have the link handy. The current idiom, the one shows you are hip to what the kids are doing, uses an iterator or a while. It even says in the ruby style guide "never use for".

Ruby peeps may have some legit reason never to use for, but what applies to ruby does not apply to every other language. People see rubyers (rubyists?) eschewing for and it just becomes de rigeur to do it everywhere. If you wanna be a ninja pirate rockstar coder, you can't use for.

There are fashions in code just as everything else, and for loops are just out of fashion right now. I am guilty of it myself. I sometimes find myself rewriting for loops for no good reason.

Zed Shaw tackles the for versus each debate (not much of a debate, for is out, each is in)
posted by Ad hominem at 10:14 AM on September 27, 2012 [2 favorites]


We often think of a programming environment or language in terms of its features -- this one "has code folding", that one "has type inference". This is like thinking about a book in terms of its words.

I disagree. I think that thinking about a program in terms of the features it uses is like thinking about a book in terms of its words. A programming language/environment is a framework, it's a set of tools that are used to make other things. It is reasonable to think about the languages in terms of their features.

For a lot of tasks it doesn't matter, because most good languages are pretty decent at most things. However, every language worth anything also has a sweet spot - something that it really rocks at and other things that it's not so hot at. What those things are tend to be determined by the language features and availability of libraries.
posted by It's Never Lurgi at 10:39 AM on September 27, 2012


The first example seems like a pretty good argument for keyword parameters:
setFillColor(red=0.63, green=0.85, blue=0.447)
drawEllipse(x=35, y=20, width=60, height=60)
drawRect(x=105, y=20, width=60, height=60)
posted by and for no one at 11:19 AM on September 27, 2012 [2 favorites]


ctrl-f pointer, no results found.
posted by k5.user at 11:36 AM on September 27, 2012 [1 favorite]


"never use for"
Use GOTO instead.
posted by MtDewd at 11:41 AM on September 27, 2012


> "never use for"
Use GOTO instead.
posted by MtDewd


You must first learn all the rules, so that you know when to break them.
posted by benito.strauss at 12:04 PM on September 27, 2012


The first example seems like a pretty good argument for keyword parameter

It's somewhat misleading because numeric literals being used directly as arguments is relatively uncommon, or is at least nearly universally considered to be a bad programming practice (normally referred to as a magic number). In most cases you would already have those values in constants or variables called width and height and whatnot, especially considering that in most types of programming you're not going to be dealing with very many hardcoded literals to begin with.
posted by burnmp3s at 12:06 PM on September 27, 2012 [3 favorites]


Just looking at the example, I hadn't thought about that aspect of the problem; almost any numeric literal is a bad idea (except sometimes 0, 1, or -1). Similarly, boolean parameters are almost always a mistake.

Keyword parameters can still be handy, although implementations aren't that common. The C++ idioms for doing similar things are pretty whacky - lots of tiny methods returning *this for chaining...
posted by and for no one at 12:45 PM on September 27, 2012


the fact is that the Khan Academy folks are the only people who've actually tried this

Huh? There are lots of interactive programming tutorials. I linked to one in my earlier comment, Learn Python. It's interactive and somewhat similar to the Khan Academy courses, who are actually a recent addition to decades of online and offline (book-based) teach-yourself-programming guides.

Just as examples, there's Ruby Monk, the W3Schools JS Tutorials (probably responsible for teaching a fair number of people I know basic JS), and if you want to go back in time, even an interactive bash tutorial, written in bash from 1996. Those are just the online ones; I have a whole bookshelf at home filled with various programming texts that I've acquired over the years. The "teach yourself xyz" concept dates back at least to the early hobbyist-PC era.

Earlier this year I had an 'aha' moment about why I so frequently choose to debug by adding print statements: this is frequently the easiest way to get a view of program behavior that cuts across time, something that is made particularly difficult by the graphical debuggers I've used (though it's doable via commandline debuggers of course)

IMO, this is because most debuggers are designed for use by people who are already proficient programmers, and have a mental model of how the program executes in their head. They're not very well designed as teaching tools for gaining that initial understanding. (Hell, they mostly seem like black magic.) Which is too bad, and maybe that's really where the author should have concentrated his attention. The IDE that he's slowly building up over the course of the article could just as easily take the form of an easy-to-use debugger that stepped you through the code's execution, maybe with some pretty graphics if that's really important.

But this is how all natural learning happens - we make sense of various parts of the whole in isolation.

Sure, but there's a difference between breaking something complex up into manageable chunks and teaching them in parts, and doing what a lot of tutorials do, which is tell you to type in some stuff -- which is obviously programming, the thing you want to learn -- but don't explain it until Chapter 3 or 4, even though you're going to type it in (or copy/paste it, if you're lazy) a few dozen times before then. I think that's pretty poor practice and leads students directly into terrible habits, like copying and pasting code snippets that they don't understand into their own projects.

It's not as much of an issue in Python, Ruby, BASIC, or other interpreted languages, because you can actually make them do things without a bunch of boilerplate. Sure, there's a lot of stuff going on under the hood when you type `print "Hello, world!"` and press return, but at least you're not asked to type in stuff you don't understand. Having one black box in the equation is enough, without feeding unknowns into it as well.

The first example seems like a pretty good argument for keyword parameters

Agreed. The author's solution (using hoverover tips) is a bandaid, the better solution is to avoid the poor design choice in the first place. I'd go further and say that arguments or parameters that are parsed positionally, rather than using key=value pairs, are virtually always bad at least where beginners are concerned, and maybe just in general. It definitely smacks of laziness at the expense of readability.
posted by Kadin2048 at 12:49 PM on September 27, 2012


From the essay:

This essay was an immune response, triggered by hearing too many times that Inventing on Principle was "about live coding", and seeing too many attempts to "teach programming" by adorning a JavaScript editor with badges and mascots.

His video Inventing on Princple was about how you can let your principles guide you when you code or design. But everyone saw the neat demos at the beginning and ran with that. John Resig even said the talk was about "responsive programming environments". So I can understand why he is frustrated when everyone missed what he was trying to say.

That being said, his essay has a lot of "Perfect is the enemy of the Good" going on. Processing is not perfect by a long shot, but it does a great job of removing all the boiler plate code and crap you need to get graphics like that running on a web page. They want to teach programming, and want to start at a better place than setting up javascript canvas contexts. So being upset they didn't design a new language and paradigm to change the way we think about programming is an overreaction.
posted by Gary at 1:11 PM on September 27, 2012


Gary, Bret Victor's principle is "creators need an immediate connection to what they create". This is the basic principle underlining all live coding environments. Live coding is much more than just automatic re-interpretation, it is generally much more broad in usage, in terms of live exploration, social interaction and all that.
posted by yaxu at 1:26 PM on September 27, 2012 [1 favorite]


When he referred to live coding as "almost worthless", I assumed he only meant live coding as it is on the Khan website. Although, even on that point I think he's wrong because even that little step towards live coding is so much better for what they are trying to do than the usual "reload and run" loops.

By the way, yaxu, thanks for your link to TopLap. That site/organization looks really interesting.
posted by Gary at 1:47 PM on September 27, 2012


It's not really confidence-inspiring to have a bunch of stuff obviously going on behind the curtain that you're aware of but told not to pay attention to.

Oh man, reminds me of my attempt a few months ago to figure out Objective-C. I went through a step-by-step "make your first app" tutorial on the Apple website. Two pages in and I had no idea what I was doing. I got to about here:
You need to specify a delegate object for the text field. This is because the text field sends a message to its delegate when the user taps the Done button in the keyboard (recall that a delegate is an object that acts on the behalf of another object). In a later step, you’ll use the method associated with this message to dismiss the keyboard.
In no way did this explain what delegates were, why I needed them, where I would need to use them again. They were just something going on "behind the curtain", which no explanation as to their use. Why do I need a object that acts on behalf of another object? Why can't the object act for itself? I guess there's reason, but in the 22-odd years I've been able to program a computer I've never come across the concept before, and the tutorial brushed over it without teaching me anything. I gave up on the 8-page-long "Hello World" tutorial and decided iOS app development is black magic.

There are fashions in code just as everything else, and for loops are just out of fashion right now.

Yeah that Ruby style guide is a bit ridiculous. Who cares if loops don't give me a new scope for each iteration? They never have. Ever. Loops don't do that. If I wanted that I'd wrap my shit up in a function.
posted by Jimbob at 2:22 PM on September 27, 2012


Zed Shaw tackles the for versus each debate (not much of a debate, for is out, each is in)

Unfortunately, that essay is colossally stupid, largely because Zed Shaw doesn't know shit about what an idiom is, and has a bizarre picture of the linguistic generally.
posted by kenko at 2:27 PM on September 27, 2012


Jimbob In no way did this explain what delegates were, why I needed them, where I would need to use them again. They were just something going on "behind the curtain", which no explanation as to their use. Why do I need a object that acts on behalf of another object? Why can't the object act for itself? I guess there's reason, but in the 22-odd years I've been able to program a computer I've never come across the concept before, and the tutorial brushed over it without teaching me anything. I gave up on the 8-page-long "Hello World" tutorial and decided iOS app development is black magic.

It's my impression that "delegate" is used in certain circles (cough.. C#.. cough) to refer to an event handler and/or function parameter (but since functions cannot be parameters in these broken, broken languages, you need to supply an object implementing some kind of Callable interface).
posted by qxntpqbbbqxl at 2:46 PM on September 27, 2012 [1 favorite]


It's my impression that "delegate" is used in certain circles (cough.. C#.. cough) to refer to an event handler and/or function parameter (but since functions cannot be parameters in these broken, broken languages, you need to supply an object implementing some kind of Callable interface).

What? You very much can supply functions (both named and anonymous) as arguments in C# nowadays. Here's a snippet from a project I'm working on right now:
foreach (var f in basePaths.SelectMany(System.IO.Directory.EnumerateFiles)) {
    CheckFile(f);
}
Where EnumerateFiles is a static method on System.IO.Directory.

The concept of delegation in Objective C is also not what you're thinking. I've only taken a couple minutes to skim this, but it appears that a single delegate object can provide implementations for multiple methods on another object (kinda like Scala traits maybe?).
posted by a snickering nuthatch at 3:17 PM on September 27, 2012 [1 favorite]


I've only taken a couple minutes to skim this

was supposed to link to this.
posted by a snickering nuthatch at 3:18 PM on September 27, 2012


Deep down everything is just a JMP.

In c# A delegate is a function pointer.

Func is a delegate that returns a string. Action returns void

this:

public LeetHelloWordl()
{

Func leetDat = (msg) => msg = msg.ToLowerInvariant()
.Replace('e', '3')
.Replace('o', '0');

outputDat("hello world", leetDat);

}

void outputDat(string msg, Func stringMunger)
{
Console.WriteLine(stringMunger(msg));
}


is the same as this

delegate string mungeHandler(string Msg);

public LeetHelloWordl()
{

mungeHandler munger = leetDat;
outputDat("hello world", munger);

}

string leetDat(string Msg)
{
return Msg.ToLowerInvariant()
.Replace('e', '3')
.Replace('o', '0');
}

void outputDat(string msg, mungeHandler stringMunger)
{
Console.WriteLine(stringMunger(msg));
}


All that is different is how you organize your code.

posted by Ad hominem at 4:20 PM on September 27, 2012


Doh, lost some stuff there

assume I meant Func<string,string> whenever I said Func.
posted by Ad hominem at 4:22 PM on September 27, 2012


What? You very much can supply functions (both named and anonymous) as arguments in C# nowadays.

They must have added that since I looked at the language back in its early days. I retract my insult! How does the type system handle function pointers? IIRC typing problems were a big reason this proposal was not added to Java 7.
posted by qxntpqbbbqxl at 5:26 PM on September 27, 2012


It has had function pointers, delegates, since day 1. You could always pass a delegate as an argument.

Let's just say it is uncommon to see them used that way since it was a little gross looking.

A delegate is a pointer to a method with a specific signature. So they had to add generics.

After they added generics we were able to do generic delegates like

public delegate TResult Func<in T, out TResult>

Which let us do stuff like

Func<string,string> munger;

Then they added lambdas.

C# really is pretty awesome.
posted by Ad hominem at 5:57 PM on September 27, 2012


Not as awesome as R.
posted by Jimbob at 6:41 PM on September 27, 2012


or F# 3.0
posted by Ad hominem at 6:44 PM on September 27, 2012


Ok, ok, I get it already. You hate emacs. Leave me alone.
posted by wobh at 7:54 PM on September 27, 2012 [1 favorite]


I loved the essay. Firstly, the web page itself is beautiful; it's obvious that he's a disciple of Edward Tufte (and not just because he mentions him a couple of times). There are videos, there are border notes, there are coloured indicators for the different elements and sections. It is always clear what he is talking about.

I am not "a programmer" but I do program. My most recent language / IDE has been a big chunk of MATLAB. Having read this essay, I realise that I really could have done with sparklines in MATLAB's workspace window (showing variables and their values or ranges). While MATLAB's matrix manipulations means that it can mostly do away with for loops, I had to use a ton of them so some of the ideas for visualising for loops over time probably would have been handy as I certainly got lost in a fair few nested loops (especially if I was reading someone else's code).

In an essay of this type you're never going to get agreement on everything. There are the beginners who have no idea of whether this is good or not. The seasoned experts obviously coped very well with whatever paradigm they learned with, so can't see the need for this methodology. I personally can't say if his ideas are realistic (as others have pointed out, I can't see how it can be generalised for all of it to be useful for my particular coding needs), but I love that someone is thinking about these things and has presented them so well.
posted by milkb0at at 1:52 AM on September 28, 2012 [2 favorites]


There's a discussion about this at LTU as well.
posted by destrius at 3:22 AM on September 28, 2012


Just going back to that "loops are bad" example - is the complaint just because of the C-style loop syntax?
for(var x=0; x <= 100; x++){
 print(x);
}
I know how to do loops like this, but they have always seemed a bit...strange. Outliers. The "for" statement sort of looks like a function, and that you're giving it parameters, but they're messed up, weird kinds of parameters, with semi-colons between them like they're just a sequence of general commands. It makes one wonder if you could write it like this and have it work:
for(
 var x=0;
 x <= 100;
 x++;
){
 print(x)
}
Which now looks almost like you're writing some kind of sub-routine or function. So what would happen if you were to put something else in there? Some more general code? Instead of just incrementing x, could you increment multiple things at the same time? And what's the initial values of x supposed to be? Because it looks like we're setting to 0 then incrementing it immediately after. It is actually weird syntax, compared to the much easier to follow while-loop.

However, other languages aren't like that. Observe a for loop in R:
for(x in seq(0,100)){
 print(x)
}
Or alternatively, to use a shortcut rather the seq function (which just generate a vector of numbers on a sequence):
for(x in 0:100){
 print(x)
}
Python does a similar thing. I find this much easier to read, and if you play with it just a little, you find that it's giving the variable on the left of "in" the values, sequentially, of the vector to the right of "in". And it's obvious how it can be generalised to iterate over things other than numbers.
names=c("Fred","George","Marcie")
for(x in names){
 print(x}
}
It's still not perfect - "in" is still a weird little statement that's used nowhere else in the language, maybe that's why while-loops are preferred. The problem is, while-loops don't work nearly as well iterating over something that doesn't depend on a condition being met.
names=c("Fred","George","Marcie")
idx=1
while(idx <= length(names)){
 print(names[idx])
}
Suddenly we're back to having to keep an index of position. For-loops get dirty if you have to unexpectedly break out of them, but I'd argue that their syntax can be, in well-designed languages, easier to comprehend and easier to apply generally than while-loops.

...of course, all that's irrelevant because in R you'd just skip the loop all-together and do things functionally.
names=c("Fred","George","Marcie")
print(names)
The examples in the article are beautiful, though, both in their concept and visualisation. But yeah, very specific to drawing shapes in a graphics window in Processing. How would dragging a slider help me understand the the internals of a function to fit a generalised least squares regression? Dragging a slider would be a little bit...unresponsive, if my loop involves downloading 200 CSV files, fitting an IDW interpolation over the data in them them to create a surface, then saving the results as geographic raster files, to give an example of an actual for-loop I was responsible for today.

But I'm probably being deliberately stupid. The essay is about teaching programming, not actual programming. The over all, most important point that I agree with completely is that there needs to be interactivity. I learnt BASIC on a computer where you turned it on, and you were faced with a prompt. You could type:
plot 1,100
draw 100,100
To make it draw a line on the screen, instantly. Or you could type:
10 plot 1,100
20 draw 100,100
To make a program you could run. I still find this the most efficient, easiest way to program. You have a little flashing cursor there, and you use it to tell this computer to do what you ask. I have a combination of awe, jealousy and pity for those you can handle languages where you have to set things up to compile before you can see what the small change to the code you just made did.
posted by Jimbob at 5:14 AM on September 28, 2012 [1 favorite]


Jimbob: In idiomatic Ruby, you pass the loop body (as a closure) to a method on the collection, which then invokes the body for each element. There are syntactic features in the language which make this a natural way to write. It has some advantages, like being able to use loop syntax in more complex "do X for every Y" cases (traversing a tree, reading lines one-by-one from a file, etc)— if you've used higher-order languages you'll recognize this form of control-flow abstraction. In Ruby, since this style of iteration is idiomatic, using an explicit for is kind of an indication that you're trying to write C in Ruby (“you can write FORTRAN in any language”), rather than making use of Ruby. I don't think the criticism of for really extends to other languages, though; it's just a matter of using the language the way that works well for it. In R, for example, it's presumably bad to explicitly iterate over a list's elements instead of using operations that distribute over lists. In Scheme or Erlang, by contrast, the criticism would be that you're not using tail-recursion. And so on.
posted by hattifattener at 11:48 AM on September 28, 2012 [1 favorite]


It looks like most of you are coming at this article as programmers. I'm not a programmer, but I do help create online learning environments and am always looking for better ways for laypeople (your average university professor) to create classes online that are more than just traditional text-books posted online with a few videos thrown in for good measure (see: just about every MOOC I've ever come across). The Khan Academy example and courses like the ones offered by codeschool offer in-browser programming as a learning activity, but the real focus of Victor's essay in my eyes isn't on which language is better, but about how to best create a learning environment that reduces the cognitive load of the material. In that regard, I think his suggestions for a better programming learning environment are dead on.
posted by madred at 12:37 PM on September 28, 2012 [1 favorite]


In Scheme or Erlang, by contrast, the criticism would be that you're not using tail-recursion. And so on.

I guess my criticism of the criticism would be - what problem does doing things the wrong way cause? Does it make things slower or use more memory? Those are legitimate concerns. Or does it just theoretically create "nice" code? Nice code is dependent on the viewer - for-loops and while-loops are pretty basic and obvious. It took me a while to get used to functional syntax, and even now I'm really only smooth at it in R. And I don't have a clue what's going on in Obj-C's crazy syntax.

I guess I think it's a positive when there are multiple ways to solve problems in software - it's a sign of maturity.
posted by Jimbob at 3:00 PM on September 28, 2012


I guess my criticism of the criticism would be - what problem does doing things the wrong way cause? Does it make things slower or use more memory? Those are legitimate concerns. Or does it just theoretically create "nice" code?

Well, not using tail-recursion in a language that supports TCO is likely to create slower or more memory-intensive code, but I think the idea with using .inject, .each, etc. in ruby isn't that it theoretically creates nicer code, but that it creates theoretically nicer code—it's supposed to be more composable in that you can extend a chain of calls to do more, or do different things, more easily than refactoring a for loop.
posted by kenko at 4:57 PM on September 28, 2012


A little late to the party here, but one other relevant project that recently launched (more broadly, anyway) is Online Python Tutor. It lets you "execute" Python code in your browser, visualizing the state of the program as it runs, with a time scrubber to move back and forth through the code.

I've been thinking about interactive tools for teaching/learning programming recently, so I was glad to find it a few days ago. This essay is also a great find. I agree with those who say that these sorts of tools are very helpful for learning to program, not for programming in general. But the get-off-my-lawn types are off the mark, I think; there is no question that learning to program is now vastly more accessible than it was even fifteen years ago. Wax nostalgic about poking memory addresses all you like, but there's no question that the languages, IDEs, tools, references, tutorials, and other resources freely and easily available today are worlds ahead of anything we had back then.

The development of tools like Online Python Tutor, the Khan Academy Tutorials, and the ideas expressed in this essay are great steps forward in that respect. They're not perfect, and we certainly should critique them to improve them and guide future work, but I'm heartened by the progress.
posted by whatnotever at 8:39 PM on September 28, 2012 [1 favorite]


Even later to the party as I was finally able to read the essay, and I thought it captured **exactly** what has bothered me about programming for years:

Expose the workings of code like you would in a cutaway model of an engine.

Brett's explanation was spot on in explaining a timeline (and I do understand the difference between synchronous and asynchronous); makes perfect sense to me, a visual learner and not very good at abstraction.

To me it seems that the joy of taking apart a clock (the mechanical ones back when I was young) was you got to see the inner workings and understood what made the hands move. It's a literal environment to say the least but that's where I'm comfortable.

As someone who's played with Proce55ing since it's early versions, Brett suggestions are for those who are more literal in their creativity. We like to see the pistons going up and down.
posted by grefo at 9:16 AM on October 6, 2012


« Older Comix Stars   |   Godspeed, Herbert. May you finally find your nose. Newer »


This thread has been archived and is closed to new comments