J programming language
March 26, 2011 10:10 AM   Subscribe

The J programming language is kind of like a super calculator (it’s been described as executable mathematical notation). It was developed by Ken Iverson and Roger Hui and is a successor to APL (and there’s no need to buy a new keyboard). The language is free and open source, and works on Windows, Mac, and Linux. A series of books and articles on using J are also available to download. To whet your appetite, here’s an article on using J to find the eighth ten-digit prime number that appears among the digits of pi.
posted by Jasper Friendly Bear (79 comments total) 39 users marked this as a favorite
 
Sweet. I've been looking for a powerful but terse programming language/calculator to use on a handheld. Is there a debian package yet? (I can't find one, but this name is ungoogleable.)
posted by DU at 10:18 AM on March 26, 2011


Nice informative FPP. Thanks.
posted by three blind mice at 10:20 AM on March 26, 2011


DU I've been trying to find one for the last 15 minutes. Google keeps throwing me Java related stuff.
posted by azarbayejani at 10:23 AM on March 26, 2011


I've been doing apt-cache searches and I don't find it. Must not be in there (yet?).
posted by DU at 10:28 AM on March 26, 2011


The best way to google for it is "j programming language" (in quotes). However, that doesn't mean I can actually find a Debian package.

The language is... something else. It's massively unlike anything else out there. I've been unable to come up with a good reason to learn it, however, because I can't really figure out a problem where it would shine (I know that it can technically be used for anything, but it's not the best choice for a lot of things).
posted by It's Never Lurgi at 10:32 AM on March 26, 2011


Check out Mind if I do a J? as well as J for C programmers if you're looking for something a bit more gentle. For more APL-like madness see Conway's Game of Life in APL (seriously watch it). J isn't strictly APL but I'm assured that they're comparably powerful.
posted by Skorgu at 10:41 AM on March 26, 2011 [6 favorites]


Strange.
posted by delmoi at 10:50 AM on March 26, 2011


Somehow or another I wound up following this guy on Twitter (probably via Steve Dekorte, who is responsible for io). He's constantly going on about J, which has made me kind of curious. Maybe this post will push me over the edge into actually playing with it, despite my considerable discomfort with the programming-as-mathematical-notation view of things...

I haven't found a Debian or Ubuntu package yet either. I thought there might be a licensing issue, but it's GPL3, so that doesn't seem to be it. Just grabbed the source, let's see how easily it builds...
posted by brennen at 10:55 AM on March 26, 2011 [1 favorite]


Because the only thing better than mathematical notation is mathematical notation in ASCII.

(Someday, cognitive science will save us all. It's going to take a while though.)
posted by effugas at 10:57 AM on March 26, 2011


(Someday, cognitive science will save us all. It's going to take a while though.)

I'm not exactly holding my breath, but would you care to elaborate on that thought?
posted by brennen at 11:04 AM on March 26, 2011


In the college I went to they taught APL to all engineering students. It was a way to teach array math.

APL, and, it appears, J, really excel at array math. I got spoiled and have been disappointed in just about every other languages handling of array math ever since. I'll have to take a look at J.
posted by eye of newt at 11:06 AM on March 26, 2011


Here's my experience: Want to generate the numbers from 1 to n? (my memory is probably fuzzy)

Python: range( 1, n)

bash: seq 1 n

Perl: 1 .. n

R: i n

For me, bash and Perl hit the mark: common action has short notation that look similar enough to what I've seen before. Python? well it's nice to be explicit, but I get tired of typing it over and over. R? very short, and it wouldn't be bad if it were the only one, but every command is that short, and I just can't keep them straight.
posted by benito.strauss at 11:07 AM on March 26, 2011


I've already devoted enough neurons to learning vi, dc and R. There must be a limit to the number of one character commands the mind can keep track of without significant confusion.
posted by autopilot at 11:08 AM on March 26, 2011 [2 favorites]


autopilot, I'm right there with you. vi gets all the one letter commands, because it got there first. Bash aliases get all the two letter combinations. Haskell has a chance, because all its 1 and 2 character combinations use the swear characters: #(*$)@^(!*)+_=+-_<><?. But any other programming language needs to use longer words.
posted by benito.strauss at 11:15 AM on March 26, 2011 [2 favorites]


These very terse languages are interesting, conceptually, but they are completely unmaintainable. That's their drawback: most coding isn't mapping your thoughts to a programming language, but rather trying to find where another programmer's thoughts went awry. It's a lot easier to debug Python (or Java) than it is to debug Perl... and APL-like languages take it to an entirely different level.
posted by sonic meat machine at 12:07 PM on March 26, 2011 [4 favorites]


brennen--

Look at how people are replying. They are speaking in terms of cognitive load -- what they can wrap their brains around.
posted by effugas at 12:12 PM on March 26, 2011


big question: what's J special function support like?
posted by oonh at 12:15 PM on March 26, 2011


effugas - yeah, sure, that I get, more or less. I guess I was curious how you expect cognitive science to inform the design of languages.

For whatever it's worth, I don't think terseness of expression is diametrically opposed to maintainability, readability, etc. The ability to render an idea or process concisely helps a good deal, where the programmer is concerned with communication and isn't mistaking shorter for better.

I share the feeling that the syntax of languages like J doesn't map very well to general-purpose programming tasks. Then again, neither do, for example, regular expressions - but there turns out to be room within the field of general-purpose programming for domain-specific tools that trade on a vocabulary of concise, powerful abstractions.

These kinds of things are also useful in the domain of interfaces. A text editor like vi contains a bunch of programmatic, composable abstractions for manipulating text, bound to single keystrokes and terse command-line operations. I wouldn't want to construct any large software project on this alphabet soup of commands, but they're extraordinarily useful in the context of working on code.
posted by brennen at 12:40 PM on March 26, 2011


The ability to render an idea or process concisely helps a good deal...

True, but I think conciseness in code is different from conciseness in characters. To me, conciseness in a programming language is a question of how much groundwork you must lay before you can do actual work. Java uses very long class, object, and method names, but compared to C I'd still call it "concise" because C requires so much memory management and other awkward code.

I think that J, Perl, and indeed Regular Expressions take "concise" too far and end up in the land of "terse," where you can't tell at a glance what the thing is even attempting to do. That's not to say they're not powerful – I use regular expressions very often and consider them an incredibly powerful tool – but they're definitely harder to maintain than other languages.
posted by sonic meat machine at 12:49 PM on March 26, 2011


most coding isn't mapping your thoughts to a programming language, but rather trying to find where another programmer's thoughts went awry.

I completely agree. In fact, I've been saying for the last year or two that programming languages are not methods for humans to talk to computer for methods for humans to talk to other humans.

That said, J would be great as a programmable calculator language. You don't want to have to write out a bunch of semicolons, parenthesis and header files when doing an integral or transforming a matrix. I was thinking of using Scheme as a handheld calculator but J might be better.
posted by DU at 12:51 PM on March 26, 2011 [1 favorite]


methods for humans to talk to computer for methods for humans to talk to other humans.

Speaking of being unable to talk to other humans...
posted by DU at 12:53 PM on March 26, 2011


String DU.getSentence throws LanguageError() {
   ...
}
posted by sonic meat machine at 12:55 PM on March 26, 2011


could that have had any more errors in it
class DU extends MeFite {
   private String getComment(MeFiPost post) throws GrammarException {
      ...
   }
}
Kids, don't make programming jokes if you're going to screw it up.
posted by sonic meat machine at 1:24 PM on March 26, 2011 [2 favorites]


True, but I think conciseness in code is different from conciseness in characters. To me, conciseness in a programming language is a question of how much groundwork you must lay before you can do actual work. Java uses very long class, object, and method names, but compared to C I'd still call it "concise" because C requires so much memory management and other awkward code.

You raise a really good point here. There are a number of cross-cutting domains for all of this stuff. I wouldn't have thought to describe Java as "concise", but it's surely true that for a lot of operations it requires a great deal less expression of the programmer than a language like C. (At least until you hit one of those problem spaces where you're able to express things much more simply in C.)

I'll note that while I think Perl is highly relevant to this discussion, that's partly because it encompasses an unusually broad range of styles (both in capability and in practice). You can sometimes treat it as a sandbox for exploring the optimal (or, if you prefer, the most amusingly pathological) degree of brevity and symbolic abstraction in a given problem. But then I like Perl, and find a lot of its better features lacking in other environments, which puts me somewhere outside the norms of discussions which take its flaws as some kind of axiomatic baseline for awfulness in language design.
posted by brennen at 1:29 PM on March 26, 2011


Seems to me to be a combinator language over (mostly) numeric types? Then it's no wonder it can look that impenetrable (the old joke of "write-only programming"). While hugely expressive and very good for reasoning about programs in a formal setting (see: Algebra of Programming, Bird/de Moor's wonderful work), major programming projects in this style might be just painful unless you learn to rewire your thought processes in the combinator style... Remember, if you're reaaaaally masochistic, you can program everything under the sun with just 3 combinators (ok, actually 2), but good luck with debugging and sharing that code. It's good to recall, also, that things like Haskell get compiled into a (smallish) set of combinators, so I'd agree with anyone complaining that languages (almost) purely built on combinators should be thought of as assemblers or virtual machine control languages, not your actual programming level.
posted by Iosephus at 1:37 PM on March 26, 2011


The interesting thing to me about this source release is the style of the C code. It seems the authors don't get any more verbose when they have to write C. From a file called "w.c", which seems to have something to do with lexing:
static A jtconstr(J jt,I n,C*s){A z;C b,c,p,*t,*x;I m=0;
 p=0; t=s; DO(n-2, c=*++t; b=c==CQUOTE; if(!b||p)m++;    p=b&&!p;);
 if(0==m)R aqq; else if(1==m&&(z=chr[(UC)s[1]]))R z;
 GA(z,LIT,m,1!=m,0); x=CAV(z);
 p=0; t=s; DO(n-2, c=*++t; b=c==CQUOTE; if(!b||p)*x++=c; p=b&&!p;);
 R z;
}
The whole codebase is like this.
posted by skymt at 1:47 PM on March 26, 2011 [5 favorites]


But then I like Perl, and find a lot of its better features lacking in other environments, which puts me somewhere outside the norms of discussions which take its flaws as some kind of axiomatic baseline for awfulness in language design.

I actually like Perl, when I use it for text processing. I have yet to find a language which so powerfully and comprehensibly deals with text manipulation, particularly regular expressions. That said, the lack of a coherent design means that there are always "gotchas," and I have found myself staring at my own code wondering what the hell I had written. This "learn a feature, then forget it" problem is not something I've run into in any other language, but it's one reason I'm leery of terse languages.
posted by sonic meat machine at 1:48 PM on March 26, 2011


class DU extends MeFite {
   private String getComment(MeFiPost post) throws GrammarException {
    SnarkFactory sf = new SnarkFactory();
    sf.setLines(1);
    return sf.getSnark().snark();
  }
}


FTFY :)
posted by delmoi at 1:56 PM on March 26, 2011 [4 favorites]


Interesting idea, an ascii APL. But poorly imagined, with weird unintuitive operators that change function whether they're "monadic" or "dyadic"... I'm looking at you, '<.'. Obvious, right?

And then there's the replacement of division '/' with '%". Because '/' has another use, don't you see. I don't think so.
posted by sea at 2:02 PM on March 26, 2011


Remember, if you're reaaaaally masochistic, you can program everything under the sun with just 3 combinators (ok, actually 2), but good luck with debugging and sharing that code.

Yes, if you don't care about such mundane matters as interacting with your environment.
posted by kenko at 2:03 PM on March 26, 2011


skymt, that's horrifying.
posted by kenko at 2:04 PM on March 26, 2011


Yes, if you don't care about such mundane matters as interacting with your environment.

Can I trade that third combinator for a monad?
posted by benito.strauss at 2:06 PM on March 26, 2011


If you do want to interact with your environment, you could do worse than Unlambda.

Here's an interpreter.
posted by kenko at 2:06 PM on March 26, 2011 [1 favorite]


Can I trade that third combinator for a monad?

Sure, you can choose from the Maybe monad, the Reader monad, and the Cont monad.
posted by kenko at 2:26 PM on March 26, 2011 [1 favorite]


The necessary quality metrics for programming languages are nowhere to be found in computer science. They may someday be found at the intersection between computer science and cognitive science.
posted by effugas at 2:31 PM on March 26, 2011


Writing programs is pretty easy, even in assembly language.
Understanding (modifying, fixing) an existing program is where the strength of programming languages ... vary:

One- or two-letter names? _$? Mystery-meat prefixes? These kill comprehensibility.

Very_Long_Names_That_Mean_Something? Usually good. Except for Counter & Index as loop counters. For those k & i do quite well (provided they are not referenced outside the loop)

Special embedded languages (regular expressions and the printf syntax)? Very good, but hard get into the culture.
A string special language would be preferable to the C++ STL garbage and the host of functions in other environs:
(S.substring(S.lastIndexOf('/'), S.lastIndexOf('.')); ? you kiddin' me?)
Collections, too.

goto. Unfairly demonized. goto can be used comprehensibly, if commented with one of two comments: //forward or //back (for C/Java/etc):
L1:
for(int n= 0; n<4; n++){
L2:
   for(int i= 0; i<5; i++){
      // (do something)
      if(bigwhoops(i))goto L1; //back
      if(littlewhoops(i))goto L2; //back
      if(alldone(i))goto L4; //forward
      if(done(i))goto L3; //forward
   }
L3:
}
L4:
(even better with comprehensible labels, and no goofy after-tests like with break or continue)
posted by hexatron at 3:45 PM on March 26, 2011


effugas, could you expand on that a bit more? I think that cognitive load, and the need to reduce it as much as possible, is already well recognized in computer science. For instance, Code Complete talks about this aspect of programming at length, and that book has been a mainstay of programmers for the last decade or so.
posted by Balna Watya at 3:51 PM on March 26, 2011


The whole codebase is like this.

In our Computer Architecture class, the prof gave out some asm to hand-optimize. None of us made any progress because WTF. So he gave us the source C code so we could see what it was supposed to be doing.

It was an N line block of 80 character wide C code without a single whitespace character. Double WTF.
posted by DU at 4:21 PM on March 26, 2011


One- or two-letter names? _$? Mystery-meat prefixes? These kill comprehensibility.

I dunno, I think that once you learn to think with them, weird symbols can become just as natural and expressive as symbols like "*", "++", etc. I mean there's nothing, to a C programmer, opaque about an assignment like "*p++ = *s++". And if you've got enough Haskell under your belt, using <> and <> is both more compact and clearer than the functionally equivalent code not written with the functions exported in Control.Applicative.

They definitely kill comprehensibility until the point at which you've retrained your eyes, though. But then, using higher-order functions really extensively can become pretty confusing too.
posted by kenko at 4:27 PM on March 26, 2011


Oops. <$> and <*>, respectively, there.
posted by kenko at 4:32 PM on March 26, 2011


I've written some Perl on and off for about 20 years and yet I can still never do it without a manual perched on my lap, and I still make mistakes when indexing arrays. This is why I don't use Perl by preference any more.

As for J, it's even worse - and I still have a soft spot in my heart for APL.

These days, it's all Python. Typing range(0, n) isn't that verbose, and the language is overall the most compact language that's still extremely readably.

Since a piece of code is generally written once and read 10 or more times, readability trumps terseness every time!
posted by lupus_yonderboy at 4:39 PM on March 26, 2011


hexatron: I do not love your goto example at all. There is a place for gotos only when you cannot raise an exception, but even then, more than one label in real-world code (i.e. code that's over a page long) results in extreme confusion.

Just use an exception and be done with it! It expresses your intent clearly, and it also allows your calling code to make the final decision as to how to deal with the exceptional condition.
posted by lupus_yonderboy at 4:42 PM on March 26, 2011


readability trumps terseness every time!

Define "readability" though. That's the point of this: once you learn to think with them, weird symbols can become just as natural and expressive as symbols like "*", "++"

Optimizing your language(/software/OS/hardware/whatever) for n00bs usually just makes it unfriendly to power users. And power users are the *real* users of anything non-pedagogical.
posted by DU at 4:59 PM on March 26, 2011


I learned APL on an IBM 5100, which was almost like a 'personal' computer. (It was at the office; I was working nights; it was available...) I pretty much have forgotten everything, but I really did like the non-ascii operators. Somehow they make more sense to me.
posted by MtDewd at 5:03 PM on March 26, 2011 [1 favorite]


Optimizing your language(/software/OS/hardware/whatever) for n00bs usually just makes it unfriendly to power users. And power users are the *real* users of anything non-pedagogical.

However, the most popular programming languages these days are Java and C#. Neither are exactly champions of "terseness." Python and Ruby aren't, either. Even if we look over at the functional side of things, Haskell is terse – but is barely used outside of academia.
posted by sonic meat machine at 5:06 PM on March 26, 2011


The general readability of Python code is absolutely my favorite feature of the language but despite that you can do some horrific things if you torture list comprehensions sufficiently.

I bet the C code for J is somewhere between clear and obvious to people with thousands of lines (?) of J under their belts.
posted by Skorgu at 5:13 PM on March 26, 2011


However, the most popular programming languages these days are Java and C#.

Yes, they are...among programmers who suckare inexperienced. I've seen it at work with a pretty large sample. The set of terriblebeginner programmers and the set of Java programmers overlap almost 100%. The ones who aren't terrible but still use Java wish they weren't, because they are invariably forced to make terrible programs against their will.
posted by DU at 5:19 PM on March 26, 2011


> once you learn to think with them, weird symbols can become just as natural and expressive as symbols like "*", "++"

Sure, I agree with that - to a certain point. Heck, my first programming language was APL. But I found that a few weeks after I'd written an APL program, it'd be very hard to figure out WTF I meant by it.

Java and C++ are too verbose. Java is particularly bad considering they had the bad example of C++ to look at, and somehow they managed to make it more verbose (I'm not knocking C++, mind you, my current project is again in that language and it's fun...!)

APL and, I believe, J, are too terse. Unless you split everything up into tiny lines with intermediate variables, even if you're an expert, it's simply hard to decipher.

And Perl is just too weird. :-D I make too many mistakes.
posted by lupus_yonderboy at 5:20 PM on March 26, 2011


My parents used to program in APL when I was a kid in the 70s. On our home terminal, where by terminal I mean a thing with no screen where you typed your code on a paper feed. My memory of this -- can it be right? -- is that to get certain commands, they had to type a character, then backspace, then overstrike another character on top of it. I was awed.
posted by escabeche at 5:21 PM on March 26, 2011


> The set of terrible/beginner programmers and the set of Java programmers overlap almost 100%

Now, now - Google's ad system is written almost entirely in Java, for example, and these guys are serious programmers. I'd say that a lot of terrible programmers do gravitate to Java because you can just put together components and not really understand programming but there are a lot of talented programmers out there. I personally have fallen out of love with the language but there's a lot of good there...
posted by lupus_yonderboy at 5:22 PM on March 26, 2011


Maybe I should put that less "your favorite language sucks"-fully.

Of course Java and C# are popular. They are optimized for beginners. That's exactly my point. They are not at all meant for power users and once a programmer becomes a power user these languages are usually abandoned.
posted by DU at 5:23 PM on March 26, 2011


> is that to get certain commands, they had to type a character, then backspace, then overstrike another character on top of it

Absolutely right. Thorn and thistle, that weird character with the box and the quote in it (quote-quad! now I remember) - ah, the bad old days of the 70s!
posted by lupus_yonderboy at 5:25 PM on March 26, 2011


> once a programmer becomes a power user these languages are usually abandoned.

Again, I disagree, and point to Google as an example (for Java at least).

Java has numerous advantages for "enterprise" level systems. I don't know C# but I heard very good things about the language from programmers I respect.
posted by lupus_yonderboy at 5:26 PM on March 26, 2011


I don't know that I agree. While it is nice to use other languages, and my favorites are Python and Haskell (at least in theory – my understanding of Haskell is pitiful), there are a lot of Java jobs out there... and I don't think that every "power user" is going to find (or even want) a job programming games in C++ or embedded systems in C or [whatever-the-heck-you-use-erlang-for] in Erlang.
posted by sonic meat machine at 5:27 PM on March 26, 2011


there are a lot of Java jobs out there.

I don't understand all these arguments from numericity. Isn't it well-known that popular/numerous != good? Maybe we should just argue that Sun makes a lot of money from Java. It's about as relevant to whether good programmers use Java to make good programs.
posted by DU at 5:30 PM on March 26, 2011


I'm just saying that there are enough Java jobs that I would guess many good programmers are "Java developers." Someone has to write the libraries that bad Java programmers glue together, after all.
posted by sonic meat machine at 5:34 PM on March 26, 2011 [1 favorite]


As for ASCII-ed math derived symbols making it a mess and hard to follow, I'm aware that a few experimental languages have happily moved onto Unicode land, where you can pick from lots of math symbology to improve readability. The bad part about this is that entering the symbols can be a pain in the ass, but I suppose a good IDE could provide contextual menus with nice drop-down tables and all that. I had that in an experimental logical framework formalization tool I used for my work many years ago, and it worked pretty handily. Does anyone know if Unicode symbology is fully available in any usable languages today?
posted by Iosephus at 5:37 PM on March 26, 2011


Iosephus: there are huge advantages to having a programming language that you can touch-type...
posted by lupus_yonderboy at 5:49 PM on March 26, 2011


I did J in a university course, many years ago, and am still not entirely sure what a gerund is.
posted by acb at 6:22 PM on March 26, 2011


I did J in a university course, many years ago, and am still not entirely sure what a gerund is.

Along with the language itself having a steep learning curve, the documentation also has a learning curve. From the J Companion for Statistical Calculations (pdf)
The terminology of English grammar is used rather than that of programming languages. Functions are referred to as verbs whose arguments are often called nouns and pronouns instead of constants and variables. Verbs may be modified by adverbs and, for example, the verb +/ which gives the sum over a list is derived from the verb + plus by means of the adverb / insert. Also conjunctions allow the composition of verbs, and @ is the conjunction atop (or after) which, for example in the defined verb pos applies the verb on the left after the verb on the right so that the verb might be read "increment (after generating) the non-negative integers".
posted by Jasper Friendly Bear at 7:07 PM on March 26, 2011


But I found that a few weeks after I'd written an APL program, it'd be very hard to figure out WTF I meant by it.

Yeah, I know this feeling, for sure.

You wouldn't believe how long it took me to decipher this line of Python:

D = lambda ((__,r)): ((D, (__[0], lambda p: (r(lambda r, _: (D, (__[1], lambda __: __(r, _)))) if p == c else (D, (__[1], lambda _: p(_, r)))))) if isinstance(__, tuple) else (r, __))
posted by kenko at 7:46 PM on March 26, 2011


No list comps, even.
posted by kenko at 7:49 PM on March 26, 2011


My write-only language was Forth (not counting assembler, which is barely writable and even less readable [I, and the other progs, were being paid. But seriously, a 60-line assembler subroutine with one commented line ("rts ;return") is just a thumb-in-the-eye. I have seen this. (I deny ever doing it)]). I could never make much sense out of Forth programs, even my own, though I have been assured that this is not true of everyone.

I have also been bit in the butt by Java, and not rarely. I recommend Concurrent Java ... for anyone writing Java thread stuff who hopes to create non-sometimes-mysteriously-crashing programs. Or do what everyone else does--catch Exceptions and start over again.

And lupus_yonderboy: Re exceptions: Except in 'slow'code (ok, that is 95% of all code) exceptions are slow... --usually, the stack gets rearranged lots of weird crap goes on. If you can avoid it by a simple if(obj !=null)obj.doit(), it's a big win and also tells the code-reader that 'hey, obj may be null at this point, but I don't much care").
The try{ obj.doit(); } catch(Exception e){} variant has lots of other possible meanings.

(and woe to the objective-c advocate who sayeth: what's wrong with obj.doit() when obj is null? May you operate a shop that accepts US$ and Monopoly money. :)
posted by hexatron at 8:21 PM on March 26, 2011


D = lambda ((__,r)): ((D, (__[0], lambda p: (r(lambda r, _: (D, (__[1], lambda __: __(r, _)))) if p == c else (D, (__[1], lambda _: p(_, r)))))) if isinstance(__, tuple) else (r, __))

That's not Python, that's poor man's lisp. lambda has its place, but it's horribly abused much of the time.

(Is this at least from something like the Python koans?)
posted by sonic meat machine at 8:42 PM on March 26, 2011


escabeche: "My parents used to program in APL when I was a kid in the 70s. On our home terminal, where by terminal I mean a thing with no screen where you typed your code on a paper feed."

That's what our University still had, in 1983. I've forgotten everything about APL, other than its bizarre character/command set, but I still remember one thing about the terminals: They accelerated as they finished a line. To me that was an incredible feat of engineering; a dot-matrix device that went zzzzzzzZZZZIP! as it printed, and yet didn't stretch any characters. |And the speed at the end of the line was at least three times as fast as at the beginning. The timing on those things was rock-solid. It was fun just getting our listings.
posted by Hardcore Poser at 10:03 PM on March 26, 2011


That's not Python, that's poor man's lisp. lambda has its place, but it's horribly abused much of the time.

In fairness, it was deliberately obfuscated. This is the non-obfuscated (at least, the not intentionally obfuscated) version (still not exactly easy to decipher, especially out of context):
def descend(tc):
  tree, cont = tc
  def dcheck(lval):
    if lval == d:
      return cont(lambda rv, k2: (descend, (tree[1], lambda lv: lv(rv,k2))))
    else:
      return (descend, (tree[1], lambda rv: lval(rv, cont)))
  if isinstance(tree, tuple): ### application of tree[0] to tree[1]
    return (descend, (tree[0], dcheck))
  else:
    return (cont, tree)
It's a non-recursive "eval" loop (actually returning the next function to call, and its argument, to a trampoline) for an interpreter for an obfuscated functional programming language. "d" is a special form that delays evaluation of its argument. (I now recall that I linked it above, actually.)

Since it's an interpreter for an obfuscated language, I thought it should be obfuscated as well.
posted by kenko at 10:28 PM on March 26, 2011


Yes, they are...among programmers who suckare inexperienced. I've seen it at work with a pretty large sample. The set of terriblebeginner programmers and the set of Java programmers overlap almost 100%. The ones who aren't terrible but still use Java wish they weren't, because they are invariably forced to make terrible programs against their will.
Ugh, this is such B.S. I realize python is all trendy now, but honestly for writing very large programs the static type checking you can do with java is helpful. Java is what's taught to kids when they're in school, so obvious beginners are going to know it. But if you right real programs instead of toy stuff people do use java. I wouldn't call the author of Minecraft, for example, a noob. I don't think the people who developed Android picked java as their main programming language because they were noobs and so on.
posted by delmoi at 5:08 AM on March 27, 2011


As for ASCII-ed math derived symbols making it a mess and hard to follow, I'm aware that a few experimental languages have happily moved onto Unicode land, where you can pick from lots of math symbology to improve readability.
What I would like to see is a language that lets you type something different then what you see. for example if you typed 'lambda' you'd get 'λ'. That would require some integration between the code and the editor, but with all the auto-complete editors out there it shouldn't be too hard.
posted by delmoi at 5:10 AM on March 27, 2011


You can pretty much get it wherever programming languages are discussed, but this "language x is for stupid people and children" stuff is seldom very helpful.
posted by brennen at 10:18 AM on March 27, 2011


but honestly for writing very large programs

Which of course Java ensures you will be writing.

the static type checking you can do with java is helpful.

Sometimes. Myself, I'm finding that I make simple type errors rarely. I can see that in theory, a carefully built type/class hierarchy could help catch some more complex entity-behavior related errors. But there's a lot cognitive overhead of doing that, chances of doing it well on the first pass or two are slim, and this is ground that's covered by good unit tests, which seem to me to provide overall better ROI.

To bring this back to the discussion, though.... that cognitive overhead I'm talking about often also applies when you're reading the program. This isn't really a Java problem per se. Someone's supposed to have said "in Smalltalk, everything happens somewhere else." It's what I think people are trying to get at when they complain about too much "magic" in frameworks -- so much abstraction and indirection across a hierarchy of classes that reading a program to find out what it does becomes unusually difficult until you have fully groked the entire set of class relationships.

Good design can ameliorate that problem within the use parameters the design is meant for. And OO programming remains a great way to package certain libraries and behaviors. But when the philosophy overflows into the idea that all programming is modeling class relationships/behaviors I think it begins to get arguably more obtuse than a set of terse APL-ish operators.

But if you right real programs instead of toy stuff people do use java. I wouldn't call the author of Minecraft, for example, a noob. I don't think the people who developed Android picked java as their main programming language because they were noobs and so on.

This is true. Lots of people use Java because the platform and even the language have advantages, whatever its weaknesses may be. Some of the people who use it wield it very effectively. I imagine a lot of those people feel the same way I do about PHP, though: there are limitations to the language and practices common among large portions of its developer base that drive me crazy. But if you want to write something like WordPress which is going to run on any old shared hosting account, or you've got a quickie web form to do, or just a little bit of dynamic stuff to add to an otherwise static website.... PHP fits the problem domain so well that the weaknesses pale.
posted by weston at 12:15 PM on March 27, 2011


that cognitive overhead I'm talking about often also applies when you're reading the program.
I can see how it could be difficult for unintelligent people to understand.
posted by delmoi at 12:38 PM on March 27, 2011


I can see how it could be difficult for unintelligent people to understand.

Code is hard for intelligent people to understand. Although I sometimes think this wouldn't be quite as true if we programmers, as a culture, devoted less of our considerable intellectual energy to sniping at one another about tools and practices.
posted by brennen at 1:06 PM on March 27, 2011 [1 favorite]


I can see how it could be difficult for unintelligent people to understand.

If you'd like to close the circle on the "only dumb people don't like language feature/practice x" argument, you've successfully done so. If you found something incorrect, unfair, or otherwise frustrating about my comment, you haven't been particularly successful at articulating it.

Are you saying you really don't ever find taking in an entire class hierarchy puts any load on your thought process? Or perhaps that if there is such a load, it's always lighter than groking a terse operator?
posted by weston at 1:20 PM on March 27, 2011


I can see how it could be difficult for unintelligent people to understand.

It can be difficult for intelligent people to understand. More to the point, it can be difficult for intelligent people to maintain. Particularly so if the maintainer is not intimately familiar with all of the code, as is often the case. As Joe Newcomer observed, maintainability is everything in this business.
posted by Crabby Appleton at 5:41 PM on March 27, 2011


K. Iverson personal anecdote...
As a young lad in mid-70s I happened to live in Swarthmore, PA which also happened to be residence of Kenneth Iverson. The local high school participated with his research efforts and had several teletype APL terminals and eventually even IBM 5100s.

As his newspaper delivery boy during the era, Dr. Iverson generously offered and spent many an hour with me personally working with APL and my budding interests in mathematics and computers.
posted by Consult The Oracle at 8:07 PM on March 27, 2011 [2 favorites]


This is a great post, thank you.

I think it is important to clearly point out that J is not just any old successor to APL. It is a successor to APL developed by Kenneth Iverson in collaboration with others, following on from his previous work developing APL.
posted by motty at 8:43 PM on March 27, 2011


More to the point, it can be difficult for intelligent people to maintain. Particularly so if the maintainer is not intimately familiar with all of the code, as is often the case.
How is that any different then any other programming language? You can build complicated type hierarchies in any language worth using (including C, using function pointers) and because the type system isn't specified or defined by the language each program can work differently, which actually can make it more difficult to maintain. For example, with javascript you have functions which are also objects, and you can change the 'class' of an object at any time by swapping out functions and replacing them with different ones. The fact that Java is strongly typed makes it easy to see, for example, exactly what kind of object a function takes. In JavaScript everything is just a variable and you have to hope it has the right properties. The only way to see what a function expects is to see where it's been called and what the parameters were. (Or hope for good comments)

Anyway one of the points with java is that the enforced type system is actually supposed to make it easier to maintain. It would be harder to maintain java code for someone who doesn't understand java very well, obviously. But honestly if you have trouble with it then probably not a very good programmer, sorry.
posted by delmoi at 2:19 AM on March 28, 2011


Fancy seeing a post about J on the blue!

I'm the author of Mind If I Do A J? (mentioned in one of the earlier comments) and involved with OpenJ (a project to maintain the GPL'd source code). I'm pretty new to the language, but it strokes a pleasure center of my programming brain like few other languages do.

Interesting idea, an ascii APL. But poorly imagined, with weird unintuitive operators that change function whether they're "monadic" or "dyadic"... I'm looking at you, '<.'. Obvious, right?

Wow, I totally disagree! While on first glance the operators don't have a clear meaning, you get used to them extremely quickly (since it's basically the only thing to remember). The overloaded valence verbs are a clever way to group functionality into a pared down number of operators. Plus, they're often grouped logically, for example: <. Floor/Min -- both monad and dyad are operations related to 'downward' comparisons.
posted by wrok at 5:27 AM on March 28, 2011


delmoi, I agree with your last comment, mostly. All I intended to say was that, in my experience, the code I've encountered that was written in a more procedural language (mostly C) is more "local" than the code I've encountered that was written in a more object-oriented language (mostly C++). Typically, if one is able to narrow down a problem to a short section of code, then fixing it is usually relatively straightforward in C, whereas in C++ you might still have to track down code in a dozen different source files to understand what's going on. A good IDE can make that easier, I admit. And I'm assuming that knowing the language well is a prerequisite for this.
posted by Crabby Appleton at 2:17 PM on March 30, 2011


« Older Legocraft   |   Let the chips fall where they may. Newer »


This thread has been archived and is closed to new comments