print "And now for something completely %s" % 'different'
December 12, 2011 6:09 AM   Subscribe

This month, Python won "Best Programming Language" in the Linux Journal's Reader's Choice Awards 2011. If you're not convinced, Python Facts explains little simple things that make Python great.

Python is relatively easy to learn although if you wish you can learn it the hard way (videos).

If you already have experience in programming, basically all you need to know is that Python is essentially pseudocode as a programming language. Of course, Python is way more than that. You can dive right into the documentation and browse through the thousands of packages written in Python.

Python is great for things like shell scripts, text processing, or data structures and algorithms.

However, it's great for other things, too:
Web Programming
The most popular framework for web programming in Python is probably Django, but there's also Pylons, web2.py, web.py, Tornado, and many, many others.
Game Programming
Previously on MetaFilter but also check out the PyGame library. Python is also used as the scripting language for Blender3D and can be and has been used as the scripting language for many other game engines.
Science!
There's actually an annual conference dedicated to using Python for science, but if you're not convinced, see its applications in biology/life science, earth science, chemistry[ PDF], physics, quantum physics, astronomy, and more. If your brand of science is very math-y, you'll probably also run into NumPy, a package for scientific computing.


Also, although Python is often run as a script from the command line, it can also be used to build GUI applications.

Although Python ships with its own IDE, there are many IDEs available for use with Python. If you're already using Eclipse for coding in other languages, you can add the excellent PyDev plugin.

Once you've started working in Python, if you're the social type you might want to check out one of the local user groups also known as PIGgies, or attend a Python Meetup.

The initial developer of Python and its Benevolent Dictator for Life is Guido van Rossum (previously). Insights on the current state and future development of Python can be gleaned from the list of PEPs (Python Enhancement Proposals). The two most important PEPs are the Style Guide for Python Code and The Zen of Python (available as an Easter Egg directly in Python, just enter import this into the command line).

And, in case you were wondering, the name was in fact based on Monty Python and so instead of the traditional foo or bar for metasyntactic variables (or "placeholder names"), you're often likely to see spam, eggs, and ham used (PEP 8 listed above uses both foo and spam as variable names).

Oh! Oh! And then there's the huge Python 2.x vs Python 3.0 thing. Basically, although it was not a complete rewrite from scratch (as per PEP 3099), it broke compatibility with Python 2.x, meaning many programs written for Python 2.x will not work if run in Python 3. Quite a few Python libraries still run in 2.x only, and so most people have a version of 2.6 or 2.7 running on their computers in addition to the 3.0 version (if they have a 3.0 version at all). Major changes include string formatting and the print statement becoming a print() function. String formatting and printing are used all the time in Python, so this is a big deal.
posted by Deathalicious (147 comments total) 127 users marked this as a favorite
 
There's a script for converting Py2 to Py3, and version 2.7 included a number of features backported from 3.0. So I guess there are a lot of people writing in version 2.7 but running 2to3 every now and again just to keep it compatible-ish.

This is a case where software holy wars make the problem much worse than it has to be.
posted by LogicalDash at 6:13 AM on December 12, 2011 [2 favorites]


HOLY WAR!

I actually prefer Lua.
posted by clvrmnky at 6:16 AM on December 12, 2011 [2 favorites]


A couple of great python things I've seen lately:

Simulated Knitting
Boston Python Workshop: We empower women of all backgrounds to learn practical programming in a beginner-friendly environment.
Python on the Web
posted by zamboni at 6:19 AM on December 12, 2011


I'm working my way through Learn Python the Hard Way right now, with no real programming experience other than dicking around with BASIC 20 years ago, and it's... hard. I think I'm getting over the hump at the halfway mark of the book, but everything is so alien to me that it's incredibly difficult to wrap my head around even the most simple things.

I love the philosophy of it, and the idea of having a "base" language in my repertoire, but... wow. I thought that being a writer with a firm grasp of "language" would help, but it's really a different order of thinking.

(and I posted this a while ago to Jobs -- it's still open, so if you're a Python person looking to make easy cash over a couple of hours a week in the evenings, please let me know!)
posted by Shepherd at 6:20 AM on December 12, 2011


Shepherd, I recommend How to Think like a Computer Scientist for an easier intro.
posted by LogicalDash at 6:22 AM on December 12, 2011 [3 favorites]


So I click through to the "little simple things", and the first offering is this:
How can addition be more easier than this?

print eval('+'.join(map(str, [1, 1])))
.... right. Silly me for thinking 1 + 1 is simpler.

Python... I'm happy for y'all. Use it and be happy. For me - I got Perl, PHP, Java, C#, C in the quiver. I didn't learn Ruby yet the world didn't end... I probably won't jump on this bandwagon either.

Besides, real languages don't depend on whitespace. You can have my braces, brackets and semicolons when you pry them from my cold dead hard drive.

< /getoffmylawn>
posted by Artful Codger at 6:25 AM on December 12, 2011 [8 favorites]


From the first link: Python continues to dominate. Close on its heels this year, however, is C++.

That’s kind of like saying that the vehicle of the year is the Trek Belleville, closely followed by the Siemens ES64U4.
posted by wachhundfisch at 6:28 AM on December 12, 2011 [10 favorites]


Google App Engine is built around python and Django as well.

The tutorials there are a good exercise for getting your feet wet with python, I think.
posted by empath at 6:30 AM on December 12, 2011


I've been using and loving Python for over 10 years now. And as much of a fan as I still am, I am concerned with the Python 3.0 split: case in point, the title of this very post is not legal Python 3.0 :-)

At any rate, one more reason to learn Python is that you can run it on the JVM (via the excellent Jython implementation) and .Net (IronPython).
posted by costas at 6:33 AM on December 12, 2011 [2 favorites]


I love Python. It's the first programming language I really had fun in.

Java and C/C++ are great, and I get the speed benefits of a compiled or even more efficient scripted language. But Python is fun. And for 95% of what I do, it's fast and efficient enough.
posted by mccarty.tim at 6:33 AM on December 12, 2011


One more bit on Jython/IronPython: one great productivity enhancer is to use Jython to give a nice, easy-to-program API to some massive Java program (a command line prompt into the process memory space? no biggie) and not have to deal with the Java "baggage" while using great Java libraries. I haven't used it, but I understand that the same is true for IronPython and C#.
posted by costas at 6:37 AM on December 12, 2011


Python is my favorite of the couple dozen languages that I've used in one way or another over the years although I don't know Ruby. I love how compact it it and how powerful the libraries are. I took a PSP class a few years ago and we all had to do the same programming exercises but we could use any language we wanted. Most of the rest of the class picked either C++ or Java while I used Python. My programs were a fraction of the length of most of the other students.

Unfortunately, my current job causes me to do most of my work in TCL.
posted by octothorpe at 6:37 AM on December 12, 2011


How can addition be more easier than this?

print eval('+'.join(map(str, [1, 1])))
Yeah, that one's a joke, and not a very good one. Python can do addition just fine, with operators or functions.
posted by zamboni at 6:38 AM on December 12, 2011 [7 favorites]


Just to be clear: I tried Ruby, didn't click for some reason.
posted by mccarty.tim at 6:39 AM on December 12, 2011


Does anyone else have difficulty reading python source due to the lack of brackets? Despite its purported easy learning curve, this small distinction has made it difficult for me to use easily. I think brackets help me "see" the logic of a program far more than indentation.
posted by yorick at 6:40 AM on December 12, 2011 [3 favorites]


yorick, test that theory with a regex! s/^ *//
posted by LogicalDash at 6:42 AM on December 12, 2011 [1 favorite]


yorick: Like semicolons, python lets you use brackets if you like. You can actually write code that looks like it'd be at home in Java or C++ (save for some different method names) and it runs just fine.
posted by mccarty.tim at 6:42 AM on December 12, 2011 [2 favorites]


But they're not required.
posted by mccarty.tim at 6:43 AM on December 12, 2011


I don't mind the indentation thing, it's the _self this and _self that that annoy me.
posted by GallonOfAlan at 6:44 AM on December 12, 2011 [2 favorites]


Yorick, I hated this too initially but just like riding a bike eventually you get used to it and actually can scan and recognize blocks right away. It's the people whoo use two space indentation who should burn in hell.
posted by Deathalicious at 6:47 AM on December 12, 2011 [5 favorites]


I do find that Python is easier to scan than bracketed language but I don't usually get too worked up by syntax quirks.
posted by octothorpe at 6:50 AM on December 12, 2011


I've never seen "_self" in Python code, and indeed it doesn't really make sense given what I know about the use of underscores to prefix names. From what I understand, they're used to indicate "private" attributes (Python doesn't have actual private attributes; the phrase I've seen describing this is "we're all adults here") and so there's no reason to use one on "self."

Nitpicking: the syntax in the post title is deprecated. It should be
print "And now for something completely {0}".format('different')
posted by valrus at 6:50 AM on December 12, 2011 [1 favorite]


It's the people whoo use two space indentation who should burn in hell.

Better to burn than spend any time in a heaven with hard tabs!

set tabstop=2

I like Python quite a bit, but it never quite clicked with me the way Ruby did. I can tell the people who came from Python to Ruby by the number of times they use self inside class definitions (inside of instance methods, self is implied in Ruby).
posted by ndfine at 6:56 AM on December 12, 2011


I can get an amazing amount done with awk/sed/wget/grep and other unix tools wrapped in a sh script. I prefer it since it's so easy to debug and port and modify, and primal, like drinking Pabst Blue Ribbon. You kids and your "fancy" languages! I know C well enough for more complex applications. Never learned Perl. Python is top of the list, which I know would give me more options. Thanks for the post.
posted by stbalbach at 6:56 AM on December 12, 2011 [1 favorite]


Python is a great programming language, it has a big community behind it which is essential these days and the Python way of doing things tends to be very elegant and intuitive compared to other popular languages. The incompatibilities between 2.6/2.7/3.0 are probably the most annoying aspect to me, especially when it comes to getting different libraries to play nice together.
posted by burnmp3s at 7:00 AM on December 12, 2011 [1 favorite]


I'm sliding from Perl to Python mainly because of Python's fundamental support for utf-8 and other encodings.
posted by gregoreo at 7:07 AM on December 12, 2011


Your favorite programming language sucks.

Just kidding. I love Python. After I picked it up, I never wrote another program in the horrifying mess that is called perl.

Two space indenters will be the first against the wall when the revolution comes. Try to think for once of the poor bastard who has to clean up after your mess!
posted by double block and bleed at 7:09 AM on December 12, 2011 [2 favorites]


Not using the new string formatting convention in the title? Tsk.
posted by urschrei at 7:13 AM on December 12, 2011


Backwards incompatibility is a rather significant issue for hosting providers. If you want to have a lot of support for the various versions you end up having to use virtualenv or rvm (ruby) which is just another layer of complexity to manage. More stable languages are generally preferred. Python has a lot going for it, but that's a pretty big flaw and ultimately it's not vastly better than other languages to where it matters. (Just my humble opinion).
posted by Godspeed.You!Black.Emperor.Penguin at 7:18 AM on December 12, 2011 [1 favorite]


set tabstop=2

*high fives ndfine* (Though don't forget set expandtab!)

I keep meaning to mess with Python since it looks really cool and working so much with PHP (for my job) has withered my soul. Seeing some functional goodness above, even if it was for a joke, has gotten me excited again. Might try to work through some Project Euler stuff this weekend. Mmm functional programming.
posted by kmz at 7:31 AM on December 12, 2011


One of the reasons I like python, regardless of any programming language features, is the spit and polish they have put on their windows and mac versions. They come in standard packages, like msi files for windows, making deployment easy. It's libraries have support for everything from web apps to churning through csv files. It's a rather infectious language in that respect, as you can install anywhere, and use it to write nearly anything.
posted by zabuni at 7:35 AM on December 12, 2011


Lists are being replaced in many cases by iterators, which is pretty great. And I see that they kept lambda (whose fate has long been in question), which is also pretty great. At this rate, Python will evolve into Haskell by about 2050.

There's a script for converting Py2 to Py3

No way is that sound, unless it includes a Python 2 interpreter emulator. So it will work for simple code, but when 'eval' starts to happen, everything will go to hell. Anyone who runs this on important code and expects it to work is in trouble.
posted by qxntpqbbbqxl at 7:37 AM on December 12, 2011


Also, re:
tabstop=2
I think PEP8 has something to say about that.

*glares
posted by urschrei at 7:37 AM on December 12, 2011 [2 favorites]


Does anyone else have difficulty reading python source due to the lack of brackets? Despite its purported easy learning curve, this small distinction has made it difficult for me to use easily. I think brackets help me "see" the logic of a program far more than indentation.

No. In fact, I find Python easier to read than curly-brace languages.

Python is executable pseudocode. I love it.

If you aren’t a formally-educated-CS-type who thinks in pseudocode, this may not be a selling point of Python for you.
posted by spitefulcrow at 7:38 AM on December 12, 2011 [1 favorite]


No way is that sound, unless it includes a Python 2 interpreter emulator. So it will work for simple code, but when 'eval' starts to happen, everything will go to hell. Anyone who runs this on important code and expects it to work is in trouble.

Anyone who uses eval statements in important code is in trouble.
posted by spitefulcrow at 7:42 AM on December 12, 2011 [10 favorites]


I love Python. It's absolutely fun to write code in it. I also take some pleasure in Javascript now, but Python has so many wonderful and powerful language constructs. zip, iterators, and generators together make for a really good time working with list data. Lua's neat too if you need a tiny / embedded runtime.

Shout out to PyPy, a new Python runtime that is itself implemented in Python. They are doing some very aggressive JIT optimization work and a lot of real world code is now running faster in PyPy than CPython. Not sure it's quite ready for production use, there's some controversy, but I'm excited about it.

I feel bad for the Perl community. Their language used to be the fun, powerful scripting language. But it's experienced a slow waning. The extended development of Perl 6 and Parrot seems to have harmed the community in some way I don't really understand. Python's got its growing pains too (Python 3, PyPy) but it doesn't seem to have hurt the language.
posted by Nelson at 7:43 AM on December 12, 2011


Isn't the biggest issue with PyPy that it won't run C modules? Which is a problem for a lot of things which simply can't be done fast enough or low enough in regular Python?
posted by mccarty.tim at 7:45 AM on December 12, 2011


I'm not positive, but I think the issue is that PyPy doesn't work with CPython's extension methods. The PyPy folks have several options for invoking C libraries; ctypes is the furthest along. I think the theory is that, in general, folks won't need to write C extensions for speed because PyPy will be faster. If you need to use some complex existing C library (say, geos), then you use ctypes. Whether this will work in practice remains to be seen. PyPy is definitely a work in progress, I just find it exciting.
posted by Nelson at 7:54 AM on December 12, 2011 [1 favorite]


Isn't the biggest issue with PyPy that it won't run C modules? Which is a problem for a lot of things which simply can't be done fast enough or low enough in regular Python?

You can interface C code with PyPy (and CPytyhon) via the excellent ctypes module. As somebody who has used Python's C APIs to bind C libraries to Python interfaces, I have to say it's unusually good for a language's FFI, but ctypes really seems like a technological leap beyond it.

Somewhere back in the dusty Usenet archives on Google is a "yes" vote from me on the creation of the comp.lang.python newsgroup. I haven't been a total fan of all of the directions Python went in since 1.5.2 or 1.6 or so (to my mind still close to the platonic ideal of the language), but it's still a great, fun language, with tons of interesting options for web development.

Also, maybe this is a good place to drop in this old NTKnow from 2004. "Perlites are chaotic/good trickster archetypes who love such events, Pythonistas are peaceful, have-their-glasses-on-a-little-string types, like hobbits or the Dutch." The crack at the end about "feel the punctuation rising in you" is a reference to the recent introduction of Java-inspired decorator syntax to Python.
posted by whir at 8:06 AM on December 12, 2011


The first one I got was "Integer division can yield floating point results, with an import (or on Python 3, by default)". Uh, great, I love having to guess what behavior a built-in operator will have. Or using a language that changes its mind on how math works based on version number. Good sell.
posted by 0xFCAF at 8:30 AM on December 12, 2011


One more shout out to SciPy and NumPy. If you're doing scientific computing of any sort, they are astonishingly comprehensive libraries. In astronomy, they've replaced a lot of what people used to do with Matlab and IDL.

Unfortunately, I've always found the installation of these packages annoying - eggs, ez_install, easy_install, build your own - and on my system there were at least four different site_packages areas in use at some point, all subtly incompatible with each other. I've been using Enthought of late, and been very happy with it.

(But my first choice for a quick job that doesn't fit into one line of grep|awk|sed is still Perl...)
posted by RedOrGreen at 8:33 AM on December 12, 2011


Today, I posted my subjective experience with a programming language online, and all the responses were polite. Nobody believed me when I told them.
posted by yorick at 8:36 AM on December 12, 2011 [3 favorites]


Python Ecosystem - An Introduction is another useful guide for people new to the Python world and want to learn about interpreters, libraries, etc.

RedOrGeren: you're right the install systems in Python are kind of a mess. FWIW the community seems to have standardized on pip, along with virtualenv to manage deployment environments. Here's an introduction to pip and virtualenv.
posted by Nelson at 8:41 AM on December 12, 2011 [1 favorite]


This is a great post, but "Best Language" arguments drive me nuts. They're the kind of things that developers do instead of getting work done. The best language is the one that accomplishes the task you sat down at the machine to do so you can get back up and go back to life.

I came to Python about a year before RoR got released, forced into learning it to customize some of the HTML that was rendered by Inktomi. Given however that Inktomi intern wrote his code (try learning the concept of significant whitespace when it's all clumped inside of HTML comments that wrap acrsso lines, etc)) it's a wonder I stuck with it, but I'm glad I did. I've looked at Ruby a few times and it seems just as nice a language, but I haven't had a need for it yet.

One nitpick: "I can tell the people who came from Python to Ruby by the number of times they use self inside class definitions (inside of instance methods, self is implied in Ruby)."

Hmm. It could be they came from Python (well, I suppose it has to be since they're writing "self" instead of "this"), but it could just be they think "implied" is a concept best left for polite conversations and not explicit computer code. I can't think of someone worse at inferring what you meant than a computer.
posted by yerfatma at 8:57 AM on December 12, 2011


Python rocks, and I wish I had an excuse for using it more often than I get to. One thing I've found it immensely useful for is data migration and/or automating code generation... I worked on a project that necessitated taking about 120,000 records out of a FileMaker database and into a Drupal web site. I wound up using Python to chew through a massive CSV export and spit out PHP command-line scripts to re-create them as native Drupal nodes. I don't think there was more than 25 lines of code (not counting the PHP script template).

Dive into Python is another nice intro to the language. Stay away from the O'Reilly Learning Python book, though. It simultaneously over-explains some things while bringing up others and then saying "But don't worry about that right now."
posted by usonian at 9:10 AM on December 12, 2011 [2 favorites]


I think the reason we have this best language argument is because there are so many object oriented scripted languages. Ruby and Python share a lot of functionality.

I don't think any rational person argues that C and Python are equally suited to different tasks, though. Python would freeze a common embedded processor, and C's speed and low-level nature are not needed for running, say, a web forum or social network (it'd work, but it'd be overkill). So there is definitely no single best programming language, just as there's no best tool on a woodworker's bench.

I've heard it argued in the MIT Computer Science open courseware videos that the important thing is learning any programming language and learning to think algorithmically. After that, it's mostly about learning syntax to pick up a new language. Anyone with experience agree with that? I mean, I guess it has limits, in that you don't learn new paradigms that easily (ie Haskell).
posted by mccarty.tim at 9:13 AM on December 12, 2011 [1 favorite]


I'm sliding from Perl to Python mainly because of Python's fundamental support for utf-8 and other encodings.

Perl actually has fairly wonderful support for UTF-8.
posted by alex_skazat at 9:14 AM on December 12, 2011 [2 favorites]


This is the most poignant Python program I've ever written.
posted by tykky at 9:24 AM on December 12, 2011 [2 favorites]


I've said it before, and I'll says it again:

from __future__ import braces
posted by 3.2.3 at 9:28 AM on December 12, 2011


Indentation conveys functionality. .

Invisible things giving meaning to code feels profoundly wrong to me.

Also there are no semicolons on the end. I hate that.

And finally, it's another damned duck typed language. ("If it looks like a duck and talks like a duck, it's a duck"), so

foo = 3;
foo = "hello"

defines what a variable is.

It's fine and good if you have a big picture, but if you're jumping into the middle of a big project, you need a lot more context to figure out what's going in.

So :

foo = GetWorldType(worldobject)

means you need to find all the variables involved and look at the api to figure out what's going on. You don't know what the function returns by default.

Whereas

int foo = getWorldType(worldObject)

Gives a bit more context. And you can follow the logical train backwards more easily.

Ostensibly, this is coming at it from a game development background.

For example, I hate non-semicolon terminated languages because I had to do a bunch of localization work on Ultima Online. We had around 30,000 hardcoded strings in both the c and wombat( java style language) scripts.

I wrote a tool to automatically pull the static strings from the files and rewrite the source accordingly. It was a lot simpler because I had a semicolon terminator to work with for each of the lines.

The duck typed complaint comes from the 4am cold-call where you're woken up and asked "Why does this NPC have 1.5 million objects in their inventory?".

And you need to pore through thousands of lines of code that you haven't seen, that have been written and touched by countless people for the past N years (upwards of 13 for some of my older games).

So deriving context quickly, removing gotcha's from code you haven't seen before, are all huge priorities.

Python is a great example of a language that's too freeform. Too open ended, and clever. It encourages excessive cleverness, which while nifty in an abstract, is the bane of everyones fucking existence at 4:00am when you're trying to find out why sewing kits are deleting other players when you click on them.

Case and Point, from the Python Facts website:

//True and false are just variables

True,False = False,True

print False
print True


That's just programming jackassery. Don't ever do that or someone who has to fix your stuff will find you and punch you in the neck.
posted by Lord_Pall at 9:35 AM on December 12, 2011 [4 favorites]


Imho, the Sage project's Cython has been Python's biggest contribution to programming language design, definitely raised the bar for foreign function interfaces

Python was the first language to make 'everything is an object' casual enough for everyday scripting use too, right?

Perl has a better golf game, of course. Merry Christmas! And C++ wins other fun games. Haskell is obviously the "best" language though.
posted by jeffburdges at 9:38 AM on December 12, 2011


You can reassign True and False's values? Why would you do that?
posted by mccarty.tim at 9:39 AM on December 12, 2011


I've heard it argued in the MIT Computer Science open courseware videos that the important thing is learning any programming language and learning to think algorithmically. After that, it's mostly about learning syntax to pick up a new language. Anyone with experience agree with that? I mean, I guess it has limits, in that you don't learn new paradigms that easily (ie Haskell).

Well that's the thing. Algorithms give you the fundamentals how to do shit. Then there's the different categories of languages aligned along several mostly orthogonal axes that capture different ways of using algorithms to do shit. OOP vs non-OOP, functional vs imperative vs declarative, etc. (Many languages, Python among them, support multiple categories.) Moving between these categories is usually non-trivial. I loved functional programming in college because I already loved recursion and math. Others though hated it and the failure rate for that class was something like 60%. (There were other factors involved there like everybody and their cousin wanting to do CS because this was the height of the late 90s tech bubble.)

Moving between languages within a specific category combination can be somewhat straightforward.

And then you get into the implementation details. These are details that can end up mattering a lot. Security, version incompatibilities, etc. For me that's where the boundary between computer science and everyday programming sits.
posted by kmz at 9:42 AM on December 12, 2011


You can reassign True and False's values? Why would you do that?

Nobody would actually do that short of some kind of diabolical programming contest context. (I did UIL Comp Sci in high school. The shit they pulled in there was ridiculous.)

But hell, you can pull off dick moves in C++ with operator overloading too. That something is possible does not mean it actually gets used.
posted by kmz at 9:49 AM on December 12, 2011 [1 favorite]


You can reassign True and False's values? Why would you do that?

You can make whatever identifiers you want in your own namespace. That doesn't rebind identifiers in other namespaces:

>>> dir()
['__builtins__', '__doc__', '__name__', '__package__']
>>> True,False = False,True
>>> dir()
['False', 'True', '__builtins__', '__doc__', '__name__', '__package__']
>>> __builtins__.True
True
>>> True
False
>>>
posted by 3.2.3 at 9:59 AM on December 12, 2011 [1 favorite]


I've heard it argued in the MIT Computer Science open courseware videos that the important thing is learning any programming language and learning to think algorithmically. After that, it's mostly about learning syntax to pick up a new language. Anyone with experience agree with that? I mean, I guess it has limits, in that you don't learn new paradigms that easily (ie Haskell).

It depends on a lot of variables. I've seen web developers move from C#/Java to a dynamic language like Python effortlessly. They're doing the same thing, just in a different language.

I would not expect the same uptake if you were moving a web developer to create a messaging middleware application in Erlang.

Similarly, you simply can't beat experience in a large library like .Net. A good C++ programmer working on, I don't know, graphic programming would certainly pick up C# quickly, but it would still take time to understand and really know .Net.

Of course the MIT courses are set to teach you how to think algorithmically and like a computer scientist. I think that's very important, and something any professional programmer should have, but I've seen plenty of good programmers who have no CS background do work on real applications and would totally blank out at the mention of a hashtable.
posted by geoff. at 10:10 AM on December 12, 2011


There's been a fair amount of hand-wringing in the Python community this week recently about why Python 3, now three years old, still hasn't caught on. Armin Ronacher wrote an essay on this subject.

Personally, I've started writing software with from __future__ import print_function but I probably won't switch over to Python 3 until it has a killer app.
posted by grouse at 10:12 AM on December 12, 2011 [1 favorite]


Don't forget ScraperWiki — you can also use PHP or Ruby but for people jumping in Python is probably the easiest way to go.
posted by OverlappingElvis at 10:14 AM on December 12, 2011


I feel bad for the Perl community. Their language used to be the fun, powerful scripting language. But it's experienced a slow waning.

This is true in general mindshare terms, and yeah, personally, I don't write a lot of stuff in Perl myself much anymore -- in fact, a year or two ago I went to do a quick one-off script and realized that while I still knew Perl semantics, I didn't remember many of the exact invocations. So much of both my recent work and my personal experiments has been in other languages.

But I don't feel bad for the Perl community much, because it still is a fun, powerful scripting language. When I check in, I find that there's been some development both in terms of core features (yes, even in 5) and in how people are learning to use the incredibly powerful language itself, that the people who use it are still getting things done, that many of 'em are as happy as any Python or Ruby programmer, and I often find myself gaining insights into CS and software development to boot.

The lost mindshare does mean some problems -- it means people who know how to use the language in a domain it may even be better for than the languages with better mindshare will face obstacles in getting other people to sign on. That's about the biggest downside as far as I can see, though.

Python's got its growing pains too (Python 3, PyPy) but it doesn't seem to have hurt the language.

"As it stands, Python 3 is the XHTML of the programming language world... incompatible to what it tries to replace but does not offer much besides being more 'correct'."

I think Python has some things going for it that mean it won't face quite the same challenges with 2 -> 3 that Perl has with 5 -> 6, but I wouldn't assume that the transition isn't hurting developers and potentially making them think about investing elsewhere. As one data point, I'm wary about investing too much more in Python until the move to 3 seems really briged, much as I see to like about the language.
posted by weston at 10:16 AM on December 12, 2011


After that, it's mostly about learning syntax to pick up a new language.

I'd disagree, for a number of reasons; one is that the standard libraries in languages, and your familiarity with them, make a huge difference to productivity. In practise a huge amount of time is spent gluing bits together from huge libraries of well-written code, if you're working in any moderately popular language. The bane of the pragmatic programmer's existence is some who re-creates functionality in their own special-snowflake fashion.

Secondly understanding idiom makes a huge difference. Just as with learning a natural language there's a huge difference between how one expresses one's self when a beginner or fluent, and, more importantly, whether one understands others. Natural languages like English and Dutch are notorious for it; it may not matter in a technical sense whether you use map/reduce, or store run-time config options in a properties file, or what have you, but doing it differently to everyone else working in a given language will make a difference.

Finally, languages can have very different paradigms. Many people calling themselves programmers will go to ridiculous lengths to avoid learning how to use SQL effectively because the paradigm of a declaritive language is so alien to them they do dumb shit like write simple queries, pull huge data sets into a language they do know, and then re-implement joins, GROUP BY, and the like very badly indeed.
posted by rodgerd at 10:23 AM on December 12, 2011


I would actually love to read a thread where people argue about which woodworking tool is the best. That would be awesome. Because clearly it's the framing hammer.

But failing that, as one more data point, I think compact white-spaced languages are highly readable. After a little while it becomes just as easy to see the structure as it is with brackets. But as a bonus:

(1) They (mostly) make sure that blocks are indented correctly -- whereas if you read bad Javascript, for example, you often find places where the indentation is messed up and the bracket is on the wrong line and you could easily miss what's going on. So it's easier rather than harder to scan overall structure.

(2) They take less vertical space, so you can see more of the structure on one page, and that makes things easier and faster in some cases.

So I'm a fan of Python, Ruby, CoffeeScript -- that whole modern range of object-oriented C-like scripting languages that make a conscious attempt to be pretty. But mostly I think CoffeeScript deserves the attention, because that is worlds better than writing Javascript, and it's a drop-in replacement, and Javascript is everywhere. The people should know.

Or we could just talk about hammers.

...

Python is a great example of a language that's too freeform. Too open ended, and clever. It encourages excessive cleverness, which while nifty in an abstract, is the bane of everyones fucking existence at 4:00am when you're trying to find out why sewing kits are deleting other players when you click on them.

PEP 20: There should be one-- and preferably only one --obvious way to do it.

You would make a ton of sense if you were talking about Perl (which I like sort of the way I like the English language, for its comprehensive quirkiness). But in the realm of duck-typed, dynamically-executed languages, Python is basically the one designed for clarity and simplicity. You might not be a fan of scripting languages, but they have their place. If you ever end up in that place, this one might be a good fit for you.

(Incidentally, pity the lawyers, who have to write their codes in English. I've tried arguing for Latin, which would be more technically correct, but it doesn't support modern libraries and the end user support just isn't there ...)
posted by Honorable John at 10:25 AM on December 12, 2011 [6 favorites]


Getting to code in Python is always a treat, wish I could use it more. Recently I've had to delve deeply into JavaScript, it's so warty by contrast. Choice of language matters, because it affects the way you think about a problem. This is why people get heavily invested in Haskell or Lisp.

I don't get the syntactic whitespace hate; it's such an insightful choice. In-well formatted code, braces are simply noise. But I guess it's one of the many programmer religions: do it the C way or it's wrong. All I can say is: try it, you'll like it.

While we're on the subject of religious wars:
* indents are 4 spaces
* tabs are evil and should be rejected at checkin
* K&R brace style wins (because of JavaScript and Go, oddly enough)
* Exceptions are *not* the spawn of the devil
* C++ has grown too complicated, it's way too hard to implement a C++ compiler correctly.
posted by and for no one at 10:25 AM on December 12, 2011 [2 favorites]


(Indeed, last time someone offered the "I just need to know the maths, the implementation is irrelevant" rationale in a workplace setting I ended up yelling at them; the floors above and below heard the chewing out, because I'd just spent a couple of days getting an order-of-magnitude speedup in some code by pulling his buggy, unreliable, half-arsed mud ball of SQL and TCL and re-writing them to work properly. Studying relational algebra and dinking with MySQL 3-era DBs does not make you competent.)
posted by rodgerd at 10:27 AM on December 12, 2011


Many people calling themselves programmers will go to ridiculous lengths to avoid learning how to use SQL effectively because the paradigm of a declaritive language is so alien to them they do dumb shit like write simple queries, pull huge data sets into a language they do know, and then re-implement joins, GROUP BY, and the like very badly indeed.

Oh JFC, really? I guess I shouldn't be surprised since I still occasionally read TheDailyWTF, but that's a fucking nightmare.

And I thought ORM over-reliance was bad. Hey guys, indexes might just be important when you start having non-trivial amounts of data!
posted by kmz at 10:36 AM on December 12, 2011


Back in 2007, I made a couple shirt designs.

Your Favorite Programming Language Sucks (black t-shirt, white text on front).

Your Favorite... (White shirt, black text on front, and on back in a variety of languages, a short program made to print this phrase, a la "Hello World")

Umm, I apologize if this isn't kosher, though I think self-linking in comments is ok, just not up front.

Anyways, I do like python, even if there's some things that irk me about it. Looking at it is like poetry. I have the book Core Python (though my version is a bit old by now) and it's great to read more than just text, but get a bit of the "behind the scenes" of the language to see how references work, and deleting things from memory.
posted by symbioid at 10:39 AM on December 12, 2011


Dammit, there goes my idea for writing a post on my favorite programming language for the December contest.
posted by mysterpigg at 10:40 AM on December 12, 2011 [1 favorite]


On the Game Programming side, folks should definitely check out Panda 3D - used to build Pirates of the Caribbean Online and is amazingly well featured for something of such seemingly low profile.
posted by AbsoluteDestiny at 10:45 AM on December 12, 2011 [2 favorites]


I meant to learn python at one point, but never got around to it. Instead I learned whatever was being used at the job I was at (tcsh, awk, sed then bash, C) or the class I was taking was in (C, C++). Now I find myself wondering if I should take the time to go learn Python, Lua or Ruby. I wonder which would be easiest coming from a mild C background? Which one does the least strange stuff?

Next question: Why are all 'modern' languages interpreted?

Also, and for no one, by 'K&R' you mean 'Allman' style, and by '4' you mean '2', right? ^^
posted by Canageek at 10:54 AM on December 12, 2011


Stackoverflow : What is the hardest language to learn?   I lol'd at Malbolge and Whitespace.
posted by jeffburdges at 10:56 AM on December 12, 2011


By the way, Pygame Subset For Android gives me hope of one day being able to program for Android completely in Python instead of Java. It actually includes a pretty comprehensive subset of the Python library and a few Android hooks (specifically, for things that are typically needed for games). I haven't looked into the library code yet, but I wouldn't be completely surprised if the rest of the major hooks could be added pretty easily, so that you could make a non-game app using the native Android widgets with the current look and feel. Even better would be the ability to select manifest items like resolution level. I haven't gotten to the "compile for android" stage yet in my little game I'm making, but it looks as easy as running py2exe is (which is to say, not completely non-trivial, but not incredibly difficult to get past wonky edge cases).
posted by mysterpigg at 10:57 AM on December 12, 2011


and for no one: "Getting to code in Python is always a treat, wish I could use it more. Recently I've had to delve deeply into JavaScript, it's so warty by contrast."

JavaScript is maybe my second-favorite language after Python. It's actually incredibly robust and elegant; the trick is you have to recognize its (considerable) strengths and write to them. JS isn't strictly OO and it isn't strictly functional but in some ways it can do things purely functional or purely OO languages can't. I heartily recommend any of Crockford's talks on Javascript. Start with JavaScript: The Good Parts.
posted by Deathalicious at 11:04 AM on December 12, 2011 [1 favorite]


Which one does the least strange stuff?

Lua is the smallest and the simplest, and it's quick. Ruby is cool but it does import a lot of the Perl weirdess, magic globals, and other oddities. Python is very coherent, well worth the time to learn and use. It has grown larger over time, but they've made a real effort not to throw a bunch of stuff in the language just because.

by 'K&R' you mean 'Allman' style, and by '4' you mean '2', right? ^^

I spent more years than I'd care to admit coding with Banner style and 2 character indents. When I use Allman/ANSI style, I end up drifting into Horstmann formatting.
Choice of style simply isn't that important; it's a good idea to keep a module consistent. It's good for your brain to work on code that uses different styles.

But K&R is still the winner overall.
posted by and for no one at 11:11 AM on December 12, 2011


JavaScript is maybe my second-favorite language after Python

I'm currently working on a JavaScript to PostScript compiler, based on Crockford's TDOP parser (Why? Don't ask). JavaScript has a lot of coolness and some ugly warts.

CoffeeScript sure looks like fun, though. It's almost Python in many ways.
posted by and for no one at 11:15 AM on December 12, 2011


and for no one: "
* indents are 4 spaces
* tabs are evil and should be rejected at checkin
"

Tabs are at worst neutral in a world in which every indents 4 spaces. In the real world, where some people indent 3 spaces, some indent 4, and some indent 2, tabs are way better because you can choose the visual indentation you want and people can combine your code with theirs even if you indent 2 "spaces" and they indent 8 "spaces".

In Python, my goodness, if you indent 2 spaces and I indent 4 spaces, our code is literally uncombinable without serious conversion. For in my code, this line of code:
        print "I don't like %s" % food
is a mere 3 levels in:
class Diner(object):
    def dislike(food):
        print "I don't like %s" % food
Whereas yours is, what, 5 levels in?
class Diner(object)
  def dislike(*args, **kwargs):
    def who_indents(*args, **kwargs):
      def like_this_seriously(food):
        print "I don't like %s" % food
      return like_this_seriously
    return who_indents
I use tabs in Python because the style guide says to and because my IDE is set up thusly. But every time I encounter someone who uses 2 spaces or 3 spaces for their indentation, I pine for tabs.

I still, to this day, have never, never, never heard an argument that did not involve, in some way, "Well, what if someone is using a combination of tabs and spaces! What then?" Which is kind of like saying, "drinking beer is waaay better than driving cars because what if someone drinks beer and then drives a car?"
posted by Deathalicious at 11:16 AM on December 12, 2011 [1 favorite]


Yeah, that one's a joke, and not a very good one.

There's a lot of not-very-good stuff on that site, but there is one thing that beautifully illustrates how much Python just gets the hell out of your way and lets you get on with what you're doing:
import datetime

today = datetime.datetime.now()
last_week = today - datetime.timedelta(days=7)
next_year = today + datetime.timedelta(weeks=52)
print today
print last_week
print next_year
After some of the juggling I had to do with date stuff in Java, this made me weep with joy.
posted by Mr. Bad Example at 11:16 AM on December 12, 2011


and for no one: "I'm currently working on a JavaScript to PostScript compiler, based on Crockford's TDOP parser (Why? Don't ask)."

Just know that when the time comes, I will kill your print job.
posted by Deathalicious at 11:23 AM on December 12, 2011 [3 favorites]


I feel bad for the Perl community. Their language used to be the fun, powerful scripting language. But it's experienced a slow waning.

It's interesting... I've been doing Perl work recently (coming from a Python background), and while this was certainly my perception when I started, I've come to realize it's actually not true at all. There's a thriving and vital Perl community, and they've been doing some awesome work to rectify a lot of the failings of the core language. (Moose being the most obvious example.) The syntax still sucks, though, and one major advantage Python has over Perl is the simple size of the standard library. (You can get packages to do most everything you need from CPAN, but that's nowhere near as good as just having them there already.)
posted by asterix at 11:28 AM on December 12, 2011


I think CoffeeScript deserves the attention, because that is worlds better than writing Javascript

Maybe you can be the first person I challenge who can actually defend this statement.

Because so far, I've been unable to detect any significant advantage to CoffeeScript. I think the aesthetic angle is perfectly acceptable as rationale for preferring/choosing it yourself -- by all means, work in a language in which you're happier looking at the screen. But to me it's in the class of choosing the syntax highlighting you prefer. I also see some nice shortcut expressions, and I like typing less as much as anybody, but as advantages go, this isn't enough for me to justify putting language-level abstraction over what essentially remains the semantics of JS.

If that's enough for you, that's fine with me. But please distinguish between "I prefer this" and "this is worlds better."

(I might add that while I like Ruby and Python, I sometimes feel that the communities around both languages seem have more than their share of opinionated programmers who could improve at this distinction. I heartily recommend both languages, but don't drink the kool-aid, kids.)
posted by weston at 11:29 AM on December 12, 2011


I pine for tabs.

It's good that you have a strong opinion on the use of tabs. I disagree, for reasons I consider valid and that carry more weight to me than your argument. But it's not important.

Really, I wasn't trying to start religious wars on multiple fronts; because of the whitespace issue I was trying to point out that any discussion about programming issues can devolve into unproductive discussions about personal preferences. In the end, these issues are trivial. It's *because* they are trivial that the annoyances they cause become somehow so tremendously significant to us.

Why should such a trivial issue cause me so many headaches? What causes my headache is different from what causes yours.
posted by and for no one at 11:39 AM on December 12, 2011


Why are all 'modern' languages interpreted?

All of these languages are compiled. They compile to a virtual machine instead of the native processor, for a number of reasons, including portability, safety, ease of implementation, and more.

Almost without exception these languages are implemented in C.
posted by and for no one at 11:42 AM on December 12, 2011


I've been unable to detect any significant advantage to CoffeeScript.

* as mentioned above, choice of language matters. It affects how you think about a problem, what tools you have at hand, and how concisely you can express a solution.

* the single most significant factor affecting a program's quality, in terms of correctness, robustness, maintainability, coherence, etc. is size. Smaller is better. Any language that helps you do more work with less code is objectively better in at least one significant way.

In both of these areas, CoffeeScript compares favorably to JavaScript.

... opinionated programmers ...

Ahh, such a rare beast, almost never seen in the wild. HAMBURGER
posted by and for no one at 11:50 AM on December 12, 2011


Canageek: "Next question: Why are all 'modern' languages interpreted?"

It is a conspiracy of processor and hardware manufacturers, to ensure that you need the latest and greatest 500 gigahertz machine to do the same task you would do on an apple II.

Also, the people who are smart enough to write a good compiler can get better money doing other things, and an interpreters are much easier to write.
posted by idiopath at 11:56 AM on December 12, 2011


idiopath: "Canageek: "Next question: Why are all 'modern' languages interpreted?"

It is a conspiracy of processor and hardware manufacturers, to ensure that you need the latest and greatest 500 gigahertz machine to do the same task you would do on an apple II.

Also, the people who are smart enough to write a good compiler can get better money doing other things, and an interpreters are much easier to write.
"

I disagree with both of your points if they were meant in seriousness. First of all, it's true that interpreted languages require more processing power, but I think that might be a chicken/egg thing, because it may be that interpreted languages emerged simply because the technology is there, and realistically interpreted languages are generally faster to program in than compiled languages, because you can run them right away without waiting to compile them.

I'm also not convinced that writing a compiler is more difficult than writing an interpreter. For one thing, compilation tools have progressed to the point that often it's possible to compile a language without writing a dedicated compiler from scratch, and secondly I think writing a good efficient interpreter can also be hard work, particularly if your goal is to make it perform at a speed that is comparable to a compiled language.
posted by Deathalicious at 12:05 PM on December 12, 2011


Can't tell if you're trolling, idiopath. Anyway, interpreted code when executed by a good just in time compiler, can often be faster than statically compiled code. Not just in theory, but in practice. Java has been showing that for awhile now, and PyPy is there for a few examples; some artificial, some more practical.
posted by Nelson at 12:11 PM on December 12, 2011


In both of these areas, CoffeeScript compares favorably to JavaScript.

You know the part of my comment where I said "maybe you can be the first to defend this statement"?

It's because I've noticed a pattern. Every time I challenge somebody on this, they assert, but they don't demonstrate, and so far, you're fitting right in.

Smaller is better.

I agree with this. There are some languages where you can see order-of-magnitude differences, and that's significant.

But this difference rarely manifests when the primary distinction is the choice of particular tokens, though. And since CS & JS have essentially the same semantics, that appears to be pretty much the difference we're looking at with CoffeeScript.

Ahh, such a rare beast, almost never seen in the wild. HAMBURGER

Sure, all over the place. And maybe it's magnified by the fact that these languages seem to be popular at a time when opinionated discussion is an accepted part of participating in the online attention economy. But I've never seen programmers wear "opinionated" as a badge, though, the way I do with Ruby developers, and to a lesser extent Python devs.
posted by weston at 12:12 PM on December 12, 2011


Actually, the whole hard compiled/interpreted language split is left over from decades ago. Many interpreted languages are actually compiled (to some form) and some compiled languages are technically interpreted in some cases.

It is all a matter of how "directly" the source is transformed into some machine representation. Sometimes it is native machine code, and sometimes not.

The line is getting blurrier all the time, and there are a few non-traditional indirectly interpreted languages that can outperform compiled languages in some tests.

I can't say it any better than Wikipedia:

"Theoretically, any language may be compiled or interpreted, so this designation is applied purely because of common implementation practice and not some essential property of a language. Indeed, for some programming languages, there is little performance difference between an interpretive- or compiled-based approach to their implementation."
posted by clvrmnky at 12:14 PM on December 12, 2011 [1 favorite]


Smaller is better.
I agree with this.


I should say: I agree with this to an extent. Smaller in terms of the number of "steps" (semantically significant statements) you have to use to produce a useful abstraction is good. Smaller is not always the same a shorter character count. It's worth noting that short of J/K/APL, there's very little out there that can beat Perl for brevity when brevity is the measure of the hour.
posted by weston at 12:18 PM on December 12, 2011


so far, you're fitting right in.

My assertions come from reading the documentation at http://jashkenas.github.com/coffee-script/. If you've read this and don't agree, then my citing examples from the documentation won't convince you.
posted by and for no one at 12:26 PM on December 12, 2011


The difference between an interpreted and a compiled language has not meant very much for quite a long time. Lookie, here's an interpreter for C and C++. It's really an implementation detail, and I think it always has been, but implementations were once a lot harder to come by.
posted by LogicalDash at 12:36 PM on December 12, 2011


Smaller is not always the same a shorter character count

This is a good point - there must be a point of diminishing returns, or we would all be using APL.

Languages have something like a "cognitive load". For languages with similar cognitive load, then the brevity argument wins. If the load becomes too great, no amount of brevity will help.

There is also a factor of matching mental models to the toolset - not everybody thinks the same way, so people have an affinity for certain styles. Learning a new style can expand your ways of thinking.

Regardless, Python is sweet, and PostScript sucks.
posted by and for no one at 12:37 PM on December 12, 2011


Python is my go-to "good enough" language for doing things, thanks to its portability and good set of standard libraries, but Lua is my favorite language as a language.
posted by Pyry at 1:03 PM on December 12, 2011


Python is still an interpreted language. There's certain powerful advantages that compiled languages get that interpreted languages don't. For instance, addition on strings is a different operation than addition on integers. An interpreted language would need to always check the type of the variable each time (boolean check conditions are a large performance constraint) whereas a compiled language can determine the type of the variable and then determine what operation it's assigned to. Chrome ran JS programs five times faster than other browsers, because it added compile time time checking to the downloaded javascript so it wasn't just an interpreted language.

My issue with Python is that it's half assed functional. We get lambdas, but no temporary variable side effects in the lambdas. We get map, reduce, and filter, but no tail recursion. I like to push Scala (I've mentioned it in previous posts) because it has compile time error checking. In large projects, compile time analysis is helpful in catching a host of errors. Google noticed that there web applications weren't scaling well using just Javascript, so they're now pushing Dart, in an effort to create more reliable and stable web applications.

Martin Odersky wants to adopt as much functional programming paradigms as possible in the language, while not removing some of the more useful features of procedural/object oriented languages (Monads versus objects/structs to hold state, objects are easier to read than monads). Guido seems to stop at a certain point. Tail call optimization can't get implemented, because a stack trace is "too important". But without tail call optimization, for loops and while loops are the only alternative, and they also don't create a stack trace.
posted by DetriusXii at 2:05 PM on December 12, 2011


You're conflating static typing with compiling.
posted by 0xFCAF at 2:26 PM on December 12, 2011 [7 favorites]


The best, super-short programming from scratch tutorial I have found, in Python or out of it, is this:
http://hetland.org/writing/instant-hacking.html

If you know a bit of programming already, then there is this:
http://hetland.org/writing/instant-python.html
posted by Elysum at 2:49 PM on December 12, 2011 [1 favorite]


More experiences from the scientific computing side of things: in neuroscience, at least, MATLAB is still the most popular language for data analysis and computational modeling.

Even though the language is so stubbornly ugly, MATLAB at least makes the easy things easy, most of the time. In particular, the command-line and GUI tools for working with figures and plots are quite nice. (Did you know you can edit a figure by clicking around in the Plot Tools -- zooming around, adding labels and annotations -- and then have it generate the equivalent lines of code to recreate that figure noninteractively?) Furthermore, very convenient libraries have been written for experiment control, signal processing, and other kinds of general data analysis. The network effect is pretty strong.

A few years ago, on a whim, I started trying to use Python (specifically, NumPy and SciPy) as a replacement for MATLAB. So far, I've been fairly happy using this stack:

- python
- numpy (the basic N-dimensional array object, and the core functions that operate on it)
- scipy (extra modules for a wide variety of common science/engineering/math tasks)
- matplotlib (to plot stuff)
- ipython (a very nice shell for interactive/exploratory programming, well-integrated with numpy/scipy/matplotlib)
- h5py (opens hdf5 files, including .mat files saved in the latest '-v7.3' version)
- pyreadline (a small package that enables things like tab completion and colored prompts in ipython).

On Windows, I used the MKL binaries from http://www.lfd.uci.edu/~gohlke/pythonlibs/. The MKL implements BLAS and LAPACK routines for doing linear algebra and vector math (the same Fortran routines that allow MATLAB to perform matrix math so efficiently). On Ubuntu, I think I just installed ATLAS through synaptic.

Some of my thoughts on the transition so far:

- It's such a relief to avoid a bunch of the minor and major arbitrary restrictions and annoyances in MATLAB. I'm sure any programmer who learned MATLAB after learning a "real" language can probably list a dozen cringeworthy language design decisions. One function per file. Array indexing with parentheses instead of square brackets. No clean equivalent (as far as I know) for things like dictionaries or list comprehensions. No keyword arguments. And so on.

- One thing I am missing in Python is a solid IDE. For all its ugliness, MATLAB does give you a nice environment to code in, with debugging, code completion/suggestion, interactive plotting, and the ability to jump to function definitions. Right now, I just use gedit or vim to write Python code, with ipython in a separate window. I have yet to try Spyder or Pydev + Eclipse, though. (Does anyone have experience with either of those environments?)

- The NumPy array object is more general than MATLAB's matrices, and in my experience NumPy/SciPy functions usually tend to work consistently on multidimensional arrays, while many MATLAB functions expect only 2-d matrices. Of course, having both list comprehensions and broadcasted ("vectorized") functions at your disposal is staggeringly useful/powerful/beautiful.

- For almost any function I've needed from a MATLAB toolbox, I've been able to find an equivalent in a SciPy module. There are rare exceptions (like some obscure functions for circular statistics).

- matplotlib can generate very pretty plots, but I sometimes miss a few features from MATLAB (most recent example for me: logarithmic axes on spectrograms). Also, matplotlib doesn't have GUI tools to inspect and edit figures that are nearly as nice as MATLAB's Plot Tools.

- With h5py, moving back and forth between MATLAB and Python isn't too difficult. I do wish there was a better way in ipython to save variables to disk from my workspace. I've used pickle in the past; I have no idea whether there's a better equivalent to MATLAB's SAVE command.
posted by omnomnOMINOUS at 3:32 PM on December 12, 2011 [3 favorites]


My assertions come from reading the documentation at http://jashkenas.github.com/coffee-script/.If you've read this and don't agree, then my citing examples from the documentation won't convince you.

Here's why what's on that page doesn't support your claims:

* Like the page itself says: same rough semantics. Like I said earlier: means same order of code size. "World of difference" shifts in size and the corresponding benefits come from deeper changes in semantics.

* As Jeremy would tell you, the examples on that page are the output of the CoffeeScript compiler, not necessarily what you would write (or need to write) in JavaScript to achieve the same effect. To his credit (and the others working on CS), I think the compiler does a very good job of making the output readable and not unreasonable, but that doesn't mean it's the idiom a working JavaScript dev is most likely to use.
posted by weston at 3:34 PM on December 12, 2011


One thing I am missing in Python is a solid IDE

I've had good experiences with WingIDE, specifically its nicely integrated debugger. Free trial, but after 30 days it's $245 ($95 non-commercial) for the Pro license that unlocks all the features. It suffers a bit from having a Python GUI, particularly on MacOS, but it's pretty solid and sophisticated.

Pydev + Eclipse seems to be the most popular free IDE. It felt too big to me.
posted by Nelson at 3:42 PM on December 12, 2011


One thing I am missing in Python is a solid IDE. For all its ugliness, MATLAB does give you a nice environment to code in, with debugging, code completion/suggestion, interactive plotting, and the ability to jump to function definitions. Right now, I just use gedit or vim to write Python code

You can make vim do most of that. Personally I develop with one window open to my code in vim and another running an interactive interpreter (with the debugger going if need be); I've tried pydev + eclipse (as Nelson suggests) but it feels like overkill.
posted by asterix at 3:54 PM on December 12, 2011


Here's an example of an idiom not available in JavaScript, list comprehensions. (Hopefully the format won't get chewed up by the post. It previews OK)

Python:
    foods = ['broccoli', 'spinach', 'chocolate']

    [eat(food) for food in foods if food != 'chocolate']
CoffeeScript:
    foods = ['broccoli', 'spinach', 'chocolate']

    eat food for food in foods when food isnt 'chocolate'
PostScript:
    /foods [(broccoli) (spinach) (chocolate)] def
    foods { dup (chocolate) ne {eat} {pop} ifelse } forall
JavaScript:
    var i, n, food, foods;

    foods = ['broccoli', 'spinach', 'chocolate'];

    for (i = 0, n = foods.length; i < n; i++) {
        food = foods[i];
        if (food !== 'chocolate')
            eat(food);
    }
and just for laughs, PostScript compiled from the JavaScript:
    {
        [ (broccoli) (spinach) (chocolate) ] js:array
        /foods js:=
        foods (length) js:. /n js:=
        0 /i js:=
        {
            i dup 1 add /i js:=
            pop
        }
        {
            foods i js:index /food js:=
            food (chocolate) js:ne js:test
            {
                [ food ] eat js:call
                pop
            } if
        }
        {
            i n js:lt js:test
        } js:for
    } [ /i /n /food /foods ] js:exec
The js2ps compiler does exactly what the JavaScript does, but completely misses what it means, because it doesn't recognize the JavaScript idiom. Beyond that, there are several other low order optimizations missing. Compilers are hard.
posted by and for no one at 4:00 PM on December 12, 2011


JavaScript:
    var i, n, food, foods;

    foods = ['broccoli', 'spinach', 'chocolate'];

    for (i = 0, n = foods.length; i < n; i++) {
        food = foods[i];
        if (food !== 'chocolate')
            eat(food);
    }


Thats hardly optimal:
var foods = ['broccoli', 'spinach', 'chocolate'];
for (i in foods) {
  if (foods[i] != 'chocolate') 
    eat(food);
}
posted by wildcrdj at 4:09 PM on December 12, 2011 [1 favorite]


argh... of course thats
eat(foods[i]);
But otherwise fine. :P
posted by wildcrdj at 4:09 PM on December 12, 2011


Using for in with JavaScript arrays is not recommended.

An interesting approach would be to use Array.filter and forEach, from JS 1.6 and later:
    foods = ['broccoli', 'spinach', 'chocolate'];

    foods.filter(function (item) { return item != 'chocolate'; }).forEach(eat);
posted by and for no one at 4:26 PM on December 12, 2011 [1 favorite]


Here's an example of an idiom not available in JavaScript, list comprehensions.

`map` and `reduce` methods are actually standard since JS 1.6/ECMA-262. Or you can roll your own in a line of code, or use libraries going back to 2006.

These days a JS dev can almost always write something like:

['broccoli', 'spinach', 'chocolate'].map(function (food) {
    if(food !== 'chocolate') eat(food);
});

posted by weston at 4:29 PM on December 12, 2011


oo! JS 1.8 adds lambda syntax:
    foods = ['broccoli', 'spinach', 'chocolate'];    
    foods.filter(function (item) item != 'chocolate').forEach(eat);
posted by and for no one at 4:34 PM on December 12, 2011 [1 favorite]


Ah, interesting about for/in.

The map/filter stuff is cool too and I coincidentally just encountered it at work like minutes after I posted to this thread... weird.
posted by wildcrdj at 4:34 PM on December 12, 2011


One more: JS 1.7 added array (list) comprehensions:
    foods = ['broccoli', 'spinach', 'chocolate'];
    [eat(food) for each (food in foods) if (food != 'chocolate')];
posted by and for no one at 5:40 PM on December 12, 2011 [2 favorites]


I like Scala more, these days... but Python is my "practical language of choice," and it's just beginning to hit the point where Python is viable. (I've just landed a job writing Python, which seemed a distant possibility even five years ago.) I think, overall, the Python stack is more robust than Ruby, and Python is pleasant in its simplicity. Even its hairy edge cases aren't particularly hairy.
posted by sonic meat machine at 6:10 PM on December 12, 2011


The only Best Python is Monte.
posted by Twang at 7:31 PM on December 12, 2011


The only Best Python is Monte.

Of course, the programming language is named after Monty Python.
posted by grouse at 7:54 PM on December 12, 2011


So it will work for simple code, but when 'eval' starts to happen, everything will go to hell. Anyone who runs this on important code and expects it to work is in trouble.

Anyone using "eval" in important code is asking for trouble.

Also, no one claims that 2to3 changes everything that needs changing to get the equivalent behavior in python3. It just doesn't. It does take care of a lot of things that can be done programmatically, though, like changing print statements to calls to the print function (and much more).
posted by kenko at 8:17 PM on December 12, 2011


pull huge data sets into a language they do know, and then re-implement joins, GROUP BY, and the like very badly indeed.

Oh JFC, really? I guess I shouldn't be surprised since I still occasionally read TheDailyWTF, but that's a fucking nightmare.


It's horribly common. The time I found a rats' nest of TCL where the 'programmer' had implemented a multi-table join by doing a SELECT, using a foreach to iterate over the results and issue SELECTs within the foreach loop, each of which then triggered another foreach loop... you get the picture.

Java weenies seem to be the worst for, it, although the OO crowd generally seem to suffer from the idea that learning anything other than the one tool/paradigm they're comfortable with it tantamount to wiping their mother's arse with their tongue.

Grah.
posted by rodgerd at 11:01 PM on December 12, 2011


the 'programmer' had implemented a multi-table join by doing a SELECT, using a foreach to iterate over the results and issue SELECTs within the foreach loop, each of which then triggered another foreach loop

I just threw up in my mouth a little bit, and I haven't even eaten yet today.
posted by Mr. Bad Example at 1:21 AM on December 13, 2011


It's horribly common. The time I found a rats' nest of TCL where the 'programmer' had implemented a multi-table join by doing a SELECT, using a foreach to iterate over the results and issue SELECTs within the foreach loop, each of which then triggered another foreach loop... you get the picture.

On the bright side, whenever you see this you get to be the bad-ass who improves performance by 3,000% before lunch time.
posted by atrazine at 2:32 AM on December 13, 2011


It's horribly common. The time I found a rats' nest of TCL where the 'programmer' had implemented a multi-table join by doing a SELECT, using a foreach to iterate over the results and issue SELECTs within the foreach loop, each of which then triggered another foreach loop... you get the picture.

To be fair, he/she probably had considerable brain damage caused by years of being forced to code in TCL.
posted by octothorpe at 4:54 AM on December 13, 2011 [2 favorites]


It was a longass time ago, but I remember Tcl being not that bad. Actually I remember Tcl/Tk was pretty nice as an intro to GUI programming.
posted by kmz at 8:20 AM on December 13, 2011


I see that you guys dug into CoffeeScript yesterday. Just in case anyone's still reading ... I agree of course that CoffeeScript vs. Javascript is a purely aesthetic choice, since they're almost totally identical functionally. (The one exception I recall is that CoffeeScript can only create myfunc = function(){} instead of function myfunc(){}, which affects the order you have to define functions.)

But the aesthetic difference is huge. We've already discussed that CoffeeScript generally uses much less space, partly by using shorter syntax and whitespace instead of brackets. Personally I find that easier to read and faster to write; I agree that not everyone necessarily would.

But the other huge thing it does is to turn common-but-complex Javascript tasks, especially ones that would be difficult to handle via libraries, into language features. This is important even if you don't inherently love whitespace, because it takes things that are time-consuming or easy to screw up in Javascript and makes them easy. A good example from the docs is existence testing:

Javascript:

if (typeof elvis !== "undefined" && elvis !== null) alert("I knew it!");

CoffeeScript:

alert "I knew it!" if elvis?

(or if you prefer your if the other way around)

if elvis?
  alert "I knew it!"


Are you sure you wouldn't screw up the first line in Javascript, potentially generating a weird edge-case bug? In some ways this is a stylistic question, but it doesn't seem like a hard decision. And it gets even easier with something like if elvis?.likes_peanut_butter?.on_his_sandwich, which will not be fun times to do correctly in Javascript.

Another random example is splats. Idiomatic Javascript for passing in a named variable and an arbitrary-sized number of additional variables might be something like:

race = function (winner){
  var runners = arguments.length > 1 ? [].slice.call(arguments, 1) : [];
  // do stuff
}


vs:

race = (winner, runners...) ->
  # do stuff


Again, it's totally up to you which of those you aesthetically prefer, but if you prefer the first one we have a really different idea of a fun way to spend our time. (You could work around this by passing in an anonymous array, but you shouldn't necessarily have to.)

I don't have to paste the entire rest of the CoffeeScript docs, but you'll see the same thing again and again. Take a look at default function arguments, the escaping of reserved words as properties, (built-in, cross-browser) list comprehensions, closures within loops, array ranges, array presence testing, (one opinionated version of) class inheritance, string interpolation, multiline strings, destructured assignment, function binding, switch statements, chained comparison ...

CoffeeScript just takes a ton of fiddly, annoying things about Javascript and smooths them out, so Iend up with what I was trying to do in less time and with fewer errors -- often in a better way than I would have done it, because it's written by people who are Javascript ninjas and have hammered at the edge cases. It noticeably speeds up the time it takes me to write and debug complicated code, and it does it by taking out what I find to be the most annoying parts. That's all I mean by "worlds better."

You could probably master your own libraries, idioms and workarounds to solve the same problems, and maybe you already have and you don't mind that your methods are more verbose and that's great. But I think CoffeeScript is a shallower learning curve to arrive at the same result, and coding shouldn't be an unnecessary test of our machismo. It's annoying enough already.
posted by Honorable John at 9:37 AM on December 13, 2011


(Actually a great metaphor would be whether to use a library like jQuery or not. I think frontend dev with jQuery is way nicer than without, because it makes lots of things that are long and error-prone become short and easy. It's just another aesthetic preference, but in most cases it's a pretty compelling one that justifies the overhead.)
posted by Honorable John at 9:42 AM on December 13, 2011


(You could work around this by passing in an anonymous array, but you shouldn't necessarily have to.)

Dude. Seriously?

Despite presenting this:
race = function (winner){
  var runners = arguments.length > 1 ? [].slice.call(arguments, 1) : [];
  // do stuff
}
as a likely JavaScript idiom for dealing with multiple trailing arguments, you know a JS dev could have written this:
function race(winner,runners) {
   //do stuff
}
and invoked it using race(winner,[runner1, runner2, ..., runnerN]) to the same effect as your CoffeeScript example (at a cost of one whole additional character)?

Heck, I could have written the more verbose example that you used as a false setup for your line about "different ideas of a fun way to spend time" in CoffeeScript, too.

Most of the CS boosters that I've encountered who don't just prefer it but are *sure* it's "worlds better" just don't seem to know how to wield JavaScript as effectively yet. You're the first I've encountered who appears to be interested in presenting intentionally misleading examples.
posted by weston at 4:52 PM on December 13, 2011 [1 favorite]


This book is free and even my English major brain can understand it: Inventing Games With Python.
posted by mecran01 at 6:50 AM on December 14, 2011


Canageek: Why are all 'modern' languages interpreted?

and for no one All of these languages are compiled. They compile to a virtual machine instead of the native processor, for a number of reasons, including portability, safety, ease of implementation, and more.

Almost without exception these languages are implemented in C.


But you still need a bytecode interpreter installed to run them, correct?

Portability: So they will run on any platform the bytecode interpreter has been written for, that is nice. But if you still to very pure C, can't you just recompile it for each platform? I thought that was why nethack (for a random example) was available for damn near every OS you can imagine : I thought portability only became a problem when you started using OS specific features like GUIs and things?

Safety: So Python has things that prevent a rogue python program from deleting your drive? What if I wanted to write a python program to sanitize a hard disk for a lawyer friend? (Actually, to do that I just typically take the hard disk apart, use the plates as Frisbee then stick them to my fridge with a speaker magnet. Now that I have access to a chem lab I might consider taking...further steps.)

Ease of implementation on my part writing the code, or the part of the person writing the language?
posted by Canageek at 10:01 AM on December 14, 2011


and for no one: "Almost without exception these languages are implemented in C."

This is meaningless. It is very easy to write a slow ass interpreter in a fast language.

Good performing interpreted languages can be done, for example ocaml and tcl (though ocaml is an order of magnitude faster if you compile rather than interpreting).
posted by idiopath at 12:38 PM on December 14, 2011


> Almost without exception these languages are implemented in C.

But you still need a bytecode interpreter installed to run them, correct

Yes, the point about C isn't really related to the interpreter question; I wanted to point out that C has been the go-to systems language for decades.

These languagues could be implemented in anything. Rhino & Jython are JavaScript and Python written in Java; IronPython is written primarily in C#; PyPy is (sort of) Python written in Python, and Narcissus is JavaScript written in JavaScript.

For the scripting languages, the compiler & virtual machine are (usually) part of the same package.

Java is of a special case, since the language and VM are statically typed, and programs are typically distributed as "object" code for the VM. You can install a Java runtime and not install the development tools. (Java == Just another virtual architecture). Java VMs run code pretty quickly, because the VM is statically typed; it doesn't have to expend as much effort on run time type checking and validation.

But if you still to very pure C, can't you just recompile it for each platform?

It's not quite that simple. For a systems language, C is very portable, but it's all the other stuff that gets you. The scripting languages runtime libraries include a lot of functionality that isn't "built-in" to C, so building any interesting C program usually involves locating/building/linking other stuff, which can quickly get out of hand, even if it is available on all of your target platforms. It's much easier to port a script program, you just copy the file(s) involved and "run" it. Installing larger script programs may involve using a package manager.

So Python has things that prevent a rogue python program from deleting your drive?

No, normal Python isn't restricted (although there is a restricted Python subset). JavaScript in a web browser is locked down to prevent rogue activity, but that's not what I meant. It's possibly to write malicious script programs, although it's harder to hide the fact that they are malicious, since they are distributed as source.

The safety has to do with the program not blowing away the VM. If you make a programming mistake, the VM gives you a coherent error and doesn't choke. It's very easy to make a mistake in a C program that causes a CPU hardware failure, and blows the program away. These kinds of bugs are one way malicious code can attack a system. In the old days, these kinds of bugs didn't just blow away the program, they could take down the whole machine, which could cause data corruption and related unpleasantness. This is where the term "snow crash" comes from.

There's a saying that C gives you enough rope to shoot yourself in the foot; C++ makes it harder to shoot yourself in the foot, but when you do, it takes your whole leg off.

Ease of implementation on my part writing the code, or the part of the person writing the language?

Both, although industrial strength implementations of JavaScript with JIT like SpiderMonkey and V8 are big, cutting-edge programs.

From an implementation standpoint, it's easier to implement a byte code compiler and interpreter than interfacing with an actual CPU. The VM can be written to closely mirror the tree that comes out of a language parser. This may be the original inspiration for Lisp.

In the old days, I worked with some guys who had a large Fortran program running on an HP3000. It was easier for them to port the program by porting the HP3000 architecture to the PC by writing a VM, than it was to get their program to compile and link and run on a 640K machine. They copied the binary HP3000 program to the PC, tweaked it a bit, and ran it on their VM.

However, scripting languages are really about making life easier for the coder. They are sometimes referred to as Very High Level languages. The built-in facilities in the language and runtime libraries make it easy to write powerful programs.

One measurement of C versus Python showed that Python code is worth roughly ten times its weight in C code.
posted by and for no one at 12:42 PM on December 14, 2011


@and for no one: So why can't you just make a java compiler that takes it all the way to machine code? Wouldn't that give you (almost) the same performance as a C or C++ program, given an equally good compiler? One would come out ahead due to garbage collection (if the programmers are both perfect that the C would win, since manual garbage collection is faster, if they are both horrible then Java would win as it would fix the programmers mistakes)? Why does everyone but up with the non-native performance penalty?

The argument I've heard is that modern CPUs are so fast it doesn't matter:
Sure, modern CPUs are fast, but when I'm trying to run a simulation with several million gamma-rays, each of which has to be ray-traced individually (GEANT4 by CERN, C++) or run the same simulation a thousand times the difference is really going to show--- I'd love a nice, high level language to replace C and Fortran with, that I could also use for simulations and such. Python is far easier to write, but I've executed simulations (a commercial one at that, though one known to be a bit slow) that took over 40 hours for the mainframe to get through, know someone who ran a nuclear reactor simulation that took 6 months to run on the best hardware he could buy (Couldn't be parallized to work on the supercomputer he could purchase time on), stuff like that. I would love to learn something lightweight like Lua or Python, but man, if it is 10x slower or whatever it is, or even half...there is a BIG difference between a 40 hour simulation and an 80 hour one. I mean, sure, you are never going to get a high-level language as fast as C, and you'll never get C as fast as assembly, but can we get it as close as possible?

Alternative way of phrasing the question: I have Fortran and C as my options for running fast, but hard to write in. I have Python and Lua on the fast to write in, but slow as mud to run (Even PyPy and Psyco don't claim to be as fast as native code). Is there an option somewhere in the middle? I like static typing, I can deal with that. I can remember to free my variables when I'm done with them. Can I have something a bit easier then C now?
posted by Canageek at 1:54 PM on December 14, 2011


Is there an option somewhere in the middle?

Write your code in Python, make sure it works the way you want to, and then profile it. Take the sections of code where you spend the most time and then rewrite them in C, using Cython as glue. I've done this before and it's not that much more difficult to do raw number-crunching in C than it is in Python, but I spare myself the headache of doing input/output, user interface, memory allocation, API calls, error-checking stuff in C. It's that boilerplate stuff that really drives me nuts when I'm writing C.
posted by grouse at 2:04 PM on December 14, 2011


@grouse: I was hoping that I could just learn one language and get pretty good with it. How hard is it getting multiple languages to play nicely? I've heard python is really good at working with others, is that true?

Also I generally like to keep the complexity down by not including a GUI. I mean, if it is just a calculation, why would you need one anyway?

Back on topic: Does python have pointers? If not, what do I replace them with? Say if I'm writing a linked list text editor (because we just did one in class) how do I point each item of the list to the next one?
posted by Canageek at 2:11 PM on December 14, 2011


I was hoping that I could just learn one language and get pretty good with it.

Once you learn programming to a certain level the extra effort to pick up a language within a paradigm you're already familiar with is not very much. Also, I don't think you can truly understand how Python works at an expert level (way beyond "pretty good") without knowing a little C, although it's not necessary for most people.

How hard is it getting multiple languages to play nicely? I've heard python is really good at working with others, is that true?

Cython makes it really easy. Cython is a language that looks like Python but is translated to C and compiled. So from Cython you can easily call Python or C functions. I usually write a Cython function as glue to interface with Python, and then have it call an inner C function to do all the real work.

Also I generally like to keep the complexity down by not including a GUI. I mean, if it is just a calculation, why would you need one anyway?

You're probably going to have some form of user interface, even if it is based on command-line arguments or configuration files. I find the Python facilities for doing this sort of thing much better and easier than what's available in C.

Does python have pointers?

In Python, when you assign an object to a name, what you're really doing is putting a pointer to that object in the name. You could implement a linked list simply by using a series of nested Python lists of length 2. Or by defining a class with car and cdr attributes. (In reality, you might find it easier to use the doubly-linked list in the standard library, collections.deque().)
posted by grouse at 2:26 PM on December 14, 2011


Another question about python: how does it deal with spacial locality in memory?

Alright, I'm asking this as I'm reading a textbook section on how memory works, and it implies that in a normal language if you assign a bunch of variables at once they wind up near each other in memory. Then when you move one of those up to the cache the rest will hopefully go with it, saving you having to go back to the memory.

But python does a bunch of weird stuff with dynamic allocation and typing and stuff. How does it keep everything close in memory? Or is this one of the reasons it is slower?
posted by Canageek at 2:31 PM on December 14, 2011


If you create a NumPy array, its elements will probably be close to each other in memory (another key point for fast Python: use NumPy when possible). If you're just assigning local variables, they might, but cache misses are the least of your performance problems there, considering all the other stuff going on. It's not even worth thinking about.
posted by grouse at 2:35 PM on December 14, 2011


Canageek:

The Java VM has JIT compilation, and GCJ is the GNU Java compiler, which can generate native machine code.

I've seen examples that argue automatic garbage collection can be faster in some cases. This depends on the heap implementation as well as what the program actually does.

Most programs don't need the kind of horsepower you are talking about; for the most part, modern programs sit around waiting for something to happen. In a server environment, something may happen frequently, but each individual request may not amount to that much.

As grouse says, use AlternateHardAndSoftLayers.

A typical program has bottlenecks where it spends most of its time; don't assume you know where your program is spending its time, because you're usually wrong. Measure what's actually happening, and spend your efforts in the right place.

There's a story about some guys who optimized a processor instruction in hardware, because their measurements said this is where the system was spending most of its time. There was no improvement in overall performance, because they'd ended up optimizing the system idle loop.

Cython looks like a pretty easy way to integrate C and Python. There's also Swig and Boost.Python.

The kind of program you're talking about is a special case. One modern approach to mass number-crunching is to offload work to the video card, which can do lots of math in parallel. Unfortunately, the current mechanisms to do this aren't super friendly. NVIDIA supports OpenCL for their CUDA compatible GPUs, and Microsoft has something called DirectCompute. NVIDIA also makes cards that aren't video cards, they're just for number crunching.

Ideally you'd want something like NumPy but can run parallel calculations; it looks like there are some Python libraries to get at this: PyCuda and PyOpenCL. The native Cuda stuff is pretty obscure, anything they can do to make it easier would help.
posted by and for no one at 2:40 PM on December 14, 2011


Here's a small singly-linked-list example in Python:
class Stuff:
    def __init__(self, what):
        self.stuff = what

a = Stuff("apple");
b = Stuff("banana");
c = Stuff("carrot");

listHead = a
a.next = b
b.next = c
c.next = None

t = listHead
while t:
    print t.stuff
    t = t.next
posted by and for no one at 2:52 PM on December 14, 2011


and for no one: Yeah, a lot of this was happening before those standards were out, or at least common. I've read that example about the system idle loop. A lot of it however is uploaded to the local super computer, in our case SharkNet, a group of supercomputers, and you let them worry about the hardware. However, you still are not going to enjoy running python on it for your number crunching. What I want is something similar to python, in that it is clean and easy to write, but that compiles down to machine code-- Would it really be that hard? Something like C or C++ but with less pointer and function weirdness, strictly defined variable lengths (So I don't have to use a library that gives me INT_32 types and stuff like that). Perhaps built in GPU support? I'd love a 'This is math, go run it on the GPU' for me button.

Isn't Cython that weird subset of python that was only meant to write the one compiler and needs static typing?
posted by Canageek at 4:06 PM on December 14, 2011


Isn't Cython that weird subset of python that was only meant to write the one compiler and needs static typing?

You're probably thinking of RPython, which is sort of an internal language used by PyPy.
posted by whir at 4:31 PM on December 14, 2011


whir: Yes, yes I was. Thanks, I'll have a look at that if I ever try and learn a new language. Right now I have computer architecture and C++ exams to pass. (I'm still waiting for a thread to give me an excuse to ask about why we can't just ignore the OS and compile straight to x86 instructions, as my textbook has all sorts of examples of how C and Java convert to MIPS instructions. Someone go make a FPP on assembly now!)
posted by Canageek at 4:38 PM on December 14, 2011


What I want is something similar to python, in that it is clean and easy to write, but that compiles down to machine code...

I'm not aware of anything offhand that matches your requirements, but there is a ton of stuff out there. For large scale number crunching, the future might be in functional languages; they're more susceptible to parallelizing. They are a brain twister when coming from more mainstream languages, though. Some examples would be Haskell, Caml/OCaml, and F#. Scala and Clojure may also be interesting; they both run on the Java VM, and GCJ can be used to compile Java VM code to machine code.

For general use, I'm looking at the D language, which has modern features but without the cruft & complexity of C++.
posted by and for no one at 4:47 PM on December 14, 2011


Next question: Why do we have a fistful of scripting languages, a bucket load of high-level OO languages (ie Java, Objective-C, C#) and a wackload of low level languages (C, C++, Fortran) and nothing in the middle?

Oh right, CS people, the most prone to violent differences over small things outside of religion.
How do you commit murder at a CS convention? Shove the person into a room, then yell 'vim/emacs sucks' and wait for everyone in the room to kill eachother.
posted by Canageek at 8:06 PM on December 14, 2011


Why do we have a fistful of scripting languages, a bucket load of high-level OO languages (ie Java, Objective-C, C#) and a wackload of low level languages (C, C++, Fortran) and nothing in the middle?

Can you explain what you mean by in the middle? Or maybe give an example of middle-language code?
posted by weston at 8:28 PM on December 14, 2011


Weston: Really lightweight like Python, no weird things with pointers or odd ways of passing variables from one function to another, easy to allocate new memory, that sort of thing. Except you compile it down to machine code, so it runs like a native application, not through a VM or bytecode interpreter.
posted by Canageek at 9:50 PM on December 14, 2011


Canageek: you just described (among other languages) ocaml and sbcl. Both are relatively terse and garbage collected, and typically perform a bit slower than C and a bit faster than Java. Compiling to machine code and interpreting are both available in both languages (usually you use the interpreted version to debug so you get nice readable interactive debug info and interactive fastforward / rewind through execution, then compile your "final" version).
posted by idiopath at 3:16 AM on December 15, 2011


But, sadly for numerical code, neither language does unboxed floats. Anyone know of a high level, compilable language that can crunch unboxed floating point variables? And no, C++ is not high level.
posted by idiopath at 3:21 AM on December 15, 2011


Really lightweight like Python, no weird things with pointers or odd ways of passing variables from one function to another, easy to allocate new memory, that sort of thing.

Maybe the D programming language? I haven't done a deep look, but my shallow look seems to indicate that here's a language where somebody who knew what languages like Python had to offer thought deeply about the painful things about C and C++ and came up with something significantly better. It's compiled, but fairly low ceremony. It clearly borrows a lot of everyday semantics from the scripting world (dynamic arrays and dictionaries and foreach loops and...). It does garbage collection. There *are* pointers and explicit memory management facilities, but as far as I can tell, you can safely ignore them until you need them, and the language provides facilities enough that for many everyday things, you won't need them.

Anyone know of a high level, compilable language that can crunch unboxed floating point variables? And no, C++ is not high level.

I pause at the threshold of assuming I understand what anybody means by "high level." But D seems to support a lot of high level concepts (both FP and template/generic), is compiled, and floats aren't objects.

I don't think there's a REPL, tho'. :(
posted by weston at 8:22 AM on December 15, 2011


you just described (among other languages) ocaml and sbcl. Both are relatively terse and garbage collected, and typically perform a bit slower than C and a bit faster than Java. Compiling to machine code and interpreting are both available in both languages (usually you use the interpreted version to debug so you get nice readable interactive debug info and interactive fastforward / rewind through execution, then compile your "final" version).

Although nobody really uses it, Forth is an interesting "in-between" language. It's very low level compared to modern languages, but compared to C it has a lot of the same advantages (compiled and fast, good for embedded systems) along with some additional benefits (less worry about memory allocation, more ability to add complex language features that don't already exist). It's kind of like a functional language in procedural language form, if that makes any sense at all. It makes me wonder what cool languages would exist if people took Forth as a starting point for a modern language in the same sort of ways that languages like Haskell build modern constructs on top of Lisp.
posted by burnmp3s at 8:36 AM on December 15, 2011


By "less worry about memory allocation", do you happen to mean that you never wrote a forth program that needed to allocate memory on the heap? Because a standard forth doesn't even have malloc or free, you need to construct them yourself with brk system calls (or the system appropriate call if you are not using a *nix).
posted by idiopath at 12:07 PM on December 15, 2011


Yeah that is part of the reason why it's not really a reasonable language to use in modern programs. The everything is a stack approach doesn't really work if you need to dynamically allocate memory. But then again a lot of C and C++ programs that dynamically allocate memory could probably be redesigned in a way that it would work without it. I assume if someone wrote a modern stack-based language it would probably include garbage collection or a other ways to support dynamic memory allocation without causing the sorts of problems that make memory management a headache in C++.
posted by burnmp3s at 12:24 PM on December 15, 2011


a modern stack-based language it would probably include garbage collection

Sounds like PostScript.

I find stack based languages hard to read & write. You can't just read it, you have to execute it in your head (or on paper) to know what it does.
posted by and for no one at 12:32 PM on December 15, 2011 [1 favorite]


Factor is a modern, stack-based language, which adds in all kinds of interesting and slightly crazy stuff from FP and OOP (for instance, type inference). I haven't taken a lot of time to learn the language, though I've done some Forth stuff in the past, but I find the Factor blogosphere to be generally fascinating.

There are various OO dialects of Forth, which generally have garbage collection and memory allocation words built in, at least for the few I've looked at. I can't recall the name of it now, but one of the earliest development tools for the Mac that could actually produce shippable binaries and that you didn't need to pay any money for was a Forth variant (maybe Mops?).
posted by whir at 4:57 PM on December 15, 2011


OK in commenting about forth memory management I forgot about allot (I never built anything in forth big enough to need to use it).

I was technically wrong but still on the right track - allot and create are quite low level compared to malloc (and there is nothing like free available).
posted by idiopath at 1:38 AM on December 17, 2011


Most of the CS boosters that I've encountered who don't just prefer it but are *sure* it's "worlds better" just don't seem to know how to wield JavaScript as effectively yet. You're the first I've encountered who appears to be interested in presenting intentionally misleading examples.

I know I'm a week late, but this is nagging at me, so here goes.

The splats example I gave has its place in Javascript. You see it in functions where the number of arguments will almost always be one, but might sometimes be more. A good example is console.log(), where you'll almost always call console.log("hey there"), but might occasionally call console.log("hey there", "here", "is", "some", "stuff"). It would be annoying and error-prone to always have to write it like console.log(["hey there"]), so it's written to handle as many arguments as you want.

Now let's take a practical example. Suppose we want a wrapper for console.log() that only prints the message if the log level is above some global setting, like log(9, "hey there, this is pretty serious").

CoffeeScript:

log = (log_level, stuff_to_log...)->
  if log_level > GLOBAL_LOG_LEVEL
    console.log stuff_to_log...


Javascript:

function log(log_level){
  stuff_to_log = Array.prototype.slice.call(arguments, 1);
  if (log_level > GLOBAL_LOG_LEVEL)
    console.log.apply(console, stuff_to_log);
}


These do basically exactly the same thing in exactly the same way, but the CoffeeScript is simpler and clearer: it includes all of the arguments in the function definition, and the call to console.log is easy to write and to read. It basically documents itself, whereas the Javascript is complex enough that I would probably have to look up the syntax, and then I'd have to document it before anyone else could figure out what I was up to.

Now splats might only come up occasionally ... which is why I gave literally 13 other examples of the same point. They all add up to what I find to be a faster and more fun coding experience. I'm cool with it if you don't.
posted by Honorable John at 5:17 PM on December 19, 2011


« Older Letter from Moscow   |   Some terracotta Lego figures were distorted during... Newer »


This thread has been archived and is closed to new comments