Join 3,433 readers in helping fund MetaFilter (Hide)


"Please cut off a nanosecond and send it over to me."
February 28, 2012 12:58 PM   Subscribe

Adm. Grace Hopper, inventor of the first compiler, explains how big a nanosecond is
posted by Blazecock Pileon (40 comments total) 65 users marked this as a favorite

 
She even explained it to David Letterman once. (Previously.)
posted by kmz at 1:03 PM on February 28, 2012 [2 favorites]


Never heard of Grace Hopper? Allow Zach Wiener to explain.
posted by Holy Zarquon's Singing Fish at 1:06 PM on February 28, 2012 [9 favorites]


I think I just watched the best of the web.
posted by Elmore at 1:09 PM on February 28, 2012 [8 favorites]


Grace Hopper was created by a wise and subtle God as the ultimate weapon to be deployed against idiotic statements that women aren't suited to work in software development.
posted by Tomorrowful at 1:11 PM on February 28, 2012 [17 favorites]


I'm guessing I'm not the only one doing the Udacity Comp Sci classes?
posted by JauntyFedora at 1:13 PM on February 28, 2012


Someone once explained a nanometer to me thusly:

"You know when you wipe your ass, and the paper tears and you get some shit on your hands, and you wash the hell out of them, but still smell a little shit. That's a nanometer of shit on your hand."
posted by three blind mice at 1:19 PM on February 28, 2012


I had the honor of seeing her talk once, way back when the dinosaurs still roamed the earth (aka 1978). Ever since, she's who I wanna be, if I ever grow up.
posted by easily confused at 1:20 PM on February 28, 2012 [3 favorites]


Great explanation. Software engineers don't realize how important it is to be able to explain things like that.
posted by b1tr0t at 1:31 PM on February 28, 2012 [3 favorites]


Grace Hopper was created by a wise and subtle God as the ultimate weapon to be deployed against idiotic statements that women aren't suited to work in software development.

You have to wonder what code she and Ada Lovelace have been writing since they got together.
posted by ROU_Xenophobe at 1:42 PM on February 28, 2012 [5 favorites]


I love this.
posted by odinsdream at 1:58 PM on February 28, 2012


After a few minutes of this, Erik curled up on the floor and began sobbing the word "why" in between huge gulps of air.

"Your monkey had leukemia. That's what made it explode." I explained.

"It didn't have leukemia!" he screamed at me.

"I meant Lou Gherig's disease," I said.

"Little Admiral Grace Hopper Inventor Of COBOL Banana didn't have Lou Gh-" he started.

"Well, it was definitely lew-something," I said. "The point is, you should have taken better care of it. Instead of giving it a funny pun name - like Nim Chimpsky, for instance - you took 'Admiral Grace Hopper', appended a short biography of Admiral Grace Hopper to it, and then just added the word 'Banana' to the whole thing. That's indicative of how lazy you were about everything with that monkey. It wasn't even female, for chrissake."

"But-" he said.

"Shut up," I said.

"But-" he added.

"Shut up," I said. "Go make me some Beefaroni," I said.


From.
posted by Sebmojo at 2:32 PM on February 28, 2012 [2 favorites]


Well, that didn't help me at all.
posted by mrnutty at 3:09 PM on February 28, 2012


Oh my goodness the interview with Letterman really shows her incredible charisma. Gah. Love.
posted by lazaruslong at 3:19 PM on February 28, 2012 [1 favorite]


To have written the first compiler ever is so huge!
posted by Meatafoecure at 3:21 PM on February 28, 2012 [2 favorites]


Can someone explain the question she asks Letterman at the end of that interview though? I feel like I missed a joke.
posted by lazaruslong at 3:26 PM on February 28, 2012


now i want a nanosecond of my very own.
posted by radiosilents at 4:01 PM on February 28, 2012 [1 favorite]


I'm oddly amused by the fact that the distance light travels in a billionth of a second is almost exactly one foot. Take that, metric system!
posted by Holy Zarquon's Singing Fish at 4:17 PM on February 28, 2012 [5 favorites]


I do so love Admiral Hopper. So much so in fact, that we named our first daughter after Amazing Grace.
posted by BigHeartedGuy at 4:44 PM on February 28, 2012 [1 favorite]


She seems cool and all, but... have you ever used COBOL?
posted by phrontist at 4:50 PM on February 28, 2012 [6 favorites]


Can someone explain the question she asks Letterman at the end of that interview though? I feel like I missed a joke.

She asked Letterman whether his ancestry was Welsh, Scottish or Irish. Letterman is actually none of those really, he is probably closer to German. At any rate he did not giver her an answer, which probably threw her off. I think the idea was that she had some sort of joke based on whatever his answer would be out of those three, but since he didn't answer she couldn't tell the joke.
posted by burnmp3s at 5:08 PM on February 28, 2012


now i want a nanosecond of my very own.

When I heard her speak, she handed them out to everyone in the audience. The only other thing I can remember from her talk, so long ago, is her anecdote of being in Japan but not speaking any Japanese, so she had to communicate using COBOL verbs.
posted by Obscure Reference at 6:52 PM on February 28, 2012


She seems cool and all, but... have you ever used COBOL?
It took her a very long time to get any professional recognition for her work at all. I think of COBOL as her retribution for all the sexist crap she had to put up with.
posted by b1tr0t at 7:03 PM on February 28, 2012 [2 favorites]


She seems cool and all, but... have you ever used COBOL?

Don't be absurd. Nobody uses COBOL.

COBOL uses you.
posted by Tomorrowful at 7:18 PM on February 28, 2012 [1 favorite]


Some context for the non-programmer:

As a programmer in 2012, it's really hard to grapple with the idea that compilers did not exist until 50-60 years ago. This is like thinking about biology before microscopes. Or thinking about physics before Newton. It hurts my brain to think about programming without even the cognitive toolkit that compiled languages offer. The sheer number of man hours and billions, perhaps trillions of dollars in commercial productivity and research time that compiled languages allow is really astounding.

This kind of "holy ****, our most basic concepts were invented within recent memory" effect is unusually common in computer science, which is just taking its baby steps into the world, as far as academic disciplines go. It's really strange and thrilling to have the founding fathers (and mothers) of your field still hanging around doing stuff.
posted by deathpanels at 8:53 PM on February 28, 2012 [1 favorite]



She seems cool and all, but... have you ever used COBOL?

You know, it's popular to take a swing at COBOL, but in a lot of ways, modern languages could learn something from it. The "plain English" approach is maligned by most career programmers, but it makes more sense in a world in which ordinary people write and maintain software, not one in which a class of professional programmers, who all understand and passively accept weird syntax and language constructs handed down from C, do all the programming.
posted by deathpanels at 9:00 PM on February 28, 2012


You scared me. I was afraid that a post beginning "Adm. Grace Hopper, inventor of the first compiler," was going to require a .

I saw that Letterman show recently. She's absolutely badass.
posted by theora55 at 10:45 PM on February 28, 2012


theora55: you do realize.that she died in 1992...?
posted by davidmsc at 12:35 AM on February 29, 2012 [3 favorites]


phrontist: "She seems cool and all, but... have you ever used COBOL?"

Yeah, it was the worst language I ever learned, not least because of the weird column-sensitive formatting. That alone would be relatively excusable - even the idea of context-free grammars was new at the time so it's not like she had a body of best practices to learn from - but then she went ahead and invented bugs. INVENTED BUGS.
posted by vanar sena at 2:11 AM on February 29, 2012 [1 favorite]


she went ahead and invented bugs. INVENTED BUGS.

Then she taped them to her reports.

the ultimate weapon to be deployed against idiotic statements that women aren't suited to work in software development.

Kind of a shame that so few have apparently followed in her footsteps. For decades I'm sure she was probably touted by the status quo as "the exception that proves the rule."
posted by ShutterBun at 3:07 AM on February 29, 2012 [1 favorite]


I had never heard of her till this post. And now I am totally in love. I wish more teachers knew how to explain things like that. Totally amazing.
posted by bardophile at 4:18 AM on February 29, 2012


Software engineersMost people, some of whom are software engineers don't realize how important it is to be able to explain things like that.
posted by DU at 4:55 AM on February 29, 2012 [1 favorite]


It hurts my brain to think about programming without even the cognitive toolkit that compiled languages offer.

This why I fear The Art of Computer Programming is doomed. There's so much good information in that series, but it's all locked up in assembly code. Assembly code for a fictional computer, no less. But nobody talks in terms of atomic operations anymore.

I'd really love to see TAoCP re-written in Scheme. That could be a book for the ages.
posted by DU at 4:58 AM on February 29, 2012 [3 favorites]


[...] compilers did not exist until 50-60 years ago.

To put it into some context for the non-programmers among you, writing even a moderate-sized program without a higher-level language and compiler is something like having to start off building a house by first constructing a forge to smelt iron ore to form the parts for the kiln you'll need to bake your own bricks.
posted by Mr. Bad Example at 5:07 AM on February 29, 2012 [5 favorites]


deathpanels: "You know, it's popular to take a swing at COBOL, but in a lot of ways, modern languages could learn something from it. The "plain English" approach is maligned by most career programmers, but it makes more sense in a world in which ordinary people write and maintain software"

I have plenty of sympathy for this idea, but the reality of COBOL doesn't match up to this ideal. The superficial similarity to natural language has the downside of lulling newbies into a false sense of competence, when the language is far less forgiving than it seems on the surface - more so even than more hacker-y languages.

I've personally seen this play out more recently with Inform 7, which is "guided by contemporary work in semantics." The grammar is extremely natural for the most part and the parser is really clever, so you'll be coding away on your game and suddenly come up against something that looks like it should work but just doesn't. Some of it is fixable parser shortcomings, and some because we're pretending we can simulate the precision required of a programming language using a less precise natural language. And that's when you think "yes yes very clever, but I need this to work" and drop into Inform 6 for a while.

Of programming languages in common use, IMHO the one that most closely approaches this ideal is probably Perl, which has another problem you'd expect from natural language programming - you could program perl for years and still find someone else's code incomprehensible, because Perl likes to let you do things your way. Lots of Perl programmers really like that though, and I'm not smart enough to argue with them.

(Note that I'm not trying to belittle Hopper's monumental achievements in any way - her place in the CS canon is beyond question.)
posted by vanar sena at 5:26 AM on February 29, 2012 [4 favorites]


To put it into some context for the non-programmers among you, writing even a moderate-sized program without a higher-level language and compiler is something like having to start off building a house by first constructing a forge to smelt iron ore to form the parts for the kiln you'll need to bake your own bricks.
A better analogy is that building large projects out of machine code (they didn't even have assemblers back then) is like attempting to carve the Taj Mahal out of a single block of stone, by hand. Individual sun-baked bricks? Luxury, akin to having an actual assembler with a macro system! Kiln-fired bricks with teams of workers who know how to lay and set them? Now you are getting closer to the most primitive procedural programming languages.

Conceiving of and implementing the first compiler was such a massive practical and intellectual achievement that it is difficult to fully comprehend in retrospect.

To quote wikipedia:

In 1952 she had an operational compiler. "Nobody believed that," she said. "I had a running compiler and nobody would touch it. They told me computers could only do arithmetic."

Note that I'm not trying to belittle Hopper's monumental achievements in any way - her place in the CS canon is beyond question.
She was a human, and like any human, her work was not without flaws. COBOL is full of regrettable features, but even having COBOL around as a signpost to demonstrate what not to do is useful. A lot of apparently brilliant software ideas turn out to be dead-ends like that. A lot of dead-end projects spawn teams of engineers who end up creating really awesome stuff with the lessons learned. Good engineers embrace failure - you can learn a lot more from it than from success.
posted by b1tr0t at 8:31 AM on February 29, 2012 [3 favorites]


The "plain English" approach is maligned by most career programmers, but it makes more sense in a world in which ordinary people write and maintain software, not one in which a class of professional programmers, who all understand and passively accept weird syntax and language constructs handed down from C, do all the programming.

I'm all for making syntax less weird (I personally think Python's super simple pseudocode-like syntax is a big improvement over C-like syntax with brackets and semi-colons), but "plain English" does not necessarily make a programming language easier to use for non-programmers. There are two main positives of using natural language as an input to computers: allowing less precise input, and re-using existing language constructs rather than inventing new words or symbols.

The main benefit from allowing less precise input is that the user can accidentally get their intended result even if they don't know anything about the system or what the options are. For example, someone can type in "what is 45 degres farenheight in celsious?" and Google is smart enough to spit out 7.22222222 as an answer even though none of the key words are spelled correctly and the person typing in the search terms probably didn't know that Google had a built-in temperature conversion calculator. Requiring the user to type in "FahrenheitToCelsius(45)" or some other specific syntax would be much less useful. This is the basis of Siri and pretty much any other major natural language based app, there is a relatively simple set of basic commands underneath with a big complicated parsing engine that runs a request through its AI to figure out what command the user is talking about and what data should be used for that command. Unfortunately for programming this doesn't really work at all. A given program will be made up of many, many commands that are all interrelated, so there is no way for a programmer to enter a series of vague imprecise statements and expect that to get put together into something that works. A programmer has to know what statements are available and how they need to be structured in order to write non-trivial code, there is no way around that.

On the other hand re-using existing language constructs is something that every programming language does to a certain extent. For example, "for each [item] in [container]" is pretty much the same as how you would describe it in English, whereas "for (i = 0; i < 10; i++)" includes a lot more specialized syntax and structure but still at least includes the concept of "for". You have to get to the Perl Golf level of completely abandoning everything but symbols to really miss out on English concepts, which the APL language probably came the closest to actually embracing as a design concept. It's not really natural language in the same ways as the imprecise example from Google, a statement in a programming language always means one precise thing that the programmer should be aware of, so in a sense any English words in a program are more like jargon standing in for specific abstract concepts than normal written English. And it's not obvious that taking the step from English jargon with symbols to straight English jargon is better. The C for loop example above is more complicated than "for each" because the underlying loop mechanics are more complicated, restating it as "Set 'i' to zero, and in a loop increment 'i' by one each time, as long as 'i' is less than ten" does not make it any less complicated to use. The English language version may make the statement make more sense to someone who has no idea how the for loop works, but as explained above a programmer has to learn about how a for loop works one way or another before they can actually start writing real code, so once the person has learned the language enough to use it, this aspect is useless. Similarly, it might help when you are learning math to think of "(3 + 5) * 12 = x" as "Take the sum of three and five, then multiply that result by twelve. The previous value is equal to a variable called x." but actual mathematicians writing mathematical proofs don't get much out of using natural language rather than more concise and opaque symbols.

Inform 7 is a good example of not getting the imprecision benefits and only getting marginal benefits from English jargon. It helps that Inform 7 is almost entirely a declarative language, so the code is more similar to HTML or a configuration file than most of the code you would create in a general purpose programming language. It also helps that most of what you do when you write Interactive Fiction is define words and write descriptions of things in English. But really I do not think there are a lot of benefits over using "Living Room is a room. A chair is in Living Room. It is fixed in place." instead of "Living Room: type=room, chair: location=Living Room moveable=False" or whatever kind of markup language syntax would make sense. Inform 7 is a well-designed language and the natural language aspect might be helpful for learning the complicated set of interactions that it allows, but after using it for a while I kind of wished that I could just write everything as a concise YAML file and have it converted to their verbose natural language style before compilation.
posted by burnmp3s at 10:08 AM on February 29, 2012 [2 favorites]


To have written the first compiler ever is so huge!

it's really hard to grapple with the idea that compilers did not exist until 50-60

Well, I'm not sure we should consider her A-O project the "first compiler" in the sense that is implied here - as the article says, it was really more of a loader and linker, with maybe some macros as well. Did her COBOL compiler come before the FORTRAN one?
posted by phrontist at 1:27 PM on February 29, 2012


DU: People make too big a deal of that - TaoCP never describes an algorithm as MMIX only. There is frequently pseudocode and often an extensive English and mathematical description.

That said, it would be cool if someone started translating some code examples to Scheme.
posted by phrontist at 2:31 PM on February 29, 2012


Well, I'm not sure we should consider her A-O project the "first compiler" in the sense that is implied here - as the article says, it was really more of a loader and linker, with maybe some macros as well.
To stretch the analogy a bit, it sounds like A-0 was more like the invention of mortar or Portland Cement in terms of construction.

Understanding the full stack of tools that make up a compiler system, to the point where you can implement them yourself, is hard enough today with sixty years of experience and literature. Of course, Hopper was a smart woman who probably didn't realize how difficult her project actually was until she completed it.
posted by b1tr0t at 2:58 PM on February 29, 2012


b1tr0t: Yeah, I don't mean to diminish her contribution, I just wanted to make clear that The Compiler wasn't invented in one fell swoop by an individual.
posted by phrontist at 1:45 PM on March 1, 2012


« Older Beavis and Butthead in Real Life. [SLBuzzFeed]...  |  The Martin Luther Insult Gener... Newer »


This thread has been archived and is closed to new comments