Dennis Ritchie has died
October 12, 2011 6:49 PM   Subscribe

According to breaking news, Dennis Ritchie, inventor of the C programming language, co-author with Brian Kernigham of the famous book on it, and creator with Ken Thompson of the Unix operating system, has died.

Interview with Ritchie from 1999 and 2000.

Ritchie and Thompson were awarded the Asssociation for Computing Machinery's Turing Award in 1983. Ritchie was awarded the IEEE Hamming Medal "for the origination of the UNIX operating system and the C programming language".
posted by grimjeer (238 comments total) 27 users marked this as a favorite
 
|
posted by Pruitt-Igoe at 6:50 PM on October 12, 2011 [1 favorite]


> /dev/null
posted by Slothrup at 6:51 PM on October 12, 2011 [7 favorites]


.
posted by FlyingMonkey at 6:51 PM on October 12, 2011


;
posted by e.e. coli at 6:52 PM on October 12, 2011 [13 favorites]


.
posted by genehack at 6:52 PM on October 12, 2011


.
posted by oonh at 6:53 PM on October 12, 2011


}
posted by Llama-Lime at 6:53 PM on October 12, 2011 [3 favorites]


;
posted by The Lurkers Support Me in Email at 6:54 PM on October 12, 2011


delete ptr_Dennis_Ritchie;
ptr_Dennis_Ritchie = null;
posted by Old'n'Busted at 6:55 PM on October 12, 2011 [3 favorites]


return 0;

More seriously: computer science is such a young field that almost all of its pioneers are still alive. It's strange to think that someday that won't be the case.
posted by jedicus at 6:55 PM on October 12, 2011 [9 favorites]


^D
posted by kcds at 6:56 PM on October 12, 2011 [3 favorites]


Maybe ; is more fitting. It's the end of a statement in C, and a command separator in the Bourne shell.

I still can't believe I have a phone that runs a variant of UNIX. RIP.

;
posted by Pruitt-Igoe at 6:57 PM on October 12, 2011 [4 favorites]


>exit
posted by octothorpe at 6:57 PM on October 12, 2011


*
posted by scose at 6:58 PM on October 12, 2011


free(*richie);
posted by Ad hominem at 6:58 PM on October 12, 2011 [1 favorite]


so sad I spelled the guy's name wrong, I do that with all variables

free(*ritchie);
posted by Ad hominem at 7:00 PM on October 12, 2011 [7 favorites]


.
posted by /\/\/\/ at 7:01 PM on October 12, 2011


*pours out a 40 bit register*
posted by DU at 7:01 PM on October 12, 2011 [25 favorites]


int dmr_days[ 25580 ];
/* buffer overflow error??? */
for (int i = 0; i <= 25580; ++i) dmr_days[ i ] = 1;
posted by sbutler at 7:01 PM on October 12, 2011


It's just impossible to calculate those three guy's influence, it's just everywhere.
posted by octothorpe at 7:01 PM on October 12, 2011


Goodbye, true love
posted by 200burritos at 7:01 PM on October 12, 2011


.

I guess I should finally pick up that K&R book I keep meaning to get.
posted by ChurchHatesTucker at 7:03 PM on October 12, 2011


return 0;
posted by unSane at 7:03 PM on October 12, 2011


I remember how proud I was when I finally understood how pointers worked.

My first programming job, I used some horrible in-house scripting language. I went back to school, took a few classes in C to brush up on what I'd lost, and then I was off to the races. I owe the first seven years of post-college paychecks to Dennis Ritchie's language.

;
posted by RakDaddy at 7:04 PM on October 12, 2011


I came here looking for the Unixy puns, and I'd like to think that he would take them as the tribute they are.

Y'know, when Jobs died I thought about all the things surrounding us that he was involved in creating. But if you count all the electronic devices around you that are either running a variety of Unix or used the C programming language as part of their creation ... it's jaw dropping.

Also
.
posted by benito.strauss at 7:05 PM on October 12, 2011 [10 favorites]


Yeah, I can't even imagine a (computer) world without UNIX and C. I run Linux on all the computers I touch. What would I be using instead? It wouldn't even be Windows and OSX as we know it now without the influence (to say the least) of Ritchie.
posted by DU at 7:05 PM on October 12, 2011


My copy of "The C Programming Language (2nd Edition)" got heavy use for years.

#define ME sad
posted by cosmac at 7:06 PM on October 12, 2011 [1 favorite]


;
posted by Vicarious at 7:06 PM on October 12, 2011


.
posted by Ivan Fyodorovich at 7:07 PM on October 12, 2011


;
posted by pwicks at 7:08 PM on October 12, 2011


/* TODO: nothing more to do. */
posted by SPrintF at 7:08 PM on October 12, 2011 [5 favorites]


int main(void)
{
printf("goodbye, world\n");
return 0;
}

posted by crunchland at 7:08 PM on October 12, 2011 [23 favorites]


char* post = ".";
posted by jferg at 7:08 PM on October 12, 2011


# shutdown -h now
posted by axiom at 7:09 PM on October 12, 2011 [2 favorites]


;
posted by ancillary at 7:12 PM on October 12, 2011


Yeah, I can't even imagine a (computer) world without UNIX and C. I run Linux on all the computers I touch. What would I be using instead?

One shudders to think of a world in which virtually all operating systems were derived from or inspired by something like OS/360.
posted by jedicus at 7:13 PM on October 12, 2011 [4 favorites]


;
posted by quoz at 7:15 PM on October 12, 2011


;
posted by drworm at 7:19 PM on October 12, 2011


;
posted by underflow at 7:19 PM on October 12, 2011


Looks like Steve Jobs needed some help running his new iCloud after all...

;
posted by the painkiller at 7:20 PM on October 12, 2011 [10 favorites]


;
posted by combinatorial explosion at 7:20 PM on October 12, 2011


;
posted by wanderingmind at 7:20 PM on October 12, 2011


;
posted by grimmelm at 7:23 PM on October 12, 2011


world--;
posted by Obscure Reference at 7:25 PM on October 12, 2011 [3 favorites]


return 0;
posted by gsteff at 7:26 PM on October 12, 2011


.

UNIX.

It's philosophy and design are everywhere now. From the largest of Supercomputers running climate change simulations, to being the backbone of internet web servers, to many a hand held smart phone.

Dennis Ritchie and Ken Thompson started something so very much bigger than themselves. UNIX and it's derivatives have shaped our modern world, in profound ways.

Simple, elegant tools. Brought together. To do very powerful things.

My world would be very different for not the advent of UNIX. It's philosophy. It's tools. Compilers. Shells.

So many have built so much on this... a most solid foundation.

I run Linux, a BSD or two, Solaris, even a very aged AIX box.... on PowerPC...

It's on all, but one machine I have.

Graceful tools. Powerful command.

UNIX is for the admin.

UNIX is for the user.

#
posted by PROD_TPSL at 7:26 PM on October 12, 2011 [5 favorites]


/* Tonight, I switch to 1TBS for one night. */

;
posted by ignignokt at 7:30 PM on October 12, 2011


What's amazing isn't that Unix isn't still being used today, it's that it will still be used 50 years from now.
posted by gsteff at 7:32 PM on October 12, 2011 [18 favorites]


.
posted by lilkeith07 at 7:33 PM on October 12, 2011


Arrgh, still is being used today.
posted by gsteff at 7:33 PM on October 12, 2011


   exit(EXIT_SUCCESS);
}
posted by 1970s Antihero at 7:34 PM on October 12, 2011 [8 favorites]


Not to denigrate Ritchie and Thompson at all, but Unix was based on a older system called Multics, and some of the things that were discarded in the simplification of Multics into Unix have returned.
posted by CheeseDigestsAll at 7:35 PM on October 12, 2011 [1 favorite]


What, no "end of an epoch" yet?
Come on, people.
posted by uosuaq at 7:35 PM on October 12, 2011 [2 favorites]


;
posted by middleclasstool at 7:40 PM on October 12, 2011


Not to denigrate Ritchie and Thompson at all, but Unix was based on a older system called Multics, and some of the things that were discarded in the simplification of Multics into Unix have returned.

One of the points that Stephen Jay Gould liked to make about evolution is that, contrary to the popular idea, evolution didn't always lead to more complexity. In some niches the less complex organism is fitter.
posted by benito.strauss at 7:41 PM on October 12, 2011 [1 favorite]


So, I'm confused. Is Linux considered a form of Unix still?
posted by mccarty.tim at 7:41 PM on October 12, 2011


I've got to find my copy of K&R...

;
posted by en forme de poire at 7:42 PM on October 12, 2011


;
posted by chunking express at 7:44 PM on October 12, 2011 [1 favorite]


;
posted by dmd at 7:45 PM on October 12, 2011


mccarty.tim - that depends whether you mean from a standards compliance, trademark licensing, "derived work", or general philosophy perspective.
posted by russm at 7:47 PM on October 12, 2011


I'd urge anyone with an interest in C to seek out the slim first edition of the K&R book, which was truly a revelation to me after I'd already been programming in (more modern, ANSI) C for a year or so. One of the exercises is an implementation of malloc() using Unix system calls. To this day I think of it as one of the all-time classic computer books, and I'll always be grateful to Ritchie for his role in writing it, and in the development of C and Unix.
posted by whir at 7:51 PM on October 12, 2011 [5 favorites]


.
posted by Increase at 7:54 PM on October 12, 2011


.
posted by a snickering nuthatch at 7:56 PM on October 12, 2011


;
posted by chemoboy at 7:57 PM on October 12, 2011


;
posted by jeisme at 7:58 PM on October 12, 2011


;
posted by randomname25 at 7:58 PM on October 12, 2011


;
posted by lapolla at 8:05 PM on October 12, 2011


&
posted by These Premises Are Alarmed at 8:06 PM on October 12, 2011


Ah, this is sad. I interviewed him once. It was excruciating. Not through any fault of his own - he was quite gracious - but it was a weird, weird day.

I'd just started work as a writer for a large publisher of trade newsletters (one of the very first sectors of journalism to be plowed under by the Internet). This would have been early to mid 90s. I was new, had no idea what I was doing, and was assigned to write half of a biweekly newsletter covering OSI, a suite of European networking standards that I had never heard of before. I had no training in anything remotely like network protocols. (And no, it was par for the course at that company for newsletters about highly technical subjects to be written by liberal arts graduates with absolutely no knowledge of those subjects whatsoever. It was an interesting place.)

About this time, they were starting to figure out that nobody really gave a damn about OSI. The U.S. had basically decided to bail and charge ahead with Internet Protocol, and the rest is history. So no sooner do I start trying to at least figure out what OSI stands for when suddenly it's now a newsletter about UNIX. About which I also knew nothing. I had no idea what I was expected to do, but then the editor tells me, "No problem, you've got an interview with Dennis Ritchie in half an hour. Just talk to him and write up two pages." Amazingly, I knew who he was. I had taken just enough intro to C to have a memory ping on Kernighan and Ritchie. This was literally the first thing I'd recognized since getting there, so I was actually kind of psyched.

But when it came right down to it, I had no idea what to do once I actually got on the phone with him. Again, he was very gracious, but it had to be bizarre for him talking to someone who clearly didn't know the first thing about him or his work and was reduced to asking things like, "So. What was it like inventing UNIX?"

But we both stumbled our way through it and I wrote a completely useless, forgettable two pages about Dennis Ritchie and his thoughts on UNIX and where it was going, and that was that. I eventually got to sit in the big chair and edit a newsletter of my very own, about the SS7 telephone signaling platform for Advanced Intelligent Network services. About which I knew absolutely nothing.
posted by Naberius at 8:08 PM on October 12, 2011 [3 favorites]


I'm teaching a little intro to C class at the local hackerspace in a little over a week. I use C, either write some, or debug some, every day as part of my job. Now have to do it without the R of K+R.

Danny Hillis famously said:
I went to my first computer conference at the New York Hilton about 20 years ago. When somebody there predicted the market for microprocessors would eventually be in the millions, someone else said, "Where are they all going to go? It's not like you need a computer in every doorknob!"
Years later, I went back to the same hotel. I noticed the room keys had been replaced by electronic cards you slide into slots in the doors.
There was a computer in every doorknob.
And what those doorknobs are almost certainly all microprocessing is the output of a C compiler.

.
posted by smcameron at 8:09 PM on October 12, 2011 [37 favorites]


Thanks Dennis. You were one of the geniuses that played an important part in helping strangers from different sides of the earth talk to each other, instantly, for the first time in human history. If we mess it up, it will be our own damn fault now.

;
posted by notion at 8:13 PM on October 12, 2011


;

delete?! free()ing a dereferenced pointer?! FOR SHAME!
posted by mkb at 8:14 PM on October 12, 2011 [4 favorites]


#!/bin/sh

while true; do
sleep 1d
done

posted by double block and bleed at 8:15 PM on October 12, 2011


I have used C almost my entire life after learning it from Kernighan & Ritchie in early high school.

I've claimed before that C is the most important programming language ever invented.

It welded all the rigor and portability of the 60's most advanced imperative languages with the power and speed of assembler by introducing critical tools for systems programming like unions, bitfields, pointers, and volatile variables onto a structured strongly typed programming language with the most advanced features know, like const. Never since has any language so unified computer science. Assembly language was pure non-portable barbarism in comparison.

Gutenberg is a more important inventor than Dennis Ritchie. I'll reserve judgement about other comparisons.

;
posted by jeffburdges at 8:15 PM on October 12, 2011 [10 favorites]


;
posted by BigHeartedGuy at 8:19 PM on October 12, 2011


C is almost a portable assembly language. It's a mid-level language: not high-level, not assembly. You can operate on memory directly, and there's no runtime. Well, there's a runtime library you can use if you want, but there doesn't need to be anything running behind the scenes that you don't control.
posted by Pruitt-Igoe at 8:21 PM on October 12, 2011


So, I'm confused. Is Linux considered a form of Unix still?

At this point, linux is pretty much the canonical unix. It occupies the vacuum left by SunOS. Or at least it seems so to me. (Notice I said SunOS, not Solaris... the distinction being that at one time, SunOS was sort of the the most "popular" unix on workstations, and all the free unixey programs would compile easily on SunOS, as it was sort of a de facto "default" unix, but maybe require tweaks on HPUX, or Ultrix, or what have you. Now all the free unixey programs tend to compile easily on linux, and linux seems to be the de facto "default" unix, but might require tweaks for (Mac) OS X, or whatever other flavor of unixey OS you might have.) The fly in this ointment is of course that there's not just one linux, but many, with varying packaging systems and default libraries and so on.
posted by smcameron at 8:22 PM on October 12, 2011


;
I'm reading this when I should be studying for a C++ midterm tomorrow; I took C a few years ago, and used it all summer at work, on a Linux machine.

So much stuff I've used is based on what he did it is crazy....
posted by Canageek at 8:23 PM on October 12, 2011


I spent all day today, like every workday, writing C on a Linux box. I can say with absolute honesty that I don't know where I'd be without this man.

Guess we all have to hit the if (err = 1) exit(1); lurking in our codebase someday. Thanks, Dennis.

;
posted by vorfeed at 8:24 PM on October 12, 2011 [2 favorites]


I love C.

But, if I could change something about C it would be these two things:

Default storage class of functions should be static, not extern.

Order of bits in bitfields should be strictly defined. As it is now, the same bitfield using code on a big endian machine will be different than on a little endian machine, so on code which uses bitfields to define the various bits of a hardware register, you end up either using #ifdefs and two variants of the bitfield, one for big endian, one for little endian, or you don't use bitfields and write endian clean code that does the same thing using constants and bitshifting and bitwise operators. This means that one of the main potential applications of bitfields is pretty fundamentally broken, and what's left (packing things into bits merely for space optimization) is of questionable value considering the price of DRAM these days.
posted by smcameron at 8:30 PM on October 12, 2011 [1 favorite]


. RIP, Sir.
posted by bz at 8:30 PM on October 12, 2011


I went to pour out a drink for dmr but my bottle was unterminated and the glass buffer overflowed.

;
posted by spitefulcrow at 8:30 PM on October 12, 2011 [2 favorites]


printf("thank you");
posted by WalkingAround at 8:32 PM on October 12, 2011


;
posted by hattifattener at 8:33 PM on October 12, 2011


;

K&R is the basis for my deep love of C; I'm a Python guy now, by preference, but C just touches something deep inside me -- the urge to build and the realization that with simple tools we can create architectural masterpieces.

Ritchie's work was brilliant. Kernighan too; let's hope he's got a few more years left in him.
posted by ChrisR at 8:34 PM on October 12, 2011


;
posted by parki at 8:36 PM on October 12, 2011


Sad news. Rest in peace.
posted by kernel_sander at 8:37 PM on October 12, 2011


++ -- () [] . -> ++ -- + - ! ~ * & * / % + - <>> < <> >= == != & ^ | && || ?: = += -= *= /= %= <>>= &= ^= |= ,
posted by grouse at 8:39 PM on October 12, 2011 [1 favorite]


I first ran across C in my second year of college, with Dennis Ritchie's book The C Programming Language. (In the first year we learned Pascal, and C was covered as part of the "computer architecture" sophomore class along with assembly language.) The first edition of that book was amazing: a tutorial on a programming language which assumed you already knew how to program! Every computer book I'd ever seen previously had started you out from square one, which was frustrating to those of us who'd been mega-geeks in high school and did not need another introduction to flowcharts and bytes. All throughout K&R was the subtext that you, the reader, already knew how to work with memory buffers and structured procedures; what C brought you was the ability to do it cleanly and easily. Dennis Ritchie made me feel, for the first time, like I was joining the world of grown-ups.
posted by Harvey Kilobit at 8:41 PM on October 12, 2011 [2 favorites]


;
posted by ZeusHumms at 8:42 PM on October 12, 2011


;
posted by jefbla at 8:46 PM on October 12, 2011


OK, Duff's device: the unholy intersection of C and Lucasfilm. There was a time that programmers worked "close to the metal" and C was their weapon of choice.

Nowadays, I feel so far away. I write scripts that talk to APIs that, in turn, chat with distant systems speaking who knows what. You know that scene in Independence Day, when Jeff Goldblum interfaces with the alien? That's me, now, every damn day.
posted by SPrintF at 8:46 PM on October 12, 2011 [5 favorites]


Every single one of my college books was a massive tome, except one - The C Programming Language 2nd Edition; 272 pages, including the index. "C is not a big language, and it is not well served by a big book". I'm looking at the battered copy that I've had for 20 years now. Price on the back is £26.95, while not exactly cheap, it was the cheapest book I bought for college, the only one I got my money's worth from, easily the most used and the only one I still own.

;
posted by IanMorr at 9:00 PM on October 12, 2011 [1 favorite]


C wasn't the first programming language I learned, but it was the one that really stuck, maybe because it was self-taught through the first edition of K&R. What programming I do these days is with Python. But I suspect if somebody were to wake me up in the middle of the night and tell me I had to write a program right now I'd be writing it in C.

free(dmr);
dmr = NULL;
posted by needled at 9:01 PM on October 12, 2011 [1 favorite]


rm -rf ~kmr
echo ":("
posted by freebird at 9:14 PM on October 12, 2011 [1 favorite]


K&R was how I got started with programming.

.
posted by Anything at 9:15 PM on October 12, 2011


This has been a hard month on people who've changed the world.
posted by ardgedee at 9:16 PM on October 12, 2011 [8 favorites]


.

Quick, load the core dump, we can fix this.
posted by qxntpqbbbqxl at 9:24 PM on October 12, 2011 [4 favorites]


$ init 0
posted by jimfl at 9:28 PM on October 12, 2011


Anyone who uses a computer is using technology influenced by Dennis Ritchie. If you ripped out everything that was developed using C or Unix, and C-like syntaxes such as C++, Java, C#, and JavaScript, on a computer, there pretty much isn't anything left above the hardware. Even the hardware was probably designed using programs written in C. His influence on software and computers in the last 40 years is unmeasurable.
posted by Xoc at 9:35 PM on October 12, 2011 [2 favorites]


;

Oh damn. I still have my 1st edition of K&R, back from before it was known as K&R. I still remember sitting in my cubicle at the USGS, taking the interactive C tutorial, typing on an ADM-3 terminal to some mysterious Bell Labs system over ARPANET. Now I feel very old.
posted by charlie don't surf at 9:41 PM on October 12, 2011 [2 favorites]


;
posted by furtive at 9:43 PM on October 12, 2011


C is the only language I truly enjoy programming in (hence my username). C++ comes close, but there's something about the simplicity of C that keeps pulling me back. RIP, dmr, your creations gave me both a job and a hobby.
posted by cmonkey at 10:01 PM on October 12, 2011


A language that doesn't have everything is actually easier to program
in than some that do.
   -- Dennis M. Ritchie

As in certain cults it is possible to kill a process if you know its true name.
   -- Ken Thompson and Dennis M. Ritchie

Via  fortune -m Ritchie | less  although Google found more.
posted by jeffburdges at 10:02 PM on October 12, 2011 [4 favorites]


One shudders to think of a world in which virtually all operating systems were derived from or inspired by something like OS/360.

I was chatting with my girlfriend about dmr's death and explaining how everything in use today is either UNIX or heavily influenced by it.
her: what did people use before?
me: horrible things like OS/360
her: i don't know what that is
me: good
Thanks again, Dr. Ritchie.
posted by grouse at 10:16 PM on October 12, 2011 [1 favorite]


I finally bought that book -- it came in the mail last week.

;
posted by victory_laser at 10:29 PM on October 12, 2011


Good grief. In one week we lose basically both the guy who popularized the idea of the home computer and the guy who made it possible in the first place.

More or less. Don't yell at me for oversimplification.
posted by DoctorFedora at 10:31 PM on October 12, 2011 [4 favorites]


Fuck. I owe him so much. We all do.

;
posted by kmz at 10:33 PM on October 12, 2011


;
posted by equalpants at 10:35 PM on October 12, 2011


;
posted by PueExMachina at 11:00 PM on October 12, 2011


I reread the K&R Chapter "Pointers and Arrays" so many times. I always tried to live up to this statement from the opening section: "Pointers have been lumped with the goto statement as a marvelous way to create impossible-to-understand programs. This is certainly true when they are used carelessly, and it is easy to create pointers that point somewhere unexpected. With discipline, however, pointers can also be used to achieve clarity and simplicity."
posted by girlhacker at 11:08 PM on October 12, 2011 [1 favorite]


One of the guys who actually changed the world we live in.
posted by rodgerd at 11:12 PM on October 12, 2011 [1 favorite]


return;
posted by dragstroke at 11:28 PM on October 12, 2011 [1 favorite]


.
posted by amorphatist at 11:33 PM on October 12, 2011


0["."]
posted by fleacircus at 11:38 PM on October 12, 2011 [4 favorites]


return ENOENT;
posted by monotreme at 11:44 PM on October 12, 2011


void main()
{
   sleep(1); // moment of silence
}

posted by secret about box at 11:50 PM on October 12, 2011 [1 favorite]


#include

int main ()
{
  FILE *pMetaFilter=fopen("http://www.metafilter.com/108340/Dennis-Ritchie-has-died","wt");
  putc ('.' , pMetaFilter);
  fclose(pMetaFilter);
  return 0;
}

posted by DreamerFi at 11:56 PM on October 12, 2011


> computer science is such a young field that almost all of its pioneers are still alive

Lovelace & Babbage? Hilbert? Turing? Eckert, Mauchly, Atanasoff, Berry & Hopper?

If you count the mathematical basis, it's no so young, but the number of people involved wasn't more than a trickle until recently, so "almost all" is probably right.

> One shudders to think of a world in which virtually all operating systems were derived from or inspired by something like OS/360.

Or VMS (WNT)?

Where I went to school, most people used terminals connected to VAXen running VMS. The Math department got some Sun 3s which the VAX operators and programmers laughed at as insecure toys. The ascendance of Unix was far from certain even in the late 1990s.

.
posted by morganw at 11:59 PM on October 12, 2011


.
posted by tykky at 12:09 AM on October 13, 2011


;
posted by brennen at 12:12 AM on October 13, 2011


Unix was pretty pervasive way before the 90s. You're right though, for some reason VMS shell accounts were all the rage at colleges back then, at least for general student use. My dad used Unix systems for his real work as a grad student at UT, but the university also gave him a general shell account on the central VAX system. At math camp (SWT), the lab had Solaris, AIX, and A/UX(!). But SWT gave us VMS accounts for email, etc. By the time I actually got to UT I think you could choose either a Unix or VMS shell.
posted by kmz at 12:15 AM on October 13, 2011


;
posted by quazichimp at 12:20 AM on October 13, 2011


I heard this news from a reshare of Rob Pike's G+ a few minutes after he posted it and my first reaction was to go to the New York Times and see if they'd posted one of those "breaking news" rectangles at the top of the page. I reloaded a couple of times and then got to thinking: is it possible to world doesn't regard Dennis Ritchie's death as being approximately as big a deal as, say, Sarah Palin announcing she won't run for president?

I guess what this really tells you is that I live in a very, very weird version of the world. And that weird version of the world owes dmr a HUGE debt of gratitude.

; indeed.
posted by troublesome at 12:24 AM on October 13, 2011 [2 favorites]


Hey, now, ain't nothing wrong with VMS. It's not real fun to use, but it's hard to speak too much ill of an OS you can get ten years of uptime on.
posted by vorfeed at 12:25 AM on October 13, 2011 [1 favorite]


I think part of the reason there hasn't been that much coverage elsewhere yet (and I've been looking too) is that there hasn't been an official statement other than a G+ post from a friend.

And I actually enjoyed my time with VMS. Before I knew about ytalk, phone was fantastic for multiparty chats with math camp buddies. And whatever the obtuse newsreader was installed was my first exposure to Usenet. And then I figured out how to extract binaries from certain groups too. Yes, definitely some fond memories.
posted by kmz at 12:38 AM on October 13, 2011


Yeah, I had a VMS account in college, had a unix shell at PANIX though.

I just finished an epic 20 hour right before the deadline refactor, was going to pull out Lions' Commentary on Unix with Ritchies' foreward but I am mad at computers right now.
posted by Ad hominem at 12:46 AM on October 13, 2011


Remember PrimeOS ?
posted by Ad hominem at 12:49 AM on October 13, 2011


I remember Primus.
posted by dirigibleman at 1:11 AM on October 13, 2011


Steve Jobs dies: meh. Dennis Ritchie dies: the nation should be in mourning.
posted by Ted Walther at 1:18 AM on October 13, 2011 [7 favorites]


}
posted by memebake at 1:20 AM on October 13, 2011


;
posted by supercoiled at 1:22 AM on October 13, 2011


I wish more programming books could get to the point as quickly and well as K&R does.

As Xoc points out, C will be influencing our programming languages for a long time to come. Not just for the syntax either. Even if you use a different programming language, there's a very good chance your compiler/interpreter is written in C. Thankfully it is a simple, beautiful language.

Thanks, Dr. Ritchie.
posted by Gary at 1:23 AM on October 13, 2011 [1 favorite]


.
posted by Smart Dalek at 1:26 AM on October 13, 2011


The Wikipedia page on him is disappointingly brief and subpar. I hope someone updates it. E.g.:

The C language is still widely used today [...] Unix has also been influential ...

Influential? Well, Linux, a flavour of UNIX, is used overwhelmingly in supercomputers, servers, mobile devices and embedded systems. There's almost certainly more devices running a UNIX variant than any other type of operating system. MacOS and iOS are built on BSD, another flavour of Linux. Incidentally all these OSes are compiled in C. I'd say that's rather more than influential - It's about as current as it gets.

??>
posted by iotic at 1:44 AM on October 13, 2011


wq
posted by flabdablet at 1:58 AM on October 13, 2011


C is the most important programming language of all time. Whilst newer languages like Java and the .net group claim platform independence, this is nothing compared to the platform independence of C which forms the basis of pretty much every operating system on every device on the planet.

C is the foundation of all that's important in computing - if you want to write an operating system or other programming language, the tool of choice (and in fact, pretty much the only choice you have) is C. The reason why I think that most languages have a c-like syntax is less to do with the greatness of C syntax, and more to do with the fact that the people writing the language are actually writing it in C behind the scenes anyway - it is less of a mind-shift for the programmers to jump from one language to another. Whilst it's certainly not the nicest language to work with, it is fundamentally the most important and most influential of all time. What's slightly scary is that C is likely to be around for a lot longer than any of the languages it has spawned. Fifty years from now C will still exist, but Java, .Net will probably have been long since consigned to the dustbin of history.

So my hat goes off to Dennis Ritchie for creating the foundations for modern computing - he will be sadly missed.


Dennis Ritchie: "So fsck was originally called something else"
Question: "What was it called?"
Dennis Ritchie: Well, the second letter was different.


Q&A at Usenix

posted by BigCalm at 2:36 AM on October 13, 2011 [6 favorites]


if you want to write an operating system or other programming language, the tool of choice (and in fact, pretty much the only choice you have) is C.

because worse really is better.
posted by flabdablet at 3:07 AM on October 13, 2011 [1 favorite]


People who mourn Jobs are the people that think the hardware is what matters. I mourn Ritchie's passing because I think it's the software. I mean, fucking hell. UNIX and C. Those five letters are basically the whole world of computing.
posted by Civil_Disobedient at 3:14 AM on October 13, 2011


I worked at Bell Labs in the 1980's when I was in high school for Max Mathews (who passed away earlier this year). The Unix room was down the hall from my office and I spent countless hours after work hacking on machinery in there. It was where I learned C, played rogue, played with Blits. It was also where I learned the effects of "stty 0 > /dev/ttyfoo" (where foo was a number of the tty attached to a terminal session). I used that to log out Dennis one night and feared the wrath of God from then on. All of that was tied together with C. All of it.

I have a number of entertaining stories about the Unix room, but oddly enough, few of them about Dennis. He was the quiet one. The one you had to keep an eye on.
posted by plinth at 3:23 AM on October 13, 2011 [4 favorites]


C_D: people who think it's the hardware don't understand why Apple became what it was.

As to dmr, there are damn few who deserve the title "wizard." He's one of them. He quietly changed the face of computing, he work still resounds today, and then he went back to work to do more things.

.
posted by eriko at 3:49 AM on October 13, 2011


.
posted by humanfont at 4:00 AM on October 13, 2011


One of the first books I read on programming that really made a difference in my thinking was The C Programming Language by K&R.

I'm typing this on a comptuer running a UNIX descended OS, and in a web browser written in C++.

It isn't exaggerating to say that without him the world of computing would look very different.

;
posted by sotonohito at 4:06 AM on October 13, 2011


sleep
posted by CautionToTheWind at 4:37 AM on October 13, 2011


BigCalm
Dennis Ritchie: "So fsck was originally called something else"
Question: "What was it called?"
Dennis Ritchie: Well, the second letter was different.
Wow, so not only did the man look exactly like Red Green; he also had Red Green's jokes!
posted by mkb at 4:47 AM on October 13, 2011 [3 favorites]


My copy of K&R hasn't been opened in years, but I can still see it from where I sit every day. It's soothing just to have it available, you know, just in case.
posted by jacquilynne at 4:53 AM on October 13, 2011 [1 favorite]


;
posted by whuppy at 4:58 AM on October 13, 2011


;
posted by purephase at 5:01 AM on October 13, 2011


MacOS and iOS are built on BSD, another flavour of Linux.

Is everybody in the BSD community OK? I don't see a raging flamewar and I'm a little worried.
posted by Dr Dracator at 5:01 AM on October 13, 2011 [14 favorites]


/*
* If the new process paused because it was
* swapped out, set the stack level to the last call
* to savu(u_ssav). This means that the return
* actually returns from the last routine which did
* the savu.
*
* You are not expected to understand this.
*/
posted by Obscure Reference at 5:24 AM on October 13, 2011 [6 favorites]


;
posted by motty at 5:30 AM on October 13, 2011


I hated C when I was first learning it. Then I used other languages (awk, C++) and am longing to return to it. As annoying as it is sometimes, it is always annoying in the same *ways*. I mean, I hated stuff like having to declare all my variables at the top, but then when I started writing my own programs I realized I always knew where to look to find out what a variable did, and I don't mind it so much.

I still wish they'd picked another symbol for pointers. @ or # or ~ or something that isn't already used for multiplication.
posted by Canageek at 6:40 AM on October 13, 2011


.
posted by doteatop at 6:50 AM on October 13, 2011


#include

int main(void)
{
printf(".\n");
return 0;
}

posted by fings at 6:57 AM on October 13, 2011


{}
posted by Zed at 7:16 AM on October 13, 2011


People who mourn Jobs are the people that think the hardware is what matters. I mourn Ritchie's passing because I think it's the software.

It's both.

Incidentally, Jobs spent the later part of his first stint with Apple trying to move the Mac to a UNIX base (as part of the BigMac project.) He then went on to found NeXT (which used UNIX) and finally ended up folding the Mac GUI into that when Apple acquired NeXT.

Mac programming these days is all about C (the Objective flavor.)
posted by ChurchHatesTucker at 7:16 AM on October 13, 2011 [4 favorites]


.
posted by heatvision at 7:22 AM on October 13, 2011


rmuser dmr
posted by hal incandenza at 7:29 AM on October 13, 2011


.
posted by Scoo at 7:33 AM on October 13, 2011


;

I really hope that in another week or so we're not mourning one of the other early giants of the computing industry, things happening in threes and all...
posted by togdon at 7:34 AM on October 13, 2011


Or VMS?

Nah, VMS came years after Unix was first released and was likely influenced by it (Ken Thomson consulted on its development). Further, as best I can tell a decent chunk of VMS was written in C. I think OS/360 is the only OS that predates Unix that still has any traction, which goes to show how fundamental Ritchie's (and Kernighan's and Thompson's) work is.
posted by jedicus at 7:36 AM on October 13, 2011


Were there any justice in the world, this would receive at least as much press coverage as any other recent death. C is, by far, the most widely-used programming language in the entire world. Your operating system is written in C. Your monitor's firmware was written in C. Your car's firmware was written in C (well, mostly). Your phone's firmware was written in C.

If I were being completely honest, I'd have to say that Ritchie has indirectly influenced my life more than any other non-family person. We all generally want to point to the great humanitarian or philosopher, but the fact is that I wouldn't think how I do, I wouldn't work how I do, and I wouldn't strive how I do were it not for my early exposure to C. The engineering art of its concepts and the manner of its implementation leads one down the path of understanding how physical hardware really works. It started me on a journey that, 23 years later, I'm still walking.




return 0; /* success */
posted by introp at 7:48 AM on October 13, 2011 [8 favorites]


I wasn't moved by Steve Jobs' death: I was in tears for Dennis Ritchie. By writing C, and then moving Unix into it, he liberated software from the tyranny of hardware, and hardware from the tyranny of software. A more fundamental step in freeing technology, from which so many good things happened, is hard to imagine.

None of this stuff around, behind and underneath these words, would have happened without him.

Also, he seemed to have led a happy and fulfilling life, and gained the respect and admiration of all who knew him (and a few million others).

Well played, sir. Well played.
posted by Devonian at 8:27 AM on October 13, 2011 [5 favorites]


;
posted by benign at 8:48 AM on October 13, 2011


@togdon As others have observed we're likely to see more in the next few years. Computing is still new enough that many/most of the pioneers are still alive, but they are reaching the end of their life expectancy.

Donald Knuth is 73, for example, and we may not see Volume 4b finished, much less any future volumes.

Its unfortunate, but I think we'll be seeing more of these announcements as time passes.
posted by sotonohito at 8:54 AM on October 13, 2011


*
posted by pretzel at 9:26 AM on October 13, 2011


.

C is still easily in my top three favorite programming languages. I don't know about anyone else, but once I finally got pointers, I felt so incredibly smart and yet humbled, all at the same time. My other two favorite languages are memory-managed, which is all fine and good for 99% of my needs writing programs for the desktop or web in this day and age. But there is something about the speed and efficiency of C, and for microcontroller programming, I don't think there is anything that even compares.
posted by mysterpigg at 9:29 AM on October 13, 2011 [1 favorite]


@sotonohito Don't tell me that, I just discovered LaTeX this summer and have been blown away by it, and was highly amused by Knuths sense of humour when I looked him up.

Sadly, those of us in the tabletop RPG community are reaching this point, with the passing of Gary Gygax, Dave Arneson and a number of others in the last couple of years, as tabletop RPGs early days happened just a bit before the computer revolution. It...isn't a happy time, realizing that the men and women who created things you've poured untold hours of your life into are passing.
posted by Canageek at 9:52 AM on October 13, 2011 [3 favorites]


.
posted by paulg at 9:52 AM on October 13, 2011


For those who really want to appreciate UNIX, find a copy of "The UNIX Hater's Handbook".

At any rate, RIP. he was a genius who was in the right place at the right time and his legancy will no doubt still be felt when I pass away.
posted by GuyZero at 9:59 AM on October 13, 2011


}
posted by Artw at 10:00 AM on October 13, 2011


I think we should put him on the $16 dollar bill.

Er, I mean the $0x10 dollar bill.
posted by benito.strauss at 10:05 AM on October 13, 2011 [2 favorites]


I think OS/360 is the only OS that predates Unix that still has any traction

Do you want to come to my TOPS-10 user group?
posted by GuyZero at 10:09 AM on October 13, 2011 [2 favorites]




;
posted by fremen at 10:22 AM on October 13, 2011


for microcontroller programming, I don't think there is anything that even compares.

I was once called in as a contractor to fix a problem that the lead system engineer, who apparently believed what you just said, was finding completely intractable. The CPU in question was a 68000, and the task at hand was making it talk to a Z8530 serial communications controller at 230 kbits/second. That's a byte every 35 microseconds, and the hardware provided no DMA support.

The hardware engineer knew this would work, because the Z8530 was the same serial controller that Apple used in the Macintosh, the Mac ran AppleTalk at 230 kbits/s without DMA, and his 68000 was running at 12MHz compared to the Mac's 8MHz.

But the rest of the system was designed in such a way that the interrupt service latency caused by masking interrupts for the duration of a network packet was unacceptable; network traffic absolutely had to be done on an interrupt-per-byte basis, with only the Z8530's 3-byte FIFO to help. There was no way on God's green earth that the code produced by the C compilers of the day was ever going to be up to that job.

In fact the carefully handcrafted 68000 assembly language ISR I wrote for them ran to completion in a quarter of time taken by the standard ISR preamble code generated by the compiler. Even so, it ended up consuming about half the available CPU time during network packet transfers. But it did meet the latency requirements for the rest of the software without causing network packet losses.

Compilers are better these days, and CPU architectures complex enough that compiler output now often does provide better overall throughput than handcrafted assembly code. But I think there will always be a role for the assembly language programmer around the edges of the place where ambitious system design meets slightly under-specified hardware.

For those who really want to appreciate UNIX, find a copy of "The UNIX Hater's Handbook"

It's online, including the anti-foreword by dmr.
posted by flabdablet at 10:35 AM on October 13, 2011 [7 favorites]


.
posted by BrotherCaine at 10:35 AM on October 13, 2011


;
posted by wayland at 11:30 AM on October 13, 2011


Oh man, I feel such gut-wrenching irrational anxiety right now. It's a cross between Dennis Ritchie having just died and this whole page being covered in eels and kraken.
posted by wayland at 11:32 AM on October 13, 2011 [1 favorite]


gurus--
posted by azlondon at 11:57 AM on October 13, 2011


flabdablet: still, I'm guessing that 90%+ of the project was coded in C and not assembler (even if 50% of the time was spent in those dozen or two lines of assembler :-P)—that's just effective (micro)optimization right there.
posted by jepler at 12:01 PM on October 13, 2011


flabdablet, Oh, don't get me wrong - I agree that assembly certainly has it's place, even today. I have found the occasional need to optimize further or step through compiler-produced assembly to get where I needed to. But just as memory-managed solutions are fine for me 99% of the time, I have found C to be useful in 99% of the remaining cases. And of the remaining remaining cases, Assembly. Then zeros and ones. Then, turtles. All the way down...
posted by mysterpigg at 12:03 PM on October 13, 2011 [1 favorite]


0
posted by vogon_poet at 12:34 PM on October 13, 2011


He developed Unix AND C?! Amazing person.

More seriously: computer science is such a young field that almost all of its pioneers are still alive. It's strange to think that someday that won't be the case.

I was just thinking about this with the passing of Jobs. In terms of legacy, does computer science/engineering have any big awards like the Nobel Prizes in other fields or like the Field's medal or Wolf prize in mathematics? I don't hear so much about such accolades in CS (at least they aren't used as bragging rights as much as the others are).
posted by bluefly at 12:38 PM on October 13, 2011 [1 favorite]


Another demigod gone. But his work lives on. I'm typing this on a Linux system that carries the genes of that Spacewar-playing PDP from so long ago...

;
posted by bitmage at 12:40 PM on October 13, 2011


.
posted by NordyneDefenceDynamics at 12:51 PM on October 13, 2011


gsteff: "What's amazing isn't that Unix isn't still being used today, it's that it will still be used 50 years from now."

Well, I can tell you for certain that Ritchie and Thompson weren't expecting it to be used much past 2038.
posted by schmod at 12:57 PM on October 13, 2011 [2 favorites]


.
posted by Axle at 1:07 PM on October 13, 2011


I can tell you for certain that Ritchie and Thompson weren't expecting it to be used much past 2038.

That was another amazing thing about Ritchie and crew. I never got the impression they set out to change the world. They just wanted to create an awesome development environment for themselves and have a place to play computer games. Now a descendent of this system runs on millions of computers and mobile phones in people's pockets, 42 years later, and there's no sign of it going away anytime soon.
posted by grouse at 1:14 PM on October 13, 2011




.
posted by Crabby Appleton at 1:20 PM on October 13, 2011


In terms of legacy, does computer science/engineering have any big awards like the Nobel Prizes in other fields or like the Field's medal or Wolf prize in mathematics?

Yes, The Turing Award. Ritchie won it in 1983.
posted by alopez at 1:36 PM on October 13, 2011 [2 favorites]


That means he could imitate a human.
posted by Artw at 1:37 PM on October 13, 2011 [2 favorites]


He was the first UNIX developer to successfully imitate a human and one of the few ever. ESR and RMS still can't do it.
posted by GuyZero at 1:44 PM on October 13, 2011 [5 favorites]


He was the first UNIX developer to successfully imitate a human and one of the few ever. ESR and RMS still can't do it.

I'm a huge Stallman fan, but even I'll admit that Eliza is often more convincing.

*types M-x doctor in another window, sighs with contentment*
posted by vorfeed at 1:51 PM on October 13, 2011 [2 favorites]


He was the first UNIX shell script to successfully emulate a human.
posted by benzenedream at 1:53 PM on October 13, 2011


Only the true grognards are known more by a three-initial email alias than by their names.
posted by GuyZero at 1:56 PM on October 13, 2011


bluefly:

There's the Turing Award. Ritchie shared it with Ken Thompson in '83.
posted by Sand Reckoner at 2:08 PM on October 13, 2011


C is still easily in my top three favorite programming languages.

YES. I always get weird looks when I tell people this. To me there's no other language that offers you the joy of building something brick by brick while still getting to see and touch each one of them and turn it around and understand it (unlike VM languages or interpreted languages or C++) alongside the ability to keep the bigger picture in your head with ease while you do it (unlike ASM). Even as I've begun the ascent up the long hard hill of functional programming geekery, I still hold C dear and I think I always will. Godspeed, Dennis Ritchie.
posted by invitapriore at 2:08 PM on October 13, 2011 [2 favorites]


return(muchRespect);
posted by samsara at 2:23 PM on October 13, 2011 [2 favorites]


.
posted by jpziller at 2:30 PM on October 13, 2011


There are many versions of this floating around, but this one seems closest to what I remember from my days of learning C:

By: Kriston J. Rehberg
Original: "Let It Be" (Beatles)

Write in C

When I find my code in tons of touble,
Friends and colleagues come to me,
Speaking words of wisdom:
"Write in C."

As the deadline fast approaches,
And bugs are all that I can see,
Somewhere, someone whispers:
"Write in C."

    Write in C, write in C,
    Write in C, oh, write in C.
    LISP is dead and buried,
    Write in C.

I used to write a lot of FORTRAN,
For science it worked flawlessly.
Try using it for graphics!
Write in C.

If you've just spent nearly 30 hours
Debugging some assembly,
Soon you will be glad to
Write in C.

    Write in C, write in C,
    Write in C, yeah, write in C.
    Only wimps use BASIC.
    Write in C.

    Write in C, write in C,
    Write in C, oh, write in C.
    Pascal won't quite cut it.
    Write in C.

{
  guitar solo
}

    Write in C, write in C,
    Write in C, yeah, write in C.
    Don't even mention COBOL.
    Write in C.

And when the screen is fuzzy,
And the editor is bugging me.
I'm sick of ones and zeros,
Write in C.

A thousand people sware that T.P.
Seven is the one for me.
I hate the word PROCEDURE,
Write in C.

    Write in C, write in C,
    Write in C, yeah, write in C.
    PL1 is '80s,
    Write in C.

    Write in C, write in C,
    Write in C, yeah, write in C.
    The government loves Ada,
    Write in C.

posted by hattifattener at 2:37 PM on October 13, 2011 [5 favorites]


.
posted by brundlefly at 2:48 PM on October 13, 2011


.
posted by UseyurBrain at 3:36 PM on October 13, 2011


;
posted by klausness at 4:12 PM on October 13, 2011


;
posted by kenko at 5:11 PM on October 13, 2011


I'm guessing that 90%+ of the project was coded in C and not assembler

More like 99.99%. The lead engineer had a zealot's belief in the inherent evil of assembler and had issued a blanket edict that the entire project must be coded in C; if I recall correctly, he had been attempting to make the network transport stuff work for a couple months before letting the hardware guy call in a mate to make it go.

It's pretty self-evident that C is indeed the dominant compiled language for microcontroller applications. It's usually the first and sometimes the only compiler available from the hardware vendors.

Even so, there are lots of low-end microcontroller families still in common use for which C is a very poor fit, simply because their architectures were never designed with it in mind. The idea of using C code on a 6502 or 6805 or 8051 or PIC makes me itch, though people frequently do. For wedging large amounts of code onto one of those, it's quite hard to do better than FORTH.
posted by flabdablet at 5:40 PM on October 13, 2011 [1 favorite]


#include <stdio.h>

void die(void) {printf("So long and thanks for all the fish!\n");exit();}

for(i=lifetime;i>0;i--) { printf("And you run and you run to catch up with the sun, but it's sinking\nRacing around to come up behind you again\nThe sun is the same in a relative way, but you're older\nShorter of breath and one day closer to death\n); } die()

(I think that should compile, unless I've made a typo; it's the first C I've written in some years, though)
posted by wierdo at 5:57 PM on October 13, 2011 [1 favorite]


Not to derail, but maybe Mr. Ritchie wouldn't mind: yeah, anything Hardvard architecture is clunky to program in C unless they've added hardware with it in mind. All semi-recent PICs (18F+) have a host of indirect addressing registers, which allows them to compile some very tightly-optimized C code.
posted by introp at 6:03 PM on October 13, 2011


And I realized as I was adding RAM to my laptop just after my previous post that I completely forgot to wrap the for loop in a main function. Clearly I've been doing too much Python and Perl lately.

Version 2:

#include

void die(void) {printf("So long and thanks for all the fish!\n");exit();}

void main(void){for(i=lifetime;i>0;i--) { printf("And you run and you run to catch up with the sun, but it's sinking\nRacing around to come up behind you again\nThe sun is the same in a relative way, but you're older\nShorter of breath and one day closer to death\n); } die();}

posted by wierdo at 6:27 PM on October 13, 2011


hades wrote: "void main"? You _are_ a weirdo.

Return values aren't always necessary. ;)
posted by wierdo at 6:57 PM on October 13, 2011


;
posted by jquinby at 7:19 PM on October 13, 2011


As for "void main", K&R left out the return type entirely.

You didn't declare i. You didn't declare lifetime or give it an initial value. You are also missing closing quotes. The exit() isn't exactly wrong, but is completely unnecessary. (We'll ignore what's going on with #include because Metafilter probably ate it).
posted by Gary at 7:37 PM on October 13, 2011 [3 favorites]


Of course, I get the link wrong. Stupid Skitt's Law.
posted by Gary at 7:47 PM on October 13, 2011


It seems Brian Raiter has won the touching C programs contest. <sniffle>
posted by jeffburdges at 8:53 PM on October 13, 2011 [2 favorites]


There are some insecure design and implementation choices in the original C compiler and its descendants that will continue to cause trouble for decades if not centuries. One of those is the decision to make array base addresses work the same way as pointers, abandoning any attempt at bounds checking. This wouldn't have been so bad had the first C compiler not also maintained the tradition of combining data and flow control information on a single stack: a pattern not unique to C by any means, but any new language (particularly one built as part of an entirely new OS) has the opportunity to discard that particular piece of traditional baggage, and C didn't.

By the time C was designed, it had become traditional for processors that provided hardware-managed stack pointer registers to put an empty stack at the highest memory address, and grow the stack "downward" in memory (perhaps because if you're looking at a core dump, the stack can be seen to grow upward on the page). The C compiler, like Algol and Pascal and B and BCPL and all the other recursion-supporting languages before it, allocates space for local variables inside a stack frame on the same stack that the processor uses to push subroutine return addresses. These two facts conspire to produce the outcome that if writes to a locally allocated variable can be persuaded to overflow it (which C's complete lack of bounds checking makes particularly easy) they will overwrite one or more subroutine return addresses, and cause the subroutine to "return" to somewhere quite arbitrary. This is the notorious "buffer overflow" problem, and it's become the single most-exploited method for malware to get a toehold.

What could easily have been done instead is to keep a second stack for local data, much as FORTH does; it could even be in a completely separate memory segment from the one the processor uses for subroutine return addresses. Such a local data stack should be empty at the beginning of its memory segment and grow toward high memory. That way, a buffer overflow would at worst trash only other local variables within its own stack frame before extending into completely unused space, and it would not be possible to use such an overflow to pervert a subroutine's return address. And if it were normal practice for the compiler to allocate variables of function pointer type before other variables in any stack frame and char[] variables last, it would not be possible for a typical buffer overflow to pervert control flow in any way whatsoever.

Only on a smallish class of CISC processors featuring inbuilt instructions to build and tear down stack frames at procedure entry and exit would this dual-stack scheme be less code- and time-efficient than the single-stack scheme used by (to my knowledge) every C compiler since the original, and then only by an insignificant amount. It could easily have been done on the PDP-11 where the first C compiler ran, and should have been done then, and should still be done now on modern processors IMHO; as far as I can tell, it's purely a matter ABI tradition that will keep us all vulnerable to buffer overflows for the foreseeable future.
posted by flabdablet at 10:13 PM on October 13, 2011 [1 favorite]


You'll sacrifice one register doing that, flabdablet, which sounds reasonable, but represents a significant cost. You could instead simply grow the stack upwards, which costs nothing. In fact, overflows cannot even corrupt a calling routine's data that way!

You need some special handling for function pointers in either case of course, allocating them either on the other stack, or before all arrays in each stack frame. I donno if it's reasonable for kernel threads' stacks to grow upwards while application stacks grow downwards.
posted by jeffburdges at 10:31 PM on October 13, 2011 [1 favorite]


Were the solution that easy, you'd see it implemented in new processor families and gcc/etc. modified to support it. Maybe it would take off, maybe it wouldn't and people would just use SP2 as another GP address register. But no one has done it. And that's because it would be terribly expensive.

(Also, doing it on the PDP-11 and any processor from that era would've been painful. It only had a single register with dedicated SP mode hardware. You'd have needed to add a "negative index" addressing mode, etc. Processor time cost big money on a PDP-11, too, so you'd sell your grandmother to the devil for a dozen clocks per function call.)
posted by introp at 10:44 PM on October 13, 2011


You'll sacrifice one register doing that, flabdablet, which sounds reasonable, but represents a significant cost.

Most compilers of the time used separate frame pointer and stack pointer registers anyway; keeping data frames and control stack separate would have cost literally nothing. Modern compilers offer a no-frame-register mode as an optimization, but it's really only on the register-poor x86 architecture that the extra register is costly enough to notice - it would make bugger-all difference on amd64 or ARM.

You could instead simply grow the stack upwards, which costs nothing.

No, on many architectures it costs you the ability to use CPU-provided call and return instructions for call and return operations. On RISC machines that save the return address in a register rather than on a CPU-managed stack, you could indeed use a single upward-growing stack at no cost.
posted by flabdablet at 11:34 PM on October 13, 2011


Gary wrote: As for "void main", K&R left out the return type entirely.

You didn't declare i. You didn't declare lifetime or give it an initial value. You are also missing closing quotes. The exit() isn't exactly wrong, but is completely unnecessary. (We'll ignore what's going on with #include because Metafilter probably ate it).


Mefi apparently munged more than I initially noticed (or I fail at cut and paste). I actually compiled it after posting. ;)
posted by wierdo at 11:56 PM on October 13, 2011


PDP-11 ... only had a single register with dedicated SP mode hardware

Not sure what you're referring to here. The PDP-11 instruction set does include JSR and RTS instructions that make implicit use of -(R6) and (R6)+ address modes and force the use of a downward-growing stack for calls and returns, but all registers have the same set of addressing modes available. In particular, there are no dedicated push and pop instructions for the control stack, just MOV source,-(R6) and MOV (R6)+, destn respectively. You can do the same thing with any other register to implement a downward-growing stack, or use MOV source,(Rn)+ and MOV -(Rn),destn for an upward-growing one.

Anyway, as the technical report makes clear, they simply missed the fact that in a downward-growing stack including control flow information, buffer overflows can do terrible, terrible things.
posted by flabdablet at 1:08 AM on October 14, 2011


There isn't any improvement achieved by growing upwards because called routines regularly access buffers created by calling routines.
posted by jeffburdges at 1:27 AM on October 14, 2011


Exactly right. Which is why the automatic-variables/parameters stack that contains things that could potentially overflow needs to be separate from the one that contains the return addresses.

Passing the address of an auto variable to a function that may itself create auto variables containing function pointers or ints ultimately used as indexes to arrays of function pointers, or that calls other functions that do either of those things, would still be a potentially unsafe thing to do. But this is a considerable improvement over present-day standard practice, which involves potential disaster following any overflow of any auto variable in any function.
posted by flabdablet at 2:22 AM on October 14, 2011


Rob Pike has posted a followup to his first G+ announcement.
posted by pharm at 3:41 AM on October 14, 2011


Sorry, that should link here.
posted by pharm at 3:43 AM on October 14, 2011


So people write shitty code and shitty things happen. That's tolerated throughout the industry, and has been for decades.

And the solution is to idiot-proof the tools???
posted by Crabby Appleton at 7:09 AM on October 14, 2011


Great quote from the Rob Pike follow-up that pharm posted:
Unix was the great equalizer, the driving force of the Nerd Spring that liberated programming from the grip of hardware manufacturers.
posted by benito.strauss at 7:22 AM on October 14, 2011


;
posted by Mitheral at 7:39 AM on October 14, 2011


Mefi apparently munged more than I initially noticed (or I fail at cut and paste). I actually compiled it after posting. ;)

codepad is better for that sort of thing.
posted by Gary at 9:47 AM on October 14, 2011 [3 favorites]


;
posted by Wemmick at 10:44 AM on October 14, 2011


Gary wrote: codepad is better for that sort of thing

Now that is a tool of the future.
posted by wierdo at 2:06 PM on October 14, 2011


And the solution is to idiot-proof the tools???

If a stamping machine is known to remove the operator's hands on a regular basis, designing a guard into the next model is a fairly sane response.
posted by flabdablet at 11:07 PM on October 14, 2011 [4 favorites]


;
posted by Skorgu at 8:33 AM on October 15, 2011


Not enough good things can be said of the man. He was a quiet visionary, always approachable, and (I think) slightly bemused by all the hoopla surrounding his creations.

I remember receiving a 9-track tape of 5th Edition unix from him in the 70's, with a handwritten note warning of "a bug in the rk05 driver." He said if you halted the cpu after the initial load started, toggled in this value in that memory location and resumed, you could get it to boot. Fortunately, we had an rm03 disk.

I wish I kept that note.

Years later, working at Bell Labs in his very organization, I meant to tease him about it. Good thing I forgot.

So long, dmr, and thanks for all the code.
posted by skippyhacker at 10:03 AM on October 15, 2011 [7 favorites]


.
posted by jwhite1979 at 6:49 AM on October 17, 2011


;
posted by Hig Hurtenflurst at 9:13 AM on October 17, 2011


Today is Dennis Ritchie Day
posted by jeffburdges at 7:19 AM on October 30, 2011 [1 favorite]


#include <Economist.h>
printf("goodbye, Dennis");
posted by benzenedream at 9:41 AM on October 30, 2011


« Older WARNING: contains bacon. also possibly hipsters.   |   Zorya & More From Amanita Design Newer »


This thread has been archived and is closed to new comments