A light flickers out
April 2, 2010 10:54 PM   Subscribe

"Apple co-founder Steve Wozniak told CNET he was saddened to learn of Roberts' death. 'He took a critically important step that led to everything we have today,' Wozniak said." Ed Roberts, creator of the Altair 8800, the first personal computer, died Thursday.

Microsoft founders Bill Gates and Paul Allen considered him a mentor, and remarked on his passing.
posted by longsleeves (36 comments total) 8 users marked this as a favorite
 
As a native of Silicon Valley,

.
posted by queensissy at 11:38 PM on April 2, 2010


.
posted by Blue Jello Elf at 11:39 PM on April 2, 2010


.
posted by XMLicious at 11:55 PM on April 2, 2010


.
posted by threetoed at 11:59 PM on April 2, 2010


.
posted by stringbean at 12:10 AM on April 3, 2010


for (;;){puts(".");}
posted by Netzapper at 12:18 AM on April 3, 2010 [2 favorites]


for (;;){puts(".");}

More like

10 print '.'
20 goto 10

posted by delmoi at 12:38 AM on April 3, 2010 [5 favorites]


You can buy a Z80 kit computer that will boot CP/M and even run S/100 peripherals (the same basic architecture as the Altair, although the Z80 and CP/M were hallmarks of later clones based on the Altair S/100 design). I keep meaning to buy a kit and build one, but I never get around to it because I know it'll require a large dedication of time. Here's the link

As for the Altair, it wasn't so much ahead of its time as just brought to market a few years too early. If the Altair had come out a few years later after the price of RAM had fell dramatically (which is what enabled Apple's early success), and Steve Wozniak had happily continued designing calculators at HP and didn't come up with a computer design before then, the Altair could have easily been around for much longer. I might even be typing this on some next-gen MITS computer instead of an iMac.
posted by DecemberBoy at 1:06 AM on April 3, 2010 [1 favorite]


However, even in that drastically different alternate future, Microsoft would STILL be the dominant software company for home users, assuming their relationship would continue after Altair BASIC. That sucks.
posted by DecemberBoy at 1:09 AM on April 3, 2010


as someone who learnt BASIC on a RadioShack TRS-80 in my senior year in fall of 1982

.
posted by infini at 1:10 AM on April 3, 2010 [1 favorite]


Ed Roberts is profiled in the fascinating documentary, "Triumph of the Nerds," available on YT (parts 2 and 3). More about the Altair 8800 and its counterpart, the Altair 680.

AP reported:
He sold his company in 1977 and retired to a life of vegetable farming in rural Georgia before going to medical school and getting a medical degree from Mercer University, in 1986.

Roberts worked as an internist, seeing as many as 30 patients a day, his son said. But he never lost his interest in modern technology, even asking about Apple's highly anticipated iPad from his sick bed.

"He was interested to see one," said Roberts, who called his father "a true renaissance man."

posted by prinado at 1:11 AM on April 3, 2010 [1 favorite]


Also, the reason the S/100 bus was designed that way is that Ed Roberts happened to have some 100 pin edge connectors lying around, and that was enough pins to bring out all the signals on the 8080 plus provide power and ground to peripherals. The tech industry was so much more innocent then. The Altair, in fact, was named that because Ed asked his daughter, who was a big Star Trek fan, what planet the Enterprise happened to be visiting in the episode she was watching.
posted by DecemberBoy at 1:22 AM on April 3, 2010 [2 favorites]


for (;;){puts(".");}

You wish! More like on on off on off off on on = D3 = 'OUT'; off off on off on on on off = 2E = '.'; on on off off off off on on = C3 = 'JMP'; off off off off off off off off = 0.

I remember my first experience with a computer, probably in 1976 or 1977, my mom's friend Milton's son had an Altair, and after enough twiddling of the dip switches it could play ADVENT. Good times...
posted by nicwolff at 3:00 AM on April 3, 2010 [6 favorites]


This guy is the main reason we enjoy cheap, abundant hardware today; the idea you could have an open bus standard and mix and match components as you see fit from an open market of competitive vendors may or may not have been the intention, but it's why we got it.
posted by rodgerd at 3:17 AM on April 3, 2010 [2 favorites]


.
posted by DreamerFi at 3:47 AM on April 3, 2010


.
posted by Skorgu at 4:05 AM on April 3, 2010


Thank you!

.
posted by two or three cars parked under the stars at 4:24 AM on April 3, 2010 [1 favorite]


.
posted by hal9k at 4:26 AM on April 3, 2010




.
posted by iviken at 5:04 AM on April 3, 2010


When I was young, I wanted some kind of computer, so badly, after seeing Star Wars for the first time. (In its original release in 1977.)

I didn't have the manual dexterity (yet) to build an Altair. We did get an Apple II. I sometimes wonder if I'd been a bit better at soldering, what would have happened next.

.
posted by mephron at 5:12 AM on April 3, 2010


.
posted by Brandon Blatcher at 5:55 AM on April 3, 2010


It's interesting to have watched the trajectories of the people from this era. Ed Roberts more or less invented the personal computer, and then quit the industry entirely. Steve Wozniak went from hardware hacker supreme to teacher and gadfly. Others quit inventing and became businessmen, or faded into the corporate world as anonymous engineers, or disappeared altogether.

It's as if, having changed the world with what they made, some of them couldn't find a satisfactory place for themselves any more.

I never got to play with an Altair, but it's because of what the Altair started that my world is as good as it is. Thanks, Ed.
posted by ardgedee at 7:28 AM on April 3, 2010


.
posted by rahnefan at 7:56 AM on April 3, 2010


ardgedee, that's not so mysterious; a similar trend seems to play out in the opening of any frontier. Some people don't come to the gold rush just to get rich; they come for the sense that the limits are not known, that you could end up doing anything and there is nobody to stop you.

In 1974, just having access to a computer -- any computer -- was a privilege, and the idea of owning one for yourself was just wicked cool. They weren't really useful for anything, except bragging that you had a computer, but that was more than most people could even fathom. The idea that you could set a machine in motion doing some complex task, and leave it to its own devices for days and it would continue reacting and carrying out your will in your absence, was totally alien. Making such a device answer to your will was something few humans had ever been able to do, and implied an almost godlike power even if it was just flashing an LED.

I remember in those days that "computer" toys would be plastic shams with levers and lights and rolls of printed paper inside them, and "computer" kits from Radio Shack would enable you to build gates, a flip flop, or maybe a simple digital counter. I remember reading a few years later that a musical doorbell contained an actual computer, an 1802, and thinking that was just crazy. And it cost something like USD$300, in 1978.

Today, of course, computers are actually useful for something, and just as the law eventually moves into the gold rush town the new immigrants come looking for standards, miniaturization, all kinds of things you can't really do in your garage and which don't make you elite any more anyway; yes you have email and you can post messages to be seen all over the world but so do all the people who use this power to post pictures of their cats. There's nothing special about it any more. Even if there's still gold to be found the saloon has stopped selling drinks to minors, the whorehouse has closed, and they're building a police station alongside the new church. Little wonder the people who started it all move on to something else.

Oh, and for Ed: .
posted by localroger at 8:08 AM on April 3, 2010 [8 favorites]


Apparently Bill Gates showed up at his bedside.
posted by Ironmouth at 8:20 AM on April 3, 2010


.
posted by papafrita at 9:14 AM on April 3, 2010


.
posted by brundlefly at 9:26 AM on April 3, 2010


.

For the historical impact of the personal computer and the personal impact of being a country doctor.
posted by jadepearl at 9:40 AM on April 3, 2010


I will always be grateful to our forefathers in computer science, who were interested in computers before they could really do anything interesting.

By the time I arrived on the scene with my shiny new Apple IIc, computers could do all kinds of neat shit -- play video games, make color printouts, call BBSes. To a lonely, brainy 8-year-old, the appeal was obvious. But the original computers were really just advanced calculators; the first PCs were basically just chips and toggle switches. However, a few visionaries saw the potential of these new gadgets and then WORKED THEIR ASSES off to make things like disk drives and color monitors and modern programming languages -- all of the things we take for granted as basic building blocks of the computing experience. Without these men, their hopeless dreams, and their thousands of hours of late night debugging sessions, I would never have had that Apple IIc. I would never have known any of you, either.

Thank you, Ed. You will be missed.
posted by Afroblanco at 10:05 AM on April 3, 2010 [2 favorites]


.
posted by Aquaman at 10:42 AM on April 3, 2010


My first real interaction with a computer was with the Commodore PETs in my school's computer room (I had seen the TRS80s a few grades earlier, but there were only 4 of them, so anyone who wasn't seen as particularly gifted got close to no time on them). This led to me devouring any magazines that had anything to do with home computing (I remember reading about all sorts of things I couldn't understand and my folks definitely couldn't afford), then bugging my parents until they got me a VIC20. I was real proud of that crude Missile Command clone I squeezed into the VIC's 5K of memory, and later the terminal emulator I whipped up when I got a modem - those countless magazine articles I read about S100 kits, Heathkit Z's, building your own memory expansion boards (actually all this stuff was a few years in the past - by the early 80s, low end stuff that hooks up to your TV and was out of the box ready was very common) inspired me to learn what I could. After drifting away in high school, I came back to learn my trade on a VAX in college, with those memories of those magazine articles helping to choose my major.

I have no doubt that I'm not the only journeyman software developer with a similar story to tell. Ed, you never knew who I was, and never used anything I worked on, but I sure knew what an 8800 was.

Good night, good luck, and rest peacefully,

.
posted by Calloused_Foot at 11:00 AM on April 3, 2010


still have a closet full of late 70s Byte Magazines
posted by Fupped Duck at 12:44 PM on April 3, 2010


Although I think the earliest, most seminal PC design during the 1970s was the 1972 French Micral 8008 by Gernelle and Thi, Roberts' design was the one that seems to have led to the most dramatic commercialisation of their concept of an affordable, low-end interruptible CPU and RAM with serial I/O and a standardised backplane expansion, and few people seem to have any bad things to say about him. His later life decision to work in medicine was an inspiration to me. The world was better for having had him in it.
posted by meehawl at 1:59 PM on April 3, 2010


still got a couple of Apple II's- and Apple ///'s, too- and at least one Apple II CP/M card, in da box. Friend of mine bought an Altair, then replaced it with a Northstar machine. Then the whole thing turned into a free-for-all, and all of a sudden there are these here Kindlings and eyepad things.
My lawn off get.
and-
.
posted by drhydro at 8:39 PM on April 3, 2010


I will always be grateful to our forefathers in computer science, who were interested in computers before they could really do anything interesting.

Digital computers could always do something interesting because the first ones were funded by the military to do interesting stuff. Early toy computers couldn't do anything interesting. The history of digital computing didn't begin with the Intel 4004.

On second thought, the 4004 was first used in a calculator, so maybe it could always do something interesting.

Also,
.
posted by Crabby Appleton at 8:48 PM on April 3, 2010 [1 favorite]


« Older Rock out with your double reed out!   |   "Au Soleil (To The Sun)" Newer »


This thread has been archived and is closed to new comments