Expandable to 16k!
October 4, 2009 9:53 PM   Subscribe

50 years ago today, IBM announced the 1401 Data Processing System. Originally designed as a spooling system for the larger machines, the 1401 became very popular as a mainframe in its own right, eventually being called 'The Model T of Computers'. By the end of 1961, the number of 1401s installed in the United States alone had reached 2,000 - representing about one fourth of all computers installed by all manufacturers at that time. 15- 20,000 were eventually built. The Computer History Museum in Mountain View is having a 50th anniversary celebration on November 10th. Here's what $125,600 (or $2500/month rent) would get you:

1401- 6-bit processor with 1400 bytes of core storage. Clock speed 87KHz
1402- combination card reader (800 cards per minute) & punch (250 cards per minute)
1403- 600 lpm printer

If you're willing to spend more you could get up to 16k of memory by getting a 1406
Then you might want some 729s (featured here) for card to tape or tape to print operations, at $30k-60k apiece.
If you need some disk space, you could try the 1405, for up to 20MB.

The processor was approximately 30"x58"x58", and used a 30A, 208V power connector (3 phase). The entire system needed 23,000 BTU of cooling per hour.

The Computer History Museum has been restoring two systems for the last 5 years, the second of which it acquired from a father and son who were using it to operate a billing service business until 1995 out of their home in Darien, Connecticut.

Those without access to their own 1401 can download an emulator.
(You might need reference material, or at least the reference card).
Here's a Hello World to get you started.

Previously: Music , History
posted by MtDewd (50 comments total) 27 users marked this as a favorite
 
OMG! i have that computer! when i want to google something i have to punch my query onto a card with a hole punch, put it in the slot, and turn the crank. (it plays 'pop goes the weasel' while i do this)

srsly, i neeeeed a new computer...
posted by sexyrobot at 10:01 PM on October 4, 2009


My name is 1401, king of computers:
Look on my computations, ye Mighty, and--

Wait, gigawhatnow?
posted by kmz at 10:01 PM on October 4, 2009 [2 favorites]


Seriously though... I'm trying to imagine how it could have been economical to be running that system in 1995. The power bill for a single month could have bought something many many orders of magnitude more powerful. Hell, I woulda sold him my good ol' Mac Classic. 8 MHz! 2MB of RAM, which we later upgraded to 4 for megaexcitement. Even in its day, was considered slow. Apple had t-shirts for the Classic reading 'Patience is a virtue.'
posted by kmz at 10:08 PM on October 4, 2009


Does anyone have a plot of punch card sales by years? I remember reading once in, I think it was wired, that punch card sales were greater today (when the article was written) then at any point in history prior. But the author could have been smoking crack, this wired we're talking about.

Hmm, this is probably the article I remember which actually says punch card sales peaked in 1967.
posted by delmoi at 10:22 PM on October 4, 2009


15- 20,000 were eventually built.

I have a hard time believing that IBM didn't count how many they made. After all, they had their own gigantic electronic brains to help them track the immense sum. So were the records lost? Or is there somehow some debate over exactly what a 1401 was?
posted by pracowity at 10:34 PM on October 4, 2009


kmz: "I'm trying to imagine how it could have been economical to be running that system in 1995."

Upgrade costs, when you have your whole business process built around a particular system, can be phenomenal. It's not just a matter of buying a new machine, but of deconstructing the old system to find out how it's programmed right now, finding a new system that will work, writing the software for it, training all the employees how to maintain it (or hiring new employees who understand it, if training current staff isn't practical), actually implementing the new system, and then tweaking it to fix the inevitable issues that come up.

I've seen mainframe-to-mini migrations go so horribly wrong, that the cost of the botched migration would have powered and staffed the old machine for decades—if not centuries. (Admittedly, in most of these real horror stories, a big part of the problem is people who want the migration to fail for various reasons, but this is just another factor to keep in mind.)

I make my living, at least in part, from people doing upgrades like that, so I'm certainly not saying that they're always a mistake. Done right they can realize huge cost savings. However, it's been my experience that the costs of migrating away from legacy systems are pretty frequently hopelessly optimistic underestimated.

At any rate, in the case of the company in Darien that was actually using it, I suspect that while the ongoing cost of the 1401 was manageable, the upgrade costs (as an up-front expense) weren't, so they just never upgraded. Makes sense to me.
posted by Kadin2048 at 10:38 PM on October 4, 2009 [5 favorites]


Sometimes I like to think about how, Bill Gates Urban Myth-style, I explained to my mom that she wouldn't need more than a 6 gig hard drive, pretty much ever, for her needs, back in 199-something. Back when it was a several-hundred-dollar decision.

Then I look at my phone, which has a 16 gig hard drive. And I laugh and laugh.

Living in the future is awesome. If my iPhone had an Omni app, I'd Voyage my ass back to the 60s and I'd show off so hard.
posted by padraigin at 10:48 PM on October 4, 2009 [1 favorite]


Kadin2048: That's true for large infrastructures, but this particular setup seemed very mom and pop. But now that I think about it, if they were really mom and pop they probably couldn't have afforded the 1401 in the first place.
posted by kmz at 11:10 PM on October 4, 2009


I have a hard time believing that IBM didn't count how many they made.

Counting is actually pretty difficult. What counts as a unit? These things were rented, repaired, refurbished. Is a refurbished machine a different unit? What about serial numbers that were never used? Does a DOA machine count?

Anyway, if you're an in-house historian at IBM and you're on the phone with some journalist at Wired, you're probably not going to go all Miss Marple on the issue. You're going to look at some figure that came out of the sales or marketing department and say "yeah, about 15 to 20K sounds right."
posted by phooky at 11:19 PM on October 4, 2009 [2 favorites]


In my senior year of college, when our advanced inorganic chemistry prof announced that we would be repairing to the campus cafe for the day's "lecture" we knew it would be a fun day where we would quickly leave the edifying but often dull world of the periodic table aside and hear war stories of graduate school in the mid-century. I recall him describing the palpable excitement when his alma mater elected to go large and order 16K of memory for the campus mainframe - the thrilling prospect of crunching really big numbers... and how it was delivered in essentially a couple of refrigerator boxes. I've got a reasonable chance of making it another 50 years, and I look forward to boring my child with the fifth retelling of the difficult decision to spend a couple thousand bucks on a computer with a paltry half terabyte of hard drive space (what's a hard drive Dad?) Living in the future is awesome, but living in the past is liable to be even more amusing.

My P-Chem prof had us work the data on some of our experiments running FORTRAN programs he wrote himself on VAX terminals lurking in the library (on which I also basically first discovered this crazy nonsense we now call the internet. "Finger" mean anything special to you damn kids? Fuck, I'm old).
posted by nanojath at 11:28 PM on October 4, 2009 [4 favorites]


My P-Chem prof had us work the data on some of our experiments running FORTRAN programs he wrote himself on VAX terminals lurking in the library (on which I also basically first discovered this crazy nonsense we now call the internet. "Finger" mean anything special to you damn kids? Fuck, I'm old).

My first online experience was also on VAX/VMS. Personal accounts at UT used to be all VMS by default and I used my dad's all the time once we got the 2400 baud modem hooked up to our Classic. finger, .plan files, gopher, the godawful text editor. Oh, and an awful Usenet client. Though I loved 'phone' which was way better than Unix's 'talk' until 'ytalk' supported multichat.

Eventually we got a PC that had an actual color display (25MHz processor!!). And that's when I discovered the world of Internet porn. But we had to work for it back then, dagnabbit. Save UUEncoded files from crappy VMS Usenet client. Download UUEncoded files to Mac with ZModem. Transfer to PC with floppy. Decode. Really hope parents don't sneak up.

Ah, the good ol' days.
posted by kmz at 11:55 PM on October 4, 2009 [9 favorites]


Er, sorry if that was perhaps oversharing.
posted by kmz at 12:01 AM on October 5, 2009 [2 favorites]


Oh boy does this bring back some memories!

I took my undergrad in Math & Computer Science in the late 70's, getting my first job in 1980. We had an IBM 1401 and getting it to boot required keying in a small bootstrap loader via hex toggle switches on the front panel. This was seriously low level computer use; toggle in a memory address, toggle in an instruction, cycle a write to write the instruction to core then move on to the next address. While I could probably still toggle the sequence to core if I had a 1401 in front of me, I can't recall how many instructions the boot loader took up but I seem to recall it was roughly 40 to 50.

If you'd entered the boot loader correctly, after setting the instruction pointer back to the start then cycling run would boot the 1401 off disk. It made a lot of noise when it booted, as the old hard drives - washing machine in size - rattled when the heads would seek. In fact sometimes the entire hard drive cabinet would rattle around pens or other light objects left on the top surface.

Punch cards were indeed the way to get programs onto the hard drive, and you'd invoke various programs via a relatively crude, card based JCL.

For its time the 1401 was a fairly interesting machine if for no other reason than its place in computer history. It even came with a complete set of schematics which the CE would use when it refused to boot (which seemed to be very often).

Of course, at University we'd been using CDC mainframes with time sharing and PDP 11s for department level research (I had done a lot of research with with one of my profs, who was on the design committee for X3J3, or Fortran 77. As his came pulled in the NSF grants, we got access to a PDP 11) so I knew this machine was on its way out.

After about one year we got a PDP 11/34 which was followed by a PDP 11/70. IBM took the 1401 back to rent out to another shop as the machines were apparently still in demand as late as 1982.

Meanwhile a small group of us who'd gone to University together had been building our own machines at home. We were assembling Z80 class computers, based on something called a "Ferguson Big Board".

Of course back then building a computer meant you'd get a bare board which you then populated. We spent three months or so scrounging parts, soldering parts to populate the board, and then another three months or so trying to figure out why it wouldn't boot. Most of these sessions took place Friday evenings, spilling over into Saturday AM, driven by copious amounts of beer.

Compare to present where 99.9% of the people building a computer are simply integrating boards and cards together to make a machine.

Just before we got the PDP 11 at work I managed to get my Big Board to boot. The contrast between single user CP/M and IBMs 1401 punched card based OS was striking, especially so as it was pretty easy to get a (very, very slow) modem connected to the CP/M machine, getting it connected to remote BBSs.

By contrast, the 1401 was isolated, capable of talking only to a printer, a plotter, hard and tape drives. It was very fascinating to work in the 1401 environment then return home at night and compare the nascent capabilities of the CP/M world on a cost adjusted basis. All told I paid about $1,500 for my CP/M machine, and never bought software.

Interesting links - brings back many memories - thanks!
posted by Mutant at 12:55 AM on October 5, 2009 [10 favorites]


Compare to present where 99.9% of the people building a computer are simply integrating boards and cards together to make a machine.

Having read a forum or two where people bemoan the process of soldering surface mount chips, I dare say there is a reason that you don't find many people really making their own computer any more.

Nobody make an eight bit processor out of discrete components any more either. Well, almost nobody.
posted by Kid Charlemagne at 3:30 AM on October 5, 2009


Cool, my dad used to fix these things for a living.
posted by marxchivist at 3:39 AM on October 5, 2009


I have a hard time believing that IBM didn't count how many they made.
I'm sure they did, but they haven't given the information to Google. I tried to track it down, but I didn't find any official numbers. I saw estimates of 15,600, 18,000 and 'over 20,000'. I know the serial numbers went over 20000, but there may have been gaps.
posted by MtDewd at 4:04 AM on October 5, 2009


Many years back, the mom&pop computer shop I was working in had a guy come in looking to buy a faster machine than his roomate.

His roomie had a 30 MHz machine. He shelled out the bucks (About $2500 as I recall) for a 33MHz machine. About four, five years before, I shelled out $2000 for a 2 MHz Osborne 1 w/CP/M, Supercalc, WordStar, and other stuff. Very useful little word processor, I must admit - not so much for games, though. And we won't even talk about the magnificent graphics capability of the Osborne - because there weren't any.

All things considered, (speed and price and such...) it's amazing the early microcomputers lasted long enough in the marketplace to evolve into what we've got today. If it hadn't been for graphic interfaces (Windows) and games, there wouldn't have been a real push for anything much faster than maybe a 50MHz clock speed...
posted by JB71 at 4:31 AM on October 5, 2009


I first learned to program on the IBM 1620 which came out about the same time (Oct 1959). It had a "program" (which had to be toggled in from the console if erased) to read a card from the card reader and follow the instructions on it. That's how one would boot it up. The instruction were to read 3 additional card and then follow their instructions--e.g. a bootstrap loader. Also on these 3 cards were the addition tables. Addition was done (in decimal) by table lookup and these were the tables it used. Division was done by software (repeated subtraction).

Eventually, we got a Fortran compiler for it (on punch cards, of course). One bug we had to deal with is when a program overwrote the constants that the compiler used. For example, if you changed the value of 2, your arithmetic would come out funny and it was hard to know why from reading the code.
posted by Obscure Reference at 5:05 AM on October 5, 2009


Kid Charlemagne -- Having read a forum or two where people bemoan the process of soldering surface mount chips, I dare say there is a reason that you don't find many people really making their own computer any more.

Absolutely, and I certainly didn't mean to sound like I was belittling anyone's efforts.

And the fact wasn't lost on me that the Z80 was a major step forward in terms of reducing discrete component count - the 8080 was apparently a real bitch in that regard, requiring a much, much larger number of external integrated circuits.

A bunch of guys from the year before my class all built 8080/8080a based systems, and they generally gave a hard time (in a good natured way of course) about how easy we had it!

To this day I regret not taking the plunge earlier, and building one of those very, very simple 1802 based systems. A buddy built one while I was in high school it was so simple he breadboarded the entire system, which brought great clarity to the architecture - it was all there in front of you!
posted by Mutant at 5:10 AM on October 5, 2009


When I was a teenager, in the very early 80s, I was a ward of the state (being an incorrigible juvenile delinquent). They had this neat program for kids where you'd work two weeks each at a variety of jobs. One of those jobs was working on these IBM monoliths, feeding them punchcards, cleaning them (they got filthy!), that kind of thing, at an Air National Guard installation in Missouri. I remember the guy in charge there holding up a punchcard and in response to my blank look, he said, "This is SOFTWARE." My mind just could not accept that a piece of cardboard with holes in it was software.

On the plus side, that was a pretty cool two-week stint (aside from the medieval computers) because on my off time I got to go sit in an F-4 Tomcat and eat lunch, and I also got to view an F-18 engine test (which was just spectacularly impressive).
posted by jamstigator at 5:32 AM on October 5, 2009


Don't laugh. When I was in highschool in the early 70s, only the smart kids were allowed to learn about punch cards. Do not bend, fold, staple or mutilate. Threatening to do so was enough to reduce them to tears.
posted by nax at 6:11 AM on October 5, 2009


Within 100 years, computers will be twice as powerful, ten thousand times larger, and so expensive that only the five richest kings of Europe will own them.
posted by dances_with_sneetches at 6:28 AM on October 5, 2009


"If it hadn't been for graphic interfaces (Windows) porn and games , there wouldn't have been a real push for anything much faster than maybe a 50MHz clock speed..."
posted by bonobothegreat at 6:57 AM on October 5, 2009


They were still teaching Fortran 77 on punch cards at Penn State when I started there in '82. It was a very different way of programming than you typically do now. You had to stand in line to get your cards compiled and have your program run and then go wait for a printout so you couldn't just type something in and let the compiler find your syntax errors. You had to sit down and debug the whole program with pen and paper before you even committed it to punch cards. It forced you to really understand the language because you basically had to compile the thing in your head before you used the real compiler.
posted by octothorpe at 7:13 AM on October 5, 2009


As a child of the late 80s, for whom a computer is "that unfathomable magical box which makes my entire life possible," I am totally fascinated by this discussion. I always wish I knew more about what computers actually ARE and how they evolved.
posted by showbiz_liz at 7:23 AM on October 5, 2009


My first computer class, in 1983, was all punch cards. You had to go through that class before they'd let you touch the TRS-80s. And at UNC in 1984, my chemistry lab experiment results had to be punch carded.
posted by MrMoonPie at 7:55 AM on October 5, 2009


Back in those days, spam was real spam. Some fucktard would come by and put canned meat into the card reader.
posted by storybored at 8:08 AM on October 5, 2009


I'll have to raise my hand as someone who programmed on one of these old IBM mainframes. It was assembly language for a legacy network system. We had the benefit of typing programs into a modern-day terminal (CTS timeshare) but debugging was done old-school... the dump would come out on a dinosaur line printer and you would inspect the register contents.

This was in 1998.
posted by crapmatic at 8:15 AM on October 5, 2009


Jóhann Jóhannsson's IBM 1401, a User's Manual (YouTube: Part I - IBM 1401 Processing Unit), referenced in the OP under "Previously", makes a good soundtrack for this thread. You can almost taste the nostalgia.
posted by The Lurkers Support Me in Email at 8:37 AM on October 5, 2009


(Also, while I doubt the spinning whirly gizmo in the first & last 30 seconds of that video were present on the actual 1401, I can't help but think its aesthetics — or for that matter, those of any modern computer — would be vastly improved by the addition of one.)
posted by The Lurkers Support Me in Email at 8:49 AM on October 5, 2009


You know what always amazes me? It amazes me what mechanical engineers were able to do (are able to do) before (without) software. That thing read 800 cards a minute and punched 250.

50 years later, my $200 scanner can't scan 6 pages a minute to .tif.
posted by TomMelee at 9:26 AM on October 5, 2009


Nah Burhanistan, we're not. Forget computational power. My point is that these systems used to require mechanical engineers. 800 pages a minute in 1959 is a pretty impressive feat.
posted by TomMelee at 10:03 AM on October 5, 2009


That thing read 800 cards a minute and punched 250.

As long as they were pristine and not BENDED, FOLDED, SPINDLED OR MUTILATED. And they certainly had their share of problems - that number is the "everything's perfect" number. Factor in the jams and resets and it's averaging only a few dozen a minute.
posted by GuyZero at 10:22 AM on October 5, 2009


The punch unit is even more impressive. The cards could just zip by the read brushes, but each card being punched had to stop 12 times to punch each row. How do you run 250 cards/minute while stopping the cards 50 times a second? (Answer: use a geneva) There were also cams and eccentrics doing their things.

GuyZero: They didn't have to be pristine- there was a bit of tolerance there. I've definitely seen them run boxes and boxes through without jamming. On the other hand, some jams were pretty severe. I used to carry a card saw. (Still have one)
posted by MtDewd at 10:32 AM on October 5, 2009


I remember being annoyed that my undergrad CS curriculum still used an IBM mainframe to teach assembly language programming, rather than x86 or 68K (this was in the 80s). It turned out to be a blessing I didn't even know was being bestowed at the time - my first job out of school was in an AS/400 shop, and the RPG programming language is so closely related to 370/Assembler that I didn't need to make all that much of an effort to be productive. The AS/400 was running virtual machines, and had a nice HAL eons before Java and Windows came around. The old stuff often lives on in unexpected ways in the present.
posted by Calloused_Foot at 10:36 AM on October 5, 2009


I used to carry a card saw. (Still have one)

Oh snap! I wondered how operators used to cope with big card jams. I can hardly deal with my printer jamming up with two pieces of paper. That there exists a specific punch-card-jam-fixing-tool is very awesome.
posted by GuyZero at 10:40 AM on October 5, 2009


So will in run MaME?
posted by wcfields at 11:47 AM on October 5, 2009


Our good friend Me Turning indicates that the IBM 1401 will indeed run MaME but you'll need to input all your moves on punchcards and see the screen updates on the lineprinter.
posted by GuyZero at 11:52 AM on October 5, 2009


Me Turning -> Mr. Turing
posted by GuyZero at 11:52 AM on October 5, 2009


I used to carry a card saw.

This is one of the reasons I love MetaFilter: I'm now aware of an entire class of objects of which I previously had no knowledge.

And now I want a card saw, even though I have no earthly use for one.
posted by Mr. Bad Example at 12:43 PM on October 5, 2009


I also learned on the 1620, and the Fortran compiler (called FORGO) was a two-pass compiler. You loaded the compiler deck, then your source deck, and the compiler would punch an intermediate "file". Then you loaded the second pass of the compiler and your intermediate deck, and the compiled binary would be punched out, a could be run.

It was definitely worth your time to verify your code before you started the process.
posted by CheeseDigestsAll at 2:06 PM on October 5, 2009


"...who were using it to operate a billing service business until 1995 out of their home.."

There was some sort of Windows OS released that year. I bet their overall throughput close to doubled.
posted by Hardcore Poser at 3:52 PM on October 5, 2009


I dunno. I doubt he got the same performance from his printer. Those old-school printers cranked it out...
posted by GuyZero at 3:57 PM on October 5, 2009


The Computer History Museum estimated the PC to 1401 print ratio at 1:1, but the 1403 could do 6-part forms.
posted by MtDewd at 4:41 PM on October 5, 2009


A few of the figures cited by the reporter in that YouTube clip are a bit suspect. A Blackberry is 10 million times faster than a 1401? I want one of these 870 GHz Blackberries!
posted by L.P. Hatecraft at 6:36 PM on October 5, 2009


Modern processors do a lot more operations per cycle than they used to. You can't just compare Hz to Hz.
posted by octothorpe at 6:52 PM on October 5, 2009


I built my home-brew computer with an 8085 because unlike the 8080 it would run on a single 5V supply. It only had 256 bytes of RAM because that meant I only needed 8 address lines. It actually had more chips involved in the front panel (that allowed you to enter programs directly into RAM like a minicomputer) than in the rest of the computer. My first 3 years of college were all punch cards on a CDC 6500 (which had the weirdest assembly language I ever saw).
posted by rfs at 8:04 PM on October 5, 2009


GuyZero -- I used to carry a card saw. (Still have one)

Oh snap! I wondered how operators used to cope with big card jams. I can hardly deal with my printer jamming up with two pieces of paper. That there exists a specific punch-card-jam-fixing-tool is very awesome.


The Cult of the 5081 Hollerith Unit Record gave rise to a wide array of ancillary tools including the aforementioned card saw.

Over time, as programs grew in complexity the number of cards also increased. Almost all languages of the time specified one program statement per card, with the order of the cards clearly critical. Data was also entered on cards, and while there was a little more freedom the order (depending upon how you wrote your programs reading the cards carrying data) was almost always important as well.

Dropping your deck of cards (more like a stack) was disastrous! So there were special machines you'd enter a deck of cards into that would (and I'm sure you know whats coming) punch special holes in reserved areas, effectively numbering your cards.

That way if the worst happened you could drop the messy deck into a sorter machine that would reorder your cards.

There was also a machine that would duplicate either a single card or an entire deck, for all intents and purposes a machine built solely to execute
cp stdin stdout
This machine would save your sanity should your deck become folded, spindled or mutilated. Well, maybe not spindled, but it would work wonders with lots of other problems, chewing through your old deck and producing a nearly identical deck of brand new cards (nearly identical simply because these were electro mechanical machines, and duplicating errors, while rare, did occur).

The typical computer room back then not only contained a card punch station, but also several of those ancillary machines, all needed to support just getting programs and data into the computer.

Wow now that I'm thinking about all of this, that all took place a long time ago!

Not that I'm old or anything *cough*. Surely its how young at heart you are that matters, isn't it?
posted by Mutant at 4:30 AM on October 6, 2009 [3 favorites]


During a brief period in the mid-80's, I worked on a system in HQ SAC that was used to communicate with all the missile stations in the United States. I apologize in advance, my google skills have failed; I can't seem to find much on the "old" SACCS/DTS (if you google, all you'll get is references to the "new" one linked below).

Up until 1987, that system was based on the 1401; it had been built in the late 50's / early 60's, had core memory (a luxurious 64k), and instead of disk drives, it used drums. Magnetic media on a cylinder, with the heads traveling over the surface of the "drum". We also used punch cards for convenience when inputting several commands at once.

The machine occupied a very large room (think high-school gymnasium, doubled). The consoles were 12-feet long, and had three rows of PBIs (Push Button Indicators). We used the PBIs to enter "go" codes into the machine, in octal, then pressed the "GO" button to execute the command. No keyboards, no terminals.

In 1987, the new system, which is still in use, came on-line. This system was based on an IBM Series 1 computer, and had been commissioned by President Richard Nixon in the early 1970's.
posted by dwbrant at 11:08 AM on October 7, 2009 [2 favorites]


dwbrant- Was it SAGE?
The guy that has the 1401 restoration site has a section on it.
posted by MtDewd at 12:50 PM on October 8, 2009


« Older R.I.P. Mercedes Sosa   |   I'm in the other room surrounded by blankets. Newer »


This thread has been archived and is closed to new comments