October 24, 2019 1:17 PM   Subscribe

“Because the paper beds of banknote presses in 1860 were 14.5 inches by 16.5 inches, a movie industry cartel set a standard for theater projectors based on silent film, and two kilobytes is two kilobytes” is as far back as I have been able to push this, but let’s get started. MeFi's own mhoye dives 2500 words deep into the history of terminal aspect ratios.
posted by cgc373 (62 comments total) 65 users marked this as a favorite
I very much enjoyed this when I first stumbled across it yesterday. I'm now ashamed that I didn't note the author!

It really is a nice bit of research that also illustrates the difficulty of finding the real history of a thing that is almost inherently ephemeral and largely ad hoc. I suspect future historians will have similarly large gaps in the contemporary record, despite the feeling we have today that everything electronic is going on a permanent record in some vault somewhere.
posted by wierdo at 1:37 PM on October 24, 2019

fascinating! I wonder if the currency stock size descends from the quarto and octavo paper stock system.
posted by mwhybark at 1:59 PM on October 24, 2019

oh that is an interesting evolutionary trail!
posted by a halcyon day at 2:00 PM on October 24, 2019

Wow. That was like listening to James Burke after he'd drank ~3 too many cups of coffee.
posted by hearthpig at 2:16 PM on October 24, 2019 [7 favorites]

More like 80x24, the VT100 link Digital’s Video Terminals – VT100.net sets them at 80x24 (and the VT300 having a 25th row status line sort of thing). And XTerm defaults are 80x24. Though I admit you could maybe get 25 lines with the 2k of memory, I'm guessing that the extra line was used for a buffer of sorts for the configuration / information display (things like setting font, baud rate, etc). based on my hazy memories of using VT100s back in the day.

Last I knew from a couple of years ago, old $WORK still has a Heathkit H19 - Terminals that someone built in the early 80's still in daily use.
posted by zengargoyle at 2:27 PM on October 24, 2019 [2 favorites]

The 128 words left over was scratchpad space for the CPU.
posted by Your Childhood Pet Rock at 2:36 PM on October 24, 2019 [1 favorite]

Not sure which bit I found more fascinating: Tge punch photograph, or the fact that movies started out 4:3, then switched to a wider format to differentiate themselves from television, which inspired television to switch to a wider format in order to show movies.
posted by ejs at 2:53 PM on October 24, 2019

to think, that the reason why monitors are often set to 1366x768 dates back to Roman chariots
posted by DoctorFedora at 3:00 PM on October 24, 2019 [22 favorites]

Lear-Siegler were doing 80×25 before DEC: like the 7700A from 1973. People still put up with weird vi cursor keys because Bill Joy's place of work had cheaped out and didn't buy the add-on keypad for the Lear-Siegler ADM-3a.

“Because IBM” is a good start and end. There were lots and lots of different sizes of punch cards (see Punched Cards: Their Applications to Science and Industry) and someone choosing a box that was already available for other paper stock is as good as any.
posted by scruss at 3:02 PM on October 24, 2019

I had a nit
with ANSI.SYS being the way DOS programs talked to the display from DOS 2.0 through to beginning of Windows.
No. Only a few oddball things like games ported from UNIX used ANSI.SYS. Basically every single DOS application programmer worked directly with video memory as a 4000-byte buffer of alternating character and attribute bytes, either directly from a hardcoded base address or using a library that did that.

And really now that I'm thinking of it, the stock ANSI.SYS wasn't good for much, and nethack for example needed a third-party piece called FANSI.SYS (the fancy ANSI).
posted by Aardvark Cheeselog at 3:05 PM on October 24, 2019 [8 favorites]

He drops a reference to punchcards as a microfilm medium -- my first tasks with my current employer were feeding microfilm aperture cards into a machine to digitize the images (these were rarely 'punched' so we never used the hollerith reader feature), and I have some 'test image' ones around here someplace; I'm tickled that there's conceptual ties back to 19th century money.
posted by AzraelBrown at 3:06 PM on October 24, 2019

Fun punched card facts:
  • IBM's printing punches used an incredibly intricate mechanical dot-matrix memory to define the character shapes: The IBM card punch code plate
  • Nobody makes punch cards any more.
They were great for notes, shopping lists, etc. I miss 'em.
posted by scruss at 3:24 PM on October 24, 2019 [6 favorites]

4:3 predates 1929 by more than a generation. Edison's engineer William K Dickson set the standard 35mm movie frame to be four sprockets high in 1894 or 1895 and it was officially made a standard in 1909. What happened in 1929 was that it was modified slightly to accommodate the sound strip, changing it from exactly 4:3(1.33:1) to 1.37:1.
posted by octothorpe at 3:25 PM on October 24, 2019 [8 favorites]

to think, that the reason why monitors are often set to 1366x768 dates back to Roman chariots

Well yes, but just in that the 16:9 aspect ratio allows you to watch Ben Hur.
posted by condour75 at 3:50 PM on October 24, 2019 [1 favorite]

Did punchcards overlap with USB to the point of USB punchcard peripherals ever having been a thing?
posted by acb at 3:50 PM on October 24, 2019 [1 favorite]

lol mhoye rocking the snarkfactor 1000 sentence generator:

Metafilter: It’s not entirely clear if this is a deliberate exercise in coordinated crank-wank or just years of accumulated flotsam from the usual debate-club dead-enders hanging off the starboard side of the Overton window
posted by lalochezia at 4:09 PM on October 24, 2019 [13 favorites]

I still politely ask my fellow programmers to keep lines under 80 characters if they can. They don't.

The Samba project uses 80 characters as a way to limit other coding practises they don't like:
Maximum Line Width is 80 Characters: The reason is not for people with low-res screens but rather sticking to 80 columns prevents you from easily nesting more than one level of if statements or other code blocks.
posted by clawsoon at 4:19 PM on October 24, 2019 [4 favorites]

Scruss, that IBM punch card plate article is fascinating, thanks! As a kid I used to play with my mother’s office punch card machines, and as an adult I worked as a digital type designer, but I never knew about this cool technology.
posted by ejs at 5:26 PM on October 24, 2019

There are a couple of illustrations of 19C railway 'punch photo' tickets here [pdf] ( I kind of fell down this rabbit hole a while ago and spent some time there).
posted by carter at 5:52 PM on October 24, 2019

Also, Baird did invent mechanical television, but the superior electronic television was invented in the US by Philo Farnsworth and stolen developed by Vladimir Zworykin at RCA. Doc on YT here, it's a fascinating story. Mechanical and electronic tv were entirely different beasts.
posted by carter at 6:01 PM on October 24, 2019 [1 favorite]

Yeah, but Baird demonstrated first. And where I'm from, he's the inventor. When Baird was in poor health at the very end of his life, my great-grandfather put together a fund for Baird's family from his colleagues at the RTC in Glasgow (where Baird had studied and briefly researched). Family lore has it that Baird was a very odd man, difficult to get along with.
posted by scruss at 6:17 PM on October 24, 2019

 Did punchcards overlap with USB to the point of USB punchcard peripherals ever having been a thing?

No. Punched cards were really done by about 1980. They were also the hallmark of a big iron DP facility, so anything small (that might've had a serial port as we know it) would have run paper tape. I think the latest computer peripheral I know of that punched paper was the Heathkit H10 tape punch from 1978. I suppose you could put an FTDI on that can call it USB, but good luck finding the tape ...
posted by scruss at 6:29 PM on October 24, 2019 [1 favorite]

Did punchcards overlap with USB to the point of USB punchcard peripherals ever having been a thing?

No, and anybody who suggests they might be is definitely someone you shouldn’t believe.
posted by Huffy Puffy at 7:04 PM on October 24, 2019 [8 favorites]

I really enjoyed this but I read it on the front page as “terminal aspect ratios” to mean asymptotic aspect ratios or some other math-proofy thing so I was quite puzzled for awhile.
posted by janell at 7:45 PM on October 24, 2019 [1 favorite]

I wonder if the currency stock size descends from the quarto and octavo paper stock system.

Yes, but in a complicated way.

Quarto and octavo refer to folded sheets of paper. A full sheet is folded once to make a folio, twice for quarto (i.e., four pages), three times for octavo, etc. The origin of "what is the size of a full sheet" are kind of murky and the dimensions varied by time and place and manufacturer, but there seems to be a sweet spot around 19" x 25" or 48 cm x 64 cm.

The ideal ratio for the sides of a sheet of paper are actually 1:√2, because paper with that ratio can be folded in half to make two sheets with the same ratio, which can be folded to make two sheets with the same ratio, etc. ISO standard 216 dictates A-, B-, and C-sizes of paper based on this ratio; which is followed in many countries (including Australia). Old-timey papermakers sort of knew this, which is why the ratios of old-timey paper are sort of in that proportion, but there was a certain amount of variation.

At one point in time everything printed was confined to the dimensions of a full sheet folded to make 2, 4, 8, 12 (duodecimo) or whatever pages. Hardly anyone would use different sizes, because that would mean throwing away paper. But that was because the techniques at the time made individual large sheets; when machines capable of making continuous rolls came around, it was practical to make sheets of any length, only restricted by the width of the roll. So that's where currency sizes appear to have come from: they're either an integer fraction of the height and width of an individual large sheet, or an integer fraction of the width and length of a roll, which length and width themselves were integer multiples of a full sheet.

Incidentally, you know why old-timey large sheets of paper tend to be around 19 x 25 inches? It was frequently the subject of regulation, but the regulations were enacted because the biggest parchment books (codices) were based on folios of that size, which is still the same as the largest sheets of parchment you can conveniently buy today. The king - or his council - or, more likely, his librarian - would have wanted to make sure that books printed on new-fangled paper fitted on the shelves built for ancient books and scrolls in the Royal Archives. The oldest regulations (made in Bologna IIRC) were promulgated around 1380, so the ancient ancient scrolls were probably a few hundred years older, composed perhaps around the year 1000 CE.

That leaves a question: why is 19 x 25 inches a standard size for parchment? I think it must be that someone observed that although animals are different sizes, 19" x 25" tended to be a good average for the largest sheet you could make from a sheepskin, or half the size of the largest sheet you could make from a calfskin. Many hides would have been greater or smaller than that, but the extra size wouldn't help you make a book if all the rest of your sheets were smaller. As evidence for this, I note that scrolls of the Torah are written on parchment, and the standard size is apparently 19" high. So when it comes down to it, currency sizes are ultimately based on the average hide of domestic animals that lived about 1000 years ago.
posted by Joe in Australia at 7:46 PM on October 24, 2019 [27 favorites]

But does the 19-inch rack also derive ultimately from the size of mediaeval cattle hides, or is it a coincidence?
posted by acb at 3:00 AM on October 25, 2019 [5 favorites]

That post has been hanging over me for months, and I'm glad I finally got it over the line; it's really great to see this on the blue.

Hi, everyone!
posted by mhoye at 3:37 AM on October 25, 2019 [18 favorites]

They were great for notes, shopping lists, etc.
They still are. I use them every day. I've only got about 12 boxes left (~24,000), but I think I have a lifetime supply. I would share, but shipping a box is expensive.
I saw cards being punched as late as 1992, but yeah, their time has passed.
posted by MtDewd at 6:28 AM on October 25, 2019 [1 favorite]

I used punch cards on an IBM 370 at Penn State in the early 80s. Even then it seemed antiquated but the CS professor at the tiny branch campus didn't believe in those new-fangled terminals.
posted by octothorpe at 6:34 AM on October 25, 2019

Holy crap! Just checked- I only have 6 1/2 boxes of cards left!
I acquired 5 cases (25 boxes) about 30 years ago, so I've been using over 1000 cards a year. Maybe I need to slow down.
posted by MtDewd at 6:53 AM on October 25, 2019 [5 favorites]

in 1926 he did just that, replaying that mechanically encoded signal through a CRT [...] Baird’s original “Televisor” showed its images on a 7:3 aspect ratio vertically oriented cathode ray tube

Picky comment: the Televisor didn't use a CRT - it produced its image using a neon bulb shining through a spinning Nipkow disc. CRTs for television only became commercially available in the early 1930s, and mechanical scanning had a surprisingly long lifespan in specialist applications like telecine and large-screen projection.
posted by offog at 9:40 AM on October 25, 2019 [1 favorite]

Only a few oddball things like games ported from UNIX used ANSI.SYS

Okay, so we're going back like 25+ years here, but if I'm not mistaken I think you needed ANSI.SYS loaded to see the color ASCII "graphics" on BBSes back in the day. I think this might have eventually changed when terminal programs started doing their own ANSI translation? [citation needed]

Or maybe you had to have it loaded it if you were running a BBS? Or if you wanted to view ANSI files from the command line? Not terribly clear on this, but I'm pretty sure I had to have ANSI.SYS loaded for something BBS-related back in the day.
posted by panama joe at 10:44 AM on October 25, 2019

a big iron DP facility

Sorry, a what?
posted by Wolfdog at 10:49 AM on October 25, 2019 [3 favorites]

DP = data processing. Payroll, stock control, invoicing, taxes: the sort of thing you'd send your paper records to a computer bureau to be keyed by a room full of "punch girls". The clever men in the machine room would run these data through the mainframe and produce your weekly reports. Everything but taxes is now handled by desktops/in the cloud.

(yes, use of sexist language deliberate. My dad ran a bureau in the 1970s and despite the young women being the best and brightest from the local schools and colleges, there might as well have been a blood/brain barrier preventing them from getting jobs in the machine room. This was just a few years after getting married automatically meant resignation for women in the UK. More in Mar Hicks' amazing/depressing Programmed Inequality.)
posted by scruss at 11:05 AM on October 25, 2019 [3 favorites]

And 'big iron' refers to mainframes, not 'pots and pans'- i.e. typewriters and little stuff.
posted by MtDewd at 11:16 AM on October 25, 2019

In the photos I've seen of the era, it's not the complete lack of women operating computers that surprises me, it's that there is never more than one, even into the late 70s and early 80s.

By then, there'd often be more than one woman pictured, especially in promotional shots, but usually something like a woman pushing a mail cart while four men stand around a woman seated at a teleprinter or male supervisor closely supervises female operator doing all the actual work changing tapes and releasing jobs with a receptionist/secretary off to one side working a phone or taking notes or some such.

In some ways, it seems more effective at conveying the boys club mentality than excluding women entirely.
posted by wierdo at 11:29 AM on October 25, 2019

I think Wolfdog was just making a sex joke, but I appreciate the explications anyway.
posted by cortex at 11:32 AM on October 25, 2019 [1 favorite]

I was not; I just could not figure out what that particular string of words and letters meant.
posted by Wolfdog at 11:57 AM on October 25, 2019

well now I feel like I tricked myself into making one indirectly
posted by cortex at 12:32 PM on October 25, 2019 [8 favorites]

The history of computing is weird and convoluted and in some ways we're still hobbled by it today. These retrospectives give us a nice view into how we ended up where we are now and a good insight into path-dependency.

Modern programmers take two's complement arithmetic and 8-bit bytes for granted, but I've programmed in C on a computer with a 36 bit word, 9 bit bytes (also 6 bit bytes) and one's complement arithmetic (Univac System 80). A lot of the oddities in C and C++ are there to ensure that it's still possible to have a compliant implementation on outlying hardware. What happens when you take the remainder of a negative number*? C++20 will finally mandate two's complement arithmetic, but AFIK has not thrown down the gauntlet on 8-bit bytes. Meanwhile, how long is an int or long? Who knows? Little-endian (sigh...) has seemingly won the day but the truth is it doesn't really affect me unless I'm looking at a hex or octal dump.

* Fortran defined the result of the remainder of a negative number wrong (it should have been a proper modulus) and people built hardware wrong to accommodate it. I fear we'll never be free of that particular idiocy, but at least the RS/6000 tried. Someday our robot overlords will mock us for this, as our intergalactic friends will someday snicker behind our backs for defining pi half too small. :)
posted by sjswitzer at 12:43 PM on October 25, 2019 [3 favorites]

The CDC 6000/7000 series "supercomputers" had a 60 bit word and used ones' complement arithmetic. Ones' complement has two kinds of zero, plus zero and minus zero. It also uses an end around carry/borrow for add/subtract, which means it cannot detect integer over/under flow in hardware. This is not a good thing. One's complement hardware uses half the gates of two's complement, which used to be a big cost savings.
posted by Metacircular at 1:45 PM on October 25, 2019 [1 favorite]

One's complement hardware uses half the gates of two's complement, which used to be a big cost savings.

You may be right but IIRC, in 2's complement you do addition by just doing the unsigned operation and squinting at the result through a 2's complement lens. Subtraction can be done by flipping bits and jamming a carry bit into the adder you already have. Negation is just subtraction from zero. I'd have to think a bit more about multiplication but it can't be double the gates. Division is hard no matter what.

So anyway, yeah, certain operations (negation and ?) have more gate delays but you can use the gates you already have.

The real reason 2's complement took so long to take off (and also that remainder is wrong!) is that the computer engineers didn't yet know modular arithmetic.
posted by sjswitzer at 2:51 PM on October 25, 2019

Oh god, it's been so long since I took computer hardware classes and had to remember what 1's and 2's complements were. Do CS student even learn that stuff now?
posted by octothorpe at 2:58 PM on October 25, 2019

Do CS student even learn that stuff now?

I would say it varies. Some do and some don't. I'd also say that you can do good work without knowing it and that's a good thing! A surgeon doesn't need to be up on the latest microbiology and a microbiologist doesn't need to be up on, say, the quantum mechanical basis of chemistry. Abstraction levels are a good thing and "leaky abstractions" are the bane of computer science.

There will always be room for--and a need for--abstraction busters. Today's computers don't work in any straightforward way that can be deduced from their architecture manuals (hence all of the side-channel attacks based on caches and speculative execution). So there's a threat surface because of leaky abstractions that needs to be investigated and protected. More dear to my heart is optimizing crucial algorithms like associative arrays to be cache-friendly (we still suck at this).

But that's just my jam. When we make it possible for, say, a JavaScript programmer to focus on user experience and not worry about data representation, we've made progress. A lot of UX developers can code circles around me and stay up on the latest toolchains and frameworks and they have no idea what 2's complement is. The same is likely true of backend developers as well, so this is just an example.

So anyway: sometimes, not always, and it's fine.
posted by sjswitzer at 3:31 PM on October 25, 2019 [2 favorites]

This thread is so much fun. BTW, I have an ~18" of MOHAWK LOW VOLTAGE COMPUTER CABLE (about 1" in diameter) that by received history was used to connect Univac computer bits over distances... Only held up by trusting the stories of my grandfather and the common knowledge of the olds that a certain road in my home-town used to be called Univac Road (there's a High School that I attended there now...). My grandfather kept it on the dashboard of his truck for ages as a slapstick / bludgeoning device just in case you needed to whack someone (he was an old time-y truck driver). It suits that purpose. It would be really painful to get whacked with it. Old stories also go back to hiring women to breakout all of those little wires and solder them into place on interface boards because... needlework and better eyesight and meticulous detail. I'm like 99% sure this is a true story and I have a bit of Univac interconnect cable.

I constantly have to worry about the CS majors of now that are basically learn Javascript or Python and more vocational than the plethora of things that my bit of CS was back in the days. I somehow doubt that most places make CS students build CPUs on breadboards and program them with DIP switches anymore.

Get off my lawn!
posted by zengargoyle at 5:38 PM on October 25, 2019 [3 favorites]

lol, I am old. I started off with paper tape back in the early 70s. I put our Job Control Language off punched cards, to disk, at a bank in Canada back in the mid 70s. And got rid of paper tape for printers while at State Farm in LA circa 1980. Last time I saw punched cards in use was around 2001 while at Bristol Myers. But everything on an IBM mainframe seems to be based on 80 columns (well, minus the last 8 columns for numbering in JCL) and 133 columns for output (gotta have that 1 column for FCBs)
posted by baegucb at 7:18 PM on October 25, 2019 [2 favorites]

In the photos I've seen of the era, it's not the complete lack of women operating computers that surprises me, it's that there is never more than one, even into the late 70s and early 80s.
I'm getting confused by following this thread and the Honeywell 800 thread at the same time. In that main link, there are two photos, and between them there are 4 women and one man operating the systems.
I've been in well over 100 computer rooms, and I'd say the men outnumbered the women, but not by much. Some shops seemed to be all men, and some seemed to be mostly women, but usually there was a reasonable mix. There might have been the old question about whether it was 'important work' (so, men) or just 'drudge work' (so, women), but I did not see this happening in my world.

~18" of MOHAWK LOW VOLTAGE COMPUTER CABLE (about 1" in diameter)
My first day at IBM, I was taken to Hershey Foods, where a brand new 370-135 was being installed. They had the covers off, and I saw these power cables.
They were only 1/2" thick, but I was impressed enough to ask "How much voltage is that?" (Thinking hundreds of volts)
The answer was 3V. (But you could weld with it)
posted by MtDewd at 12:49 PM on October 26, 2019 [3 favorites]

The result of that work is something you’ll certainly recognize, the standard IBM punchcard

I would certainly recognize a punch card, I punched a bunch of them myself. But I'm just wondering, MeFites of a certain (young) age, be honest, how many of you never heard of a punch card until reading this?
posted by beagle at 2:02 PM on October 26, 2019 [2 favorites]

But does the 19-inch rack also derive ultimately from the size of mediaeval cattle hides, or is it a coincidence?

It's possible that the racks are descended from drawers that were lined with 19" wide newspaper pages, but I think the alternative standard of 23" racks provides a clue.

Carpentry is much cheaper and quicker when every dimension is a simple fraction of the length of a piece of timber. Add an inch to either rack measurement and you have a round number, in Imperial units. If the original racks had an internal width of 2' then they'd accommodate 23" units with 1/2" clearance on either side. If the racks had an external dimension of 2' and were built with 2" thick supports the internal width would accommodate a 19" unit with the same clearance.
posted by Joe in Australia at 2:42 PM on October 26, 2019 [2 favorites]

My dad started out as a student doing keypunch part-time work for the university mainframe, then became an operator as an erstwhile student in the early 70s. I don't recall much, if anything, of the short period he did keypunch, but it was mostly students and probably gender neutral. Prior to that, he spent some years (also as an erstwhile student -- he didn't get his degree until I was a teen) doing bookkeeping. My mom was a banker. Both of them could operate adding machines, typewriters, and 10-keys very rapidly. For keypunch work, these skills were critical; and so I think it was most likely, as asserted above, clerical (post-war style) and leaned greatly female.

Much has been written about how data entry directly led to programming and therefore for many years there were many women programmers -- I think the inflection point for the change to a bias against women was probably the 70s, for a variety of reasons. Back in those days, remember (and this is related to the other discussion above), computing was the emergent conjunction of electrical engineering, computer science, business, and academia. So there was a complex interplay between prestige, education, and gender with regard to where some particular kind of work involved those factors and as it evolved over time. The EE side of computing was dominated by men, as was the analysis and managerial parts of business computing. Mainframe operators were fairly low-skilled but performed critical work; but it was also a path into programming.

By the late 70s and early 80s, there was a pretty big chasm between academic CS and business computing. You'd see a fair number of women as career programmers on the mainframe business side, which probably reflects the clerical + bookkeeping genesis of business computing and the pink-collar gender segregation of the 50s-60s -- women made their way into business programming via this route. Academic computing and CS had a much closer relationship to electrical engineering, which was (and is) mostly a boy's club. My experience of CS departments in the early 80s were that they were almost exclusively men. But from the mid 80s and through the 90s, when my dad worked his way up from a business mainframe programmer to executive, there was a significant minority of women programmers in that world. For example, while my dad's older brother was a Fortran/ADA programmer for DOD/DOE contractors, my uncle's wife was a COBOL programmer for the county/city government. My dad's second wife was also a COBOL programmer. And my mom's older sister, a Deaf women with an MBA, was a mainframe programmer -- although it was never clear exactly what she did because she worked at a highly classified level at Sandia National Laboratories for 40 years. (Sandia has a family visit one day of the year and the year I went, her group seemed to be the "Test Analysis Division" and there was a map of the Nevada nuclear test sites on the walls. Make of that what you will. My dad could never figure out what my mom's sister did. I still don't know. But then, most of the people I've known who've worked at LANL or SNL don't talk at all about their work.) In general, for that generation, women did a lot of programming, but where things veered more into EE, it was men. Programmers come from EE or CS or MIS. These were and are different but overlapping cultures. I'm twenty years out of date, but I think programming has always been a pretty large, diverse tent -- although certain areas can be very insular.

One weird thing is that even business mainframe COBOL programmers like my dad were, counterintuitively, often working closer to the metal than CS-trained coders today. Mostly, this had to do with resource limitations. My dad was noted for writing very memory efficient code.

It seems kind of inevitable that I'd end up in computing, with all these influences -- except that I came of age at the beginning of the microcomputer era and, wow, there was a huge cultural gap between that older generation and mine. As things switched to workstations and PCs and then networks in the 80s - 90s, my dad (being a narcissist) was very threatened by my growing expertise, but by the early 00s he came to openly respect me (which was far more gratifying to me than I like to admit).

For those of us with strong cultural roots dating back to that era, much of this computing flotsam is extremely familiar. Punch cards were everywhere. Defective drive platters. Boxes of 1403 line printer paper. (Anyone remember music played via the line printer using the control tape loop? Loudly.) Many of the rest of my relatives, including my mother, were in banking so add to this bankers boxes (convenient for storing paperbacks).

In my very early teens, my first taste of programming was with a timesharing minicomputer (not a VAX, but some weird Heathkit-looking thing) down the hall from my dad's office, running a BASIC interpreter. I really wanted to do something with that big, old IBM 360, so dad gave me an intro COBOL text and between that and learning that I'd be writing something that I would have to keypunch and then run as a batch with, at best, some minimal print output, I was very unimpressed. Not fun at all.
posted by Ivan Fyodorovich at 11:10 PM on October 26, 2019 [6 favorites]

The real reason 2's complement took so long to take off (and also that remainder is wrong!) is that the computer engineers didn't yet know modular arithmetic.

Ahem; the Manchester computers were all two's complement, starting from the SSEM in 1948. I suspect our old friends hubris and NIH are more to do with the American computers reverting to one's complement.
posted by scruss at 6:24 AM on October 27, 2019

Also, negative number bases were used for encoding positive and negative integers at one point, at least in the Soviet bloc. (There was a Polish base -2 computer in the early 1960s, and IIRC, the USSR may have had similar machines.)
posted by acb at 6:37 AM on October 27, 2019 [1 favorite]

I used to have a bunch of PDP paper tape, IIRC one chunk was a debugger. Sadly I used it to decorate my ceiling (kicks old self). I'm not sure how true it is, but it seems reasonable... the 7-bit ASCII DEL character is 0x7F (0b1111111) so that if you made a mistake while punching, you could backspace and punch a DEL which was the one character that could overpunch everything else and remain itself. Then the reader could just ignore the DEL characters.
posted by zengargoyle at 6:02 PM on October 27, 2019 [3 favorites]

That was like listening to James Burke after he'd drank ~3 too many cups of coffee.

On that note, I'm pretty sure it was on this show of his that I learned about Jacquard looms back in the 1700s using punch cards as instructions for weaving.
posted by exogenous at 7:55 PM on October 28, 2019 [2 favorites]

Jacquard loom programmers were the first frontend devs.
posted by acb at 1:55 PM on October 31, 2019 [2 favorites]

In some alternative universe a printer thought of using Jacquard loom cards to typeset legal documents rather than paying clerks to write hundreds of copies with slight variations. Ada Lovelace seized on the idea and designed the first true output device. Presented with Lovelace and Babbage's combination data store, processor, and printer, the British Home Office quickly began digitising all government records. This was followed by other Departments, such as the Treasury, which introduced currency dispensers activated by punched-card authorisations. These authorisations could be (and were) used as an alternative to physical currency. The British economy, freed from its reliance on gold, was vastly more flexible than its European counterparts, and the new Jacquard credit systems began to be adopted abroad. Meanwhile, a young German émigré scholar in the British Library, was drawing some some very interesting conclusions about the relationship of labour and capital ...
posted by Joe in Australia at 7:20 PM on October 31, 2019 [4 favorites]

Ken Shirriff with a riposte: “IBM, sonic delay lines, and the history of the 80×24 display.”
posted by channaher at 4:54 PM on November 7, 2019 [3 favorites]

Cool stuff, channaher. BTW (prompted by "replacing teleprinters with CRT terminals was a large and profitable market"), some of you may be interested to know that, even to this day, amateur radio nerds have contests to see how many other amateur radio nerds they can talk to around the world using radioteletype communications via radio transmissions rather than this newfangled internet.
posted by exogenous at 5:47 PM on November 7, 2019

Excellent link channaher. Thanks!
posted by zengargoyle at 2:36 PM on November 8, 2019

I agree. I was thinking of 3270s when I read the FPP, because that's where my work and terminals first intersected.
Also, people complaing about keyboards should be glad we didn't keep the 2260 method of using the keyboard from a keypunch.

Story about the 2260 from a long ago co-worker: This guy was the IBM account CE at the White House Communications Agency computer room when it was still in the Old Executive Office Building.
There was another guy from [NSA??] doing RF testing on Tempest shielding working out on a balcony with antennae and some sophisticated gear. On the scope was the image from a (leaky) 2260 inside.
He showed the IBM guy the screen image and told him to fix the Tempest shielding on that device. He also asked why the image on his screen was rotated 90 degrees. (Not familiar with the vertical raster.)
posted by MtDewd at 12:21 PM on November 9, 2019

channaher's link is indeed interesting, but I can't get over the mislabeling of the Enter key's function on the 3270 (and the still-current 5250). The spot where you'd expect to find the enter key today was actually the Field + button. Modern emulators map the tab key to Field + and use Enter to submit the form, but the keys were rather different on the physical terminal.

Also, 80x24 became 80x25 because of the added status/help bar at the bottom of the screen on the 3270 and 5250, at least in IBM-land. I can't say when/why it happened on DEC and compatible VT series terminals, though I know that it predated the PC.
posted by wierdo at 6:20 PM on November 9, 2019

It wasn't Field+, it was the New Line key, at least on the 3277.
But yeah, the Enter key was down by the spacebar. And the 3277 had 24 lines plus the status line. (I've got the number 1920 stuck in my brain still)
posted by MtDewd at 4:48 PM on November 11, 2019

« Older 'We need it more snarly'   |   Your Favorite Bit May be Missing Newer »

This thread has been archived and is closed to new comments