Unix at 50: How the OS that powered smartphones sprouted from failure
September 23, 2019 7:37 AM   Subscribe

Unix, the operating system that in one derivative or another powers nearly all smartphones sold worldwide, was born 50 years ago from the failure of an ambitious project that involved titans like Bell Labs, GE, and MIT. Largely the brainchild of a few programmers at Bell Labs, the unlikely story of Unix begins with a meeting on the top floor of an otherwise unremarkable annex at the sprawling Bell Labs complex in Murray Hill, New Jersey. Unix at 50, Richard Jensen (Ars Technica) recaps the era from Multics (MIT; Multicians) to GNU and Linus Torvalds, pulling heavily from science historian Michael S. Mahoney's interviews (Princeton).
posted by filthy light thief (65 comments total) 44 users marked this as a favorite
 
Previously: Compatible Time-Sharing System (CTSS), Computer Timesharing Systems explained in two videos (September 17, 2012)

Also: Multics, Requiescat in Pace. (November 12, 2000)
posted by filthy light thief at 7:40 AM on September 23, 2019 [1 favorite]


I always enjoy pulling up the terminal emulator on my phone to show nostalgic coworkers.
posted by aspersioncast at 7:45 AM on September 23, 2019 [9 favorites]


aspersioncast, that's really bringing it full circle.
posted by filthy light thief at 7:55 AM on September 23, 2019 [4 favorites]


The first Unix-like system I ever used, ~1984-85, was in a computer lab at my high school: a room full of terminals connected to an Altos 86 running Microsoft Xenix, which it booted from 8" floppies. The fact that today I can still run Unix-like stuff on my desktop (or laptop) has made me very happy over the years. Poking around has never gotten old, and I still learn new bits and pieces.
posted by jquinby at 8:16 AM on September 23, 2019 [8 favorites]


The UNIX philosophy, is the correct philosophy.

I lived near Bell Labs in Murray Hill for awhile, and had a few jobs there painting offices and moving stuff around. It was wildly cool and I was constantly asking people if they knew where Ritchie and Thompson had worked but no one had a clue.

My cousin was also convinced that he knew where Shockley's house in Summit NJ was but I never believed him.
posted by pilot pirx at 8:19 AM on September 23, 2019 [5 favorites]


It's time to update Peter Salus' classic book "A Quarter Century of Unix" to "A Half Century of Unix and More Than A Quarter Century of Linux". My first SLS install was in 1992, which seems like a few lifetimes ago in computering years...
posted by autopilot at 8:38 AM on September 23, 2019 [7 favorites]


At the height of the 80's dial-up BBS craze someone here in Austin, unannounced and unidentified, gave access to a restricted shell account. I was totally lost and clueless but also smitten. And remain so today in all regards...
posted by jim in austin at 8:46 AM on September 23, 2019 [11 favorites]


It's a bit disorienting to realize that Torvalds, working continuously since 91---28 years, now accounts for the majority of Unix/Linux history. I remember when he was distributing binaries on the newsgroups, very much a newbie, and when the BSD port was being documented month by month in Dr. Dobb's magazine.

In other news, I have a lot of gray hair.
posted by bonehead at 8:54 AM on September 23, 2019 [22 favorites]


It turned out to be very fortunate for me that a BBS acquaintance lent me a Slackware CD in 1993 when he heard me complain about how crappy Trumpet Winsock was on Windows for Workgroups 3.11, which itself I had cadged off a guy from school so I could run the damn thing at all because 3.1 couldn't run the Win32 library.

After dealing with all that bullshit, all the config file hacking needed to get a dial up Internet connection going on Linux seemed plainly simple, so I stuck with that. That all the early TCP/IP software was primarily developed on a Unix variant (and still often actual AT&T Unix, though it was quickly dying at the time) and had sloppy Windows ports if they ran on Windows at all made it very easy to keep using it.

It certainly didn't help that compilers for Windows were still expensive and proprietary.
posted by wierdo at 8:56 AM on September 23, 2019 [4 favorites]


Metafilter: I was totally lost and clueless but also smitten. And remain so today in all regards...
posted by Multicellular Exothermic at 9:08 AM on September 23, 2019 [6 favorites]


I'd like to contribute my memories but I'm still trying to find a way to exit vi.
posted by JoeZydeco at 9:25 AM on September 23, 2019 [16 favorites]


One of the things that stuck with me in Stephenson's "In the Beginning Was the Command Line", or maybe it was from an interview I read with him after its publication in the late 90s, was his despair that in the decades since the creation of Unix nobody's been able to come along with something fundamentally better and different; at best there are interpretations of it (Linux) or complex systems built on top of it (MacOS X). Attempts to create a true next-generation operating system (Plan 9, BeOS, Copland) that could succeed Unix have all died, not necessarily because of technical inadequacies but from an inability to achieve self-sustainability. Granted, Unix had the advantage of very few competing OSes (and userbases mostly numbering in the tens, not the billions) when it shipped, but at the time and in the decades since the continued existence of Unix and Linux seems to have suppressed popular interest in any fundamentally different systems.
posted by ardgedee at 9:27 AM on September 23, 2019 [8 favorites]


There was an excellent talk by Warner Losh of the FreeBSD project on first 10 years of UNIX at EuroBSDCon this weekend. It was recorded and the video should appear some time in the next few days

https://2019.eurobsdcon.org/talk-speakers/#unix
posted by adventureloop at 9:36 AM on September 23, 2019 [2 favorites]


Meanwhile, one of the things I love about Vinge's A Deepness In The Sky is the strong implication that underneath 30,000 years of history and interstellar colonization, the Qeng Ho still runs on top of a Unix system.
posted by thecaddy at 9:54 AM on September 23, 2019 [24 favorites]


merits of beos aside, it was strangled
posted by lescour at 9:55 AM on September 23, 2019 [9 favorites]


Yay, UNIX stories. I went from the early Apple ][ with only integer BASIC to the Amiga-1000 in high school. That last summer camp at a university... the undergrads would let us into the dorms computer room and play around on the guest accounts. The next year was going to university and plopping down in front of Sun workstations (and the occasional VT or PDP-11) working for the Computer Services. Networks and big servers and tape drives and console printers and modems. Getting chastised for using a closed lab full of workstations to do ray-tracing. I found out that most big Amiga people were really UNIX people who wanted a workstation at home. Same processor, same executable format, cross-compiling was a breeze. I would later leave two jobs because they wouldn't buy me a UNIX workstation. Fuck this Windows shit, it sucks. I eventually had some $$ and Linux was there now. I started with a Yggdrasil CD and built my workstation all balls out. Fastest CPU, many memory, SCSI controller, had to go an buy a second computer with IDE to compile a kernel that would work on my workstation (then returned it, thanks Best Buy). I moved to Slackware soon enough. Setup a serial console for my roommate. (had to dual boot to DOS to play DOOM!, multiplayer over the modem and IPX drivers natch). Got my hands on an unlabeled DX4-100 probably snagged off the line. Eventually ended up back at university doing UNIX stuff for decades. Sitting now in front of a Debian laptop running the same old basic openbox and xterms.

I can manage some Windows/Mac things, but not really. They're a total mystery. UNIX and I are about the same age, and it's like a 3rd love that has lasted forever.
posted by zengargoyle at 9:57 AM on September 23, 2019 [10 favorites]


[...] despair that in the decades since the creation of Unix nobody's been able to come along with something fundamentally better and different [...]

Would a new OS really solve the most pressing problems in computing, though? Right now it seems like security is one of our biggest problems, but I wouldn't blame the OS for that -- that seems like more of an issue with the structure of the internet, as well as our natural susceptibility to social engineering attacks. And I suppose politics now, too.
posted by panama joe at 9:57 AM on September 23, 2019 [1 favorite]


My intro into UN*X was installing NextStep on a Next Cube in 1991. After that, I was hired to admin systems including DEC OSF/1 in 1993. Since then it's been lots of Linux variants. Pretty much everything I work on today has a UNIX back end including OSX and CentOS.

I think my first exposure to any kind of multiuser OS was the iscabbs in 1990 in college. They didn't have any networking throughout the campus so you had to modem in at 1200/2400 baud to get a terminal server that would allow you to telnet. iscabbs was the only system that any of us knew about at the time, but the concept of hundreds of people throughout the world chatting at the same time just blew my mind.

@ardgedee I too am really disappointed that nothing significant has come out to replace UNIX, Windows or OSX in the past 30 years. Since everything is moving to the cloud and the browser is becoming the OS, we may not see any new operating systems going forward. Which is a shame. I always loved messing with something really interesting and new (like BeOS).

On an enterprise level, the pendulum is swinging back to command line. There's enough OS there to run whatever cloud / web application is needed and that's it. No need for NextStep or BeOS or any shiny new GUI anymore.
posted by ensign_ricky at 10:00 AM on September 23, 2019


ardgedee: the continued existence of Unix and Linux seems to have suppressed popular interest in any fundamentally different systems

panama joe: Would a new OS really solve the most pressing problems in computing, though? Right now it seems like security is one of our biggest problems, but I wouldn't blame the OS for that -- that seems like more of an issue with the structure of the internet, as well as our natural susceptibility to social engineering attacks. And I suppose politics now, more generally.

That's a good question.

But the bigger issue is momentum and legacy systems. Moving from "interesting experiment" to "full-fledged platform" seems like a pretty big hurdle, though there might be more room for something new in the mobile market, given that most people don't peak under the hood and just want something that works well and reliably.
posted by filthy light thief at 10:11 AM on September 23, 2019


Attempts to create a true next-generation operating system (Plan 9, BeOS, Copland) that could succeed Unix have all died, not necessarily because of technical inadequacies but from an inability to achieve self-sustainability.

Any conversation about why Unix has had such longevity has to start with the incredibly influential 1991 article on why "Worse is Better".

Every time I've bet on a sophisticated and rigorous technical solution winning in the marketplace, I lost my bet. The simple, extensible, Unix-philosophy technologies win every time over the long term, no matter how ugly and hacked-together they look.
posted by fuzz at 10:11 AM on September 23, 2019 [7 favorites]


Multics didn't really fail in 1969, it kept going for many years afterwards, just without MIT and Bell Labs. Unix has its upsides, but it also led to the propagation of the C language, which is a major cause of security woes.

I'd say one issue that led to the failure of most OSes that came after is how hard it is to find a good organization to develop something as complex as an operating system; it's difficult in the private sector, where companies can die for unrelated reasons and good systems can succumb to commercial pressure; it's not really feasible in academia, where the funding and publishing model isn't suited for that type of long-term development; volunteer development can work, but it's difficult to build a team for a brand-new project, and funding is hard.
posted by Monday, stony Monday at 10:22 AM on September 23, 2019 [5 favorites]


I managed to see Jobs give a NeXT presentation in the campus auditorium. We had a couple for testing, but it was basically a Sun shop and everybody thought the optical R/W storage was just too slow. I used a NeXT cardboard box as my laundry basket for years.

Most of the security sort of things are Microsoft/Apple and their inital not-internet-connected hacks to make things plug and play easy. Don't start with Users/Groups, everybody is Admin. Don't start with permissions, there are none. Then realizing that they have to bolt them on as an afterthought. It's not that big of a problem when at best you have a small home/office network and pretty much trust everyone and it's all easypeasy to plug things in. They also suffered from Not Invented Here syndrome and wrote their own things with a bit less finesse.

I'd really rant, but I think they are getting a bit better. Microsoft has gotten a lot better from the '98 days until now. Apple has sorta switched over under the hood.
posted by zengargoyle at 10:36 AM on September 23, 2019 [1 favorite]


despair that in the decades since the creation of Unix nobody's been able to come along with something fundamentally better and different

This kind of ignores the fact that much of Unix and friends is so replaceable, that what we call Unix now bears little resemblance in code the Unix of Bell Labs. What it does perserve is a detailled set of protocols and interfaces that are standard, but the underlying code, even it architecture has been developed and innovated for decades now. Unix is a grandfather's axe; both handle and head have been replaced many times, but it's still the best axe we have. Besides Linux is itself a re-implementation and has itself been re-implemented several times. Things aren't standing still at all.

Secondly, this also underestimates how important it is to have a system that has very well-understood characteristics, including faults and weaknesses, that can still run old software (either directly or by recompilation). Robustness is a test we value highly in experimental sciences, and I see a lot of parallels in using a known piece of code like Unix.

Not to say that new things shouldn't be looked at closely when they come along, but I think the value in renovating and rebuilding an existing working system that highly fault-tolerant is often undervalued in the tech world.
posted by bonehead at 10:36 AM on September 23, 2019 [21 favorites]


I was an early adopter of Andy Tanenbaum's Minix. I forget the exact details but I watched it play out on the mailing list something like as follows. Andy was adamant that Minix should have teaching value and so should be simple and stripped down and was resistant to lots of change requests which were made. A noisy youth wanted more drivers and for it to be more useful as a desktop OS, eventually they fell out and that youth went on to create his own OS, he called it Linux.
posted by epo at 10:43 AM on September 23, 2019 [8 favorites]


As I recall Minix was part of a course, and Tanenbaum didn't want it to get too big or complex because then it wouldn't be teachable in the semester or two he had to cover the material. He also didn't want multiple Minixes and derivatives out there causing confusion. And so the Finn decided to bang out his own kernal (in a very short period of time) out of frustration.
posted by bonehead at 10:47 AM on September 23, 2019 [3 favorites]


despair that in the decades since the creation of Unix nobody's been able to come along with something fundamentally better and different

This short-changes Windows NT IMO. It has a lot of VMS in its DNA and does a lot of things fundamentally better than linux, like user account security.

An article from 1998 is a fun read, Windows NT and VMS: The Rest of the Story
posted by Space Coyote at 10:48 AM on September 23, 2019 [9 favorites]


Every time I've bet on a sophisticated and rigorous technical solution winning in the marketplace, I lost my bet. The simple, extensible, Unix-philosophy technologies win every time over the long term, no matter how ugly and hacked-together they look.

It's no coincidence that every honest invocation of "worse is better" frames it in terms of marketplace adaptability, because as a sentiment it becomes a lot harder to justify when you actually take externalities into account. The sentiment may be accurate in terms of its enabling the spread of computing in the first few decades of microcomputers, but now with computers being as ubiquitous and essential as they are, "worse is better" is a humongous liability. Implicit in the Unix philosophy, IMO, is that programmers should favor foisting complexity onto the user rather than taking it on themselves, and that's part of why I think there's value in seriously considering alternative models.
posted by invitapriore at 10:58 AM on September 23, 2019 [3 favorites]


I'd like to contribute my memories but I'm still trying to find a way to exit vi.

So in my previous life in academia, one of the things I did was help teach a course on Operating Systems and Programming for first year Mechanical Engineering students (don't ask). I once told a student that was flailing around in vi to just exit, and they got up and left the room.
posted by each day we work at 11:02 AM on September 23, 2019 [30 favorites]


I was an early adopter of Andy Tanenbaum's Minix

Tanenbaum's "Operating Systems" was one of the most important books I ever read.
posted by mikelieman at 11:13 AM on September 23, 2019 [3 favorites]


This short-changes Windows NT IMO. It has a lot of VMS in its DNA and does a lot of things fundamentally better than linux, like user account security.

So much hay was made about how NT 3.51 was C2 (I think?) compliant. The government says it's secure enough to hold classified data! as long as you don't plug it in to a network

In reality, gaining Administrator on an early NT box from the console was no less trivial than getting root on Linux boxes, which itself was only limited in speed by the speed at which the machine could POST.
posted by wierdo at 11:17 AM on September 23, 2019 [1 favorite]


It's interesting that while Unix arose from the ashes of Multics, it eventually became, for all practical purposes, Multics:

- Hierarchical file system
- Implemented in a high-level language
- Virtual memory
- Memory-mapped I/O
- Devices represented in the file system
- All system resources (aside from devices) represented in file system
- Privilege rings (as of KVM, VirtualBox)
- Arpanet Networking

The mistake of Multics was probably to attempt too many things at once. And that the GE635 was a pretty limited and expensive machine by today's standards. Minicomputer capabilities we creeping up on it and were a lot cheaper.

The crucial Unix innovation, aside from its necessarily parsimonious set of system primitives, was the pipe (and the shell). That enabled high-level scripting using a small(ish) number of composable special-purpose software tools. (I once implemented a halfway credible ELIZA program in Bourne shell.) Unix may have been the first OS to realize that a text file could be as simple as a sequence of bytes separated into lines by a "newline" character; gone were record control words and now text files were (nearly) compatible with terminal I/O.

I remember Tracy Kidder's Soul of a New Machine, which described the heroic efforts of the team behind the Data General Eclipse. They developed a new machine architecture, operating system, compilers, utilities, etc... all from scratch. They would be among last to do so (along with Apollo). It was getting pretty easy to develop minicomputers that were VAX-equivalent and for a while in the mid eighties companies were leapfrogging each other to do so. They realized that they could save a lot of time and effort by just building machines to run UNIX. One even basically cloned the VAX, making minor modifications and leaving out instructions the C compiler didn't use. Kidder apparently never noticed that he was chronicling the end of an era. As for myself, I briefly programmed an Eclipse; it was running UNIX.

Soon enough the 68000, then the 68020 in particular, crept up on the capabilities of minicomputers, ushering in the era of workstations, which was tragically but temporarily eclipsed by the PC era; I spent a decade wandering in the Windows wilderness. Eventually, the x86 architecture became capable enough to compete with workstations and that era came to an end, but the seeds had been planted for Unix to rise again.

Finally the SoC era (and ARM, etc.) brought powerful computers to the palms of our hands. Unix was the only logical choice; the road to do so had already been paved.

Still, Unix would almost certainly not be what it is today if not for its slow liberation from proprietary vendors. For that, we have to thank a lot of highly contingent history that could easily have gone very differently. But the fact that Unix is, at its core, a simple set of primitives with services layered on enabled its evolution to recapitulated several times over.

I think we'll be "stuck" with Unix for the foreseeable future. We've reached the point where Unix is the DNA of computing and while other foundations might be possible or even superior, it's evolutionarily locked-in. There's nothing wrong with Unix that can't be fixed (real-time remains a challenge and it's a shame that fault-tolerant systems architectures seem to be left by the wayside). You could write a whole new kernel if you wanted to but the userland interface would still be UNIX because it would be insane to walk away from all the software and community that's built on it. Mac and now even Windows have a Unix userland interface.
posted by sjswitzer at 11:18 AM on September 23, 2019 [11 favorites]


In my decade in the Windows wilderness I drank a lot of the kool-aid and came to appreciate a lot of its design. DLLs, for instance, are a sound modularity principle and COM wasn't as bad as people seemed to agree at the time. I have fond memories of Visual Studio 4.2. But even in those benighted days before Longhorn crashed and burned I realized that proprietary operating systems could not prevail in the end and Unix was the only serious contender.

I wondered what universities would teach CS students in a world of proprietary operating systems and platforms. CS415: Thread Safety in Windows DLLs? No way. Not happening. It was a generational issue that would soon become absurd if nothing changed. Nobody can predict the future but impossible things never happen. You can bank on it.
posted by sjswitzer at 11:37 AM on September 23, 2019 [2 favorites]


Kind of a fun thought experiment : what if the early timeshare systems had been widely available to home users?

No CP/M. No DOS. No Windows. No Macs. No BBSes. The Internet boom taking place in the early 80s as opposed to the mid-90s. No "cloud computing," since computing would have never left the "cloud" to begin with. Some time in the 80s or 90s, you'd essentially see the first Chromebooks, as graphics hardware would have become cheap enough to reach home users.

I would argue the PC age set computing back by 10 to 20 years.
posted by panama joe at 11:59 AM on September 23, 2019 [5 favorites]


Kind of a fun thought experiment : what if the early timeshare systems had been widely available to home users?

No CP/M. No DOS. No Windows. No Macs. No BBSes.

I don't think so. Even with timeshares available, there would still be hobbyists that would want to build and customize their own computers at home. We would still have Moore's Law, which would mean that those home computers would be getting exponentially more powerful each year. And we would still have video game consoles, which could run games that are far more sophisticated than what could be delivered through a telephone line to a home terminal.
posted by 1970s Antihero at 12:15 PM on September 23, 2019 [9 favorites]


Happy Birthday UNIX!

I was lucky enough to grow up near Bell Labs Murray Hill and I had a summer job there when I was in high school, in the early 80s. I learned C and Unix and even got to (briefly) meet Ken and Dennis. Somebody set me up with a login and password and sat me down in front of a VT-100 terminal connected to the central UNIX server, and told me to type "man man", then left me alone for the rest of the day. That turned out to be the foundation for a career that has lasted me over 30 years and which I'm still (mostly) enjoying. I'm glad I learned UNIX, which has shown itself to have real staying power, and not some other now-obsolete system! I find it interesting that so many of the tools I use and rely on every day (UNIX, Fast Fourier Transform, etc) were born in the mid-60s, just like me.
posted by crazy_yeti at 12:20 PM on September 23, 2019 [8 favorites]


DLLs, for instance, are a sound modularity principle...

Any more so than SOs, ignoring for now the notorious lack of versioning in DLLs? My understanding is that the only practical difference is when they’re linked in the lifetime of a process.
posted by invitapriore at 12:40 PM on September 23, 2019 [2 favorites]


At the height of the 80's dial-up BBS craze someone here in Austin, unannounced and unidentified, gave access to a restricted shell account. I was totally lost and clueless but also smitten. And remain so today in all regards...

My first introduction to a shell and telnet was similar, but instead of a BBS it was a local university that had a free dial up modem pool used to access their bespoke campus wide library catalog, which could be easily exited with a ctrl-k break to drop you to a lightly restricted shell, including access to stuff like Telnet, Archie, Veronica and many other information services and search tools.

Meaning you could also telnet into any open telnet host, which rapidly started including gateways to private BBSes or things like text based MUDs on many different campuses or businesses.

So you had to do a lot of exploring and poking around to find anything. Reading finger pages might yield the address of a MUD or BBS. Someone's public directory might have a text file containing useful addresses. An Archie or Veronica index might yield a bunch of Telnet addresses of various kinds.

And you generally didn't know what a given host might have to offer before you logged in successfully with a guest account. It wasn't like the web based internet at all. You could string together a lot of telnet links between different public guest shells while exploring and run into dead ends before having to retreat to your original shell and picking up again somewhere in that chain of hosts.

I remember keeping notes about cool stuff like MUDs or gateways to BBSes and other interesting things.

I ended up working at the same university many years later as an adult in more than one IT department on that campus and at one point I ended up talking to one of the old gurus and high wizards that had been there for years - particularly in telco-to-computer realms in infrastructure and networking. He was of that particularly gnomish sort that you only seem to find where stuff like Unix and computer networking meets the telco infrastructure on the physical layer - someone who was just as familiar with terminating an entire trunk line of twisted pair wiring and programming a PBX as they were with programming big network iron and servers.

So I asked them about that modem pool or if they knew about that really easy access to the shell, because I just had to ask and if anyone knew the answer it was this guy.

And I specifically asked him about it because at the time I was on a facilities tour and we were standing in a vault full of frames of telephone and telco networking gear, right next to a large bank of modems still in use as a dial-up pool. And it was very likely the exact vault and facility I was dialing into as a kid albeit likely with upgraded modems, because the facility we were in was the central telco plant and vaults for the entire campus. Effectively all telephone and networking in and out of the campus went through this building and vaults.

He lit up and smiled when I asked about this dial-up pool and his answer was something like - and I am heavily paraphrasing and glossing it up - "Oh yeah, we knew. We did that on purpose to give people and kids like you some access to Unix and the internet. We figured if anyone was smart enough to try or know some basic terminal commands and recognize a command prompt we could let them play with a guest shell. We also used it ourselves as an access point because if you had an account and login credentials you could use that shell to login and access the campus network. It was also a really handy source of stress testing traffic. It worked, didn't it? Because, hey, here you are working for us now, right?"

I suspect there's a lot of stories like this out there about Unix and the early internet that specifically reflect the Unix philosophy like this. Confusing and inscrutable at first with what is effectively unassuming silence, IE the no news is good news ideology and simplicity. A blank canvas.

Effectively "Ok, here's direct access to your entire machine and computer as pure bytes and not much else. What are you going to do with it?"
posted by loquacious at 12:40 PM on September 23, 2019 [30 favorites]


I agree with Antihero that a lot of other things would have to have happened, but, at the same time, the PC era, which did set computing back at least a decade, didn't have to happen and almost didn't. Before the x86 PC, other more promising architectures were popular, the 6502, 6800, 68000, etc. People could build and customize computers before the PC. Microsoft could have stuck with XENIX. If the PC, DOS and Windows had never happened, we might (except for IP issues!) have skipped that lost decade and gone straight to Unix. Or at least have had a protected kernel, process isolation and preemptive multitasking from the get-go. That almost happened! But thanks to the PC (to say nothing of the Mac), it took another decade or so.

You can also make an argument (tenuous, but still...) that the real platform of the PC era was the AT bus. There was a thriving market for third-party peripherals and the de-facto standardization of the PC-AT made that possible. Eventually, the AT bus ran out of steam and other system architectures followed it... and established peripheral vendors came along. But now bus standards, PCIe and USB, primarily, have mooted that point. Bus architectures are no longer platform-dependent (I'll gloss over driver issues :) ), so peripheral vendors are no longer so tied to a particular system architecture. That, I would argue, loosened the iron grip of the PC on the ecosystem.

The one thing that proprietary operating systems offered at the time--and still offer--is a stable platform for third party binary software. Unix never got the "if you build it they will come" idea quite right. They went with "if they can build it they will come." But there never were enough people willing and able to build and install software from source. Dependencies on Unix are a problem that makes DLL-hell seem like a walk in the park. Package managers have gone a long way to help, but third-party software vendors have to deal with a lot of distros. Writing desktop GUI apps for Unix is also a real pain for similar reasons.

These days, most software innovation is happening on the web so client diversity can be papered over by the browser. But if you need (or are developing) desktop software, Windows and macOS remain the go-to targets.

There's a new dynamic at play now. Full-stack developers need to be able to build and test software on their client machines. Since back ends are almost uniformly Unix now, that means being able to run Unix software on desktops. MacOS 10 was ahead of the game here, having a Unix kernel under the covers. WSL is designed to fill that gap for Windows (I've not used it, so I'll reserve judgement). But one way or the other, Unix will eventually conquer the desktop as well.
posted by sjswitzer at 1:10 PM on September 23, 2019 [4 favorites]


DLLs, for instance, are a sound modularity principle...

Any more so than SOs...?
I'd say it's a matter of taste. In DLLs, dependencies are explicit (imports and exports), so they conform a bit better to my idea of what a "component" should be like. Both models are valid within their worldview. DLLs seemed pretty well suited for "plugin" type composition since you want your points of contact to be circumscribed. For libraries, the SO model is better. I'm less sympathetic to the DLL worldview now since, for security reasons at least, extension and composition have moved to process and protocol boundaries. SOs are a fine way to do shared libraries (hence the name). DLLs are a good way to do loadable components, which you probably shouldn't be doing anymore anyway.
posted by sjswitzer at 1:25 PM on September 23, 2019 [1 favorite]


Windows NT was decent, but not really in a desktop way. Too incompatible with non-NT and in the era of crappy MS TCP/IP stack. It was still meant to only really work well in a Microsoft only non-internet environment. (Must admit there's a long story where I actually got my MSCE in the NT era). For a brief moment of time, Commodore had a UNIX, it was either the Amiga 2000 or 3000. It was OK, but ancient in the sense that 'ls' wouldn't even do columns. Only the barest and no bells and whistles.

And Heh, yeah, lost decade. Things would have been much better if Amiga hadn't died. Just from the 68000 and OS angles. It was so close to UNIX of the days that I used to dial up into work before going to work and still play around on my computer back home. There was a SLIP stack and remote terminal and file transfer. I totally could not understand the love of the Mac Classic and PC and PS/2 and all the hoops they had to jump through. Sadly x86 and IBM and Microsoft and Apple won the popularity battle.
posted by zengargoyle at 1:33 PM on September 23, 2019


So in my previous life in academia, one of the things I did was help teach a course on Operating Systems and Programming for first year Mechanical Engineering students (don't ask). I once told a student that was flailing around in vi to just exit, and they got up and left the room.

I HAVE A SMART PHONE NOW. THIS DOESN'T WORK ANYMORE.
posted by srboisvert at 2:26 PM on September 23, 2019 [3 favorites]


Remember when NT 3.51 was too stable, so they moved the video drivers to kernel space in 4.0?

Fun times.
posted by ryanrs at 2:28 PM on September 23, 2019 [2 favorites]


I also wandered into a free shell account as a middle and high schooler in the mid-90s, on a BBS in southeastern Michigan. The grownups on the system very clearly believed that having a shell account on a UNIX system was A Big Deal, but they had no patience at all for questions about what it was for or why. Still, I was a persistent kid, and an urban exploration nerd with a try-every-doorknob approach, so I spent ages poking around on my own, and...

…found absolutely nothing of interest that I could figure out how to do. So I gave up and spent more time writing music instead.

I work in tech now, love UNIX, program recreationally, and have a Ph.D. I could do those things partly because of my own skill and gumption, but mostly because I found experienced people who had patience for me. "Smart, curious people will learn to use it on their own" is bullshit — some will, lots won't, and mentorship matters.
posted by nebulawindphone at 3:35 PM on September 23, 2019 [12 favorites]


> It turned out to be very fortunate for me that a BBS acquaintance lent me a Slackware CD in 1993 when he heard me complain about how crappy Trumpet Winsock was on Windows for Workgroups 3.11,

Somewhere around '94 I downloaded the complete Linux install on floppy disks.

If I recall, it was 20-something diskettes and I would set it up to download one diskette image overnight while I was sleeping (yes, using crappy Trumpet Winsock).

So, it took quite a while to grab the whole thing. If I recall it came from one of those software repositories in Finland, so it was downloading an OS not only from over the Internet, but also from a completely different continent, which was a novelty.

Then I installed it on an old PC I had laying around the house.

I was pretty astonished when every install disk finally worked without any failures (though I might have had to re-download one or two, slowing down the project by a couple of days).

I don't know what in the world I had in mind to do with it, but still it was an adventure to have a whole new and FREE operating system actually running and working. And I learned a lot by becoming a "sysadmin".

I seem to recall spending a lot of time trying to get a X Windows set up and connected over the wires to our Unix systems at the university. I believe I was successful in getting all that to work a few times, though--as with the whole setup--I'm pretty sure I never actually accomplished anything useful with it beyond, "Hey, it works!"

And as you say, I can still ls and grep, awk, gcc, build, etc my way into a lot of trouble, though I usually don't really know exactly what I'm doing except following a list of instructions for installing whatever on some web page or readme file. (Does ANYONE actually know what they're doing?)

7.5 hours and a lot of tweaks later, it often works, too.

I guess that's why we have things like apt nowadays, though it seems to lack about 99% of the drama (and fun?) of the old days.

I think that hard drive with that first Linux operating system is still in a box around here somewhere. My stack of install floppies probably are, too, come to think of it . . .
posted by flug at 4:57 PM on September 23, 2019 [3 favorites]



Kind of a fun thought experiment : what if the early timeshare systems had been widely available to home users?

No CP/M. No DOS. No Windows. No Macs. No BBSes. The Internet boom taking place in the early 80s as opposed to the mid-90s. No "cloud computing," since computing would have never left the "cloud" to begin with. Some time in the 80s or 90s, you'd essentially see the first Chromebooks, as graphics hardware would have become cheap enough to reach home users.


To me, this shows a tremendous ignorance of just how expensive non-home computing was back then, as well as the fact that, for most people in the 80s, the IBM PC was a business computer and not a home computer. The world proposed here would absolutely would have been one that both I, and most of my friends, would have been excluded out of - the real democratizing effect of 80s home computing, by which I mean the really cheap z80/6502 machines (think commodore 64, zx spectrum in europe/australia, and apple2/trs80 in north america - I guess also the MSX type machines in japan, though I'm not as familar with the 80s market there and only know it through retold history) and later early 90s check hand off PCs - I'm talking 286/386 in 1994, so super trailing edge of _that_ set - rather than even the expensive IBM PC type machines, let alone the expensive timesharing machines, was that lower-middle-class people, and later poor people, could get access to them, and learn computing on them.

(context: my rural family saved up for a year to afford a $150 commodore 64 in 1988, which was our only computer until 1993, when we were given a hand off CGA XT which was destined for trash. It was completely life changing. There is no possible way my family could have found the money for even the phone calls to dial an expensive timesharing system, let alone the fees for accessing one).

If "setting things back years" means people with less ongoing means get to hop on board as well, if they are so inclined, then I'm not convinced that I agree that's actually backwards progress.
posted by jaymzjulian at 5:11 PM on September 23, 2019 [10 favorites]


To me, this shows a tremendous ignorance of just how expensive non-home computing was back then

I am genuinely curious -- what exactly do you think I meant by "widely available?"

Because timeshare systems were available to people -- but only if you were in the military, academia, or if you worked for a company that owned one. They were even available to consumers via services like Compuserve, which were way too expensive for the average consumer. Thus, not widely available.

Widely available means widely available.

You seem to have misunderstood my point so badly, I can't even imagine what you must have thought I meant.
posted by panama joe at 5:19 PM on September 23, 2019 [2 favorites]


> Kind of a fun thought experiment : what if the early timeshare systems had been widely available to home users?

Hmm, you mean like this?

It rolled out in 1980. It was wildly successful, by all accounts. Available only in France, unfortunately.
From its early days, users could make online purchases, make train reservations, check stock prices, search the telephone directory, have a mail box, and chat in a similar way to what is now made possible by the Internet.

In February 2009, France Télécom indicated the Minitel network still had 10 million monthly connections.
posted by flug at 5:34 PM on September 23, 2019 [7 favorites]


This idea that if we all somehow bought dumb terminals in the 80’s/90’s and had mainframe access, that things would’ve been better...seems divorced from what people actually did with computers. This apart from the presumed subscription model, which would definitely fly now but back then was a time when a second phone line was a luxury.

Word processing? Maybe. Kinda. I guess you could (and did) run WordStar on a terminal. Printing? At your house?

Spreadsheets? This barely works in the cloud now.

Games? With graphics? And audio? On the modems we had back then? I played BBS games, probably the closest analog, and it was a whole different experience.

Would it have been any cheaper? You still need a monitor, keyboard, modem, and enough processor and graphics to render the stuff you want to do. You might as well just have the whole computer.

Maybe it set computing back, but a lot of people did a lot of computing in that decade.

I suspect that I am wholly misreading what the comment is; in that case, uh, never mind.
posted by Huffy Puffy at 5:50 PM on September 23, 2019 [3 favorites]


> Widely available means widely available.

By what means would somebody in 1982 connect to a timeshare from home? The cheapest dumb terminals of that era cost several times as much as a Commodore 64 and most of them were built into their own desks.

It was cheaper to buy a C-64, floppy drive and a modem. At that point you have your own computer plus optional terminal. Best of both worlds.
posted by ardgedee at 6:12 PM on September 23, 2019 [2 favorites]


So basically the opportunity was already available, and in a more accessible way than through dedicated terminals, for institutional timeshares to revolutionize computing. It didn't.
posted by ardgedee at 6:15 PM on September 23, 2019 [2 favorites]


My first experience with *nix was in college, though I didn't know it. Our scheduling system (SOLAR at YSU) ran on dumb terminals. Actually email did too. I dialed up from my dorm room to email people. I remember being in a computer lab talking to people in the same room using the dumb terminals. I thought there was a command like 'tell' but a quick google is showing me my memory is probably faulty.

I started using Linux out of pure curiosity with Fedora 7. I tried Ubuntu, but my mentor was a Fedora guy, so that's what I stuck with. I actually used it for quite a few years before moving home and getting my brother's Mac. When that died, I went back to Windows. A couple years ago, my brother (same one, who does Linux stuff as a career) got me a Raspberry Pi. I went back to that and only use my Windows laptop when forced to. In the past 3 months, I have to reinstall Windows because an update borked it beyond belief. Now it refuses to update and I've spent about 3 hours trouble shooting it. I'm thinking about putting a different Linux distro on it, perhaps going back to Fedora. If not, then reinstalling Windows yet again.

I have my eye on the new Raspberry Pi and might ask for one for Christmas. Good for most things, the Pi I'm on is low in the RAM department as well as storage space.

I actually freaked out the first time I got into a terminal on Fedora. I was terrified I was going to break something. Little did I know I was playing in a terminal all those years before in college (94-97).
posted by kathrynm at 6:21 PM on September 23, 2019


Maybe it set computing back, but a lot of people did a lot of computing in that decade.

I suspect that I am wholly misreading what the comment is; in that case, uh, never mind.


You're probably not, and as the originator of the "lost decade" meme, I apologize. A lot of people got a lot of value out of that era. It would not have been the juggernaut it was had that not been the case.

Someone else pointed to the era of low-end consumer/hobbyist computers as liberating. I relate to that. I could never afford even those computers, though I could afford a subscription to BYTE magazine and write assembly language on notepads but never be able to run it. Nevertheless, the spirit of that age was inspiring. And, as also pointed out, the x86 PCs also came in at a price point that was prohibitive to a lot of people who wanted to participate but could not afford. It kinda wiped out evolution from above and below for a while.

What I was meaning to say is that had things gone even slightly differently (butterfly wings!) that we (all of us) would have had better computers sooner. It's a counterfactual, so, yeah it didn't happen (and there are sufficient reasons that it didn't).
posted by sjswitzer at 6:25 PM on September 23, 2019


Ugh. Well, first off, apologies to everyone in this thread -- I really did not mean to derail. There are interesting things to say about Unix that have nothing to do with my comment. Such are the perils of non-threaded discussion boards.

Anyways, my thought experiment was just that -- a thought experiment. If one were a novelist, I think one could write a fairly plausible alternate history where we skipped the PC age and went straight to widespread internet -- although obviously that isn't what happened in reality.

I guess the meat of my argument is this : in the 80s and 90s, there was a parallel evolution between, on the one hand, the internet and timeshare systems, and on the other, PCs and BBSes. I would make the case that PCs and BBSes arose to fill a need that the timeshare & internet world did not fill. Why weren't internet and timeshare systems properly harnessed for the consumer market prior to the mid-90s? Was it market forces, or merely a failure of imagination? Or possibly both? I think you could make the case that this bifurcation was ultimately wasteful, although nobody knew that at the time. You can't exactly fault the people of the past for not knowing the future.

All of that aside, I think I've probably angered some people in this thread by pooping on (or appearing to poop on) PCs and BBSes, which wasn't my intention. I ran a BBS back in the day -- several in fact! My first real exercise in programming was modifying WWIV source code. Lots of fond memories there. So, apologies for the aforementioned pooping and derailing. I think it can be fun to imagine the "what could have been," although clearly I could have done a much better job of phrasing my thoughts.

... and now back to your regularly scheduled discussion (i hope) ...
posted by panama joe at 6:59 PM on September 23, 2019 [1 favorite]


To me, in my early computing career, "Unix" and "Internet" were synonymous. I started out in the 80s active in the Commodore 64 BBS scene, with a little Compuserve and Q-Link (the forerunner of AOL). I actually managed my first Internet exposure in the early 90s when I managed to get UUCP running on a Kaypro-10 CP/M machine. I downloaded Slackware diskettes in the mid-90s via my employer's modem pool. I went back and forth between Linux and Windows for years (with a few years in Mac-land), but Linux was always at a disadvantage because my personal computing was always on laptops, and not until recently have laptops and Linux been compatible enough for me.

I went all-in last week. Installed Mint 19 on my new Dell Inspiron, and set up Windows in a VirtualBox VM (because I can't attach to my employer's VPN on a Linux machine). I would have dual-booted, but keeping Dropbox in sync on a shared NTFS drive is a nightmare nowadays. Linux is actually the path of least resistance. As much as I love Windows, I don't feel secure with it anymore.

I have so enjoyed learning Linux, though I feel like a total beginner most of the time. I like Vim (and used it a lot on Windows). Having a terminal window open makes sense on Linux, as it never seems to on Windows. It got me interesting in exploring the history of Unix, and emulating old machines that run old versions of Unix. This is a fine FPP post!
posted by lhauser at 7:45 PM on September 23, 2019 [2 favorites]


The UNIX philosophy, is the correct philosophy.

The UNIX philosophy was the correct philosophy in the 1970s when it arose. Nowadays it's showing its age a bit. The small-pieces-loosely-joined approach is good, with the pieces being easily composable by a user (on the command line). The idea of everything being a file descriptor, and a file being an untyped stream of bytes, not so much.

The problem is, there's no easy post-UNIX. There was Plan 9, which didn't get beyond the research phase, and beyond that, systems with new layers of infrastructure built on top of UNIX, like NeXTStep/OSX/macOS and BeOS.

One idea that may be relevant could be borrowed from another standard interface for composable services, HTTP. Have file descriptors have headers which give a MIME type, encoding and other metadata for what they return, and some means of content negotiation. Then you can have command tools that stream not bytes of text but streams of records (in JSON or ASN.1 or whatever wire format you want), other tools that consume/process these, and a whole set of low-level tools for dealing with everything being text (like cut, for example) becoming all but obsolete.
posted by acb at 1:37 AM on September 24, 2019 [4 favorites]


But that could be, would be, built on top of a stream of bytes. Perhaps stream of bytes is too low level an abstraction for some purposes but it is a pretty good base level abstraction from which to build others. The rest is parsers.
posted by epo at 2:31 AM on September 24, 2019 [7 favorites]


It was cheaper to buy a C-64, floppy drive and a modem. At that point you have your own computer plus optional terminal. Best of both worlds.

I had this kit. I still have the paperwork from my DELPHI account, and I can't tell you how many Compuserve kits at the Radio Shack I worked at were missing the login credentials envelopes.
posted by mikelieman at 4:18 AM on September 24, 2019 [4 favorites]


But that could be, would be, built on top of a stream of bytes. Perhaps stream of bytes is too low level an abstraction for some purposes but it is a pretty good base level abstraction from which to build others. The rest is parsers.

Though if everybody is responsible for their own parsers, you get a thicket of edge cases that don't quite line up, and a huge amount of productivity lost to making the pieces line up.

You can still have byte streams if that's the best model that describes what you're doing. Though in most cases, there are existing patterns being applied, like lists of records (which would be shown as tables of columns; think ls, ps, and such). Having a standard format for those would reduce a lot of cruft, and would also allow better tooling. (Imagine a terminal emulator where, if you typed ls -l, you got a collapsible/scrollable table with resizable columns, for example.)

Everything-is-bytes is the libertarian anarchocapitalism of computer science, and works about as well.
posted by acb at 5:00 AM on September 24, 2019 [2 favorites]


Ironically, the most interesting improvement on "everything is a byte stream" came from the Windows world, specifically PowerShell, which basically asked the question "What if the stuff we sent over the pipes was structured?" and came up with an answer that looks almost exactly like what acb is describing. It's the one example in the world of OOP successfully making things simpler instead of more complex.
posted by tobascodagama at 5:43 AM on September 24, 2019 [3 favorites]


I haven't looked at PowerShell, but if the output involves embedding some kind of OLE-style live view in your terminal window, that sounds like overengineering and too un-UNIXy. A happy medium would be just having a header saying "this is going to be a stream of (CSV lines/JSON dictionaries/AMQP records/&c.)", and the standard system libraries having some kind of standardised, well-tested stream-iterator-like facility for traversing those in the language of the implementor's choice.

This still keeps the philosophy of UNIX (data being passed along free to handle, rather than something more baroque), but just updates it to modern learnings (such as that one should draw a separation between lossless-encodings-of-data (the model) and human-friendly-presentations-of-data (the view), in precisely the way that UNIX command-line tools don't.
posted by acb at 6:10 AM on September 24, 2019


Remember when NT 3.51 was too stable, so they moved the video drivers to kernel space in 4.0?

The alternative would be two context switches and two TLB flushes every time you made a GDI call. They made the right choice.
posted by Your Childhood Pet Rock at 6:36 AM on September 24, 2019


Ironically, the most interesting improvement on "everything is a byte stream" came from the Windows world, specifically PowerShell...
I dunno. When PowerShell was a new thing, I was super-excited... a real command interpreter for Windows! Alas, it proved not to be what I was hoping for, namely a native replacement for bash under cygwin.

You can do a lot of cool stuff with PowerShell, but it's not an environment where you can easily just type stuff out of your head and make the computer do what you want. That higher-than-a-byte-stream abstraction comes at the cost of having to know the interfaces of all of the COM and .NET objects you're instantiating, and they're not regular enough to be able to do anything useful without poring over a manual.
posted by Aardvark Cheeselog at 6:46 AM on September 24, 2019 [3 favorites]


For PowerShell, the items being passed in the pipeline are .Net objects complete with methods and properties. There are PowerShell native types like PSObject and PSCustomObject that allow for the dynamic addition of properties and values but any old .Net object will do. The objects can have default columns to be displayed if the object is to be displayed on screen and the PowerShell host in question makes a judgement call on how to best display the data.

Get-Process |
Where-Object { $_.ProcessName -eq 'chrome' } |
Sort-Object CPU -Descending |
Select-Object Id, CPU


Literally, return a list of system process, filter down to only the chrome ones, sort them descending based on the CPU property and then display process Id and CPU.
posted by mmascolino at 6:49 AM on September 24, 2019 [2 favorites]


top -o cpu -stats pid,command,cpu -pid `pgrep "chrome"`
posted by bradf at 11:05 AM on September 24, 2019 [2 favorites]


In the OS I'm developing this is actually a single command:
fc
("find chrome"). I feel like this maximally represents the Unix philosophy.
posted by Pyry at 10:48 PM on September 24, 2019 [3 favorites]


« Older It's a joke   |   I'll put the next one in jail! Newer »


This thread has been archived and is closed to new comments