The Secret Origin of Windows
March 10, 2010 6:26 PM   Subscribe

The Secret Origin of Windows, recollections of the development and release of Windows 1.0 and 2.0 by its project manager Tandy Trower (via)
posted by Blazecock Pileon (75 comments total) 24 users marked this as a favorite
 
Is this the one about how Windows was alive during the civil war and it got some sort of unbreakable metal implanted on its bones like one hundred years later? Because that story sucked.
posted by Joey Michaels at 6:35 PM on March 10, 2010 [7 favorites]


No, Bill stole it from a Harvard classmate; an Olympic rower who grew up in Connecticut and later went to get his MBA at Oxford. That's how it always starts.
posted by leotrotsky at 6:40 PM on March 10, 2010 [7 favorites]


Fascinating read. Thanks, BP!
posted by Pope Guilty at 6:46 PM on March 10, 2010


Bill Gates and IBM were standing on a tree branch and both were about to jump, but Bill shook the branch and IBM fell. Later, when they were arguing about it, IBM fell again and broke for good.
posted by Cyrano at 6:55 PM on March 10, 2010 [6 favorites]


Windows 1.0 had a hell of an ad campaign to make up for its delayed release.
posted by invitapriore at 6:55 PM on March 10, 2010 [5 favorites]


I worked peripherally with Trandy Trower when I was on the Access 1.0 team at Microsoft. At the time he was known as the "Dialog Police". One of his jobs was to standardize the interface across the Microsoft Office line.

Trower mentions the unsung hero of Windows: David Weise. Weise didn't just "figure[d] out a clever trick to use extended memory on PCs." Weise, and another guy whose name I don't remember, were the ones who changed Windows to use the protected memory of the 386 as a skunkworks project. Protected memory is what keeps applications from accidentally writing into the memory of each other. This feature turned Windows into a real operating system, which eventually led to the end of OS/2.

The story I heard is that they took this feature to Steve Balmer, who then said that they needed to take it to Bill. Bill told them to run with it. This was a direct threat to the OS/2 that IBM and Microsoft were co-developing. Balmer asked Bill, "What are we going to tell IBM?" Bill said, "No, Steve, the question is what are YOU going to tell IBM?"
posted by Xoc at 7:02 PM on March 10, 2010 [21 favorites]


One of my program managers, Gabe Newell–now the head of the successful game company Valve–would pound on [prerelease Windows 1.0] until late into the night and morning hours and then sleep in his office.
posted by BeerFilter at 7:03 PM on March 10, 2010 [2 favorites]


Protected memory is what keeps applications from accidentally writing into the memory of each other. This feature turned Windows into a real operating system...

*cough*
posted by DU at 7:28 PM on March 10, 2010 [4 favorites]


Blue Screen of Death Real Operating System.
posted by ZenMasterThis at 7:31 PM on March 10, 2010 [1 favorite]


It could have ended there.

I watched, as the gaunt form of Trower teetered on the precipice, the moment a microcosm of human fate.

"Destroy it! Destroy it now before it consumes the world!"

But Trower was already lost in the OS, deaf to my pleas.

"For all that is good and free in this world, end it, that it may never trouble the eyes of men again!" My voice cracked with desperation, for I knew even then that I was too late.

"No." Trower said, as he turned from the fires that would have consumed the OS, the same fires that would henceforth burn in his sunken eyes until it had consumed him wholly.

"No, I will ship this OS. We will use it, we can harness it for the good of all."

I have ever since cursed the weakness of man, the weakness that let loose Windows onto this darkened, once fertile land.
posted by Salvor Hardin at 7:34 PM on March 10, 2010 [10 favorites]


I remember getting Windows 1.01 running on my first PC-compatible computer (a Tandy 1400LT) sometime around 1998.
It ran in glorious monochrome 640x200 CGA graphics and I had to use it without a mouse. Windows v1 didn't even support overlapping windows- that was introduced in Windows 2, which I quickly upgraded to.

You know what's awesome? Most of the apps bundled with Windows 1.01 still run on my copy of Windows 7. Just for fun, I just fired up MS-DOS Executive. it's weird seeing that 1985 copyright date on a Windows application.
posted by dunkadunc at 7:37 PM on March 10, 2010


1998, huh?
posted by nevercalm at 7:44 PM on March 10, 2010 [2 favorites]


Yeah, most evil things originate in secret, don't they?
posted by DecemberBoy at 8:01 PM on March 10, 2010


Even 1988 was kinda late for Windows 1.01.
posted by iconjack at 8:04 PM on March 10, 2010


However, holding down the command key while you drag a window allows you to move the window without disturbing its relative position in the pile. This feature is an example of an “advanced” desktop management skill that you soon learn after a few work sessions with the Mac.

holy fuck i just learned something from AN ARTICLE FROM 1984

linked in the windows article
posted by nathancaswell at 8:09 PM on March 10, 2010 [26 favorites]


For context this is the same article that contains things like:

A more efficient way to open an icon is to double-click the mouse button (quickly press and release it twice)

My mind is fucking blown.
posted by nathancaswell at 8:12 PM on March 10, 2010 [3 favorites]


You know what's awesome? Most of the apps bundled with Windows 1.01 still run on my copy of Windows 7. Just for fun, I just fired up MS-DOS Executive. it's weird seeing that 1985 copyright date on a Windows application.
I steadfastly believe that this level of backwards compatibility is one of the biggest things that keep MS from releasing a really nice, modern OS. Unfortunately, if they did away with backwards compatibility and pulled even a remotely Apple-ish "apps must be written for Windows Vista or later" proclamation their corporate customers would have their heads on pikes.
posted by c0nsumer at 8:23 PM on March 10, 2010 [1 favorite]


Yeah, but what are they going to do? Switch to Ubuntu? Not likely.
posted by Pope Guilty at 8:36 PM on March 10, 2010


c0nsumer: Do you think that the windows XP mode virtual machine will allow them to gradually break backward compatibility?

MS will still keep their lock in with Word, Excel and Outlook being defacto standards in most offices and with apps written for Win 2K and above running on their OS.
posted by sien at 8:44 PM on March 10, 2010


I steadfastly believe that this level of backwards compatibility is one of the biggest things that keep MS from releasing a really nice, modern OS.

You are completely correct, of course, but at the same time, backwards binary compatibility can be brilliant and noble aim. The way things tend to break on a typical Linux system if you try to add the new and latest software yourself is incredibly frustrating. Recompile the kernel just to get some new hardware working? Pur-lease. But, I guess, that's negated somewhat by the ease and lack-of-money required to just give up and install the latest version of Ubuntu over the top of what you already have.
posted by Jimbob at 8:48 PM on March 10, 2010


Windows 1.0 had a hell of an ad campaign to make up for its delayed release.

Holy.

Jesus.

Fuck.
posted by clarknova at 8:57 PM on March 10, 2010 [3 favorites]


I'm suspicious of the protected memory anecdote, at least with regard to early versions of Windows. Windows 1.0 ran on ordinary x86, on release in 1985 and no version required a 286 until 1988 and 386 support came in 1989.

There were several forms of extended memory available even for 8088 and 8086 which used bank-switching techniques. These were supported by Windows and the other early multi-tasking systems (QEMM, TopView). Protected mode came with the 80286, but never caught on because it wasn't compatible with memory mapping techniques, though OS|2 was designed to use this mode. Intel pushed it heavily, and it was no secret.

The protected memory I think Xoc is talking about came about with the 80386 and it's virtual 8086 mode. Windows/386 supported that, but it wasn't required for Windows until much later. Even Windows 3.0 supported older 8088/8086 processors.

And yes, I am old.
posted by CheeseDigestsAll at 9:02 PM on March 10, 2010 [1 favorite]


c0nsumer: I steadfastly believe that this level of backwards compatibility is one of the biggest things that keep MS from releasing a really nice, modern OS

Oh, totally agreed. And it's most definitely their corporate customers that are keeping them from doing that. In my opinion, anyone who insists on making employees stick with IE6 should be the one with their head on a pike.

Linux, on the other hand, seems to be the opposite of this- there are constant updates that are changing the UI and application compatibility. While this means that everything is super up-to-date (for whatever that's worth), it's also a major weakness: You can't take a binary package and expect it to install and run on your computer, because there isn't that much backwards (or cross-distro) compatibility- leaving you compiling from source at best. This is incredibly user-unfriendly and will continue to chase off users.

What's important, I think, is a balance between Microsoft's refusal to throw out crud and Linux's cacophony of compatibility-breaking updates and variant distros.
posted by dunkadunc at 9:14 PM on March 10, 2010 [1 favorite]


Windows 1.0 had a hell of an ad campaign to make up for its delayed release.

I'm using windows windows windows 386 wuh wuh windows..
posted by ROU_Xenophobe at 9:16 PM on March 10, 2010


holy fuck i just learned something from AN ARTICLE FROM 1984

Holy crap it still works ... I BEEN MAC ROL'D
posted by RobotVoodooPower at 9:25 PM on March 10, 2010


In 1988, Apple decided to sue Microsoft over Windows 2.0’s “look and feel”, claiming it infringed on Apple’s visual copyrights. Having been a principal manager in charge during development of Windows 2.0, I was now caught up in the maelstrom and over the next year I got a thorough education on the US legal process as I briefed the Microsoft legal team, created exhibits for them, and was grilled by deposition by the other side. To me the allegation clearly had no merit as I had never intended to copy the Macintosh interface, was never given any directive to do that, and never directed my team to do that.
The similarities between the products were largely due to the fact that both Windows and Macintosh has common ancestors, that being many of the earlier windowing systems such as those like Alto and Star (the latter shown at left) that were created at Xerox PARC. History shows that Jobs in fact visited PARC and hired people from there to join Apple. But Apple’s first graphical-interface computer, the Lisa, failed, and there was a time even in the first year of its launch that it was unclear whether the Macintosh would make it. From my perspective, Microsoft’s support of the Macintosh helped it survive through its most critical time and continues to be a platform the company continues to support. To me, the allegation was almost insulting. If I wanted to copy the Macintosh, I could have done a much better job.

posted by KokuRyu at 9:44 PM on March 10, 2010 [3 favorites]


This made me think that perhaps the offer to me was a ploy by Gates and Ballmer to fire me because of their disappointment in dealing with Turbo Pascal and my suggestion that perhaps my assignment to managing programming languages was a poor choice on their part. It seemed clever: give me a task that no one else had succeeded with, let me fail as well, and they would have not only a scapegoat, but easy grounds to terminate me. So, I confronted Gates and Ballmer about my theory. After their somewhat raucous laughter they regained their composure and assured me that the offer was sincere and that they had confidence in my potential success.
Heh.
Windows 1.0 had a hell of an ad campaign to make up for its delayed release.
Wow, was that for real? It actually seems kind of contemporary, I realize 1984 wasn't that long ago.
posted by delmoi at 9:46 PM on March 10, 2010


I’ve read elsewhere—sorry, no idea what the source was—that the Ballmer ad was a joke to encourage everybody to power through and finish the product.
posted by stilist at 10:00 PM on March 10, 2010


I steadfastly believe that this level of backwards compatibility is one of the biggest things that keep MS from releasing a really nice, modern OS.
What are you talking about? How are vista and windows 7 not "Modern"?

Microsoft did break backwards comparability somewhat in new OSes in order to enhance security. As far as backwards comparability goes, you can run programs from any OS or system in an emulator or virtual machine. So it's possible, these days to keep backwards comparability without any real sacrifice.

But what are you event talking about? It seems like OSes are pretty much a solved problem, frankly. I can't even think of any real innovation in PC operating systems for quite a while (as opposed to UI innovations, like multi-touch)

I don't even know what "Modern" is supposed to mean here. I can imagine "Cloud" OSes of the future that actually run on clusters of machines, or something like that. But more likely, they'll all be running Linux kernels on top of Xen hypervisors and the "Cloud" stuff will be a layer on top of that.
posted by delmoi at 10:09 PM on March 10, 2010 [2 favorites]


What are you talking about? How are vista and windows 7 not "Modern"?

I've thought about this - I started off agreeing with the statement about Windows needing to get "modern", but then I realized that what I was really wishing for was for Windows to be based on Unix like, well, everything else we have. That's got nothing to do with modernity, though, it's just a preference.
posted by Jimbob at 10:27 PM on March 10, 2010 [1 favorite]


It's ironic, for example, that my desire for Windows to throw out the backwards compatibility and get "modern" was driven by the desire for "head" and "tail" commands at the Command Prompt.
posted by Jimbob at 10:28 PM on March 10, 2010


Wow, was that for real? It actually seems kind of contemporary, I realize 1984 wasn't that long ago.

The thing I notice about it is that Ballmer's insanity is quite clearly of a younger vintage. It's still bright and fruity, lacking the somewhat darker notes that are evident today, like chair throwing. It's always had a hell of a nose, though.
posted by invitapriore at 10:41 PM on March 10, 2010 [3 favorites]


"There were several forms of extended memory available even for 8088 and 8086 which used bank-switching techniques. These were supported by Windows and the other early multi-tasking systems (QEMM, TopView)"

You're thinking not of extended memory, but expanded memory. EMS was memory switched in to a 64k window near the top of the real mode address space. This was done in hardware on the (usually ISA) memory cards that implemented it. XMS existed only on the 286 and above, with the larger address spaces, and was directly addressed by the memory bus.

Also, QEMM was not a multi-tasking system, it was a software implementation of EMS. It took normally unused XMS memory and made it available to DOS programs as EMS, by running as a 386 protected mode memory manager. You're probably thinking of Quarterdeck's other blockbuster product, DesqView, however that ran in V86 mode on a 386 or better. TopView was IBM's predecessor that ran (sometimes) in real mode, but as far as I know it never made use of either EMS or XMS. It just shoved as many programs as would fit into 640K.

I used to have original installation media for Windows 1.01 and 2. And Windows/286. W/286 actually had its own EMS emulation that it did for DOS programs running under it.

Yeah, memory management issues like this utterly dominated the face of early microcomputing, especially so on the PC platform which had such an awful proliferation of different ways to divvy up memory for program use.

Microsoft actually got things like memory management and multitasking right in one of their earliest OS products, in 1980: Xenix. Unfortunately -- in part because of how screwed up UNIX licensing was in those days -- they made it really hard to buy, so despite being pretty decent it never really had much market penetration beyond running a veritable crapload of point-of-sale systems in the world.
posted by majick at 10:49 PM on March 10, 2010 [4 favorites]


" I can't even think of any real innovation in PC operating systems for quite a while (as opposed to UI innovations, like multi-touch)"

BeOS' fast, pervasively multithreaded microkernel architecture with major chunks of code getting pushed into userspace comes to mind, but now that I mention it I realize BeOS is actually ancient. BeOS was the first and last time that stuff escaped from the lab, and it was arguably far, far ahead of its time. In its day, CPUs were expensive and rarely seen in N>1 outside the high end. In the modern age where we're trying for widespread parallel systems in order to work around the Shannon limit of silicon (or whatever we're saying is behind clock constraints now), it's actually a good idea.
posted by majick at 10:57 PM on March 10, 2010 [1 favorite]


"holy fuck i just learned something from AN ARTICLE FROM 1984"

Blimey, me too. I can't say it's something I'll be using often, but it's fascinating that it still works.
posted by malevolent at 11:26 PM on March 10, 2010


I've thought about this - I started off agreeing with the statement about Windows needing to get "modern", but then I realized that what I was really wishing for was for Windows to be based on Unix like, well, everything else we have. That's got nothing to do with modernity, though, it's just a preference.
...
It's ironic, for example, that my desire for Windows to throw out the backwards compatibility and get "modern" was driven by the desire for "head" and "tail" commands at the Command Prompt.
Yeah and the thing pretty much any System level unix feature you can name, windows has. And cygwin takes like five minutes to install. I have a little one-liner shell command to count lines of code, dump it to a file and then print out the end of said file (using tail even). That's all I use it for, though.
BeOS' fast, pervasively multithreaded microkernel architecture with major chunks of code getting pushed into userspace comes to mind, but now that I mention it I realize BeOS is actually ancient
Yeah, BeOS came out in '95. I don't know how quick windows task switching is, but its kind of irrelevant on 8 core machines. And remember, BeOS was actually pretty backwards looking. It was a single-user OS, Unlike OSX, Linux and Windows 2000+. Even if only one person is using a machine at once, having a multi-user system makes it much more secure. Applications and services can run as their own user, with limit access to the system. It's a vast improvement in safety and was a huge step forward. It's no surprise that Apple chose NextSTEP as a basis for OSX over Be.
posted by delmoi at 11:27 PM on March 10, 2010


I live about five blocks from Tandy, FWIW (from his Redmond home, not his Hood Canal home).
posted by bz at 11:35 PM on March 10, 2010


I am no fan of beos at all, nor do I really disagree with you delmoi but I think it's interesting that the multiuser paradigm is really sweating as a security model for single user machines.

If the threat is other users on a multiuser machine then sure user-based access control is great. But if the machine is single-user, as most machines are, then all that user shit boils down to a glorified single bit of access control which doesn't really go that far in the internet era.
posted by Wood at 12:04 AM on March 11, 2010


This is really interesting, BP. Thanks.

The more I read about the early history of MS, the more I'm amazed at Bill Gates. He's really pulled off one of the biggest PR coups in modern history: playing the part of the nerd while actually being more of a business-minded twat than anybody in a suit and tie. Seriously - guy didn't even write DOS, did he? And yet, because he's skinny and gangly and long-necked, and because he likes wearing sweaters, the media seized on him as some kind of uber-nerd. Sometimes I wonder if he ever even wrote a single line of code for Microsoft.
posted by koeselitz at 12:24 AM on March 11, 2010 [1 favorite]


I, of course, meant Tandy, not Trandy.

That version of Windows in my story first became Windows/386, but soon after became Windows 3.0. That was the first really successful version of Windows that finally put DOS programs out of business.

Bill actually wrote the version of BASIC that Micro-Soft (later renamed Microsoft) first shipped himself. Wrote it in assembly language on yellow legal pads, then punched it into the Altair 8800 (the first microcomputer). It fit in 4K of memory with room to spare for a program and worked pretty much on the first try. Don't disparage his technical abilities...they are better than most.

Bill tended to review the current development on each product at Microsoft about every 3 months, and he asked hard technical questions, and you'd better have good explanations for why things were implemented the way they were. His breadth and depth of technical knowledge is rather amazing.
posted by Xoc at 12:56 AM on March 11, 2010 [2 favorites]


Wood: “I am no fan of beos at all, nor do I really disagree with you delmoi but I think it's interesting that the multiuser paradigm is really sweating as a security model for single user machines. ¶ If the threat is other users on a multiuser machine then sure user-based access control is great. But if the machine is single-user, as most machines are, then all that user shit boils down to a glorified single bit of access control which doesn't really go that far in the internet era.”

Er – well, you may feel as though "the multiuser paradigm is really sweating," but every major OS maker seems to disagree with you. And I have to be honest, the biggest advance I saw in Vista (which was sort of okay as far as driver updates, but seemed mostly in my mind to be a pointlessly flashy show of graphics) was the adoption of an attempt at the Linux enforcement of a user/superuser model, User Account Control. Yes, it's implemented in a hamfisted way, but it's really an essential part of modern security.

I'm not really in a mood to give you the "always use sudo" lecture, but you should know that it's absolutely, completely necessary for a modern computer system to separate users in this way. Even in what you call a "single-user" system, this is important; on a computer with any sort of power and flexibility, the most expert of users can accidentally click the wrong thing or run the wrong command and do something disastrous. So it's imperative that command structure be divided, at the very least, into "non destructive" and "destructive" commands. The most rational way to do this is to divide system processes by users, and assign at least two users, one of whom is an administrator and another of whom is a normal user.

So it can seem odd and sort of artificial that computers have retained the old server model of "multiple users," even 25 years after they were supposed to start belonging to one person at a time – but believe me, it's for a reason. And this is even more true now than it ever was before; there might have been a time when it was feasible to take every important function on a computer and label it "IMPORTANT!" abjuring all novices never to touch it - but now systems are so complex that things can't be easily labeled like that, and there are forms of malware that merely prompt the user to click a button and then take over from there. I know that MS was sort of shooting themselves in the foot with UAC simply because it asks for permission so goddamned much, but I know why they did it - because modern systems need it.
posted by koeselitz at 12:58 AM on March 11, 2010 [3 favorites]


Xoc: “Bill actually wrote the version of BASIC that Micro-Soft (later renamed Microsoft) first shipped himself. Wrote it in assembly language on yellow legal pads, then punched it into the Altair 8800 (the first microcomputer). It fit in 4K of memory with room to spare for a program and worked pretty much on the first try. Don't disparage his technical abilities...they are better than most. ¶ Bill tended to review the current development on each product at Microsoft about every 3 months, and he asked hard technical questions, and you'd better have good explanations for why things were implemented the way they were. His breadth and depth of technical knowledge is rather amazing.”

Yeah; I imagined that was true on some level. You just never hear about Bill doing anything technical at MS, but I'm sure he had a strong hand early on.

To be honest, Bill Gates' achievement is even more impressive given that he was technically-minded; he's always displayed what I think is probably the least technologically sentimental attitude of any significant figure in the history of computing. From the start, Microsoft was meant to be a successful company, not a company that made "pretty software" or "beautiful programs." And Bill has been unswerving in his dedication to that ideal. It's really the pinnacle of success, and its finest encapsulation is that infamous motto, "embrace/extend/extinguish." Lesser designers, people more concerned with making something beautiful or something great or something that fit their idea of how software should be – Steve Jobs is of course chief amongst them, but there are many others – have always failed. But Bill Gates succeeds because those things don't matter to him as much as keeping the company on top. That's admirable, in its way.
posted by koeselitz at 1:18 AM on March 11, 2010 [2 favorites]


Thing is, koeselitz, if he hadn't been handed a quasi-monopoly right at the start, that determined focus on success combined with indifference to quality would have got him precisely nowhere, and we'd have had a market where people who cared about good software that was a pleasure to use actually succeeded - the way the market system is supposed to work, as I understand it.
posted by Phanx at 1:59 AM on March 11, 2010 [1 favorite]


Jimbob: Windows without Cygwin is like a toolbox without a screwdriver. Well, like that toolbox without the #2 Phillips, when you need to replace the batteries in your kid's Leapster.
posted by Ella Fynoe at 5:11 AM on March 11, 2010


Lesser designers, people more concerned with making something beautiful or something great or something that fit their idea of how software should be – Steve Jobs is of course chief amongst them, but there are many others – have always failed.

Uh, what? I think you (or Bill Gates) have a messed-up definition of "success." I'd say massive profits = business success, and if you don't think Steve Jobs is sitting on massive profits, you just aren't paying attention.
posted by grubi at 5:43 AM on March 11, 2010 [1 favorite]


In a way, UAC is just a way of making novice users use a user/superuser system; in XP, you could simply create an unprivileged account and use it most of the time, and simply use "run as Administrator" (sudo, basically) when you needed to install stuff or change settings. I ran my main system like that from 2002-2009.
posted by Monday, stony Monday at 5:52 AM on March 11, 2010


And remember, BeOS was actually pretty backwards looking. It was a single-user OS, Unlike OSX, Linux and Windows 2000+. Even if only one person is using a machine at once, having a multi-user system makes it much more secure. Applications and services can run as their own user, with limit access to the system. It's a vast improvement in safety and was a huge step forward. It's no surprise that Apple chose NextSTEP as a basis for OSX over Be.

Well, they chose NextSTEP after JLG countered their offering price for BeOS with a much higher one. Who knows what would have happened if he had taken the money and run instead.

In any case, lack of multi-user didn't really enter into the equation. Multi-user had been figured out and implemented, but for non-technical reasons was never put into the public releases.
posted by mikepop at 6:38 AM on March 11, 2010


"It's no surprise that Apple chose NextSTEP as a basis for OSX over Be."

A credible argument could be made that the company most people refer to as Apple today is Next, Inc. with a new name slathered on it. In the long run, it was the right choice anyway -- it was easier and smarter to graft GCD on to NextStep and then pull the system apart into threads than it would have been to bolt all the missing bits of UNIX on to BeOS.

I won't, however, back down from the assertion that parallelizing the holy hell out of the OS and applications and then giving the computer a bunch of cores to run all those threads is exactly what we wound up trying to do. And BeOS got there too early for their own good, and as mentioned, with a security model that did nothing to exceed its market contemporaries.

"I don't know how quick windows task switching is, but its kind of irrelevant on 8 core machines."

Context switches are still pretty horrible on x86, which is exactly why we want 8 core machines running an OS with really granular threads (something Be-ish) rather than one big honking kernel process and some big honking application processes (everything else, until somewhat recently).

Anyway, yeah, historical perspective on the golden era of personal computing -- say, the period from 1976 through about 1996, before the Windows monopoly totally stamped out everything -- is something I always get a kick out of. The hardware we have today is better, the software we have today is better, but I'll be damned if there weren't tons of good and interesting ideas. Of course, the cleverest ideas were usually ways of working around hardware limitations that simply don't exist today, so cleverness is now rarely a virtue in personal computing products.
posted by majick at 6:54 AM on March 11, 2010




A credible argument could be made that the company most people refer to as Apple today is Next, Inc. with a new name slathered on it.

Seriously. Anyone who thinks Apple took over NeXT clearly has no idea what they're talking about. A simple look at the org charts pre-acquisition and within a few months later proves that.
posted by John Kenneth Fisher at 9:30 AM on March 11, 2010 [1 favorite]


xoc: Bill actually wrote the version of BASIC that Micro-Soft (later renamed Microsoft) first shipped himself. Wrote it in assembly language on yellow legal pads, then punched it into the Altair 8800 (the first microcomputer).

Actually he co-wrote it with Paul Allen. TBH I can't be bothered to research the details right now :) but the story is widely known and on the webs. Wikipedia mentions it, come to think.

AFAICT Allen was always considered the better programmer. Apparently the last operating system (per se) that Gates had a direct hand in was the BASIC used in the Tandy M100/M102. (I have a 102 and I love it!)
posted by blue funk at 9:59 AM on March 11, 2010


The golden era of personal computing actually ended in 1994 when Commodore went bankrupt.
posted by rfs at 9:59 AM on March 11, 2010


Seriously. Anyone who thinks Apple took over NeXT clearly has no idea what they're talking about. A simple look at the org charts pre-acquisition and within a few months later proves that.

I believe the running joke at the time was "NeXT acquires Apple for Negative (whatever million dollars).

I heard the same joke made when Disney acquired Pixar.
posted by mikepop at 10:03 AM on March 11, 2010 [1 favorite]


You know what's awesome? Most of the apps bundled with Windows 1.01 still run on my copy of Windows 7.

This seems strange to me, because I remember trying to run a Windows 2.0 program on a lark under XP and was completely unable to do so, getting unsupported version errors. And I've heard that 16-bit Windows programs won't run under 64-bit Windows, like how Windows 7 is being primarily offered on new machines.

From the start, Microsoft was meant to be a successful company, not a company that made "pretty software" or "beautiful programs." And Bill has been unswerving in his dedication to that ideal. It's really the pinnacle of success, and its finest encapsulation is that infamous motto, "embrace/extend/extinguish." Lesser designers [...]

That is not the attitude of a software designer. It is the attitude of a businessman, which is what Gates has primarily been at Microsoft for many years now. And it is an attitude that may have been good for Microsoft, but has been absolute hell for many other people, like Netscape. The U.S. Justice Department looked at that with some measure of concern, I seem to remember.

These days it is really beginning to seem like the end of Microsoft's monopoly, while still a ways off, may be in sight, but I'm not ready to start romanticizing him yet.

That's admirable, in its way.

One can admire the shark for its sharkly ways. There are a lot of things that aren't admirable about it, too.
posted by JHarris at 10:26 AM on March 11, 2010 [1 favorite]


If the threat is other users on a multiuser machine then sure user-based access control is great. But if the machine is single-user, as most machines are, then all that user shit boils down to a glorified single bit of access control which doesn't really go that far in the internet era.

I agree. It ought to be user/application based. The combination of user U plus appliction X gets to do Y to Z files. Or something.
posted by delmoi at 10:37 AM on March 11, 2010


The combination of user U plus appliction X gets to do Y to Z files. Or something.

I'm so tired of the overuse of jargon around here!
posted by grubi at 12:21 PM on March 11, 2010


>: I remember trying to run a Windows 2.0 program on a lark under XP and was completely unable to do so, getting unsupported version errors.

I'm getting a warning on application launch that I should "obtain an updated version that is compatible with Windows version 3.0 or higher" list I have "compatibility problems that could cause the application or Windows to close unexpectedly".

Working from the Windows 2.03 install files I've got on hand, Cardfile, Clipboard, Clock, Control Panel (one screen of sliders!), MS-DOS Executive, Notepad, and Terminal (only compatible with modems and serial cables) will run out of the box. Write (early WordPad) and Paint both exit with "Not Enough Memory" errors, all others just don't run at all.
posted by dunkadunc at 12:54 PM on March 11, 2010


Er – well, you may feel as though "the multiuser paradigm is really sweating," but every major OS maker seems to disagree with you.
I think they disagree from inertia, not because they think the point is wrong. The multiuser paradigm is great if you want to protect the system, but 99% of the time, I don't give two shits about the system -- I've got the install disks right here. What I care about is my data, and that's all under my user account.

One rogue application or bit of malware that gets authority to run as me can then destroy everything I care about on my machine. Hey, it might leave me the system intact and bootable, but if ~/Documents is gone, I don't care.

I think that's Wood's point about the internet era: my browser usually runs with my access privileges. There's various bits of sandboxing going on, but IE especially has been known to let the odd grains of sand dribble out.

When you're talking about a single user machine, then, yes, I think the multi-user paradigm as we know it in current implementations really is huffing and puffing trying to contort into the job it's now being asked to do.
posted by bonaldi at 1:01 PM on March 11, 2010


Oh man, reading the article I remembered playing Reversi as a 7 year old. So it must have been Windows 2.0 (it was on a 286 - it emulated 8086 and 8088 modes which you needed for some games... with the Hercules graphics card hooked up to a monochrome monitor...) Sweet, cheers for the post!
posted by yoHighness at 1:25 PM on March 11, 2010


Actually the other way round, from it being Windows 2.0 helped me guess that I was around 7+ years old. Sweet
posted by yoHighness at 1:26 PM on March 11, 2010


reversi
posted by yoHighness at 1:29 PM on March 11, 2010


rfs: “The golden era of personal computing actually ended in 1994 when Commodore went bankrupt.”

Actually, I think you'll find that the golden era of personal computing actually started in 1994 with the Debian Manifesto.

bonaldi: “The multiuser paradigm is great if you want to protect the system, but 99% of the time, I don't give two shits about the system -- I've got the install disks right here. What I care about is my data, and that's all under my user account. ¶ One rogue application or bit of malware that gets authority to run as me can then destroy everything I care about on my machine. Hey, it might leave me the system intact and bootable, but if ~/Documents is gone, I don't care. ¶ I think that's Wood's point about the internet era: my browser usually runs with my access privileges. There's various bits of sandboxing going on, but IE especially has been known to let the odd grains of sand dribble out. ¶ When you're talking about a single user machine, then, yes, I think the multi-user paradigm as we know it in current implementations really is huffing and puffing trying to contort into the job it's now being asked to do.”

But that's not how computers work; and I think you have a weird idea of the multiuser paradigm. The system isn't to be protected simply because "I don't want my precious system to have to be restored!" Yeah, that's a hassle, but it's hardly the biggest reason for security. The system is to be protected because it's the common denominator, because it's the thing with access to the details of all users. If someone takes control of the system, it doesn't matter where your files are; it can get to them.

If you run your browser with your user access, and that's a security risk, the solution isn't "forget about the security risk and walk away." It's "separate users MORE." In this case, put your home directory on a separate encrypted partition; then your browser won't be able to have any access whatsoever to that kind of thing. That way, you have to have root superuser access to even mount your documents folder. This is already an option in Linux, and (in a move that I think makes a lot of sense) it's offered explicitly to the user at setup time, encouraging people to think about security.

It seems like a good thing to remember that "the multiuser paradigm" is one that hasn't made literal sense since 1985 or so – and yet it's survived all these years, in almost every operating system. It's something that IT folks have sometimes had to artificially create (cf WinXP) in order to preserve security, even on their own single-user machines. It is something that makes sense because without it the words you're using aren't even meaningful; you say that a bit of malware can wrest control of the system and "run as you," but that wouldn't mean anything at all if we did away with the multiuser paradigm. delmoi was right up above when he pointed out that ownership of processes is pretty much essential to an OS's security scheme; it's essential that a system of authority be laid down if there's going to be any sort of malware protection or even real efficiency of processes. That system implies the multiuser paradigm, even on "single-user" systems – which is really what we're talking about, I think.

I'd like to know what systems you're thinking of when you say the current paradigm is "huffing and puffing." Is this just annoyance at User Account Control that's talking, or what?
posted by koeselitz at 2:22 PM on March 11, 2010


Thanks for the Cygwin tips, people. It's been years since I've actually used it, and it appears to have improve immensely.
posted by Jimbob at 2:34 PM on March 11, 2010


meanwhile: 119 Anecdotes about the development of Apple's original Macintosh computer, and the people who created it.
posted by ovvl at 3:59 PM on March 11, 2010


Neat article.

The Windows® 95 User Interface: A Case Study in Usability Engineering Is pretty interesting if you want to be taken back to the days of Win 95/Cairo.
posted by Artw at 4:21 PM on March 11, 2010 [1 favorite]


The system is to be protected because it's the common denominator, because it's the thing with access to the details of all users
Yeah, if you're talking about a timesharing hunk o' iron from 1978. Not so much if you're talking about my personal laptop.

If you run your browser with your user access, and that's a security risk, the solution isn't "forget about the security risk and walk away." It's "separate users MORE." In this case, put your home directory on a separate encrypted partition ...

I didn't say the solution was "walk away", but I really don't think "jump through hoops" is a good answer, either. Which is exactly what the "lock your home directory away" thing is: you're doing technical things needlessly, to fit into a security model that breaks things down only into "users".

That system implies the multiuser paradigm, even on "single-user" systems – which is really what we're talking about, I think.
No, it doesn't, and it only seems implied because we've lazily allowed multi-user to become so pervasive that it became our only way of looking at the world. It's just like there was a time there when every new language used C syntax, because otherwise programmers wouldn't even countenance it.

What is actually essential to security is accountability and trust. There are metaphors that fit these requirements much better than pretending that there are lots of little users in the computer, when there's really only one user: me, and the rest of them are daemons, jumping through hoops.

I'd like to know what systems you're thinking of when you say the current paradigm is "huffing and puffing." Is this just annoyance at User Account Control that's talking, or what?
UAC is more of a symptom. But I was thinking about the Mac, which is actually in a bad state but content to remain so because there's virtually no malware levelled at it. There's a robust multiuser implementation here, as you'd expect, but if a dressed-up AppleScript moved all my documents to the trash and emptied it, there'd be no security barriers in the way, even if I had a standard user account.

I don't know what's more broken: the filesystem metaphor or the multiuser security model. But I do know that when they combine, they create a Hard Problem, and it's going to take some OS thinking of the kind that isn't really happening much anymore to solve it. Perhaps the iPhone OS's a-sandbox-for-every-app is the way.
posted by bonaldi at 4:24 PM on March 11, 2010


bonaldi: “I didn't say the solution was "walk away", but I really don't think "jump through hoops" is a good answer, either. Which is exactly what the "lock your home directory away" thing is: you're doing technical things needlessly, to fit into a security model that breaks things down only into "users".”

I know you might feel as though it's technical jumping-through-hoops, but seriously, this is what putting the home directory on its own encrypted partition entails on an Ubuntu Linux system nowadays:

1. Check "put my files on a separate encrypted partition" box during setup.
2. Type in root whenever you want to access those files.

It's honestly that simple. It's very, very easy. Moreover it's the only way to really protect document files from browser access, so I don't really know what else we can do.

“I don't know what's more broken: the filesystem metaphor or the multiuser security model. But I do know that when they combine, they create a Hard Problem, and it's going to take some OS thinking of the kind that isn't really happening much anymore to solve it. Perhaps the iPhone OS's a-sandbox-for-every-app is the way.”

The iPhone doesn't even really have an OS in many senses, at least insofar as the developer is flatly prohibited access to most of the machine's core processes. You can call this "a-sandbox-for-every-app," but really it just means "don't give people any tools that might allow them to fuck something up." And as long as people actually want to use the core functions of their computers - overclocking their CPU for graphics-intense design work, switching out RAM for heavy mathematical computations, etc - such 'OSes' won't really fly.

Moreover, I disagree firmly about both the "multi-user" and "filesystem" metaphors (we've had this argument here, I know) but I think you're probably right to bring them up. Both are in about the same boat; people keep talking about how 'limited' they are, but the fact is that there is no way to construct a computer without them. In the end, all of this simply indicates that people would like a computer that floats in the air and communicates with them telepathically, which is really the only way we'll get rid of filesystems or multi-user systems. Do you really want a computer where you don't need an admin password to do something dangerous? Are you seriously going to argue that a computer with no such security measures makes any sense at all?
posted by koeselitz at 4:39 PM on March 11, 2010


Type in root password whenever...
posted by koeselitz at 4:40 PM on March 11, 2010


bonaldi: “... it's going to take some OS thinking of the kind that isn't really happening much anymore to solve it...”

Actually, rereading your comment, this is an interesting thing to say. What do you mean? Are you thinking of BeOS, or what?
posted by koeselitz at 4:41 PM on March 11, 2010


2. Type in root whenever you want to access those files.
That's really not "very, very easy", it's a pain in the ass. I manipulate hundreds of files a day. I want my apps to be able to manipulate those files, too. If I have to unlock them every single time, that's a heavy overhead. If they're unlocked for limited periods of time, then during those periods, the app can go to town.

It's the old you-can't-secure-your-computer-against-physical-access problem: in this little world of pretendy users we've made, we have to give those users virtually-physical access to do the things we want them to do. So then we can't secure our files against them.

but the fact is that there is no way to construct a computer without them.
It depends whether or not you beg the question in your definition of "computer" or not. The Newton, for instance, had no files. It stored data in soups. (Those soups were stored at a low-level in something you'd recognise as a filesystem, but I think that's a semantic point: it's would possible to implement the system without the discrete files, for sure)

It's the same with multi-user, but I think you're confusing the arguments here because you're so locked into multi-user-is-the-only-game-in-town. The alternative to a multi-user system is not "no such security measures", it's a system where security is implemented without it being bolted on the descendants of /etc/passwd.

Actually, rereading your comment, this is an interesting thing to say. What do you mean? Are you thinking of BeOS, or what?
What I mean is that currently we're in Flatland. We need someone to break the paradigm just as Englebart/Parc/Apple shattered the CLI. I don't know what that would look like: I imagine sandboxed apps, with malleable permissions. iTunes doesn't need access to anything outside of ~/Music: why should it, and any app that happens along have access to anything in ~/ by default?

Multi-user makes some apps pretend they are students in the next room of a college campus sharing time on my computer on a sunny day in 1978. Why can't we build a system that acknowledges they're actually all my apps, sharing access to my data in a more controlled way?

I don't know the answer, but I do know it's not "because everybody does it this way and has done for ages, therefore it's the best and only way to do it".
posted by bonaldi at 5:03 PM on March 11, 2010 [1 favorite]


iTunes doesn't need access to anything outside of ~/Music: why should it

Perhaps people don't keep their movies, TV shows and other media in that folder.
posted by Blazecock Pileon at 6:57 PM on March 11, 2010


No one is saying the multiuser paradigm is worthless. It's great. There are plenty of multiuser systems out there, I was working on honest to goodness multiuser workstations until well after 1985 and if you include LANs I still am. The admin / singleuser separation has value as well.

It's honestly that simple.

Don't forget to always use a secure attention sequence before typing any passwords.
posted by Wood at 9:46 PM on March 11, 2010


My first job was writing programs under Topview.

Extra points if you know the name of its more successful clone.
posted by sfts2 at 5:16 AM on March 12, 2010


" I imagine sandboxed apps, with malleable permissions."

Oh, so capabilites? Solaris' implementation is good, and the Apple version, while less granular, is pretty sweet as well. The problem with using caps is that constraining by identity is way easier than constraining by instance, so the awesomely improved security you're looking for is more of a pain in the ass for the end to administer rather than less.

Mostly I don't think you're looking for new security paradigms. You're looking for new security user interfaces, and improved implementations. I'd agree with that.
posted by majick at 6:02 AM on March 12, 2010


Perhaps people don't keep their movies, TV shows and other media in that folder.
I don't think allowing people to keep their music files in non-standard locations is an argument for permissions wide enough to fit a tank through. A better security user interface would allow the type of person who moves their media to move it, but without submitting ordinary users to the drawbacks of permissions that are too loose.

Don't forget it was an iTunes upgrader that wiped out entire partitions.

No one is saying the multiuser paradigm is worthless. It's great.
I'm not saying it's worthless, but I wouldn't say it's great, either. And again with the "there are plenty". There are plenty of Windows installs too. So?

majick: The reason I'm looking for new paradigms and not better implementations of the same-old is precisely because the "awesomely improved" security of caps is a total pain in the ass for the end users, and hence is actually much worse. We can't get there from here.
posted by bonaldi at 11:15 AM on March 14, 2010


« Older This is Nella's Story   |   All I see now is blonde, brunette, redhead... in 8... Newer »


This thread has been archived and is closed to new comments