What Really Happened with Vista
June 5, 2017 10:06 AM   Subscribe

I think there is a different story to tell — one that is better rooted in the actual facts of the projects and the real motivations of key parties. This is not an effort at alternative history — I have no idea what would have happened if these mistakes were not made but they certainly did not help Microsoft navigate this critical inflection point in the computing industry. -- How Microsoft Vista Failed
posted by Chrysostom (49 comments total) 24 users marked this as a favorite
 
What happened with Vista? They fucked up WDDM on the two then most popular chipsets on earth (GMA 910 and 915) and it ran like dog shit on every <$1000 laptop.
posted by Talez at 10:16 AM on June 5, 2017 [5 favorites]


It wasn't just Microsoft. Intel made some epically bad decisions on some of their chipsets.

I had a netbook that used one of their chipsets that was completely unable to use Linux, as there were no drivers, nor any support.

After a long and painful correspondence with Intel, I finally found out what had happened. They had outsourced the GPU design to a third party and allowed them to keep full IP rights. So, when that company decided to only support Windows, Intel was basically unable to create a Linux driver.
posted by Samizdata at 11:23 AM on June 5, 2017 [6 favorites]


As with Windows 8, service-packed-up Vista was actually alright, and nowhere near as the internet men would have you believe.
posted by GallonOfAlan at 11:41 AM on June 5, 2017 [3 favorites]


we got along quite well.
posted by shockingbluamp at 11:43 AM on June 5, 2017


The general excellence of Windows 7 also no doubt made Vista look a hundred times worse by comparison.
posted by Pope Guilty at 11:44 AM on June 5, 2017 [8 favorites]


Microsoft lost control of public opinion on Vista so badly that my mother, who knows nothing about computers and requires a detailed set of written instructions to access email or operate her cellphone, insisted that the new laptop she and my dad were getting shouldn't have it installed.
posted by sevenyearlurk at 11:45 AM on June 5, 2017 [8 favorites]


I can't help it, I'm going to be that person: RTFA! This essay is fascinating, especially in the context of Alan Kay's talk, How to Invent the Future (read down the post for working links).
posted by Chuckles at 11:58 AM on June 5, 2017 [17 favorites]


(older readers will remember the scandal of questionably labeled “Vista Capable” computers)

Oh God I'm an older reader
posted by selfnoise at 12:25 PM on June 5, 2017 [29 favorites]


It is a fascinating essay, but it has significant trouble getting to the point in anything even remotely resembling efficiency. Entire sections, sometimes several pages in length, could each be reduced to a one- or two-sentence description of a problem, likely followed by his over-used "This had long-running consequences." And only a few of those would need a perhaps two-sentence supplement of extra detail, presumably also followed by his just as nearly over-used "I do not claim to have had unique insight during this period."
posted by mystyk at 12:28 PM on June 5, 2017 [4 favorites]


This is a fascinating essay and I deeply appreciate the author's access and point of view. However, I found it a bit difficult to read in spots (and aggravating in others) because of syntax, spelling, and style mistakes. I would have enjoyed the piece a lot more if it had been edited by someone with an eye for writing well.
posted by Drowsy Philosopher at 12:30 PM on June 5, 2017 [5 favorites]


only 00s adults remember Windows ME
posted by AFABulous at 12:44 PM on June 5, 2017 [19 favorites]


The thing about the 'Vista Capable' sticker is true. I inherited a laptop from that era recently with Vista and that sticker. Dog slow. Tried various minimal Linux distros on it to no avail. In the end I put XP on it and it's great. Then I put it on the shelf where it has stayed, because I had beaten it.
posted by GallonOfAlan at 12:48 PM on June 5, 2017 [11 favorites]


Cheers for posting Chrysostom, this was (just about) the perfect level of detail and explanation for me. I learned a lot about why when my wife decided to do away with her W95 machine and buy a brand-new Vista laptop all those years ago it was obsolete straight out of the box. (And why, as a consequence, she would interrupt my work to borrow my MacBook Pro so frequently.)
posted by stanf at 12:48 PM on June 5, 2017


My wife's computer ran Vista and it ran fine performance wise, but it was the last Windows release (for me) that did that thing where it just slowly degrades into shitty buginess over time without any user changes. The machine now runs Linux Mint and is effectively a Chromebook (only really used for Chromium) which is about the most direct refutation of the eras "Rich client" model that I can think of.
posted by selfnoise at 12:51 PM on June 5, 2017 [1 favorite]


Heh. That short story linked at the end was a nice touch.
posted by radicalawyer at 12:56 PM on June 5, 2017 [4 favorites]


I would have made every one of those mistakes, had I been in a position to make those decisions.
posted by Ivan Fyodorovich at 1:17 PM on June 5, 2017 [7 favorites]


It was so bad it forced me to install Ubuntu for the first time. I'd call that a success of sorts.
posted by jpe at 1:19 PM on June 5, 2017 [3 favorites]


I would have made every one of those mistakes, had I been in a position to make those decisions.

Yeah, they mostly seemed pretty reasonable in isolation. It seems like just too much was trying to be done at once.
posted by Chrysostom at 1:25 PM on June 5, 2017 [2 favorites]


.... a Chromebook (only really used for Chromium) which is about the most direct refutation of the eras "Rich client" model that I can think of.

If Chrome isn't a rich client what is?
posted by WaterAndPixels at 1:35 PM on June 5, 2017 [1 favorite]


Chrome with WebAssembly?
posted by JoeZydeco at 2:40 PM on June 5, 2017 [2 favorites]


I worked with a developer once who voraciously defended Vista, dismissed all complaints and concerns as simply not "getting" Vista, and ridiculed anyone not using it as simply lacking basic computational knowledge. When cornered at last to explain how then Vista is supposed to be used, she went on a lengthy explanation that, when distilled, was essentially "turn off everything that makes Vista Vista and it will run perfectly fine what is wrong with you."
posted by Aya Hirano on the Astral Plane at 2:40 PM on June 5, 2017 [6 favorites]


only 00s adults remember Windows ME

for serious Windows ME would lock up if you looked at it funny
posted by juv3nal at 2:47 PM on June 5, 2017 [4 favorites]


I would have made every one of those mistakes

You are probably not a project manager. They seemed to have gone out of their way to make every project management mistake possible. I guess it was the lack of competition that let them wander off on this doomed project.
posted by bhnyc at 2:48 PM on June 5, 2017 [1 favorite]


If I'm reading the article correctly, it sounds like the fundamental problem was that Bill Gates was trying to create monopolistic vendor lock-in again, just like he did the first time, using the same strategies that he did the first time.

I'm becoming increasingly convinced that most people who succeed in a big way once won't succeed again. We expect them to succeed again, though, which is why we're so often surprised and disappointed when a novelist's second novel isn't great, or when a genius hedge fund manager turns out to be as dumb as the rest of us. (Everybody knew that Eddie Lampert was just the guy to save Sears...) Massive success doesn't work that way most of the time, though.
posted by clawsoon at 2:56 PM on June 5, 2017 [5 favorites]


Interesting article - and it's in line with much of what I'd heard at the time as Longhorn was being developed.

The idea that most of the WinXP bluescreens was due to crappy third-party drivers doesn't sound accurate. I recall reading a report that it was around 50/50 WInXP itself/other drivers based on the data sent back to Microsoft.
posted by Jessica Savitch's Coke Spoon at 3:07 PM on June 5, 2017


I read this a few days ago, and while I'm sure there is a bunch of truth in it, I also can't help but feel it is pretty colored by the fact the writer wasn't involved in Vista at all, and was part of the Office team, who by all accounts were completely against all the major changes in Longhorn from day one and fought it tooth and nail. To then turn around and say "see, it failed, we were right" while not mentioning that his organisation played a hand in that failure doesn't seem great.

I also liked Paul Betts' comment from hacker news here https://news.ycombinator.com/item?id=14474433 which probably gets to the heart of some of the problems people have mentioned here (ie perf and drivers).
posted by markr at 3:18 PM on June 5, 2017 [1 favorite]


What always gets me about complaints regarding Vista is, the thing I've seen cited frequently as the Worst Feature Ever (particularly by techies) was IMHO not only a reasonable thing to add, but what we now view as an obviously correct model for security. I'm talking, of course, about the much-maligned User Account Control (UAC, sometimes referred to as User Access Control).

I mean, holy shit: Microsoft finally came around on the notion that desktop users should ideally not run every goddamn thing they do as a root user with maximum access! Instead, even a privileged user would have two sets of credentials -- their standard credentials, and their elevated credentials. By default, if you ran something it would be using standard credentials, with no access to do things like install programs, modify the OS, touch sensitive system settings, etc. If you wanted to run something that had to do those things, you could either do it forcefully ("Run As Administrator") or the program could ask you, and you would be prompted (via a set of screens specially walled-off to thwart imitation or attempts to trick you), and you could grant privileges.

It was literally graphical sudo. You know, like, literally exactly how every competent fucking unix-like system is run nowadays. It solved a major problem.

Techies fucking hated it.

"It's so annoying, it keeps asking me for permission!"

Security: damned if you do, damned if you don't.
posted by tocts at 3:21 PM on June 5, 2017 [19 favorites]


It was literally graphical sudo. You know, like, literally exactly how every competent fucking unix-like system is run nowadays. It solved a major problem.

Your comparison only works if practically every file on the filesystem was owned by root.

"Techies" fucking hated it because, instead of solving the actual problem of always requiring elevated access to perform almost any action, MS decided to just ask your permission to perform almost any action.

So, the only difference between before and after was an extremely annoying notification. If any, it was less secure since it conditioned user to not even read popups.

Although, there were some benefits to UAC. It was so terrible, Apple made this I'm a Mac ad that put some money into John Hodgman's pocket.
posted by sideshow at 3:50 PM on June 5, 2017 [14 favorites]


You can argue about some of the specifics of the implementation, and I would agree there were improvements that could be made, but the idea was not a bad one. Instead, I had a ton of programmers / IT people / etc basically balk at the very notion of Vista ever asking for permission, pushing hard and fast for "how do we turn this off forever", which is total bullshit.

I'm far from a Microsoft fanboy, but they put a lot of thought into the model, and even went as far as flying developers out to their labs to work with Vista compatibility (specifically focused on UAC) way ahead of release. They did a lot of legwork to try to make it a smooth transition, and they got shat on completely for it.
posted by tocts at 3:54 PM on June 5, 2017 [2 favorites]


The UAC notification process should be considered a textbook example of how harmful a bad UI can be to good technology.
posted by ardgedee at 4:35 PM on June 5, 2017 [4 favorites]


I love this, because I specifically went out and bought a Mac to avoid the awfulness I knew would be Vista. My efforts were successful. A few years later, just as Windows 7 was being released (and I was moving back to Windows), I got a free copy of Vista Ultimate for attending a MS focus group. The only thing I ever used it for was as a basis for installing an upgrade version of Windows 7.

So glad I missed it.
posted by lhauser at 4:52 PM on June 5, 2017


I found the bit about WinFS as some kind of new innovative relational data storage system to be interesting. The current way we organize discrete units of data on computers, generally, follows the paradigm of there being virtual filing cabinets, with draws, with folders, and in those folders are bits of paper with data on them. And we can take out those bits of paper, those "files" and move them around, or send them to someone else, or write on them, or throw them in the trash. That is the real-world reflection of how computer file systems are constructed, whether we are talking a floppy disk in an MSDOS PC in 1983 or or a Linux server or a Windows desktop today. Every now and again, however, someone gets the bright idea that that might not be the smartest way to organize data on computers. There's no reason computers should be following that filing cabinet-folder-file structure, right? There are lots of other potential ways to organize, and maybe they would be better? Maybe it comes from that hatred of skeuomorphism that some people have, they just can't stand that sort of physical-world metaphor being applied to data storage.

So they come up with something radical and new and different and amazing and smart and efficient. And it inevitably fails. Because, you know what? Individual files as objects that you can send to people and create and delete and move around is actually not that bad a paradigm.

Mobile operating systems have started to make a dent in this; from the user's point of view there is no hierarchical file system like this on an iOS device - bits of data belong to individual apps, and they can sometimes be sent between apps, and they can sometimes be turned into some traditional file for emailing or putting on Dropbox. The flaws in this system are apparently to anyone who's tried to do complicated work using multiple data sources on an iOS device, however. And, of course, internally that standard hierarchical file-system still exists.
posted by Jimbob at 4:54 PM on June 5, 2017 [4 favorites]


I never understood the bile against UAC because I was using separate Admin and User accounts back in the XP days, and it wasn't all that painful, really. I did skip vista because I had a Pentium 3 with 256 MB of RAM until I built my first computer in the Windows 7 days.
posted by Monday, stony Monday at 5:10 PM on June 5, 2017


This is exactly like Intel's bold and visionary transition to Itanium, which ended up pretty much the same way as Vista did.
posted by monotreme at 5:25 PM on June 5, 2017


"you are coming to a sad realization, cancel or allow?"

OMG I'm LOL'ing so hard at that.
posted by Annika Cicada at 6:23 PM on June 5, 2017 [4 favorites]


Drivers causing "bluescreens" - finally found the article (I have posted a few times on the blue, stating the reason things have gotten much better, is that Microsoft listened to their "phone-home" crash/debug telemetry data and did alot of work on delivering better driver code samples):

http://www.zdnet.com/article/why-the-blue-screen-of-death-no-longer-plagues-windows-users/
posted by jkaczor at 7:18 PM on June 5, 2017 [1 favorite]


At least the icons were pretty.
posted by 4ster at 7:56 PM on June 5, 2017


for serious Windows ME would lock up if you looked at it funny

I hear that a lot, but I ran a OHM version of ME (with Plus! no less) that I purchased for like ten bucks with a mouse from a dodgy little computer store in Melbourne, on a Pentium III that I built myself out of parts that were literally out of dumpsters, and it ran fine. I Baldur's Gated for like a month straight on that thing, it was shit hot.

Edit: Apparently Plus! was not available for ME, so I must have been conflating it with Windows 98.
posted by turbid dahlia at 8:51 PM on June 5, 2017 [2 favorites]


"Older readers"?

I remember Windows 2 was a Vista-like horrorshow until Windows/386 V3.
posted by meehawl at 10:14 PM on June 5, 2017 [1 favorite]


UAC is kind of a classic example of the boy-who-cried-wolf phenomenon — requesting user confirmation for certain actions related to system security is one of those things that has to be infrequent enough to make an impact, and my memory of dealing with Vista is that "VOLUME ADJUSTMENT DETECTED, ALLOW OR CANCEL" was only a mild exaggeration. I could be misremembering, but I do recall my frustration with just how often those dialog boxes appeared in everyday use.

Granted, yeah, a lot of the problems with Windows security in general can be resolved by simply not using an admin/root account for everyday tasks, but that's a lesson that eluded Microsoft for a frankly distressing length of time.
posted by DoctorFedora at 12:41 AM on June 6, 2017


The interesting thing about UAC is it's "not a security boundary". They wanted it to be, belatedly realised it was impossible to retrofit, and gave up. (I think it's possible in principle, but whether there's an OS that has really done any better is another matter). IIRC the official blurb on that is it's only a "security feature".

The function of UAC is not to protect, exactly, because it can't. Not even on Windows 10.

The function of UAC is literally to inconvenience. It is to make inconvenient any software that assumed users would run it with admin access. Software authors are forced to see their software as bad and inconvenient, rather than being able to say "limited accounts are not supported, you should run everything as admin". Once they fix their software, running separate limited and admin accounts is actually possible.

Of course this is still not useful for the average home user. Having a second password for an admin account and clicking through the "fast user switcher" is not something normal people are actually going to do. (Linux happens to have a convenient bypass: you can quickly switch between logged in users with ctrl+alt+Fx, you just have to keep track of what user is on what Fx).

What the UAC project benefits is the functioning of corporate managed systems - where your normal people are not supposed to be installing software or fixing the system clock.

Possibly if they'd been able to see this clearly, they'd have known to weaken UAC as much as they could. I'm ignorant of what people complain about specifically, but I imagine the builtin Windows configuration GUIs could have been exempted for most things short of installing new system-wide applications or third-party drivers.

I expect there are incidental benefits to UAC on home systems too. Just as enabling `sudo` has been such a useful tool for popular "linux on the desktop". Running with `sudo` does not really prevent some malicious process from elevating to root access... Literally all it has to do is wait for you to run sudo, then any process with your user ID is also allowed to run sudo. But it's real nice to have this additional speed-bump, before you inadvertently run the equivalent of `FORMAT C:`.
posted by sourcejedi at 2:25 AM on June 6, 2017 [1 favorite]


As someone who is drafted to do front line support for family and friends, I got great value out of the UAC. I could tell people, never hit yes. If what you are trying to do still doesn't work, call me. It saved me lots of uninstalling malware.

I always felt Vista just had one problem; it was slow. There was no way Microsoft could argue around slow.
posted by lowtide at 5:26 AM on June 6, 2017 [3 favorites]


I Baldur's Gated for like a month straight on that thing, it was shit hot.

Was it overclocked, or were you a victim of Vista's rampant CPU chipset driver issues?

*rimshot*
posted by Mayor West at 5:31 AM on June 6, 2017 [1 favorite]


Running with `sudo` does not really prevent some malicious process from elevating to root access... Literally all it has to do is wait for you to run sudo, then any process with your user ID is also allowed to run sudo.

!!!!!

Is this actually the implementation of UAC, and/or superuser privileges on *nix systems? Because it seems like that in itself is a huge, glaring security hole that is unnecessarily baked into the OS design. If I'm sudo rm -rf .-ing, what possible utility is there in elevating every process under my user ID to sudo? Elevated privileges should apply to the active process, and any child processes it spawns and explicitly owns.
posted by Mayor West at 5:44 AM on June 6, 2017


Of course this is still not useful for the average home user. Having a second password for an admin account and clicking through the "fast user switcher" is not something normal people are actually going to do.

This isn't how UAC works, unless the technical jargon for what you actually do is very misleading? If a UAC event pops up when you're logged in as a normal user, you don't have to click through anything; all you need to do is directly enter the password for the/an admin account.
posted by ROU_Xenophobe at 7:26 AM on June 6, 2017


If a UAC event pops up when you're logged in as a normal user, you don't have to click through anything; all you need to do is directly enter the password for the/an admin account.

That's right. But UAC isn't a security boundary. The UAC code is explicitly not maintained to be secure against malware. So when your goal is to have a security boundary, you can't use anything more convenient than fast user switching.

If I'm sudo rm -rf .-ing, what possible utility is there in elevating every process under my user ID to sudo?

For convenience, sudo only prompts for a password the first time. When you use sudo again within five minutes, you don't have to re-enter it.

There's a second part that I forgot: this applies within a single terminal. Apparently the credentials cache is per "terminal session ID". So the attacker also needs code to notice when you run sudo (scan the process list), and inject themself into a process with the same terminal session ID. I imagine you can start a debugger on the target process, and ask it to evaluate something like `system("/bin/evil &")`.
posted by sourcejedi at 7:59 AM on June 6, 2017 [1 favorite]


I'm with Chuckles - RTFA.

I think part of the reason people are talking about stuff other than what's actually in TFA is the writer does really a very poor job of letting non-nerds into the narrative. If you can fight past some of the extraneous detail (though much of that detail is fascinating in and of itself), it's really a very good cautionary tale.

The short version is there was (as always) a moment where technology companies realize they need to build something to enable more. But more what? Enable whom? Microsoft chose to try and enable more sophisticated capabilities and users via several ambitious technical initiatives. Apple essentially chose to lower the bar to becoming a technology user by tightly scoping all sorts of things in iOS (and also, several ambitious initiatives).

This is an age old problem. Do you service your existing customers? Or the people that are not yet your customers? If the latter, how do you know what you're doing will work to turn them into customers?
posted by NoRelationToLea at 1:01 PM on June 6, 2017 [2 favorites]


writer does really a very poor job of letting non-nerds into the narrative

I'm only nerd-light, I enjoyed the article, I feel like he did a decent job of explaining stuff. Honestly I think an editor could've better organized the article around 1-3 key points, because even among people who have RTFA there's a lot of disagreement what the main point is. W/r/t enabling more sophisticated capabilities versus lowing the bar... I don't know if that's a fair comparison because iOS and Windows/OSX are just completely different animals. And OSX has only recently began implementing some of the "tight scoping" lessons learned from iOS. But during the the Vista time period I don't think the contemporary OSX 10.5 was really actively seeking to restrict the scope of concern of end users... it was mostly just focused on smaller scale features like Spotlight Search and Time Machine in the wake of the switch to intel and native 64 bit support.

I think the point is more accurately MS tried to enable several capabilities at once without a set of clearly ordered priorities and inadequate revision of the plan for when the engineering challenge of enabling several new capabilities became apparent. Apple has been interested in new File System for years (Siracusa has been alluding to APFS since 10.6 or 10.7 at least) but has delayed making it default until now. Apple also took its sweet time getting to a memory managed language with Swift in... 2014? I mean, I can't say Microsoft made a mistake in choosing where to guide its OS but trying to accomplish all of the above by 2007 was possibly overly very ambitious and I guess (per TFA) why Vista failed.

That being said... now that OSX is finally catching up to Windows when it comes to memory management and modern file systems, I wish Windows would catch up in terms of ease of use, because I'd love to switch back. My Hackintosh is becoming less stable with each new version of OSX and I've gotten used to Power-Mac performance but in no way are PowerMac pricing anything but highway robbery right now.
posted by midmarch snowman at 1:05 PM on June 7, 2017 [4 favorites]


I thought he did a great job of keeping the technical discussion understandable without dumbing things down. Noting the Office vs. Windows turf war within Microsoft, and how that influences his perspective, feels important. Beyond that, I find the criticisms in this thread rather banal.

I'm interested, because I believe Computer Engineering has a problem. I come at that as a relatively informed outsider, so I was very interested, first in the Alan Kay talk, but then David Heinemeier Hansson's bit on exponential growth and perverse tax incentives, and finally this piece by Terry Crowley. I think together these three posts paint a fascinating picture of a deeply flawed industry where none of the major players escape blame.

The best MetaFilter I've seen in ages!
posted by Chuckles at 5:42 PM on June 7, 2017 [4 favorites]


« Older Look out for that bear!   |   Participating in his own Irish wake, filled with... Newer »


This thread has been archived and is closed to new comments