Join 3,494 readers in helping fund MetaFilter (Hide)


Model Train Oriented Design
May 17, 2010 4:28 PM   Subscribe

"People who work with computers-especially those being exposed to a machine for the first time-can become quite entranced with these qualities, finding the computer a kind of alter ego. "Sometimes programmers just won't go home, take a bath or anything," reports a computer man who has got over it himself. "They're like a kid falling in love with a hot rod. They'll sit there working with their newfound 'friend' 20 hours a day, just watching the lights and drinking coffee. After a while they get to looking pale and unhealthy. They sit there fascinated and just forget to eat." Life, October 27, 1967 on "How the Computer gets the answer."
posted by geoff. (49 comments total) 24 users marked this as a favorite

 
I'll admit to dreaming in HTML when I first got started writing stuff.




Those are some incredible illustrations in the mag - and the other stories -

"The best college half back - just call him, O.J"


Oh, perspective!
posted by alex_skazat at 4:34 PM on May 17, 2010


They sit there fascinated and just forget to eat.

Fortunately computery types seem to have gotten over THIS particular hurdle...

*blows popcorn bits off of keyboard*
posted by hermitosis at 4:37 PM on May 17, 2010 [7 favorites]


That is really a very clear explanation of how computers work. You wouldn't find such an explanation today because the writer would be too busy trying to show how it supports Javascript and webpages to show the hidden simplicity -- which, in truth, isn't so simple in PC class machines any more which is why they have developed the annoying habit of crashing once in awhile. When I was a kid, if the computer in my Dad's physics lab crashed, Hewlett-Packard sent a technician out to fix it.

In a world where a lot of peoples' impressions of computers were formed by movies like Colossus: The Forbin Project and The Billion Dollar Brain, Dad told me one of the most important things anyone has ever told me about anything: "Son," he said, "A computer is a blazing moron. It is just smart enough to do the wrong thing a million times a second." That perspective removed any intimidation I might have felt about those still expensive machines and created my career.
posted by localroger at 4:38 PM on May 17, 2010 [15 favorites]


At an interview for the first neuroimaging job I ever applied for, the PI told me "If the computer is not your friend, fMRI is maybe not for you."

I spent a lot of time with the glowbox learning how to do fMRI, but it still was several years before I realized how right he was. I never thought I'd be this big a computer geek, and I still don't have ice-hot hackin' chops or anything, but man, it's hard to get away from the pattern of blinking lights (and today, the pattern is all wrong).
posted by solipsophistocracy at 4:41 PM on May 17, 2010 [1 favorite]


I couple weeks ago I would have been surprised by this phenomena but I've since used an iPad and been similarly ensorceled. I know I've never before touched anything so full of possibilities and while I am not yet a parent I have held small children none of whom have yet amounted to anything.
posted by I Foody at 4:49 PM on May 17, 2010 [3 favorites]


IBM System/360: yes, it was made up of thousands of individual wires. A bit more history on the line of computers:
The IBM-360 family of computers ranged from the model 20 minicomputer (which typically had 24 KB of memory) to the model 91 supercomputer which was built for the North American missile defense system. Despite their differences, all these machines had the same user instruction set; on the smaller machines many of the more complex instructions were done in microcode rather than in hardware. For example, machines in the lower midrange did not have multiplier hardware, but the microcode implemented multiplications by repeated addition. It was rumored that the smallest machines did addition by repeated increments!
And I thought breadboards were only for hobby builders.
posted by filthy light thief at 4:51 PM on May 17, 2010


Ah. I was so much more like that in my early career. I miss those days.

Having a kid will severely mess with your ability to pull that shit.
posted by Artw at 4:54 PM on May 17, 2010


A Turing-complete train set. I like it.

Modern processors really aren't so different from this, when you get right down to it; few people would bother going into this much detail though, because there are so many additional layers built on top of them.

Actually I think it's probably been a good 40 years since a single person could have really understood the complete inner workings of any typical processor — they're just too complex for anyone to actually understand, to hold the entire schematic down to the actual gates, in their head. So even the people in charge of the teams who design the processors don't really understand them at the lowest level; everybody is abstracting away at least some of the complexity.

In '67 there would have still been a lot of people around who cut their teeth on systems that were simple enough so that a person might, with enough study and dedication and interest, actually hope to understand every part of it. And I think that's reflected, indirectly, in the bottom-up approach of the article. You don't see that anymore, because anyone attempting a true bottom-up understanding of a modern PC would quickly be driven insane.

So instead journalists and the public approach things from the top down; they start off with something complex but approachable (like Word) and then explain how it works as you go down through various abstraction levels, getting more hand-wavey all along.
posted by Kadin2048 at 4:55 PM on May 17, 2010


The computer is a series of railway tracks.
posted by unliteral at 4:56 PM on May 17, 2010 [3 favorites]


We sure showed them! Now a majority of Americans (and a lot of the rest of the world) sit and watch blinking lights all day! And we have like 25 different kinds of coffee!
posted by DU at 4:58 PM on May 17, 2010 [2 favorites]


Crikey. I know there's a lot computers do for us now, often without us realizing a bit of it. Where the $299.99 "big-screen" TV ad on page 3 of this LIFE magazine made me snicker (though the $299.99 USD pricetag in 1964 would have inflated to $2052.90 is a bit shocking), somehow watching a random video game promo trailer on YouTube kind of blew my mind just now, considering the context of the article. Checking computer price aggregators, it looks like there is more RAM sold by the gb than mb, and I don't know if I ever remember being able to buy RAM by the kb.
posted by filthy light thief at 5:13 PM on May 17, 2010


Dad told me one of the most important things anyone has ever told me about anything: "Son," he said, "A computer is a blazing moron. It is just smart enough to do the wrong thing a million times a second."

Truly. Still, that is at least two orders of magnitude less moronic than the average company executive.
posted by Ritchie at 5:28 PM on May 17, 2010 [3 favorites]


Meanwhile, this hippie is going to steer clear of that Methadrine stuff. Sounds dangerous.
posted by not_on_display at 5:42 PM on May 17, 2010


Smack in the middle of the OJ article is a full page advertisement entitled "Is your present group life insurance worth dying for?"

Well, LIFE magazine, is it?
posted by swift at 5:54 PM on May 17, 2010 [1 favorite]


I don't know if I ever remember being able to buy RAM by the kb.

Actually, you still can. For an extra bit of weirdness: '30-pin SIMM, 256kB capacity' has a Facebook page, with 4 friends no less.
posted by jedicus at 6:03 PM on May 17, 2010 [1 favorite]


Man, don't miss the O.J. article.

"The boy laughed, embarrassed now, and it was clear on his idolizing face that he would have done anything -- even gone to school -- if O.J. Simpson told him to.""
posted by decagon at 6:16 PM on May 17, 2010


"The Federal Crusade Against Smoking Has Gone Too Far" is also lulzworthy in a Mad Men sort of way.
posted by jokeefe at 6:23 PM on May 17, 2010


booze-tobacco-booze-tobacco-booze-booze-booze-tobacco-FUR!-booze-tobacco-booze-sprite.
posted by contessa at 6:54 PM on May 17, 2010 [1 favorite]


I remember my stepdad coming home all excited one day. Not only had he upgraded his processor to a 386, he'd splurged and gone to 16MB RAM. We were overjoyed! Gaming bliss was to come!

Some thing simply don't change.
posted by Pope Guilty at 7:14 PM on May 17, 2010 [1 favorite]


Nice, thanks for posting this.
posted by carter at 7:21 PM on May 17, 2010


filthy light thief wrote: "Checking computer price aggregators, it looks like there is more RAM sold by the gb than mb, and I don't know if I ever remember being able to buy RAM by the kb."

My first computer (a PC clone) had DIP sockets for the memory. It was weird. Rather than buy a PCB with some chips mounted on it and a nice connector that fits in a slot, you got to place the chips individually.
posted by wierdo at 7:36 PM on May 17, 2010


"Sometimes programmers just won't go home, take a bath or anything," reports a computer man who has got over it himself. "They're like a kid falling in love with a hot rod. They'll sit there working with their newfound 'friend' 20 hours a day, just watching the lights and drinking coffee. After a while they get to looking pale and unhealthy. They sit there fascinated and just forget to eat."

You know, despite the Tee-Hee-I-Resemble-That-Remark comments, it actually makes me kind of sad that the programming industry has always had this approach, and still does to a substantial extent. My first job out of college was programming, but I've since left the industry.

Being obsessive about one's hobby is fine & dandy, whether it's computers or motorcycles or Star Wars action figures. And having one's hobby turn into a career is even nicer, but there are inevitable mismatches between exactly what turned you onto it in the first place and what you end up doing for, you know, a paycheck. Programming text-based RPGs for my little brother presented a high level of nerdy bliss back in the day, and I could do it for hours on end. While implementing secure transactions at job X or debugging UNIX port communications at job Y sort of relies on the same skills and can provide a measure of satisfaction, it really is not the same. I imagine installing & maintaining windows on a dozen workstations would be even farther off the path.

And the thing is about this industry, you're supposed to approach it the same way: you're supposed to take the job you're assigned and crank it out long into the night because... I don't know. Because you're supposed to be a nerd who is hardcore about these things, when really, at that point it's just a job. It's an interesting job, and a good-paying job all things considered, but other careers don't demand the same level of obsessive interest and dedication just to be considered on-par in the industry.

Once you come out of college with a degree (even an EECS degree), one would hope that you'd also become more well rounded and developed other interests. But then you get tossed back into an industry that is not particularly open to that idea, because you are regarded as a computer nerd and you are supposed to be having fun at this job. It's sad and disappointing, and I think it eventually chases a lot of people away from programming.
posted by rkent at 7:48 PM on May 17, 2010 [16 favorites]


wierdo: And the special feature was that those chips tended to walk out of their sockets via thermal cycling. SRSLY. There was a time when you could get your geek cred and a nice side income by pushing them back in.
posted by localroger at 7:53 PM on May 17, 2010


That's awesome.

If you're interested in a more in-depth (but still very approachable) look at how computers work, the book The Elements of Computing Systems is a great read. Using a hardware simulator, it takes you from building basic logic gates, through the workings of RAM and the CPU, all the way to implementing a compiler and writing programs for the machine. You learn how the whole system works together by building a complete computer from first principles.
posted by findango at 8:05 PM on May 17, 2010 [5 favorites]


I liked the car ads the best.
posted by Devils Rancher at 8:24 PM on May 17, 2010


booze-tobacco-booze-tobacco-booze-booze-booze-tobacco-FUR!-booze-tobacco-booze-sprite.


I'm imagining that as the inner monologue of millions of Americans on a slightly chilly day in October, 1967.

Damn. Born too late. I'd trade my iPhone, internet, and my 21st century lifestyle for a computer engineer job in '67, punch card machine, a 20MB hard drive that only weighs 4000lbs, several cartons of Lucky Strikes, and a case of fine, fine booze in a heartbeat.
posted by chambers at 11:15 PM on May 17, 2010 [5 favorites]


Y'all should read Sherry Turkle's The Second Self. Even though it was written in the '80s, it's still more or less the best study of the "computer person" personality even made, I think.
posted by nasreddin at 11:36 PM on May 17, 2010 [2 favorites]


Once you come out of college with a degree (even an EECS degree), one would hope that you'd also become more well rounded and developed other interests. But then you get tossed back into an industry that is not particularly open to that idea, because you are regarded as a computer nerd and you are supposed to be having fun at this job. It's sad and disappointing, and I think it eventually chases a lot of people away from programming.

This is exactly my problem.

Thing is, yes, sometimes they'll give me a problem that I want to solve. And I'll start working on it, and then look up and realize that everybody else is gone, the cleaning staff already split, and it's dark outside.

The problem comes when they expect this as my default behavior.

I haven't worked a regular job in pushing on two years now. And I'm planning on going back to the industry. And while I want the work (because I'm bored), and I want the money, I'm pretty scared. I have a zillion other interests and hobbies, and I want time to actually enjoy my life. And, yet, I'm pretty convinced that anybody who hires me is going to expect 60 hour work weeks for 40 hours' pay just cause "that's how it is".
posted by Netzapper at 12:03 AM on May 18, 2010 [1 favorite]


I don't know if I ever remember being able to buy RAM by the kb.

I did, I did! I once upgraded my Sinclair ZX Spectrum from 16K to 48K!

Damn, I suppose that makes me an old git now, doesn't it?
posted by Skeptic at 12:05 AM on May 18, 2010 [1 favorite]


Some beautiful images in that article.
posted by molecicco at 12:11 AM on May 18, 2010


my dad was like that. I think the critical moment was when he put one in the bedroom and mom said "its either that thing or me"... they're still married, there's still a laptop in the bedroom but now she's got her own machine to play games on...
posted by infini at 1:23 AM on May 18, 2010


Meanwhile, this hippie is going to steer clear of that Methadrine stuff. Sounds dangerous.

Love the title of that article: "Drugs That Even Scare Hippies"
posted by fairmettle at 2:21 AM on May 18, 2010


Y'all should read Sherry Turkle's The Second Self. Even though it was written in the '80s, it's still more or less the best study of the "computer person" personality even made, I think.

Computer Power and Human Reason, Joseph Weizenbaum (of Eliza fame) has some interesting observations on obsessive personalities. He draws analogies between hackers and gamblers - both build grandiose systems with little grounding in reality.
posted by Leon at 2:47 AM on May 18, 2010


I'd trade my iPhone, internet, and my 21st century lifestyle for a computer engineer job in '67, punch card machine, a 20MB hard drive that only weighs 4000lbs, several cartons of Lucky Strikes, and a case of fine, fine booze in a heartbeat.

I guess the plot for the upcoming season of Mad Men is officially out of the bag.
posted by crapmatic at 3:11 AM on May 18, 2010 [1 favorite]


You don't see that anymore, because anyone attempting a true bottom-up understanding of a modern PC would quickly be driven insane.

Part of the problem is that it's getting very hard to get the really low-level data anymore. I've spent some time, for instance, trying to get details on how Intel's VT-x and AMD's Pacifica technologies work, and how they differ. (these are hardware support for virtualization; they make programs like VMWare run faster.) All I've been able to determine is that they both exist, and that they're somehow different, but nobody much seems to know how or why, or which one is the better approach. Everything is all hand-wavy, even the technical whitepapers and the like.

It may be that I'm not digging hard enough, because obviously the Linux kernel devs, for instance, were able to get the hard details on how the two systems work, but it's been fairly frustrating starting from scratch and trying to find detailed information, much less educated opinion about what that information means.

I miss the days of Byte Magazine. Frequently, when I finished an article, my brain hurt. I remember chewing on some issues for weeks, and still being quite sure there was a lot I wasn't getting. Byte, more than any other source I've ever encountered, showed me just how ignorant I was on a regular basis, and I miss it very much.

I haven't found anything equivalent in the Web age. The technical world seems to be splitting into subdisciplines that barely talk to each other anymore, and maintaining a good grasp of the various approaches in the various fields is becoming very difficult indeed.
posted by Malor at 4:45 AM on May 18, 2010 [2 favorites]


My first computer (a PC clone) had DIP sockets for the memory. It was weird. Rather than buy a PCB with some chips mounted on it and a nice connector that fits in a slot, you got to place the chips individually.

Our first computer (I was a kid) was a TI 99/4... we later upgraded the 4A, which added a marvelous new, high-tech feature: lower case. Seriously. And when TI started to run into trouble, they had firesales on their Peripheral Expansion Box, and we jumped on it. They had some package of the box (which was one of the most incredibly over-engineered pieces of hardware I've ever seen; you could have dropped a safe on it and it would probably have survived), a disk drive, and a 48K memory expansion, for something like $500... this was way down from the $1500 it would have been previously, and we were very excited. That memory card was about the size of a big hardback book, but about half the thickness, and it plugged into a giant slot. Man, what a difference that made. You could write giant programs, and store and load them by name, almost instantly. No more keeping your own catalog of tape positions... pure bliss. Those floppies held 93 entire K. Now, this sucked compared to Apple IIs, which did 140K, or C64s, which did 160, but it was a very fast drive, and was quite reliable. It was pretty transformative. It changed the computer from a toy into something you could actually work with.

In the next generation of computers, I got into the Amiga. In that 16-bit generation, you bought memory expansion boards separately from memory, if you wanted a good price. The memory would come as a bunch of individual chips, the DIP packaging you're referring to... small chips with one row of eight pins down each side. You had to orient the chips the right way, aligning the notch on the chip with the notch on the socket (there was nothing else stopping you from putting the RAM in backwards and toasting it), and then carefully, carefully push it down in the slot without bending any pins. They'd come in these little plastic boxes filled with foam, with all the chips stuck in the foam. And man, those early chips were super static-sensitive, so you had to be really, really careful.

I remember the first big RAM crunch... it must have been about, oh, 1989 or 1990. Prices soared on RAM due to a shortage of production, and for awhile, a typical-for-the-time 2 meg expansion was over a thousand dollars just for the RAM, and then another couple hundred for the mini expansion box to put it in. Not surprisingly, we didn't sell very many of them for about six months. (I was working in an ST/Amiga shop; I was the Amiga guy. :) )

The Atari ST 1040 was heavily marketed as the first 1 meg computer you could buy for under a thousand dollars.... "wow, that's less than a dollar per K!". It really was very exciting, because that was getting into an actually useful amount of RAM. A 64K machine is very limited in what it can do. It's very difficult to work with data of any size... you have to swap in and out from external storage, and that's very painful. But when you get up to a meg or two, you can do some really, really cool stuff. A meg is really a lot of space, when you're dealing with primarily text information. You could fit a whole book in the computer, all at once.

Things just marched along as a dizzying pace until the 486-33, computers getting faster and more powerful by leaps and bounds. It slowed down then a bunch, but kept improving steadily up to about the P3-700. After that, in terms of actual usefulness for real people doing real work, the rate of change dropped a great deal. And modern CPUs are so fast that, for most people, they hardly matter at all. The difference between a 2Ghz Core 2 and an overclocked i7 at 4Ghz just isn't that major for most people. Both are perfectly capable of doing almost any normal task you want. About the only difference is that decoding 1080P video usually needs about a 2.6Ghz processor to do in software, but with the advent of decoding in hardware, that's now irrelevant. New CPUs have minor impact on your life, and there are hardly any 64-bit programs to take advantage of the huge amounts of RAM you can now stuff in a workstation.

It's kind of sad, actually. I'm not aware of any other time in history when any field advanced so quickly, and with so much benefit to the actual people using it. It was a burst of creativity and enthusiasm that changed a lot of lives. You see some of that creativity and energy rebounding still, in the new Net applications that are coming online, and new ways people are finding to use bandwidth. Maybe I'm just too old to fully appreciate the energy there, but it doesn't seem the same as it was from about 1985 to about 1995. It's still good, but it's like comparing a four-cup caffeine high to methamphetamine.
posted by Malor at 5:22 AM on May 18, 2010 [8 favorites]


I'd trade my iPhone, internet, and my 21st century lifestyle for a computer engineer job in '67, punch card machine, a 20MB hard drive that only weighs 4000lbs, several cartons of Lucky Strikes, and a case of fine, fine booze in a heartbeat.

I wouldn't. That's nuts.
posted by grubi at 6:50 AM on May 18, 2010 [1 favorite]


rkent, you are describing what I call the "disposageek" system. We're all the same, interchangeable. You get a geek, work them 100+ hours a week until they burn out, fire them and get another.

I don't think think other jobs expect that kind of insanity.
posted by QIbHom at 6:56 AM on May 18, 2010 [2 favorites]


I miss the days of Byte Magazine.
Oh my! Yes. I remember an issue that had Spock on the cover and contained an article that talked about a waiter entering a restaurant that was on optical disc and the different outcomes that could ensue. I kept it for years and read it over and over (it never made sense to me) and I loved it.
posted by unliteral at 7:22 AM on May 18, 2010


You get a geek, work them 100+ hours a week until they burn out, fire them and get another.

I don't think think other jobs expect that kind of insanity.


Journalism.
posted by mrgrimm at 9:41 AM on May 18, 2010 [1 favorite]


You get a geek, work them 100+ hours a week until they burn out, fire them and get another.

I don't think think other jobs expect that kind of insanity.


Law.
posted by mrgrimm at 9:41 AM on May 18, 2010 [1 favorite]


You get a geek, work them 100+ hours a week until they burn out, fire them and get another.

I don't think think other jobs expect that kind of insanity.


Medicine.
posted by mrgrimm at 9:42 AM on May 18, 2010 [1 favorite]


I think journalism is a good comparison, career-wise. Careers in medicine and law have a licensing/cost component that's more of a factor in the 100-hour/week schedule (i.e. doctors and lawyers work those hours because they have educational debt and can't afford to lose their jobs). But reporters, especially those on a beat, are always considered to be working, 24/7.

Some non-profits/charities have the same problem. When you have someone who "loves what they do," it's easy to exploit them into working overtime.

Also, it's all relative (obviously). I work with engineers who show up at 10 and leave at 4-5. We use Rally to track tasks/assignments, and the generally workload is 5 hours/day of tracked work. So, yeah, no way does that add up to 100 hours a week, even with meetings, etc.
posted by mrgrimm at 9:59 AM on May 18, 2010


Old enough to remember "Do not fold, bend, spindle or mutilate" ?
posted by etaoin at 11:17 AM on May 18, 2010 [2 favorites]


Malor wrote: "It slowed down then a bunch, but kept improving steadily up to about the P3-700. After that, in terms of actual usefulness for real people doing real work, the rate of change dropped a great deal. And modern CPUs are so fast that, for most people, they hardly matter at all. The difference between a 2Ghz Core 2 and an overclocked i7 at 4Ghz just isn't that major for most people."

Yeah, up to the Pentium II, things were still progressing along pretty quickly, then slowed. The Pentium III wasn't much of an upgrade. Then we got the Pentium 4, by which time almost all progress in the CPU area was being made by AMD. Core 2 was a pretty good jump, though. I bought my Core2Duo laptop four years ago and I have absolutely no interest in upgrading it. It would be nice if it had a newer GPU (only for H.264 decoding, really), but the CPU is perfectly fine.

Part of the issue is that those of us who have been around a long time expect performance improvements. It seems like Intel is focusing more on efficiency at the moment. (I have no idea what AMD is doing at the moment, since I'm not in the market for an upgrade)

Don't get me wrong, I'm enjoying the ever increasing size of hard disks (I still can't believe we have 2TB drives now), but it seems like the fundamental improvements are slower in coming.
posted by wierdo at 11:54 AM on May 18, 2010


Yeah, Core2 was a pretty substantial improvement. The P4 was designed around the idea of scaling clockrate to the moon. It had very long pipelines. A CPU is a lot like an assembly line, in that it has to fetch instructions from memory, decode them, execute them, and then store the results. These steps can be subdivided, and the P4 went for a super-subdivided approach of many, many tiny pieces working in concert. The advantage of doing this is that you can clock simpler circuits up faster, much faster.

The disadvantage is that the latency increases; it takes longer from the time the instruction hits the front of the chip to the time that it exits the other end. Further, programs are branchy... they test values and then jump to different lines of execution based on the result. The problem with the long pipeline was that the fetch-and-decode part of the chip wouldn't know what branch to take until the instruction had been executed, and that didn't happen until a lot later. So the chip would make a guess, and start fetching and decoding one of the two branches. If it guessed right, then everything was great, and the chip chugged along full speed. But if it guessed wrong, as soon the execute realized a bad prediction had been made, all the work that had been done down the 'wrong' branch had to be thrown away, and the chip had to start over from scratch down the right execution path. And, because it took quite awhile for the instructions to percolate through all those pipeline steps, this caused a big bubble where no useful work was being done, a 'pipeline stall'.

The original idea was to just scale the clockrate to such insane frequencies that it effectively didn't matter. The long pipeline stalls were the price to pay, in theory, for the vastly increased execution speed when the branch predictor got it right. And, had the chip indeed scaled to 10Ghz, as they were originally planning, they would have been correct.

Problem was, they slammed into two walls simultaneously; the speed of CPU transistors topped out around 3Ghz, and got unreliable after that. And the speed of memory, which was the other critical component in filling a long pipeline quickly, hit a rock-solid wall at 200Mhz. (Consumer memory hasn't actually increased in speed in something like ten years now; all the "advanced" DDR and DDR2 and DDR3 technologies are just putting more slow memory chips side by side, so that total bandwidth increases, but latency doesn't change at all.) They can constantly make stuff smaller, but not very much faster. And this crippled the P4; that architecture doesn't even start to get good until 4Ghz, and wouldn't really sing until 8, but they simply couldn't make it go that fast.

The Core2 was a fundamental rework. They canned the entire P4 architecture, and started over with some amazing work that their Israeli lab had done on the Pentium-M. The Pentium-M was a short-pipeline, high power efficiency design for laptops, based largely on the P3. It worked extremely well in that environment. It was such a good chip, in fact, that some hobbyists found desktop boards that would run a P-M, because it would run most stuff faster than a much higher-clocked P4. Intel has been working from that architecture ever since.

I'm not sure what the main differences are in the Core and Core2; I know they improved the branch prediction, and added 64-bit support and more on-chip cache, but beyond that, I think it's mostly still a Pentium-M. Whatever the actual features are, and chip companies tend to be much quieter about what they're doing internally these days, they really got it together with the Core2. It was substantially better than anything AMD was offering for quite awhile. (AMD had just stomped Intel all through the P4 era, because their Athlon design had a much shorter pipeline.) i5 and i7 aren't huge improvements; they have more bandwidth and do virtualization and 64-bit support better, but they're nothing like the night-and-day difference from the P4 to the Core and Core2.

The reason CPU manufacturers are going 'wide', giving you more cores, instead of 'fast', giving you one core that's really smoking quick, is because they have no other choice. They're doing power efficiency and multicore because that's all they can make. Unfortunately, it's not what we really need. Most software doesn't parallelize well, except in a few specific cases, and multicore programming is at least an order of magnitude harder than single-threaded applications. (there's a LOT of weird bugs you can get when you have several execution units all twiddling the same bits in RAM... and they tend to be subtle and hard to reproduce.) And that's why, mostly, not all that much has been happening for the last six or eight years. They can't go faster, they can only go wider, but our actual use for that is pretty limited.

This is much of why the excitement these days is in handhelds, because they're taking the old speed boosts down to micro-size. They can't raise clockspeeds much, but they can keep making the same stuff smaller every year, so you're seeing the old performance levels move down into tiny devices. Eventually, you should be able to do 3Ghz-class computing in a handheld, but if existing experience holds, it'll top out there.

What they'll do after that is anyone's guess.
posted by Malor at 1:41 PM on May 18, 2010 [4 favorites]


They are extremely simple underneath. Which was perfectly obvious when they all came with BASIC installed. Which is why they quit doing that.

Now we're RIGHT back in the same position we were in when IBM ruled and you needed a 12-foot shelf of manuals to do anything.

Come to think of it, that's very similar to what happened to the U.S., innit? Because freedom means you have to do things yourself.
posted by Twang at 2:06 PM on May 18, 2010


I wouldn't. That's nuts.

I just missed out on the days when incredibly expensive military computer equipment, such as this SAGE radar console(about 2:20 minutes in), had built-in ashtrays and cigarette lighters.
posted by chambers at 2:10 PM on May 18, 2010 [1 favorite]


Malor wrote: "but they're nothing like the night-and-day difference from the P4 to the Core and Core2. "

I just want to point out that the i3/i5/i7 are much better than Core2 on certain workloads (like video encoding), but otherwise it's just more of the same, as everything has been for quite a while now from Intel. I think it has more to do with the increased on die cache sizes than anything else, though.

Intel is kept afloat mainly by their excellence in reducing process size. They're pretty much always the ones in the lead on that count. In overall design, not so much, excluding the power efficiency compared to AMD.

Also, P4 could be clocked higher than 3GHz, but it required going to ridiculous lengths to keep it cool. In addition to the long pipeline the thing ran as hot or hotter than an early Athlon. (I had a 2GHz Thunderbird chip that I could not keep below 65C no matter what kind of awesome cooler I stuck on it, so it's my standard of "very hot CPU")

One thing they have done very well on is thermal throttling. Four or five years back Intel started putting a thermistor on-die so the CPU can monitor its own temperature and throttle itself down rather than breaking completely if it gets too hot. (very important on P4, given its ridiculous power dissipation) They had the bright idea that if you tell the CPU how hot it can get, it can automagically overclock one or more cores if it has the thermal headroom.

It takes a lot of the pain, and the fun, out of overclocking one's CPU, but is much safer and easier since it's all done in the CPU itself.

It still shocks me that parts like the Cortex A8 can run at 1-1.2GHz fairly reliably with the smaller process sizes. Heck, it shocks me that a Core2 can run at 3GHz. Those things are significantly more complex than the stripped down P4.

Intel may have had the right idea, just too early. The problem is that after the fiasco that was the P4 and now that people are somewhat more aware of the impact computers have on their electric bill they wouldn't be likely to jump at a 8 or 10GHz CPU that draws 2-3 times the power of an iwhatever even if it was faster for a lot of workloads.
posted by wierdo at 2:37 PM on May 21, 2010


« Older Charles Cawley's Medieval Lands is an encyclopedia...  |  Kudzu, Pueraria lobata, may be... Newer »


This thread has been archived and is closed to new comments