The glorious history and inevitable decline of one of technology’s great
April 14, 2015 1:17 PM   Subscribe

IEEE Spectrum has published a "Special Report: 50 Years of Moore's Law," with a selection of a dozen short articles looking back at Moore's original formulation of the law, how it has developed over time, and prospects for the law continuing. Here are some highlights.
posted by infini (33 comments total) 20 users marked this as a favorite
 
See also eroom's law for pharmaceutical science.....
lol/weeps
posted by lalochezia at 1:34 PM on April 14, 2015 [4 favorites]


I just came into a talk by Ray Kurzweil today about the decline of Moore's law by the end of the decade. Crazy to think about
posted by arousingappetites at 1:44 PM on April 14, 2015


People don't realise quite how anomalous the last 50 years are going to seem, or how strange it will be when Moore's Law runs... sorry, now that it has run out of steam.
posted by Devonian at 1:49 PM on April 14, 2015 [1 favorite]


OMG statistic: "More transistors were made in 2014 than in all the years prior to 2011."
posted by storybored at 1:51 PM on April 14, 2015 [6 favorites]


I enjoyed Andrew "bunnie" Huang's new twist (in an otherwise pretty reasonable and interesting piece about the conjectured rise of "open" hardware) on the old delusion that technology will fix capitalism for us:
Another welcome change I see coming is a rise in repair culture as technology becomes less disposable and more permanent. Replacing worn-out computer parts five years from their purchase date won’t seem so silly when the replacement part has virtually the same specifications and price as the old one. This change in the keep-or-throw-away calculus will create a demand for schematics and spare parts […] Personally, I’m looking forward to the changes—including the return of artisan engineering, where elegance, optimization, and balance are valued over raw speed and feature creep.
"Artisan engineering" sounds kind of silly on one level (and obviously is silly, as an actual social forecast) but it's still pretty perfect in a symptomatic way, a neat miniature fantasy-encapsulation of the whole hacker-romantic value system. Eventually, somehow, he seems to think, if we just wish for it hard enough, technological change will bring about a change in the social relationships between people…
posted by RogerB at 2:01 PM on April 14, 2015 [11 favorites]


> Another welcome change I see coming is a rise in repair culture as technology becomes less disposable and more permanent.

Yes, my 15 year old Honda Civic is perfectly serviceable and runs fine with only routine maintenance. But it doesn't fly, or get a thousand miles to the gallon, or navigate on its own, or even talk to nearby cars for collision avoidance.

Given that trade-off, I'd love to live on this rapid growth curve as long as possible, where each year's iPhone is superior enough to the previous generations to make it worth upgrading. (Computers are already not worth upgrading for 4 or 5 years, certainly.)
posted by RedOrGreen at 2:08 PM on April 14, 2015 [4 favorites]


Moore's Law:

- not a law
- not Muslim
- not from North Africa
posted by GuyZero at 2:10 PM on April 14, 2015 [4 favorites]


Can someone explain something to me?

FTA: In turn, the driving force behind information and communications technology has been Moore's law, which can understood as the proposition that the number of components packed on to a computer chip would double every two years, implying a sharp fall in the capabilities of information technology.

Why would doubling the power of computer chips reduce the capabilities of information technology?
posted by Ratio at 2:10 PM on April 14, 2015


now that it has run out of steam

You haven't been watching the SSD market. A trillion transistors for under $400 on newegg right now.

Heck, we still have a whole 'nother dimension we haven't been using.
posted by ryanrs at 2:18 PM on April 14, 2015 [5 favorites]


Isn't Moore's Law slowing down for a certain design type of transistor? I'm kind of thinking of the time before the transistor, and whether characteristics of vacuum tubes, like heat dissipation and lifespan, might have run into a theoretical wall that could have kept computing from progressing much further, had the modern transistor not been invented. Who can really say that material science won't find a way to break through current limits, allowing Moore's Law to progress more or less unimpaired?
posted by a lungful of dragon at 2:21 PM on April 14, 2015 [2 favorites]


Whoops did my math wrong for multilevel flash. One trillion transistors is more like $150.
posted by ryanrs at 2:35 PM on April 14, 2015 [3 favorites]


Can someone explain something to me?

FTA: In turn, the driving force behind information and communications technology has been Moore's law, which can understood as the proposition that the number of components packed on to a computer chip would double every two years, implying a sharp fall in the capabilities of information technology.


Maybe they just fixed a typo, but when I looked it read: "implying a sharp fall in the costs and rise in the capabilities of information technology."
posted by yoink at 2:56 PM on April 14, 2015 [1 favorite]


Who can really say that material science won't find a way to break through current limits, allowing Moore's Law to progress more or less unimpaired?

I'm with you. Spectrum reports pretty regularly on folks working on new materials (graphene being the big one) to potentially replace silicon for the next generations of processors. Merely forecasting the end of Moore's law without taking into account the considerable effort being expended to keep this crazy tech progress train going is a little silly, IMHO.
posted by wemayfreeze at 2:58 PM on April 14, 2015 [3 favorites]


I thought it was like "I've finished my chopped cabbage salad, may I have some More's Law?"
posted by oneswellfoop at 3:04 PM on April 14, 2015 [11 favorites]


Who can really say that material science won't find a way to break through current limits, allowing Moore's Law to progress more or less unimpaired?

Physicists study this stuff pretty well. The major issue is the inability to use any sort of lithographic technique to produce components smaller than 14nm. At some point you get to layers that are a single atom thick and then you encounter quantum tunneling effects with electrons and stuff like that.

We may yet move to 7 nm but most people who are actual material scientists think that 5 nm is the end of the road. The most likely thing to keep Moore going past that is real 3D circuitry where there are multiple layers of devices produced on a single chip as opposed to just stacking multi-chip modules.

Even if we get to 5 nm current projections are that that's only maybe 15-ish years away.

But the basic configuration of the universe is pretty fixed and the issues with sub-5nm have to do with the behaviour of electrons and not materials as normal people know them. Sadly we are unlikely to find a good replacement for electrons anytime soon.
posted by GuyZero at 3:06 PM on April 14, 2015 [4 favorites]


Maybe they just fixed a typo

Yup, looks like it. Thanks.
posted by Ratio at 3:31 PM on April 14, 2015


What a crazy ride. This HOPE presentation "Indistinguishable From Magic" was pretty interesting. It goes into detail about some of the processes needed to manufacture modern chips with 22nm features, and some of the challenges scaling down further.

It's kind of amazing that the industry is able to make a $xx billion dollar bet that they can further tame the laws of physics and stack up new types of transistors on a mass production line every few years.
posted by RobotVoodooPower at 5:21 PM on April 14, 2015 [2 favorites]


There are lots of problems - and they're already happening. You might notice that clock frequencies haven't changed, basically, for a decade whereas they went up a thousand-fold between 1981 and 2001. We've wrung about all the shrink magic we can out of current lithography and the EUV stuff (which is 11 billion dollars of research already and well overdue) isn't here. And that's before the full stop of not having anywhere smaller to go, because we've run out of atoms, but which I doubt we'll get to. Even if so, that's about eight years away.

Yes, you can happily stack but that won't help very much. It's good for storage, which has traditionally led on feature size anyway, but you don't get twice as many transistors for the same wafer throughput next year, and twice again eighteen months later.

As for Magic New Physics: there may indeed be something at some point that continues the sleigh ride. But we don't have that yet, and even when we do it won't have the 50 years of experience and production nous that we have in silicon lithography. Graphene? Carbon nanotubes? Exotic materials? Go and research them: none will be remotely as good as good old-fashioned silicon will be when the game ends in five (or ten; pick a number you like) years time. And if they're not as good, where's the money going to come from to invest in getting them that good?

Moore's Law is as much about money as anything else. Each generation costs a lot to make happen, but it's been worth it because the performance/price increase means the sales will be there. when Moore's Law stops, so does that tap. It is very difficult to supplant an old, established technology even if you're superior, if you're not cheaper.

In another thread, people pointed out that, basically, gaming rigs last a lot longer these days than they used to. Same effect: things have stopped getting better for less money, in significant ways. Storage and some CPU aspects will continue to get better for a while, but not indefinitely and not past the final shrink.

We are so conditioned to think that Moore's Law is God's Law, when it's actually an effect of a particular combination of circumstances, that we really can't believe it's going away. Hence the "I'm sure something will turn up...".

Really, it's just about over. Lots of fun to come, for sure, but not the same fun we've been having until now.
posted by Devonian at 5:39 PM on April 14, 2015 [5 favorites]


metafilter: because we've run out of atoms.
posted by el io at 7:46 PM on April 14, 2015 [4 favorites]


Really, it's just about over

Certainly at some point this will have to stop, but people have been saying "we're at that point now" since I've been aware of Moore's law, and that's been a quite a while. This graph stops at 2011, but the newer entries in this table still seem to fit the curve pretty darn well. So until they actually stop putting more crap on a chip each year, I'm going to assume that they're going to, somehow, continue to put more crap on a chip each year.
posted by aubilenon at 7:57 PM on April 14, 2015 [4 favorites]


In another thread, people pointed out that, basically, gaming rigs last a lot longer these days than they used to. Same effect: things have stopped getting better for less money, in significant ways.

I think there's a very different thing going on there. Gaming rigs last longer because games are built around the old hardware of whatever the current console generation is, so up-to-date PC hardware can turn on even more eye candy and run at higher-than-hd resolutions and still not be remotely taxed. It's not that gaming rigs haven't kept on getting better and cheaper, it's that games mostly have stopped keeping up with PC hardware.

Anyway, we know with complete and absolute certainty that it's possible fit enough computational machinery to run an entire human mentality into one liter and 60-100 watts.
posted by ROU_Xenophobe at 8:15 PM on April 14, 2015 [5 favorites]


We have a lot of ground left to cover once the 5nm limit is reached.

Memristors and ternary computing and layered curcuits come immediately to mind to increase computational density.

And once those avenues are exhausted, we can spend a few happy decades optimizing the hell out of our software. Then quantum hits the desktop, and we're off to the races again!
posted by Slap*Happy at 8:17 PM on April 14, 2015 [6 favorites]


It may be that materials science manages to overcome some of the limits of 2D photo-etched silicon wafer transistors, which are the dominant design right now. However, there are what still appear to be some fundamental limitations that we will run into at some point.

Not long before he disappeared, Jim Gray of Microsoft Research did a really interesting presentation talking about some of those fundamental limits and how they would drive processor design. Unfortunately the video seems to have gone offline (it was hosted by the "Research Channel", which is apparently defunct). The key phrase that I remember him saying was "smoking, hairy, golf balls".

That is to say, that there are some fundamental limits which place a lower bound on the amount of work required for computation, such that there will always be power-dissipation challenges; the best way to dissipate heat is to maximize the difference in temperatures between an object and its surroundings, so it's therefore advantageous to design processors that can tolerate high temperatures and let them run hot—hence "smoking".

Further, there are fundamental limits on the amount of information you can shove through a wire, meaning that more I/O requires more interconnects (in addition to power), and at some point it'll make sense to stop trying to put all the connectors on the bottom face of the CPU and start putting them on the top and sides of the package as well—hence "hairy".

His third point was the one I thought was the most interesting, and it referred to the physical shape of the processor: he suggested that as processor clock speeds increased, eventually you start to approach a limit where you can no longer transmit information from one side of the processor die to the other via electrical signals within a single clock cycle. IIRC, he made an interesting analogy comparing a single transistor to a planet, and just as our observable universe is bound by the speed of light and the age of the universe, the transistor's "observable universe" is bound by the electron propagation velocity and the time between clock cycles. To maintain coherency, transistors need to be in the same "universe", which nudges designers towards spherical shapes.

This was from back in the early 2000s, and I don't know whether or not Gray anticipated the shift towards parallelism and away from very high-speed monolithic architectures, or whether or not he'd change his predictions in light of them. It seems to me that parallelism and high-speed processing are complementary, and while pursuing parallelism might have sucked the air out of the room for a few years in terms of pushing clock rates and FLOPS, eventually that upward pressure is going to reestablish itself. If something else doesn't stop us first, I think eventually we'll end up at Gray's maximum as a steady state of sorts.
posted by Kadin2048 at 9:20 PM on April 14, 2015 [5 favorites]


If and when chip design cycles start to slow dramatically, we'll actually have a nice continued round of price cheapening, as chips (and the things they go into) no longer need to pay for the extraordinarily fast depreciation which semiconductor IP and factories now suffer.
posted by MattD at 9:27 PM on April 14, 2015 [3 favorites]


I'm just kind of bemused byt the idea that in my lifetime someone going to college may be given their parent's hand-me-down tablet, the way that teens now get their parent's hand-me down cars to drive.
posted by happyroach at 11:53 PM on April 14, 2015 [2 favorites]


Does this mean we're going to have to get good at programming now?
posted by whuppy at 5:56 AM on April 15, 2015 [6 favorites]


Yeah, software is not even close to making the most of hardware.

A lot of focus in the tech industry overall for the last several years has been on mobile platforms -- lower power and cheaper rather than smaller and faster. Which is good, because the trend in high-performance graphics cards for PC was to run hotter and consume more power.
posted by Foosnark at 6:21 AM on April 15, 2015 [2 favorites]


So until they actually stop putting more crap on a chip each year, I'm going to assume that they're going to, somehow, continue to put more crap on a chip each year.

Hopefully you've read enough in this thread to help shape that assumption. They're reaching the theoretical limit of the physical size of the transistor (can't go smaller than an atom). So they put more crap on a chip each year by making the die larger, or improving the architecture (or software). But there's a hard limit that is very real.

as an electrical engineer, my interest in hearing about moore's law (and its subsequent chicken little panic) falls by half each time I hear about it. I *especially* hate it when keynote speakers start their address with moore's law.

yes people like Intel are pushing the state of the art but who is really designing and reliably manufacturing at that node right now? I would argue most companies are floating around 130nm to 45nm, maybe 20nm and will likely sit there for a loooong time. It will take a long time for the process to mature enough such that 7nm-5nm is feasible for serious mass production. And "boutique" places won't be able to afford the ticket price for that wafer foundry anyways, so they'll just sit with the larger nodes.

it will take several decades for other technologies to really become feasible (yes that's you quantum computing), and we'll spend some time oohing and aaahing at wearable electronics or nano-machines / nano-medicine, and finally start using magnetic fields for interesting things. And we're in the early days of commercialized space exploration. Plus all the meta-data that we'll gather on ourselves will shape social behavior. There's lots to keep us occupied until we can start increasing computing power again. The gravy train isn't over it's just going to shift.

now, if I were in university right now, I would say a career in digital / analog IC design has a shelf life. 20-30 years maybe? Depends how long you want to work as an engineer I guess, and how soon they come up with alternatives.
posted by St. Peepsburg at 6:22 AM on April 15, 2015


Tha "hairy smoking golf-ball" idea has been around since... well, I first heard the phrase in 1980, when a Z80A was pretty cool 'cos it ran at 4 MHz. It's been extremely impressive how well it's been avoided, especially in terms of engineering lower power devices. Your Macbook Air is something like 10-16k Cray 1s in raw processing power, and runs off a battery. Which of those two aspects is most impressive? (To come down from that, consider what it spends its time doing.)

The light cone of information processing devices is a fascinating concept (it knocks the Singularity on its head, for a start) and it clearly sets a theoretical limit on how fast and how much information can be processed - although I've never seen a remotely rigorous discussion of this, I would very much like to.

The industry is going to change. If you look at commercial aviation as an interesting metaphor, then certain aspects of that have already been through this: the physics of burning jet fuel to push a hundred-odd people through the air have set the standard form factor for aircraft to something we arrived at around fifty years ago. Fifty years before that was the Wright Brothers. Stuff like Concorde didn't work, but when it was being designs speeds were still increasing and of course we'd all be hypersonic by 2010.

We've gone from lots of large aircraft manufacturers to, basically, two, which are producing practically identical devices that work in practically identical ways.

But aviation as an industry continues to grow apace, in terms of passenger miles, despite there being nothing but (compared to 1940-1970) minor increments in design of late.

Moore's Law is dead. It will not continue in another form. Don't worry about it. (Unless you're Intel, in which case you've known this was going to come for twenty years and you'll have been desperately trying to find another equivalent revenue stream for all that time.)
posted by Devonian at 8:30 AM on April 15, 2015 [2 favorites]




also btw, fwiw, there's swanson's law, which i think has the potential to be just as profound, if not more so...
  • Silicon: After the chip, another revolution?
  • Just as Moore's law forecast an exponential increase in the number of transistors on a chip, Swanson's Law predicts an exponential decrease in the price of solar. What he forecast is that every time the number of solar cells in the world doubles, the cost of making one would fall 20%.

    And this law too has proved remarkably accurate. Prices have fallen like a stone since - from $100 per watt in the 70s, down to less than $1 per watt now.

    That exponential fall in price is why the solar industry is beginning to really take off now. To use the jargon, solar has reached "grid parity".

    What that means is that in sunny places with relatively high electricity costs - like Hawaii, California, Japan or Italy - the cost of supplying a watt of electricity from a solar cell to the electricity grid is now very similar to the cost of generating power from coal, gas or nuclear energy.

    And according to Dick Swanson there's room to cut prices still further. He thinks the price per watt could fall to 35 cents.

    That means solar could pretty soon deliver cheap, plentiful power without - it almost goes without saying - all the consequences of the pollution of other forms of electricity generation.
  • While Coal Loses Nearly 50,000 Jobs, Wind and Solar Add 79,000: "As coal plants are shuttered due to increasing regulation and competition from cheap natural gas and renewables, coal industry jobs are being shed, losing nearly 50,000. During the same period, wind and solar added about 79,000 jobs and natural gas tacked on more than 94,000 jobs."
  • Fossil Fuels Just Lost the Race Against Renewables: "The world is now adding more capacity for renewable power each year than coal, natural gas, and oil combined. And there's no going back. The shift occurred in 2013, when the world added 143 gigawatts of renewable electricity capacity, compared with 141 gigawatts in new plants that burn fossil fuels."
  • Clean Energy Revolution Is Ahead of Schedule: "Each of these trends -- cheaper batteries and cheaper solar electricity -- is good on its own, and on the margin will help to reduce our dependence on fossil fuels, with all the geopolitical drawbacks and climate harm they entail. But together, the two cost trends will add up to nothing less than a revolution in the way humankind interacts with the planet and powers civilization."
as jeremy rifkin writes towards an empathic civilization:[*]
The human race is in a twilight zone between a dying civilisation on life support and an emerging one trying to find its legs. Old identities are fracturing while new identities are too fragile to grasp. To understand our situation, we need to step back and ask: what constitutes a fundamental change in the nature of civilisation? The great turning points occur when new, more complex energy regimes converge with communications revolutions, fundamentally altering human consciousness in the process.
and returning power to the people:[*]
In the past, revolutions in energy have often come hand-in-hand with revolutions in communication: the First and Second Industrial revolutions of the 1800s and 1900s, for example, were matched by revolutions in steam-powered printing and the advent of radio and television. We already have our revolution in communication: the Internet, which broke away from old, top-down models and instead emphasizes collaboration and lateral, peer-to-peer power. Now, Rifkin says, we need an energy regime that matches it...
posted by kliuless at 10:21 AM on April 15, 2015 [5 favorites]


Ever more from Moore: A microchip pioneer’s prediction has a bit more life left in it - "Fortunately, one of the corollaries of Moore’s law is that the energy efficiency of transistors follows the same exponential law, doubling around every two years."

After 50 Years, Moore's Law Has Only Started To Disrupt Everything We Do - "At an event Friday at the Computer History Museum in Mountain View, Calif... two semiconductor industry luminaries offered a fascinating look not only at the impact of Moore’ Law so far but how it’s poised to disrupt the institutions of society in an even bigger way in the decades to come. Onetime Intel senior executive Bill Davidow, now an adviser to his venture capital firm Mohr Davidow, and semiconductor pioneer Carver Mead, emeritus professor of engineering and applied science at Caltech, talked with David C. Brock, co-author of a book to be released May 5, Moore’s Law: The Life of Gordon Moore, Silicon Valley’s Quiet Revolutionary."
Davidow: I attended a lecture in 1960 by Richard Hamming at Stanford. He wanted to talk about order of magnitude changes in technology. Every time that happens, it has tremendous social implications. Like going from horses at 3-4 MPH to trains at 30 MPH—created industrial city. I’ve often wondered what Hamming would think about an eight order of magnitude advance in 50 years, which is what Moore’s Law is about.

There are huge social and economic transformations. It was very expensive and slow to move information more than 50 years ago. So we moved people closer to the information: Wal-Mart. Now there’s Amazon, moving information to the people where they are. We will rebuild all of physical infrastructure as a result of Moore’s Law.

Things tend to get driven to extremes. There was $600 trillion in over the counter derivatives by 2006, which helped cause the economic meltdown. Couldn’t happen without Moore’s Law. We’re going to see a lot more winner-take-all situations.

Q: How can societies and companies contend with this?

Davidow: They need to find ways to restructure themselves and move to more efficient infrastructures. All industry around the world is going to be restructured in this way.

Q: Even greater changes in future?

Davidow: Not sure. Change is happening so quickly… We’re going to end up with different physical, economic and social infrastructure. The challenge now is to figure out how to adapt our institutions to this new environment—labor force, GM, etc. For example, millennials are not buying cars, instead electronic devices.

[...]

Q: What’s the impact on the human spirit of Moore’s Law?

Davidow: Colors do not exist. All that we sense is energy levels and our mind creates colors. Music doesn’t exist. All we hear are vibrations and our mind creates music. It’s going to be fascinating to imagine the mind in the virtual environment.

Mead: There’s a lot of noise out there about how we’ve got much computing that we’re going to be able to build brains and they’ll be intelligent. In late Middle Ages, minds were clockwork. Then it became a telephone switching network because that was the fanciest technology. Now computers. People look at the cell and say it’s a switch. Then neurobiologists got sharper tools. It wasn’t the neurons, it was the synapse. Now inside the synapse, whole chemical variations, so many state variables. We still have a lot to learn.

Q: Gordon Moore said Moore’s Law was all about economics as about the extensibility of silicon technology. Moore’s Law has been a huge deflationary force in the world. Also a big boost in productivity. How do you see its place in macroeconomics and where we’re heading?

Davidow: I’ve thought a lot about deflation because one of my problems is economists measure it in 20th century terms.

Mead: Or 18th or 17th.

Davidow: All the rules we’ve lived by change significantly. We keep applying old metrics to new environments. They don’t reflect accurately what’s going on. We’re going to make a lot of bad economic decisions.

Mead: That’s an understatement. Most of our economic discussions don’t account for innovation. It isn’t one of the things that goes on in most models. That’s nuts. It needs to get front and center in all discussions of economics.

Q: What excites and concerns you looking ahead?

Davidow: What will we use to define our identities in the future? If the tools become so good that we really don’t have to work 40 hours a week, how are we going to define our personal identity.

Mead: This reminds me of discussions in the ‘50s about what are we going to do with our time (with all the labor-saving devices). The absolutely essential change that’s happened recently is we’ve gone from a broadcast mentality to a point-to-point mentality—but people are trying to turn it (the Internet) back into a broadcast mentality.

Davidow: Businesses are trying to preserve the algorithms of the past. They need to figure out what’s next.

Mead: People are finding new and innovative ways to use the platform. Look for the things that are the pinch points preventing us from going forward in certain areas.
posted by kliuless at 2:22 PM on April 20, 2015


Just so you know, I'm going to build a workstation around a 4k panel in portrait mode. It's going to look like an Easter Island idol mated with the entire '70s. A glossy black plane, and beneath, an LED-capped keyboard displaying every APL symbol ever. You have to escape the ascii! Or, like, hit shift twice.

There will be an APL-to-Shell RePL, because I am a cruel god.
posted by Slap*Happy at 8:48 PM on April 20, 2015


« Older Golden Meaning   |   What is a sandwich? Newer »


This thread has been archived and is closed to new comments