2012 Nobel Prize in Physics
October 15, 2012 12:59 AM   Subscribe

The 2012 Nobel Prize in Physics has been awarded to Serge Haroche (France) and David Wineland (US) for discovering ways to measure and manipulate quantum particles, a discovery which many are suggesting may soon allow us to build computers with virtually limitless capabilities. The Nobel press release provides a layman friendly PDF summary of the research and its potential applications, as well as a less layman friendly PDF with additional scientific background information. The press release cites two older Scientific American articles for further reading, and the magazine has made these articles available to read free online for the next 30 days:
Monroe, C. R. and Wineland, D. J. (2008) Quantum Computing with Ions, Scientific American, August.

Yam, P. (1997) Bringing Schrödinger’s Cat to Life, Scientific American, June.

posted by dgaicun (51 comments total) 21 users marked this as a favorite
 
Serge Haroche - Collège de France and Ecole Normale Supérieure, Paris, France

The ENS is a public institution, with a scientific, cultural and professional orientation, directly dependent on the Minister in charge of Higher Education.

David J. Wineland - National Institute of Standards and Technology (NIST) and University of Colorado Boulder, CO, USA

Founded in 1901 and now part of the U.S. Department of Commerce, NIST is one of the nation's oldest physical science laboratories. Congress established the agency to remove a major handicap to U.S. industrial competitiveness at the time—a second-rate measurement infrastructure that lagged behind the capabilities of England, Germany, and other economic rivals.

To hell with all of this big government science. I really wish these guys would get off my back and permit me the freedom and liberty to figure out the secrets of quantum mechanics on my own.
posted by three blind mice at 1:13 AM on October 15, 2012 [9 favorites]


NIST is a national treasure.
posted by Pre-Taped Call In Show at 1:24 AM on October 15, 2012 [2 favorites]


...a discovery which many are suggesting may soon allow us to build computers with virtually limitless capabilities.

Yes, yes, the Maps App will get better over time, we know.
posted by Brandon Blatcher at 2:05 AM on October 15, 2012 [9 favorites]


One of the interesting things about quantum computers is that it's quite possible to think of reality itself as a computation engine. It's supremely elegant to directly use the bits of matter themselves as computational devices, rather than building up large agglomerations of atoms to create transistors. It's sort of commandeering the substrate of the universe for our own purposes. Very cool stuff.

The thought also occurs that it's sort of a shift backward -- the big thing about the late twentieth and early twenty-first centuries is the transition from the analog world to a digital one, and the rise of networks; moving to quantum computing is sort of a transition from digital back to analog again. It's kind of amusing, especially when you think of the enormous changes in society that are presently underway from the first transition. (stuff like the endless arguments over copyright; what does copyright even mean, when just about everyone in the whole world can make copies of things for free? A featurephone and a solar panel is enough to copy music, and that can work in the most remote of remote areas.)

So if we transition again to quantum computing, I wonder what implications it will have?

These computers will not be unlimited, though. At first, they won't be anywhere near as powerful as digital computers. Even when we work out the kinks, and get our qubits down to be very small and cheap devices, it's not unlimited computational power, it's just a lot of it.

At the minimum energy density required to store a bit of information, simply storing 2^128 bits would require enough energy to boil the oceans. And when I looked that up, to be sure I had it right, I found this article, 128-bit storage: are you high? It has this interesting cite:
In particular, it has been shown that 1 kilogram of matter confined to 1 liter of space can perform at most 10^51 operations per second on at most 10^31 bits of information [see Seth Lloyd, "Ultimate physical limits to computation." Nature 406, 1047-1054 (2000)].
So it's definitely not unlimited. Just powerful.
posted by Malor at 2:23 AM on October 15, 2012 [3 favorites]


...a discovery which many are suggesting may soon allow us to build computers with virtually limitless capabilities.

Quantum computers are cool and I expect them to be very useful for quite a number of applications, but this statement is an extreme exaggeration -- even if you disregard the practical considerations mentioned that Malor mentioned on the one side, or fundamental problems like uncomputability on the other, and focus only on efficient computations.

As far as I know the current state of research, quantum computers are not expected to be able to solve PSPACE-complete problems efficiently (e.g., minimizing regular expressions will probably be so horribly slow on quantum computer that you won't care about the speedup you might have in comparison to a classical computer).
posted by erdferkel at 2:36 AM on October 15, 2012




So I hear a quantum computer has just managed to factor the number 21. It's 3 and 7±1.

Try the veal!
posted by Pre-Taped Call In Show at 3:57 AM on October 15, 2012 [1 favorite]


I for one.
posted by DU at 4:10 AM on October 15, 2012 [1 favorite]


If you ask a quantum computer the old math problem about how long it takes to fly from New York, it'll tell you either how long it'll take to get there or the destination, but not both.
posted by JHarris at 4:15 AM on October 15, 2012 [2 favorites]


What about the old math problem of how long is a string?

However, I'm pretty sure the problem of whether I walk to school or carry my lunch will remain unsolved until the end of time.
posted by DU at 4:25 AM on October 15, 2012


Its worth noting that people are selling and using quantum computers today. Lockheed has purchased a 128 qubit quantum computer called the Dwave One for 10 million dollars. They are using as an application specific processor for things involving discrete optimization. Discrete optimization according to wiki has to do with things like graphs. My takeaway is either currently or within the next 5 years someone will be directly using these systems to monitor terrorist groups using graph mining. We don't have flying cars but dammit we live in the future.
posted by Rubbstone at 4:35 AM on October 15, 2012 [1 favorite]


I suspect you're right, Rubbstone. However, I also suspect that if you follow the (private industry) money, the first widespread practical application will be weather modeling and forecasting.
posted by digitalprimate at 4:40 AM on October 15, 2012


So how many fps will I get?
posted by nathancaswell at 4:43 AM on October 15, 2012 [1 favorite]


If money is your object, the best use of quantum computers is cryptography. For white hats, making secrets; for black hats, breaking them.
posted by DU at 4:55 AM on October 15, 2012 [3 favorites]


Hogwash. They might've stood a chance getting this thing off the ground quietly, but now that it's been observed...
posted by hypersloth at 6:13 AM on October 15, 2012


One of the interesting things about quantum computers is that it's quite possible to think of reality itself as a computation engine.

Puts me in the mind of the science fiction novels Permutation City, by Greg Egan, and Moving Mars, by Greg Bear.
posted by aught at 6:21 AM on October 15, 2012


We don't have flying cars but dammit we live in the future.

So do none of you people deal with the idiots in regular street traffic? I don't freakin *want* these flying cars people keep complaining they don't have, particularly not with all the texters-n-drivers at the wheel, err, joystick.
posted by aught at 6:25 AM on October 15, 2012 [3 favorites]


To hell with all of this big government science...

Redistributing Nobel Prizes away from the job-creating private sector since, well, the beginning.
posted by fatllama at 6:27 AM on October 15, 2012


This essay by MIT professor Scott Aaronson in the New York Times provides a quite level-headed view of the potential and challenges for practical, scalable quantum computing. In particular the notion that shows up in almost all articles in newspapers that a quantum computer would just be an exponentially faster version of a traditional computer by simply calculating all the answers simultaneously in parallel is far too simplistic. Actually, you have to try to get the wrong answers to cancel out so that you get the right answer with a high enough probability. Depending on your particular problem, this can get quite tricky.

At the minimum energy density required to store a bit of information, simply storing 2^128 bits would require enough energy to boil the oceans.

2^128 (~10^38) bits is an absolutely huge amount of information to store, nothing simple about that. It is always important to distinguish between the number of bits required to store a particular value and the number of possible values for a given amount of bits. The potential of quantum computers is that you could operate on all 2^128 different values that can be stored in 128 bits by manipulating the superposition of the 128 qubits.


Its worth noting that people are selling and using quantum computers today. Lockheed has purchased a 128 qubit quantum computer called the Dwave One for 10 million dollars.

It's not that clear that the D-Wave actually is a quantum computer that successfully exploits entanglement in practice... Quoting Scott Aaronson:
The quantum algorithm on which D-Wave's business model is based - namely, the quantum adiabatic algorithm - has the property that it "degrades gracefully" to classical simulated annealing when the decoherence rate goes up. This, fundamentally, is the thing that makes it difficult to know what role, if any, quantum coherence is playing in the performance of their device. If they were trying to use Shor’s algorithm to factor numbers, the situation would be much more clear-cut: a decoherent version of Shor's algorithm just gives you random garbage. But a decoherent version of the adiabatic algorithm still gives you a pretty good (but now essentially "classical") algorithm, and that’s what makes it hard to understand what’s going on here.
As far as I know, no one has actually demonstrated a quantum computation even approaching what can today be solved using classical computers. There are probably still a lot of breakthroughs necessary in implementation and a few more Nobel Prizes before at least some of the potential of quantum computers is realized in practice.
posted by ltl at 6:36 AM on October 15, 2012 [4 favorites]


Okay, quantum computers aside for the moment, I'd love some help explaining what's going on here. Paging physicsmatt!

They were talking about this on the most recent episode of the SGU. So the idea usually in quantum mechanics is that quantum particles can exist in a couple different states simultaneously (super position, etc?) and that the act of observing them using science collapses them into one state, and these folks have figured out a way of observing the particles "out of the corner of the eye" so to speak so that the collapse doesn't happen.

This seems HUGE, right?!?
posted by lazaruslong at 6:38 AM on October 15, 2012


This seems HUGE, right?!?

It's not as earth shattering as you might hope. Yes, we've learned that a measurement can be continuously varied from a "little peek" (called a "weak measurement") to a fully wavefunction-collapsing observation, but the amount of usable information one gains with such a coy scheme decreases while the likelihood of decoherence creeps up steadily. In other words, there's still an uncertainty principle at work, and squeezing more precision out of one variable leads to higher uncertainty in another.
posted by fatllama at 6:44 AM on October 15, 2012 [2 favorites]


you rang?

You'll have to narrow down exactly what you'd like to know more about. This stuff is way outside more usual wheelhouse, so I can't really put the whole field in too much context. But if you have specific questions about the quantum mechanics, I can probably help.
posted by physicsmatt at 6:48 AM on October 15, 2012 [1 favorite]


I guess I'm specifically curious about the innovations and methods used to avoid the wave-form collapsing measurements, and what that means for quantum mechanics moving forward. For example, in the two-slit experiment, does this mean that we could now watch the particles acting as both waves and particles at the same time?

fatllama kinda hints at what I'm wondering about, the efficacy of this new..."weak measurement" and what that means for particle physics. What is the "weak measurement"? Why doesn't it collapse the wavefunction? What changes between a weak measurement and a stronger one that makes it collapse?

If these questions are non-starters because I'm functionally ignorant of this area of physics, no worries =)
posted by lazaruslong at 6:51 AM on October 15, 2012


does this mean that we could now watch the particles acting as both waves and particles at the same time?

The short answer to this is no, if for no other reason than the "click" at a photo-multiplier tube (indicating "particle-like" behavior) is in no way a "weak" measurement. Instead, a practical example might be measuring the polarization state of a photon without directly detecting it with a photo-multiplier tube... To do this, one might need a pair of entangled photons and a scheme to measure photon B to learn about photon A without somehow affecting photon A. Such a scheme is a weak measurement and will involve post-selection of weak measurement values based on final "strong" measurements of both A and B.

There's no free lunch with this technique, just a way to interpret correlated data you obtain before making final collapsing observations.
posted by fatllama at 7:06 AM on October 15, 2012 [2 favorites]


But if you have specific questions about the quantum mechanics, I can probably help.

Yeah: When can I have it and do you take Discover?
posted by LordSludge at 7:12 AM on October 15, 2012 [1 favorite]


Adam Frank of the University of Rochester had an Op-Ed in the Times yesterday about the Nobel that contained this nugget:

A quantum machine using no more than 300 qubits would be a million, trillion, trillion, trillion times faster than the most modern supercomputer.

Can anyone put into context how hard it would be to make a quantum computer with 300 qubits?
posted by ultraviolet catastrophe at 7:21 AM on October 15, 2012


One of the more interesting things about quantum computation is how it doesn't let us efficiently handle all the problems in NP. It does give some special cases where you get a real speedup, with Shor's algorithm being probably the most popular example, although it's not known that factoring is actually hard (not known to be NP hard, so could be in P without P=NP). The more surprising stuff is the less well-publicised Deutsch Jozsa algorithm and (its derivative) Grover's algorithm. Grover's algorithm really highlights the "if you think you understand Quantum theory, etc." issue, letting you search an unordered list of N elements in O(sqrt(N)) time - think about this: given an unknown function on some set of 16 elements, you can find the element which produces a given output with only 4 queries to the function. As far as I'm concerned that's some serious voodoo right there.

Anyhow, the quadratic speedup doesn't actually knock much off NP-hard problems, because the search space grows exponentially, so no fixed polynomial benefit can help you out much. For the purposes of cracking encryption, your quantum computer can help you with RSA, but so far as we know discrete logarithm, elliptic curve methods and so on are still fine. I suspect this is (as mentioned) because factoring may not actually be hard. This makes me doubt that the NSA have a big warehouse of quantum computers for cracking codes (unless they have an unexpected proof that BQP = NP).

What really niggles at me with all of this is how many apparently completely different angles of attack (in terms of physical techniques for computing) seem to be tweaked just right to prevent exponential computing power that appears to be below the surface from leaking out. For example, in an analog computer, you can get exponential power out by encoding exponential volumes of information into the least significant digits of your analog number, and perform many operations "in one", but then you have to put exponential amounts of work in to get enough precision in your read/write devices to get the error rate down. In a quantum computer, there's an exponentially large superposition sitting there doing lots of work for you in each operation, but the measurement operation chucks most of that work away, again in a manner that is just right to stop you getting too much computation out. These two approaches to building a computer couldn't be more different, but they are both nerfed from unexpected directions!
posted by larkery at 7:32 AM on October 15, 2012 [7 favorites]


Can anyone put into context how hard it would be to make a quantum computer with 300 qubits?

There's an app for that. If it's going to be with trapped ions, pretty hard. We can see in principle how to scale-up to these numbers, but it is a technical challenge. We're at the point where several groups can do interesting things with two ions, and one group can do interesting things with (up to) eight (have they done more now?), but these latter demonstrations are not at a tolerably low "noise level" for scaling yet. 300 qubits will eventually mean several thousand trapped ions, because one will necessarily need to employ error correcting schemes which use more than one ion to represent a logical qubit.

To do it, some materials issues as basic as dielectric dirt on metal will have to be solved, and the whole thing might end up being a micro-fabricated jewel with integrated vacuum hardware, metal electrodes, light collection, fiber coupling, and CMOS logic. It's probably a billion dollar project, but that isn't without some precedent, is it?
posted by fatllama at 7:36 AM on October 15, 2012 [2 favorites]


fatllama: " the whole thing might end up being a micro-fabricated jewel with integrated vacuum hardware, metal electrodes, light collection, fiber coupling, and CMOS logic."

I just want you to know how much you just sounded like The Doctor and how hot that makes me.
posted by lazaruslong at 7:39 AM on October 15, 2012 [1 favorite]


What's a qubit?
{with apologies to Bill Cosby}
posted by mfu at 7:48 AM on October 15, 2012


Q*Bert's less dramatic brother.
posted by fatllama at 8:02 AM on October 15, 2012 [1 favorite]


From the YouTube channel Minute Physics, an simple illustrated explanation of the work that won the Nobel. But I have neither a Physics degree, nor am I the esteemed physicsmatt, so I can't exactly vouch for the 100% accuracy of this. Although Minute Physics has helped me in the past and concurs with what I *do* know.
posted by undue influence at 8:10 AM on October 15, 2012


OK, so you're never going to see something as both a particle and a wave; assuming what you mean by a particle is "a wavefunction that collapsed" and a wave is "a wavefunction that didn't collapse." This is a reasonable definition, since what people refer to as the particle-ness tends to be things that require an object to be point-like (i.e. "I found this electron here"), and the wave-ness is when a particle "knows" about things far away (i.e. "there are two slits through which to pass"). So it's sort of tautological. Completely collapsing down to a point is what you might call a complete or strong measurement.

Partial measurements are not new in principle (though I don't want to undersell what people are doing experimentally). You can even think of the classic double-slit experiment as an example of a weak measurement. With the double-slit, the way it's usually presented is "wave goes through A or B, and interferes with itself." But if you cut 3 slits in the barrier, you'd get waves through A, B, and C interfering, or though four slits, five slits, etc...

Richard Feynman then asked "what if there was no barrier in the way at all?" Well, then, you can think of the particle moving in a straight line, but if you think of this as a particle passing through an infinite number of slits (so many that there is, in fact, no wall there at all), then the wave goes through slit A, B, C, D....AA, BB, CC, .... you get the idea. Afterwards, the wave interferes with itself, and we see that interference as "a particle moving in a straight line," because all the waviness cancels out.

The point is that, by putting a barrier in the way, we've actually reduced the possibilities for the wave to travel from infinite to two, and in doing so, massaged the wavefunction into something very different from the initial configuration. And you can SEE that new configuration in the bizarre (from a particle view) interference pattern after the particles pass through the barrier. But if I check to see which slit each particle went through, that pattern disappears; I've made "too strong" of a measurement, and the quantum effect I cared about isn't manifest anymore. The trick for quantum computing is to massage the wavefunction of the various physical systems that make up the qubits into the right configurations, allowing them to talk to each other sufficiently without collapsing the system. Clearly, a bit more complex than the double slit experiment, but then again, it's hard to do computing with that.

As for how they do that practically? Well, I'm a theorist, so as far as I can tell, you take a bunch of grad students, bury them in a basement for 6 years and give them coffee. Science will result. Occasionally Nobel Prizes as well.
posted by physicsmatt at 8:22 AM on October 15, 2012 [13 favorites]


"Can anyone put into context how hard it would be to make a quantum computer with 300 qubits?"

A lot of it is having to simultaneously achieve three mutually exclusive goals: speed, robustness, and quietness.

In a quantum computer, it is the simultaneous collective state of all the qubits that determines the answer. These states need to be written and read from something that can work quantum mechanically. Typically, we're talking individual atoms or small collections of atoms that are condensed into Bose-Einstein Condensates, or ions trapped in electrostatic potential wells.

We need to be able to write the states of each of the 300 qubits in an unambiguous fashion and then elucidate their collective state with a certain level of precision- without affecting the state of the answer by our observation. Entanglement- the property that allows qubits to work in lock-step with each other- must also be established with high fidelity so that expectations of the outcome have high confidence values.

One must also be quick to set up the qubits lest they decohere from vibration (thermal noise). The longer the system is exposed- that is, the longer the system has to run in order to settle into a final state- the more susceptible it is to decoherence. The stability of a quantum computer goes down as the number of qubits goes up, so you need to take exponentially more time and care (= money) to keep things quiet, fast, and robust. To my knowledge, it is this exponential growth of noise and uncertainty that form the biggest practical hurdles to large entangled systems.

Or on preview, what several others have alluded to.
posted by Phyllis Harmonic at 8:41 AM on October 15, 2012 [1 favorite]


Regarding quantum computers and the difficulty of making one with 300 qubits, and also to emphasise the apparent contrast between D-wave and the rest of the world... look what just turned up in my inbox:

E. Lucero et al.
Computing prime factors with a Josephson phase qubit quantum processor
Nature Physics 8, 719 (2012)

(...) here we demonstrate a nine-quantum-element solid-state quantum processor and show three experiments to highlight its capabilities.
(...) In the final experiment, we run a three-qubit compiled version of Shor’s algorithm to factor the number 15, and successfully find the prime factors 48% of the time.


I swear didn't see this 'til after making that wisecrack earlier.
posted by Pre-Taped Call In Show at 8:52 AM on October 15, 2012 [1 favorite]


One can draw an analogy from Moore's law with transistors to qubits to look at the trend in increasing number of qubits that can be manipulated. When one does so, one finds a doubling in the number of qubits every 6 years which stands at 14 now.

According to that website you need about 50 qubits to do better than classical simulations and by 2048 your 2048 bit RSA key will not be secure if the trend continues.
posted by euphorb at 8:53 AM on October 15, 2012 [1 favorite]


A qubit is a quantum bit, which is to say it is a state that can have two possible values: 1 or 0, just like a regular bit. The quantum part is that, as this is a quantum state, the qubit can be a linear combination of the two values. You'll see this written as 1/sqrt(2)(|0>+|1>) or the like, which looks very strange I imagine. Let's use a specific example to see what's going on.

Electrons can have two spin orientations in a magnetic field: spin-up (aligned with the field) or spin-down (anti-aligned). You could name spin up "1" and spin down "0", and you'd be able to do binary operations, which is good, because that's how we do math with computers, but let me just stick with up and down.

If I measure the spin of an electron, I'll get one of the two possible results: up or down. However, if just have an electron in a magnetic field, and don't check in some way which way it's spinning, it will be in a superposition of the two states, meaning that there is some probability of finding it in |up> or |down> when I do get around to measuring it. One possible superposition of "equal" spin would be 1/sqrt(2)(|up> + |down>). Collapsing the wavefunction would be just finding it in |up> or |down>.

The |...> notation is a called a "ket" to indicate we're talking about a possible wavefunction value. There are also <...| called "bras", and then you'll see the combination written like <...|...> which are called, of course, brackets. Aren't we physicists clever?

Another "equal" combination of up and down would be 1/sqrt(2)(|up>-|down>), which is physically different even though it looks like it's "50%" up and "50%" down. (-|up>+|down> and -|up>-|down> are also distinct). For example, the |up>+|down> has zero overlap with the |up>-|down> state (meaning if I start the system in the first, I'll never find it in the second). This is one of the really weird pieces of quantum mechanics: things can have a "phase," which doesn't have a really easy classical (non-quantum) analog. More generally, the two values |up> and |down> can have arbitrary phases (written as e^(i a), if you know your complex numbers). The point is, when you have non-collapsed wavefunctions, you can superimpose all possible measurements, and those superpositions can have unexpected ways of interacting.

So far, this is just a description of quantum systems. Where's the computing? Well, normal computing works by taking a whole bunch of bits (0 or 1, or up and down in my notation), putting them in some order, and then building a physical system that implements certain rules. For example, you might have a rule saying "if this bit is in the down state, and the bit next to it is in up state, and you put in this electrical voltage, switch both bits to up. But if this bit is down, and that bit is down, switch the first bit to up and the second stays down." And, if you're clever, you make this into a physical representation of "addition." With qubits, you now can get really clever: you can say "switch the |up> to |down> in this qubit when this stimulus comes in, but correlated with the |up> part of the 2nd qubit's state" and so on. So you can have the system run through all sorts of possible mathematical operations all at once. If I'm clever, then I can get the system to read out the answer I'm interested in at the end by making some measurement (though I gather this tends to be probabilistic, as in sometimes you'll get the wrong answer. However, for many questions quantum computing cares about, checking that the answer is wrong is trivial but finding the right answer is VERY hard, so this isn't a huge downside).

But you see the complications: I need a whole bunch of qubits that can talk to each other (entangled, that is), because if the |up> of one qubit can't see the state of anything else, that's not useful. I also need ways to set up the system,and I need ways to read out the system. All of these things start affecting the wavefunction of the physical system making up the qubit, and that can be a measurement. And if I measure a qubit, I get |up> or |down>, and then I have a (not particularly efficient) classical computer, which again, not useful. Then, as Phyllis Harmonic said, there are noise effects and a litany of other problems that can pull my physical system out of the delicate superposition that it needs to be in to work.

As a result, the theory of how to do quantum computing (as in, give me some qubits, and I can break all these codes) is way ahead of the experimental issues. This is often true in theoretical physics, and speaking as a theorist, is generally not a great place to be for a long time, as the theorists need someone around to tell them that they're wrong.

Also, as an aside, the word qubit has a special place in my heart, as it was invented by one of my excellent undergrad professors.
posted by physicsmatt at 9:36 AM on October 15, 2012 [5 favorites]


ultraviolet catastrophe: "Can anyone put into context how hard it would be to make a quantum computer with 300 qubits?"

This is a decent question of reckoning. I'm not involved in quantum computing in any way, but I think that the technology is approaching a point analogous to punch cards: people are still having to rig things onsie-twosie and the first steps towards standardization are happening.

So, 300 qubits. It's my understanding that you can take a quantum computer of n qubits and simulate it on a classical computer that has 2n bits of storage. This type of estimate doesn't take into consideration the computational cost of whatever your states are (e.g. how fast you get an answer), but represent how complex the answers can be, and what amount of work a quantum computer can do.

So a 2 qubit QC would be 22 , or 4 bits, and 8 qubits would be 28 or 256 bits. Small stuff so far.

Let's work backwards for a more relevant analogy. Let's say I have a terabyte hard drive (or a terabyte of memory) What kind of quantum computer could I model? A terabyte is 243 bits, so a QC with 43 qubits. That's a tiny fraction of a 300 qubit qc.

A large company's data center might have an exabyte of storage (263 bits). This requires large infrastructure: power backups, dedicated cooling, maintenance personnel and equipment. We're still not halfway there. Even if you get to the most common "outlandish" and oversized storage/ memory unit, the yottabyte (283 bits), you're still not even close to modelling your QC supercomputer.

So I can sort of see how people are willing to say how these discoveries can point us towards "virtually limitless capabilities" in computing. Trying to put things in context, I asked Wolfram Alpha about what 2^300 corresponds to, and it suggests that this number far, far exceeds all the atoms in the observable universe.
posted by boo_radley at 9:45 AM on October 15, 2012


a really good book on this subject (from around 2000) is "Minds, Machines, and the Multiverse: the quest for the quantuum computer" by Julian Brown...
It's been a while since I've read it, but one point really stuck with me and that was the bit (qubit? haha) about "reversible computations" and (I'm skimming it now) the work of Rolf Landauer (and Charles Bennett) from IBM from the early 60s (to mid 80s) and particularly the answer he came up with for the question "What is the absolute minimum energy required for computation?"
Think of it this way: you're doing all your maths on an abacus. Once you get to the end of your calculations (in your ideal, frictionless scenario), you tip the abacus so all the beads slide back to their starting position, and you re-capture all the energy you put into moving the beads in the first place. The energy price of computation is this: ZERO.
This is what they are referring to when they talk about 'limitless computing power'...there are caveats, to be sure...Irreversible computing processes (those where information is discarded or erased...eg. moving the beads on your abacus back to the start 'by hand' as part of your calculation or any time you discard unneccesary intermediate results) uses energy (and makes your current chips get hot enough to fry eggs on), as does reading off the final results. However, it is possible to 'undo' your 'irreversable' intermediate bits... basically, by doing the same calculation in reverse, after reading off the results, returning your system to its original state. This is what happens (and i am waving my hands so hard right now) somehow (with science of some sort and ...programming, i'm going to assume) inside of your quantuums so that you only need to expend energy to read off your results, whether you're working with 128 bits of information, or a million...doesn't really matter.
posted by sexyrobot at 10:11 AM on October 15, 2012


People who are looking for a super dumbed down summary of quantum mechanics (basically my own level of understanding) should watch this recent hour long NOVA special by physicist Brian Greene. It even has a segment on quantum computing.
posted by dgaicun at 12:55 PM on October 15, 2012


RE: "super dumbed down" Even trying to comprehend the short summary header of the quantum mechanics article on Wikipedia I feel a lot like Homer Simpson in his ill-fated attempt to understand marketing.
posted by dgaicun at 1:04 PM on October 15, 2012


The only real question is how many qubits are needed for replicators and teleporters.
posted by Shit Parade at 1:38 PM on October 15, 2012


DOE Office of Science budget: $4.9 billion
NSF budget: $7.3 billion
NASA budget: $18.1 billion
NIH budget: $30.9 billion
NIST budget: $0.75 billion

Defense research budget: $71.2 billion
posted by miyabo at 4:07 PM on October 15, 2012 [2 favorites]


Just to add, the Minute Physics video is an explanation of Haroche's work.
posted by undue influence at 5:12 PM on October 15, 2012


So there's almost certainly a limit to how many qubits the universe can support. There is a limit to how fast you can run a CPU with classical bits on silicon -- two gigahertz, give or take.

Imagine it being the late 50's, and trying to guess then where the classical limit was. That's kind of where we are with qubits, except unlike with classical systems low qubit math is effectively worthless. Will we get enough qubits to make a difference for any imaginable problem?

Personally, my money is on SAT/SMT solver breakthroughs making classical systems even better at these class of problems that qubits might be nice at. But my money's *always* on SAT/SMT. It's magic.
posted by effugas at 6:30 PM on October 15, 2012


effugas, is the 2 GHz clock speed limit really a fundamental parameter? I thought it was the product of the speed of light, a (mostly) two-dimensional geometry, how many transistors are required for a general-purpose machine, and the size of those transistors. One of those is fundamental, two are design decisions, and one is an engineering limit having to do with the chemistry of etched silicon. But if there were an application that really required a 100 GHz processor, would that really be physically impossible?
posted by fantabulous timewaster at 8:05 PM on October 15, 2012


One more point on the board for practical applications of philosophy.
posted by cthuljew at 10:16 PM on October 15, 2012


fanta--

There are certainly 10ghz transistors out there, but the constant increase in clock speed that defined chip design for decades is over. We're at a wall defined by the speed of light, quantum tunneling of electrons, and heat transfer. Graphene may move this wall, but we have lots of applications that would love a 100ghz processor. Instead they're getting a hundred 1ghz processors on a die.
posted by effugas at 11:11 PM on October 15, 2012


So there's almost certainly a limit to how many qubits the universe can support. There is a limit to how fast you can run a CPU with classical bits on silicon -- two gigahertz, give or take.

Intel could ship some 5GHz processors today. Not very many, but their existing parts would bin out that high for at least some pieces. Still in the same order of magnitude, and there don't appear to be any huge further increases happening in the near future, but it's a fair bit faster than 2GHz. I believe they don't sell chips this fast because it's not in their long-term interest to do so. Their main competitor is now themselves, and thus it's to their largest benefit to move as slowly as they possibly can, while still selling all their chips. By moving at minimum speed, they make themselves a weaker competitor for the Intel of tomorrow. This also frees up engineering and fiscal resources to compete against ARM in the mobile space.

But, even with 5Ghz processors, our computers wouldn't speed up as much as we'd like. The real problem is memory, which hit a hard wall at around 200Mhz, and all the DDR/DDR2/DDR3 stuff has been about putting more and more chips on a die. This increases total bandwidth, by reading from more chips, but hardly changes latency at all. The transistors making up memory chips have shrunk a ton, but they've barely sped up in the last twelve years. If we could get back to memory with single-cycle latency, like we had back in the early days of computing, we'd see an enormous increase in the speed of computations.

This is part of why Intel chips tend to be faster than AMD chips; they have more cache. The architecture is also better, but their denser process node lets them put more cache on the dies, which helps to hide the dreadful latency problems in modern memory.
posted by Malor at 2:35 AM on October 16, 2012 [2 favorites]


IBM does ship 5 GHz chips today -- and they're several process revisions behind Intel. The chips are super expensive but apparently some people think they're worth it. But yeah, increasing clock speed much beyond that is a dead end.
posted by miyabo at 10:39 AM on October 16, 2012 [1 favorite]




« Older A Mighty Wind   |   You can't eat scenery Newer »


This thread has been archived and is closed to new comments