128 pair-wise coupled superconducting flux qubits
December 28, 2012 1:04 AM   Subscribe

The D-Wave OneTM is the world's first commercially available quantum computer. "Our superconducting 128-qubit processor chip is housed inside a cryogenics system within a 10 square meter shielded room." (images) In other words, it's a programmable superconducting integrated circuit with up to 128 pair-wise coupled superconducting flux qubits (video). The first D-Wave was sold in 2011 for a rumored $10 million. At first there was a lot of skepticism about it, but an August Nature study proved it worked by successfully solving "13 times out of 10,000 for four-amino-acid and six-amino-acid sequences under the Miyazawa-Jernigan model of lattice protein folding." Investors Jeff Bezos and The CIA are happy. A 2048 qubit system is in the works about 1 million times faster.
posted by stbalbach (58 comments total) 28 users marked this as a favorite
 
Every time I read how quantum computers work, I am reminded how my brain can both understand and not understand at the same time.
posted by jiawen at 1:15 AM on December 28, 2012 [45 favorites]


Neato, but it's not quantum, that's marketing.
posted by Mblue at 1:20 AM on December 28, 2012


At first there was a lot of skepticism about it, but an August Nature study proved it worked by successfully solving "13 times out of 10,000 for four-amino-acid and six-amino-acid sequences under the Miyazawa-Jernigan model of lattice protein folding."

It didn't really do anything that a classical computer can't do, though. Read the "it worked" article and you'll see that the claims here are overblown.
posted by vacapinta at 1:25 AM on December 28, 2012


Qubit? What's a qubit? What's a pair-wise coupled semiconducting flux qubit? Can I run it through the Heisenberg compensators to create an inverted tachyon pulse? What a zany future you all live in, where people hold straight faces say things like "flux qubit" and "blog."
posted by bicyclefish at 1:28 AM on December 28, 2012 [20 favorites]


13 times out of 10,000 for four-amino-acid and six-amino-acid sequences
13 out of 10,000? How long did it take to run those 10k simulations? And, now do they know which are correct (or is it an NP-like type problem where the solution is easy to verify?)

Qubit? What's a qubit? What's a pair-wise coupled semiconducting flux qubit?
A Qubit is a quantum bit. It can, IIRC exist in a super position of one and zero. If you have coupled qubits. If you have coupled qubits, that means that the quantum bits are in a state of quantum entanglement. Figuring out exactly what that means requires knowing at least a little bit about quantum physics, I guess.
posted by delmoi at 1:47 AM on December 28, 2012 [3 favorites]


The thing I love most about this is that in 10 years' time my phone will have more processing power than this computer.
posted by MuffinMan at 2:10 AM on December 28, 2012 [4 favorites]


Man, I can't wait to check me out some porn on this baby.
posted by the noob at 2:10 AM on December 28, 2012 [5 favorites]


Rigghht.

What's a qubit?

posted by Meatbomb at 2:28 AM on December 28, 2012 [13 favorites]


Every time I read how quantum computers work, I am reminded how my brain can both understand and not understand at the same time.

Oh, I don't have that problem. I just don't understand.
posted by zardoz at 2:51 AM on December 28, 2012 [2 favorites]


Geordie Rose gave a talk (shitty silverlight site warning; there's a crappy screen cap on youtube) to the faculty of the USC Information Sciences Institute who had just bought the thing. It does a pretty good job of explaining some of the details of how it works and how it's programmed.
posted by Rhomboid at 2:52 AM on December 28, 2012 [1 favorite]


The look of the D-Wave One reminds me of the appearance of some of the computers made by Silicon Graphics in the 1990 (like this). In both cases the marketing challenge is to seduce the nerds who might be using the device to such an extent that they will pester their senior managers into signing a very large purchase order. Yet, because the device is so cutting edge, everybody will have to take it on faith that the machine will fulfil their current and future computing dreams - without being able to rely too much on details. In this context having a box which looks super-bad-ass is essential.
posted by rongorongo at 2:54 AM on December 28, 2012 [4 favorites]


I'm about as far from an expert as you could possibly find in this area, but I was under the very specific impression that they couldn't correctly link more than three or four qubits at a time.

I am, therefore, strongly suspicious that this 128-qubit model does not actually work correctly, or at least it's not a classic quantum computer as described in the literature, where all the bits are synced up with one another.

The 'pair-wise coupled' wording makes me think that this is actually 64 separate 2-bit quantum computers, which is not even vaguely the same thing. If that's the case, I'm very unclear on how you'd actually do any useful computation with it that you couldn't do much faster the old way.
posted by Malor at 3:25 AM on December 28, 2012


The whole point of a quantum computer is that you can do functions on a string of quantum bits ("qubits") that transform the whole string in a way that reveals some information. This is the premise behind Shor's algorithm, which shows that a certain type of transformation performed on a string of bits would either produce nonsense or show the factors of a very large number, such as those used in RSA encryption. Done repeatedly this would be enough to break almost any encryption. But performing Shor's algorithm means having a computer (a series of linked qubits) that is long enough to house the entire number to be factored.

However, it doesn't sound like that is what we have here; even if it's quantum, it sounds like what they did was to take a bunch of 2-qubit computers and use them in parallel. This has none of the interesting potential of quantum computing and has been done before, although not repeated on a large scale. There is literally no point other than to try and sell a big box which may be useful for very particular features.

The other thing about quantum computing is that it tends to be very specialized, as in Shor's algorithm. It's not like standard computers where you'd really want as many as possible to do day-to-day tasks, but rather you'd want to use finely tuned quantum computers to do extremely specialized tasks like factoring through Shor's algorithm. So even if it's real, it's still hype.
posted by graymouser at 3:47 AM on December 28, 2012 [1 favorite]


In this context having a box which looks super-bad-ass is essential.

Super-bad-ass? In my opinion, Silicon Graphics produced boxes that were simply super bad. Might win a yacht race in record time, though.
posted by Jimbob at 3:53 AM on December 28, 2012


"The other thing about quantum computing is that it tends to be very specialized ... It's not like standard computers where you'd really want as many as possible to do day-to-day tasks, but rather you'd want to use finely tuned quantum computers to do extremely specialized tasks"

I agree. I think there is a world market for maybe five quantum computers.
posted by EnterTheStory at 4:05 AM on December 28, 2012 [23 favorites]


I read that as quibids not quibits the first time, which was confusing.
posted by unSane at 4:07 AM on December 28, 2012


Let's hook it up to a DeLorean and blast some Huey Lewis.
posted by Brocktoon at 4:13 AM on December 28, 2012 [2 favorites]


Jeff Bezos and the CIA? At some point the 21st Century is going to stop feeling like William Gibson's brain coming down off a methamphetamine jag.

But alas, not yet.
posted by R. Schlock at 4:30 AM on December 28, 2012 [18 favorites]


I think there is a world market for maybe five quantum computers.

If this is meant to predict a ubiquity of quantum computers in the future, it's probably wrong. Quantum computers are not Turing machines that you feed data through with a clock attached as in a modern binary computer, but with qubits instead of bits. They are strings of atoms that are manipulated in highly specialized ways, as described in Shor's algorithm (which remains the main problem in quantum computing). Performing a quantum algorithm relies on manipulating bits that are in superposition and affect each other in a way that provide a meaningful result (well, sometimes). Actually running a quantum computer involves the ability to manipulate and read the state of single atoms. Talking about ubiquitous quantum computers is like predicting that you'll drive a spaceship to work; it may revolutionize other things but quantum computing itself is not likely to be an everyday thing in our lifetimes.
posted by graymouser at 4:34 AM on December 28, 2012 [7 favorites]


So does this mean that publicly available forms of encryption are in trouble?

Awesome, more scary stuff to contend with!
posted by This, of course, alludes to you at 4:39 AM on December 28, 2012


So does this mean that publicly available forms of encryption are in trouble?

No.
posted by clarknova at 4:47 AM on December 28, 2012 [2 favorites]


Jimbob, I supported a cluster of those evil things back in the day, and even the presence of that link in this post is making my skin crawl. Blaargh.
posted by 1adam12 at 4:59 AM on December 28, 2012


"Their claimed speedup over classical algorithms appears to be based on a misunderstanding of a paper my colleagues van Dam, Mosca and I wrote on "The power of adiabatic quantum computing." That speed up unfortunately does not hold in the setting at hand, and therefore D-Wave's "quantum computer" even if it turns out to be a true quantum computer, and even if it can be scaled to thousands of qubits, would likely not be more powerful than a cell phone." - Source
posted by ACair at 5:02 AM on December 28, 2012 [1 favorite]


ACair your Dr Aaronson has since backed off his criticisms
posted by humanfont at 5:21 AM on December 28, 2012


> The thing I love most about this is that in 10 years' time my phone will have more processing power than this computer.
> posted by MuffinMan at 5:10 AM on December 28 [+] [!]

Can't wait to put mine on my display shelf between the Apple II and the C64.
posted by jfuller at 5:43 AM on December 28, 2012 [1 favorite]


Your phone already has more processing power than this computer. It just happens to be less, um, "parallel". Most of the computing problems that normal people care about are not particularly easier with quantum computers. You only need quantum if normal computers have difficulty even listing all the possible solutions.
posted by LogicalDash at 6:14 AM on December 28, 2012 [1 favorite]


Done repeatedly this would be enough to break almost any encryption

No. It breaks public key cryptosystems where the strength of the key is the difficulty in factoring very large numbers, or computing discrete logarithms. In classical computing, integer factorization is NP-intermediate problem, but it's a BQP problem in quantum computing, thanks to Shor's algorithm. This makes integer factorization vastly fasterin quantum computing.

Symmetric cryptosystems like AES aren't built on the factorization problem for security. They instead demand that the key be kept secret. Public key cryptography is based on the fact that if you know the public key, encryption or determining if the signature is valid is easy, if you know the private key, decryption or signing data is very easy, but if you only have one of the keys, it is very hard indeed to derive the other key, since you'd need to solve an NP-intermediate problem to find the other key.

With quantum computing, and Shor's algorithm, that changes. Shor's algorithm can factor integers in O((log b)3) time and O(b) space, where b is the number of bits in the number.

The reason that public key cryptography hasn't been destroyed is b -- the number of bits. If you want to factor a 768 prime number, you need O(b) qubits, or 768 of them. Since the largest quantum computer we've seen has used 7 qubits, this limits it. The largest number proven to have been factored by Shor's algorithm is 21 (7x3), which was accomplished by IBM this year.

Even if this is a 128 qubit quantum computer, and I don't think it is, it's not large enough to break PGP as we know it today. However, if we can develop very large qubit computers, then any cryptosystem based on integer factorization or discrete logarithms will fall.

If an quantum algorithm can found that will test a large number of keys in a symmetric key cryptosystem at once, it may be that quantum computing will render things like AES unfeasible. But I don't know of anything resembling such.

So, right now, if you could create a large qubit quantum computer, it would render PGP and such useless, but not AES and the like.
posted by eriko at 6:25 AM on December 28, 2012 [11 favorites]


In both cases the marketing challenge is to seduce the nerds who might be using the device to such an extent that they will pester their senior managers into signing a very large purchase order.

Are the marketers' goals to seduce the nerds, or the managers?
posted by inigo2 at 6:30 AM on December 28, 2012


Is Scott Aaronson just an outlier * or is there genuine debate in the industry about whether the device is working as advertised? It is the sort of argument I would expect to see over a product in Skymall rather than something with papers written up in Nature.
posted by rongorongo at 6:42 AM on December 28, 2012 [1 favorite]


A 2048 qubit system is in the works...

Need it sooner than that.
posted by hal9k at 7:02 AM on December 28, 2012 [1 favorite]


Scanning the FPP I thought that amino acids were actually used in the computer and I was all whaaaaa?
posted by sourwookie at 7:06 AM on December 28, 2012


Since it unveiled its first working computer in 2007, the 72-employee Canadian company has faced skepticism from purists, who say the D-Wave system is a pale imitation whose circuitry doesn’t obey the laws of quantum physics. In its defence, D-Wave cites a 2011 paper in the reputable scientific journal Nature as proof that quantum properties are in play.

D-Wave Systems Inc. uses the relatively new adiabatic model, also known as quantum annealing. This architecture allows its quantum bits, or qubits, to shift from superposition to a traditional computer state.
posted by stbalbach at 7:12 AM on December 28, 2012


What Other Items Do Customers Buy After Viewing This Item?
posted by weapons-grade pandemonium at 7:35 AM on December 28, 2012 [4 favorites]


humanfont: ACair your Dr Aaronson has since backed off his criticisms

If that's backing off, his original criticisms must have been blistering. He's basically saying in your link that there's no evidence that this is even a quantum computer; it may just be a classical device that happens to be a little faster than some others.
posted by Malor at 7:42 AM on December 28, 2012 [4 favorites]


Is Scott Aaronson just an outlier * or is there genuine debate in the industry about whether the device is working as advertised?

There's always some skeptic out there making a name for themselves - climate change, tobacco, you name it. The interesting thing is to watch their reaction when proven wrong. Aaronson just says "ok I was wrong about that, but *this*.." so consider how reliable he is as a source. Maybe he really is the lone wolf who holds the Truth, maybe not.

It didn't really do anything that a classical computer can't do, though. Read the "it worked" article and you'll see that the claims here are overblown.

This is the first generation of an entirely new computer, like 1950 or something. The mere fact that it worked is hugely significant and not at all overblown - they are working to improve it a pace a million times faster in the near future, then a million times faster than that, and so on, at some point it will far surpass traditional computers, and once it does, there is no looking back, they can never catch up. The skeptics in this thread are like "look at how crappy this working first generation computer is, my cell phone does more, it's only good for one things, what's this good for blah blah", whatever.
posted by stbalbach at 8:08 AM on December 28, 2012 [1 favorite]


This is the first generation of an entirely new computer, like 1950 or something. The mere fact that it worked is hugely significant and not at all overblown - they are working to improve it a pace a million times faster in the near future, then a million times faster than that, and so on, at some point it will far surpass traditional computers, and once it does, there is no looking back, they can never catch up.

In other words -- this will, in 50 years or so, be more along the lines of this.
posted by CosmicRayCharles at 8:30 AM on December 28, 2012


The skeptics in this thread are like "look at how crappy this working first generation computer is, my cell phone does more, it's only good for one things, what's this good for blah blah", whatever.

No, the skepticism here is that D-wave is advertising a 128 bit quantum computer that doesn't seem to work like a 128 bit quantum computer should, according to the current theories.

It is clear that D-Wave has demonstrated quantum annealing, and has s successfully solved a limited state protein folding problem using quantum annealing enable gates. However, in terms of quantum computing, this is very limited. Adabiatic quantum computing is good at a class of problems that we can simply describe as "find the minimum." Useful stuff, but very different than a quantum gate array, which is what most people think of as quantum computing, and what truly operates well on BQP* complex problems, like integer factoring. Instead, Adabiatic Quantum computing operates on QMA** complex problems. QMA is to BQP as NP is to P.

It also appears that the actual gates are configured as a series of limited bit gates (some say 64 2 qubit, some say 34 4 qubit) operating in parallel. This is *very* different than a 128 qubit quantum computer. However, D-Wave is unclear on this -- which doesn't help appease the skeptics one bit.

D-Wave came under a great deal of criticism for making computational claims, but providing no peer reviewed support. This changed in 2011 when D-Wave published a peer reviewed work in Nature showing that they did have a small, function adiabatic gate quantum computer.

As a research work, the machine is very impressive. As a commercial product? It doesn't seem so, but D-Wave is somewhat closed-mouth and thus, it's hard to truly know.

* BQP "Bounded-error Quantum Polynomial Time" -- Decision problems that are solvable by a quantum computer in polynomial time with a error probability at most 1/3.

** QMA "Quantum Merlin Arthur", named after the Arthur-Merlin protocol. It is the set of decision problems for which, with high probability (2/3), where if:
: the answer is YES, the is a polynomial sized quantum proof that can be verified in polynomial time, and....
: the answer is NO, every polynomial sized state is rejected with high probability in polynomial time.
posted by eriko at 9:04 AM on December 28, 2012 [6 favorites]


Qubit? What's a qubit?

One three-hundredth of an arc.
posted by JackFlash at 9:23 AM on December 28, 2012 [6 favorites]


Jeff Bezos and the CIA? At some point the 21st Century is going to stop feeling like William Gibson's brain coming down off a methamphetamine jag.

Ya just ain't looking in the right place is all.

http://longnow.org/clock/ - the clock on Jeff's land. But now lets get you on a majik karpit ride.

http://kk.org/ct2/2008/08/neal-stephenson-and-the-10000y.php - see the purple robe? Now start tripping on why the purple robe is needed for what right or ceremony.
posted by rough ashlar at 9:54 AM on December 28, 2012 [2 favorites]


stbalbach: I read Scott Aaronson's revised opinion of D-Wave as very reasonable. Where they showed him evidence, he backed off. Where they didn't show evidence, he didn't. Also, your insinuation that he's just some crank on the Internet is pretty far off the mark. He's a professor at MIT, he works on quantum computational complexity, and he's hardly been alone in his skepticism.

I don't think Scott Aaronson or any of the people in that IEEE Spectrum article are saying that what D-Wave claims is impossible. They are saying that D-Wave hasn't shown evidence for its most important claims yet.

eriko: I don't understand your argument. BQP is contained in QMA, so wouldn't an adiabatic quantum computer be more useful? Have I missed something?
posted by Serf at 10:24 AM on December 28, 2012 [3 favorites]




It's a little weird that there is a for-profit company selling a product that seems to be far ahead of what's claimed to be achieved in laboratories.
posted by benito.strauss at 11:53 AM on December 28, 2012 [2 favorites]


I am a sucker for high tech start ups. Wish I was 30 years younger. Having worked on the bleeding edge most of my life I know you have to start somewhere. I just looked at their openings and they are looking for field service engineer/technicians they are installing machines the romance of quantum computing is too strong for some people to resist. There is enough money involved here to keep a hundred or so engineers busy for some time. They may ultimately fail but to be there at the start of an industry is fun as hell. Solving problems in the real world that no one has ever faced before is an outstanding way to spend your time. I know I would join them in a heartbeat if they would have me. Damn few chances for folks to do stuff for the first time if you have the chops what is there to loose? A few years working in the best physics and computer labs in the world? Shit yeah I might send in my resume and see if they can use a 60 year old. Sometimes you know too much and it holds you back. If you know it can't be done chances are you will never do it.
posted by pdxpogo at 12:11 PM on December 28, 2012 [1 favorite]


The way that D-Wave seems to cultivate uncertainty and misinformation about their product makes me very wary about them. The best comparison I can make is to Noka chocolate [previously]: they were a self-proclaimed luxury chocolate company that was always very unclear on whether they were a chocolate maker (actually taking the cocoa beans and from them producing chocolate) or a chocolatier (taking base chocolate made by other companies and transforming it into molded pieces of chocolate and the like). They wanted to give the impression of being a chocolate maker when really they were a chocolatier, and based on this consumer confusion charge exorbitant prices as if they were making high quality artisanal chocolate and not just melting down stock chocolate into bars and putting them into fancy boxes.

D-Wave sends off the same kind of vibe of a company that seeks to benefit from confusion. They want to be seen as providing a machine that can break modern public key encryption when in fact (?) they are actually making something else (??). That very few people are qualified to evaluate D-Wave's claims should be additional cause for caution.
posted by Pyry at 12:44 PM on December 28, 2012 [7 favorites]


pdxpogo:

It's not a question of whether quantum computing is possible; D-Wave's critics do not doubt that it is a possibility. The problem is, what D-Wave seems to have done is put together something that is marginal in terms of the mostly experimental field of quantum computing, if it actually falls under that umbrella at all, and is marketing it as if it were the real thing. A functional 128-bit quantum computer would mean that some significant problems have been solved, and as soon as D-Wave's hype is given the merest scratch, it becomes clear that they haven't solved them at all. This is an attempt to cash in on a buzz word, not a serious advance in computing.
posted by graymouser at 12:51 PM on December 28, 2012


Interesting comparison to NoKa, because after reading that article (which, BTW, is one of the most interesting foodie articles I've ever read) I decided to try "the most expensive chocolate in the world". I located some Bonnat chocolate, the "stock chocolate" they were using. And it's delicious. Just amazing. And about the same price as any other high-end chocolate.

The lesson that would teach for this situation is (though I don't know if it actually is valid in this case) : don't buy the technology from D-Wave, but there may still be something useful underneath their marketing buzz.
posted by benito.strauss at 12:56 PM on December 28, 2012 [2 favorites]


The sense I get from D-Wave is that they're trying to use the hardware startup model of research (which is a model, as opposed to Big Corporate and Big Academia) against the messy problem of Quantum Computing. Nobody actually knows if you can extract a useful number of qubits from the universe, but who knew the complexity limits of Silicon IC's back in the 50's or 60's? We'd probably have never found out if we had some other substrate that was already kicking ass.

They're way out there, but OK. Let them be. Sometimes you have to just keep trying stuff to see if you hit a wall.
posted by effugas at 2:11 PM on December 28, 2012


As far as I can see, it's not like they're selling a perpetual motion machine, or something like that. There is a lot of marketing behind it, and they could be more rigorous, but their product does interesting things, and people want to play with it and figure out what, exactly, it does. I'm doubt anyone who buys a D-wave doesn't know what they're getting themselves into.
posted by Jimbob at 4:14 PM on December 28, 2012


No. It breaks public key cryptosystems where the strength of the key is the difficulty in factoring very large numbers, or computing discrete logarithms. In classical computing, integer factorization is NP-intermediate problem, but it's a BQP problem in quantum computing, thanks to Shor's algorithm. This makes integer factorization vastly fasterin quantum computing.

Indeed. Quantum computers are not magic.
There are classes of encryption schemes that are asymmetric and that are resistant to being broken by quantum computers. There are some Lattice based problems that could serve as a building block for a cryptosystem that is not vulnerable even to a full quantum computer.
posted by atrazine at 5:14 PM on December 28, 2012 [2 favorites]


"The CIA" is via In-Q-Tel as the article elaborates, and their portfolio is public (as well as alumni) and it's quite extensive/interesting. Keyhole (Google Earth), Palantir, 10gen, and so on.
posted by tmcw at 9:16 PM on December 28, 2012 [1 favorite]


Look none of us know or is qualified to judge but we can make some observations: One would think the institutions spending $10 million each to buy this machine would not be so easily snowed by marketing brochures, or would leak some discontent once they took the wrapping off only to discover, no, this is not the quantum machine they thought it was. D-Wave's secrecy is easily and logically explained by competitive concerns, don't need a conspiracy, by Occam's view of things. The critics complaints boil down to this: we don't know how the thing works at a level where we could replicate it ourselves because D-Wave won't tell us trade secrets! Duh. We also know that as new information has become public in peer reviewed papers, it has proven the critics wrong on some points, and that the critic mentioned above has backed down as being D-Wave's "official" critic. Finally the purists view of things to me seems like an academic argument, does it really matter if its the quantum computer of X type vs one of Y type if in fact its solving problems (again, this is first gen and they are scaling it up quickly).
posted by stbalbach at 9:27 PM on December 28, 2012


does it really matter if its the quantum computer of X type vs one of Y type if in fact its solving problems

In terms of the viability of quantum computing for the immediate future, yes, it absolutely matters. If they don't have quantum computers with registers longer than 2 or 4 bits, then they haven't solved the problems of creating and manipulating the long registers of entangled qubits that are required for using Shor's algorithm to factor long integers, which is pretty much the benchmark for quantum computing today. (Currently it's achieved factorization of 15, which of course is 1111 in binary, so 4 qubits has already been done.) In other words - "X type" is not a step towards the potential that makes researchers excited about quantum computing, because it doesn't move us towards "Y type" quantum computers in any significant way, and "X type" quantum computers are not in themselves an improvement over classical computers.

As far as the critics backing down: as I read it, the question was over whether this was a quantum computer at all (that is, whether it displayed entanglement properties between the qubits in its registers). The critics who've backed down say that it is possible that D-Wave's computer does actually have small quantum registers of the type that have already been created in other laboratory settings, but not that it has created the kind of registers that would eventually allow for Shor's algorithm and similar processes to be run.
posted by graymouser at 11:21 PM on December 28, 2012 [6 favorites]


tmcw: ""The CIA" is via In-Q-Tel as the article elaborates, and their portfolio is public (as well as alumni) and it's quite extensive/interesting. Keyhole (Google Earth), Palantir, 10gen, and so on."

Gosh. Thanks for making me somewhat paranoid of Granola, as it is created by MiserWare WHICH is in that portfolio. Just what I needed.
posted by Samizdata at 2:40 AM on December 29, 2012


"The CIA" is via In-Q-Tel as the article elaborates, and their portfolio is public (as well as alumni) and it's quite extensive/interesting. Keyhole (Google Earth), Palantir, 10gen, and so on.

Ah, In-Q-Tel -- the spooky defense/intel VC firm whose ascent was overseen by the same genius as Falcon. Looking back on the changes in defense and intelligence over the last decade, Chopstick was a very appropriate and emblematic choice. (He was also reportedly a very tough and technically plugged in manager, by accounts of still somewhat disgruntled Falcon developers I've encountered on other forums.) He's also served on the boards of companies in various areas of geeky interest--like WOTC (and FASA). He's an impressive node for degrees-of-separation exercises.
posted by snuffleupagus at 6:43 AM on December 29, 2012


now at Alsop-Louie
posted by snuffleupagus at 7:50 AM on December 29, 2012


"X type" quantum computers are not in themselves an improvement over classical computers.

That's not true, from what they are saying. They are saying they can scale "X Type" up to be faster than traditional computers. Very specialized applications of course not a GP computer.
posted by stbalbach at 10:34 AM on December 29, 2012


Talking about ubiquitous quantum computers is like predicting that you'll drive a spaceship to work; it may revolutionize other things but quantum computing itself is not likely to be an everyday thing in our lifetimes.
I don't think that's necessarily a given. All that really matters is how cheap they are - even if you're manipulating individual atoms, if the hardware to do that can be made cheaply, then it won't cost a lot. There's also the possibility of "cloud hosted" quantum computing where one organization maintains them but everyone can use them. Done that way, quantum computing might be a more efficient way to do certain things. I'm not really sure what. There is some work being done on hypothetical quantum computer machine learning algorithms. Whatever it is will probably be a situation where you have general purpose computers that break problems into chunks that a quantum computer can solve, then recombines those for an answer.

It also depends on if we can ever do room temperature quantum computing. It might be possible to have a cryogenic desktop, but a cellphone where some of the components are kept super-chilled seems unlikely (Unless they are really small, I guess)
---
Also, as far as crypto goes I've heard there are some crypto algorithms that can't be broken in BQP, which means they can't actually be broken quantum computers. I'm not sure if they're public key, or not though.
If that's backing off, his original criticisms must have been blistering. He's basically saying in your link that there's no evidence that this is even a quantum computer; it may just be a classical device that happens to be a little faster than some others.
Hahaha. Scott A Aaronson actually criticized me once. In my case, he was ridiculously over the top. Seriously the guy seems to take math very personally.

Not that the D-Wave isn't bullshit, or a useless device. Or it could be something with some super-niche applications or but not what people mean when they typically say "quantum computing"
The sense I get from D-Wave is that they're trying to use the hardware startup model of research (which is a model, as opposed to Big Corporate and Big Academia) against the messy problem of Quantum Computing. Nobody actually knows if you can extract a useful number of qubits from the universe, but who knew the complexity limits of Silicon IC's back in the 50's or 60's? We'd probably have never found out if we had some other substrate that was already kicking ass.
The problem is, though, that by the time Intel and companies like that came around the idea of transistors and "integrated electronics" were already a thing you could do and sell and actually have work. The transistor was invented at Bell Labs, I think. If I there had already been a few quantum computers in production, these guy's claims wouldn't be as outlandish.

And by the way, I'm sure there were tons and tons of bogus IC startups over the years.
posted by delmoi at 10:05 PM on December 29, 2012 [1 favorite]


A former colleague used to work for D-Wave and was very critical of their methods after he left them. Some demos they've done in the past have been accused of being borderline fraudulent but, as Prof. van Dam says in the Wired piece, the criticism seems to be less fierce after they published some interesting experimental results. It's still highly questionable if their machines work as they claim though.

I do some work on superconducting qubits, and actually have one cold that I do measurements on right now. I'm no expert on the computational aspect, but maybe I can still contribute something about the qubit they use and the difference between D-Wave's approach and that of most researchers.

Since there was a question about it upthread: A flux qubit is a closed loop of superconductor where current can flow without any resistance. It works as an electromagnet, producing a little bit of magnetic flux which depends on the current flowing around the loop. If you have the loop flat on a chip, the magnetic field points down if the current runs clockwise and up if it runs counterclockwise. These are the two base states of the qubit: Magnetic field up means "0" and down means "1". With a small coil next to the loop, you can change the preferred direction of the field gradually, and the important quantum feature is that each qubit can be a little bit each of 0 and 1, in what's called a Quantum superposition. Next to each qubit loop is also a SQUID, a magnetometer capable of detecting which direction the qubit's magnetic field is pointing.

In a "traditional" quantum computer architecture, you'd have a few of these qubits coupling to each other so that changes in one of them may produce a change in another one. You would start with all qubits in a known initial state and then do some exact operations on them in a specific sequence. For example, you could bring one of them to a 50/50 superposition, then couple it to one of its neighbours, then changing the superposition in that neighbour a certain amount, and so on. After a few of these operations you may use the SQUID:s to do a projective measurement son one or more of the qubits, which means you force them to be either 0:s or 1:s in the classical world and record their decisions. You can keep doing operations on the remaining un-measured qubits, but in the end you have a lot of measured zeros and ones that form the solution to some specific computational problem.

In D-Wave's architecture, you have qubits of the same type as before, but they spend most of their time either in the 0 or 1 state, not as much in superpositions. You start out with all of them as 0:s and use the biasing coils to adjust the "preferred" state for each one of them. Many of them will be prone to switching from 0 to 1, but this is complicated by the magnetic field each of them senses from its neighbours: If the bias coil on one qubit makes it weakly inclined to be a 1, it may still be better for it to be a 0 if the qubit next to it is 1, whereas a neighbouring value of 0 will make it strongly inclined to be a 1. Looking at all the qubits together, there will be some overall combination which makes the most sense (a global optimum which minimizes the total potential energy).

The task of the machine is to find this optimum given some combination of bias from all the coils, and quantum mechanics gets involved by letting the qubits work together in deciding how to flip.

I think a rough analogy can be drawn to the audience members in a theater, who all want to position themselves to get a good view of the stage. In the classical scenario, each person adjusts individually, according to what they presently see in front of them. In doing so, they may end up blocking the view of someone else, who then has to adjust accordingly. The quantum version is when people talk with each other to optimize their positions together. That is a more efficient way to make everyone's view as good as possible.

The main controversy around D-Wave is that you can eventually reach an optimum also without any quantum-ness, it's just a slower process. This is what van Dam's comment in the Wired article refers to:

“If I were to predict the future, their systems would [turn out to] be classical systems. But it could be that they’re using a kind of technology to implement what I think is classical computation that is very different from the silicon technology we use nowadays.”

As others have said, there is no equivalence between their type of adiabatic computer and a traditional quantum computer with the same number of bits. They don't solve the same problems, but that isn't necessarily a disadvantage to the annealing variant. One of the main concerns with traditional quantum computing is the lack af algorithms - apart from factorizing integers, you can't really use it for a lot of interesting computations. For a lot of people in the field, numerical computation doesn't seem to be the main goal anymore. They want to use quantum computers to simulate other quantum systems instead, for example chemical and nuclear processes. That's also a worthwhile goal, and one that is much more realistic to achieve.
posted by springload at 6:20 AM on December 30, 2012 [2 favorites]


« Older Kindness killed, just as surely as the huntsman's...   |   Now that the Mayan's have lifted off Newer »


This thread has been archived and is closed to new comments