As http://www.scottaaronson.com/blog/?p=1400 makes clear, the current incarnation of D-Wave looks cool and costs a fortune, but is slower than a desktop PC running properly-optimized code. And that optimized code only needed to be written once because it solves the problem that D-Wave asks us to reduce all problems to before we can use D-Wave.
As http://www.scottaaronson.com/blog/?p=1400 admits, large-scale entanglement most likely does take place in D-Wave's computer. So now he just argues that their proof of concept prototype is not as efficient as classical devices that have been perfected over fifty years.
If we lived in a world where nuclear energy was still just a theoretical possibility, and you had a company D-Uranium trying to build the first reactor ever, and someone like Scott Aaronson kept saying for years and years "You guys suck you haven't even demonstraded there are any atoms split inside your device, for all we know it's all heat from chemical reactions", and then, eventually in the face of evidence he was forced to admit that atoms indeed were being split in D-Uranium's pile, so he retreated to saying "But your reactor generates less heat than a coal burning power station so you guys still suck!"--what would that make you think?
You're attacking me for what I didn't say and ignoring what I did. I'm not attacking D-Wave for their ambitions. I'm attacking the article for being a puff piece.
The article, in the second paragraph, says ...the computer can solve a certain type of problem much faster than conventional computers... which is a claim that even cursory research into the subject would find false. (Though a PR company could easily deliver a packet of materials and people to talk to which would give a journalists quotes on both sides, and leave them thinking that it was true.) That theme of "this is insanely fast" is repeated throughout the piece in various ways with no challenges. The journalist did not do their homework.
That right there is the evidence upon which I base my assertion that this is a puff piece. This looks and feels in every way like what a journalist might write based on the information given by a PR firm.
Full stop.
As for the rest, you're unfairly misrepresenting what Scott Aaronson has been saying. His statement all along is that D-Wave's marketing exceeds their delivery, and he's afraid that a legitimate field of study - which he's involved with - could suffer backlash as a result. He has repeatedly said, and I believe him on this, that if they can actually deliver then he'll be a huge fan. But as long as they say they have delivered when they haven't, he's going to be a critic.
This seems to me to be a reasonable response to an overhyped implementation of cool technology. There always will be salesmen overselling their products. But for building long-term trust, remember the mantra "under promise and over deliver". D-Wave is doing the opposite.
First, Scott Aaronson never said D-Wave sucked, to my knowledge. I downvoted you for that exhibition of rhetoric, it was pretty immature even for my standards.
Second, compare quantum computing with fussion, which is of about the same level of difficulty. Some physicist thought he had cold fussion, and the response from the community was (rightfully!) to be skeptical. But even if it would have worked, no one would whine about how Evil Physicists were Keeping Cold Fussion Down. Instead, it would be understood as the appropiate reaction to very wild claims, and large scale entanglement is pretty out there.
It would make me think that nuclear energy isn't yet a practical alternative for electrical generation, and anyone trying to sell me a nuclear generating station must be up to something.
There were two issues:
1. Quantum entanglement was not previously shown
2. D-Wave did not gives evidence of the ability to perform better than classical computers.
Now D-Wave has shown 1 but 2 remains a valid criticism. I don't understand why anyone should just assume that the D-Wave architecture can be faster --- for a restricted set of tasks --- than a classical computer.
To be honest I never expected that we’d live to see a quantum computer in this decade. Despite of what D-Wave’s machine actually is the fact remains that we’re pretty close to achieving this milestone. How would this change the computer industry though is something I can’t even start to fathom.
What really interests me though as a programmer is whether we’d have to change our tools too when quantum computers become the norm. Would we have to adopt new programming techniques or we’d make it by refining the ones we already use? How would techniques like multithread programming evolve in a machine like this?
I don't actually think this will change things as much as you might think. New problems would be efficiently computable, sure, new programming paradigms will be needed, yes, but people have been producing abstractions for a very long time. If you want to know what it will be like to program a QC, look at what it is like to program in CUDA, OpenCL, Prolog, SQL, or any high level language with a compiler. All of these things do substantial work that the vast majority of users are not aware of. 99.9% of programmers will go out and buy a book called "OpenQC for Dummies" and they will build up a set of heuristics for proper programming techniques. Meanwhile, 0.01% of programmers will actually understand QC, and make this abstractions simpler for the rest of the world.
Any pointers to a technical presentation on how D-Wave's (or any other) quantum computing is physically implemented? For example, classical computers use silicon doped with impurities to form gates, and connected with metal traces. Are quantum chips created in a similar manner (using different materials), but operate at extreme cold temperatures, for example? Or is it done using a liquid or gas structure with lasers? So far I've only seen the high-level logical descriptions, but nothing about the physical side.
Forgive me for butting into stuff i do not understand,but you guys are quoting Scott Aaranson the same way most people quote holy books.He could be wrong,you know.
I give him that much credit because if he was wrong, it would be easy to demonstrate. Give this computer an instance of a problem, show it performs much better than the best conventional computer. If this bar is met, there's only so much obfuscation that can be thrown up, if this bar is not met, there's only so much obfuscation that can be thrown up. So far it seems the bar has not been met.
Scott Aaronson has agreed that they've reached a new high, which also helps his credibility in my mind. But they have not reached what they claim they've reached. And being able to solve only one highly restricted problem is a long way away from having a quantum computer. Basically, if you look at what D-Wave actually has, it's a great deal less impressive than their marketing would like you to believe.
I used to have links to lectures explaining this, but they seem to have 404`ed.
The general approach is to view quantum computers as a series of quantum gates. Specifically, the CC-Not gate takes 3 inputs and, if the first 2 are true, it toggles the third. This is sufficient to build an arbitrary gate, including a binary adder.
I would try explaining how these gates might be constructed, but its been so long since I've studied it that I would probably butcher the explanation.
As http://www.scottaaronson.com/blog/?p=1400 makes clear, the current incarnation of D-Wave looks cool and costs a fortune, but is slower than a desktop PC running properly-optimized code. And that optimized code only needed to be written once because it solves the problem that D-Wave asks us to reduce all problems to before we can use D-Wave.