As someone who has read PRL extensively as well as having published articles in it, I can tell you that an idea published in Physical Review Letters is at least 20 years from practical application. So while I'm excited to see the field advance I'm not going to make any investment decisions today based on a what has been recently published in that august journal. YMMV.There is a huge difference between hardware and software/algorithms. Hardware does take about twenty years to go from state of the art to state of the practice. In other words from demonstrated in a lab, to dominating manufacture in its category. (Measuring that is tricky. For example, the best way to measure market penetration for the McCormick reaper, is in the tons of grain reaped by hand and by machine.)Anyway, there are very few algorithms that 1) require quantum computers, and 2) can deliver an exponential speedup over current computers. Schnorr's algorithm for factoring numbers is one, this is another. However, Schnorr's algorithm will be best known for what it kills, such as the RSA algorithm for public key cryptosystems. This algorithm will produce a large demand for quantum computers.How soon will quantum computers reach the general server market? Good question. D-Wave (http://www.dwavesys.com/en/dw_homepage.html) is selling a quantum computer today, but it is pretty limited.* (They have shipped two so far.) My guess is that practical QCs for doing the Clader, Jacobs, and Sprouse will start to hit the market sometime in the next few years. There has been a lot more than twenty years of research on quantum computers. In fact Schnorr's algorithm is about that old. However, Schnorr's algorithm, at least for real applications requires a machine with thousands of qubits, which may not happen for a decade or more. (Order of six qubits for each bit in the number you want factored. The first demonstration of the algorithm factored 15 into 3 and 5.)We may never get beyond special purpose quantum computers. But who cares? Any problem worth throwing a QC at it is constantly using hundreds to millions of CPU cores today. Tailoring a QC to match one problem beyond today's non-QC state of the art won't be all that painful. Current digital design tools can do that job just as easily as programming the resulting computer.* The current D-Wave machine requires cooling to almost absolute zero, and is a box the size of a small room. The first version has 128 qubits, but they are not "general purpose" qubits. They have to be related in specific ways. That's a programming problem, right? ;-) The D-Wave 2 will have 512 qubits, but the same footprint, and the same limitation to entangling two separate sets of qubits.
Best Of |
Favorites & Replies |
Start a New Board |
My Fool |
BATS data provided in real-time. NYSE, NASDAQ and NYSEMKT data delayed 15 minutes.
Real-Time prices provided by BATS. Market data provided by Interactive Data.
Company fundamental data provided by Morningstar. Earnings Estimates, Analyst Ra