Right now there are PetaFLOPS constantly spent solving systems of linear equations. Actually millions of CPU cores is a better measure--the limit on performance is access to large memories. Modern CPU cores spend lots of time waiting for memory access when doing linear algebra (LA).A better variant on a previous quantum computing algorithm for solving LA systems was just published: http://prl.aps.org/abstract/PRL/v110/i25/e250504 When practical quantum computers come along, this means that there will be a large demand for such systems. (I'm assuming that practical here means can solve LA problems much faster/cheaper than current servers and supercomputers.)What is the answer? I have no idea. Just remember this when someone comes along with a practical quantum computer. Note that history says that often it is not the first company to market with a product which hauls in the big bucks. But history says that if you buy a few hundred shares of each of the first dozen QC vendors, your biggest regret will be selling some too soon.Yes, I know that Bernard Baruch famously said: "I made my money by selling too soon." What he really did was to reduce his investments in a stock or commodity to take profits and what today is called re-balancing his portfolio.* Translation if you buy a stock for $10/share and it goes up to $80, selling 25% locks in a sure profit. Will you have some regret if the stock keeps going up, or drops to the basement? Sure. But either is a nice position to be in, compared to selling or holding all respectively.Or to look at your whole portfolio. If you buy ten QC stocks and half go bankrupt, a few double, and one goes up 10,000%, you can forget the losers. * He also got rid of mistakes fast, rather than hold on waiting for a recovery.
Um,As someone who has read PRL extensively as well as having published articles in it, I can tell you that an idea published in Physical Review Letters is at least 20 years from practical application. So while I'm excited to see the field advance I'm not going to make any investment decisions today based on a what has been recently published in that august journal. YMMV.Rgds,HH/Sean
As someone who has read PRL extensively as well as having published articles in it, I can tell you that an idea published in Physical Review Letters is at least 20 years from practical application. So while I'm excited to see the field advance I'm not going to make any investment decisions today based on a what has been recently published in that august journal. YMMV.There is a huge difference between hardware and software/algorithms. Hardware does take about twenty years to go from state of the art to state of the practice. In other words from demonstrated in a lab, to dominating manufacture in its category. (Measuring that is tricky. For example, the best way to measure market penetration for the McCormick reaper, is in the tons of grain reaped by hand and by machine.)Anyway, there are very few algorithms that 1) require quantum computers, and 2) can deliver an exponential speedup over current computers. Schnorr's algorithm for factoring numbers is one, this is another. However, Schnorr's algorithm will be best known for what it kills, such as the RSA algorithm for public key cryptosystems. This algorithm will produce a large demand for quantum computers.How soon will quantum computers reach the general server market? Good question. D-Wave (http://www.dwavesys.com/en/dw_homepage.html) is selling a quantum computer today, but it is pretty limited.* (They have shipped two so far.) My guess is that practical QCs for doing the Clader, Jacobs, and Sprouse will start to hit the market sometime in the next few years. There has been a lot more than twenty years of research on quantum computers. In fact Schnorr's algorithm is about that old. However, Schnorr's algorithm, at least for real applications requires a machine with thousands of qubits, which may not happen for a decade or more. (Order of six qubits for each bit in the number you want factored. The first demonstration of the algorithm factored 15 into 3 and 5.)We may never get beyond special purpose quantum computers. But who cares? Any problem worth throwing a QC at it is constantly using hundreds to millions of CPU cores today. Tailoring a QC to match one problem beyond today's non-QC state of the art won't be all that painful. Current digital design tools can do that job just as easily as programming the resulting computer.* The current D-Wave machine requires cooling to almost absolute zero, and is a box the size of a small room. The first version has 128 qubits, but they are not "general purpose" qubits. They have to be related in specific ways. That's a programming problem, right? ;-) The D-Wave 2 will have 512 qubits, but the same footprint, and the same limitation to entangling two separate sets of qubits.
This is all way beyond me. However, I do recall the days when the first generation of CDC's 3600 (building size) was imagined, and the thought was that only one would ever be used and needed. And then when the first "super computer" was imagined and only one was ever expected to be needed.... well you get the picture. Yes I am officially and old guy
Yes. I am an old guy too. I find I can imagine enough of what nanotechnology will do to know that it will change more than all the technology developed since I was born put together. How much more? I don't know.
Best Of |
Favorites & Replies |
Start a New Board |
My Fool |