QUANTA

Wednesday, September 28, 2011

 Is Quantum Computing real?
 
Researchers have been working on quantum systems for more than a decade, in the hopes of developing super-tiny, super-powerful computers. And while there is still plenty of excitement surrounding quantum computing, significant roadblocks are causing some to question whether quantum computing will ever make it out of the lab.

Researchers have been working on quantum systems for more than a decade, in the hopes of developing super-tiny, super-powerful computers. And while there is still plenty of excitement surrounding quantum computing, significant roadblocks are causing some to question whether quantum computing will ever make it out of the lab.

25 radical research projects you should know about

First, what is quantum computing? One simple definition is that quantum computers use qubits (or quantum bits) to encode information. However, unlike silicon-based computers that use bits which are zeroes or ones, qubits can exist in multiple states simultaneously. In other words, a qubit is a bit of information that has not yet decided whether it wants to be a zero or a one.

In theory, that means that quantum systems can produce simultaneous processing of calculations; in essence, true parallel systems.

Watch a slideshow version of this story.

Olivier Pfister, professor of experimental atomic, molecular and optical physics at the University of Virginia, says quantum algorithms could deliver exponential advances in compute speed, which would be useful for database searching, pattern recognition, solving complex mathematical problems and cracking encryption protocols.

"But the roadblocks to complete success are numerous," Pfister adds. The first is scalability - how do you build systems with large numbers of qubits. The second is even more vexing - how do you overcome "decoherence," the random changes in quantum states that occur when qubits interact with the environment.

The first roadblock is an obvious one: quantum systems are microscopic. The challenge is to gain exquisite levels of control at the atomic scale, over thousands of atoms. To date, this has only been achieved on the order of 10 atoms.

"My work with optical fields has demonstrated good preliminary control over 60 qubit equivalents, which we call 'Qmodes' and has the potential to scale to thousands of Qmodes," Pfister says. "Each Qmode is a distinctly specified color of the electromagnetic field, but to develop a quantum computer, nearly hundreds to thousands of Qmodes are required."

Decoherence is an even more vexing problem. "All the algorithms or patents in the world are not going to produce a quantum computer until we learn how to control decoherence," says Professor Philip Stamp, Director of the Pacific Institute for Theoretical Physics, Physics, and Astronomy at the University of British Columbia.

In the early days of quantum research, computer scientists used classical error correction methods to try to mitigate the effects of decoherence, but Stamp says those methods are turning out to be not applicable to the quantum world. "The strong claims for error correction as a panacea to deal with decoherence need to be re-evaluated."

According to Stamp, there are many experiments going on around the world in which researchers are claiming that they have built quantum information processing devices, but many of these claims dissolve when the hard questions about decoherence for multi-qubit systems are asked.

So far, the most sophisticated quantum computations have been performed in 'ion trap' systems, with up to eight entangled qubits. But physicists believe that the long-term future of this field lies with solid-state computations; that is, in processors made from solid state electronics (or all-electronic devices that look and feel more like regular microprocessors), as opposed to atomic particles. This has not been possible using solid-state qubits until now because the qubits only lasted about a nanosecond. Now these qubits can last a microsecond (a thousand times longer), which is enough to run simple algorithms.

Quantum controversy
The most recent results showing very low decoherence for magnetic molecule qubits was recently published in Nature International Weekly Journal of Science by a team of researchers from the Vancouver-based company D-Wave Systems. D-Wave has performed a technique called quantum annealing, which could provide the computational model for a quantum processor.

Dr. Suzanne Gildert, PhD from the University of Birmingham, experimental physicist, and quantum computer programmer (now working at D-Wave Systems), says that with quantum annealing, decoherence is not a problem.

According to Gildert, D-Wave uses Natural Quantum Computing (NQC) to build its quantum computers, which is very different from the traditionally proposed schemes. "Some quantum computing schemes try to take ideas from regular computing — such as logic operations — and make 'quantum' versions of them, which is extremely difficult. Making 'quantum' versions of computing operations is a very delicate process. It's like trying to keep a pencil standing on its end by placing it on a block of wood, and then moving the block around to try to balance the whole thing. It's almost impossible. You have to constantly work hard to keep the pencil (i.e., the qubits) in the upright state. Decoherence is what happens when the pencil falls over," Gildert says.

"In our NQC approach, which is more scalable and robust, we let the pencil lie flat on the wood instead, and then move it around. We're computing by allowing the pencil to roll however it wants to, rather than asking it to stay in an unusual state. So we don't have this same problem of bits of information 'decohering' because the state we are trying to put the system into is what nature wants it to be in (that's why we call it Natural QC)."

But Jim Tully, vice president and chief of research, semiconductors and electronics at Gartner Research, says that what D-Wave is doing is not really quantum computing.

Tully says, "A sub-class of quantum computing has been demonstrated by D-Wave Systems that is referred to as quantum annealing, which involves superposition, but does not involve entanglement and is not; therefore, 'true' quantum computing. Quantum annealing is potentially useful for optimization purposes, specifically for the purposes of finding a mathematical minimum in a dataset very quickly."

There may be some dispute over whether D-Wave's approach is pure quantum computing, but Lockheed Martin is a believer. Lockheed Martin owns a quantum computing system called the D-Wave One, a 128-qubit processor and surrounding system (cooling apparatus, shielded rooms etc.). Lockheed is working on a problem known as verification and validation to develop tools that can help predict how a complex system will behave; for example, to detect if there are bugs in the system, which may cause equipment to behave in a faulty way.

Keith Mordoff, director of Communications Information Systems & Global Solutions at Lockheed Martin, says, "Yes, we have a fully functioning quantum computer with 56 qubits, which is different from the classical methods. D-Wave uses an adiabatic or quantum annealing approach, which defines a complex system whose ground state (lowest energy state) represents the solution to the problem posed. It constructs a simple system and initializes it in its ground state (relatively straight forward for simple systems), then changes the simple system slowly until it becomes the complex system. As the system evolves, it remains in the ground state, then measures the state of the final system. And, this will be the answer to the problem posed. The change from simple system to complex system is induced by turning on a background magnetic field."

Future shock
Some scientists are extremely skeptical about quantum computing and doubt that it will ever amount to anything tangible.

Artur Ekert, professor of Quantum Physics, Mathematical Institute at the University of Oxford, says physicists today can only control a handful of quantum bits, which is adequate for quantum communication and quantum cryptography, but nothing more. He notes that it will take a few more domesticated qubits to produce quantum repeaters and quantum memories, and even more to protect and correct quantum data.

"Add still a few more qubits, and we should be able to run quantum simulations of some quantum phenomena and so forth. But when this process arrives to 'a practical quantum computer' is very much a question of defining what 'a practical quantum computer' really is. The best outcome of our research in this field would be to discover that we cannot build a quantum computer for some very fundamental reason, then maybe we would learn something new and something profound about the laws of nature," Ekert says.

Gildert adds that the key area for quantum computing will be machine learning, which is strongly linked to the field of artificial intelligence (AI). This discipline is about constructing software programs that can learn from experience, as opposed to current software, which is static.

"This is radically different from how we use computing for most tasks today," Gildert says. "The reason that learning software is not ubiquitous is that there are some very difficult and core mathematical problems known as optimization problems under the hood when you look closely at machine learning software. D-Wave is building a hardware engine that is designed to tackle those hard problems, opening the door to an entirely new way of programming and creating useful pieces of code."

According to Gildert, one very important real-world application is in the field of medical diagnosis. It's possible to write a program that applies hand-coded rules to X-ray or MRI images to try and detect whether there is a tumor in the image. But current software can only perform as well as the expert doctors' knowledge regarding what to look for in those images. With learning software, the program is shown examples of X-rays or MRI scans with and without tumors, then it learns the differences itself without having to be told. With this technology, the computer can even detect anomalies that a doctor cannot see or might not even notice. And the more examples you show it, the better it gets at this task.

"It is unlikely that QCs will replace desktop machines any time soon," Gildert says. "In terms of years, it depends on the effort invested, available funding, and the people working on the problem. The logical assumption is that these machines will be cloud-based co-processors for existing data centers used by companies that have very difficult problems to solve. Quantum systems are very good at solving a specific class of hard problems in the fields of AI and machine learning, so we are concentrating on building tools that help introduce the potentials of quantum computing to the people who work in these areas."

Addison Snell, CEO of Intersect360 Research, an analyst firm specializing in high performance computing, says, "Quantum computing is still of interest primarily among government and defense research labs. And, while the principles of quantum computing have been described for years, it is a wholly new paradigm, and the number of applications it will work for, even theoretically at this point, is small. However, some of these applications could be relevant to national security, so a high degree of interest remains."

"Quantum computing is certainly 'on the radar' of IBM, HP, and other supercomputing vendors, but it is difficult to say how many engineers they have working on this technology. At this point, it is uncertain whether quantum computing will ever have any role beyond a small handful of boutique supercomputing installations; but if or when it does, it is not likely we'll see commercially available working systems within the next five years."

"That depends what you mean by working systems," Stamp adds. "If you believe D-Wave, we already have one system commercially available now. I think that for a genuine quantum computer, we may be talking about 10 years for something that a very big company can buy and 25 to 30 years for the ordinary consumer."

25 ways IT will morph in 25 years

"I'd put quantum computing, even if it proves competitive and valid, 20 years out because of the very complex infrastructure that has to go with it," says Michael Peterson, analyst and CEO at Strategic Research Corporation. "Developing a new technology like this requires 'breaking the laws of physics' more than once.' However, we did it with disk technology many times over during the past 25 years, and we'll do it many times more."

Mordoff adds that there are other commercial companies evaluating quantum computers, but no one is actually 'using' them, thus far, except Lockheed and D-Wave, of course. "Whether we want this or not, we have to eventually venture into a quantum domain," Ekert says.

"Some researchers believe that general purpose quantum computers will never be developed. Instead, they will be dedicated to a narrow class of use such as the optimization engine of D-Wave Systems. This suggests architectures where traditional computers offload specific calculations to dedicated quantum acceleration engines. It's still likely to be around 10 years before the acceleration engine approach is ready for general adoption by the classes of user that can make use of them; however, they will likely be attractive offerings for cloud service providers, adds Tully."

Read more: http://goo.gl/Ai04A


Global Source and/or and/or more resources and/or read more: http://goo.gl/JujXk ─ Publisher and/or Author and/or Managing Editor:__Andres Agostini ─ @Futuretronium at Twitter! Futuretronium Book at http://goo.gl/JujXk