Back in 1946, the world's first general purpose electronic computer was switched on at the University of Pennsylvania. The huge processing power of ENIAC (Electronic Numerical Integrator And Computer) stunned the world, or at least the few dozen people who had any idea what it was for and why it was important.
But ENIAC had an important flaw. It could only be programmed by resetting a myriad switches and dials, a task that could take weeks. And this seriously hindered the computer's flexibility.
The solution was not hard to find. it had already been outlined by Alan Turing, John Von Neumann and others: have a unit for number crunching and a separate electronic memory that could store instructions and data. That design meant that any reprogramming could be done relatively quickly, easily and electronically.
Today, almost all modern computers use this design, now known as the Von Neumann architecture.
The exception is the quantum computer. These devices use the strange properties of the quantum world to perform huge numbers of calculations in parallel. Consequently they have the potential to vastly outperform conventional number crunchers.
Unfortunately, physicists have only a vague and fleeting power over the quantum world and this means has prevented them the luxury of designing a Von Neumann-type quantum computer.
Until now. Today, Matteo Mariantoni at the UC Santa Barbara and pals reveal the first quantum computer with an information processing unit and a separate random access memory.
Their machine is a superconducting device that stores quantum bits or qubits as counter-rotating currents in a circuit (this allows the qubit to be both a 0 and 1 at the same time). These qubits are manipulated using superconducting quantum logic gates, transferred using a quantum bus and stored in separate microwave resonators.
Quantum computing hype
The computational power and informational density of classical computers is limited with uncertainty principle in the same way, like at the case of quantum computers. It means, for consumer electronics operating at room temperature it has no meaning, whether you would decrease the number of atoms in classical transistors to the physical limit for the sake of their information density or whether you would increase the redundancy of quabits for the sake of their sufficient reliability - the effectiveness of both devices will converge to the same value.
Of course, the quantum computers running at the (near) zero temperature would supersede the classical computers running at room temperature pretty much - but the classical computers would run a way better at these low temperatures as well