Quantum - The Biggest Development in Computing Since Computers

Jun 10, 2014

Big data is the buzzword for data sets too large and complex to be analyzed efficiently using conventional tools. The ability of humans to measure and analyze more and more of our surroundings has created opportunities for control and optimization never before imagined and created markets that serve the new need to understand and use these massive data sets. Conventional computers have enabled the information age, and the advancements made in chip architecture and software development have been monumental. But all of these advancements may soon be made obsolete with the development of a single chip architecture that does not so much exist physically as it does probabilistically, is ruined by direct observation, and uses laws of physics we do not yet properly understand. Enter the quantum computer.

Moore’s Law (it’s actually an observation, rather than an actual physical law) indicates that the number of transistors in a circuit will double every 18-24 months. This prediction proved to be fairly accurate from its conception in the ‘60s through the early ‘00s, and has caused exponential improvement in memory capacity, processing speed, and system reliability. But Moore’s Law is over. The exponential increases it predicted held up fairly well until about 2005, and then slowed to a comparably modest 10-15% yearly improvement in processing power, leading to a market where add-ons and graphics improvements drive computer sales rather than improvements in raw performance.

For the last 9-10 years, improvements in computing capacity have relied on stringing lots of cheap, commoditized servers together and using virtualization and increasingly powerful analytic software (think Hadoop and SQL) to crunch bigger and more complex data sets. This method is clunky and expensive, as the useful life of the average server is about 3 years and Google owns north of a million servers. Which means that every year Google is replacing somewhere in the area of 333,000 servers, each costing let’s say $1,000, for a total price of $333 million/year (the cost to own and operate a data center is far more than the price of servers; Google spent $1.02 billion on infrastructure, just in the last three months of 2013).

The development of a quantum computer will change all of this. In your traditional silicon based chip architecture, the smallest piece of information is called a bit. That bit is either a 1 or a 0, and strings of bits together tell the computer what to do. In a quantum computer, the bit is replaced by a qubit (a quantum bit), which through some pretty mind-bending laws of quantum mechanics, can be a 1, a 0, anything in between, or more than one state simultaneously. This multi-state phenomenon is called a superposition, and is what makes quantum computers so desire-able. Using the standard 1s and 0s of a traditional computer makes it devilishly hard to solve complex optimization problems or factor multiplied primes (the foundation of cyber-security), while quantum computers are phenomenal at these tasks, performing them orders of magnitude faster than any machine to date.

The good news for your mobile banking session is that we, as a species, have yet to build a true quantum computer. A company called D-Wave out of CA is closest, and will probably get there first. They have built a machine with a single chip containing 512 qubits (we think they’re qubits anyway). The really fascinating thing about this machine is that, as advanced as it is (the 10 year old D-Wave’s computer rivals industry-leader Intel’s best work) we don’t really know if it is a quantum computer or not. In some trials it seems to be, and in others is clearly not. The tricky thing about qubits is that you can’t look right at them, or they lose their superposition and become mundane 1s and 0s. So you use quantum entanglement (Einstein was famously quoted describing this phenomenon as “spooky action at a distance”) to lock the behaviors of two quantum particles together regardless of location, and you look at only one of the pair (the apparent paradox here of disrupting the entangled system is explained by quantum mechanics beyond the scope of this article). This entanglement is very fragile, requiring the system be maintained with zero magnetic or vibration interference, at just above absolute zero in a futuristic looking 10-foot black cube.

When it’s finished, this one, 10-foot computer will replace warehouses full of Google servers, use relatively no energy, and cost peanuts. Also it will destroy just about all cyber-security and solve the most complex optimization problems in minutes rather than weeks (standard computers are unlikely to disappear for quite some time following the development of a commercially viable quantum computer, as they can still compete in non-optimization-based problems). Quantum computers are the holy grail of the high-tech industry, and though we’re not there yet, we’re probably not all that far off.

Category : Big Data, Digital Media

Fatal error: Call to a member function getUserUserBio() on null in /home/frost/public_html/digitaltransformation/packages/problog/helpers/blogify.php on line 362