We say ‘data is everything’ so often in the tech world it might actually replace ‘hello’ at some point, but the phrase doesn’t tell the whole story. Data isn’t worth much unless it can be interpreted correctly and, when it comes to competition, how quickly it’s interpreted can make all the difference. This is especially true in the FinTech space -- imagine the power of not only accurately predicting where the economy is heading into 2021, but achieving that months before anyone else? What’s needed is some science fiction-level supercomputing and, after decades of mostly theoretical research, the era of Quantum Computing may soon be upon us.
According to Shroedinger, you may well be both planning for quantum computing and also not planning for quantum computing simultaneously… or something like that.
Quantum physics is strange. Literally - that’s a word physicists use. And that strange world is being domesticated for the new era of computing. These Quantum Computers are developing so rapidly it’s making Moore’s Law redundant. Whether you understand or care about what Quantum Computing (QC) is, all signs point to it being the future. Or maybe it’s already here? Again, strange.
To give a brief QC 101, QC supersedes standard computing by replacing the linear 10010011 binary input ‘bits’ with ‘qubits’ or ‘quantum bits’. These mysterious qubits are about as easy to understand as Interstellar, but basically they can act as both 1s and 0s at the same time (superposition) and, through a process called entanglement, interact with each other in non-linear ways, reaching across space-time to work in synchrony. To sum up at quantum speed: this enables exponentially more calculations to be performed near-simultaneously. It’s math in hyperdrive.
A few of the big names in tech have already been making quantum leaps with the technology. Last year Google, dispelling no illusions in its quest for universal domination, announced something called Quantum Supremacy. This was an experiment where Google engineers used QC to perform calculations that would take a standard computer 10 000 years. It took Google 200 seconds. Ten thousand years. Two hundred seconds. ‘Nuff said.
But what does this mean for those of us without Google’s near-mythical resources? During this time of great upheaval and disruption in the traditional sense, innovation might be the difference in navigating the post pandemic rebuild. And given the speed with which QC is developing, those ‘long-term’ strategies we’ve drawn up might be made to look as over-simple as that high school drawing of an atom with electrons whizzing around the nucleus.
Even if the idea of spending any of that squeezed IT budget on new, mostly unusable gadgets is laughable right now, there are options out there for laying the groundwork for the future. Could you invest in training for your developer team? IBM already has a summer school and intern program in place called Qiskit. Google has a quantum version of Tensorflow in place, Circ, providing quantum tutorials for non-physicist developers. These nascent quantum developers embedded in your IT team (who could use a little entanglement to work on all those projects simultaneously, right?) could then begin building the infrastructure for when QC really kicks off, leaving the organization ready for decades of its own version of supremacy. Indeed, there’s already a SaaS vendor out there for quantum planning: Zapata is an open source, near one-stop-shop platform that helps users develop quantum algorithms and set up automated quantum workflows.
It’s tough to look ahead at quantum solutions when most of us are still figuring out how AI might help the organization. And given that we still haven’t really figured out how to keep QCs cold enough to operate accurately (if you think your laptop fan struggles now, wait till it has to keep its core near absolute zero), it might just be the Googles of the tech world that rule supreme for now. But a little training investment for those developers could be the edge that counts when quantum supremacy gives way to quantum computing by default.