Log In
Access all customer product support, event response, and training in one place
LifeRisks PortalFind modeling tools based on best practice actuarial techniques and medical science
Miu PortalExplore analytics and risk insights for the alternative capital market
Insurance Solutions
Formerly Moody’s RMS
Much hype surrounds quantum processing. This is perhaps unsurprising given that it could create computing systems thousands (or millions, depending on the study) of times more powerful than current classical computing frameworks.
The power locked within quantum mechanics has been recognized by scientists for decades, but it is only in recent years that its conceptual potential has jumped the theoretical boundary and started to take form in the real world.
Since that leap, the “quantum race” has begun in earnest, with China, Russia, Germany and the U.S. out in front. Technology heavyweights such as IBM, Microsoft and Google are breaking new quantum ground each month, striving to move these processing capabilities from the laboratory into the commercial sphere.
But before getting swept up in this quantum rush, let’s look at the mechanics of this processing potential.
Classical computers are built upon a binary framework of “bits” (binary digits) of information that can exist in one of two definite states — zero or one, or “on or off.” Such systems process information in a linear, sequential fashion, similar to how the human brain solves problems.
In a quantum computer, bits are replaced by “qubits” (quantum bits), which can operate in multiple states — zero, one or any state in between (referred to as quantum superposition). This means they can store much more complex data. If a bit can be thought of as a single note that starts and finishes, then a qubit is the sound of a huge orchestra playing continuously.
What this state enables — largely in theory, but increasingly in practice — is the ability to process information at an exponentially faster rate. This is based on the interaction between the qubits. “Quantum entanglement” means that rather than operating as individual pieces of information, all the qubits within the system operate as a single entity.
From a computational perspective, this creates an environment where multiple computations encompassing exceptional amounts of data can be performed virtually simultaneously. Further, this beehive-like state of collective activity means that when new information is introduced, its impact is instantly transferred to all qubits within the system.
To deliver the levels of interaction necessary to capitalize on quantum power requires a system with multiple qubits. And this is the big challenge. Quantum information is incredibly brittle. Creating a system that can contain and maintain these highly complex systems with sufficient controls to support analytical endeavors at a commercially viable level is a colossal task.
In March, IBM announced IBM Q — part of its ongoing efforts to create a commercially available universal quantum computing system. This included two different processors: a 16-qubit processor to allow developers and programmers to run quantum algorithms; and a 17-qubit commercial processor prototype — its most powerful quantum unit to date.
At the launch, Arvind Krishna, senior vice president and director of IBM Research and Hybrid Cloud, said: “The significant engineering improvements announced today will allow IBM to scale future processors to include 50 or more qubits, and demonstrate computational capabilities beyond today’s classical computing systems.”
“a major challenge is the simple fact that when building such systems, few components are available off-the-shelf”
Matthew Griffin
311 Institute
IBM also devised a new metric for measuring key aspects of quantum systems called “Quantum Volume.” These cover qubit quality, potential system error rates and levels of circuit connectivity.
According to Matthew Griffin, CEO of innovation consultants the 311 Institute, a major challenge is the simple fact that when building such systems, few components are available off-the-shelf or are anywhere near maturity.
“From compute to memory to networking and data storage,” he says, “companies are having to engineer a completely new technology stack. For example, using these new platforms, companies will be able to process huge volumes of information at near instantaneous speeds, but even today’s best and fastest networking and storage technologies will struggle to keep up with the workloads.”
In response, he adds that firms are looking at “building out DNA and atomic scale storage platforms that can scale to any size almost instantaneously,” with Microsoft aiming to have an operational system by 2020.
“Other challenges include the operating temperature of the platforms,” Griffin continues. “Today, these must be kept as close to absolute zero (minus 273.15 degrees Celsius) as possible to maintain a high degree of processing accuracy. One day, it’s hoped that these platforms will be able to operate at, or near, room temperature. And then there’s the ‘fitness’ of the software stack — after all, very few, if any, software stacks today can handle anything like the demands that quantum computing will put onto them.”
One area where quantum computing has major potential is in optimization challenges. These involve the ability to analyze immense data sets to establish the best possible solutions to achieve a particular outcome.
And this is where quantum processing could offer the greatest benefit to the insurance arena — through improved risk analysis.
“From an insurance perspective,” Griffin says, “some opportunities will revolve around the ability to analyze more data, faster, to extrapolate better risk projections. This could allow dynamic pricing, but also help better model systemic risk patterns that are an increasing by-product of today’s world, for example, in cyber security, healthcare and the internet of things, to name but a fraction of the opportunities.”
Steve Jewson, senior vice president of model development at RMS, adds: “Insurance risk assessment is about considering many different possibilities, and quantum computers may be well suited for that task once they reach a sufficient level of maturity.”
However, he is wary of overplaying the quantum potential. “Quantum computers hold the promise of being superfast,” he says, “but probably only for certain specific tasks. They may well not change 90 percent of what we do. But for the other 10 percent, they could really have an impact.
“I see quantum computing as having the potential to be like GPUs [graphics processing units] — very good at certain specific calculations. GPUs turned out to be fantastically fast for flood risk assessment, and have revolutionized that field in the last 10 years. Quantum computers have the potential to revolutionize certain specific areas of insurance in the same way.”
It will be at least five years before quantum computing starts making a meaningful difference to businesses or society in general — and from an insurance perspective that horizon is probably much further off. “Many insurers are still battling the day-to-day challenges of digital transformation,” Griffin points out, “and the fact of the matter is that quantum computing … still comes some way down the priority list.”
“In the next five years,” says Jewson, “progress in insurance tech will be about artificial intelligence and machine learning, using GPUs, collecting data in smart ways and using the cloud to its full potential. Beyond that, it could be about quantum computing.”
According to Griffin, however, the insurance community should be seeking to understand the quantum realm. “I would suggest they explore this technology, talk to people within the quantum computing ecosystem and their peers in other industries, such as financial services, who are gently ‘prodding the bear.’ Being informed about the benefits and the pitfalls of a new technology is the first step in creating a well thought through strategy to embrace it, or not, as the case may be.”
Any new technology brings its own risks — but for quantum computing those risks take on a whole new meaning. A major concern is the potential for quantum computers, given their astronomical processing power, to be able to bypass most of today’s data encryption codes.
“Once ‘true’ quantum computers hit the 1,000 to 2,000 qubit mark, they will increasingly be able to be used to crack at least 70 percent of all of today’s encryption standards,” warns Griffin, “and I don’t need to spell out what that means in the hands of a cybercriminal.”
Companies are already working to pre-empt this catastrophic data breach scenario, however. For example, PwC announced in June that it had “joined forces” with the Russian Quantum Center to develop commercial quantum information security systems.
“As companies apply existing and emerging technologies more aggressively in the push to digitize their operating models,” said Igor Lotakov, country managing partner at PwC Russia, following the announcement, “the need to create efficient cyber security strategies based on the latest breakthroughs has become paramount. If companies fail to earn digital trust, they risk losing their clients.”