What Does Quantum Computing Mean for Conservation?
Nothing, really. . . . At least not yet. The rise in new, powerful computing techniques could transform a conservation sector that has grown increasingly reliant—though not without debate—on sophisticated modeling and visualization software to make decisions about which places are worth protecting.
Simply put, quantum computers use the foundational principles of quantum physics to exponentially expand the rate and scope of computation. What might take one of our fastest “classical” supercomputers 30 minutes to figure out takes even the most fickle quantum computer just half a second. Like the iPhone just a decade or two ago, quantum computing today lives mainly in popular representation, in the pages of Time magazine as “the infinity machine” or in sci-fi thrillers like Hannu Rajaniemi’s wonderful Jean le Flambeur trilogy. Still, one start-up, D-Wave, says it has built a workable computer, a ten foot tall machine kept at a temperature 150 times colder than deep space (the machine may in fact be the coldest place in the universe!). D-Wave has serious backers. Lockheed Martin is leasing one of their machines. Google and NASA are jointly investing in another. Much of the company’s seed money came from In-Q-Tel, the CIA’s venture capital branch.
Despite the hype, D-Wave’s quantum computer may not even mean that much for quantum computing. Physicists and computer scientists are still not sure whether the device is actually using quantum mechanics to make its computations and, if it is, whether it can do so reliably and for a wide range of problems. To understand why D-Wave’s machine may not be living up to its name and why that matters, we need to make a brief, if difficult, detour through the field of subatomic physics. Hey, at least it’s not rocket science.
Your laptop and my smart phone—all our digital devices—run on the on/off binary of electronic signals, calculating with 0s and 1s that are called bits. A quantum computer, on the other hand, harnesses what we know about the mechanics of electron position at the subatomic level. It relies on quantum entanglement, in which the properties of these particles can be hitched together and changed simultaneously even if they are nowhere near each other. Once entangled, they can then all be put into a state of superposition, where electron “spins” around an atom in multiple directions at the same time. Relying on the superposition of these electrons, quantum bits, or qubits, are thus either 0s or 1s or both at the same time. Because qubits can in effect be everything at once, quantum computers can consider all possible combinations of bits at once. Given a problem with lots of different potential solutions, a quantum computer should be able to see all possible solutions at once and give us the best one, in one fell stroke (Microsoft has a decent video explaining the physics here).
Here’s where things actually get tricky. It’s hard to keep the electrons in superposition—they’re easily knocked out of it by heat, which is why D-Wave keeps its machine in a deep freeze. Plus, the superpositioned different spin states are only resolved at time of measurement. When we ask the quantum computer to calculate something, we only get our answer when we peek under the hood (or into the box, as it were). But quantum physics tells us that the very act of measurement collapses all that superpositioned ambiguity into one answer selected out of all possible ones, meaning we’re likely not going to get the best one. The work-around has been to limit the functionality of the computer to calculating a single class of problems, called optimization problems. This means we won’t be using quantum computers to watch cat videos any time soon. But, like cat videos, optimization problems are ubiquitous and important.
We solve optimization-type problems everyday when we go to plan a trip to work or the grocery store. You may pull up Google Maps, which tells you the quickest route based on distance, current traffic conditions, and mode of transportation. Airlines solve optimization problems all the time in order to efficiently allocate their planes along routes. Coca-Cola has developed an immense algorithm that tells it how to manage its orange juice operations based on up to a quintillion (!) factors, many involving taste, time of year, hurricane damage, and customer demand. And you don’t even want to know how financial speculators use optimization algorithms. What in all these problems we ask is, what’s the best solution—the best use of resources (time, planes, oranges, money)—given so many different constraining variables?
Conservationists face exactly these kinds of problems. When an organization like The Nature Conservancy looks to purchase properties to protect or when a state resource agency plans where to establish a new set of reserves, they ask, what portfolio of sites provides the most benefit—in terms of species saved or ecosystem services provided—for the least cost? One way they go about answering that question is through a technique called simulated annealing. Picture a landscape of peaks and valleys with the crests representing high costs, the swales least-cost scenarios. Simulated annealing works by randomly picking a spot on the landscape, say a hill (which is really just a bundle of sites with a particularly high cost). Then it picks another spot on the other side of the landscape and compares it to where it just was: are we closer to the valley floor? It does this millions of times, taking smaller jumps each time. That’s what makes simulated annealing relatively computationally-intensive. But because of superposition, a quantum computer would see all the hills and dales at once and could “tunnel” to the optimum much more quickly.
Now, most conservation projects are not all that resource intensive—they might take half an hour to find the best bundle of sites to protect in Oregon’s Willamette Valley—but quantum computing would account for far more variables. It would also facilitate conservationists’ use of machine learning algorithms, which many are already beginning to use to find patterns in “big data” on anything from real-time measures of soil moisture in national forests to the rate of logging in the Amazon. Finally, the speed of calculability promised by quantum computing would improve climate change modeling efforts by NASA and academics, which are already running up against the limits of current processors.
Conservationist nonprofits and government agencies are increasingly turning to these modeling and resource allocation “decision-support” tools to make the most of scarce funding and to interest policy-makers. The idea is to model different scenarios—different sets of sites to invest in, for instance—which can then be shown to decision-makers who “pull the lever,” and make a tradeoff between the different modeled economic and ecological outcomes of different project options. The Natural Capital Project, for instance, has written a couple of different software programs that have helped it show leaders and community members in Colombia the different nutrient levels and stream flows in a watershed based on various restoration projects constrained by the limited amount of investment available. One of the programs, InVEST, even calculates the monetary value provided by different investments in ecosystem services, using a number of different variables, from carbon to cultural use of nature.
It’s “modeling sausage,” if not a modeling witch’s brew; complex to be sure, at times arcane, and often producing unexpected results. Some conservationists question the growing use of software modeling and what each program’s underlying assumptions commit its users to. It’s part of a larger, ongoing debate amongst conservationists right now: to what extent should they accommodate tradeoffs between different projects, sacrificing one place for the other, and to what degree should they work with corporations to do so? The conservationist turn to new analysis software may be useful in providing more and better-organized information about a problem, but it also masks underlying issues.
It will be a long time before conservationists are using quantum computers. Quantum computing remains largely the province of academic debate, sci-fi imagination, and military-industrial intrigue. If and when it becomes available to them, conservationists will be forced to answer tough questions about the politics of computation. With what set of money, time, and other resource constraints do we make our calculations and do we challenge them or accept them as given? Can bigger and better processors solve climate change, species extinctions, and all of our other environmental challenges?
Eric Nost is a Ph.D. student in UW-Madison’s Department of Geography. His research describes the concepts, tools, and institutions environmental regulators, non-profit conservationists, and private sector entrepreneurs produce and utilize to confront the effects of climate change. He is currently looking at efforts to restore coastal marshes following the 2010 Deepwater Horizon oil spill in the Gulf of Mexico. Website. Contact.
very much informative