Probability, Global Warming, and Hurricane Sandy

There’s a lot about global warming we don’t know, and the more specific we try to get, the more people disagree. City planners are betting on the effects in extremely concrete ways, such as on Cermak Road here in Chicago.

MTA flooded

 

Yesterday I got annoyed at a Politico item that tried to take the bloom off of the election-prediction wizards at FiveThirtyEight. My annoyance was less with the weird questions it raised about Nate Silver—yes, if Romney wins, people who don’t really understand his project might stop taking him seriously, insofar as they were actually taking him seriously in the first place—as the abuse it did to the idea of probability, because dealing with probability rationally is just crucial to reading the news, voting, the whole democratic project. And Frankenstorm brings that home.

From the original item, referring to the possibility that Mitt Romney, facing a 25 percent chance of winning according to Silver’s model, might buck the odds:

Prediction is the name of Silver’s game, the basis for his celebrity. So should Mitt Romney win on Nov. 6, it’s difficult to see how people can continue to put faith in the predictions of someone who has never given that candidate anything higher than a 41 percent chance of winning (way back on June 2) and — one week from the election — gives him a one-in-four chance, even as the polls have him almost neck-and-neck with the incumbent.

It’s self-contradictory. The model gives Romney a non-trivial chance of winning—one in four, better than the roll of one die—so it doesn’t obviate Silver’s model, much less Silver’s work generally, if Romney wins. (The Princeton Election Consortium gives Obama better odds at more electoral votes.) If there’s something wrong with his model, the mere binary result of the presidential election won’t reveal it.

If this seems like minor media infighting, it’s worrisome: lots and lots of public policy is based on probability. Not understanding it can kill. Silver gives a good example in his book, The Signal and the Noise, of how bad public policy came from trying to avoid just the sort of misunderstanding that Politico’s Dylan Byers—a political reporter—evinces.

The National Weather Service predicted that, in April of 1997, the Red River would flood, cresting at 49 feet. The levees in Grand Forks were built to withstand a 51-foot crest. It crested at 54 feet, causing a mess that you may still recall. As Silver writes, “a five-foot miss, two months in advance of a flood, is pretty reasonable—about as well as these predictions had done on average historically.” In other words, the real fault in prediction was not with the NWS, which did as well as could be (mathematically) expected, but with the prediction that their prediction would be perfectly accurate. The NWS did the best it could with the data it had; those in charge of responding to it did not, since the NWS’s margin of error was reasonably well established.

So why didn’t the NWS warn people more about the caveats (emphasis mine)?

The problem is that the Weather Service had explicitly avoided communicating the uncertainty in their forecast to the public, emphasizing only the forty-nine-foot prediction. The forecasters later told researchers that they were afraid the public might lose confidence in the forecast if they had conveyed any uncertainty in the outlook.

Instead, of course, it would have made the public much better prepared—and possibly able to prevent the flooding by reinforcing the levees or diverting the river flow. Left to their own devices, many residents became convinced they didn’t have anything to worry about. (Very few of them bought flood insurance.) A prediction of a forty-nine-foot crest in the river, expressed without any reservation, seemed to imply that the flood would hit forty-nine feet exactly; the fifty-one-foot levees would be just enough to keep them safe. Some residents even interpreted the forecast of forty-nine feet as representing the maximum possible extent of the flood.

Silver estimates that there was a 35 percent chance the river would top the levees, just based on the NWS’s track record. Then it becomes an investment strategy: is it worth spending money to protect against a 35 percent chance of flood? On the other hand, Byers would say—ok, he’d ask—if the NWS was necessary; if anyone could put faith in it ever again. And fear of that reaction hurt the NWS, and Grand Forks.

It’s not just that there are lots of carpal-tunneled nerds doing this for (and to) us every day, from our governments to our insurance companies to the people who build our cars. We have to do this all the time. Practically every year I have to choose a new insurance plan, and I usually end up taking the more expensive, low-deductible one, and I’ve probably lost several thousand dollars doing that, because I haven’t had to go to the hospital. I’ve bet wrong in the sense that I chose to soften the blow of potential medical costs I didn’t incur, but next year might be the year. So I’ll probably go with the more expensive one. (I could probably calculate the odds, but I have a time cost, too, and not much to hedge with.)

Which bring us to Hurricane Sandy and the flooding of New York:

The waters on the city’s doorstep have been rising roughly an inch a decade over the last century as oceans have warmed and expanded. But according to scientists advising the city, that rate is accelerating, because of environmental factors, and levels could rise two feet higher than today’s by midcentury. More frequent flooding is expected to become an uncomfortable reality.

With higher seas, a common storm could prove as damaging as the rare big storm or hurricane is today, scientists say. Were sea levels to rise four feet by the 2080s, for example, 34 percent of the city’s streets could lie in the flood-risk zone, compared with just 11 percent now, a 2011 study commissioned by the state said.

[snip]

Some experts argue that the encounter with Hurricane Irene last year and a flash flood in 2007 underscored the dangers of deferring aggressive solutions.

Klaus H. Jacob, a research scientist at Columbia University’s Earth Institute, said the storm surge from Irene came, on average, just one foot short of paralyzing transportation into and out of Manhattan.

If the surge had been just that much higher, subway tunnels would have flooded, segments of the Franklin D. Roosevelt Drive and roads along the Hudson River would have turned into rivers, and sections of the commuter rail system would have been impassable or bereft of power, he said.

That was written a month and a half ago. Hurricane Sandy’s surge hit at high tide on a full moon—not good odds, those, but they happen—and Jacob’s alternate scenario came to be. But not to worry, Chicago’s planning ahead:

Climate scientists have told city planners that based on current trends, Chicago will feel more like Baton Rouge than a Northern metropolis before the end of this century.

So, Chicago is getting ready for a wetter, steamier future. Public alleyways are being repaved with materials that are permeable to water. The white oak, the state tree of Illinois, has been banned from city planting lists, and swamp oaks and sweet gum trees from the South have been given new priority.

Meet Cermak Road, the “Greenest Street in America.”

Cermak Road is a bet, one of many that city planners are making. Understanding that—and presenting it honestly to readers—is critical to understanding what they do, and critical to watchdogging them. It’s always been a part of public policy, but as people familiarize themselves with data, and as our technical capacity for dealing with it improves, it’s even more critical to understand, as that trend is likely to continue.

Update: Chris Mooney on Hurricane Sandy:

Take a recent Nature study by climate scientists at MIT and Princeton, looking at future storm surge scenarios under climate change. The researchers used multiple computer model runs to simulate a variety of storm surges hurled at New York City—explicitly looking at future climate and sea level rise scenarios. By 2100, New York is projected to experience between .5 and 1.5 meters of sea-level rise. Taking the midpoint of this estimate, or a 1 meter sea-level rise, the paper found that what is currently a 100 year storm surge event for New York could become a 20 year event by 2100.

[snip]

The point is that we have a terrible track record of dealing with long range risks in this country. This is exacerbated by a presentist, science-phobic mindset on full display in the saga of 2012 presidential debates—which now, in the wake of Hurricane Sandy, now look utterly inane.

Obviously some of it is ideological, but I can’t help but wonder how much of being “presentist” is a misunderstanding of probability.

 

Photograph: MTAPhotos (CC by 2.0)

Share

comments
1 year ago
Posted by Sandbagger

Your comparison of Sandy's storm surge to a Grand Forks flood prediction is a good one - in that today they both use a probability distribution to establish risk. For example, the National Hurricane Center storm surge folks had established that the risk of Sandy hitting Battery Park at least as hard as either Irene or the previous record level, was something higher than 80 percent, while the 50 percent risk was that it would come in a couple feet higher than the previous record. Now that's a gutsy call! Trouble is, what really occurred is something even worse than that (maybe only a 25 percent chance :)). And that's the problem with the science and math of prediction and probabilities... your predictive capability is really only base-able on your knowledge of the total system. In both social science and physical science there are still lots of little noises we don't know how to account for, and some of them can still rear up and swamp a system. I suggest you read up on Chaos Theory.

Submit your comment