PROJECT M
 
PROJECT M
Edward Tenner

Vigilance is the watchword

Historian Edward Tenner sees unintended consequences as inevitable. In this interview with PROJECT M, he applies his lessons to the financial industry

© www.plainpicture.com

Vigilance is the watchword

Historian Edward Tenner sees unintended consequences as inevitable. In this interview with PROJECT M, he applies his lessons to the financial industry


Historian Edward Tenner examined unintended consequences in his books Our own devices and Why things bite back. In his forthcoming book, he investigates the positive unintended consequences of economic crises for creativity and serendipity. A TED speaker and the former executive editor at Princeton University, he is now a visiting scholar at Princeton, Rutgers and the Smithsonian.

PROJECT M

You once said that civilization may be an unintended consequence. What did you mean?

Edward Tenner

I meant that there’s archaeological and anthropological evidence that hunter-gatherers had a remarkable standard of living without much exertion. So it took other pressures to drive people to try new approaches. For example, a shortage of tin used to make bronze in the Bronze Age inspired experimentation that ultimately led to the Iron Age. At every stage of human history there have been constraints and problems, such as resource scarcity, population growth or environmental change, that led human ingenuity to find new ways of doing things. For better and worse, we owe our technological world to these innovations.

PROJECT M

So unintended consequences are a fact of life?

Edward Tenner

Yes, and trying to do away with unintended consequences might have the most serious unintended consequences of all. We need to find better ways to manage them.

PROJECT M

So, how do you clearly define unintended consequences, then?

Edward Tenner

I’ve written about a phenomenon called the revenge effect, which is the result of a well-intentioned and plausibly effective measure that tends to cancel itself out. For example, there was a recent article in The New York Times about mosquito nets that had been sent to African countries to help combat malaria. It turns out that the nets are often used for fishing and that has endangered the environment because the chemicals used to repel the mosquitoes are harmful to aquatic life.

It’s not that distributing mosquito netting is a bad idea; it’s just that the complexity of the world is such that it’s easy for something that seems to be an effective measure to lead to behavior that would be hard to predict. So we have to be more tentative and experimental about innovations at an early stage.

PROJECT M

Much of your work deals with engineering. How applicable are your observations to finance?

Edward Tenner

The financial industry shares something with other human technologies: unforeseen interactions between collective behavior and precautions. For example, when the US Federal Reserve Board was established more than 100 years ago, some of its backers believed that the business cycle could be smoothed out and that the all-wise central banker would be able to avoid the panics that had plagued the 19th- and early 20th-century American economy. In his wonderful book A Nation of Deadbeats, my colleague, the historian Scott Reynolds Nelson, quotes Senator “Cotton Tom” Heflin as saying that with the new Federal Reserve, “this Republic can shake off temporary business disturbances like dew drops from a lion’s mane.”

Firefighting is analogous. The suppression of many small fires in American forests resulted in the early 20th century in the build-up of large concentrations of brush. This meant that when fires occurred they were bigger and more intense than before. Some of the greatest forest fires actually occurred in the early 20th century as a result of fire suppression. In finance, the smaller panics prevalent before 1913 may have been reduced in impact – think how the US economy absorbed the savings and loan disaster of the 1980s thanks in part to deposit insurance – but other problems have been more intractable.

PROJECT M

Historian William H. McNeill has implied there is a type of a natural law of the conservation of catastrophe, like the conservation of energy. Do you agree?

Edward Tenner

William McNeill was one of my graduate teachers, and I’ve always admired his insight into human behavior. I’m not sure it’s a general rule, but it is an inherent risk of any reform. When you set up a secure and protected system, you have to be aware that beyond the barriers of that system, all kinds of things are happening. While you may try to insulate yourself from them, there are actors in those fields, especially institutions and individual investors outside your own jurisdiction, who are taking advantage of whatever you have, in ways and maybe on a scale that you had not foreseen. The rise of superbugs, unfortunately, suggests that the mid-20th-century antibiotic miracle might help lead to new pandemics.

PROJECT M

In engineering there’s a theory of a 30-year cycle in bridge failures that often stems from overconfidence in new technology. Might a similar mechanism of hubris operate in the field of economics and finance?

Edward Tenner

I think there are similarities; after all, we still regularly have crises, and just when we think we’ve patched one up, along comes another. The Glass-Steagall Act, which created a firewall between commercial and investment banks, was repealed with much celebration in 1991. In hindsight, progressives tend to say it was the repeal of Glass-Steagall and securitization that was primarily responsible for the most recent financial crisis, while conservatives tend to say it was subprime mortgages and the encouragement by the government to lend to people whose circumstances were too precarious for home ownership.

In all likelihood, none of these things was solely responsible, and that’s the problem. You take a number of well-meant reforms and their collective effect might be to undermine each other or act together with other forces in a way that might not be easily predictable but could turn out to be disastrous. It’s very difficult to undertake holistic reform. You have to examine not only the proposed innovation but how all kinds of other proposed innovations can affect it for better and for worse.

PROJECT M

So what’s the solution? Are we hopeless in the face of unintended consequences?

Edward Tenner

Paradoxically, being too cautious about innovation threatens to lock in our present problems. The idea of the precautionary principle, as often expressed, is not a solution. The challenge is the great financial incentive to roll things out fully before they’re proven, so negative externalities, like environmental costs, may be shifted to the public. By studying the historical record we can develop a certain type of imagination, an insight into what can go wrong. That’s one of my main purposes in writing, to give people analogies that will help to recognize emerging risks.

The objective should be to avoid implementing innovations on a scale that it is hard to turn around if their effects prove disastrous. I think that’s the best compromise.

PROJECT M

Vigilance is a word you have used a lot in your writing. What is its role?

Edward Tenner

I’ll give you an example: General Motors has had a much publicized problem with faulty ignitions that could switch off the engine during driving. This resulted in the recall of over 24 million cars and multiple investigations and lawsuits. There were indications over a decade ago that there was a problem, yet they ignored it.

So many of our problems stem not only from unexpected interactions of human behavior and technology, which are always going to happen, but from companies’ attempts to wish away or dismiss early warning signs. In fact they should take them very seriously, as they certainly could save fortunes by choosing the safer route. Unfortunately, executives are judged by the profit they make today rather than the long-term losses that these companies could incur as a result of their actions. Independent auditors may not have the training or incentives to identify such risks.

PROJECT M

Can you provide an example of where vigilance will be required in the future?

Edward Tenner

One of relevance to the insurance industry is the self-driving car. It is sometimes assumed these things will be so smart that people can let the vehicle handle all the steering. The New York Times recently revealed that hands-free driving is actually legal in most states; some cars can already keep in lane and regulate speed on their own.

Yet even though there have been hundreds of thousands of miles of test drives, when everything is scaled up there will be unusual constellations of factors that no software can anticipate and that will require human intervention. But if we will not be in constant practice, we will lack the skills and reactions to avoid an accident. This has become increasingly apparent in aviation incidents like the crashes of Air France Flight 447 and Asiana Flight 214. In principle, there has always been regular training with flight simulators. They may now need to become even more realistic.

PROJECT M

But this doesn’t mean we should be afraid of the technology, that we should try and stop it?

Edward Tenner

No, not at all, but we should be aware of where potential flaws lie. Recently Google changed the algorithm to keep their self-driving cars closer to the car in front of them to address one complaint of passengers that other cars kept cutting in. This is not based on good traffic engineering.

At a conference, a safety engineer who helps train fleet drivers said, as I recall, that the time you lose in the course of a two-hour journey from people cutting in is negligible – less than half a minute. I think it’s ironic that an algorithm should be adjusted to reflect an illusion of average drivers rather than be optimized for safest driving. Who can tell what the consequences, if any, of this minor tweak in an algorithm will be in the future?

I also think it’s an illusion to believe that new technology can take the need for vigilance out of life. Even if a navigation system is almost perfectly reliable, it is still no better than the many other systems of a vehicle. Twelve percent of US cars nationally have at least one bald tire. Tread on many others is too worn for safe use in bad weather. Potholes abound. No algorithm can substitute for well-maintained vehicles and roads.

Comments