Dr. Peter Scoblic is a co-founder and principal of the strategic foresight consultancy Event Horizon Strategies. A former executive editor at The New Republic and Foreign Policy who has written on foresight for publications including the New York Times, The Washington Post, Science, and Harvard Business Review, Peter is also a senior fellow with the International Security Program at New America, and an instructor for the Professional Development Program at Harvard University. Previously, he was deputy staff director of the Senate Committee on Foreign Relations, where he worked on approval of the New START agreement and was the chief foreign policy speechwriter for Chairman John Kerry.
On the eve of a particularly fraught election and a turbulent moment in US political history, Peter joined me for a discussion about his career, ranging from post-Cold War nuclear arms policy to the relationship between policymaking and pop culture, plus the practical question of how and to what extent we can usefully predict the future. The interview will appear on this blog in three parts, and you can read the first part here – but you can also read the interview in its entirety as a PDF download.
Your doctoral research has led to a number of outputs, including a great research paper on strategic foresight as a dynamic capability in uncertain situations, and case study work on the US Coast Guard’s scenarios programme which can be explored in both an article and podcast for the Harvard Business Review.
Is there anything you uncovered in your doctoral research which hasn’t come up in coverage of your work?
Scenario planning can be used to challenge assumptions and the mental models people have of the world, but it also has its own models and assumptions baked into it: how time works, how the future relates to the present and past.
One of the things I found interesting was that, among the Coast Guard for example, scenario participants found that the process didn’t just change their mental model of how the organization went about its mission and operations; it also changed the way they thought about time.
Scenario planning communicated its epistemological and ontological assumptions implicitly to participants. They absorbed it without realising they’d done so, finding that it hadn’t just changed what they thought about their organization’s strategy for the future, but also how they thought about questions of strategy and the future.
Something similar comes up with students of history. I researched the lives of scholars who were refugees from the Nazis for my doctorate.
There seemed to be some evidence that German and Austrian refugee historians developed an unusually sophisticated perspective on time, using their historical research and writings to encode concerns and issues they had about the present.
Bill Schwarz’s 2004 essay “Not Even Past Yet” points to Caribbean migrant intellectuals in the UK also developing “an unusually complex consciousness of the shape of the historical past” as their writings spoke both to the distant past and the tensions of the present.
Recently, I’ve been wondering about the ways in which scenario planning is analogous to therapy, including the fact that both scenarios and “the talking cure” are a kind of change process that we don’t fully understand, even though they seem to get results.
Do you think scenarios have been used therapeutically within the policy establishment?
One would have to identify the pathology! There’s probably no shortage of those, however – including short-termism, which afflicts a lot of policymaking.
In that sense, one could view scenarios as having been a therapeutic process for the Coast Guard, which reoriented the organization toward the longer-term future. It did this not only by explicitly asking them to consider what lay twenty years ahead of them, but by inculcating this sense that doing so was important, worthwhile, constructive, and useful.
Do you think there are things we have to be cautious of, if we import that mental model?
Any discipline or methodology can transmit an orthodoxy, and you have to be careful. The ability to question is essential to both scholarship in general and this question of thinking about the future.
There is sometimes a sense that scenario planning is a corrective to a quantitative, forecasting approach to thinking about the future – the hubristic idea, born in the 1950s, that the future can be known, if we only model it carefully enough.
Scenario planners and foresight practitioners sometimes commit the same sin in the opposite direction, saying “the future is wide open and we can imagine anything”, as if there were no constraints on reality, nothing could be known about the future, and anyone who thinks tomorrow is something other than a tabula rasa is just plain wrong.
Philip Tetlock talks about beliefs as testable hypotheses, not sacred truths; I think it’s important for us to remember this when we reflect on methods of forecasting and foresight.
There’s also an element of practicality. How far do we choose to question everything that’s presented to us?
If we try to question everything, we run up against the limits of bounded rationality very quickly. There’s too much information, and there are too many possibilities, so heuristics are necessary for us to cope with the world.
You manage to bridge the worlds of both foresight and forecasting: those who imagine the future and those who seek to predict it. But by embracing forecasting, do you inevitably have to opt for a deterministic worldview in which the future can ultimately be known and calculated?
Taken to an extreme, you find yourself with LaPlace’s demon – the idea that if some supernatural entity knew the location and momentum of every atom in the universe, they could play out the future with all the certainty of classical mechanics.
It’s not a comfortable place to be, and one of the economist Frank Knight’s responses was that no matter how much we detest uncertainty, we would be bereft without it: uncertainty allows for opportunity, entrepreneurship, and free will. In Knight’s conception, uncertainty about the future was what allowed for different conceptions about what awaits, which in turn allows for competitive choices and profit.
Probabilistic forecasting elides this problem precisely by virtue of its focus on probabilities. It doesn’t yield the predictable certainties you’d find in a Newtonian universe, with people bouncing around like billiard balls. It attends instead to the complexities of human interaction.
I keep returning to Jerry Ravetz’s comment that after Descartes discarded the humanities in favour of geometry, “Many practitioners who nowadays receive emotional security from the belief that their spreadsheet will tell them precisely what to do with a project or company are living with the consequences of Descartes’ desperate grab for certainty.”
It’s easy to be seduced by the certainty of numbers, but stories have their own seductions too. How do you deal with that tension between those who want the confidence of numbers, those who find more benefit in stories, and the risk of being led astray by overemphasising either?
The first step is awareness. Among scenario planners, there’s great recognition that numbers can suggest a false sense of precision and that certainty can be seductive. When you put a 72% probability on a future event, does that have meaning, or is it merely a simulacrum of meaning?
However, it’s also crucial to acknowledge that stories can have the same seductive power. When we craft a scenario, especially when we use that imagined future to reflect on the present, it’s important that we don’t start confirming our imaginings by blithely seeing evidence of that scenario in everything that’s around us. Though we tell ourselves that scenarios are not predictive, we nevertheless may adopt them as what psychologists call a focal hypothesis and then start seeking out evidence to confirm them. You have to be constantly on your guard against that. One particularly good story does not a probable future make.
Holding two ideas in your head at the same time is a challenging thing for anyone, but it’s a skill worth cultivating, especially when we’re approaching uncertainty.
It’s like being Zorro, riding with one foot on each horse. Or perhaps the White Queen, believing six impossible things before breakfast.
You collaborate and co-write with Philip Tetlock. What’s the dynamic of that back-and-forth between a foresight practitioner and a forecaster?
I feel very fortunate to have the opportunity to collaborate with Phil, whose scholarship has had such an influence on me. It works well because we each bring different perspectives to the table. It’s not that we’re at opposite poles, but we come at the problem of uncertainty from two different angles. We’ve learned to constructively disagree! I also think we have complementary skills.
Phil is a great proponent of adversarial collaboration; that doesn’t quite describe our relationship, but helps explain how we work together despite having different views on certain things. It helps that we can have collegial disagreements about things and that, at the end of the day, we’re both pragmatists; while we will indulge philosophical debates, we’re also asking: “How can people make better policy?” The most constructive thing under those circumstances is to, at some point, lay aside ontological debates about the future and ensure that our work helps improve decision-making.
How has the collaboration shifted your thinking?
Previously, I’d worked in opinion journalism, in which one can sometimes be very convinced of one’s own righteousness. Working with Phil has more deeply internalised my sense that one’s beliefs should be treated and defined as testable hypotheses. That’s not to scientise our values, but no matter how open-minded we think we are, we still carry around a set of mental models which we must be willing to test if we want to get closer to the truth.
I guess science is ultimately about the commitment to revise your beliefs in the face of evidence – at which point it then becomes a discussion about what standards of evidence you accept.
Polemic has its value in order to shift people’s thinking, even though it’s very different from those testable hypotheses, or the academic work you’ve been doing lately. Do you find you still value both?
Polemic does have a place, and especially these days, there’s a time for outrage and a time for incredible strength of opinion. At one point, Phil and I had an exchange where he thought I was being alarmist, and I thought he was being complacent; the fact that we were able to find a way forward perhaps illustrates the strengths of our collaboration.
To frame beliefs as testable things, one needn’t trade objectivity for neutrality. You don’t have to be completely agnostic about everything. It’s a very different way of approaching the world, but I’ve come to value both deeply.
Like embracing both numeracy and narrative, or the Zorro-on-two-horses approach of combining forecasting and foresight, it is possible to value polemic while at the same time wishing to put “sacred truths” to the test.
If you think of them as lenses, being able to apply both will give us a clearer view of the truth, to the extent that that’s an establishable thing. Truth is a squirmy beast!
That metaphor of the lens is very useful. The Oxford Scenario Planning Approach, of course, speaks of “strategic reframing”, and at the EXBD scenarios event in Copenhagen this year, participants were given sheets of coloured cellophane representing different aspects of the future, so we could remember to look at various scenarios through the tint of a given lens.
Within different paradigms, different questions are askable, and different methods are accepted. There are times, in my research, when I’ve wondered if I was exploring science or scientism – not everything that is true can be established in a laboratory. It makes us humble, as we recognise the limits of what we can know, and how.
Humility is a difficult thing to cultivate, especially for people who have gained doctorates in particular ways of looking at the world, and who see themselves as engaged in the pursuit of truth – there’s always a danger of hubris.
Join us for the next instalment when we talk about storytelling, satire, and scenarios – plus the personal impact of scenario research on Peter’s life. You can also read the complete interview now in this PDF download.