[Farnam Street Podcast] Philip Tetlock on The Art and Science of Super Forecasting
http://podplayer.net/#/?id=24093071 via @PodcastAddict
Phil Tetlock forms teams of “super forecasters” who are amazing accurate at predicting the probabilites of real world events, in their case often used to predict the likelihoods of complex real world events for the intelligence community.
It has many insights useful to invesmtent management and I am sure other fields that depend on probabilities.
1. Start with the outsiders view. Establish baseline probabilities of how likely something is before you start refining with inside knowledge. Eg in predicting how likely someone specific is to get divorced start with the probabilities of anyone getting divorced.
2. Break the problem down into steps in a decision tree each with their own probability. You can then work on refining each node in the tree. That way you know what the key questions are that you are asking.
3. Focus on accurate statement of the prediction, many of us are managing career risk for fear of being wrong, creating fuzzy statements that could be right under a wide range of outcomes.
4. Being open minded is essential to being a good forecaster. We all like to think we are open minded – we really are usually not. We can be more open minded on things we are not ideological about but where we have ideologies its much more difficult to be open minded
5. A group of people with the same objective and a good debating style but where possible with very independent thought processes can operate far more effectively to get to the right probababilities.
6. There are lots of impediments to making accurate forecasts in organisations where the objective of accuracy may be to further your career, not rock the political boat or not be seen to make a mistake rather than getting to the right answer. Manging that culturally is a challenge that leadership have to undertake: ensuring that the goal is the accuracy, that open mindedness is real and that mistakes are welcomed to learn from.
7. One of the biggest risks is conflating mistakes with probabilistic outcomes. You thought the probabilty of an outcome was 75%, the alternative outcome actually happened. That does not mean your probability was wrong, it could just have been the 1 in 4 chance of the other outcome happening.