📚 Finished reading The Signal and the Noise by Nate Silver.

Nate Silver describes efforts to forecast events from a wide range of domains - everything from baseball to terrorism, from global warming to poker games. This includes a chapter on forecasting contagious diseases, some of which, despite being published in 2012, is probably all too familiar in a world where an otherwise surprising number of people have unfortunately had to embed the concept of R0 and case fatality rates into their psyches.

In doing so he largely (very deliberately) shows the limits of our ability to accurately predict things. After all, it’s not something humans have evolved to be good at. We’re optimised for survival, not for being able to mathematically rigorously predict the future. And some things are inherently easier to predict than others. We’re increasingly good at the short term weather forecast; we’ve made very little progress when it comes to earthquakes.

Naively, one might think that because we have access to so much more information today than ever before - with each day increasing the store of data by historically unimaginable amounts - that we should be in a great place to rapidly learn vastly more truth about the world than ever before.

But data isn’t knowledge. The amount of total information is increasing far, far faster than the amount of currently useful information.

We’re able to test so many hypotheses with so much data that even if we can avoid statistical issues such overfitting to the extent of being fairly good at knowing if a particular hypothesis is likely to be true or not, the fact that baseline rate of hypotheses being true is probably low means that we’re constantly in danger of misleading ourselves into thinking we know something that we don’t. This argument very much brings to mind the famous “Why Most Published Research Findings Are Wrong” paper.

As Silver writes:

We think we want information when we really want knowledge.

Furthermore, bad incentives exist in many forecast-adjacent domains today. Political pundits want to exude certainty, economists want to preserve their reputation, social media stars want to go viral, scientists want to get promoted. None of these are necessarily drivers that necessarily align with making accurate predictions. Some weather services are reticent to ever show a 50% chance of rain even if that’s what the maths says; consumers see that as “indecisive”.

Silver’s take is that we can improve our ability to forecast events, and hence sometimes even save lives, by altering how we think about the world. We should take up a Bayesian style of thinking. This approach lets us - in fact requires us - to quantify our pre-existing beliefs. We then constantly update them in a formalised way as and when new information becomes available to us, resulting in newly updated highly-quality beliefs of how likely something is to be true.

Our thinking should thus be probabilistic, not binary. It’s very rare that something is 0% predictable or 100% predictable. We can usually say something about some aspect of any given future event. As such we should become comfortable with, and learn to express, uncertainty.

We should acknowledge our existing assumptions and beliefs. No-one starts off from a place of no bias and this will inevitably influence how you approach a given forecasting problem.

There’s almost a kind of Protestant work ethic about it that forecasters would be wise to adhere to: work hard, be honest, be modest.

One can believe that an objective truth exists - in fact you sort of have to if you are trying to predict it - but we should be skeptical of any forecaster who believes they have certainty about it.

Borrowing from no less than the Serenity Prayer, the author leaves us with the thought that:

Distinguishing the signal from the noise requires both scientific knowledge and self-knowledge: the serenity to accept the things we cannot predict, the courage to predict the things we can, and the wisdom to know the difference.

My more detailed notes are here.