HN Books @HNBooksMonth

The best books of Hacker News.

Hacker News Comments on
The Signal and the Noise: Why So Many Predictions Fail--but Some Don't

Nate Silver · 4 HN comments
HN Books has aggregated all Hacker News stories and comments that mention "The Signal and the Noise: Why So Many Predictions Fail--but Some Don't" by Nate Silver.
View on Amazon [↗]
HN Books may receive an affiliate commission when you make purchases on sites after clicking through links on this page.
Amazon Summary
UPDATED FOR 2020 WITH A NEW PREFACE BY NATE SILVER"One of the more momentous books of the decade." —The New York Times Book ReviewNate Silver built an innovative system for predicting baseball performance, predicted the 2008 election within a hair’s breadth, and became a national sensation as a blogger—all by the time he was thirty. He solidified his standing as the nation's foremost political forecaster with his near perfect prediction of the 2012 election. Silver is the founder and editor in chief of the website FiveThirtyEight.   Drawing on his own groundbreaking work, Silver examines the world of prediction, investigating how we can distinguish a true signal from a universe of noisy data. Most predictions fail, often at great cost to society, because most of us have a poor understanding of probability and uncertainty. Both experts and laypeople mistake more confident predictions for more accurate ones. But overconfidence is often the reason for failure. If our appreciation of uncertainty improves, our predictions can get better too. This is the “prediction paradox”: The more humility we have about our ability to make predictions, the more successful we can be in planning for the future.In keeping with his own aim to seek truth from data, Silver visits the most successful forecasters in a range of areas, from hurricanes to baseball to global pandemics, from the poker table to the stock market, from Capitol Hill to the NBA. He explains and evaluates how these forecasters think and what bonds they share. What lies behind their success? Are they good—or just lucky? What patterns have they unraveled? And are their forecasts really right? He explores unanticipated commonalities and exposes unexpected juxtapositions. And sometimes, it is not so much how good a prediction is in an absolute sense that matters but how good it is relative to the competition. In other cases, prediction is still a very rudimentary—and dangerous—science.Silver observes that the most accurate forecasters tend to have a superior command of probability, and they tend to be both humble and hardworking. They distinguish the predictable from the unpredictable, and they notice a thousand little details that lead them closer to the truth. Because of their appreciation of probability, they can distinguish the signal from the noise.With everything from the health of the global economy to our ability to fight terrorism dependent on the quality of our predictions, Nate Silver’s insights are an essential read.
HN Books Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this book.
That’s because all probabilities are conditional, and people don’t consider that.

These two things helped me to understand probability better.

The Signal and the Noise:

https://www.amazon.com/Signal-Noise-Many-Predictions-Fail-bu...

The videos for Harvard Statistics 110:

https://projects.iq.harvard.edu/stat110/home

lawrenceyan
This is a great quote. I might have to steal this.
This article reminds me of Nate Silver's book [1] which has a far more scientific and in depth look on the failure/success of these types of predictive statistics.

The chapters on the many attempts (or many failed hopes) in predicting earthquakes was particularly interesting, including many times the media has bought into hyped up new charlatans who say they finally figured it out but which ultimately failed to survive under basic statistical scrutiny.

It also has a useful soft introduction to Bayesian statistics and other useful concepts from the field of prediction that I hope more journalists read about. As this seems to be a very common theme in reporting.

Even this journalist couldn't help themselves with this line (combined with some scary looking charts described with an alarmist tone farther down):

> Was there some miscalculation of how frequently these massive flooding events occur? Or, most alarmingly, is something else happening that suggests these catastrophic weather events are becoming much more common?

The failure to mention the effects of El Nino/El Nina seems like a big oversight in this article, especially when we're just coming out of a particularly strong one. Climate stats are an easy one to get wrong - or to shape into any narrative - especially when timeframes and location are easy things to be viewed too narrowly.

[1] https://www.amazon.com/Signal-Noise-Many-Predictions-Fail-bu...

Nate Silvers wrote an entire book on this subject called "The Signal and the Noise" [1]. Humans are so often taken in by people claiming to be able to make predictions by combining new data points. The more unusual, or unrelated to the subject matter, the better. They make good headlines but (not surpsingly) almost always turn out to be heavily flawed in practice.

You can basically measure how much a pundit/expert is going to be wrong in their predictions by how ideological they are in their analysis. The best indicator is when they use only one or two metrics as a basis of a prediction of an otherwise very complex scenario.

One example from the book is how a researcher became famous before the 2000 US presidential elections by claiming to predict races with 90% accuracy [2]. He claimed that by measuring a) per-capita disposable income combined with b) # of military causalities you can determine whether democrat or republicans get elected. He said historical data backs up his theory. He then proceeded to fail to predict that years election and faded into obscurity.

Nate did his own historical analysis and demonstrated it was only 60% accurate instead of 90%. Plus that was only if you ignore 3rd party candidates as the model assumes a two-party system.

Plenty of other examples are provided in the book which makes me highly suspicious of the value of the predictions made in this article.

The general idea is that we need to stop looking for simple one-off solutions to complex problems. Instead we should adopt multi-factor approaches which suffer from fewer biases and are better grounded in reality. Otherwise these predictions are just another form of anti-intellectualism.

[1] http://www.amazon.com/Signal-Noise-Many-Predictions-Fail--bu...

[2] the "Bread and Peace" model by Douglas Hibbs of the University of Gothenberg http://query.nytimes.com/gst/fullpage.html?res=9803E5DD1F3DF...

hammock
The law of averages will tell you that the more metrics (however random) you throw into your prediction engine, the closer your prediction will be to the actual result. But it's not very remarkable and it will never put you ahead consistently.
Nate Silver's The Signal and the Noise [1] is excellent.

[1] http://www.amazon.com/Signal-Noise-Many-Predictions-Fail--bu...

HN Books is an independent project and is not operated by Y Combinator or Amazon.com.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.