HN Books @HNBooksMonth

The best books of Hacker News.

Hacker News Comments on
Superforecasting: The Art and Science of Prediction

Philip E. Tetlock, Dan Gardner · 5 HN comments
HN Books has aggregated all Hacker News stories and comments that mention "Superforecasting: The Art and Science of Prediction" by Philip E. Tetlock, Dan Gardner.
View on Amazon [↗]
HN Books may receive an affiliate commission when you make purchases on sites after clicking through links on this page.
Amazon Summary
NEW YORK TIMES BESTSELLER • NAMED ONE OF THE BEST BOOKS OF THE YEAR BY THE ECONOMIST “ The most important book on decision making since Daniel Kahneman's Thinking, Fast and Slow. ” —Jason Zweig, The Wall Street Journal Everyone would benefit from seeing further into the future, whether buying stocks, crafting policy, launching a new product, or simply planning the week’s meals. Unfortunately, people tend to be terrible forecasters. As Wharton professor Philip Tetlock showed in a landmark 2005 study, even experts’ predictions are only slightly better than chance. However, an important and underreported conclusion of that study was that some experts do have real foresight, and Tetlock has spent the past decade trying to figure out why. What makes some people so good? And can this talent be taught? In Superforecasting, Tetlock and coauthor Dan Gardner offer a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary people—including a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancer—who set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. They’ve beaten other benchmarks, competitors, and prediction markets. They’ve even beaten the collective judgment of intelligence analysts with access to classified information. They are "superforecasters." In this groundbreaking and accessible book, Tetlock and Gardner show us how we can learn from this elite group. Weaving together stories of forecasting successes (the raid on Osama bin Laden’s compound) and failures (the Bay of Pigs) and interviews with a range of high-level decision makers, from David Petraeus to Robert Rubin, they show that good forecasting doesn’t require powerful computers or arcane methods. It involves gathering evidence from a variety of sources, thinking probabilistically, working in teams, keeping score, and being willing to admit error and change course. Superforecasting offers the first demonstrably effective way to improve our ability to predict the future—whether in business, finance, politics, international affairs, or daily life—and is destined to become a modern classic.
HN Books Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this book.
People who make good forecasts are not polarized enough for punditry.

The authors of the book Superforecasting : The Science of Prediction have done research in this area. And a website [2]

Nate Silver also tells a similar story in his book. He proved it too, by turning pundit.

[1] https://www.amazon.com/Superforecasting-Science-Prediction-P...

[2] http://www.superforecasting.com/

Tolerating some amount of cognitive dissonance is one of the best things that I was able to develop in myself recently. In coding for example, I used to suffer mental anguish over seeing inconsistencies in codebases I'm working on and have an OCD-like drive to fix them. Learning to live with those consistencies freed me up to work on what actually matters.

Furthermore the unforgiving drive for consistency is a reason why people don't update their beliefs when new evidence comes to light. Consider Superforcasting[1] (a book about people with an unusual ability in forcasting the future) the author says that one common trait among superforcasters is that they have a larger capacity for tolerating cognitive dissonance. The drive to avoid cognitive dissonance shackles you to your existing beliefs (see confirmation bias).

[1]: http://www.amazon.com/Superforecasting-Science-Prediction-Ph...

wwweston
Yeah.

"The test of a first-rate intelligence is the ability to hold two opposed ideas in mind at the same time and still retain the ability to function." (F. Scott Fitzgerald)

I've also noticed something about the term "cognitive dissonance" in popular discussion. It's used as a pejorative explanation when the speaker can't figure out how someone else harmonizes two apparently inconsistent or contradictory positions. But when it's a speaker responding to someone else's accusation of cognitive dissonance, something like Fitzgerald's line usually comes up.

It's probably a useful concept in the context of doing actual psychology (either science or therapy), but I've come to believe that in an argument it's often a form of ad hominem attack, like most stories participants in arguments tend to spin about each other.

danieltillett
Furthermore the unforgiving drive for consistency is a reason why people don't update their beliefs when new evidence comes to light.

Only if you reject the evidence that does not fit with your beliefs. If you use the new evidence to examine and update your beliefs then you can be both open to new evidence and avoid cognitive dissonance.

amasad
What if the new evidence puts you at even (or close to even) odds? Then you have to live with the cognitive dissonance until further evidence.
danieltillett
If you are using a Bayesian approach then all new evidence does is adjust the probability of your belief being true. Only those impervious to new information have beliefs with 0 or 1 probabilities.
Dylan16807
Cognitive dissonance is not picking X sometimes and !X sometimes, because they're both around 50%.

Cognitive dissonance is saying that X is 90% certain and that !X is also 90% certain.

deciplex
There is a difference between managing probability mass and "managing" cognitive dissonance. If one of two events will occur, or only one of two hypotheses can be true, with equal probability according to the information you possess, it is not cognitive dissonance to hold that fact in your head. Cognitive dissonance would be more like formulating two hypotheses in logical contradiction to each other, and insisting they must both be true.
dancompton
It is rare that an arbitrary function negates another for all possible inputs.
deciplex
If it negates it for the majority of inputs, but you insist it negates it for very few or no inputs, I would call that cognitive dissonance as well. In other words, if it's highly unlikely that two things are both true, but you consider it very likely they are both true, that's probably cognitive dissonance even if the two propositions are not logical contradictions.
dancompton
Do you disagree that given a bag of all functions which accept N inputs, it is unlikely that 2 chosen at random would negate each other across all N inputs?
deciplex
Obviously I don't disagree with that. Where are you going with this? People don't formulate their beliefs by picking randomly from a collection of all theoretically-possible beliefs. To expand on your example, if you are selecting from a bag of beliefs according to some criteria (based on some combination of morality, desire, etc) which are themselves in conflict (e.g. "I wish to be feared; I wish to be loved") then I think you are more likely to pick some beliefs which happen to contradict each other (again, if not logically, then for most inputs) than you would just by selecting randomly. Or, probably, if your criteria are more in tune with one another.
dancompton
How do people formulate their beliefs?
amasad
I don't think they are that different. First, the basic insight from Bayes is that belief of all kind can be represented as probabilities. Second, as @dancompton alluded to, most things are not as cut and dry as to say that they are logically in contradiction. There is usually a compatiblist view or a third alternative.
deciplex
If you are presented with evidence that requires a Bayesian update, and you are smart enough to figure this out, yet you still do not perform the Bayesian update (or you calculate it incorrectly according to your whims as opposed to the evidence), then I would say you are cognitively dissonant in proportion to the change in probability mass you should have recalculated but did not.

Good enough?

dancompton
So the problem is detecting when that update should be performed, right?
deciplex
You keep replying to me with one-sentence posts that say little and have no point. If you have something to say or a point you're trying to make, I'd prefer you just do that rather than trying socratic-method everyone to enlightenment.

To (try to) answer your question, to the best of my ability based on limited information what your question even is: no, it is not about detecting when that update should be performed. Remember that we are talking about brain states, i.e. the-world-as-you-think-it-is, as it is represented in your mind, and perhaps if you want to push it, as you are consciously aware of it. If you hold conflicting beliefs but you are truly ignorant of the fact, then you can't really be said to have cognitive dissonance. In Bayes terms, I guess this is analogous to having priors, but not being aware of how to evaluate them based on evidence, and maybe even that you should do so in the first place. Cognitive dissonance, on the other hand, would be having priors, knowing how (and when) to update them, doing the update, and then ignoring the results. Alternatively, it would be performing faulty updates and being at least partially aware of the error in doing so.

Natanael_L
Like a benign version of doublethink?
jules
The ability to deal with uncertainty is not the same as cognitive dissonance. There is no cognitive dissonance involved in accepting that mutually contradictory propositions P and Q could be true, you just don't know yet which is. Cognitive dissonance is when you believe that both P and Q are true at the same time. Humans have a tendency to jump to conclusions because they don't like not knowing something, and any answer is better than no answer. It's precisely those people who can't deal with uncertainty who must deal with the most cognitive dissonance when the conclusion they jumped to turned out to be wrong, particularly when they've turned that conclusion into dogma.
amasad
Everyone must jump to a conclusion at some point -- you can't possibly collect all the evidence in the world. I think it is also a case of cognitive dissonance in believing that you can believe otherwise.
Dylan16807
You can pick a conclusion without acting like it's certain.

"I decided X but these pieces of evidence induce doubt" is not cognitive dissonance. It's capacity for uncertainty.

Poorboyrise
Shure with only anectodical reference...

During an interview one of my answers was: "for me - it sounds shit to live ambitions !"

And the answer was like the old adam and eve thingy: "But shit is a super fertilizer, it gives a filmsy plantlet growth and thriving.

"Because i have seen where you're going with this", i answered, "let me tell you that some people think, you can make every stand - when you only ground it"

wink

pessimizer
You never have to decide that something is absolutely true regardless of any new evidence.
bonobo3000
If you are only thinking about it, yes. If you want to take a decision based on the thought process, you have to pick a side at one point, knowing that it might not be the right side.
pessimizer
Only if you think your decisions purely based on fact, rather than thinking of your decisions as based on the best facts you have at the time.

Cognitive dissonance is believing that two contradictory theories are most likely true based on the evidence you have. Since the evidence of the truth of one is also evidence for the untruth of the other, it means that when evaluating the evidence for one, you discount a different set of evidence than you do when evaluating the evidence for the other.

This is never rational, and never something that you have to do. It's usually done to avoid conclusions that would force you (according to your own ethics) to give up something that you have, or not take something that you want.

riprowan
> This is never rational, and never something that you have to do.

Not sure I agree with that.

The real world is full of mutually exclusive moral problems, like when "do unto others" runs directly into "first do no harm" when deciding to give help to someone whose bad behavior you might be enabling.

pessimizer
The problem is continuing to hold both beliefs. You can't have two prime directives. You can't say "never do harm to anyone" and "always defend your children." You have to explicitly prioritize beliefs that have the potential to cause dilemmas.
jules
You can make decisions in the face of uncertainty based on an assessment of the risk and the probabilities of the possible outcomes. If you get in your car you put your seatbelt on without knowing whether you're going to need it for that particular ride. There's no need to "pick a side" on the question of whether you're going to have an accident on that ride. Being uncertain about something is almost the opposite of cognitive dissonance, which comes from being quite certain about contradictory propositions.
None
None
Bluestrike2
For the most part. Cognitive dissonance is the stress that results in believing P and Q are true at the same time, along with a few other scenarios such actions contradictory to one's beliefs or being confronted with evidence that is contrary to one of your beliefs (belief disconfirmation paradigm). We then engage dissonance reduction to arbitrate the difference and lessen the stress. One of the easiest methods of which is ignoring or rejecting the conflicting evidence or ignoring the voices that are--literally--causing you mental pain.

Socially, belief disconfirmation is arguably the most problematic because it turns rational discourse into something very, very ugly. Instead of persuading people, evidence and debate can further entrench a given belief instead. When certain issues became almost identity and lifestyle statements (climate change denial, the anti-vax movement, GMO hysteria, etc.), contrary evidence is perceived as a personal attack instead of a weighing of scientific evidence to determine fact. Unfortunately, overcoming this isn't easy. And it's made that much harder when people are able to retreat into their preferred echo chambers rather than be forced to directly confront the dissonance.

Mar 28, 2016 · qwtel on New Kid on the Blockchain
If you are sure that the Ethereum project will fail, I'd be happy to take on bets 1 : infinity. I pay you $1 if the project is no longer around, say 10 years from now, you pay me everything you have otherwise. It would only be rational for you to take this bet, since I'm offering you a free dollar according to your beliefs.

EDIT: the intent here was to expose overconfidence and vague predictions, not pay fan service to ethereum or suggest an actual bet. if anybody is interested in how to make proper predictions, I recommend the books by Philip Tetlock, especially the latest called Superforecasting [1].

EDIT2: I wasn't aware my views are so controversial, so here is some more background: If somebody was convinced something couldn't happen, he'd assign a probability of 0 to that event. If that person wanted to act according to her believes, taking on bets, no matter the odds, would have positive expected utility. Since almost nobody takes on such bets, it suggests that we generally over-exaggerate when we say things like "impossible" or "sorry for you loss", hence we are being overconfident. The other is vagueness. By not being clear about what exactly we are predicting, we're leaving the door open to back out of it later. In fact, Tetlock has found that, by making vague predictions, experts could later convince themselves (and others) that they were "close", skewing their sense of accuracy. Unfortunately, when subject to a prediction tournament with strict rules, they would score no better than random [2].

[1]: http://www.amazon.com/Superforecasting-The-Art-Science-Predi...

[2]: http://www.amazon.com/Expert-Political-Judgment-Good-Know/dp...

mootothemax
>if the project is no longer around

You're offering a nonsensical bet.

We live a world where people often delight in keeping obscure things alive. Even http://nyan-coin.org still has a following!

As an illustration of what I mean, I'll be incredibly surprised if bitcoin ever truly dies, and the price ever truly tanks. I'd expect coins to remain above $100, as a few people will flat-out refuse to sell for less, and minimal trading will still take place. Anything in the "real" world though? naaaaah.

qwtel
Yes, I have to admit it's vague. Price below $1 or not traded at all, would be a better condition. Maybe add some trade volume requirements as well.
mootothemax
>Price below $1 or not traded at all

All it'd take is two bored people to stop this from happening, merrily trading just for their own delight.

As with bitcoin, I'd be really surprised if any of these coins stopped being used entirely, thanks to tech enthusiasts keeping them alive.

qwtel
Generally, there has to be some condition on which to judge the success of a project.
mootothemax
>Generally, there has to be some condition on which to judge the success of a project.

So why not try offering a bet on the success of a project, rather than going through the horror of trying to prove a negative?

pavel_lishin
Debating the exact rules of this bet is already not worth a dollar of anyone's time.
n_mca
That was part of the point...
mootothemax
>That was part of the point...

Not really; my point was that some things never die, however functionally useless they become.

The top comment said that Eth will become functionally useless, and the bet offered didn't involve it becoming functionally useless.

None
None
nibs
The bet would be joosters gets a dollar for every dollar short of break-even the Ethereum industry is, and you get one for every dollar it creates in surplus. Or you can change to order of magnitude, to make it more viable.
bachback
if you would like to sell me ETH put options, let me know how we can arrange that deal.
qwtel
The bet was a rhetorical device intended to show the impossibility of having definitive views about the future while refusing to accept bets of the form 1 : inf. I appreciate the offer though ;)
imtringued
>If you are sure that the Ethereum project will fail, >the project is no longer around

You should work on your reading comprehension. The parent never made these claims. Whose beliefs are you even talking about? Saying something is not useful and not practical is not the same as saying it will fail or stop existing.

The only thing you're doing here is psychologically manipulating someone into accepting a bet with conditions you yourself chose that are clearly in your favour.

You didn't even define what the vague phrases "the Etherum project will fail" or "the project is no longer around" is even supposed to mean. The betting conditions are effectively "give me everything you have when I feel like it".

qwtel
> [intent was not] suggest an actual bet
There have been several editions of the Global Trends reports, all interesting both for what they got right and what they got wrong. Please note that the reports are not "predictions" but "forecasts". The methodology tries to identify trends and drivers that are thought to be significant in the evolving future. For a few of them, scenarios are created to explore how different interacting trends and drivers might create a future.

HN readers interested in learning more about forecasting might read Philip Tetlock's book, Superforecasting, http://www.amazon.com/Superforecasting-The-Art-Science-Predi..., which has been justifiably on many must read lists.

For anyone who is interested in this topic, Dr. Tetlock just released a new book about it (http://www.amazon.com/Superforecasting-The-Art-Science-Predi...).

We're also hosting a public forecasting tournament for him and his team that focuses on geo-political forecasting: https://www.gjopen.com/

techstrategist
In addition - anyone interested in Tetlock's work and how to think like a super forecaster can join us at the conference this Saturday in London. https://www.eventbrite.co.uk/e/superforecasting-and-geopolit...
MathsOX
I'm about half-way through the book. Very easy, enjoyable read. Left wondering why some of these Amazon reviews try to make it sound as if it's quite academically/technically rigorous.
HN Books is an independent project and is not operated by Y Combinator or Amazon.com.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.