> A policeman sees a drunk looking for his keys under a lamp post, starts looking with him. After a while he stops, asking the man whether he's sure this is where he lost them. Turning around, the drunk says he has no idea, but this is the only place he can see anything.
The anecdote seems to come from an older tale about the Sufi sage Nasreddin:
> One day, Nasreddin lost his ring in the living room. After looking for a while, he goes outside looking for it. Questioned by his wife about this, he answers the room is too dark for him to find anything, when there is so much light outside.
Both stories illustrate our tendency to overfocus on the obvious at the expense of what makes sense; on the information readily available, consensus, rather than noisy signals; on why we're right, instead of why we're wrong.
Distracting injuries are visually so impactful they monopolize attention at the expense of others. They illustrate the point vividly. Paramedic rescues train not to fall for this trap, yet hundreds of people die every year because minor but visually impacting injuries were treated, leaving deadly ones unattended.
> “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so. “ _— Mark Twain_
I found another instance of the effect at work in an experiment described by Daniel Kahneman in his book *Thinking, Fast and Slow*.
Given the following description of an individual by his neighbor: _"Steve is very shy and withdrawn, invariably helpful but with very little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail"_. Is Steve more likely to be a librarian or a farmer?
Almost everyone answers he's more likely to be a librarian. Yet there are far more farmers than librarians. Because this distribution wasn't mentioned, and despite the fact any SWAG[^1] would make this fact obvious, we jump on the information at hand and analyze the question exclusively in the light of it, _replacing_ our current knowledge, when we should have updated it instead[^2]. All this take is a question about the question itself[^3].
Pre-processed information, recipes and formulas hinder our ability to think.
> "There is an unmistakeable difference between a bag of rabbit parts and a rabbit."
> _— Ryan Nance_
In the same way, there's an unmistakeable difference between the ability to follow recipes and the ability to cook.
So we know we tend to overrely on *given* information.
But the streetlight can be more than a single piece of data. It can be an entire system. A financial model.
By their nature, models are man-made streetlights. Would you cross the road looking only at the traffic lights and never at the cars?
Why should financial decision-making be any different? And yet the housing market was blooming right before the subprime crisis hit in 2008, because the indicators looked great.
>"The problems with models is that the assumptions that are necessary to simplify the model eventually come back to bite you." *— Rory Sutherland*
Poking and groping in the dark feels uneasy; but why should we expect anything else when dealing with uncertainty?
> "The first rule of fishing is fish where the fish are."
> *— Charlie Munger*
[^1]: Scientific wild-ass guess
[^2]: Which is known as Bayesian epistemology by the mathematically inclined
[^3]: Some thoughts about the role in AI in metacognition: [[Practical Questions about Practical Questions]]