The streetlight effect takes its roots in the following anecdote:
*A policeman sees a drunk looking for his keys under a lamp post, starts looking with him. After a while he stops, asking the man whether he's sure this is where he lost them. Turning around, the drunk says he has no idea, but this is the only place he can see anything.*
This original story is attributed to Nasreddin, an Islamic cultural figure:
*One day, Nasreddin lost his ring in the living room. After looking for a while, he goes outside looking for it. Questioned by his wife about this, he answers the room is too dark for him to find anything, when there is so much light outside.*
Both stories illustrate our tendency to overfocus on what comes easy at the expense of what makes sense; on the information we have, rather than the information that is hidden from us; on why we're right, instead of why we're wrong.
It is consequential because it affects how we manage risk and allocate resources.
## Distracting Wounds
A good example of the streetlight effect in action is distracting wounds. I learned about the concept through [[Can't Hurt Me by David Goggins]].
When rescuing people, it's easy to get distracted by the most obvious or spectacular injuries, like an open leg fracture, which is rarely mortal. Addressing that first in the rush of the action without assessing the rest might make you miss the broken rib that perforated the lungs of the victim. Many people die from distracting wounds every year, despite the fact paramedics are specifically taught not to let themselves lead astray by them.
This seemingly goes agains the conventional wisdom of doing the most simple task so that you can move forward. The trick is to recognize some situations call for one type of prioritization — because all tasks at hand are straightforward — while others need you to [[fieldnotes/Notions/Tackle the Monkey First|differentiate between the monkey and the pedestal]], between critical wounds and distracting wounds. What good is patching a scary wound if the victim is going to die from another?
## Distracting Knowledge
>“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so. “ *— Mark Twain*
Another example comes from an experiment described in [[Thinking Fast and Slow by Daniel Kahneman]].
You're given the following description of an individual by his neighbor: *"Steve is very shy and withdrawn, invariably helpful but with very little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail"*. Is Steve more likely to be a librarian or a farmer?
When asked that question, almost every single person answers he's more likely to be a librarian, failing to consider the distribution of librarians and farmers in society. Because some information has been brought to our attention, we tend to consider the question exclusively in the light of it, *replacing* our current knowledge instead of updating it.[^1]
We tend to overrely on information, recipes and technics that are readily available or easily accessible, often neglecting crucial insights or edges.
What does it look like in practice? This is how [Cate Hall claims she was able to create a massive edge for herself](https://usefulfictions.substack.com/p/how-to-be-more-agentic). She went against the poker crowd by focusing on physical reads (analyzing body language and other physical cues) when everyone else was busy with model solvers yielding tiny mathematical advantages.
Asking what relevant pieces are not part of the equation and what irrelevant pieces or constraints are will vastly improve your thinking and your results.
## Distracting Models
> "All models are wrong, some are useful." *— George Box*
By definition, models are derived from past events. But crucial information in any given situation tends to be a moving target. What was relevant yesterday might not be tomorrow. Moreover, models, because they're derived from past events, emphasize what we know instead of what we don't, when it's the latter that gets you.
Models are static. They encapsulate a view of the world, a state of knowledge stuck in the past, rarely, if ever, updated. Overrelying on models makes one blind to the moving boundary between relevant and irrelevant information, between signal and noise.
[[Black Swan Events]] are born from this limitation of models; they occur precisely because something wasn't part of any model. As tempting as it is, you can't model for black swan events, either. The only thing you can really predict is that they will occur.
We've seen it happen during the 2008 subprime mortgage crisis[^2]. Industry-standard models made the mortgage market look rock solid, when it was on the verge of collapse. Only a handful of outliers were able to see happening, no thanks to any model.
Taken from [[The Bed of Procrustes by Nassim Nicholas Taleb]], *"you can expect blowups and explosive errors in fields where there is a penalty for simplicity"*.
## Procrustean Bed
>"There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don't know. But there are also unknown unknowns. There are things we don't know we don't know."
>*— Donald Rumsfeld, former United States Secretary of Defense*
Procrustes was a character from Greek mythology who gained his nickname (Procrustes means "Stretcher") with his rather peculiar sense of hospitality. He abducted travelers and fed them a generous dinner, before making them lie into his special bed. Procrustes wanted the bed to fit his guest to perfection. To do so, if the guest was too tall, he'd cut off his legs. If he was too small, he'd stretch him.
The way we address unknown unknowns is too often by arbitrarily stretching or amputating reality to make it fit our expectations, rather than stretching or shrinking the latter.
## Feedback
Besides reading the aphorisms on the subject in [[The Bed of Procrustes by Nassim Nicholas Taleb]] and actively seeking to identify the Procrustean beds surrounding us, we need feedback.
A crucial component of the anecdote we started with is the policeman asking "why are we looking here anyway?"
Having the people and space to ask hard questions is a requirement if we want to avoid blindspots.
Don't be a quack[^3].
[^1]: To learn more about Bayes theorem, [check out this video by 3Blue1Brown](https://www.youtube.com/watch?v=HZGCoVF3YvM)
[^2]: An excellent investigation on the topic: [Money, Power and Wall Street by Frontline](https://www.pbs.org/wgbh/frontline/documentary/money-power-wall-street/?)
[^3]: UK equivalent of a snake oil salesman, a charlatan