The streetlight effect got its name from the following anecdote:
*A policeman sees a drunk looking for his keys under a lamp post, starts looking with him. After a while he stops, asking the man whether he's sure this is where he lost them. Turning around, the drunk says he has no idea, but this is the only place he can see anything.*
This original story is attributed to Nasreddin, an Islamic cultural figure:
*One day, Nasreddin lost his ring in the living room. After looking for a while, he goes outside looking for it. Questioned by his wife about this, he answers the room is too dark for him to find anything, when there is so much light outside.*
Both stories illustrate our tendency to overfocus on what comes easy at the expense of what makes sense; on the information we have, rather than the information that is hidden from us; on why we're right, instead of why we're wrong.
It is consequential because it affects how we manage risk and allocate resources.
Let's take a look at what practical consequences are derived from this.
## Distracting Injuries
Distracting injuries are visually so impactful they take our attention away from much more serious but less obvious wounds. Paramedic rescues train not to fall for this mental trap, but a number of people still die up to this day because deadly wounds were neither noticed nor treated because of distracting wounds.
This illustrates the streetlight effect pretty well. What good is patching a scary wound if the victim is going to die from another? Spectacular injuries are not necessarily the most dangerous, and paramedics try as best as they can to look for life threats before treating them.
Maybe there's something in there for the rest of us.
## Distracting Ego
A marketing teacher once asked his students what would be the biggest competitive advantage for a hotdog stand. Answers started fusing and shouting out ideas revolving around location, quality, price, taste... Once answers ran out, the teacher simply said: 'A starving crowd'.
We tend to default to things that are under our control, even if they have little to no influence over the situation compared to external factors. This is because such things are known and obvious to us, in plain sight, so to speak, and they preserve our ego from the harm that could come with uncertainty.
Take for instance the entrepreneur that starts his journey by designing a brand, picking a name, making a website, taking a loan... All of those are external signs of having a business, and make him *feel* like he has a one, but they are not a business. In fact, none of those are necessary to test a business idea. Focusing on them, however, protects the ego from the rejection that would come with facing the market.
This is what I call **active procrastination**. Doing one thing as a proxy to achieve another without a causal link between the two, because what you're doing feels better than what you should be doing to achieve your goal.
## Distracting Information
>“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so. “ *— Mark Twain*
Another example comes from an experiment described in [[Thinking Fast and Slow by Daniel Kahneman]].
You're given the following description of an individual by his neighbor: *"Steve is very shy and withdrawn, invariably helpful but with very little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail"*. Is Steve more likely to be a librarian or a farmer?
When asked that question, almost every single person answers he's more likely to be a librarian, failing to consider the distribution of librarians and farmers in society. Because some information has been brought to our attention, we tend to consider the question exclusively in the light of it, *replacing* our current knowledge, when we should have updated it instead.[^1]
Again, we tend to overrely on information, recipes and technics that are readily available or easily accessible, often neglecting crucial insights or edges.
## Distracting Models
>"The problems with models is that the assumptions that are necessary to simplify the model eventually come back to bite you."
>*— Rory Sutherland*
So, we overrely on given information. But the streetlight can be more than a single piece of data. It can be an entire system. A model. By their nature, models are man-made streetlights. They illuminate a specific, simplified version of the world that's based on past data, leaving the messy, unpredictable and decisive parts of the future in the dark.
At the core of models is the hidden assumption that whatever was important in the past will stay that way, pushing us to mold our forecasts based on what might already be irrelevant. This leads to solutions that would have worked yesterday but fail when confronted with tomorrow, because the future tends to be vastly different from our expectations. Why? Because of... unexpected events.
In short, this is the recipe for disastrous blind spots called [[Black Swan Events|black swan events]] (Taleb 2007).
Such events owe their name to a deceptively simple idea. People tend to think black swans are impossible, because all the swans they've ever seen were white; and with every white swan they see, their confidence in the impossibility of a black swan grows bigger. But a single black swan is enough to wipe out everything that was based on the assumption that all swans are white.
We've seen it happen during the 2008 subprime mortgage crisis[^2], which was deemed impossible by many. Industry-standard models made the mortgage market look rock solid, when it was on the verge of collapse. Only a handful of outliers were able to see happening, no thanks to any model.
>"You can expect blowups and explosive errors in fields where there is a penalty for simplicity."
>(Taleb 2010)
Overrelying on models makes one blind to the moving boundary between relevant and irrelevant information, between signal and noise.
This raises a question: Can we make predictions about what we did not observe based on what was observed? Two puzzling observations are (1) how terrible we are at predicting the future and (2) how obsessed we are with predicting the future.
> "All models are wrong, some are useful." *— George Box*
## Final Thoughts
Ultimately, the streetlight effect is a pervasive notion that comes in many shapes and forms, from a gruesome yet superficial injury to a sophisticated financial model. In every case, the same error is made: We mistake the illuminated part of the map for the entire territory.
The real challenge isn't just to look harder at what's in front of us until we work up a headache; it's to have the audacity to ask hard questions, challenge the status quo, and look in the dark.
>"The first rule of fishing is fish where the fish are."
>*— Charlie Munger*
[^1]: For more on this and the Bayes theorem, [check out this video by 3Blue1Brown](https://www.youtube.com/watch?v=HZGCoVF3YvM)
[^2]: An excellent investigation on the topic: [Money, Power and Wall Street by Frontline](https://www.pbs.org/wgbh/frontline/documentary/money-power-wall-street/?)