The streetlight effect got its name from the following anecdote:
>A policeman sees a drunk looking for his keys under a lamp post, starts looking with him. After a while he stops, asking the man whether he's sure this is where he lost them. Turning around, the drunk says he has no idea, but this is the only place he can see anything.
While this is the most common version, the original story is often attributed to an older tale about the Sufi sage Nasreddin, an Islamic cultural figure:
>One day, Nasreddin lost his ring in the living room. After looking for a while, he goes outside looking for it. Questioned by his wife about this, he answers the room is too dark for him to find anything, when there is so much light outside.
Both stories illustrate our tendency to overfocus on what comes easy at the expense of what makes sense; on the information we have, rather than the information that is hidden from us; on why we're right, instead of why we're wrong.
It is consequential because it affects how we manage risk and allocate resources.
Let's take a look at what practical consequences are derived from this.
## Distracting Injuries
Distracting injuries are visually so impactful they take our attention away from much more serious but less obvious wounds. Paramedic rescues train not to fall for this mental trap, but a number of people still die up to this day because deadly wounds were neither noticed nor treated because of distracting wounds.
This illustrates the streetlight effect pretty well. What good is patching a scary wound if the victim is going to die from another? Spectacular injuries are not necessarily the most dangerous, and paramedics try as best as they can to look for life threats before treating them.
Maybe there's something in there for the rest of us.
## Distracting Ego
A marketing teacher once asked his students what would be the biggest competitive advantage for a hotdog stand. Students started shouting out ideas revolving around location, quality, price, taste... Once answers ran out, the teacher simply said: 'A starving crowd'.
We tend to default to things that are under our control, even if they have little to no influence over the situation compared to external factors. This is because such things are known and obvious to us, in plain sight, so to speak, and they preserve our ego from the harm that could come with uncertainty.
Take for instance the entrepreneur that starts his journey by designing a brand, picking a name, making a website, taking a loan... All of those are external signs of having a business, and make him *feel* like he has a one, but they are not a business. In fact, none of those are necessary to test a business idea. Focusing on them, however, protects the ego from the rejection that would come with facing the market.
This is what I call **active procrastination**. Doing one thing as a proxy to achieve another without a causal link between the two, because what you're doing feels better than what you should be doing to achieve your goal.
## Distracting Information
>“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so. “ *— Mark Twain*
Another example comes from an experiment described in [[Thinking Fast and Slow by Daniel Kahneman]].
You're given the following description of an individual by his neighbor: *"Steve is very shy and withdrawn, invariably helpful but with very little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail"*. Is Steve more likely to be a librarian or a farmer?
When asked that question, almost every single person answers he's more likely to be a librarian, failing to consider the distribution of librarians and farmers in society. Because some information has been brought to our attention, we tend to consider the question exclusively in the light of it, *replacing* our current knowledge, when we should have updated it instead.[^1]
Again, we tend to overrely on information, recipes and technics that are readily available or easily accessible, often neglecting crucial insights or edges.
## Distracting Categories
Software specifications are often misleading. One example I had on my team regarded the implementation of access-control for our database. Because I made the mistake of calling the feature by its category rather than talking about our needs, the person I put in charge of implementing it rushed onto access-control implementation pre-packaged recipes.
Such ideal systems encompassed our needs, but they also came with a ton of overhead and at ridiculous costs (something like 30% of our revenue and a quarter of refactoring, give or take).
When we boiled down the specifications to what we actually needed, it became obvious trivial solutions were available and fitted *all* of our needs. They didn't fit hypothetical future needs, of course, but they did the job, are still in use and saved the company both time and money.
## Distracting Context
Once I was a student living in a dorm in Moscow. We were moving out with my girlfriend at the time. While moving, some acquaintances from the gym *on their way to the gym* greeted me and wished me good luck. They could have actually used their muscles, helped us and get the same amount of exercise, but because this was outside their usual frame of reference, it didn't occur to them. I was puzzled.
## Distracting Models
>"The problems with models is that the assumptions that are necessary to simplify the model eventually come back to bite you."
>*— Rory Sutherland*
We know we overrely on given information. But the streetlight can be more than a single piece of data. It can be an entire system. A model. By their nature, models are man-made streetlights. They illuminate a specific, simplified version of the world that's based on past data, leaving the messy, unpredictable and decisive parts of the future in the dark.
At the core of models is the hidden assumption that whatever was important in the past will stay that way, pushing us to mold our forecasts based on what might already be irrelevant. This leads to solutions that would have worked yesterday but fail when confronted with tomorrow, because the future tends to be vastly different from our expectations. Why? Because of... unexpected events. This is the recipe for disastrous blind spots called [[Black Swan Event|black swan events]] (Taleb 2007).
## Final Thoughts
Ultimately, the streetlight effect is a pervasive notion that comes in many shapes and forms, from a gruesome yet superficial injury to a sophisticated financial model. In every case, the same error is made: We mistake the illuminated part of the map for the entire territory.
The real challenge isn't just to look harder at what's in front of us until we work up a headache; it's to ask hard questions, challenge the status quo, and look in the dark.
>"The first rule of fishing is fish where the fish are."
>*— Charlie Munger*
[^1]: For more on this and the Bayes theorem, [check out this video by 3Blue1Brown](https://www.youtube.com/watch?v=HZGCoVF3YvM)