* The fallacy of parsimony
I think statements like these are fairly common:
While these claims are temptingly easy to digest, they paint an overly simplistic model of the world: X leads to Y. Of course the real world is often much more complex: many factors lead to Y, and we just happen to be aware of X.
John Haidt calls this the fallacy of parsimony: we choose a simpler explanation (e.g. X causes Y) over the more complex one because it’s easier for us to understand. But that doesn’t make it correct.
Just because you think of a reason first doesn’t mean it’s the right reason or the entire reason. It’s probably simply a reason, and one of many.
There’s some neat mental jiu jitsu you can use to overcome this: every time you have an explanation for something, preface it with “One of the reasons is because….”.
Popularized by Toyota in the mid-1900s, kaizen (in its English usage) stands for “continuous improvement.”
It means that even when things are pretty good, there is still room for improvement. You aren’t satisfied with “good enough.”
Good enough operates under the assumption that things will stay as they are - what works today will work tomorrow. Which works, until it doesn’t.
When the world - or what we want out of the world - changes, then good enough quickly becomes inadequate. For example, if a business is satisfied with its sales this year, but then a competitor swoops in, then what? Or if we’re pretty satisfied with our job, but then we get laid off, then what?
It’s easy to get complacent with a good enough mentality. Kaizen is a nice reminder that there are always a few things we could be doing better.
* The half-life of facts
We take many things to be true, such as how many planets there are in the solar system, or how many states there are in the U.S.
The problem is that not all of these things stay true over time. When I was born, there were nine planets in the solar system - now there are eight. Things change, and the facts change along with them.
Some facts expire faster than others, so it’s good to get a sense of what type of facts have shorter or longer half-lives. For example, scientific laws? Good half-lives. GDP growth of China? Not so good.
Because facts have half-lives, as Shane Parrish puts it, “we can never be too sure of what we know.” We have to be skeptical of our own beliefs.
Peter Attia takes this to its logical conclusion: “if you are anchored to being right, you will get hosed.”
Thanks for reading,
 Suggested by Sanjay Bakshi on The Knowledge Project.
 The fancy word for this is “fallibilism.”