Entropic Certainty
The Second Law of Thermodynamics is one of the most concrete and reliable principles in nature, yet it is not absolutely guaranteed in every case. It states that in a closed system, entropy tends to increase over time - not as a rigid rule, but as a statistical inevitability.
Strictly speaking, it’s not impossible for entropy to decrease in a closed system; it’s just so overwhelmingly unlikely especially in macroscopic systems that we treat it as a fundamental law.
In small systems, entropy might fluctuate briefly, but even there, the chances of a consistent decrease in entropy are astronomically low. As the system gets larger, the probability of such a decrease trends towards zero. Why?
Well in short, there are absurdly more ways that a system can be "disordered" than "ordered." For a visual example, imagine dropping an egg and watching it's shell shatter, contents erupt, and a mess be made. There are an incredible number of possible messes you can get from dropping an egg.
Most of those messy outcomes look more or less similar - a splatter here instead of there - but they are all distinct disordered states nonetheless.
In contrast, how mind-bogglingly rare would it be to see an egg unbreak? Can you imagine tossing the yolk, whites, and shattered shell into the air and watching the egg magically leap back together, reassembling perfectly into its original form?
You know such a result is unfathomably unlikely, yet all you need to do is reverse a video of the egg breaking to see this happen. If you could somehow reverse the physics of what occurred when the egg broke, you would get to witness it unbreak in person.
So, while the unbreaking of the egg is technically possible, for all practical purposes, it might as well be impossible. Let’s call this entropic certainty: something so outrageously and overwhelmingly likely (or unlikely) that it might as well be a certainty.
---
So now we can say that we’ve divided truth into two categories: the things we know to be true, and the things we know to be true with extremely high probability - i.e. entropic certainty.
However, one could argue that even what we believe to be 100% certain is actually just another instance of entropic certainty. That is, anything that has always been correct isn’t guaranteed to remain correct; it’s simply never been proven wrong - yet.
We might estimate, based on its long-standing and consistent efficacy, that there’s a 99.999… (repeating to a googolplex) percent chance our assertion is true.