Encountered a rather strange definition of a “miracle” in Shermer’s book. He says a miracle is an event that has 1 in a million odds of happening.
Littlewood defines a miracle as an exceptional event of special significance occurring at a frequency of one in a million. He assumes that during the hours in which a human is awake and alert, a human will see or hear one “event” per second, which may be either exceptional or unexceptional. Additionally, Littlewood supposes that a human is alert for about eight hours per day.
As a result, a human will in 35 days have experienced under these suppositions about one million events. Accepting this definition of a miracle, one can expect to observe one miraculous event for every 35 days’ time, on average – and therefore, according to this reasoning, seemingly miraculous events are actually commonplace.
Why is this even a law? It’s full of assumptions. How does Littlewood know that a human will see or hear 1 event/sec? What if a second is passed with no event at all?
How about that “alert for about eight hours per day”? What about the rest of the day? What does being “alert” include?
By the way, this is a debate topic. I’m not asking for a definition of what a miracle is. I’ve already checked it from Merriam-Webster. I’m trying to figure out if anyone agrees with me that this law is bullshit, and deriving a definition and using it in a book is an even more bullshit move. By the way, he didn’t reference this law in the book, he just gave the definition.