This is a unique book on how probability affects our everyday lives. It guides the reader in an almost chronological trip through the fascinating and amazing laws of chance, omnipresent in the natural world and in our daily lives. Along the way many fascinating topics are discussed. These include challenging probability paradoxes, "paranormal" coincidences, game odds, and causes and effects. Finally the author discusses possibilities and limitations of learning the laws of a Universe immersed in chance events. This charming book, with its many easy-to-follow mathematical examples, will inform and entertain the scientist and non-scientist alike.

clear enough: when do we consider a sequence of experiments to be suﬃciently large to ensure an acceptable estimate of a mathematical expectation? The notion of conﬁdence interval is going to supply us with an answer. Consider the necklace-weighing example. What is an ‘acceptable estimate’ of the necklace weight? It all depends on the usefulness of our 4.8 The Law of Large Numbers Revisited 83 estimate. If we want to select pearl necklaces, maybe we can tolerate a deviation of 2 g between our

p)/ n. (Note once again the division by n.) We thus arrive at a vicious circle: in order to estimate p, we need to know the standard deviation which depends on p. The easiest way out of this vicious circle is to place ourselves in a worst case situation of the standard deviation. Given its formula, it is easy to check that the worst case (largest spread) occurs for p = 0.5. Since we saw that a 95% conﬁdence level corresponds to two standard deviations we have √ 1 0.25 =√ . 2× √ n n √ We just

determine the ratio between the number of cases favorable to the event and the total number of possible cases. Denoting the probability of an event by P (event), this gives P (event) = number of favorable cases . number of possible cases Let us assume that in the throw of a die we wish to determine the probability of obtaining a certain face (showing upwards). There are 6 possible outcomes or elementary events (the six faces), which we represent by the set {1, 2, 3, 4, 5}. These are referred to

longer observed? The modern decoherence theory presents an explanation consistent with experimental results, according to which for suﬃciently large particles, where suﬃcient means that an interaction with other particles in the surrounding space is inevitable, the state superposition is no longer preserved. The intrinsically random nature of quantum phenomena was a source of embarrassment for many famous physicists. It was even suggested that instead of probability waves, there existed in the

practically impossible to prove whether or not a given sequence is algorithmically random. Incidentally, note that a minimal program s∗ must itself be algorithmically random. If it were not, there would then exist another program s∗∗ outputting s∗ , and we would be able to generate s with a program instructing the generation of s∗ with s∗∗ , followed by the generation of s with s∗ . The latter program would only have a few more bits than s∗∗ (those needed for storing s∗ and running it);