Independence is a fundamental notion in probability theory, as in statistics and the theory of stochastic processes.
Two events are independent, statistically independent, or stochastically independent[1] if the occurrence of one does not affect the probability of occurrence of the other (equivalently, does not affect the odds). Similarly, two random variables are independent if the realization of one does not affect the probability distribution of the other.
When dealing with collections of more than two events, a weak and a strong notion of independence need to be distinguished. The events are called pairwise independent if any two events in the collection are independent of each other, while saying that the events are mutually independent (or collectively independent) intuitively means that each event is independent of any combination of other events in the collection. A similar notion exists for collections of random variables.
The name "mutual independence" (same as "collective independence") seems the outcome of a educational choice, merely to distinguish the stronger notion from "pairwise independence" which is a weaker notion. In the advanced literature of probability theory, statistics, and stochastic processes, the stronger notion is simply named independence with no modifier. It is stronger since independence implies pairwise independence, but not the other way around.
https://en.wikipedia.org/wiki/Independence_(probability_theory)
No comments:
Post a Comment