Introduction to Probability
Sample Spaces
Any action with a random outcome is defined as random experiment
The sample space for this action is the set of all possible outcomes. Denoted as .
A sample space can be discrete or continuous. Ex. flipping a coin, measuring weight
For any events and in :
- The union of are all outcomes from in either or ;
- The intersection of are all outcomes in both of and ;
- The complement of A(sometimes denoted as or ) is the set of all outcomes in that are not in ;
If and don't have any common outcomes, they are mutually exclusive. is , or the empty set.
Counting Techniques
Addition Rule
Consider a job that can be done with 2 independent machine; first machine OR second machine. The first machine does the job in ways and the second machine does the job in ways. Then, the job can be done in ways.
Multiplication Rule
A job that can be done in a k-stage procedure would be modeled as having bags, with items in the first bag,..., items in -th bag. A -stage process is a process for which:
- there are possibilities at stage 1;
- regardless of the 1st outcome there are possibilities of stage 2
- regardless of the previous outcomes, there are choices at stage
Then there are total ways the process can turn out.
Ordered and Unordered Samples
Ordered Samples
Two different scenarios: with replacement or without replacement.
Sampling With Replacement (order important):
- If outcomes and repetitions then
Sampling Without Replacement (order important):
- This results in
Unordered Samples
In this case we would use =
Probability of an Event
For situations where we have a random experiment which has exactly possible **mutually exclusive, equally likely outcomes, we can assign a probability to an event by counting the number of outcomes that correspond to . If the count is then: .
The probability of each individual outcome is thus
Axioms of Probability
- For any event , .
- For the complete sample space , .
- For the empty event .
- For two mutually exclusive events and , the probability that or occurs is
Since , and since and are mutually exclusive then, .
General Addition Rule
When and are mutually exclusive, and
If there are more than two events, the rule expands as follows:
Conditional Probabilities and Independent Events
Any two events and satisfying are said to be independent; this is a purely mathematical definition, but it agrees with the intuitive notion of independence in simple examples.
When events are not independent, we say that they are dependent or conditional.
Conditional Probability
The conditional probability of an event given that another event has occurred as .
Law of Total Probability
if are mutually exclusive and exhaustive (i.e. forall and ), then for any event
.
Bayes Theorem
Terminology
- (hypothesis) is the probability of the hypothesis being true prior to the experiment (called the prior)
- (hypothesis|data) is the probability of the hypothesis being true once the experimental data is taken into account (called the posterior)
- (data|hypothesis) is the probability of the experimental data being observed assuming that the hypothesis is true (called the likelihood)