Probability Distributions - Some Concepts
Authors
1. Discrete and continuous random variables;
Discrete random variable can take on at most a countable number of possible values, such as coin flip or rolling dice.
Continuous random variable can take on an uncountable (infinite) number of possible values, such as asset returns or temperatures.
2. Range of possible outcomes of a specified random variable;
1. Coin toss
{Head, Tail}.
2. Rolling a die
(1,2,3,4,5,6}.
3. Share returns (in percent)
Interval [-100, +∞ ).
3. Probability distribution;
Probability distribution specifies the probabilities of the possible outcomes of a random variable.
4. Probability function
Probability function specifies the probability that the random variable takes on a specific value: P(X = x). To determine if a given function is a probability function it must fulfill the two key properties in the next LOS.
5. Properties of a probability function;
Two Key Properties of a Probability Function.
i. 0 ≤ p(x) ≤ 1 because a probability lies between 0 and 1.
ii. The sum of probabilities p(x) over all values of X equals 1.
6. Cumulative distribution function
Cumulative Distribution function specifies the probability that the random variable X is less than or equal to a particular value x, P(X ≤ x). For a discrete random variable this is the sum of the probabilities for all values less than or equal to x.
7. Probability density function;
Probability Density function (pdf) specifies the probability that a continuous random variable takes on a specific value.
i. Pdf of a number is a function .
8. Discrete uniform random variable
Discrete Uniform Random Variable: The uniform random variable X takes on a finite number of values, k, and each value has the same probability of occurring, i.e. P(xi) = 1/k for i = 1,2,…,k.
Examples of simple uniform random variables:
i. Coin flip: Prob(head) = ½ Prob(tail) = ½
ii. Die: Prob(Die shows 1) = 1/6, Prob(Die shows 2) = 1/6, …Prob(Die shows 6)=1/6
9. binomial random variable and binomial probability distribution;
Bernoulli random variable is a binary variable that takes on one of two values, usually 1 for success or 0 for failure. Think of a single coin flip as an example of a Bernoulli r.v.
p(x|n,p) = nCx px(1-p)n-x
The distribution is symmetric when p = .5, but otherwise it is skewed.
9A. Expected value and variance of a binomial random variable;
10. Key properties of the normal distribution;
Normal distribution is a continuous, symmetric probability distribution that is completely described by two parameters: its mean, μ, and its variance, σ2. Written as N(μ, σ2).
i. The normal distribution is said to be bell-shaped with the mean showing its central location and the variance showing its “spread”.
ii. A linear combination of two or more Normal random variables is also normally distributed.
Very good explanation of binomial probability with example.In probability theory and statistics, the binomial distribution is the discrete probability distribution of the number of successes in a sequence of n independent yes/no experiments, each of which yields success with probability p. Such a success/failure experiment is also called a Bernoulli experiment or Bernoulli trial; when n = 1, the binomial distribution is a Bernoulli distribution.
ReplyDeleteZ score Table