a)** R**andom variable - Definition

A random variable is a quantity whose outcome is uncertain.

Mutually exclusive events mean one and only one event can occur at any time.

Exhaustive events one of the events must occur, i.e. that the listed events cover all possible outcomes

b) ** T**he two defining properties of probability;

Two defining properties of Probability.

*i. *Probability of any event *E* is a number between 0 and 1, .

*ii. *Sum of the probabilities of any list of mutually exclusive and exhaustive events equals 1.

c) **E**mpirical, a priori, and subjective probabilities;

__Empirical__ probability is when the probability of an event occurring is estimated from data, usually in the form of a relative frequency.

__A priori__ probability is when probability of an event is deduced by reasoning about the structure of the problem itself.

These first two approaches to probability are sometimes referred to as objective probabilities because they should not vary from person to person.

__Subjective__ probability is when the probability of an event is based on a personal assessment without reference to any particular data.

d) **T**he investment consequences of probabilities that are inconsistent;

__Inconsistent__ probabilities create profit opportunities because investors can buy and sell assets at the resulting inconsistent prices in ways that allow them to achieve profits on average. These buying and selling decisions should eliminate inconsistent prices, and probabilities, in the market.

e) **U**nconditional and conditional probabilities;

__Unconditional__ or marginal probability, *P(A),* is the probability of event A occurring without reference to any other event.

__Conditional__ probability, *P(A|B),* is the probability of event A occurring given that event *B* is known to already have occurred. *P(A|B) = P(AB)/P(B) if P(B) ≠ 0*. Conditional probabilities are important in tests of market efficiency, where event *B* is some piece of public or private information that becomes available to the market at some point of time.

f) **J**oint probability;

__Joint__ probability, *P(AB),* is the probability of __both__ event *A* and event *B* occurring together.

g) **M**ultiplication rule and the joint probability of two events;

__Multiplication Rule for probabilities__ - __Joint__ probability, *P(AB),* is

*P(AB) = P(A|B) P(B) = P(B|A) P(A)*

h) **T**he probability that at least one of two events will occur;

__Addition Rule for probabilities__ – Given events *A* and *B*, the probability that *A* __or__*B* occurs is equal to:

*P(A or B) = P(A**U B) = P(A) + P(B) - P(AB)*

If you don’t see this result then construct a Venn diagram of Events A and B that share some overlap. The sum *P(A) + P(B)* counts *P(AB)* twice, so it must be subtracted.

i) ** D**ependent and independent events;

__Definition of Independent Events__ – Two events *A* and *B* are independent if and only if:

*P(A|B) = P(A)* or equivalently *P(B|A) = P(B)*

If two events are dependent, then the occurrence of one of the events is related to the probability of the occurrence of the other event.

j) ** J**oint probability of any number of independent events;

__Multiplication Rule for Independent Events__ - __Joint__ probability of independent events *A _{1}, A_{2}, … A_{m}* is:

*P(A _{1}A_{2}…A_{m}) = P(A_{1})P(A_{2})… P(A_{m-1})P(A_{m}) *

Think about calculating the probability of getting 10 heads on ten coin flips.

k) **T**he total probability rule

__Total Probability Rule __- Probability of event *A* is:

*i. **P(A) = P(A|S) P(S) + P(A|S ^{C}) P(S^{C}) *

*ii. **P(A) = P(A|S _{1}) P(S_{1}) + P(A|S_{2}) P(S_{2})… + P(A|S_{m}) P(S_{m})* where

*S*

_{1}, S_{2}, … , S_{m}_{ }are mutually exclusive and exhaustive events.

l) ** E**xpected value, variance and standard deviation;

__Expected Value__ of a random variable is the probability-weighted average of the possible outcomes of the random variable. Expected Value of random variable X is calculated as:

E[X] = ΣP(x_{i})x_{i}

__Variance__ of a random variable is the expected value of squared deviations from the random variable’s expected value.

σ² = Σ [(X – Avg.(X)] ²/n-1

__Standard deviation__ is the square root of the variance.

It is a measure of risk and it shows dispersion of possible outcomes around expected level of outcomes.

Original knol - http://knol.google.com/k/narayana-rao/probability-concepts-in-the-context-of/2utb2lsm2k7a/ 497
## No comments:

## Post a Comment