# Probability Fundamentals

## Bayes’ Theorem:

Bayes’ Theorem is a formula that calculates the probability of an event occurring based on prior knowledge of conditions that might be related to the event.

P(A|B) = (P(B|A) * P(A)) / P(B)

where: P(A|B) is the probability of event A occurring given that event B has occurred. P(B|A) is the probability of event B occurring given that event A has occurred. P(A) is the prior probability of event A occurring. P(B) is the prior probability of event B occurring.

## Conditional Probability:

Conditional probability is the probability of an event occurring given that another event has already occurred.

P(A|B) = P(A and B) / P(B)

where: P(A|B) is the probability of event A occurring given that event B has occurred. P(A and B) is the probability of both event A and event B occurring. P(B) is the prior probability of event B occurring.

## Independent Events:

Two events are considered to be independent if the occurrence of one event does not affect the occurrence of the other event.

P(A and B) = P(A) * P(B)

where: P(A and B) is the probability of both event A and event B occurring. P(A) is the probability of event A occurring. P(B) is the probability of event B occurring.

# Random Variable: Discrete Random Variables

A discrete random variable takes on a countable number of possible values. The probability mass function (PMF) describes the probability distribution of a discrete random variable.

PMF: P(X=x) = p(x)

where: P(X=x) is the probability that the random variable X takes on the value x. f(x) is the probability mass function.

Mean: μ = E(X) = ∑[x * p(x)]

where: E(X) is the expected value of the random variable X. x is each possible value of the random variable X. f(x) is the probability mass function.

Variance: σ² = E(X2) – [E(X)]2 = ( ∑[x2 * p(x)] ) – ( ∑[x * p(x)] )

where: E[X²] = ∑[x2 * p(x)]

## Uniform Discrete Distribution:

The uniform discrete distribution describes the probability of a discrete random variable taking on values within a certain range with equal probability.

PMF: P(X) = 1/n

Mean: μ = E(X) = (n+1) / 2

where: E(X) is the expected value of the random variable X. n is the total number of possible values of X.

Variance: σ² = (n²-1) / 12

## Binomial Distribution:

The binomial distribution describes the probability of a binary event with a fixed number of independent trials (usually denoted as n).

PMF: P(X=k) = nCr pr qn-r for r ∈ {0,1,2,…,n}

Mean: μ = E(X) = np

where: E(X) is the expected value of the random variable X. n is the number of independent trials. p is the probability of a success in each trial (usually denoted as p= P(X=1)).

Variance: σ² = npq = np*(1-p)

where: E[(X-μ)²] is the expected value of the squared difference between the random variable X and its mean μ. n is the number of independent trials. p is the probability of a success

## Poisson Distribution:

The Poisson distribution describes the probability of a certain number of events occurring in a fixed interval of time or space, assuming the events occur independently and at a constant rate.

PMF: P(X) = (e * λx) / x!

Mean: μ = E(X) = λ = np

where: E(X) is the expected value of the random variable X. λ is the rate parameter (the expected number of events in the interval).

Variance: σ² = λ = np

where: E[(X-μ)²] is the expected value of the squared difference between the random variable X and its mean μ. λ is the rate parameter (the expected number of events in the interval).

# Random Variable: Continuous Random Variables

The probability density function (PDF) describes the probability of a continuous random variable taking on a value within a certain range.

PDF: f(x)

where: f(x) is the probability density function.

Mean: μ = E(X) = ∫x * f(x) dx

where: E(X) is the expected value of the random variable X. x is the value of the random variable X.

Variance: σ² = E(X2) – [E(X)]2 = ( ∫x² * f(x) dx ) – ( ∫x * f(x) dx )

where: E[X²] = ∫x² * f(x) dx

## Uniform Continuous Distribution:

The uniform continuous distribution describes the probability of a continuous random variable taking on values within a certain range with equal probability.

PDF: f(x) = 1 / (b-a) , for a ≤ x ≤ b

where: f(x) is the probability density function. a and b are the lower and upper bounds of the range.

Mean: μ = E(X) = (a+b) / 2

where: E(X) is the expected value of the random variable X. a and b are the lower and upper bounds of the range.

Variance: σ² = (b-a)² / 12

## Normal Continuous Distribution:

The normal continuous distribution describes many natural phenomena, such as heights and weights of people, IQ scores, and errors in measurements.The normal distribution is characterized by two parameters: the mean (μ) and the standard deviation (σ): N(μ,σ)

PDF: f(x) = e(-((x-μ)² / (2σ²))) * (1 / (σ * sqrt(2π)))

where: f(x) is the probability density function. μ is the mean of the distribution. σ is the standard deviation of the distribution. π is the mathematical constant (approximately equal to 3.14159). e is the mathematical constant (approximately equal to 2.71828).

The mean of a normal distribution represents the center of the distribution and is the location where the curve is symmetrically balanced. The standard deviation represents the spread or dispersion of the data points in the distribution. A larger value of σ indicates a wider spread of data, while a smaller value of σ indicates a narrower spread of data.

## Standard Normal Continuous Distribution:

The standard normal continuous distribution is a special case of the normal continuous distribution, where the mean is 0 and the standard deviation is 1.

PDF: f(x) = e(-x² / 2) * (1 / sqrt(2π))

where: f(x) is the probability density function. π is the mathematical constant (approximately equal to 3.14159). e is the mathematical constant (approximately equal to 2.71828).

Mean: μ = E(X) = 0

where: E(X) is the expected value of the random variable X.

Variance: σ² = 1

## If you want Part 2 tell us in comments!

“The Greater the demand, the faster we provide!”

0