UrbanPro
true
default_background

Take Class 12 Tuition from the Best Tutors

  • Affordable fees
  • 1-1 or Group class
  • Flexible Timings
  • Verified Tutors

Search in

Probability Theory Simply Understood

Vasu Bhalothia
18 hrs ago 0 0

PROBABILITY THEORY

 

Complete Masterclass for Competitive Examinations

 

7 Chapters

50+ Formulas

80+ MCQs

CAT | GATE | GRE | GMAT

 

 

01

Fundamentals & Axioms

02

Conditional Probability & Bayes' Theorem

03

Random Variables & Distributions

04

Expectation, Variance & Moments

05

Special Distributions

06

Joint Distributions & Limit Theorems

07

Practice MCQs with Detailed Solutions

 

Chapter 1

Fundamentals of Probability

 

1.1  Basic Definitions

  • Experiment: Any process whose outcome is uncertain (e.g., tossing a coin).
  • Sample Space (S): Set of all possible outcomes of an experiment.
  • Event (E): A subset of the sample space.
  • Mutually Exclusive Events: A and B cannot both occur; A intersection B = empty set.
  • Exhaustive Events: Events whose union equals the entire sample space.
  • Equally Likely Events: Events with identical probability of occurrence.

 

1.2  Classical Probability

When all outcomes are equally likely, the probability of event E is:

 

P(E) = (Number of favourable outcomes) / (Total outcomes in S)

 

1.3  Axiomatic Probability (Kolmogorov)

Axiom 1: P(E) >= 0  for every event E.

 

Axiom 2: P(S) = 1  (sample space has probability 1).

 

Axiom 3 (Additivity): If A intersection B = empty set, then P(A union B) = P(A) + P(B).

 

1.4  Key Theorems

Addition Rule

P(A union B) = P(A) + P(B) - P(A intersection B)

 

Complement Rule

P(A') = 1 - P(A)

 

Impossible Event

P(empty set) = 0

 

Inclusion-Exclusion (3 events)

P(AUBUC) = P(A)+P(B)+P(C) - P(AintB) - P(BintC) - P(AintC) + P(AintBintC)

 

Exam Tip

In 'at least one' problems, use the complement: P(at least one) = 1 - P(none). This is almost always faster in exam conditions.

 

Counting Techniques Reference

Operation

Formula

Order Matters?

Repetition?

Permutation

nPr = n! / (n-r)!

Yes

No

Combination

nCr = n! / [r!(n-r)!]

No

No

Permutation (rep)

n^r

Yes

Yes

Combination (rep)

(n+r-1)Cr

No

Yes

 

Chapter 2

Conditional Probability & Independence

 

2.1  Conditional Probability

The probability of A given that B has occurred (P(B) > 0):

 

P(A | B) = P(A intersection B) / P(B)

 

2.2  Multiplication Rule

P(A intersection B) = P(A) * P(B | A) = P(B) * P(A | B)

 

P(A intersection B intersection C) = P(A) * P(B|A) * P(C|A intersection B)

 

2.3  Independent Events

A and B are independent if knowledge of one does not affect the other:

 

P(A intersection B) = P(A) * P(B)

Note: Mutually exclusive events (with P > 0) are NOT independent.

 

2.4  Law of Total Probability

If B1, B2, ..., Bn form a partition of S (mutually exclusive & exhaustive):

 

P(A) = P(A|B1)*P(B1) + P(A|B2)*P(B2) + ... + P(A|Bn)*P(Bn)

 

2.5  Bayes' Theorem

Used to 'reverse' conditional probabilities — most tested in competitive exams:

 

P(Bk | A) = P(A|Bk)*P(Bk) / [P(A|B1)*P(B1) + ... + P(A|Bn)*P(Bn)]

 

Worked Example — Disease Testing

Given

Value

Disease prevalence P(D)

0.01

P(+ve | D) — sensitivity

0.99

P(+ve | D') — false positive

0.05

P(D | +ve) via Bayes' Theorem

approximately 0.167 (only 16.7%!)

 

Exam Tip

Bayes' Theorem problems almost always benefit from a tree diagram. List priors on first branches, likelihoods on second — the answer is one product divided by the total.

 

Chapter 3

Random Variables

 

3.1  Discrete Random Variables

A random variable X taking countable values x1, x2, ... with probabilities p1, p2, ...

 

PMF: P(X = xi) = pi,   pi >= 0,   Sum(pi) = 1

 

CDF: F(x) = P(X <= x) = Sum of P(X = xi) for xi <= x

 

3.2  Continuous Random Variables

X is continuous if it has a probability density function (PDF) f(x):

 

f(x) >= 0 for all x;   Integral of f(x) dx = 1

 

P(a <= X <= b) = Integral from a to b of f(x) dx

 

CDF: F(x) = Integral from -inf to x of f(t) dt;   P(X = a) = 0

 

3.3  Expectation & Variance Reference Table

Quantity

Formula

E[X]  (discrete)

Sum of xi * P(X = xi)

E[X]  (continuous)

Integral of x * f(x) dx

E[aX + b]

a*E[X] + b

E[X + Y]

E[X] + E[Y]  (always)

Var(X)

E[X^2] - (E[X])^2

Var(aX + b)

a^2 * Var(X)

Var(X + Y) if indep.

Var(X) + Var(Y)

SD(X)

sqrt(Var(X))

 

Chapter 4

Special Probability Distributions

 

4.1  Binomial Distribution  B(n, p)

n independent Bernoulli trials each with success probability p; X = number of successes.

 

P(X = k) = C(n,k) * p^k * (1-p)^(n-k),   k = 0, 1, ..., n

 

Property

Value

Mean

np

Variance

np(1-p)

Mode

floor((n+1)p)

MGF

(1-p+p*e^t)^n

 

4.2  Poisson Distribution  P(lambda)

Models rare events over a fixed interval; lambda = average rate.

 

P(X = k) = e^(-lambda) * lambda^k / k!,   k = 0, 1, 2, ...

 

Property

Value

Mean

lambda

Variance

lambda

Approximation

Binomial -> Poisson when n large, p small, lambda = np

 

4.3  Geometric Distribution  Geom(p)

X = number of trials until the first success.

 

P(X = k) = (1-p)^(k-1) * p,   k = 1, 2, 3, ...

 

Property

Value

Mean

1/p

Variance

(1-p)/p^2

Memoryless

P(X > m+n | X > m) = P(X > n)

 

4.4  Uniform Distribution  U(a, b)

Every outcome in [a, b] equally likely.

 

f(x) = 1/(b-a)  for a <= x <= b,  else 0

 

Property

Value

Mean

(a+b)/2

Variance

(b-a)^2 / 12

CDF

(x-a)/(b-a)  for a <= x <= b

 

4.5  Normal Distribution  N(mu, sigma^2)

The bell-curve — the most important distribution in statistics.

 

f(x) = (1/(sigma*sqrt(2*pi))) * exp(-(x-mu)^2 / (2*sigma^2))

 

Property

Value

Mean

mu

Variance

sigma^2

Standardisation

Z = (X - mu) / sigma  =>  Z ~ N(0, 1)

Empirical rule

68% within +/-1sigma, 95% within +/-2sigma, 99.7% within +/-3sigma

 

4.6  Exponential Distribution  Exp(lambda)

Models time between events in a Poisson process.

 

f(x) = lambda * e^(-lambda*x)  for x >= 0

 

Property

Value

Mean

1/lambda

Variance

1/lambda^2

Memoryless

P(X > s+t | X > s) = P(X > t)

 

Exam Tip

Always standardise before using the Z-table: Z = (X - mu)/sigma. Normal is symmetric: Phi(-z) = 1 - Phi(z), and P(a < Z < b) = Phi(b) - Phi(a).

 

Chapter 5

Joint Distributions & Limit Theorems

 

5.1  Joint & Marginal Distributions

Concept

Formula

Joint PMF

p(x,y) = P(X=x, Y=y)

Marginal of X

pX(x) = Sum over y of p(x,y)

Marginal of Y

pY(y) = Sum over x of p(x,y)

Independence

p(x,y) = pX(x) * pY(y)  for all (x,y)

Conditional

p(x|y) = p(x,y) / pY(y)

Covariance

Cov(X,Y) = E[XY] - E[X]*E[Y]

Correlation

rho = Cov(X,Y)/(sigma_X * sigma_Y);  -1 <= rho <= 1

 

5.2  Law of Large Numbers (LLN)

As n -> infinity, the sample mean converges in probability to the true mean mu:

 

X_bar_n = (X1 + X2 + ... + Xn) / n  -->  mu  as n --> infinity

 

5.3  Central Limit Theorem (CLT)

Regardless of the original distribution, the sample mean is approximately Normal for large n:

 

sqrt(n) * (X_bar - mu) / sigma  -->  N(0,1)  as n --> infinity

Rule of thumb: CLT approximation is reliable for n >= 30.

 

5.4  Chebyshev's Inequality

A non-parametric bound applicable to any distribution with finite mean and variance:

 

P(|X - mu| >= k*sigma) <= 1/k^2   for any k > 0

 

Exam Tip

Chebyshev gives a guaranteed (conservative) bound. For k=2: at least 75% of data lies within 2 standard deviations. For k=3: at least 88.9%.

 

Chapter 6

Practice MCQs — All Levels

 

Foundational (Q1-10)

Intermediate (Q11-20)

Advanced (Q21-30)

 

Q1. A fair coin is tossed twice. P(at least one Head) = ?

(A) 1/4

(B) 1/2

(C) 3/4

(D) 1

Answer: (C) 3/4   P = 1 - P(TT) = 1 - 1/4 = 3/4

Q2. Two dice are rolled. P(sum = 7) = ?

(A) 1/6

(B) 7/36

(C) 1/12

(D) 1/9

Answer: (A) 1/6   Favourable: {(1,6),(2,5),(3,4),(4,3),(5,2),(6,1)} = 6; P = 6/36 = 1/6

Q3. Bag: 5 red, 3 blue balls. 2 drawn without replacement. P(both red) = ?

(A) 25/64

(B) 5/14

(C) 5/28

(D) 10/56

Answer: (B) 5/14   P = (5/8)*(4/7) = 20/56 = 5/14

Q4. P(A)=0.6, P(B)=0.4, A and B independent. P(A intersection B) = ?

(A) 0.24

(B) 0.10

(C) 0.76

(D) 0.20

Answer: (A) 0.24   P(A intersect B) = P(A)*P(B) = 0.6*0.4 = 0.24

Q5. P(A|B)=0.3, P(B)=0.5. Find P(A intersection B).

(A) 0.6

(B) 0.15

(C) 0.8

(D) 0.2

Answer: (B) 0.15   P(A intersect B) = P(A|B)*P(B) = 0.3*0.5 = 0.15

 

Q6. X ~ B(10, 0.4). E[X] = ?

(A) 4

(B) 2

(C) 6

(D) 2.4

Answer: (A) 4   E[X] = np = 10*0.4 = 4

Q7. X ~ B(10, 0.4). Var(X) = ?

(A) 4

(B) 2.4

(C) 1.6

(D) 6

Answer: (B) 2.4   Var = np(1-p) = 10*0.4*0.6 = 2.4

Q8. X ~ Poisson(3). P(X = 0) = ?

(A) 1/e^3

(B) 3/e

(C) e^3

(D) 0

Answer: (A) 1/e^3   P(X=0) = e^(-3)*3^0/0! = e^(-3) = 1/e^3

Q9. X ~ N(10, 4). P(X < 10) = ?

(A) 0.25

(B) 0.75

(C) 0.50

(D) 1

Answer: (C) 0.50   Normal is symmetric about mean; P(X < mu) = 0.50

Q10. X is uniform on [2, 8]. E[X] = ?

(A) 4

(B) 5

(C) 6

(D) 3

Answer: (B) 5   E[X] = (a+b)/2 = (2+8)/2 = 5

 

 

Q11. Box: 4 defective, 6 good. 3 chosen. P(exactly 2 defective) = ?

(A) 3/10

(B) 6/25

(C) 12/35

(D) 1/5

Answer: (A) 3/10   C(4,2)*C(6,1)/C(10,3) = 6*6/120 = 36/120 = 3/10

Q12. P(A)=0.5, P(B|A)=0.6, P(B|A')=0.3. P(A|B) = ?

(A) 2/3

(B) 0.45

(C) 1/2

(D) 0.6

Answer: (A) 2/3   P(B)=0.5*0.6+0.5*0.3=0.45; P(A|B)=0.30/0.45=2/3

Q13. X ~ Geometric(p=0.25). E[X] = ?

(A) 4

(B) 3

(C) 0.25

(D) 12

Answer: (A) 4   E[X] = 1/p = 1/0.25 = 4

Q14. X ~ Poisson(5). P(X <= 1) = ?

(A) 6e^(-5)

(B) 5e^(-5)

(C) e^(-5)

(D) 1-6e^(-5)

Answer: (A) 6e^(-5)   P(0)+P(1) = e^(-5) + 5e^(-5) = 6e^(-5)

Q15. E[X]=3, E[Y]=4, Cov(X,Y)=2. E[XY] = ?

(A) 14

(B) 12

(C) 10

(D) 6

Answer: (A) 14   Cov(X,Y)=E[XY]-E[X]*E[Y] => E[XY]=2+12=14

 

Q16. X ~ N(50, 25). P(40 < X < 60) = ?

(A) 0.9544

(B) 0.6827

(C) 0.9974

(D) 0.50

Answer: (A) 0.9544   40 and 60 are +/-2sigma from mu=50 (sigma=5); by empirical rule ~95.44%

Q17. P(A union B)=0.8, P(A)=0.5, P(B)=0.6. P(A intersection B) = ?

(A) 0.3

(B) 0.1

(C) 0.4

(D) 0.2

Answer: (A) 0.3   P(A intersect B) = 0.5+0.6-0.8 = 0.3

Q18. Var(X)=9, Var(Y)=16, independent. Var(2X-3Y) = ?

(A) 180

(B) 108

(C) 144

(D) 36

Answer: (A) 180   4*Var(X)+9*Var(Y) = 4*9+9*16 = 36+144 = 180

Q19. X ~ B(6, 0.5). P(X >= 4) = ?

(A) 11/32

(B) 21/64

(C) 15/64

(D) 5/16

Answer: (A) 11/32   [C(6,4)+C(6,5)+C(6,6)]/2^6 = (15+6+1)/64 = 22/64 = 11/32

Q20. CLT: X_bar from n=100, mu=50, sigma=10. P(X_bar > 51) = ?

(A) 0.1587

(B) 0.8413

(C) 0.3413

(D) 0.6587

Answer: (A) 0.1587   SE=10/sqrt(100)=1; Z=(51-50)/1=1; P(Z>1)=1-Phi(1)~0.1587

 

 

Q21. 3 machines produce 50%, 30%, 20% of output. Defect rates: 2%, 3%, 4%. P(defective item from machine 3) = ?

(A) 8/27

(B) 20/45

(C) 4/13

(D) 1/5

Answer: (A) 8/27   P(D)=.5*.02+.3*.03+.2*.04=.027; P(M3|D)=0.008/0.027=8/27

Q22. X,Y ~ independent N(0,1). P(X^2+Y^2 <= 4) = ?

(A) 1-e^(-2)

(B) e^(-2)

(C) 1-e^(-4)

(D) 0.5

Answer: (A) 1-e^(-2)   X^2+Y^2 ~ Chi-sq(2)=Exp(0.5); P(<=4)=1-e^(-2) ~ 0.865

Q23. E[X^2]=13, Var(X)=4. E[X] = ?

(A) 3

(B) +/-3

(C) 9

(D) Cannot determine

Answer: (B) +/-3   Var=E[X^2]-(E[X])^2 => 4=13-(E[X])^2 => (E[X])^2=9 => E[X]=+/-3

Q24. X ~ Poisson(lambda). P(X=1)=P(X=2). Find lambda.

(A) 1

(B) 2

(C) 0.5

(D) 3

Answer: (B) 2   lambda*e^(-lambda) = lambda^2*e^(-lambda)/2 => 1=lambda/2 => lambda=2

Q25. Cov(X,Y)=6, sigma_X=3, sigma_Y=4. Corr(X,Y) = ?

(A) 0.5

(B) 0.75

(C) 2

(D) 0.6

Answer: (A) 0.5   rho = 6/(3*4) = 6/12 = 0.5

 

Q26. X ~ U[0,1]. Y = -ln(X). What distribution is Y?

(A) Uniform

(B) Exponential(1)

(C) Normal

(D) Geometric

Answer: (B) Exponential(1)   CDF: P(Y<=y)=P(-lnX<=y)=P(X>=e^(-y))=1-e^(-y); this is Exp(1)

Q27. n fair coins. P(all heads) <= 0.01. Minimum n = ?

(A) 6

(B) 7

(C) 5

(D) 8

Answer: (B) 7   (1/2)^n <= 0.01 => n >= log2(100) ~ 6.64 => n=7

Q28. P(A)=0.4, P(B)=0.5, P(A' intersect B')=0.2. Are A,B independent?

(A) Yes

(B) No

(C) Insufficient data

(D) Mutually exclusive

Answer: (B) No   P(A intersect B)=0.4+0.5-0.8=0.1; P(A)*P(B)=0.20 != 0.1; not independent

Q29. E[X]=2, E[X^2]=8. Var(3X+5) = ?

(A) 36

(B) 9

(C) 45

(D) 12

Answer: (A) 36   Var(X)=8-4=4; Var(3X+5)=9*Var(X)=9*4=36

Q30. Chebyshev: E[X]=10, Var(X)=4. P(|X-10|>=4) <= ?

(A) 1/4

(B) 1/2

(C) 1/3

(D) 1/16

Answer: (A) 1/4   k=4/sigma=4/2=2; P(|X-mu|>=2sigma) <= 1/k^2 = 1/4

 

 

Chapter 7

Quick Formula Reference Sheet

 

Core Probability Rules

P(A') = 1 - P(A)

 

P(A union B) = P(A) + P(B) - P(A intersection B)

 

P(A|B) = P(A intersection B) / P(B)

 

P(A intersection B) = P(A)*P(B)  [if independent]

 

Bayes: P(Bi|A) = P(A|Bi)*P(Bi) / Sum[P(A|Bj)*P(Bj)]

 

 

Expectation & Variance

E[aX+b] = a*E[X] + b

 

Var(aX+b) = a^2*Var(X)

 

Var(X) = E[X^2] - (E[X])^2

 

Cov(X,Y) = E[XY] - E[X]*E[Y]

 

Correlation rho = Cov(X,Y) / (sigma_X * sigma_Y)

 

 

Key Distributions (Mean | Variance)

Binomial B(n,p):      mu = np          |  sigma^2 = np(1-p)

 

Poisson P(lambda):    mu = lambda      |  sigma^2 = lambda

 

Geometric Geom(p):    mu = 1/p         |  sigma^2 = (1-p)/p^2

 

Uniform U[a,b]:       mu = (a+b)/2     |  sigma^2 = (b-a)^2/12

 

Normal N(mu,sig^2):   Standardise Z=(X-mu)/sigma   Z~N(0,1)

 

Exponential Exp(lam): mu = 1/lambda    |  sigma^2 = 1/lambda^2

 

 

Limit Theorems

LLN:       X_bar_n --> mu  as n --> infinity

 

CLT:       sqrt(n)*(X_bar - mu)/sigma --> N(0,1)  as n --> infinity

 

Chebyshev: P(|X-mu| >= k*sigma) <= 1/k^2

 

Markov:    P(X >= a) <= E[X]/a  (for X>=0, a>0)

 

 

Final Revision Note

Before every exam, review Bayes' Theorem, Binomial and Normal distributions, and the CLT. These three areas account for approximately 50% of probability questions in CAT, GATE, and GRE.

 

0 Dislike
Follow 1

Please Enter a comment

Submit

Looking for Class 12 Tuition ?

Learn from Best Tutors on UrbanPro.

Are you a Tutor or Training Institute?

Join UrbanPro Today to find students near you
X

Looking for Class 12 Tuition Classes?

The best tutors for Class 12 Tuition Classes are on UrbanPro

  • Select the best Tutor
  • Book & Attend a Free Demo
  • Pay and start Learning

Take Class 12 Tuition with the Best Tutors

The best Tutors for Class 12 Tuition Classes are on UrbanPro

This website uses cookies

We use cookies to improve user experience. Choose what cookies you allow us to use. You can read more about our Cookie Policy in our Privacy Policy

Accept All
Decline All

UrbanPro.com is India's largest network of most trusted tutors and institutes. Over 55 lakh students rely on UrbanPro.com, to fulfill their learning requirements across 1,000+ categories. Using UrbanPro.com, parents, and students can compare multiple Tutors and Institutes and choose the one that best suits their requirements. More than 7.5 lakh verified Tutors and Institutes are helping millions of students every day and growing their tutoring business on UrbanPro.com. Whether you are looking for a tutor to learn mathematics, a German language trainer to brush up your German language skills or an institute to upgrade your IT skills, we have got the best selection of Tutors and Training Institutes for you. Read more