Bayes' Theorem Calculator

Online calculator to compute conditional probabilities using Bayes' theorem

Bayes Theorem Calculator

Bayes' Theorem

Bayes' theorem enables the computation of conditional probabilities and is fundamental to statistical inference and machine learning.

Enter probabilities
Probability of event A
Probability of event B
Conditional probability of B given A
Bayes Result
P(A|B):
Probability of A given B
Bayes Theorem Properties

Core idea: Update probabilities based on new information

Conditional probability Reversal of conditioning Bayesian inference

Bayes Diagram

Bayes' theorem "reverses" conditional probabilities.
From P(B|A) we compute P(A|B).

A B A∩B P(A|B) P(B|A)

Event A Event B Intersection A∩B


What is Bayes' Theorem?

Bayes' theorem is one of the most fundamental theorems in probability theory:

  • Definition: Compute conditional probabilities by "reversing" conditioning
  • Core: Update probabilities based on new observations
  • Principle: Combine prior knowledge with new evidence
  • Applications: Statistics, AI, medicine, spam filters, diagnostics
  • Interpretation: How likely is a cause given an observation?
  • Related: Bayesian statistics, machine learning, expert systems

Interpretation of Bayes' Theorem

Bayes' theorem addresses the problem of "inverse probability":

Forward direction
  • Given: P(B|A) = "How likely is B when A occurs?"
  • Example: "How likely is fever given influenza?"
  • Note: This direction is often known or measurable
Inverse direction
  • Sought: P(A|B) = "How likely is A when B is observed?"
  • Example: "How likely is influenza given fever?"
  • Solution: Bayes' theorem computes this inverse probability

Applications of Bayes' Theorem

Bayes' theorem underpins many modern scientific and technological applications:

Medicine & Diagnostics
  • Medical diagnostic systems
  • Interpretation of lab results
  • Epidemiological studies
  • Therapy risk assessment
Artificial Intelligence
  • Naive Bayes classifiers
  • Spam email filters
  • Speech recognition and NLP
  • Bayesian neural networks
Statistics & Research
  • Bayesian statistics and inference
  • A/B testing and experiments
  • Market research and surveys
  • Scientific hypothesis testing
Security & Risk
  • Fraud and anomaly detection
  • Risk analysis in finance
  • Quality control in production
  • Cybersecurity and intrusion detection

Formulas for Bayes' Theorem

Basic formula
\[P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}\]

Conditional probability of A given B

Extended form
\[P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B|A) \cdot P(A) + P(B|\overline{A}) \cdot P(\overline{A})}\]

With full decomposition of the denominator

Bayesian terminology
\[\text{Posterior} = \frac{\text{Likelihood} \times \text{Prior}}{\text{Evidence}}\]

Classical Bayesian interpretation

Proportional form
\[P(A|B) \propto P(B|A) \cdot P(A)\]

Without normalization constant

Logarithmic form
\[\log P(A|B) = \log P(B|A) + \log P(A) - \log P(B)\]

For numerical stability

Odds form
\[\frac{P(A|B)}{P(\overline{A}|B)} = \frac{P(B|A)}{P(B|\overline{A})} \cdot \frac{P(A)}{P(\overline{A})}\]

Bayes factor × prior odds = posterior odds

Example calculations for Bayes' theorem

Example1: Medical diagnostics
Disease K, positive test T
Given
  • P(K) =0.1% (disease is rare)
  • P(T|K) =99% (test is highly sensitive)
  • P(T|¬K) =5% (5% false-positive rate)
Sought: P(K|T) - probability of disease given a positive test
Bayes calculation
\[P(K|T) = \frac{P(T|K) \cdot P(K)}{P(T)}\] \[P(T) = P(T|K) \cdot P(K) + P(T|\overline{K}) \cdot P(\overline{K})\] \[P(T) =0.99 \times0.001 +0.05 \times0.999 =0.051\] \[P(K|T) = \frac{0.99 \times0.001}{0.051} =1.94\%\]
Surprising insight: Despite99% test accuracy the probability of disease given a positive test is only about2%! This is due to the low prevalence of the disease.
Example2: Spam email filter
Word "WIN", email is spam
Given
  • P(Spam) =30% (30% of emails are spam)
  • P("WIN"|Spam) =80% (80% of spam contains "WIN")
  • P("WIN"|¬Spam) =2% (2% of normal emails contain "WIN")
Sought: P(Spam|"WIN") - probability of spam given the word "WIN"
Step-by-step
\[P("WIN") =0.8 \times0.3 +0.02 \times0.7\] \[=0.24 +0.014 =0.254\] \[P(Spam|"WIN") = \frac{0.8 \times0.3}{0.254}\] \[= \frac{0.24}{0.254} =94.5\%\]
Practical application: An email containing the word "WIN" is with94.5% probability spam - a very strong indicator!
Example3: Default values from the calculator
P(A) =20%, P(B) =45%, P(B|A) =60%
Direct calculation
\[P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}\] \[= \frac{0.60 \times0.20}{0.45}\] \[= \frac{0.12}{0.45} =0.2667\]
Result
\[P(A|B) =26.67\%\]
Interpretation:
The probability of A increases from20% to26.67% when B is observed.
Verification: Reversing the calculation
Given Sought Formula Result
P(A), P(B), P(B|A) P(A|B) P(B|A)×P(A)/P(B) 26.67%
P(A), P(B), P(A|B) P(B|A) P(A|B)×P(B)/P(A) 60.00%
Consistency check: Both directions lead to consistent results

Mathematical foundations of Bayes' theorem

Bayes' theorem, named after Thomas Bayes (1701-1761), is a direct consequence of the definition of conditional probabilities and forms the basis of Bayesian statistics. It changed the understanding of probability as a measure of uncertainty and belief.

Historical development

Bayes' theorem has a fascinating history and deep philosophical implications:

  • Thomas Bayes (1763): Original formulation in "An Essay towards solving a Problem in the Doctrine of Chances"
  • Pierre-Simon Laplace: Further development and popularization of Bayesian methods
  • 20th century: Debate between frequentist and Bayesian statistics
  • Computer age: Renaissance due to MCMC methods and machine learning
  • Modern applications: AI, big data, and probabilistic programming

Philosophical interpretations

Bayes' theorem leads to various interpretations of probability:

Subjective probability

Probabilities represent a degree of belief or conviction about the truth of a statement.

Epistemic interpretation

Probabilities measure the extent of our knowledge or ignorance about the world.

Logical probability

Probabilities are logical relationships between statements, similar to logical inference.

Objective Bayes interpretation

Use of non-informative priors to achieve "objective" Bayesian inference.

Mathematical derivation

Bayes' theorem follows directly from the definition of conditional probabilities:

Starting point: Definition of conditional probability
\[P(A|B) = \frac{P(A \cap B)}{P(B)} \quad \text{and} \quad P(B|A) = \frac{P(A \cap B)}{P(A)}\]
Rearrangement:
\[P(A \cap B) = P(A|B) \cdot P(B) = P(B|A) \cdot P(A)\]
Bayes' theorem:
\[P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}\]

Generalizations and extensions

Bayes' theorem can be extended in several directions:

Total probability

For a partition {A₁, A₂, ..., Aₙ} of the sample space: \(P(B) = \sum_{i=1}^{n} P(B|A_i) \cdot P(A_i)\)

Continuous variants

For continuous random variables with densities: \(f(x|y) = \frac{f(y|x) \cdot f(x)}{f(y)}\)

Multiple evidences

Sequential application: \(P(A|B_1, B_2) = \frac{P(B_2|A, B_1) \cdot P(A|B_1)}{P(B_2|B_1)}\)

Bayesian networks

Graphical models for complex conditional dependencies between multiple variables.

Modern applications in computer science

Machine Learning

Naive Bayes classifiers, Bayesian optimization, probabilistic graphical models, Gaussian processes.

Data analysis

A/B testing, multi-armed bandits, online learning, adaptive algorithms.

Uncertainty quantification

Bayesian neural networks, credible intervals, risk analysis in critical systems.

Decision theory

Optimal decisions under uncertainty, game theory, auction mechanisms.

Common misconceptions and pitfalls

Base rate fallacy

Neglecting the base rate (prior) when intuitively assessing probabilities, as shown in the medical example.

Prosecutor's fallacy

Confusing P(Evidence|Innocence) with P(Innocence|Evidence) in legal contexts.

Prior dependence

Strong dependence of results on the choice of prior distribution, especially with limited data.

Computational challenges

Computing the denominator (evidence) can become intractable for complex models.

Computational aspects

MCMC methods

Markov Chain Monte Carlo for complex posteriors: Metropolis-Hastings, Gibbs sampling, Hamiltonian Monte Carlo.

Variational inference

Approximate methods for fast posterior approximation in large models and big data scenarios.

Summary

Bayes' theorem is more than just a mathematical formula - it represents a fundamental paradigm for handling uncertainty and knowledge. From Thomas Bayes' original motivations to modern AI systems, the theorem has transformed how we think about probability, learning and decision-making. In a world of increasing complexity and data, the Bayesian approach provides a principled framework for rational inference and continuous learning.

Is this page helpful?            
Thank you for your feedback!

Sorry about that

How can we improve it?

More Risk and Probability Functions

Birthday Paradox
Bayes Theorem
Central Limit Theorem