Bayes' theorem

P(parameters given data) ~ P(data given parameters) × P(parameters)

  • P(parameters given data): is called posterior
  • P(data given parameters): is called likelihood
  • P(parameters): is called prior


Derivation of Bayes rule + Chain rule for probs

It is astonishing simple.
The conditional prob P(A|B) is defined as P(A|B):= ------
So P(A,B) = P(A|B) P(B) - if P(B)>0

And for this:

P(A|B) = ----------
<=> P(A|B)P(B) = P(B|A)P(A)
<=> P(A,B) = P(A,B)             

Simple example

Computing the probability of having found a hipster in NYC given some data (“drinking PBR”). Nice example by Student Dave.

public/bayes_theorem.txt · Last modified: 2014/01/01 19:11 (external edit) · []
Recent changes RSS feed Powered by PHP Valid XHTML 1.0 Valid CSS Driven by DokuWiki