by Francesco Boi
Last Updated September 11, 2019 09:19 AM - source

In the Elements of Statistical learning book when introducing Linear Discriminant Analysis it says:

A simple application of Bayes theorem gives us

$Pr(G=K|X=x) = \frac{f_k(x)\pi_k}{\sum_{l=1}^Kf_l(x)\pi_l}$

where $\pi_k$ is the prior probability of class $k$ and $f_k(x)$ is the class conditional probability.

- What is the class conditional probability? Is it $Pr(X=x|G=K)$?
- How is derived the above equation from the Bayes theorem? I know $Pr(G=K|X=x) = \frac{Pr(X=x|G=K)Pr(G=K)}{Pr(X=x)}$

I know that $Pr(G=K)=\pi_k$ but I do not know how to derive the rest of the equation.

- Serverfault Help
- Superuser Help
- Ubuntu Help
- Webapps Help
- Webmasters Help
- Programmers Help
- Dba Help
- Drupal Help
- Wordpress Help
- Magento Help
- Joomla Help
- Android Help
- Apple Help
- Game Help
- Gaming Help
- Blender Help
- Ux Help
- Cooking Help
- Photo Help
- Stats Help
- Math Help
- Diy Help
- Gis Help
- Tex Help
- Meta Help
- Electronics Help
- Stackoverflow Help
- Bitcoin Help
- Ethereum Help