

A generic classification problem involves a random variable $X$ from one of $K$ classes
A discriminant rule tries to divide the data space into $K$ disjoint regions that represent all classes
The Maximum Likelihood (ML) discriminant rule allocates $x$ to the population with the largest likelihood at $x$, thus
$$ d(x)=\arg\max \delta_k(x) $$
In a simple univariate case where $K=2$ and both $K_1\sim\mathcal{N}(\mu_1,\sigma_1^2)$ and $K_2\sim\mathcal{N}(\mu_2,\sigma_2^2)$, the ML rule will assign $x$ to $K_1$ if and only if $f(x_1)>f(x_2)$ , which is solvable by the PDF of Gaussian distribution.
$$ x^2(\frac{1}{\sigma^2_2}-\frac{1}{\sigma^2_1})+x(\frac{2\mu_1}{\sigma_1^2}-\frac{2\mu_2}{\sigma_2^2})+\frac{\mu_2^2}{\sigma_2^2}-\frac{\mu_1^2}{\sigma_1^2}+2\log\frac{\sigma_2}{\sigma_1}>0 $$
$$ 2x(\mu_1-\mu_2)+\mu_2^2-\mu_1^2>0 $$