Gaussian Mixture Model (GMM)

Definition - What does Gaussian Mixture Model (GMM) mean?

A Gaussian mixture model (GMM) is a category of probabilistic model which states that all generated data points are derived from a mixture of a finite Gaussian distributions that has no known parameters. The parameters for Gaussian mixture models are derived either from maximum a posteriori estimation or an iterative expectation-maximization algorithm from a prior model which is well trained. Gaussian mixture models are very useful when it comes to modeling data, especially data which comes from several groups.

Techopedia explains Gaussian Mixture Model (GMM)

Mathematically, Gaussian mixture models are an example of a parametric probability density function, which can be represented as a weighted sum of all densities of Gaussian components. In other words, the weighted sum of M component Gaussian densities is known as a Gaussian mixture model, and mathematically it is p(x|λ) = X M i=1 wi g(x|µi , Σi), where M is denoted for mixture weights, x is the continuous-valued data vector from the D-dimension and and g(x|µi , Σi ) is the component Gaussian densities. A Gaussian mixture model consists of covariance matrices, mixture weights and mean vectors from every component density present. Gaussians are fully capable of modeling the correlations of feature vector elements thanks to the linear combination of diagonal covariance basis. Another feature of the Gaussian mixture model is the formation of smooth approximations to randomly shaped densities.

Gaussian mixture models are used in biometric systems where the parametric model helps in understanding the features or measurements related to ones such as vocal-tract spectral features. Gaussian mixture models are also used for density estimation and are considered the most statistically mature techniques for clustering.

Share this: