Statistical learning via layers of mixtures of component distribution.

Deep learning is a hierarchical inference method formed by subsequent multiple layers of learning able to more efficiently describe complex relationships. In this project, deep Gaussian mixture models (DGMM) are to be explored further. A DGMM is a network of multiple layers of latent variables, where, at each layer, the variables follow a mixture of Gaussian distributions. They thus globally provide a nonlinear model able to describe the data in a very flexible way.

Project members

Project Level: Honours / Masters / PhD

Professor Geoffrey McLachlan

School of Mathematics and Physics