Classification using localized mixtures of experts

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

10 Citations (Scopus)

Abstract

A mixture of experts consists of a gating network that learns to partition the input space and of experts networks attributed to these different regions. This paper focuses on the choice of the gating network. First, a localized gating network based on a mixture of linear latent variable models is proposed that extends a gating network introduced by Xu et al., based on Gaussian mixture models. It is shown that this localized mixture of experts model, can be trained with the Expectation Maximization algorithm. The localized model is compared on a set of classification problems, with mixtures of experts having single or multilayer perceptrons as gating network. It is found that the standard mixture of experts with feed-forward networks as gate often outperforms the other models.

Original languageEnglish
Title of host publicationIEE Conference Publication
PublisherIEE
Pages838-843
Number of pages6
Edition470
ISBN (Print)0852967217, 9780852967218
DOIs
Publication statusPublished - 1999
EventProceedings of the 1999 the 9th International Conference on 'Artificial Neural Networks (ICANN99)' - Edinburgh, UK
Duration: 7 Sept 199910 Sept 1999

Publication series

NameIEE Conference Publication
Number470
Volume2

Conference

ConferenceProceedings of the 1999 the 9th International Conference on 'Artificial Neural Networks (ICANN99)'
CityEdinburgh, UK
Period7/09/199910/09/1999

Cite this