Mixtures of experts estimate a posteriori probabilities

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

8 Citations (Scopus)

Abstract

The mixtures of experts (ME) model offers a modular structure suitable for a divide-and-conquer approach to pattern recognition. It has a probabilistic interpretation in terms of a mixture model, which forms the basis for the error function associated with MEs. In this paper, it is shown that for classification problems the minimization of this ME error function leads to ME outputs estimating the a posteriori probabilities of class membership of the input vector.

Original languageEnglish
Title of host publicationArtificial Neural Networks - ICANN 1997 - 7th International Conference, Proceeedings
EditorsWulfram Gerstner, Alain Germond, Martin Hasler, Jean-Daniel Nicoud
PublisherSpringer Verlag
Pages499-504
Number of pages6
ISBN (Print)3540636315, 9783540636311
DOIs
Publication statusPublished - 1997
Event7th International Conference on Artificial Neural Networks, ICANN 1997 - Lausanne, Switzerland
Duration: 8 Oct 199710 Oct 1997

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume1327

Conference

Conference7th International Conference on Artificial Neural Networks, ICANN 1997
Country/TerritorySwitzerland
CityLausanne
Period8/10/199710/10/1997

Cite this