Learning Bayesian network parameters from imperfect data: Enhancements to the EM algorithm

Rohitha Hewawasam, Kamal Premaratne

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

The recent years have seen many developments in uncertainty reasoning taking place around Bayesian Networks (BNs). BNs allow fast and efficient probabilistic reasoning. One of the key issues that researchers have faced in using a BN is determining its parameters and structure for a given problem. Many techniques have been developed for learning BN parameters from a given dataset pertaining to a particular problem. Most of the methods developed for learning BN parameters from partially observed data have evolved around the Expectation-Maximization (EM) algorithm. In its original form, EM algorithm is a deterministic iterative two-step procedure that converges towards the maximum-likelihood (ML) estimates. The EM algorithm mainly focuses on learning BN parameters from imperfect data where some of the values are missing. However in many practical applications, partial observability results in a wider range of imperfections, e.g., uncertainties arising from incomplete, ambiguous, probabilistic, and belief theoretic data. Moreover, while convergence is to their ML estimates, the EM algorithm does not guarantee convergence to the underlying true parameters. In this paper, we propose an approach that enables one to learn BN parameters from a dataset containing a wider variety of imperfections. In addition, by introducing an early stopping criterion together with a new initialization method to the EM-algorithm, we show how the BN parameters could be learnt so that they are closer to the underlying true parameters than the converged ML estimated parameters.

Original languageEnglish (US)
Title of host publicationIntelligent Computing
Subtitle of host publicationTheory and Applications V
DOIs
StatePublished - Nov 15 2007
EventIntelligent Computing: Theory and Applications V - Orlando, FL, United States
Duration: Apr 9 2007Apr 10 2007

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume6560
ISSN (Print)0277-786X

Other

OtherIntelligent Computing: Theory and Applications V
CountryUnited States
CityOrlando, FL
Period4/9/074/10/07

Keywords

  • Bayesian networks
  • Data uncertainties
  • EM algorithm
  • Imperfect data
  • Learning parameters

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Condensed Matter Physics

Fingerprint Dive into the research topics of 'Learning Bayesian network parameters from imperfect data: Enhancements to the EM algorithm'. Together they form a unique fingerprint.

  • Cite this

    Hewawasam, R., & Premaratne, K. (2007). Learning Bayesian network parameters from imperfect data: Enhancements to the EM algorithm. In Intelligent Computing: Theory and Applications V [65600E] (Proceedings of SPIE - The International Society for Optical Engineering; Vol. 6560). https://doi.org/10.1117/12.719290