### Abstract

The recent years have seen many developments in uncertainty reasoning taking place around Bayesian Networks (BNs). BNs allow fast and efficient probabilistic reasoning. One of the key issues that researchers have faced in using a BN is determining its parameters and structure for a given problem. Many techniques have been developed for learning BN parameters from a given dataset pertaining to a particular problem. Most of the methods developed for learning BN parameters from partially observed data have evolved around the Expectation-Maximization (EM) algorithm. In its original form, EM algorithm is a deterministic iterative two-step procedure that converges towards the maximum-likelihood (ML) estimates. The EM algorithm mainly focuses on learning BN parameters from imperfect data where some of the values are missing. However in many practical applications, partial observability results in a wider range of imperfections, e.g., uncertainties arising from incomplete, ambiguous, probabilistic, and belief theoretic data. Moreover, while convergence is to their ML estimates, the EM algorithm does not guarantee convergence to the underlying true parameters. In this paper, we propose an approach that enables one to learn BN parameters from a dataset containing a wider variety of imperfections. In addition, by introducing an early stopping criterion together with a new initialization method to the EM-algorithm, we show how the BN parameters could be learnt so that they are closer to the underlying true parameters than the converged ML estimated parameters.

Original language | English (US) |
---|---|

Title of host publication | Intelligent Computing |

Subtitle of host publication | Theory and Applications V |

DOIs | |

State | Published - Nov 15 2007 |

Event | Intelligent Computing: Theory and Applications V - Orlando, FL, United States Duration: Apr 9 2007 → Apr 10 2007 |

### Publication series

Name | Proceedings of SPIE - The International Society for Optical Engineering |
---|---|

Volume | 6560 |

ISSN (Print) | 0277-786X |

### Other

Other | Intelligent Computing: Theory and Applications V |
---|---|

Country | United States |

City | Orlando, FL |

Period | 4/9/07 → 4/10/07 |

### Keywords

- Bayesian networks
- Data uncertainties
- EM algorithm
- Imperfect data
- Learning parameters

### ASJC Scopus subject areas

- Electrical and Electronic Engineering
- Condensed Matter Physics

## Fingerprint Dive into the research topics of 'Learning Bayesian network parameters from imperfect data: Enhancements to the EM algorithm'. Together they form a unique fingerprint.

## Cite this

*Intelligent Computing: Theory and Applications V*[65600E] (Proceedings of SPIE - The International Society for Optical Engineering; Vol. 6560). https://doi.org/10.1117/12.719290