### Abstract

Estimating the entropy of a sample set is required, in solving numerous learning scenarios involving information theoretic optimization criteria. A number of entropy estimators are available in the literature; however, these require a batch of samples to operate on in order to yield an estimate. We derive a recursive formula to estimate Renyi's (1970) quadratic entropy on-line, using each new sample to update the entropy estimate to obtain more accurate results in stationary situations or to track the changing entropy of a signal in nonstationary situations.

Original language | English (US) |
---|---|

Title of host publication | Neural Networks for Signal Processing - Proceedings of the IEEE Workshop |

Publisher | Institute of Electrical and Electronics Engineers Inc. |

Pages | 209-217 |

Number of pages | 9 |

Volume | 2002-January |

ISBN (Print) | 0780376161 |

DOIs | |

State | Published - 2002 |

Externally published | Yes |

Event | 12th IEEE Workshop on Neural Networks for Signal Processing, NNSP 2002 - Martigny, Switzerland Duration: Sep 6 2002 → … |

### Other

Other | 12th IEEE Workshop on Neural Networks for Signal Processing, NNSP 2002 |
---|---|

Country | Switzerland |

City | Martigny |

Period | 9/6/02 → … |

### ASJC Scopus subject areas

- Electrical and Electronic Engineering
- Artificial Intelligence
- Software
- Computer Networks and Communications
- Signal Processing

## Fingerprint Dive into the research topics of 'A recursive Renyi's entropy estimator'. Together they form a unique fingerprint.

## Cite this

Erdogmus, D., Principe, J. C., Kim, S. P., & Sanchez, J. C. (2002). A recursive Renyi's entropy estimator. In

*Neural Networks for Signal Processing - Proceedings of the IEEE Workshop*(Vol. 2002-January, pp. 209-217). [1030032] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/NNSP.2002.1030032