A recursive Renyi's entropy estimator

D. Erdogmus, J. C. Principe, Sung Phil Kim, Justin C. Sanchez

Research output: Chapter in Book/Report/Conference proceedingConference contribution

21 Scopus citations

Abstract

Estimating the entropy of a sample set is required, in solving numerous learning scenarios involving information theoretic optimization criteria. A number of entropy estimators are available in the literature; however, these require a batch of samples to operate on in order to yield an estimate. We derive a recursive formula to estimate Renyi's (1970) quadratic entropy on-line, using each new sample to update the entropy estimate to obtain more accurate results in stationary situations or to track the changing entropy of a signal in nonstationary situations.

Original languageEnglish (US)
Title of host publicationNeural Networks for Signal Processing - Proceedings of the IEEE Workshop
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages209-217
Number of pages9
Volume2002-January
ISBN (Print)0780376161
DOIs
StatePublished - 2002
Externally publishedYes
Event12th IEEE Workshop on Neural Networks for Signal Processing, NNSP 2002 - Martigny, Switzerland
Duration: Sep 6 2002 → …

Other

Other12th IEEE Workshop on Neural Networks for Signal Processing, NNSP 2002
CountrySwitzerland
CityMartigny
Period9/6/02 → …

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Artificial Intelligence
  • Software
  • Computer Networks and Communications
  • Signal Processing

Fingerprint Dive into the research topics of 'A recursive Renyi's entropy estimator'. Together they form a unique fingerprint.

Cite this