A recursive Renyi's entropy estimator

D. Erdogmus, J. C. Principe, Sung Phil Kim, Justin C. Sanchez

Research output: Chapter in Book/Report/Conference proceedingConference contribution

21 Citations (Scopus)

Abstract

Estimating the entropy of a sample set is required, in solving numerous learning scenarios involving information theoretic optimization criteria. A number of entropy estimators are available in the literature; however, these require a batch of samples to operate on in order to yield an estimate. We derive a recursive formula to estimate Renyi's (1970) quadratic entropy on-line, using each new sample to update the entropy estimate to obtain more accurate results in stationary situations or to track the changing entropy of a signal in nonstationary situations.

Original languageEnglish (US)
Title of host publicationNeural Networks for Signal Processing - Proceedings of the IEEE Workshop
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages209-217
Number of pages9
Volume2002-January
ISBN (Print)0780376161
DOIs
StatePublished - 2002
Externally publishedYes
Event12th IEEE Workshop on Neural Networks for Signal Processing, NNSP 2002 - Martigny, Switzerland
Duration: Sep 6 2002 → …

Other

Other12th IEEE Workshop on Neural Networks for Signal Processing, NNSP 2002
CountrySwitzerland
CityMartigny
Period9/6/02 → …

Fingerprint

Entropy

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Artificial Intelligence
  • Software
  • Computer Networks and Communications
  • Signal Processing

Cite this

Erdogmus, D., Principe, J. C., Kim, S. P., & Sanchez, J. C. (2002). A recursive Renyi's entropy estimator. In Neural Networks for Signal Processing - Proceedings of the IEEE Workshop (Vol. 2002-January, pp. 209-217). [1030032] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/NNSP.2002.1030032

A recursive Renyi's entropy estimator. / Erdogmus, D.; Principe, J. C.; Kim, Sung Phil; Sanchez, Justin C.

Neural Networks for Signal Processing - Proceedings of the IEEE Workshop. Vol. 2002-January Institute of Electrical and Electronics Engineers Inc., 2002. p. 209-217 1030032.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Erdogmus, D, Principe, JC, Kim, SP & Sanchez, JC 2002, A recursive Renyi's entropy estimator. in Neural Networks for Signal Processing - Proceedings of the IEEE Workshop. vol. 2002-January, 1030032, Institute of Electrical and Electronics Engineers Inc., pp. 209-217, 12th IEEE Workshop on Neural Networks for Signal Processing, NNSP 2002, Martigny, Switzerland, 9/6/02. https://doi.org/10.1109/NNSP.2002.1030032
Erdogmus D, Principe JC, Kim SP, Sanchez JC. A recursive Renyi's entropy estimator. In Neural Networks for Signal Processing - Proceedings of the IEEE Workshop. Vol. 2002-January. Institute of Electrical and Electronics Engineers Inc. 2002. p. 209-217. 1030032 https://doi.org/10.1109/NNSP.2002.1030032
Erdogmus, D. ; Principe, J. C. ; Kim, Sung Phil ; Sanchez, Justin C. / A recursive Renyi's entropy estimator. Neural Networks for Signal Processing - Proceedings of the IEEE Workshop. Vol. 2002-January Institute of Electrical and Electronics Engineers Inc., 2002. pp. 209-217
@inproceedings{16dc696a445a4d52a8f4556991d45e87,
title = "A recursive Renyi's entropy estimator",
abstract = "Estimating the entropy of a sample set is required, in solving numerous learning scenarios involving information theoretic optimization criteria. A number of entropy estimators are available in the literature; however, these require a batch of samples to operate on in order to yield an estimate. We derive a recursive formula to estimate Renyi's (1970) quadratic entropy on-line, using each new sample to update the entropy estimate to obtain more accurate results in stationary situations or to track the changing entropy of a signal in nonstationary situations.",
author = "D. Erdogmus and Principe, {J. C.} and Kim, {Sung Phil} and Sanchez, {Justin C.}",
year = "2002",
doi = "10.1109/NNSP.2002.1030032",
language = "English (US)",
isbn = "0780376161",
volume = "2002-January",
pages = "209--217",
booktitle = "Neural Networks for Signal Processing - Proceedings of the IEEE Workshop",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - A recursive Renyi's entropy estimator

AU - Erdogmus, D.

AU - Principe, J. C.

AU - Kim, Sung Phil

AU - Sanchez, Justin C.

PY - 2002

Y1 - 2002

N2 - Estimating the entropy of a sample set is required, in solving numerous learning scenarios involving information theoretic optimization criteria. A number of entropy estimators are available in the literature; however, these require a batch of samples to operate on in order to yield an estimate. We derive a recursive formula to estimate Renyi's (1970) quadratic entropy on-line, using each new sample to update the entropy estimate to obtain more accurate results in stationary situations or to track the changing entropy of a signal in nonstationary situations.

AB - Estimating the entropy of a sample set is required, in solving numerous learning scenarios involving information theoretic optimization criteria. A number of entropy estimators are available in the literature; however, these require a batch of samples to operate on in order to yield an estimate. We derive a recursive formula to estimate Renyi's (1970) quadratic entropy on-line, using each new sample to update the entropy estimate to obtain more accurate results in stationary situations or to track the changing entropy of a signal in nonstationary situations.

UR - http://www.scopus.com/inward/record.url?scp=84953735880&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84953735880&partnerID=8YFLogxK

U2 - 10.1109/NNSP.2002.1030032

DO - 10.1109/NNSP.2002.1030032

M3 - Conference contribution

SN - 0780376161

VL - 2002-January

SP - 209

EP - 217

BT - Neural Networks for Signal Processing - Proceedings of the IEEE Workshop

PB - Institute of Electrical and Electronics Engineers Inc.

ER -