@mariusoei

Global convergence and empirical consistency of the generalized Lloyd algorithm

, and . Information Theory, IEEE Transactions on, 32 (2): 148-155 (March 1986)
DOI: 10.1109/TIT.1986.1057168

Abstract

The generalized Lloyd algorithm for vector quantizer design is analyzed as a descent algorithm for nonlinear programming. A broad class of convex distortion functions is considered and any input distribution that has no singular-continuous part is allowed. A well-known convergence theorem is applied to show that iterative applications of the algorithm produce a sequence of quantizers that approaches the set of fixed-point quantizers. The methods of the theorem are extended to sequences of algorithms, yielding results on the behavior of the algorithm when an unknown distribution is approximated by a training sequence of observations. It is shown that as the length of the training sequence grows large that 1) fixed-point quantizers for the training sequence approach the set of fixed-point quantizers for the true distribution, and 2) limiting quantizers produced by the algorithm with the training sequence distribution perform no worse than limiting quantizers produced by the algorithm with the true distribution.

Description

IEEE Xplore Abstract - Global convergence and empirical consistency of the generalized Lloyd algorithm

Links and resources

Tags

community

  • @mariusoei
  • @dblp
@mariusoei's tags highlighted