Processing math: 100%

tf.contrib.distributions.Mixture.entropy_lower_bound()

tf.contrib.distributions.Mixture.entropy_lower_bound(name='entropy_lower_bound')

A lower bound on the entropy of this mixture model.

The bound below is not always very tight, and its usefulness depends on the mixture probabilities and the components in use.

A lower bound is useful for ELBO when the Mixture is the variational distribution:

logp(x)>=ELBO=q(z)logp(x,z)dz+H[q]

where p is the prior disribution, q is the variational, and H[q] is the entropy of q. If there is a lower bound G[q] such that H[q]G[q] then it can be used in place of H[q].

For a mixture of distributions q(Z)=iciqi(Z) with ici=1, by the concavity of f(x)=xlogx, a simple lower bound is:

H[q]=q(z)logq(z)dz =(iciqi(z))log(iciqi(z))dz iciqi(z)logqi(z)dz =iciH[qi]

This is the term we calculate below for G[q].

Args:
  • name: A name for this operation (optional).
Returns:

A lower bound on the Mixture's entropy.

doc_TensorFlow
2025-01-10 15:47:30
Comments
Leave a Comment

Please login to continue.