tf.contrib.distributions.Mixture.entropy_lower_bound(name='entropy_lower_bound')
A lower bound on the entropy of this mixture model.
The bound below is not always very tight, and its usefulness depends on the mixture probabilities and the components in use.
A lower bound is useful for ELBO when the Mixture
is the variational distribution:
\( \log p(x) >= ELBO = \int q(z) \log p(x, z) dz + H[q] \)
where \( p \) is the prior disribution, \( q \) is the variational, and \( H[q] \) is the entropy of \( q \). If there is a lower bound \( G[q] \) such that \( H[q] \geq G[q] \) then it can be used in place of \( H[q] \).
For a mixture of distributions \( q(Z) = \sum_i c_i q_i(Z) \) with \( \sum_i c_i = 1 \), by the concavity of \( f(x) = -x \log x \), a simple lower bound is:
\( \begin{align} H[q] & = - \int q(z) \log q(z) dz \\\ & = - \int (\sum_i c_i q_i(z)) \log(\sum_i c_i q_i(z)) dz \\\ & \geq - \sum_i c_i \int q_i(z) \log q_i(z) dz \\\ & = \sum_i c_i H[q_i] \end{align} \)
This is the term we calculate below for \( G[q] \).
Args:
-
name
: A name for this operation (optional).
Returns:
A lower bound on the Mixture's entropy.
Please login to continue.