tf.contrib.distributions.MultivariateNormalDiag.variance()

tf.contrib.distributions.MultivariateNormalDiag.variance(name='variance') Variance.

tf.TFRecordReader.supports_serialize

tf.TFRecordReader.supports_serialize Whether the Reader implementation can serialize its state.

tf.contrib.layers.one_hot_encoding()

tf.contrib.layers.one_hot_encoding(*args, **kwargs) Transform numeric labels into onehot_labels using tf.one_hot. Args: labels: [batch_size] target labels. num_classes: total number of classes. on_value: A scalar defining the on-value. off_value: A scalar defining the off-value. outputs_collections: collection to add the outputs. scope: Optional scope for name_scope. Returns: one hot encoding of the labels.

tf.is_non_decreasing()

tf.is_non_decreasing(x, name=None) Returns True if x is non-decreasing. Elements of x are compared in row-major order. The tensor [x[0],...] is non-decreasing if for every adjacent pair we have x[i] <= x[i+1]. If x has less than two elements, it is trivially non-decreasing. See also: is_strictly_increasing Args: x: Numeric Tensor. name: A name for this operation (optional). Defaults to "is_non_decreasing" Returns: Boolean Tensor, equal to True iff x is non-decreasing. Raises: TypeError

tf.contrib.bayesflow.stochastic_tensor.BinomialTensor

class tf.contrib.bayesflow.stochastic_tensor.BinomialTensor BinomialTensor is a StochasticTensor backed by the distribution Binomial.

tf.sparse_reduce_sum()

tf.sparse_reduce_sum(sp_input, reduction_axes=None, keep_dims=False) Computes the sum of elements across dimensions of a SparseTensor. This Op takes a SparseTensor and is the sparse counterpart to tf.reduce_sum(). In particular, this Op also returns a dense Tensor instead of a sparse one. Reduces sp_input along the dimensions given in reduction_axes. Unless keep_dims is true, the rank of the tensor is reduced by 1 for each entry in reduction_axes. If keep_dims is true, the reduced dimensions a

tf.contrib.distributions.DirichletMultinomial.log_prob()

tf.contrib.distributions.DirichletMultinomial.log_prob(value, name='log_prob') Log probability density/mass function (depending on is_continuous). Additional documentation from DirichletMultinomial: For each batch of counts [n_1,...,n_k], P[counts] is the probability that after sampling n draws from this Dirichlet Multinomial distribution, the number of draws falling in class j is n_j. Note that different sequences of draws can result in the same counts, thus the probability includes a combina

tf.contrib.losses.sum_of_squares()

tf.contrib.losses.sum_of_squares(*args, **kwargs) Adds a Sum-of-Squares loss to the training procedure. (deprecated) THIS FUNCTION IS DEPRECATED. It will be removed after 2016-10-01. Instructions for updating: Use mean_squared_error. weight acts as a coefficient for the loss. If a scalar is provided, then the loss is simply scaled by the given value. If weight is a tensor of size [batch_size], then the total loss for each sample of the batch is rescaled by the corresponding element in the weig

tf.contrib.losses.get_losses()

tf.contrib.losses.get_losses(scope=None, loss_collection='losses') Gets the list of losses from the loss_collection. Args: scope: an optional scope for filtering the losses to return. loss_collection: Optional losses collection. Returns: a list of loss tensors.

tf.contrib.distributions.GammaWithSoftplusAlphaBeta

class tf.contrib.distributions.GammaWithSoftplusAlphaBeta Gamma with softplus transform on alpha and beta.