tf.contrib.bayesflow.stochastic_tensor.MultivariateNormalDiagTensor.distribution

tf.contrib.bayesflow.stochastic_tensor.MultivariateNormalDiagTensor.distribution

tf.argmin()

tf.argmin(input, dimension, name=None) Returns the index with the smallest value across dimensions of a tensor. Args: input: A Tensor. Must be one of the following types: float32, float64, int64, int32, uint8, uint16, int16, int8, complex64, complex128, qint8, quint8, qint32, half. dimension: A Tensor. Must be one of the following types: int32, int64. int32, 0 <= dimension < rank(input). Describes which dimension of the input Tensor to reduce across. For vectors, use dimension = 0. na

tf.contrib.training.NextQueuedSequenceBatch.length

tf.contrib.training.NextQueuedSequenceBatch.length The lengths of the given truncated unrolled examples. For initial iterations, for which sequence * num_unroll < length, this number is num_unroll. For the remainder, this number is between 0 and num_unroll. Returns: An integer vector of length batch_size, the lengths.

tf.contrib.bayesflow.stochastic_tensor.DirichletTensor.mean()

tf.contrib.bayesflow.stochastic_tensor.DirichletTensor.mean(name='mean')

tf.contrib.distributions.TransformedDistribution.is_reparameterized

tf.contrib.distributions.TransformedDistribution.is_reparameterized

tf.contrib.distributions.LaplaceWithSoftplusScale.log_survival_function()

tf.contrib.distributions.LaplaceWithSoftplusScale.log_survival_function(value, name='log_survival_function') Log survival function. Given random variable X, the survival function is defined: log_survival_function(x) = Log[ P[X > x] ] = Log[ 1 - P[X <= x] ] = Log[ 1 - cdf(x) ] Typically, different numerical approximations can be used for the log survival function, which are more accurate than 1 - cdf(x) when x >> 1. Args: value: fl

tf.nn.rnn_cell.LSTMCell

class tf.nn.rnn_cell.LSTMCell Long short-term memory unit (LSTM) recurrent network cell. The default non-peephole implementation is based on: http://deeplearning.cs.cmu.edu/pdfs/Hochreiter97_lstm.pdf S. Hochreiter and J. Schmidhuber. "Long Short-Term Memory". Neural Computation, 9(8):1735-1780, 1997. The peephole implementation is based on: https://research.google.com/pubs/archive/43905.pdf Hasim Sak, Andrew Senior, and Francoise Beaufays. "Long short-term memory recurrent neural network archi

tf.where()

tf.where(input, name=None) Returns locations of true values in a boolean tensor. This operation returns the coordinates of true elements in input. The coordinates are returned in a 2-D tensor where the first dimension (rows) represents the number of true elements, and the second dimension (columns) represents the coordinates of the true elements. Keep in mind, the shape of the output tensor can vary depending on how many true values there are in input. Indices are output in row-major order. Fo

tf.contrib.distributions.Mixture.__init__()

tf.contrib.distributions.Mixture.__init__(cat, components, validate_args=False, allow_nan_stats=True, name='Mixture') Initialize a Mixture distribution. A Mixture is defined by a Categorical (cat, representing the mixture probabilities) and a list of Distribution objects all having matching dtype, batch shape, event shape, and continuity properties (the components). The num_classes of cat must be possible to infer at graph construction time and match len(components). Args: cat: A Categorical

tf.contrib.distributions.InverseGamma.pmf()

tf.contrib.distributions.InverseGamma.pmf(value, name='pmf') Probability mass function. Args: value: float or double Tensor. name: The name to give this op. Returns: pmf: a Tensor of shape sample_shape(x) + self.batch_shape with values of type self.dtype. Raises: TypeError: if is_continuous.