tf.contrib.bayesflow.stochastic_tensor.PoissonTensor.name

tf.contrib.bayesflow.stochastic_tensor.PoissonTensor.name

tf.contrib.bayesflow.stochastic_tensor.MultinomialTensor.entropy()

tf.contrib.bayesflow.stochastic_tensor.MultinomialTensor.entropy(name='entropy')

tf.contrib.distributions.StudentT.event_shape()

tf.contrib.distributions.StudentT.event_shape(name='event_shape') Shape of a single sample from a single batch as a 1-D int32 Tensor. Args: name: name to give to the op Returns: event_shape: Tensor.

tf.contrib.rnn.LayerNormBasicLSTMCell

class tf.contrib.rnn.LayerNormBasicLSTMCell LSTM unit with layer normalization and recurrent dropout. This class adds layer normalization and recurrent dropout to a basic LSTM unit. Layer normalization implementation is based on: https://arxiv.org/abs/1607.06450. "Layer Normalization" Jimmy Lei Ba, Jamie Ryan Kiros, Geoffrey E. Hinton and is applied before the internal nonlinearities. Recurrent dropout is base on: https://arxiv.org/abs/1603.05118 "Recurrent Dropout without Memory Loss" Stanisl

tf.contrib.distributions.Bernoulli.mode()

tf.contrib.distributions.Bernoulli.mode(name='mode') Mode. Additional documentation from Bernoulli: Returns 1 if p > 1-p and 0 otherwise.

tf.contrib.learn.monitors.StepCounter.run_on_all_workers

tf.contrib.learn.monitors.StepCounter.run_on_all_workers

tf.contrib.bayesflow.stochastic_tensor.MultinomialTensor.graph

tf.contrib.bayesflow.stochastic_tensor.MultinomialTensor.graph

tf.contrib.bayesflow.stochastic_tensor.GammaTensor.value_type

tf.contrib.bayesflow.stochastic_tensor.GammaTensor.value_type

tf.contrib.bayesflow.stochastic_tensor.ExponentialTensor.mean()

tf.contrib.bayesflow.stochastic_tensor.ExponentialTensor.mean(name='mean')

tensorflow::TensorShape::InsertDim()

void tensorflow::TensorShape::InsertDim(int d, int64 size) Insert a dimension somewhere in the TensorShape. REQUIRES: 0 <= d <= dims() REQUIRES: size >= 0