tf.contrib.bayesflow.stochastic_tensor.NormalWithSoftplusSigmaTensor.__init__()

tf.contrib.bayesflow.stochastic_tensor.NormalWithSoftplusSigmaTensor.__init__(name=None, dist_value_type=None, loss_fn=score_function, **dist_args)

tf.contrib.distributions.MultivariateNormalDiag.log_prob()

tf.contrib.distributions.MultivariateNormalDiag.log_prob(value, name='log_prob') Log probability density/mass function (depending on is_continuous). Additional documentation from _MultivariateNormalOperatorPD: x is a batch vector with compatible shape if x is a Tensor whose shape can be broadcast up to either: self.batch_shape + self.event_shape or [M1,...,Mm] + self.batch_shape + self.event_shape Args: value: float or double Tensor. name: The name to give this op. Returns: log_prob: a

tf.nn.rnn_cell.BasicRNNCell.__init__()

tf.nn.rnn_cell.BasicRNNCell.__init__(num_units, input_size=None, activation=tanh)

tf.parse_example()

tf.parse_example(serialized, features, name=None, example_names=None) Parses Example protos into a dict of tensors. Parses a number of serialized Example protos given in serialized. example_names may contain descriptive names for the corresponding serialized protos. These may be useful for debugging purposes, but they have no effect on the output. If not None, example_names must be the same length as serialized. This op parses serialized examples into a dictionary mapping keys to Tensor and Sp

tf.SparseTensor.dtype

tf.SparseTensor.dtype The DType of elements in this tensor.

tf.contrib.bayesflow.entropy.entropy_shannon()

tf.contrib.bayesflow.entropy.entropy_shannon(p, z=None, n=None, seed=None, form=None, name='entropy_shannon') Monte Carlo or deterministic computation of Shannon's entropy. Depending on the kwarg form, this Op returns either the analytic entropy of the distribution p, or the sampled entropy: -n^{-1} sum_{i=1}^n p.log_prob(z_i), where z_i ~ p, \approx - E_p[ Log[p(Z)] ] = Entropy[p] User supplies either Tensor of samples z, or number of samples to draw n Args: p: tf.contrib.distribut

tf.contrib.graph_editor.ControlOutputs.get()

tf.contrib.graph_editor.ControlOutputs.get(op) return the control outputs of op.

tf.contrib.distributions.Mixture.log_pmf()

tf.contrib.distributions.Mixture.log_pmf(value, name='log_pmf') Log probability mass function. Args: value: float or double Tensor. name: The name to give this op. Returns: log_pmf: a Tensor of shape sample_shape(x) + self.batch_shape with values of type self.dtype. Raises: TypeError: if is_continuous.

tf.contrib.losses.get_total_loss()

tf.contrib.losses.get_total_loss(add_regularization_losses=True, name='total_loss') Returns a tensor whose value represents the total loss. Notice that the function adds the given losses to the regularization losses. Args: add_regularization_losses: A boolean indicating whether or not to use the regularization losses in the sum. name: The name of the returned tensor. Returns: A Tensor whose value represents the total loss. Raises: ValueError: if losses is not iterable.

tf.matrix_set_diag()

tf.matrix_set_diag(input, diagonal, name=None) Returns a batched matrix tensor with new batched diagonal values. Given input and diagonal, this operation returns a tensor with the same shape and values as input, except for the diagonals of the innermost matrices. These will be overwritten by the values in diagonal. The batched matrices must be square. The output is computed as follows: Assume input has k+1 dimensions [I, J, K, ..., N, N] and diagonal has k dimensions [I, J, K, ..., N]. Then th