tf.contrib.distributions.BetaWithSoftplusAB.mode()

tf.contrib.distributions.BetaWithSoftplusAB.mode(name='mode') Mode. Additional documentation from Beta: Note that the mode for the Beta distribution is only defined when a > 1, b > 1. This returns the mode when a > 1 and b > 1, and NaN otherwise. If self.allow_nan_stats is False, an exception will be raised rather than returning NaN.

tensorflow::SessionOptions::env

Env* tensorflow::SessionOptions::env The environment to use.

tf.contrib.distributions.MultivariateNormalDiagPlusVDVT.log_pmf()

tf.contrib.distributions.MultivariateNormalDiagPlusVDVT.log_pmf(value, name='log_pmf') Log probability mass function. Args: value: float or double Tensor. name: The name to give this op. Returns: log_pmf: a Tensor of shape sample_shape(x) + self.batch_shape with values of type self.dtype. Raises: TypeError: if is_continuous.

tf.contrib.learn.monitors.CaptureVariable

class tf.contrib.learn.monitors.CaptureVariable Captures a variable's values into a collection. This monitor is useful for unit testing. You should exercise caution when using this monitor in production, since it never discards values. This is an EveryN monitor and has consistent semantic for every_n and first_n.

tf.contrib.learn.monitors.StepCounter.run_on_all_workers

tf.contrib.learn.monitors.StepCounter.run_on_all_workers

tf.contrib.learn.DNNRegressor.linear_bias_

tf.contrib.learn.DNNRegressor.linear_bias_ Returns bias of the linear part.

tf.contrib.distributions.Normal

class tf.contrib.distributions.Normal The scalar Normal distribution with mean and stddev parameters mu, sigma.

tf.contrib.distributions.Bernoulli.mode()

tf.contrib.distributions.Bernoulli.mode(name='mode') Mode. Additional documentation from Bernoulli: Returns 1 if p > 1-p and 0 otherwise.

tf.contrib.rnn.LayerNormBasicLSTMCell

class tf.contrib.rnn.LayerNormBasicLSTMCell LSTM unit with layer normalization and recurrent dropout. This class adds layer normalization and recurrent dropout to a basic LSTM unit. Layer normalization implementation is based on: https://arxiv.org/abs/1607.06450. "Layer Normalization" Jimmy Lei Ba, Jamie Ryan Kiros, Geoffrey E. Hinton and is applied before the internal nonlinearities. Recurrent dropout is base on: https://arxiv.org/abs/1603.05118 "Recurrent Dropout without Memory Loss" Stanisl

tf.contrib.framework.with_shape()

tf.contrib.framework.with_shape(expected_shape, tensor) Asserts tensor has expected shape. If tensor shape and expected_shape, are fully defined, assert they match. Otherwise, add assert op that will validate the shape when tensor is evaluated, and set shape on tensor. Args: expected_shape: Expected shape to assert, as a 1D array of ints, or tensor of same. tensor: Tensor whose shape we're validating. Returns: tensor, perhaps with a dependent assert operation. Raises: ValueError: if tenso