tf.contrib.bayesflow.stochastic_tensor.DirichletTensor.name

tf.contrib.bayesflow.stochastic_tensor.DirichletTensor.name

tf.contrib.distributions.QuantizedDistribution.param_shapes()

tf.contrib.distributions.QuantizedDistribution.param_shapes(cls, sample_shape, name='DistributionParamShapes') Shapes of parameters given the desired shape of a call to sample(). Subclasses should override static method _param_shapes. Args: sample_shape: Tensor or python list/tuple. Desired shape of a call to sample(). name: name to prepend ops with. Returns: dict of parameter name to Tensor shapes.

tf.image.central_crop()

tf.image.central_crop(image, central_fraction) Crop the central region of the image. Remove the outer parts of an image but retain the central region of the image along each dimension. If we specify central_fraction = 0.5, this function returns the region marked with "X" in the below diagram. -------- | | | XXXX | | XXXX | | | where "X" is the central 50% of the image. -------- Args: image: 3-D float Tensor of shape [height, width, depth] central_fraction: float (0, 1]

tf.contrib.distributions.MultivariateNormalDiag.allow_nan_stats

tf.contrib.distributions.MultivariateNormalDiag.allow_nan_stats Python boolean describing behavior when a stat is undefined. Stats return +/- infinity when it makes sense. E.g., the variance of a Cauchy distribution is infinity. However, sometimes the statistic is undefined, e.g., if a distribution's pdf does not achieve a maximum within the support of the distribution, the mode is undefined. If the mean is undefined, then by definition the variance is undefined. E.g. the mean for Student's T

tf.contrib.learn.monitors.StepCounter.every_n_post_step()

tf.contrib.learn.monitors.StepCounter.every_n_post_step(step, session) Callback after a step is finished or end() is called. Args: step: int, the current value of the global step. session: Session object.

tf.contrib.distributions.Normal.sigma

tf.contrib.distributions.Normal.sigma Distribution parameter for standard deviation.

tf.contrib.layers.l1_regularizer()

tf.contrib.layers.l1_regularizer(scale, scope=None) Returns a function that can be used to apply L1 regularization to weights. L1 regularization encourages sparsity. Args: scale: A scalar multiplier Tensor. 0.0 disables the regularizer. scope: An optional scope name. Returns: A function with signature l1(weights) that apply L1 regularization. Raises: ValueError: If scale is negative or if scale is not a float.

tf.contrib.graph_editor.select_ts()

tf.contrib.graph_editor.select_ts(*args, **kwargs) Helper to select tensors. Args: *args: list of 1) regular expressions (compiled or not) or 2) (array of) tf.Tensor. tf.Operation instances are silently ignored. **kwargs: 'graph': tf.Graph in which to perform the regex query.This is required when using regex. 'positive_filter': an elem if selected only if positive_filter(elem) is True. This is optional. 'restrict_ts_regex': a regular expression is ignored if it doesn't start with the substri

tf.contrib.distributions.Gamma.entropy()

tf.contrib.distributions.Gamma.entropy(name='entropy') Shanon entropy in nats. Additional documentation from Gamma: This is defined to be entropy = alpha - log(beta) + log(Gamma(alpha)) + (1-alpha)digamma(alpha) where digamma(alpha) is the digamma function.

tf.contrib.bayesflow.stochastic_tensor.NormalWithSoftplusSigmaTensor.loss()

tf.contrib.bayesflow.stochastic_tensor.NormalWithSoftplusSigmaTensor.loss(final_loss, name='Loss')