tf.contrib.learn.monitors.EveryN.step_end()

tf.contrib.learn.monitors.EveryN.step_end(step, output) Overrides BaseMonitor.step_end. When overriding this method, you must call the super implementation. Args: step: int, the current value of the global step. output: dict mapping string values representing tensor names to the value resulted from running these tensors. Values may be either scalars, for scalar tensors, or Numpy array, for non-scalar tensors. Returns: bool, the result of every_n_step_end, if that was called this step, or F

tf.contrib.distributions.BetaWithSoftplusAB.param_static_shapes()

tf.contrib.distributions.BetaWithSoftplusAB.param_static_shapes(cls, sample_shape) param_shapes with static (i.e. TensorShape) shapes. Args: sample_shape: TensorShape or python list/tuple. Desired shape of a call to sample(). Returns: dict of parameter name to TensorShape. Raises: ValueError: if sample_shape is a TensorShape and is not fully defined.

tf.contrib.distributions.BetaWithSoftplusAB.param_shapes()

tf.contrib.distributions.BetaWithSoftplusAB.param_shapes(cls, sample_shape, name='DistributionParamShapes') Shapes of parameters given the desired shape of a call to sample(). Subclasses should override static method _param_shapes. Args: sample_shape: Tensor or python list/tuple. Desired shape of a call to sample(). name: name to prepend ops with. Returns: dict of parameter name to Tensor shapes.

tf.contrib.learn.TensorFlowRNNRegressor.fit()

tf.contrib.learn.TensorFlowRNNRegressor.fit(x, y, steps=None, monitors=None, logdir=None) Neural network model from provided model_fn and training data. Note: called first time constructs the graph and initializers variables. Consecutives times it will continue training the same model. This logic follows partial_fit() interface in scikit-learn. To restart learning, create new estimator. Args: x: matrix or tensor of shape [n_samples, n_features...]. Can be iterator that returns arrays of featur

tf.contrib.bayesflow.stochastic_tensor.MultivariateNormalCholeskyTensor.distribution

tf.contrib.bayesflow.stochastic_tensor.MultivariateNormalCholeskyTensor.distribution

tf.contrib.bayesflow.stochastic_tensor.BetaWithSoftplusABTensor.__init__()

tf.contrib.bayesflow.stochastic_tensor.BetaWithSoftplusABTensor.__init__(name=None, dist_value_type=None, loss_fn=score_function, **dist_args)

tf.FIFOQueue.__init__()

tf.FIFOQueue.__init__(capacity, dtypes, shapes=None, names=None, shared_name=None, name='fifo_queue') Creates a queue that dequeues elements in a first-in first-out order. A FIFOQueue has bounded capacity; supports multiple concurrent producers and consumers; and provides exactly-once delivery. A FIFOQueue holds a list of up to capacity elements. Each element is a fixed-length tuple of tensors whose dtypes are described by dtypes, and whose shapes are optionally described by the shapes argumen

tf.contrib.distributions.Poisson.pdf()

tf.contrib.distributions.Poisson.pdf(value, name='pdf') Probability density function. Args: value: float or double Tensor. name: The name to give this op. Returns: prob: a Tensor of shape sample_shape(x) + self.batch_shape with values of type self.dtype. Raises: TypeError: if not is_continuous.

tf.contrib.distributions.Chi2WithAbsDf.pdf()

tf.contrib.distributions.Chi2WithAbsDf.pdf(value, name='pdf') Probability density function. Args: value: float or double Tensor. name: The name to give this op. Returns: prob: a Tensor of shape sample_shape(x) + self.batch_shape with values of type self.dtype. Raises: TypeError: if not is_continuous.

tf.contrib.rnn.LSTMBlockCell.zero_state()

tf.contrib.rnn.LSTMBlockCell.zero_state(batch_size, dtype) Return zero-filled state tensor(s). Args: batch_size: int, float, or unit Tensor representing the batch size. dtype: the data type to use for the state. Returns: If state_size is an int or TensorShape, then the return value is a N-D tensor of shape [batch_size x state_size] filled with zeros. If state_size is a nested list or tuple, then the return value is a nested list or tuple (of the same structure) of 2-D tensors with the shap