tf.contrib.learn.monitors.StopAtStep.begin()

tf.contrib.learn.monitors.StopAtStep.begin(max_steps=None) Called at the beginning of training. When called, the default graph is the one we are executing. Args: max_steps: int, the maximum global step this training will run until. Raises: ValueError: if we've already begun a run.

tf.contrib.learn.monitors.GraphDump

class tf.contrib.learn.monitors.GraphDump Dumps almost all tensors in the graph at every step. Note, this is very expensive, prefer PrintTensor in production.

tf.contrib.distributions.DirichletMultinomial.prob()

tf.contrib.distributions.DirichletMultinomial.prob(value, name='prob') Probability density/mass function (depending on is_continuous). Additional documentation from DirichletMultinomial: For each batch of counts [n_1,...,n_k], P[counts] is the probability that after sampling n draws from this Dirichlet Multinomial distribution, the number of draws falling in class j is n_j. Note that different sequences of draws can result in the same counts, thus the probability includes a combinatorial coeff

tf.contrib.learn.TensorFlowEstimator.get_tensor()

tf.contrib.learn.TensorFlowEstimator.get_tensor(name) Returns tensor by name. Args: name: string, name of the tensor. Returns: Tensor.

tf.contrib.distributions.TransformedDistribution.cdf()

tf.contrib.distributions.TransformedDistribution.cdf(value, name='cdf') Cumulative distribution function. Given random variable X, the cumulative distribution function cdf is: cdf(x) := P[X <= x] Args: value: float or double Tensor. name: The name to give this op. Returns: cdf: a Tensor of shape sample_shape(x) + self.batch_shape with values of type self.dtype.

tf.contrib.bayesflow.stochastic_tensor.InverseGammaWithSoftplusAlphaBetaTensor.value()

tf.contrib.bayesflow.stochastic_tensor.InverseGammaWithSoftplusAlphaBetaTensor.value(name='value')

tf.contrib.learn.DNNRegressor

class tf.contrib.learn.DNNRegressor A regressor for TensorFlow DNN models. Example: education = sparse_column_with_hash_bucket(column_name="education", hash_bucket_size=1000) occupation = sparse_column_with_hash_bucket(column_name="occupation", hash_bucket_size=1000) education_emb = embedding_column(sparse_id_column=education, dimension=16, combiner="sum") occupation_emb = e

tf.contrib.bayesflow.stochastic_tensor.ExponentialTensor.input_dict

tf.contrib.bayesflow.stochastic_tensor.ExponentialTensor.input_dict

tf.contrib.bayesflow.stochastic_tensor.ExponentialTensor.name

tf.contrib.bayesflow.stochastic_tensor.ExponentialTensor.name

tf.contrib.graph_editor.select_ops_and_ts()

tf.contrib.graph_editor.select_ops_and_ts(*args, **kwargs) Helper to select operations and tensors. Args: *args: list of 1) regular expressions (compiled or not) or 2) (array of) tf.Operation 3) (array of) tf.Tensor. Regular expressions matching tensors must start with the comment "(?#ts)", for instance: "(?#ts)^foo/.*". **kwargs: 'graph': tf.Graph in which to perform the regex query.This is required when using regex. 'positive_filter': an elem if selected only if positive_filter(elem) is Tr