tf.assert_non_positive()

tf.assert_non_positive(x, data=None, summarize=None, message=None, name=None) Assert the condition x <= 0 holds element-wise. Example of adding a dependency to an operation: with tf.control_dependencies([tf.assert_non_positive(x)]): output = tf.reduce_sum(x) Example of adding dependency to the tensor being checked: x = tf.with_dependencies([tf.assert_non_positive(x)], x) Non-positive means, for every element x[i] of x, we have x[i] <= 0. If x is empty this is trivially satisfied. Arg

tf.contrib.training.NextQueuedSequenceBatch.key

tf.contrib.training.NextQueuedSequenceBatch.key The key names of the given truncated unrolled examples. The format of the key is: "%05d_of_%05d:%s" % (sequence, sequence_count, original_key) where original_key is the unique key read in by the prefetcher. Returns: A string vector of length batch_size, the keys.

tf.contrib.learn.LinearClassifier

class tf.contrib.learn.LinearClassifier Linear classifier model. Train a linear model to classify instances into one of multiple possible classes. When number of possible classes is 2, this is binary classification. Example: education = sparse_column_with_hash_bucket(column_name="education", hash_bucket_size=1000) occupation = sparse_column_with_hash_bucket(column_name="occupation", hash_bucket_size=1000) e

tf.contrib.distributions.RegisterKL.__call__()

tf.contrib.distributions.RegisterKL.__call__(kl_fn) Perform the KL registration. Args: kl_fn: The function to use for the KL divergence. Returns: kl_fn Raises: TypeError: if kl_fn is not a callable. ValueError: if a KL divergence function has already been registered for the given argument classes.

tf.get_session_handle()

tf.get_session_handle(data, name=None) Return the handle of data. This is EXPERIMENTAL and subject to change. Keep data "in-place" in the runtime and create a handle that can be used to retrieve data in a subsequent run(). Combined with get_session_tensor, we can keep a tensor produced in one run call in place, and use it as the input in a future run call. Args: data: A tensor to be stored in the session. name: Optional name prefix for the return tensor. Returns: A scalar string tensor rep

tf.contrib.distributions.MultivariateNormalFull.sigma

tf.contrib.distributions.MultivariateNormalFull.sigma Dense (batch) covariance matrix, if available.

tf.contrib.graph_editor.transform_op_if_inside_handler()

tf.contrib.graph_editor.transform_op_if_inside_handler(info, op, keep_if_possible=True) Transform an optional op only if it is inside the subgraph. This handler is typically use to handle original op: it is fine to keep them if they are inside the subgraph, otherwise they are just ignored. Args: info: Transform._Info instance. op: the optional op to transform (or ignore). keep_if_possible: re-attach to the original op if possible, that is, if the source graph and the destination graph are t

tf.decode_raw()

tf.decode_raw(bytes, out_type, little_endian=None, name=None) Reinterpret the bytes of a string as a vector of numbers. Args: bytes: A Tensor of type string. All the elements must have the same length. out_type: A tf.DType from: tf.float32, tf.float64, tf.int32, tf.uint8, tf.int16, tf.int8, tf.int64. little_endian: An optional bool. Defaults to True. Whether the input bytes are in little-endian order. Ignored for out_type values that are stored in a single byte like uint8. name: A name for

tf.contrib.bayesflow.stochastic_tensor.MultivariateNormalDiagTensor.__init__()

tf.contrib.bayesflow.stochastic_tensor.MultivariateNormalDiagTensor.__init__(name=None, dist_value_type=None, loss_fn=score_function, **dist_args)

tf.contrib.distributions.DirichletMultinomial.mean()

tf.contrib.distributions.DirichletMultinomial.mean(name='mean') Mean.