tf.contrib.distributions.QuantizedDistribution.log_prob()

tf.contrib.distributions.QuantizedDistribution.log_prob(value, name='log_prob') Log probability density/mass function (depending on is_continuous). Additional documentation from QuantizedDistribution: For whole numbers y, P[Y = y] := P[X <= lower_cutoff], if y == lower_cutoff, := P[X > upper_cutoff - 1], y == upper_cutoff, := 0, if j < lower_cutoff or y > upper_cutoff, := P[y - 1 < X <= y], all other y. The base distribution's log_cdf method mus

tf.assert_non_negative()

tf.assert_non_negative(x, data=None, summarize=None, message=None, name=None) Assert the condition x >= 0 holds element-wise. Example of adding a dependency to an operation: with tf.control_dependencies([tf.assert_non_negative(x)]): output = tf.reduce_sum(x) Example of adding dependency to the tensor being checked: x = tf.with_dependencies([tf.assert_non_negative(x)], x) Non-negative means, for every element x[i] of x, we have x[i] >= 0. If x is empty this is trivially satisfied. Arg

tf.contrib.graph_editor.SubGraphView.remap_inputs()

tf.contrib.graph_editor.SubGraphView.remap_inputs(new_input_indices) Remap the inputs of the subgraph. If the inputs of the original subgraph are [t0, t1, t2], remapping to [2,0] will create a new instance whose inputs is [t2, t0]. Note that this is only modifying the view: the underlying tf.Graph is not affected. Args: new_input_indices: an iterable of integers representing a mapping between the old inputs and the new ones. This mapping can be under-complete and must be without repetitions.

tf.contrib.learn.TensorFlowEstimator.get_params()

tf.contrib.learn.TensorFlowEstimator.get_params(deep=True) Get parameters for this estimator. Args: deep: boolean, optional If True, will return the parameters for this estimator and contained subobjects that are estimators. Returns: params : mapping of string to any Parameter names mapped to their values.

tf.nn.rnn_cell.OutputProjectionWrapper.output_size

tf.nn.rnn_cell.OutputProjectionWrapper.output_size

tf.contrib.distributions.ExponentialWithSoftplusLam.sample()

tf.contrib.distributions.ExponentialWithSoftplusLam.sample(sample_shape=(), seed=None, name='sample') Generate samples of the specified shape. Note that a call to sample() without arguments will generate a single sample. Args: sample_shape: 0D or 1D int32 Tensor. Shape of the generated samples. seed: Python integer seed for RNG name: name to give to the op. Returns: samples: a Tensor with prepended dimensions sample_shape.

tf.contrib.losses.sigmoid_cross_entropy()

tf.contrib.losses.sigmoid_cross_entropy(logits, multi_class_labels, weight=1.0, label_smoothing=0, scope=None) Creates a cross-entropy loss using tf.nn.sigmoid_cross_entropy_with_logits. weight acts as a coefficient for the loss. If a scalar is provided, then the loss is simply scaled by the given value. If weight is a tensor of size [batch_size], then the loss weights apply to each corresponding sample. If label_smoothing is nonzero, smooth the labels towards 1/2: new_multiclass_labels = mult

tf.contrib.distributions.StudentT.allow_nan_stats

tf.contrib.distributions.StudentT.allow_nan_stats Python boolean describing behavior when a stat is undefined. Stats return +/- infinity when it makes sense. E.g., the variance of a Cauchy distribution is infinity. However, sometimes the statistic is undefined, e.g., if a distribution's pdf does not achieve a maximum within the support of the distribution, the mode is undefined. If the mean is undefined, then by definition the variance is undefined. E.g. the mean for Student's T for df = 1 is

tensorflow::RandomAccessFile

A file abstraction for randomly reading the contents of a file. Member Details tensorflow::RandomAccessFile::RandomAccessFile() tensorflow::RandomAccessFile::~RandomAccessFile() virtual Status tensorflow::RandomAccessFile::Read(uint64 offset, size_t n, StringPiece *result, char *scratch) const =0 Reads up to n bytes from the file starting at offset. scratch[0..n-1] may be written by this routine. Sets *result to the data that was read (including if fewer than n bytes were successfully read). Ma

tf.contrib.metrics.streaming_pearson_correlation()

tf.contrib.metrics.streaming_pearson_correlation(predictions, labels, weights=None, metrics_collections=None, updates_collections=None, name=None) Computes pearson correlation coefficient between predictions, labels. The streaming_pearson_correlation function delegates to streaming_covariance the tracking of three [co]variances: - streaming_covariance(predictions, labels), i.e. covariance - streaming_covariance(predictions, predictions), i.e. variance - streaming_covariance(labels, labels), i.