tf.contrib.distributions.Distribution.__init__()

tf.contrib.distributions.Distribution.__init__(dtype, parameters, is_continuous, is_reparameterized, validate_args, allow_nan_stats, name=None) Constructs the Distribution. This is a private method for subclass use. Args: dtype: The type of the event samples. None implies no type-enforcement. parameters: Python dictionary of parameters used by this Distribution. is_continuous: Python boolean. If True this Distribution is continuous over its supported domain. is_reparameterized: Python bool

tf.contrib.graph_editor.SubGraphView.remap_inputs()

tf.contrib.graph_editor.SubGraphView.remap_inputs(new_input_indices) Remap the inputs of the subgraph. If the inputs of the original subgraph are [t0, t1, t2], remapping to [2,0] will create a new instance whose inputs is [t2, t0]. Note that this is only modifying the view: the underlying tf.Graph is not affected. Args: new_input_indices: an iterable of integers representing a mapping between the old inputs and the new ones. This mapping can be under-complete and must be without repetitions.

tf.contrib.distributions.QuantizedDistribution.variance()

tf.contrib.distributions.QuantizedDistribution.variance(name='variance') Variance.

tf.contrib.learn.infer()

tf.contrib.learn.infer(restore_checkpoint_path, output_dict, feed_dict=None) Restore graph from restore_checkpoint_path and run output_dict tensors. If restore_checkpoint_path is supplied, restore from checkpoint. Otherwise, init all variables. Args: restore_checkpoint_path: A string containing the path to a checkpoint to restore. output_dict: A dict mapping string names to Tensor objects to run. Tensors must all be from the same graph. feed_dict: dict object mapping Tensor objects to input

tf.contrib.distributions.Uniform.get_batch_shape()

tf.contrib.distributions.Uniform.get_batch_shape() Shape of a single sample from a single event index as a TensorShape. Same meaning as batch_shape. May be only partially defined. Returns: batch_shape: TensorShape, possibly unknown.

tf.contrib.distributions.Binomial.get_event_shape()

tf.contrib.distributions.Binomial.get_event_shape() Shape of a single sample from a single batch as a TensorShape. Same meaning as event_shape. May be only partially defined. Returns: event_shape: TensorShape, possibly unknown.

tf.sparse_split()

tf.sparse_split(split_dim, num_split, sp_input, name=None) Split a SparseTensor into num_split tensors along split_dim. If the sp_input.shape[split_dim] is not an integer multiple of num_split each slice starting from 0:shape[split_dim] % num_split gets extra one dimension. For example, if split_dim = 1 and num_split = 2 and the input is: input_tensor = shape = [2, 7] [ a d e ] [b c ] Graphically the output tensors are: output_tensor[0] = [ a ] [b c ] output_tensor[1] = [

tf.contrib.distributions.Gamma.param_static_shapes()

tf.contrib.distributions.Gamma.param_static_shapes(cls, sample_shape) param_shapes with static (i.e. TensorShape) shapes. Args: sample_shape: TensorShape or python list/tuple. Desired shape of a call to sample(). Returns: dict of parameter name to TensorShape. Raises: ValueError: if sample_shape is a TensorShape and is not fully defined.

tf.OpError.error_code

tf.OpError.error_code The integer error code that describes the error.

tf.contrib.learn.TensorFlowEstimator.fit()

tf.contrib.learn.TensorFlowEstimator.fit(x, y, steps=None, monitors=None, logdir=None) Neural network model from provided model_fn and training data. Note: called first time constructs the graph and initializers variables. Consecutives times it will continue training the same model. This logic follows partial_fit() interface in scikit-learn. To restart learning, create new estimator. Args: x: matrix or tensor of shape [n_samples, n_features...]. Can be iterator that returns arrays of features.