tf.contrib.layers.optimize_loss()

tf.contrib.layers.optimize_loss(loss, global_step, learning_rate, optimizer, gradient_noise_scale=None, gradient_multipliers=None, clip_gradients=None, learning_rate_decay_fn=None

2016-10-14 13:05:23
tf.contrib.layers.xavier_initializer()

tf.contrib.layers.xavier_initializer(uniform=True, seed=None, dtype=tf.float32) Returns an initializer performing "Xavier" initialization

2016-10-14 13:05:25
tf.contrib.layers.summarize_tensors()

tf.contrib.layers.summarize_tensors(tensors, summarizer=summarize_tensor) Summarize a set of tensors.

2016-10-14 13:05:24
tf.contrib.layers.batch_norm()

tf.contrib.layers.batch_norm(*args, **kwargs) Adds a Batch Normalization layer from

2016-10-14 13:05:20
tf.contrib.layers.flatten()

tf.contrib.layers.flatten(*args, **kwargs) Flattens the input while maintaining the batch_size. Assumes

2016-10-14 13:05:21
tf.contrib.layers.convolution2d()

tf.contrib.layers.convolution2d(*args, **kwargs) Adds a 2D convolution followed by an optional batch_norm layer.

2016-10-14 13:05:20
tf.contrib.layers.l1_regularizer()

tf.contrib.layers.l1_regularizer(scale, scope=None) Returns a function that can be used to apply L1 regularization to weights

2016-10-14 13:05:22
tf.contrib.layers.convolution2d_in_plane()

tf.contrib.layers.convolution2d_in_plane(*args, **kwargs) Performs the same in-plane convolution to each channel independently

2016-10-14 13:05:21
tf.contrib.layers.summarize_activation()

tf.contrib.layers.summarize_activation(op) Summarize an activation. This applies the given activation

2016-10-14 13:05:24
tf.contrib.layers.sum_regularizer()

tf.contrib.layers.sum_regularizer(regularizer_list, scope=None) Returns a function that applies the sum of multiple regularizers

2016-10-14 13:05:25