tf.contrib.layers.optimize_loss(loss, global_step, learning_rate, optimizer, gradient_noise_scale=None, gradient_multipliers=None, clip_gradients=None, learning_rate_decay_fn=None
tf.contrib.layers.xavier_initializer(uniform=True, seed=None, dtype=tf.float32) Returns an initializer performing "Xavier" initialization
tf.contrib.layers.summarize_tensors(tensors, summarizer=summarize_tensor) Summarize a set of tensors.
tf.contrib.layers.batch_norm(*args, **kwargs) Adds a Batch Normalization layer from
tf.contrib.layers.flatten(*args, **kwargs) Flattens the input while maintaining the batch_size. Assumes
tf.contrib.layers.convolution2d(*args, **kwargs) Adds a 2D convolution followed by an optional batch_norm layer.
tf.contrib.layers.l1_regularizer(scale, scope=None) Returns a function that can be used to apply L1 regularization to weights
tf.contrib.layers.convolution2d_in_plane(*args, **kwargs) Performs the same in-plane convolution to each channel independently
tf.contrib.layers.summarize_activation(op) Summarize an activation. This applies the given activation
tf.contrib.layers.sum_regularizer(regularizer_list, scope=None) Returns a function that applies the sum of multiple regularizers
Page 2 of 3