tf.contrib.losses.sigmoid_cross_entropy(logits, multi_class_labels, weight=1.0, label_smoothing=0, scope=None)
Creates a cross-entropy loss using tf.nn.sigmoid_cross_entropy_with_logits.
weight acts as a coefficient for the loss. If a scalar is provided, then the loss is simply scaled by the given value. If weight is a tensor of size [batch_size], then the loss weights apply to each corresponding sample.
If label_smoothing is nonzero, smooth the labels towards 1/2: new_multiclass_labels = mult