tf.contrib.bayesflow.stochastic_tensor.NormalTensor.__init__()

tf.contrib.bayesflow.stochastic_tensor.NormalTensor.__init__(name=None, dist_value_type=None, loss_fn=score_function, **dist_args)

tf.contrib.graph_editor.assign_renamed_collections_handler()

tf.contrib.graph_editor.assign_renamed_collections_handler(info, elem, elem_) Add the transformed elem to the (renamed) collections of elem. Args: info: Transform._Info instance. elem: the original element (tf.Tensor or tf.Operation) elem_: the transformed element

tf.contrib.distributions.Dirichlet.sample_n()

tf.contrib.distributions.Dirichlet.sample_n(n, seed=None, name='sample_n') Generate n samples. Args: n: Scalar Tensor of type int32 or int64, the number of observations to sample. seed: Python integer seed for RNG name: name to give to the op. Returns: samples: a Tensor with a prepended dimension (n,). Raises: TypeError: if n is not an integer type.

tf.image.non_max_suppression()

tf.image.non_max_suppression(boxes, scores, max_output_size, iou_threshold=None, name=None) Greedily selects a subset of bounding boxes in descending order of score, pruning away boxes that have high intersection-over-union (IOU) overlap with previously selected boxes. Bounding boxes are supplied as [y1, x1, y2, x2], where (y1, x1) and (y2, x2) are the coordinates of any diagonal pair of box corners and the coordinates can be provided as normalized (i.e., lying in the interval [0, 1]) or absol

tf.image.crop_and_resize()

tf.image.crop_and_resize(image, boxes, box_ind, crop_size, method=None, extrapolation_value=None, name=None) Extracts crops from the input image tensor and bilinearly resizes them (possibly with aspect ratio change) to a common output size specified by crop_size. This is more general than the crop_to_bounding_box op which extracts a fixed size slice from the input image and does not allow resizing or aspect ratio change. Returns a tensor with crops from the input image at positions defined at

tf.image.resize_image_with_crop_or_pad()

tf.image.resize_image_with_crop_or_pad(image, target_height, target_width) Crops and/or pads an image to a target width and height. Resizes an image to a target width and height by either centrally cropping the image or padding it evenly with zeros. If width or height is greater than the specified target_width or target_height respectively, this op centrally crops along that dimension. If width or height is smaller than the specified target_width or target_height respectively, this op centrall

tf.contrib.distributions.MultivariateNormalDiagPlusVDVT.pmf()

tf.contrib.distributions.MultivariateNormalDiagPlusVDVT.pmf(value, name='pmf') Probability mass function. Args: value: float or double Tensor. name: The name to give this op. Returns: pmf: a Tensor of shape sample_shape(x) + self.batch_shape with values of type self.dtype. Raises: TypeError: if is_continuous.

tf.image.pad_to_bounding_box()

tf.image.pad_to_bounding_box(image, offset_height, offset_width, target_height, target_width) Pad image with zeros to the specified height and width. Adds offset_height rows of zeros on top, offset_width columns of zeros on the left, and then pads the image on the bottom and right with zeros until it has dimensions target_height, target_width. This op does nothing if offset_* is zero and the image already has size target_height by target_width. Args: image: 3-D tensor with shape [height, widt

tf.contrib.graph_editor.make_view()

tf.contrib.graph_editor.make_view(*args, **kwargs) Create a SubGraphView from selected operations and passthrough tensors. Args: *args: list of 1) regular expressions (compiled or not) or 2) (array of) tf.Operation 3) (array of) tf.Tensor. Those objects will be converted into a list of operations and a list of candidate for passthrough tensors. **kwargs: keyword graph is used 1) to check that the ops and ts are from the correct graph 2) for regular expression query Returns: A subgraph view

tf.contrib.bayesflow.stochastic_tensor.QuantizedDistributionTensor.value_type

tf.contrib.bayesflow.stochastic_tensor.QuantizedDistributionTensor.value_type