tf.contrib.distributions.WishartCholesky.mean()

tf.contrib.distributions.WishartCholesky.mean(name='mean') Mean.

tf.scan()

tf.scan(fn, elems, initializer=None, parallel_iterations=10, back_prop=True, swap_memory=False, infer_shape=True, name=None) scan on the list of tensors unpacked from elems on dimension 0. The simplest version of scan repeatedly applies the callable fn to a sequence of elements from first to last. The elements are made of the tensors unpacked from elems on dimension 0. The callable fn takes two tensors as arguments. The first argument is the accumulated value computed from the preceding invoca

tf.contrib.learn.monitors.GraphDump.end()

tf.contrib.learn.monitors.GraphDump.end(session=None) Callback at the end of training/evaluation. Args: session: A tf.Session object that can be used to run ops. Raises: ValueError: if we've not begun a run.

tf.contrib.bayesflow.stochastic_tensor.PoissonTensor.mean()

tf.contrib.bayesflow.stochastic_tensor.PoissonTensor.mean(name='mean')

tensorflow::Tensor::vec()

TTypes<T>::ConstVec tensorflow::Tensor::vec() const Const versions of all the methods above.

tf.contrib.bayesflow.stochastic_tensor.MultivariateNormalFullTensor.mean()

tf.contrib.bayesflow.stochastic_tensor.MultivariateNormalFullTensor.mean(name='mean')

tf.contrib.bayesflow.stochastic_tensor.WishartFullTensor.distribution

tf.contrib.bayesflow.stochastic_tensor.WishartFullTensor.distribution

tf.contrib.bayesflow.stochastic_tensor.DirichletTensor.loss()

tf.contrib.bayesflow.stochastic_tensor.DirichletTensor.loss(final_loss, name='Loss')

tf.contrib.rnn.GridLSTMCell.state_size

tf.contrib.rnn.GridLSTMCell.state_size

tf.contrib.learn.monitors.RunHookAdapterForMonitors.before_run()

tf.contrib.learn.monitors.RunHookAdapterForMonitors.before_run(run_context)