tf.contrib.rnn.CoupledInputForgetGateLSTMCell.state_size

tf.contrib.rnn.CoupledInputForgetGateLSTMCell.state_size

2016-10-14 13:07:21
tf.contrib.rnn.GridLSTMCell.

tf.contrib.rnn.GridLSTMCell.__call__(inputs, state, scope=None) Run one step of LSTM. Args:

2016-10-14 13:07:22
tf.contrib.rnn.LayerNormBasicLSTMCell.

tf.contrib.rnn.LayerNormBasicLSTMCell.__call__(inputs, state, scope=None) LSTM cell with layer normalization and recurrent dropout

2016-10-14 13:07:24
tf.contrib.rnn.GRUBlockCell.

tf.contrib.rnn.GRUBlockCell.__init__(cell_size) Initialize the Block GRU cell. Args:

2016-10-14 13:07:24
tf.contrib.rnn.CoupledInputForgetGateLSTMCell.

tf.contrib.rnn.CoupledInputForgetGateLSTMCell.__call__(inputs, state, scope=None) Run one step of LSTM.

2016-10-14 13:07:21
tf.contrib.rnn.AttentionCellWrapper.

tf.contrib.rnn.AttentionCellWrapper.__call__(inputs, state, scope=None) Long short-term memory cell with attention (LSTMA).

2016-10-14 13:07:20
tf.contrib.rnn.LSTMBlockCell.output_size

tf.contrib.rnn.LSTMBlockCell.output_size

2016-10-14 13:07:25
tf.contrib.rnn.TimeFreqLSTMCell.zero_state()

tf.contrib.rnn.TimeFreqLSTMCell.zero_state(batch_size, dtype) Return zero-filled state tensor(s). Args:

2016-10-14 13:07:26
tf.contrib.rnn.TimeFreqLSTMCell.

tf.contrib.rnn.TimeFreqLSTMCell.__init__(num_units, use_peepholes=False, cell_clip=None, initializer=None, num_unit_shards=1, forget_bias=1.0, feature_size=None, fr

2016-10-14 13:07:26
tf.contrib.rnn.GridLSTMCell

class tf.contrib.rnn.GridLSTMCell Grid Long short-term memory unit (LSTM) recurrent network cell. The

2016-10-14 13:07:22