tf.contrib.rnn.CoupledInputForgetGateLSTMCell.
  • References/Big Data/TensorFlow/TensorFlow Python/RNN

tf.contrib.rnn.CoupledInputForgetGateLSTMCell.__call__(inputs, state, scope=None) Run one step of LSTM.

2025-01-10 15:47:30
tf.contrib.rnn.GRUBlockCell.state_size
  • References/Big Data/TensorFlow/TensorFlow Python/RNN

tf.contrib.rnn.GRUBlockCell.state_size

2025-01-10 15:47:30
tf.contrib.rnn.LSTMBlockCell.state_size
  • References/Big Data/TensorFlow/TensorFlow Python/RNN

tf.contrib.rnn.LSTMBlockCell.state_size

2025-01-10 15:47:30
tf.contrib.rnn.GridLSTMCell
  • References/Big Data/TensorFlow/TensorFlow Python/RNN

class tf.contrib.rnn.GridLSTMCell Grid Long short-term memory unit (LSTM) recurrent network cell. The

2025-01-10 15:47:30
tf.contrib.rnn.CoupledInputForgetGateLSTMCell.output_size
  • References/Big Data/TensorFlow/TensorFlow Python/RNN

tf.contrib.rnn.CoupledInputForgetGateLSTMCell.output_size

2025-01-10 15:47:30
tf.contrib.rnn.AttentionCellWrapper.
  • References/Big Data/TensorFlow/TensorFlow Python/RNN

tf.contrib.rnn.AttentionCellWrapper.__call__(inputs, state, scope=None) Long short-term memory cell with attention (LSTMA).

2025-01-10 15:47:30
tf.contrib.rnn.AttentionCellWrapper.output_size
  • References/Big Data/TensorFlow/TensorFlow Python/RNN

tf.contrib.rnn.AttentionCellWrapper.output_size

2025-01-10 15:47:30
tf.contrib.rnn.AttentionCellWrapper
  • References/Big Data/TensorFlow/TensorFlow Python/RNN

class tf.contrib.rnn.AttentionCellWrapper Basic attention cell wrapper. Implementation based on

2025-01-10 15:47:30
tf.contrib.rnn.GridLSTMCell.output_size
  • References/Big Data/TensorFlow/TensorFlow Python/RNN

tf.contrib.rnn.GridLSTMCell.output_size

2025-01-10 15:47:30
tf.contrib.rnn.LayerNormBasicLSTMCell.output_size
  • References/Big Data/TensorFlow/TensorFlow Python/RNN

tf.contrib.rnn.LayerNormBasicLSTMCell.output_size

2025-01-10 15:47:30