tf.contrib.rnn.AttentionCellWrapper.__init__(cell, attn_length, attn_size=None, attn_vec_size=None, input_size=None, state_is_tuple=False)
Create a cell with attention.
Args:
-
cell: an RNNCell, an attention is added to it. -
attn_length: integer, the size of an attention window. -
attn_size: integer, the size of an attention vector. Equal to cell.output_size by default. -
attn_vec_size: integer, the number of convolutional features calculated on attention state and a size of the hidden layer built from base cell state. Equal attn_size to by default. -
input_size: integer, the size of a hidden linear layer, built from inputs and attention. Derived from the input tensor by default. -
state_is_tuple: If True, accepted and returned states are n-tuples, wheren = len(cells). By default (False), the states are all concatenated along the column axis.
Raises:
-
TypeError: if cell is not an RNNCell. -
ValueError: if cell returns a state tuple but the flagstate_is_tupleisFalseor if attn_length is zero or less.
Please login to continue.