tf.nn.rnn_cell.LSTMCell.__init__(num_units, input_size=None, use_peepholes=False, cell_clip=None, initializer=None, num_proj=None, proj_clip=None, num_unit_shards=1, num_proj_shards=1, forget_bias=1.0, state_is_tuple=True, activation=tanh)
Initialize the parameters for an LSTM cell.
Args:
-
num_units
: int, The number of units in the LSTM cell -
input_size
: Deprecated and unused. -
use_peepholes
: bool, set True to enable diagonal/peephole connections. -
cell_clip
: (optional) A float value, if provided the cell state is clipped by this value prior to the cell output activation. -
initializer
: (optional) The initializer to use for the weight and projection matrices. -
num_proj
: (optional) int, The output dimensionality for the projection matrices. If None, no projection is performed. proj_clip
: (optional) A float value. Ifnum_proj > 0
andproj_clip
is provided, then the projected values are clipped elementwise to within[-proj_clip, proj_clip]
.num_unit_shards
: How to split the weight matrix. If >1, the weight matrix is stored across num_unit_shards.num_proj_shards
: How to split the projection matrix. If >1, the projection matrix is stored across num_proj_shards.forget_bias
: Biases of the forget gate are initialized by default to 1 in order to reduce the scale of forgetting at the beginning of the training.state_is_tuple
: If True, accepted and returned states are 2-tuples of thec_state
andm_state
. If False, they are concatenated along the column axis. This latter behavior will soon be deprecated.activation
: Activation function of the inner states.
Please login to continue.