4
我想知道如何在Tensorflow中使用多层双向LSTM。如何在Tensorflow中使用多层双向LSTM?
我已经实现了双向LSTM的内容,但是我想将这个模型与添加了多层的模型进行比较。
我该如何在这部分添加一些代码?
x = tf.unstack(tf.transpose(x, perm=[1, 0, 2]))
#print(x[0].get_shape())
# Define lstm cells with tensorflow
# Forward direction cell
lstm_fw_cell = rnn.BasicLSTMCell(n_hidden, forget_bias=1.0)
# Backward direction cell
lstm_bw_cell = rnn.BasicLSTMCell(n_hidden, forget_bias=1.0)
# Get lstm cell output
try:
outputs, _, _ = rnn.static_bidirectional_rnn(lstm_fw_cell, lstm_bw_cell, x,
dtype=tf.float32)
except Exception: # Old TensorFlow version only returns outputs not states
outputs = rnn.static_bidirectional_rnn(lstm_fw_cell, lstm_bw_cell, x,
dtype=tf.float32)
# Linear activation, using rnn inner loop last output
outputs = tf.stack(outputs, axis=1)
outputs = tf.reshape(outputs, (batch_size*n_steps, n_hidden*2))
outputs = tf.matmul(outputs, weights['out']) + biases['out']
outputs = tf.reshape(outputs, (batch_size, n_steps, n_classes))
我试过这个,得到这个错误:ValueError:变量bidirectional_rnn/fw/lstm_cell/kernel已经存在,不允许。你是否想在VarScope中设置reuse = True?你能提供一个有效的例子吗? – Rahul