我想用两次adam优化器来最小化代码中的不同张量,我尝试过使用GradientDescentOptimizer两次,这很好,但是当我使用adam优化器两次时出现错误消息,我问了另一个问题在:tensorflowVariable RNNLM/RNNLM/embedding/Adam_2/ does not exist,但该解决方案在这里不起作用。我也向上看:https://github.com/tensorflow/tensorflow/issues/6220,但我还是不明白。
这是我的代码,我收到错误消息:ValueError:变量NN/NN/W/Adam_2 /不存在,或者未使用tf.get_variable()创建。你是否想在VarScope中设置重用=无?
然后我试图解决在tensorflowVariable RNNLM/RNNLM/embedding/Adam_2/ does not exist,但不工作
在tensorflow中使用Adam优化器TWICE
import tensorflow as tf
def main():
optimizer = tf.train.GradientDescentOptimizer(0.005)
# optimizer = tf.train.AdamOptimizer(0.005)
with tf.variable_scope('NN') as scope:
W = tf.get_variable(name='W', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
X = tf.get_variable(name='X', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
y_ = tf.get_variable(name='y_', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
y1 = W + X
loss_1 = tf.reduce_mean(tf.abs(y_ - y1))
# train_op1 = tf.train.GradientDescentOptimizer(0.005).minimize(loss_1)
train_op1 = tf.train.AdamOptimizer(0.005).minimize(loss_1)
# with tf.variable_scope('opt'):
# train_op1 = tf.train.AdamOptimizer(0.005).minimize(loss_1)
##############################################################################################
scope.reuse_variables()
W2 = tf.get_variable(name='W', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
X2 = tf.get_variable(name='X', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
b = tf.Variable(tf.random_normal(shape=[5, 1], dtype=tf.float32))
y2 = W2 + X2 + b
loss_2 = tf.reduce_mean(tf.abs(y_ - y2))
# train_op2 = tf.train.GradientDescentOptimizer(0.005).minimize(loss_2)
train_op2 = tf.train.AdamOptimizer(0.005).minimize(loss_2)
# with tf.variable_scope('opt'):
# train_op2 = tf.train.AdamOptimizer(0.005).minimize(loss_2)
if __name__ == '__main__':
main()
你可以尝试把你的第二优化器在不同的范围? – rmeertens
谢谢,如果我把第二个优化器放在不同的范围内,它可以工作!但我仍然不知道为什么错误发生在我的代码中 – Joey