2017-07-07 50 views
0

我想用两次adam优化器来最小化代码中的不同张量,我尝试过使用GradientDescentOptimizer两次,这很好,但是当我使用adam优化器两次时出现错误消息,我问了另一个问题在:tensorflowVariable RNNLM/RNNLM/embedding/Adam_2/ does not exist,但该解决方案在这里不起作用。我也向上看:https://github.com/tensorflow/tensorflow/issues/6220,但我还是不明白。

这是我的代码,我收到错误消息:ValueError:变量NN/NN/W/Adam_2 /不存在,或者未使用tf.get_variable()创建。你是否想在VarScope中设置重用=无?

然后我试图解决在tensorflowVariable RNNLM/RNNLM/embedding/Adam_2/ does not exist,但不工作
在tensorflow中使用Adam优化器TWICE

import tensorflow as tf 

def main(): 
    optimizer = tf.train.GradientDescentOptimizer(0.005) 
    # optimizer = tf.train.AdamOptimizer(0.005) 

    with tf.variable_scope('NN') as scope: 
     W = tf.get_variable(name='W', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1])) 
     X = tf.get_variable(name='X', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1])) 
     y_ = tf.get_variable(name='y_', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1])) 
     y1 = W + X 
     loss_1 = tf.reduce_mean(tf.abs(y_ - y1)) 


     # train_op1 = tf.train.GradientDescentOptimizer(0.005).minimize(loss_1) 
     train_op1 = tf.train.AdamOptimizer(0.005).minimize(loss_1) 
     # with tf.variable_scope('opt'): 
     #  train_op1 = tf.train.AdamOptimizer(0.005).minimize(loss_1) 

     ############################################################################################## 
     scope.reuse_variables() 

     W2 = tf.get_variable(name='W', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1])) 
     X2 = tf.get_variable(name='X', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1])) 
     b = tf.Variable(tf.random_normal(shape=[5, 1], dtype=tf.float32)) 
     y2 = W2 + X2 + b 
     loss_2 = tf.reduce_mean(tf.abs(y_ - y2)) 

     # train_op2 = tf.train.GradientDescentOptimizer(0.005).minimize(loss_2) 
     train_op2 = tf.train.AdamOptimizer(0.005).minimize(loss_2) 
     # with tf.variable_scope('opt'): 
     #  train_op2 = tf.train.AdamOptimizer(0.005).minimize(loss_2) 


if __name__ == '__main__': 
    main() 
+0

你可以尝试把你的第二优化器在不同的范围? – rmeertens

+0

谢谢,如果我把第二个优化器放在不同的范围内,它可以工作!但我仍然不知道为什么错误发生在我的代码中 – Joey

回答

0

来解决这个问题最简单的方法就是把第二个优化在不同的范围。这样的命名不会造成任何混淆。

0

如果你绝对必须这样做,在同一范围内, 确保所有变量在时间上定义。 我不得不做更多的研究,为什么它像这样工作,但优化器设置锁定在较低级别的图形中,不再可动态访问。

最小工作例如:

import tensorflow as tf 

def main(): 
    optimizer = tf.train.GradientDescentOptimizer(0.005) 
    # optimizer = tf.train.AdamOptimizer(0.005) 

    with tf.variable_scope('NN') as scope: 
     assert scope.reuse == False 
     W2 = tf.get_variable(name='W', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1])) 
     X2 = tf.get_variable(name='X', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1])) 
     y2_ = tf.get_variable(name='y_', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1])) 
     b = tf.get_variable(name='b', initializer=tf.random_normal(shape=[5, 1], dtype=tf.float32)) 
     y2 = W2 + X2 + b 
     loss_2 = tf.reduce_mean(tf.abs(y2_ - y2)) 

     # train_op2 = tf.train.GradientDescentOptimizer(0.005).minimize(loss_2) 
     train_op2 = tf.train.AdamOptimizer(0.005).minimize(loss_2) 


     # with tf.variable_scope('opt'): 
     #  train_op1 = tf.train.AdamOptimizer(0.005).minimize(loss_1) 

    ############################################################################################## 
    with tf.variable_scope('NN', reuse = True) as scope: 


     W = tf.get_variable(name='W', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1])) 
     X = tf.get_variable(name='X', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1])) 
     y_ = tf.get_variable(name='y_', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1])) 
     b = tf.get_variable(name='b', initializer=tf.random_normal(shape=[5, 1], dtype=tf.float32)) 

     y1 = W + X 
     loss_1 = tf.reduce_mean(tf.abs(y_ - y1)) 


     # train_op1 = tf.train.GradientDescentOptimizer(0.005).minimize(loss_1) 
     train_op1 = tf.train.AdamOptimizer(0.005).minimize(loss_1) 
     # with tf.variable_scope('opt'): 
     #  train_op2 = tf.train.AdamOptimizer(0.005).minimize(loss_2) 


if __name__ == '__main__': 
    main() 
相关问题