2017-05-11 51 views
0

我想了解Tensorflow,我看到了Cifar-10模型的官方示例之一。Tensorflow:Cifar-10模型中的输出节点名称是什么?

cifar10.py,在推理(),你可以看到下面几行:

with tf.variable_scope('softmax_linear') as scope: 
    weights = _variable_with_weight_decay('weights', [192, NUM_CLASSES], 
             stddev=1/192.0, wd=0.0) 
    biases = _variable_on_cpu('biases', [NUM_CLASSES], 
          tf.constant_initializer(0.0)) 
    softmax_linear = tf.add(tf.matmul(local4, weights), biases, name=scope.name) 
    _activation_summary(softmax_linear) 

scope.name应softmax_linear,这应该是节点的名称。我保存以下行(它不同于本教程)的图表原:

with tf.Graph().as_default(): 
    global_step = tf.Variable(0, trainable=False) 

    # Get images and labels 
    images, labels = cifar10.distorted_inputs() 


    # Build a Graph that computes the logits predictions from the 
    # inference model. 
    logits = cifar10.inference(images) 

    # Calculate loss. 
    loss = cifar10.loss(logits, labels) 

    # Build a Graph that trains the model with one batch of examples and 
    # updates the model parameters. 
    train_op = cifar10.train(loss, global_step) 

    # Create a saver. 
    saver = tf.train.Saver(tf.global_variables()) 

    # Build the summary operation based on the TF collection of Summaries. 
    summary_op = tf.summary.merge_all() 

    # Build an initialization operation to run below. 
    init = tf.global_variables_initializer() 

    # Start running operations on the Graph. 
    sess = tf.Session(config=tf.ConfigProto(
     log_device_placement=FLAGS.log_device_placement)) 
    sess.run(init) 

    # save the graph 
    tf.train.write_graph(sess.graph_def, FLAGS.train_dir, 'model.pbtxt') 

    .... 

但我不能看到一个名为softmax_linear在model.pbtxt节点。我究竟做错了什么?我只想输出节点的名称来导出图形。

回答

1

运算符名称将不是"softmax_linear"tf.name_scope()为运营商名称加上前缀名称,用/分开。每个操作员都有自己的名字。例如,如果你写

with tf.name_scope("foo"): 
    a = tf.constant(1, name="bar") 

那么这个常量将会有名字"foo/bar"

希望有帮助!

+0

谢谢!你是对的。 softmax_linear/softmax_linear确实存在。 – Kuranes

相关问题