2016-03-02 39 views
2
def compileActivation(self, net, layerNum): 
    variable = net.x if layerNum == 0 else net.varArrayA[layerNum - 1] 
    #print tf.expand_dims(net.dropOutVectors[layerNum], 1) 

    #print net.varWeights[layerNum]['w'].get_shape().as_list() 

    z = tf.matmul((net.varWeights[layerNum]['w']), (variable * (tf.expand_dims(net.dropOutVectors[layerNum], 1) if self.dropout else 1.0))) + tf.expand_dims(net.varWeights[layerNum]['b'], 1) 

    a = self.activation(z, self.pool_size) 
    net.varArrayA.append(a) 

我正在运行,其计算z,并将其传递到S形激活的激活功能相同的等级。 当我尝试执行上面的功能,我得到以下错误:TensorFlow错误:TensorShape()必须具有

ValueError: Shapes TensorShape([Dimension(-2)]) and TensorShape([Dimension(None), Dimension(None)]) must have the same rank 

的theano当量计算z工作就好了:

z = T.dot(net.varWeights[layerNum]['w'], variable * (net.dropOutVectors[layerNum].dimshuffle(0, 'x') if self.dropout else 1.0)) + net.varWeights[layerNum]['b'].dimshuffle(0, 'x') 
+0

该代码看起来在语法上是正确的,但似乎'net'中的某个对象具有损坏的形状。特别是'TensorShape([Dimension(-2)])'永远不会出现,并且在TensorFlow 0.7.0中进行测试,所以如果升级,您可能会收到更有帮助的错误消息。 – mrry

+0

谢谢。我会尝试升级 –

回答

0

米希尔,

当我遇到了这个问题,这是因为我的占位符在我的Feed字典中大小错误。你也应该知道如何在会话中运行图形。 tf.Session.run(fetches, feed_dict=None)

这里是我的代码,以使placeholders

# Note this place holder is for the input data feed-dict definition 
input_placeholder = tf.placeholder(tf.float32, shape=(batch_size, FLAGS.InputLayer)) 
# Not sure yet what this will be used for. 
desired_output_placeholder = tf.placeholder(tf.float32, shape=(batch_size, FLAGS.OutputLayer)) 

这里是我的填充料词典功能:

def feel_feed_funct(data_sets_train, input_pl, output_pl): 
    ti_feed, dto_feed = data_sets_train.next_batch(FLAGS.batch_size) 

    feed_dict = { 
    input_pl: ti_feed, 
    output_pl: dto_feed 
    } 
    return feed_dict 

后来我这样做:

# Fill a feed dictionary with the actual set of images and labels 
# for this particular training step. 
feed_dict = fill_feed_dict(data_sets.train, input_placeholder, desired_output_placeholder) 

然后运行会话并获取输出我有这条线

_, l = sess.run([train_op, loss], feed_dict=feed_dict) 
相关问题