这是我的问题,我想在TimeDistributed图层中使用pretrain CNN网络之一。但是我有一些问题来实现它。Keras pretrain有时间分布的CNN
这里是我的模型:
def bnn_model(max_len):
# sequence length and resnet input size
x = Input(shape=(maxlen, 224, 224, 3))
base_model = ResNet50.ResNet50(weights='imagenet', include_top=False)
for layer in base_model.layers:
layer.trainable = False
som = TimeDistributed(base_model)(x)
#the ouput of the model is [1, 1, 2048], need to squeeze
som = Lambda(lambda x: K.squeeze(K.squeeze(x,2),2))(som)
bnn = Bidirectional(LSTM(300))(som)
bnn = Dropout(0.5)(bnn)
pred = Dense(1, activation='sigmoid')(bnn)
model = Model(input=x, output=pred)
model.compile(optimizer=Adam(lr=1.0e-5), loss="mse", metrics=["accuracy"])
return model
当编译模型我没有错误。但是,当我开始训练,我得到以下错误:
tensorflow/core/framework/op_kernel.cc:975] Invalid argument: You must feed a value for placeholder tensor 'input_2' with dtype float
[[Node: input_2 = Placeholder[dtype=DT_FLOAT, shape=[], _device="/job:localhost/replica:0/task:0/gpu:0"]()]]
我检查,我做派FLOAT32但对于输入1,输入2是存在于pretrain RESNET输入。
只是在这里有一个概述是模型总结。 (注:这是奇怪的是,它并没有显示RESNET里面发生了什么,但没关系)
____________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
====================================================================================================
input_1 (InputLayer) (None, 179, 224, 224, 0
____________________________________________________________________________________________________
timedistributed_1 (TimeDistribut (None, 179, 1, 1, 204 23587712 input_1[0][0]
____________________________________________________________________________________________________
lambda_1 (Lambda) (None, 179, 2048) 0 timedistributed_1[0][0]
____________________________________________________________________________________________________
bidirectional_1 (Bidirectional) (None, 600) 5637600 lambda_1[0][0]
____________________________________________________________________________________________________
dropout_1 (Dropout) (None, 600) 0 bidirectional_1[0][0]
____________________________________________________________________________________________________
dense_1 (Dense) (None, 1) 601 dropout_1[0][0]
====================================================================================================
Total params: 29,225,913
Trainable params: 5,638,201
Non-trainable params: 23,587,712
____________________________________________________________________________________________________
我猜,我不正确地使用TimeDistributed,我没有看见任何人试图做到这一点。我希望有人能指导我。
编辑:
的问题来自于以下事实:ResNet50.ResNet50(weights='imagenet', include_top=False)
在图中创建自己的输入。
所以我想我需要做一些像ResNet50.ResNet50(weights='imagenet', input_tensor=x, include_top=False)
但我不知道如何将它与TimeDistributed
耦合。
我试图
base_model = Lambda(lambda x : ResNet50.ResNet50(weights='imagenet', input_tensor=x, include_top=False))
som = TimeDistributed(base_model)(in_ten)
但它不工作。
它似乎要求占位符的浮点值。你可以追踪传递给'tf.Session.run'调用中'feed_dict'的东西吗? – drpng
在** tensorflow_backend.py中,我打印了feed_dict,并得到了这个'[,, dtype = bool>,]'。 ResNet仍然使用占位符定义,但不应该这样做。 –
rAyyy
我很确定我应该做一些像'ResNet50.ResNet50(weights ='imagenet',input_tensor = x,include_top = False'),所以base_model中没有占位符,但我不明白如何用TimeDistributed来做到这一点。 – rAyyy