ValueError:尺寸必须相等,但对于具有输入形状 [?,2]、[?,64] 的 ‘{{node binary_crossentropy/mul}},尺寸必须是 2 和 64
nlp 629
原文标题 :ValueError: Dimensions must be equal, but are 2 and 64 for ‘{{node binary_crossentropy/mul}} with input shapes[?,2], [?,64]
我正在尝试使用 bi-lstm 模型对文本进行二进制分类,但出现此错误: ValueError: Dimensions must be equal, but are 2 and 64 for ‘{{node binary_crossentropy/mul}} = Mul[T=DT_FLOAT](binary_crossentropy/ Cast, binary_crossentropy/Log)’ 输入形状:[?,2], [?,64]。我是初学者,请提供一些有价值的解决方案。
text=df['text']
label=df['label']
X = pad_sequences(X, maxlen=max_len,padding=pad_type,truncating=trunc_type)
Y = pd.get_dummies(label).values
X_train, X_test, Y_train, Y_test = train_test_split(X,Y, test_size = 0.20)
print(X_train.shape,Y_train.shape)
print(X_test.shape,Y_test.shape)
#model creation
model=tf.keras.Sequential([
# add an embedding layer
tf.keras.layers.Embedding(word_count, 16, input_length=max_len),
tf.keras.layers.Dropout(0.2),
# add another bi-lstm layer
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(2,return_sequences=True)),
# add a dense layer
tf.keras.layers.Dense(32, activation=tf.keras.activations.relu),
tf.keras.layers.Dense(32, activation=tf.keras.activations.relu),
tf.keras.layers.Dense(32, activation=tf.keras.activations.relu),
tf.keras.layers.Dense(32, activation=tf.keras.activations.softmax),
# add the prediction layer
tf.keras.layers.Dense(1, activation=tf.keras.activations.sigmoid),
])
model.compile(loss=tf.keras.losses.BinaryCrossentropy(), optimizer=tf.keras.optimizers.Adam(), metrics=['accuracy'])
model.summary()
history = model.fit(X_train, Y_train, validation_data=(X_test, Y_test), epochs = 10, batch_size=batch_size, callbacks = [callback_func], verbose=1)
回复
我来回复-
ki-ljl 评论
该回答已被采纳!
二分类的预测层的输出维度应该是2:
# add the prediction layer tf.keras.layers.Dense(2, activation=tf.keras.activations.sigmoid)
展平:
#model creation model=tf.keras.Sequential([ # add an embedding layer tf.keras.layers.Embedding(word_count, 16, input_length=max_len), tf.keras.layers.Dropout(0.2), # add another bi-lstm layer tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(2,return_sequences=True)), # add flatten tf.keras.layers.Flatten(), #<======================== # add a dense layer tf.keras.layers.Dense(32, activation=tf.keras.activations.relu), tf.keras.layers.Dense(32, activation=tf.keras.activations.relu), tf.keras.layers.Dense(32, activation=tf.keras.activations.relu), tf.keras.layers.Dense(32, activation=tf.keras.activations.softmax), # add the prediction layer tf.keras.layers.Dense(2, activation=tf.keras.activations.sigmoid), ])
2年前