Keras 序列模型中的数据增强层
tensorflow 326
原文标题 :Data Augmentation Layer in Keras Sequential Model
我正在尝试将数据增强作为一个层添加到模型中,但我遇到了我认为是形状问题的问题。我也尝试在增强层中指定输入形状。当我从模型中取出data_augmentation
层时,它运行良好。
preprocessing.RandomFlip('horizontal', input_shape=(224, 224, 3))
data_augmentation_layer = keras.Sequential([
preprocessing.RandomFlip('horizontal'),
preprocessing.RandomRotation(0.2),
preprocessing.RandomZoom(0.2),
preprocessing.RandomWidth(0.2),
preprocessing.RandomHeight(0.2),
preprocessing.RandomContrast(0.2)
], name='data_augmentation')
model = keras.Sequential([
data_augmentation_layer,
Conv2D(filters=32,
kernel_size=1,
strides=1,
input_shape=(224, 224, 3)),
Activation(activation='relu'),
MaxPool2D(),
Conv2D(filters=32,
kernel_size=1,
strides=1),
Activation(activation='relu'),
MaxPool2D(),
Flatten(),
Dense(1, activation='sigmoid')
])```
The last dimension of the inputs to a Dense layer should be defined. Found None. Full input shape received: (None, None)
Call arguments received:
• inputs=tf.Tensor(shape=(None, 224, 224, 3), dtype=float32)
• training=True
• mask=None
回复
我来回复-
AloneTogether 评论
层
RandomWidth
和RandomHeight
导致此错误,因为它们导致None
维度:请参阅此处的评论:[…]RandomHeight 将导致高度维度上的 None 形状,因为并非图层的所有输出都将具有相同的高度(按设计)。这对于像 Conv2D 层这样的东西是可以的,它可以接受可变形状的图像输入(在某些维度上没有形状)。
这不适用于随后调用 Flatten 后跟 aDense,因为展平的批次也将具有可变大小(因为可变高度),并且 Dense 层需要最后一个维度的固定形状。您可能可以在密集之前填充 flatten 的输出,但如果您想要这种架构,您可能只想避免导致可变输出形状的图像增强层。
因此,您可以不使用a
Flatten
layer,例如,使用aGlobalMaxPool2D
layer,它不需要事先知道其他维度:import tensorflow as tf data_augmentation_layer = tf.keras.Sequential([ tf.keras.layers.RandomFlip('horizontal', input_shape=(224, 224, 3)), tf.keras.layers.RandomRotation(0.2), tf.keras.layers.RandomZoom(0.2), tf.keras.layers.RandomWidth(0.2), tf.keras.layers.RandomHeight(0.2), tf.keras.layers.RandomContrast(0.2) ], name='data_augmentation') model = tf.keras.Sequential([ data_augmentation_layer, tf.keras.layers.Conv2D(filters=32, kernel_size=1, strides=1), tf.keras.layers.Activation(activation='relu'), tf.keras.layers.MaxPool2D(), tf.keras.layers.Conv2D(filters=32, kernel_size=1, strides=1), tf.keras.layers.Activation(activation='relu'), tf.keras.layers.GlobalMaxPool2D(), tf.keras.layers.Dense(1, activation='sigmoid') ]) print(model.summary())
Model: "sequential_4" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= data_augmentation (Sequenti (None, None, None, 3) 0 al) conv2d_8 (Conv2D) (None, None, None, 32) 128 activation_8 (Activation) (None, None, None, 32) 0 max_pooling2d_6 (MaxPooling (None, None, None, 32) 0 2D) conv2d_9 (Conv2D) (None, None, None, 32) 1056 activation_9 (Activation) (None, None, None, 32) 0 global_max_pooling2d_1 (Glo (None, 32) 0 balMaxPooling2D) dense_4 (Dense) (None, 1) 33 ================================================================= Total params: 1,217 Trainable params: 1,217 Non-trainable params: 0 _________________________________________________________________ None
2年前