''NoneType' object is not subscriptable : tf.keras.layers.concatenate : 'layer should be called on a list of 'f'at least 1 input
I am trying to implement Unet with TensorFlow 2.7 and python 3.8. I am following the guide but I am facing an issue while running the architecture.
The problem I am facing is while using concatenating the inputs and Conv2DTranspose.
I am also looking for various implementations of Unet but all are written in different styles and different versions of tensorflow.
please let me know the possible solution as this code looks clean and I don't want to look for a different complex version.
import os
from skimage.transform import resize
from skimage.io import imsave
import numpy as np
import tensorflow as tf
image_height, image_width = 96, 96
smoothness = 1.0
work_dir = ''
def convolution_layer(filters, kernel=(3,3), activation='relu', input_shape=None):
if input_shape is None:
return tf.keras.layers.Conv2D(
filters=filters,
kernel_size=kernel,
activation=activation,padding='same')
else:
return tf.keras.layers.Conv2D(
filters=filters,
kernel_size=kernel,
activation=activation,
input_shape=input_shape,padding='same')
def concatenated_de_convolution_layer(filters):
return tf.keras.layers.concatenate([
tf.keras.layers.Conv2DTranspose(
filters=filters,
kernel_size=(2, 2),
strides=(2, 2),
padding='same'
)],
axis=3
)
def pooling_layer():
return tf.keras.layers.MaxPooling2D(pool_size=(2, 2))
unet = tf.keras.models.Sequential()
inputs = tf.keras.layers.Input((image_height, image_width, 1))
input_shape = (image_height, image_width, 1)
unet.add(convolution_layer(32, input_shape=input_shape))
unet.add(convolution_layer(32))
unet.add(pooling_layer())
unet.add(convolution_layer(64))
unet.add(convolution_layer(64))
unet.add(pooling_layer())
unet.add(convolution_layer(128))
unet.add(convolution_layer(128))
unet.add(pooling_layer())
unet.add(convolution_layer(256))
unet.add(convolution_layer(256))
unet.add(pooling_layer())
unet.add(convolution_layer(512))
unet.add(convolution_layer(512))
**unet.add(concatenated_de_convolution_layer(256))**
unet.add(convolution_layer(256))
unet.add(convolution_layer(256))
unet.add(concatenated_de_convolution_layer(128))
unet.add(convolution_layer(128))
unet.add(convolution_layer(128))
unet.add(concatenated_de_convolution_layer(64))
unet.add(convolution_layer(64))
unet.add(convolution_layer(64))
unet.add(concatenated_de_convolution_layer(32))
unet.add(convolution_layer(32))
unet.add(convolution_layer(32))
unet.add(convolution_layer(1, kernel=(1, 1), activation='sigmoid'))
unet.compile(optimizer=tf.keras.optimizers.Adam(lr=1e-5),
loss=dice_coefficient_loss,
metrics=[dice_coefficient])
Solution 1:[1]
You cannot concatenate a single Conv2DTranspose layer, since it is only one. Try this:
def concatenated_de_convolution_layer(filters):
return tf.keras.layers.Conv2DTranspose(
filters=filters,
kernel_size=(2, 2),
strides=(2, 2),
padding='same')
If you want to concatenate two layers, then you have to feed two inputs, but you are making your life unnecessarily difficult by using the Sequential API. Generally, I would recommend using the functional Keras API for building a UNET model. Here is a good starting point.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 |


