Asked  7 Months ago    Answers:  5   Viewed   47 times

I have trained a binary classification model with CNN, and here is my code

model = Sequential()
model.add(Convolution2D(nb_filters, kernel_size[0], kernel_size[1],
                        border_mode='valid',
                        input_shape=input_shape))
model.add(Activation('relu'))
model.add(Convolution2D(nb_filters, kernel_size[0], kernel_size[1]))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=pool_size))
# (16, 16, 32)
model.add(Convolution2D(nb_filters*2, kernel_size[0], kernel_size[1]))
model.add(Activation('relu'))
model.add(Convolution2D(nb_filters*2, kernel_size[0], kernel_size[1]))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=pool_size))
# (8, 8, 64) = (2048)
model.add(Flatten())
model.add(Dense(1024))
model.add(Activation('relu'))
model.add(Dropout(0.5))
model.add(Dense(2))  # define a binary classification problem
model.add(Activation('softmax'))

model.compile(loss='categorical_crossentropy',
              optimizer='adadelta',
              metrics=['accuracy'])
model.fit(x_train, y_train,
          batch_size=batch_size,
          nb_epoch=nb_epoch,
          verbose=1,
          validation_data=(x_test, y_test))

And here, I wanna get the output of each layer just like TensorFlow, how can I do that?

 Answers

32

You can easily get the outputs of any layer by using: model.layers[index].output

For all layers use this:

from keras import backend as K

inp = model.input                                           # input placeholder
outputs = [layer.output for layer in model.layers]          # all layer outputs
functors = [K.function([inp, K.learning_phase()], [out]) for out in outputs]    # evaluation functions

# Testing
test = np.random.random(input_shape)[np.newaxis,...]
layer_outs = [func([test, 1.]) for func in functors]
print layer_outs

Note: To simulate Dropout use learning_phase as 1. in layer_outs otherwise use 0.

Edit: (based on comments)

K.function creates theano/tensorflow tensor functions which is later used to get the output from the symbolic graph given the input.

Now K.learning_phase() is required as an input as many Keras layers like Dropout/Batchnomalization depend on it to change behavior during training and test time.

So if you remove the dropout layer in your code you can simply use:

from keras import backend as K

inp = model.input                                           # input placeholder
outputs = [layer.output for layer in model.layers]          # all layer outputs
functors = [K.function([inp], [out]) for out in outputs]    # evaluation functions

# Testing
test = np.random.random(input_shape)[np.newaxis,...]
layer_outs = [func([test]) for func in functors]
print layer_outs

Edit 2: More optimized

I just realized that the previous answer is not that optimized as for each function evaluation the data will be transferred CPU->GPU memory and also the tensor calculations needs to be done for the lower layers over-n-over.

Instead this is a much better way as you don't need multiple functions but a single function giving you the list of all outputs:

from keras import backend as K

inp = model.input                                           # input placeholder
outputs = [layer.output for layer in model.layers]          # all layer outputs
functor = K.function([inp, K.learning_phase()], outputs )   # evaluation function

# Testing
test = np.random.random(input_shape)[np.newaxis,...]
layer_outs = functor([test, 1.])
print layer_outs
Tuesday, June 1, 2021
 
TheTechnicalPaladin
answered 7 Months ago
27

When you saved your model using:

old_model.save('my_model.h5')

it will save following:

  1. The architecture of the model, allowing to create the model.
  2. The weights of the model.
  3. The training configuration of the model (loss, optimizer).
  4. The state of the optimizer, allowing training to resume from where you left before.

So then, when you load the model:

res50_model = load_model('my_model.h5')

you should get the same model back, you can verify the same using:

res50_model.summary()
res50_model.get_weights()

Now you can, pop the input layer and add your own using:

res50_model.layers.pop(0)
res50_model.summary()

add new input layer:

newInput = Input(batch_shape=(0,299,299,3))    # let us say this new InputLayer
newOutputs = res50_model(newInput)
newModel = Model(newInput, newOutputs)

newModel.summary()
res50_model.summary()
Sunday, June 13, 2021
 
Ticksy
answered 6 Months ago
90

The problem is input_shape.

It should actually contain 3 dimensions only. And internally keras will add the batch dimension making it 4.

Since you probably used input_shape with 4 dimensions (batch included), keras is adding the 5th.

You should use input_shape=(32,32,1).

Wednesday, July 14, 2021
 
TheCarver
answered 5 Months ago
96

You could re-order the levels of your factor and add the color adjustment:

dat %>% ggplot(aes(x = rt, 
                   fill = factor(acc, levels = c(1,0)))) + 
  geom_density(aes(y= ..count..*0.03), alpha = 0.6)+
scale_fill_manual(values = c("1" = "#00BFC4", "0" = "#F8766D"))
Friday, August 20, 2021
 
T.J. Crowder
answered 4 Months ago
11

Your best bet may be to write a custom train loop via train_on_batch or fit; the former's only disadvantaged if use_multiprocessing=True, or using callbacks - which isn't the case. Below is an implementation with train_on_batch - if you use fit instead (for multiprocessing, callbacks, etc), make sure you feed only one batch at a time, and provide no validation data (use model.evaluate instead) - else the control flow breaks. (Also, a custom Callback is a valid, but involved alternative)


CUSTOM TRAIN LOOP
iters_per_epoch = len(train_it) // batch_size
num_epochs = 5
outs_store_freq = 20 # in iters
print_loss_freq = 20 # in iters

iter_num = 0
epoch_num = 0
model_outputs = []
loss_history  = []

while epoch_num < num_epochs:
    while iter_num < iters_per_epoch:
        x_train, y_train = next(train_it)
        loss_history += [model3.train_on_batch(x_train, y_train)]

        x_test, y_test = next(test_it)
        if iter_num % outs_store_freq == 0:
            model_outputs += [model3.predict(x_test)]
        if iter_num % print_loss_freq == 0:
            print("Iter {} loss: {}".format(iter_num, loss_history[-1]))

        iter_num += 1
    print("EPOCH {} FINISHED".format(epoch_num + 1))
    epoch_num += 1
    iter_num = 0 # reset counter


FULL CODE
from keras.models import Sequential
from keras.layers import Dense, Conv2D, GlobalAveragePooling2D
from keras.models import Model
from keras.optimizers import SGD
from keras.applications.vgg16 import VGG16
from keras.preprocessing.image import ImageDataGenerator

model = VGG16(include_top=False, weights='imagenet')
print(model.summary())

#add layers
z = Conv2D(1, (3, 3), activation='relu')(model.output)
z = Conv2D(1,(1,1), activation='relu')(z)
z = GlobalAveragePooling2D()(z)
predictions3 = Dense(2, activation='softmax')(z)
model3 = Model(inputs=model.input, outputs=predictions3)

for layer in model3.layers[:20]:
   layer.trainable = False
for layer in model3.layers[20:]:
   layer.trainable = True

model3.compile(optimizer=SGD(lr=0.0001, momentum=0.9), 
               loss='categorical_crossentropy')
batch_size = 1
datagen = ImageDataGenerator()
train_it = datagen.flow_from_directory('DATA/C_Train/', 
                                        class_mode='categorical', 
                                        batch_size=batch_size)
test_it = datagen.flow_from_directory('DATA/C_Test/', 
                                      class_mode='categorical', 
                                      batch_size=batch_size)

[custom train loop here]


BONUS CODE: to get outputs of any layer, use below:

def get_layer_outputs(model, layer_name, input_data, learning_phase=1):
    outputs   = [layer.output for layer in model.layers if layer_name in layer.name]
    layers_fn = K.function([model.input, K.learning_phase()], outputs)
    return [layers_fn([input_data,learning_phase])][0]

outs = get_layer_outputs(model, 'dense_1', x_test, 0) # 0 == inference mode
Tuesday, October 5, 2021
 
tplaner
answered 2 Months ago
Only authorized users can answer the question. Please sign in first, or register a free account.
Not the answer you're looking for? Browse other questions tagged :  
Share