My accuracy curve lays down above the training set accuracy one. Is it normal?

by Kaeldric   Last Updated October 13, 2018 01:19 AM

I'm try to build a text classifier using a CNN with word embedding with Keras and Tensorflow.

Graph from tensorboard

Here is a snippet of the code that shows the model construction:

# Model (Convolutional NN)
inp = Input(shape=(maxlen,), dtype='int32')
embedding = embedding_layer(inp)
stacks = []
for kernel_size in [4, 8, 16, 32, 50]:
    conv = Conv1D(64, kernel_size, padding='same', activation='relu', strides=1)(embedding)
    pool = MaxPooling1D(pool_size=3)(conv)
    drop = Dropout(0.7)(pool)
    stacks.append(drop)

merged = Concatenate()(stacks)
flatten = Flatten()(merged)
drop = Dropout(0.7)(flatten)
outp = Dense(len(int_category), activation='softmax')(drop)

TextCNN = Model(inputs=inp, outputs=outp)
TextCNN.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

TextCNN.summary()

# Train
textcnn_history = TextCNN.fit(x_train,
                              y_train,
                              batch_size=1024,
                              epochs=200,
                              validation_data=(x_val, y_val),
                              callbacks=
                              [ModelCheckpoint("../checkpoints/ck{epoch:02d}-{acc:.2f}.hdf5", monitor="acc",
                                               save_best_only=True),
                               TensorBoard(),
                               EarlyStopping(monitor='loss', min_delta=0.0001, patience=1)]
                              )

After some modifications I was able to reach the following results:

Train and evaluation accuracies

I already used dropout regularization after every convolutional block and after the dense layer.

Then I decided to improve the dropout rate to raise the accuracy curve a little (from 0.5 to 0.7), but after some iterations this is what happened:

Train and evaluation accuracies after the dropout incresing

Why is the accuracy curve above the training one? Shouldn't it be below everytime?

Thank you.



Related Questions




Regularization in VGGNet-16 Network

Updated June 29, 2018 07:19 AM

How to improve training accuracy of DCGAN

Updated December 30, 2017 09:19 AM