# How to add function (Get F1-score) in Keras metrics and record F1 value after each epoch?

Background

I was making binary classifier (0 or 1) Multi-Layer Perceptron Model using Keras for “Kaggle Quora competition”. As classes (0 or 1) are imbalanced, using F1-score as evaluation metric.

Objective

Integrate any user defined function in Keras metrics like function to get F1 score on training and validation data.

What you will get ?

You will get training and validation F1 score after each epoch.

Challenge

By default, f1 score is not part of keras metrics and hence we can’t just directly write f1-score in metrics while compiling model and get results. However, Keras provide some other evaluation metrics like accuracy, categorical accuracy etc. So, to get training and validation f1 score after each epoch, need to make some more efforts.

Solution

**Step 1**: Import of libraries

**from** **keras.callbacks** **import** Callback,ModelCheckpoint

**from** **keras.models** **import** Sequential,load_model

**from** **keras.layers** **import** Dense, Dropout

**from** **keras.wrappers.scikit_learn** **import** KerasClassifier

**import** **keras.backend** **as** **K**

**Step 2**: Assuming, you have train and validation data ready in DTM (Document Term Matrix) form.

Train data: (X_train_tfidf,y_train)

Validation data: (X_val_tfidf,y_val)

**Step 3**: Most important step of article — declaring function to get f1 score.

**def** get_f1(y_true, y_pred): *#taken from old keras source code*

true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))

possible_positives = K.sum(K.round(K.clip(y_true, 0, 1)))

predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))

precision = true_positives / (predicted_positives + K.epsilon())

recall = true_positives / (possible_positives + K.epsilon())

f1_val = 2*(precision*recall)/(precision+recall+K.epsilon())

**return** f1_val

I have my YouTube Channel (Analytics Diksha — https://bit.ly/2kDoJ1t) on which I talk about Data Science topics. Please subscribe !!

**Step 4:** Defining Model structure and calling ‘get_f1’ function in metrics while compiling model.

*## Define Model*

input_shape = X_train_tfidf.shape[1]

**def** mlp_v2():

mdl = Sequential()

mdl.add(Dense(512, init='glorot_uniform', activation='relu',input_shape=(input_shape,)))

mdl.add(Dropout(0.5))

mdl.add(Dense(128, init='glorot_uniform', activation='relu'))

mdl.add(Dropout(0.5))

mdl.add(Dense(1, activation='sigmoid'))

mdl.compile(loss='binary_crossentropy', optimizer='adadelta', **metrics=[get_f1]**)

mdl.summary()

**return** mdl

mode_path = '../models/mlp_v2.h5'

callbacks = [ModelCheckpoint(filepath=mode_path, save_best_only=True)]

**Step 5: **Run Model and see training and validation f1 after each epoch.

*## Run Model*

estimator = KerasClassifier(build_fn=mlp_v2, epochs=5, batch_size=128)

history = estimator.fit(X_train_tfidf.toarray(), y_train,\

validation_data=(X_val_tfidf.toarray(),y_val),callbacks=callbacks)

For more on keras metrics, refer to link.

Conclusion

Hurray !! You learned how to add your own evaluation function in Keras metrics. Now, you can try for other function as well like getting auc value.

I have my YouTube Channel (Analytics Diksha — https://bit.ly/2kDoJ1t) on which I talk about Data Science topics. Please subscribe !!

Please clap if article helps you and share with your friends as well.