Best Confusion Matrix Guide With Sklearn Python

Confusion Matrix sklearn

Today is last for you to confuse about confusion matrix. 

Before that, how many times your read about confusion matrix, and after a while forgot about the ture positive, false negative ... etc, Even you implemented confusion matrix with sklearn or tensorflow, Still we get confusion about the each componets of the matrix.

As I said before, after reading this article, you will never forget confusion matrix any more.

So forgot everything you learned so far, and start fresh now. 

The machine learning model building journey end goal is not about building the model. The real journey will begin when we start measuring the performance of the model we built. 

We are having numerous ways to quantify the performance of the model. The confusion matrix is one such case, This model performance metrics is popularly used for quantifying the performance of the classification models which fall under the supervised learning algorithms.

Before we drive further let me explain what you are about to learn in this article.

Why we need a confusion matrix

Let's understand what is the need of using the confusion matrix as performance metrics for the classification models.

Accuracy is the popular model evaluation method used for the majority of the classification models in supervised learning algorithms. This makes us to think about the below question.  

When we are having accuracy as a measure for knowing the performance of the classification models then why we need another measure to quantify the performance of the model?

Let’s understand this with an example.

 Suppose if we are building a binary classification model for imbalanced target class data.

Let me give you an example of an imbalanced dataset. In the target class imbalance dataset, the target classes are not properly balanced.

What is target class imbalance

Let's say we are having two expected classes for the target variable.

Which are

  1. Positive 
  2. Negative
Class Imbalance data

95% percentage we are getting a positive class and only 5% percentage we're getting the negative class. Here positive class is dominating the negative class, this kind of in balance of the target class within the target classes is called imbalance.

Examples of the imbalanced dataset

 Below are some of the examples with the imbalance dataset.

In the above examples the target classes distribution, will not be in equally distributed. In general target class imbalance means the target classes are not equally distributed, one class will be dominating the other classes.

 Let's go back to our question of what is the need for confusion matrix when we have the accuracy to calculate the performance of the classification model?

In the above example, we have seen the positive class is coming 95% percent cases whereas the negative class is coming only 5% of the time. 

Without building any machine learning model if we predict all the target classes as positive. What do you think our model accuracy could be?

Yes, you are correct our model accuracy is 95% 

But do you think this is the correct way of quantifying the performance of the model?

A big No!

So when we are dealing with target class imbalance datasets, accuracy is not the best performance measure technique. 

In such cases, we will use the confusion matrix to measure the efficiency of the classification model.

In the next section of this article, we will learn more about the confusion matrix representation.

What is confusion matrix?

To understand the confusion matrix in the much deeper level we are considering the below example dataset.

S no.

Actual Target

Predicted Target

1

Positive

Positive

 2

Positive

Negative

3

Positive

Positive

4

Positive

Negative

5

Positive

Positive

6

Negative

Negative

7

Negative

Positive

8

Positive

Positive

9

Positive

Positive

10

Positive

Positive

The above table contains the actual target class and the predicted class information.

If we calcualte the accuracy of this data it will 70%, as the predicted target column’s values are matching 7 times in an overall 10 cases in actual targets.

Accuracy is not able to explain the below question.

  • How many actual positive targets are predicted as positive?
  • How many actual positive targets are predicted as negative?
  • How many actual negative targets are predicted as positive?
  • How many actual negative targets are predicted as negative?

We can answer all these questions with a confusion matrix, below is the pictorial representation of answer all the above questions.

Confusion matrix representation

A confusion matrix is a matrix representation of showing how well the trained model predicting each target class with respect to the counts. Confusion Matrix mainly used for the classification algorithms which fall under supervised learning. 

Below are the descriptions for the terms used in the confusion matrix

  • Ture positive:  Target is positive  and the model predicted it as positive
  • False negative: Target is positive and the model predicted it as negative
  • False positive:  Target is negative and the model predicted it as positive
  • True negative:  Target is negative and the model predicted it as negative

Using the above positive and negative targets information table, we will populate the matrix which gives a much more clear understanding of how the confusion matrix constructed.

Confusion matrix representation for a binary classification problem

Confusion matrix for binary classification

The above image is representing the confusion matrix for the binary classification problem, each cell values of the matrixs, are calculated for the example dataset we showed before.

  • TP: Out of 8 actual positive cases, in 6 cases the model predicted positive.
  • FN: (8 - 6), the remaining 2 cases will fall into the true negative cases.
  • FP:  We are having 2 negative cases and 1 we predicted as positive.
  • TN: Out of 2 negative cases, the model predicted 1 negative case correctly.

So these cell values of the confusion matrix are addressed the above questions we have. 

By seeing the matrix representation, we can understand where the model is much more accurate and we can clearly know where the model not able to predict properly. This helps in tuning the right model parameters to reduce the false positive and false negative.

By now we are having clear understanding about each component of the confusion but still TP, TN, FP, FN is hard to remember, we know the concepts but these terms are really a bit confusing. 

So Let's understand how we can remember these terms forever.

Ture positive and negatives graph

In the above image we spilt the each term into two characters, the second character represents what the model predicting, in our case, is the model predicting postive class or negative class. The first character represents, is the model prediction is correct or not.

Let's undersand with an example.

Suppose TN means, the second character is N means model predicted negative class, the first character T means model predicted correctly.

Hope this gives clear picture about these individual components about the matrix.

What is the ideal model?

The ideal machine learning model which will always predict the correct target values. If we are using accuracy as a measure to quantify the performance of the model. The ideal model should get 100% accuracy.

In the same way, to say a model is ideal with confusion matrix performance metrics, it should have zero cases in false positive and false negative, which are called as type 1 and type 2 errors.

What are Type 1 and Type 2 errors?

Below are the two error types we can represent with confusion matrix.

  • Type 01 (False positive)
  • Type 02 (False negative)
Type 1 error
Type 2 Error

The above image clearly explaining the difference between Type 1 and type 2 errors. Below are the key difference between type 1 and type 2 errors.

 
 
 
 
 

Difference between Type 1 and Type 2 errors

Type 1 Error

  • False positive from confusion matrix.
  • Predicting negative as positive.

Type 2 Error

  • False negative from confusion matrix.
  • Predicting positive as negative.

By now we clearly understood how the confusion matrix can build and aware of the components of the confusion matrix. 

Now we will learn how to implement the confusion matrix in different ways. Before that below is the full representation  of the learnings we got in one picture.

Confusion matrix full representation

Implementing confusion matrix in python

We are going to implement confusion matrix in two different ways.

  1. Confusion matrix with Sklearn
  2. Confusion matrix with Tensorflow

Confusion matrix implementation with sklearn

Confusion matrix sklearn
Confusion matrix output

The scikit learn confusion matrix representation will be a bit different, as scikit learn considers the actual target classes as columns and the predicted classes as rows, because of this scikit learn confusion matrix output look different.

Confusion matrix implementation with Tensorflow

Confusion matrix tensor flow

How to plot the confusion matrix

Using the below code, we can easily plot the confusion matrix, we are using seaborn heat map to visuvalize the confusion matrix in more representive way.

Confusion matrix graph

If we run the above code we will get the below kind of graph, the below graph is the confusion matrix created for the email spam classification model.

Confusion matrix on test dataset

Confusion matrix

By now we know the different components of the confusion matrix, using these components we can derive multiple model performance metrics to quantify the performance of the trained model. We will cover that in another article.

Below are the different measures we can calculate using the confusion matrix.

  • Accuracy 
  • Misclassification rate
  • True positive rate or Recall or Sensitivity
  • False positive rate
  • True negative rate or Specificity
  • Precision
  • F 1 score

We will learn about these measures in the upcoming article.

Complete Code

Below is the code for implementing confusion matrix in sklearn and tensorflow along with visuvalization code.

You can also clone this code in our Github.

Conclusion

In this article we learned what is the need for confusion matrix,  different components of the confusion matrix, how to implement them with sklearn and TensorFlow and we also have seen the code to visualize it.

Recommended Courses

Recommended
Machine learning

Machine Learning Course

Rating: 4.5/5

AWS Course

Deep Learning Course

Rating: 4.5/5

Data Science Full Specialization

NLP Course

Rating: 5/5

Follow us:

FACEBOOK| QUORA |TWITTER| GOOGLE+ | LINKEDIN| REDDIT | FLIPBOARD | MEDIUM | GITHUB

I hope you like this post. If you have any questions ? or want me to write an article on a specific topic? then feel free to comment below.

2 Responses to “Best Confusion Matrix Guide With Sklearn Python

  • bangalore architects
    7 months ago

    This is an excellent contribution to the conversation. Your thoughtful analysis and practical advice make it a valuable resource. Keep up the fantastic work!

    • Wow, thank you for the uplifting feedback! I’m thrilled to hear that you found the analysis and advice useful. It’s comments like yours that motivate me to continue sharing and contributing. If there’s anything specific you’d like to see discussed in future posts, please let me know. Thanks for being a part of our community! 🌟

Leave a Reply

Your email address will not be published. Required fields are marked *

>