A Simple Explanation - By Varsha Saini

Normalization is the process of feature scaling that is used to bring all the features to the same scale which can increase the overall accuracy of the model. Feature Scaling is one of the important data preprocessing steps in machine learning.

Why is Feature Scaling Required?

The dataset available can have features in different ranges which can result in some features being given more importance than others due to their large values.

For Example, In the House Price Prediction dataset, the feature area of the house has values in a higher range as compared to the number of bedrooms. Performing feature scaling can bring all the features in the same range.

Min-Max Scaling

It is a type of Normalization technique that brings down features value in the range of 0 to 1. Scikit-Learn provides a transformer called MinMaxScaler for Normalization.


Xn = (X - Xminimum) / ( Xmaximum - Xminimum)  
  • Xn = Value of Normalization
  • X = Value without Normalization
  • Xmaximum = Maximum value of a feature
  • Xminimum = Minimum value of a feature