Linear Regression in Machine Learning | Introduction Machine Learning

Machine Learning

It is an application of artificial intelligence that provides machines the ability to learn and improve from their experiences without being explicitly programmed. It uses a large amount of data, the model is trained to learn patterns from data using which predictions can be done on unseen data.

Types of Machine Learning Algorithms

  1. Supervised Machine Learning Algorithm
  2. Unsupervised Machine Learning Algorithm
  3. Reinforcement Machine Learning Algorithm

1. Supervised Machine Learning Algorithm

Supervised ML Algorithms are those which has target variables. There are two types of supervised machine-learning algorithms :

  1. Regression Models
  2. Classification Models

a. Regression Models

It is a type of supervised machine learning algorithm in which target variables are continuous values.
A few regression algorithms are Linear Regression, Decision Tree Regressor.
Example: House Price Prediction, in this target variable, is house price which is a continuous variable.

b. Classification Models

It is a type of supervised machine learning algorithm in which target variables are discrete values called Classes.
A few classification algorithms are Logistic Regression, SVM, Decision Tree, Naive Bayes, and K-NN.

Classification can be further divided into two Parts:-
1. Binary Classification: Classification algorithms that have two Classes.
Example: Email is Spam or Not, It has two Classes.
2. Multiclass Classification: Classification algorithms that have more than two classes.
Example: In the Iris dataset, predicting if the flower category is virginica, setosa, or versicolor. It has three classes.

2. Unsupervised Machine Learning Algorithm

Unsupervised machine learning algorithms are those which do not have any target variables.
A few Unsupervised algorithms are K means, K medoids, and PCA.
Example: Recommendation System

3. Reinforcement Learning

It is a machine learning algorithm that learns how to perform in a particular situation on its own. The agent is rewarded positive points for taking corrective actions and punished with negative points for incorrect actions with an aim to maximize the overall rewards.
Example: Self-Driving Car

Types of Feature Variables in Dataset

1. Independent Variables: All the features which influence the output or target variable are called Independent Variables.
2. Dependent Variables: It is the variable that is determined by Independent Variables.

Example: In the below image, sepal_length, sepal_width, petal_length, and petal_width are independent variables and Species is a dependent variable.

Machine Learning Models are build to find the pattern how these independent 
variables are determining the target variable or dependent variable.

Datasets in Machine Learning

1. Training Dataset: The data used for training the model. It has both independent and dependent variables.
2. Testing Dataset: The data used for testing the model. It has only independent variables, the dependent variable (y-actual) is compared with (y-predicted) values to calculate the performance of the model build.

Linear Regression

It is a statistical method for finding the relationship between independent variables and dependent variables.

How the Independent Variables are Determining the Dependent Variable?

Linear Regression tries to create a linear equation between independent and dependent variables.

Types of Linear Regression

  1. Simple Linear Regression
  2. Multi Linear Regression
  3. Polynomial Regression

1. Simple Linear Regression

In Simple Linear Regression, we try to find the relationship between one independent variable (x) and one dependent variable (y) by creating the best fit line between them.

Equation : 
The above equation represents a line.
where
y = predicted values
x = independent variable
m = slope 
c = intercept
m and c are regression coefficients.

 

Best Fit Line

If we can somehow find the values of m and c such that our prediction is close to the actual value, the above equation will be called a Best Fit Line.

How to find the Best Value for m and c?

1. Gradient Descent: It is an optimization algorithm using which we can get the best values for coefficients such that the overall cost function decreases.
2. Normal Equation: It is a method to get coefficient values analytically using the below equation.

Equation :

The coefficient can be easily found using the above equation. But it includes computationally expensive operations like transpose and inverse. Therefore it is suitable for small datasets but not for large data.

Cost Function in Linear Regression

  • The cost Function is used to evaluate the performance of the model by comparing the predicted value to the actual value.
  • A good model has predicted values close to the actual values. Therefore cost function needs to be minimized.
  • We take a summation of the difference between actual values and predicted values.
Equation :

2. Multi Linear Regression

In Multi Linear Regression, we try to find the relationship between independent variables (x) and dependent variable (y) by creating the best fit line between them.

Equation : 
where
y = predicted values
x1, x2, .., xn=  independent variable
m1, m2, ... , mn= slopes
c = intercept

Simple Linear Regression is used to understand the BEST FIT LINE in 2 Dimensional space, but in real-world cases, multi-linear regression is used. However, it can suffer from some conditions like overfitting and underfitting.

Overfitting

It is a situation in which the model performs well on training data, but very badly on testing data.

When does Overfitting happen?

  • If the model learns the training data very well.
  • If the value of coefficients is high.

How to solve Overfitting?

  • Regularization: It is a method to handle overfitting by adding an additional penalty term in the error function.
    Lasso Regression (L1 Regularization) and Ridge Regression (L2 Regularization)

Underfitting

It is a situation in which the model is neither working well on training data, nor on testing data.

When Does Underfitting Occur?

  • Fewer no features are present.
  • Not enough patterns are learned during training.

How to Resolve Underfitting?

  • Polynomial Regression
  • Increasing Training data

Bias Variance Tradeoff

Bias and variance are complements of each other. The increase of one will result in the decrease of the other and vice versa.

Bias is the Error in Training Data and Variance is the Error in Testing Data.

  • In Overfitting, the model performs very well on the training dataset hence bias is low and it performs badly on the testing dataset hence variance is high.
  • In Underfitting, the model neither performs well on training data nor on testing data hence both variance and bias are high.
  • For the Balanced model, error on both training and testing should be low hence both bias and variance should be low.

3. Polynomial Regression

Polynomial Regression is a special kind of Multi Linear Regression that includes the feature of n polynomial degree.

Equation : 

Multicollinearity in Regression

When one or more independent variables are correlated with each other. Correlation between two features (independent variables) can be calculated using Pearson Correlation. Two variables are said to be perfectly correlated if the correlation between them is +1 or -1.

Why is Multicollinearity a Problem in Linear Regression?

Multicollinearity reduces the statistical power of the regression model. Therefore, it is important to handle multicollinearity.

Performance Metrics

Why is Model Evaluation Necessary?

We need Evaluation Metrics to check how well the model trained is performing on unseen data.

 Performance Metrics for Regression

  1. Mean Absolute Error (MAE)
  2. Mean Squared Error (MSE)
  3. Root Mean Squared Error (RMSE)
  4. Root Mean Squared Log Error (RMSLE)
  5. R squared
  6. Adjusted R Squared
Related Link: Machine Learning Tutorial

End Notes

Thank you for reading this article. By the end of this article, we are familiar with Linear Regression Algorithm in Machine Learning. You can read about other machine learning algorithms here.

I hope this article was informative. Feel free to ask any query or give your feedback in the comment box below.

Happy Learning!