Bias and variance are complements of each other. The increase of one will result in the decrease of the other and vice versa.
Bias is the Error in Training Data and Variance is the Error in Testing Data.
- In Overfitting, the model performs very well on the training dataset hence bias is low and it performs badly on the testing dataset hence variance is high.
- In Underfitting, the model neither performs well on training data nor on testing data hence both variance and bias are high.
- For the Balanced model, error on both training and testing should be low hence both bias and variance should be low.