A Machine Learning or Deep Learning Model is represented by the parameters passed to it. While training any model, these parameters can affect the training process and hence overall performance, therefore it is required to choose optimal parameters that can give the best performance in that model.
Hyperparameter is a parameter that controls the learning process and whose value is externally selected before starting the training process. Hyperparameters are said to be external to the model because the model cannot change its values during learning/training.
In libraries like sklearn, every model has some default value for all the hyperparameters, which can be modified. If you don’t pass a value for any parameter while training the model, then the default parameter value is used.
A Few Examples of Hyperparameter are:
- Number of Trees in Random Forest
- Learning rate in Gradient Descent
- Choice of the Optimization Algorithm
Batch Gradient Descent, Stochastic Gradient Descent, etc.
- Choice of activation function in a Neural Network
Sigmoid, ReLU, Tanh, etc.
- Train-Test Split Ratio
- Number of Clusters in Clustering Algorithms
- The K in the K-Nearest Neighbour algorithm
- Number of activation units in each layer
- The Dropout Probability in Neural Network
- Number of Hidden Layers in Neural Network