Machine Learning Model Optimization with Hyper Parameter Tuning Approach

Authors

  • Md Riyad Hossain

  • Dr. Douglas Timmer

Keywords:

machine learning, hyper parameter optimization, grid search, random search, BO-GP

Abstract

Hyper-parameters tuning is a key step to find the optimal machine learning parameters. Determining the best hyper-parameters takes a good deal of time, especially when the objective functions are costly to determine, or a large number of parameters are required to be tuned. In contrast to the conventional machine learning algorithms, Neural Network requires tuning hyperparameters more because it has to process a lot of parameters together, and depending on the fine tuning, the accuracy of the model can be varied in between 25%-90%. A few of the most effective techniques for tuning hyper-parameters in the Deep learning methods are: Grid search, Random forest, Bayesian optimization, etc. Every method has some advantages and disadvantages over others. For example: Grid search has proven to be an effective technique to tune hyper-parameters, along with drawbacks like trying too many combinations, and performing poorly when it is required to tune many parameters at a time. In our work, we will determine, show and analyze the efficiencies of a real-world synthetic polymer dataset for different parameters and tuning methods.

How to Cite

Machine Learning Model Optimization with Hyper Parameter Tuning Approach. (2021). Global Journal of Computer Science and Technology, 21(D2), 7-13. https://testing.computerresearch.org/index.php/computer/article/view/2059

References

Machine Learning Model Optimization with Hyper Parameter Tuning Approach

Published

2021-05-15

How to Cite

Machine Learning Model Optimization with Hyper Parameter Tuning Approach. (2021). Global Journal of Computer Science and Technology, 21(D2), 7-13. https://testing.computerresearch.org/index.php/computer/article/view/2059