• 대한전기학회
Mobile QR Code QR CODE : The Transactions of the Korean Institute of Electrical Engineers
  • COPE
  • kcse
  • 한국과학기술단체총연합회
  • 한국학술지인용색인
  • Scopus
  • crossref
  • orcid

References

1 
T. Young, D. Hazarika, S. Poria, E. Cambria, 2018, Recent Trends in Deep Learning Based Natural Language Processing, IEEE Computational Intelligence Magazine, Vol. 13, pp. 55-75DOI
2 
M. I. Jordan, T. M. Mitchell, Jul 2015, Machine learning: Trends, perspectives, and prospects, Science, Vol. 349, No. 6245, pp. 255-260DOI
3 
R. Elshawi, M. Maher, S. Sakr, 2019, Automated Machine Learning: State-of-The-Art and Open Challenges, ArXiv190602287 Cs StatDOI
4 
S. Abreu, 2019, Automated Architecture Design for Deep Neural Networks, ArXivDOI
5 
K. He, X. Zhang, S. Ren, J. Sun, 2016, Deep Residual Learning for Image Recognition, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 770-778DOI
6 
N. Ma, X. Zhang, H.-T. Zheng, J. Sun, 2018, ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design, CoRR, Vol. abs/1807.11164, pp. -DOI
7 
J. Bergstra, R. Bardenet, Y. Bengio, B. Kégl, 2011, Algorithms for Hyper-Parameter Optimization, Advances in Neural Information Processing Systems, Vol. 24, pp. 2546-2554DOI
8 
J. Bergstra, Y. Bengio, 2012, Random Search for Hyper- Parameter Optimization, J. Mach. Learn. Res., Vol. 13, No. 10, pp. 281-305DOI
9 
B. Shahriari, K. Swersky, Z. Wang, R. P. Adams, N. de Freitas, 2016, Taking the Human Out of the Loop: A Review of Bayesian Optimization, Proc. IEEE, Vol. 104, No. 1, pp. 148-175DOI
10 
T. Akiba, S. Sano, T. Yanase, T. Ohta, M. Koyama, 2019, Optuna: A Next-generation Hyperparameter Optimization Framework, Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 2623-2631DOI
11 
J. Bergstra, B. Komer, C. Eliasmith, D. Yamins, D. D. Cox, 2015, Hyperopt: a Python library for model selection and hyperparameter optimization, Comput. Sci. Discov., Vol. 8, No. 1, pp. 014008-DOI
12 
2022, KerasTuner
13 
P. Probst, A.-L. Boulesteix, B. Bischl, 2019, Tunability: Importance of Hyperparameters of Machine Learning Algorithms, J. Mach. Learn. Res., Vol. 20, No. 53, pp. 1-32DOI
14 
M. Claesen, B. De Moor, Apr. 06, 2015, Hyperparameter Search in Machine Learning, arXivDOI
15 
H. J. P. Weerts, A. C. Mueller, J. Vanschoren, Jul. 15, 2020, Importance of Tuning Hyperparameters of Machine Learning Algorithms, arXivDOI
16 
V. Nair, G. E. Hinton, 2010, Rectified linear units improve restricted boltzmann machines, in Proceedings of the 27th International Conference on International Conference on Machine Learning, Madison, WI, USA, pp. 807-814DOI
17 
J. Brownlee, Jan. 22, 2019, How to Configure the Learning Rate When Training Deep Learning Neural Networks, Machine Learning MasteryDOI
18 
Y. Bengio, 2012, Practical Recommendations for Gradient-Based Training of Deep Architectures, in Neural Networks: Tricks of the Trade: Second Edition, G. Montavon, G. B. Orr, and K.-R. Müller, Eds. Berlin, Heidelberg: Springer, pp. 437-478DOI
19 
S. Agrawal, 2021, Hyperparameters in Deep Learning, MediumDOI
20 
, [Coursera] Neural Networks for Machine Learning (University of Toronto) (neuralnets)DOI
21 
D. P. Kingma, J. Ba, 2015, Adam: A Method for Stochastic Optimization, in 3rd International Conference on Learning Representations, San Diego, CA, USA, May 7-9, 2015, Conference Track ProceedingsDOI
22 
J. Duchi, E. Hazan, Y. Singer, 2011, Adaptive Subgradient Methods for Online Learning and Stochastic Optimization, Journal of machine learning research, Vol. 12, No. 7, pp. 39-DOI
23 
P. Liashchynskyi, P. Liashchynskyi, 2019, Grid search, random search, genetic algorithm: a big comparison for NAS, arXiv preprint arXiv:1912.06059DOI
24 
M. A. J. Idrissi, H. Ramchoun, Y. Ghanou, M. Ettaouil, 2016, Genetic algorithm for neural network architecture optimization, in 2016 3rd International Conference on Logistics Operations Management (GOL), pp. 1-4DOI
25 
J. Bergstra, R. Bardenet, Y. Bengio, B. Kégl, 2011, Algorithms for Hyper-Parameter Optimization, in Advances in Neural Information Processing Systems, Vol. 24DOI
26 
R. Joseph, 2018, Grid Search for model tuning, MediumDOI
27 
M.-A. Zöller, M. F. Huber, 2021, Benchmark and Survey of Automated Machine Learning Frameworks, ArXiv190412054 Cs StatDOI
28 
A. Klein, S. Falkner, S. Bartels, P. Hennig, F. Hutter, 2017, Fast Bayesian Optimization of Machine Learning Hyperparameters on Large Datasets, in Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, pp. 528-536DOI
29 
M. Seeger, 2004, Gaussian processes for machine learning, Int. J. Neural Syst., Vol. 14, No. 2, pp. 69-106DOI
30 
F. Hutter, H. H. Hoos, K. Leyton-Brown, 2011, Sequential Model-Based Optimization for General Algorithm Configuration, in Learning and Intelligent Optimization, pp. 507-523DOI
31 
D. Maclaurin, D. Duvenaud, R. Adams, 2015, Gradient-based Hyperparameter Optimization through Reversible Learning, in Proceedings of the 32nd International Conference on Machine Learning, pp. 2113-2122DOI
32 
A. S. Wicaksono, A. A. Supianto, 2018, Hyper Parameter Optimization using Genetic Algorithm on Machine Learning Methods for Online News Popularity Prediction, Int. J. Adv. Comput. Sci. Appl. IJACSA, Vol. 9, No. 12, pp. 33-31DOI
33 
L. Li, K. Jamieson, G. DeSalvo, A. Rostamizadeh, A. Talwalkar, 2022, Hyperband: Bandit-Based Configuration Evaluation for Hyperparameter Optimization, International Conference on Learning RepresentationsDOI
34 
2022, GitHub - fmfn/BayesianOptimization: A Python implementation of global optimization with gaussian processes, https://github. com/fmfn/BayesianOptimizationGoogle Search
35 
Mar. 18, 2022, Optuna: A hyperparameter optimization framework, optunaDOI
36 
Jan. 12, 2023, Hyperopt: Distributed Hyperparameter Optimization, hyperoptDOI
37 
K. Team, Jan. 13, 2023, Keras documentation: KerasTunerDOI
38 
L. Li, K. Jamieson, G. DeSalvo, A. Rostamizadeh, A. Talwalkar, 2017, Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization, , pp. 6765-6816DOI
39 
Md. H. A. Banna, 2021, Attention-Based Bi-Directional Long-Short Term Memory Network for Earthquake Prediction, IEEE Access, Vol. 9, pp. 56589-56603DOI
40 
Murat Koklu, Ilker Ali Ozkan, 2020, Multiclass classification of dry beans using computer vision and machine learning techniques, Comput. Electron. Agric., Vol. 174, pp. 105507-DOI
41 
İ. Çinar, M. Koklu, P. D. Ş. Taşdemi̇r, Dec. 2020, Classification of Raisin Grains Using Machine Vision and Artificial Intelligence Methods, Gazi Mühendis. Bilim. Derg., Vol. 6, No. 3, pp. -DOI
42 
L. Candillier, V. Lemaire, Aug. 2013, Active learning in the real-world design and analysis of the Nomao challenge, in The 2013 International Joint Conference on Neural Networks (IJCNN), pp. 1-8DOI
43 
A. Krizhevsky, I. Sutskever, G. E. Hinton, 2012, ImageNet Classification with Deep Convolutional Neural Networks, in Advances in Neural Information Processing Systems, Vol. 25DOI
44 
P. Srinivas, R. Katarya, Mar. 2022, hyOPTXg: OPTUNA hyper- parameter optimization framework for predicting cardiovascular disease using XGBoost, Biomed. Signal Process. Control, Vol. 73, pp. 103456-DOI
45 
J.-P. Lai, Y.-L. Lin, H.-C. Lin, C.-Y. Shih, Y.-P. Wang, P.-F. Pai, Feb. 2023, Tree-Based Machine Learning Models with Optuna in Predicting Impedance Values for Circuit Analysis, Micromachines, Vol. 14, No. 2, pp. -DOI
46 
J. Joy, M. P. Selvan, 2022, A comprehensive study on the performance of different Multi-class Classification Algorithms and Hyperparameter Tuning Techniques using Optuna, in 2022 International Conference on Computing, Communication, Security and Intelligent Systems (IC3SIS), pp. 1-5DOI
47 
Y. Nishitsuji, J. Nasseri, Mar. 2022, LSTM with forget gates optimized by Optuna for lithofacies prediction, DOI
48 
I. Ekundayo, 2020, OPTUNA Optimization Based CNN-LSTM Model for Predicting Electric Power Consumption, masters, Dublin, National College of IrelandDOI
49 
S. Putatunda, K. Rama, 2018, A Comparative Analysis of Hyperopt as Against Other Approaches for Hyper-Parameter Optimization of XGBoost, in Proceedings of the 2018 International Conference on Signal Processing and Machine Learning, Shanghai China, pp. 6-10DOI
50 
R. J. Borgli, H. Kvale Stensland, M. A. Riegler, P. Halvorsen, 2019, Automatic Hyperparameter Optimization for Transfer Learning on Medical Image Datasets Using Bayesian Optimization, in 2019 13th International Symposium on Medical Information and Communication Technology (ISMICT), pp. 1-6DOI
51 
J. Zhang, Q. Wang, W. Shen, Dec 2022, Hyper-parameter optimization of multiple machine learning algorithms for molecular property prediction using hyperopt library, Chin. J. Chem. Eng., Vol. 52, No. , pp. -DOI
52 
N. Schwemmle, T.-Y. Ma, May 2021, Hyperparameter Optimization for Neural Network based Taxi Demand Prediction, presented at the BIVEC-GIBET Benelux Interuniversity Association of Transport Researchers: Transport Research Days 2021DOI
53 
B. Abdellaoui, A. Moumen, Y. Idrissi, A. Remaida, 2021, Training the Fer2013 Dataset with Keras Tuner., pp. 412-DOI
54 
A. Jafar, M. Lee, 2021, High-speed hyperparameter optimization for deep ResNet models in image recognition, in Cluster Computing, pp. 1-9DOI
55 
A. Jafar, L. Myungho, Aug. 2020, Hyperparameter Optimization for Deep Residual Learning in Image Classification, in 2020 IEEE International Conference on Autonomic Computing and Self-Organizing Systems Companion (ACSOS-C), pp. 24-29DOI