Data classification exists in various practical applications, such as the classification of words in natural language processing, classification of meteorological conditions, classification of environmental pollution degree, and so on. Artificial neural network is a basic method of data classification. A reasonable optimization algorithm will get better results for a loss function in the neural network. The research and improvement of these optimization algorithms has been a focus in this field. Because of the various optimizers developing in building the neural networks, an improved NAdam Algorithm (RNAdam) is proposed in this paper, on the basis of discussing and comparing several Algorithms with Adam Algorithm. This algorithm not only combines the advantages of RAdam algorithm, but also keeps the convergence of NAdam algorithm. A classification experiment is carried out on the data set composed of 300 sample points generated by the Make moon function. The experimental results show that the RNAdam algorithm is better than SGDM, Adam and Nadam algorithm in terms of the loss and accuracy between the output and the actual results, when the data are classified by the three-layer neural network. Therefore, the classification effect will be improved when this algorithm is applied to neural network for various practical data classification problems.
Published in | American Journal of Computer Science and Technology (Volume 4, Issue 4) |
DOI | 10.11648/j.ajcst.20210404.13 |
Page(s) | 106-110 |
Creative Commons |
This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited. |
Copyright |
Copyright © The Author(s), 2021. Published by Science Publishing Group |
Data Classification, Artificial Neural Network, Optimization Algorithm, Loss Function
[1] | Kingma D, Ba J. Adam: A Method for Stochastic Optimization [J]. Computer Science, 2014. |
[2] | Ilya Sutskever, James Martens, George Dahl, and Geoffrey Hinton. On the importance of initialization and momentum in deep learning. In Proceedings of the 30th International Conference on Machine Learning (ICML-13), pp. 1139–1147, 2013. |
[3] | Dozat T. Incorporating Nesterov Momentum into Adam. 2016. |
[4] | Goyal P, P Dollár, Girshick R, et al. Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour [J]. 2017. |
[5] | Gotmare A, Keskar N S, Xiong C, et al. A Closer Look at Deep Learning Heuristics: Learning rate restarts, Warmup and Distillation [J]. 2018. |
[6] | Xiao L, Yu A W, Lin Q, et al. DSCOVR: Randomized Primal-Dual Block Coordinate Algorithms for Asynchronous Distributed Optimization [J]. 2017. |
[7] | Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. Attention is all you need. In Advances in neural information processing systems, pp. 5998–6008, 2017. |
[8] | Popel M, Bojar O. Training Tips for the Transformer Model [J]. Prague Bulletin of Mathematical Linguistics, 2018, 110 (1): 43-70. |
[9] | Liu L, Jiang H, He P, et al. On the Variance of the Adaptive Learning Rate and Beyond [J]. 2019. |
[10] | Bengio, Y., Boulanger-Lewandowski, N., & Pascanu, R. (2012). Advances in Optimizing Recurrent Networks. Retrieved from http://arxiv.org/abs/1212.0901. |
[11] | Reddi S J, Kale S, Kumar S. On the Convergence of Adam and Beyond [J]. 2019. |
[12] | Robert Nau. Forecasting with moving averages. 2014. |
[13] | Wolter K M. Introduction to variance estimation | Clc [M]. |
APA Style
Zhu Zhixuan, Hou Zaien. (2021). Research and Application of Rectified-NAdam Optimization Algorithm in Data Classification. American Journal of Computer Science and Technology, 4(4), 106-110. https://doi.org/10.11648/j.ajcst.20210404.13
ACS Style
Zhu Zhixuan; Hou Zaien. Research and Application of Rectified-NAdam Optimization Algorithm in Data Classification. Am. J. Comput. Sci. Technol. 2021, 4(4), 106-110. doi: 10.11648/j.ajcst.20210404.13
AMA Style
Zhu Zhixuan, Hou Zaien. Research and Application of Rectified-NAdam Optimization Algorithm in Data Classification. Am J Comput Sci Technol. 2021;4(4):106-110. doi: 10.11648/j.ajcst.20210404.13
@article{10.11648/j.ajcst.20210404.13, author = {Zhu Zhixuan and Hou Zaien}, title = {Research and Application of Rectified-NAdam Optimization Algorithm in Data Classification}, journal = {American Journal of Computer Science and Technology}, volume = {4}, number = {4}, pages = {106-110}, doi = {10.11648/j.ajcst.20210404.13}, url = {https://doi.org/10.11648/j.ajcst.20210404.13}, eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajcst.20210404.13}, abstract = {Data classification exists in various practical applications, such as the classification of words in natural language processing, classification of meteorological conditions, classification of environmental pollution degree, and so on. Artificial neural network is a basic method of data classification. A reasonable optimization algorithm will get better results for a loss function in the neural network. The research and improvement of these optimization algorithms has been a focus in this field. Because of the various optimizers developing in building the neural networks, an improved NAdam Algorithm (RNAdam) is proposed in this paper, on the basis of discussing and comparing several Algorithms with Adam Algorithm. This algorithm not only combines the advantages of RAdam algorithm, but also keeps the convergence of NAdam algorithm. A classification experiment is carried out on the data set composed of 300 sample points generated by the Make moon function. The experimental results show that the RNAdam algorithm is better than SGDM, Adam and Nadam algorithm in terms of the loss and accuracy between the output and the actual results, when the data are classified by the three-layer neural network. Therefore, the classification effect will be improved when this algorithm is applied to neural network for various practical data classification problems.}, year = {2021} }
TY - JOUR T1 - Research and Application of Rectified-NAdam Optimization Algorithm in Data Classification AU - Zhu Zhixuan AU - Hou Zaien Y1 - 2021/11/05 PY - 2021 N1 - https://doi.org/10.11648/j.ajcst.20210404.13 DO - 10.11648/j.ajcst.20210404.13 T2 - American Journal of Computer Science and Technology JF - American Journal of Computer Science and Technology JO - American Journal of Computer Science and Technology SP - 106 EP - 110 PB - Science Publishing Group SN - 2640-012X UR - https://doi.org/10.11648/j.ajcst.20210404.13 AB - Data classification exists in various practical applications, such as the classification of words in natural language processing, classification of meteorological conditions, classification of environmental pollution degree, and so on. Artificial neural network is a basic method of data classification. A reasonable optimization algorithm will get better results for a loss function in the neural network. The research and improvement of these optimization algorithms has been a focus in this field. Because of the various optimizers developing in building the neural networks, an improved NAdam Algorithm (RNAdam) is proposed in this paper, on the basis of discussing and comparing several Algorithms with Adam Algorithm. This algorithm not only combines the advantages of RAdam algorithm, but also keeps the convergence of NAdam algorithm. A classification experiment is carried out on the data set composed of 300 sample points generated by the Make moon function. The experimental results show that the RNAdam algorithm is better than SGDM, Adam and Nadam algorithm in terms of the loss and accuracy between the output and the actual results, when the data are classified by the three-layer neural network. Therefore, the classification effect will be improved when this algorithm is applied to neural network for various practical data classification problems. VL - 4 IS - 4 ER -