Научный журнал Моделирование, оптимизация и информационные технологииThe scientific journal Modeling, Optimization and Information Technology
cетевое издание
issn 2310-6018

MODIFICATION OF NEURAL NETWORK MODEL RKELM WITH ADDITIONAL TRAINING

Asanov Y.A.   Beletskaya S.Y.  

UDC 681.3
DOI: 10.26102/2310-6018/2019.27.4.040

  • Abstract
  • List of references
  • About authors

The aim of this work is developing of an artificial neural network model (ANN) capable of working in dynamically changing conditions. Despite a large number of research and development in this sphere, there are still no models that satisfy the limited resources of mobile systems (primarily – performance). This article proposes a developed modification of the Huang Extreme Learning Model, which differs from the original approach in the training process – training on common conditions, without increasing the weight matrix and the training sample, followed by further training for specific conditions. As a test sample of data, a dataset from the open source machine-learning repository UCI was used. Vast experiments were performed, the purpose of which was to identify the most suitable model, the choice was made from RKELM, SVM and ELM. The selection criteria for the model were performance and classification accuracy. The model with extreme training of Huang turned out to be the most suitable, it was used as the basis of the developed modification. The results of comparing the original and modified models are presented. The proposed approach surpassed the competition in speed and performance, while only slightly inferior in accuracy of data classification in the initial conditions, but turned out to be much more accurate in the new conditions in which the model was not trained.

1. Anguita, D., Ghio, A., Oneto, L., Parra, X., & Reyes-Ortiz, J. L. Human activity recognition on smartphones using a multiclass hardware-friendly support vector machine. In Ambient assisted living and home care-Springer. 2012:216–223.

2. Lvovich Ya.E., Beletskaya S.Yu. Increasing the efficiency of parametric synthesis procedures for complex systems based on the transformation of optimization problems. Information technologies. 2002;10:31–35.

3. Huang, G.B., Zhu, Q.Y., & Siew, C.K. Extreme learning machine: theory and applications. Neurocomputing. 2006:489–501.

4. Lee, Y.J., & Huang, S.Y. Reduced support vector machines: a statistical theory. Neural Networks, IEEE Transactions on. 2007:1–13.

5. Deng, W., Zheng, Q., & Zhang, K. Reduced Kernel Extreme Learning Machine. In Proceedings of the 8th international conference on computer recognition systems CORES. 2013:63–69.

6. Blake, C.L., & Merz, C.J. UCI Repository of machine learning databases http://www.ics.uci.edu/mlearn/MLRepository.html - Irvine, CA: University of California, Department of Information and Computer Science. 1998:55–78.

7. Yilun Chen, Zhicheng Wang, Yuxiang Peng, Zhiqiang Zhang, Gang Yu, and Jian Sun. Cascaded pyramid network for multi-person pose estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2018:7103–7112.

8. Liang-Chieh Chen, Maxwell Collins, Yukun Zhu, George Papandreou, Barret Zoph, Florian Schroff, Hartwig Adam, and Jon Shlens. Searching for efficient multi-scale architectures for dense image prediction. In Advances in Neural Information Processing Systems. 2018:8699–8710.

Asanov Yuru Anatolievich

Email: asanovjura@mail.ru

Voronezh State Technical University

Voronezh, Russian Federation

Beletskaya Svetlana Yuryevna
Doctor of Technical Sciences Professor
Email: su_bel@mail.ru

Voronezh State Technical University

Voronezh, Russian Federation

Keywords: artificial neural network, rkelm modification, model with an additional training, .

For citation: Asanov Y.A. Beletskaya S.Y. MODIFICATION OF NEURAL NETWORK MODEL RKELM WITH ADDITIONAL TRAINING. Modeling, Optimization and Information Technology. 2019;7(4). Available from: https://moit.vivt.ru/wp-content/uploads/2019/11/AsanovBeletckaya_4_19_1.pdf DOI: 10.26102/2310-6018/2019.27.4.040 (In Russ).

139

Full text in PDF