main| new issue| archive| editorial board| for the authors| publishing house|
Ðóññêèé
Main page
New issue
Archive of articles
Editorial board
For the authors
Publishing house

 

 


ABSTRACTS OF ARTICLES OF THE JOURNAL "INFORMATION TECHNOLOGIES".
No. 1. Vol. 25. 2019

DOI: 10.17587/it.25.41-45

K. Sh. Ismayilova, Is_kamalya@yahoo.com, Azerbaijan State University of Oil and Industry, Baku city, Az1010, Azerbaijan Republic

Application of Various Optimization Methods for Calculating the Neural Network Error for the Diagnosis of Neuromuscular Diseases

The proposed work is devoted to the calculation of the error of the neural network for comparison and the choice of the optimal method for teaching this problem. Analysis of the literature shows that the problem of finding the optimal number of neurons of the hidden layer does not have a unique solution due to the lack of an established methodology. There are four specific limitations that distinguish the training of a neurocomputer from common optimization tasks: the astronomical number of parameters, the need for high parallelism in learning, the multicriteria nature of the problems being solved, the need to find a sufficiently wide area in which the values of all minimized functions are close to minimal. It is known that the most important and most accessible indicator of such systems are the results of errors. The most common of the errors, and at the same time, the absolute and relative errors that are easy to calculate. The output of the OUTPUT network contains four elements: 1 is the norm; 2 — polyneuropathy; 3 — carpal tunnel syndrome; 4 — a cubital tunnel syndrome. Computer implementation of the experiment was carried out in the NeuroPro 0.25 software environment and optimization methods were chosen for network training: Gradient descent, Modified Par Tan, Conjugate gradients, BFGS. To compare the optimization methods, the number of error values within the selected interval is used. Five intervals for absolute errors are selected. To compare optimization methods using relative errors, five intervals are chosen. Comparison of optimization methods with the help of network errors makes it possible to conclude that for this task the best results were obtained using the BFGS optimization method.
Keywords: neural network errors, optimization methods, neuromuscular diseases, absolute error, relative error, gradient descent, modified Par Tan, conjugate gradients, BFGS

P. 41–45

To the contents