Abstract: At present time neural networks based on Group Method of Data Handling (GMDH-NN), nodes of which are two-input N-Adalines?, is well-known. Each of N-Adalines? contains the set of adjustable synaptic weights that are estimated using standard least squares method and provides quadratic approximation of restoring nonlinear mapping. On the other hand, for needed approximation quality ensuring this NN can require considerable number of hidden layers. Approximating properties of GMDH-NN can be improved by uniting the approaches based on Group Method of Data Handling and Radial-Basis-Functions? Networks that have only one hidden layer, formed by, so-called, R-neurons. Such networks learning reduces, as a rule, to the tuning of synaptic weights of output layer that are formed by adaptive linear associators. In contrast to neurons of multilayer structures with polynomial or sigmoidal activation functions R-neurons have bell-shaped activation functions. In this paper as activation functions multidimensional Epanechnikov’s kernels are used. The advantage of activation function is that its derivatives are linear according all the parameters that allows to adjust sufficiently simply not only synaptic weights but also centers with receptive fields. Proposed network combines Group Method of Data Handling, Radial-Basis-Functions? Networks and cascade networks and isn’t inclined to the “curse of dimensionality”, is able to real time mode information processing by adapting its parameters and structure to problem conditions. The multidimensional Epanechnikov’s kernels were used as activation functions, that allowed to introduce numerically simple learning algorithms, which are characterized by high speed.
Keywords: evolving neural network, cascade networks, radial-basis neural network, Group Method of Data Handling, multidimensional Epanechnikov’s kernels.
ACM Classification Keywords: F.1 Computation by abstract devices – Self-modifying machines (e.g., neural networks), I.2.6 Learning – Connectionism and neural nets, G.1.2 Approximation – Nonlinear approximation.
Link:
EVOLVING CASCADE NEURAL NETWORK BASED ON MULTIDIMESNIONAL EPANECHNIKOV’S KERNELS AND ITS LEARNING ALGORITHM
Yevgeniy Bodyanskiy, Paul Grimm, Nataliya Teslenko
http://foibg.com/ijitk/ijitk-vol05/ijitk05-1-p02.pdf