Abstract: In the paper, an ontogenic artificial neural network (ANNs) is proposed. The network uses orthogonal
activation functions that allow significant reducing of computational complexity. Another advantage is numerical
stability, because the system of activation functions is linearly independent by definition. A learning procedure for
proposed ANN with guaranteed convergence to the global minimum of error function in the parameter space is
developed. An algorithm for structure network structure adaptation is proposed. The algorithm allows adding or
deleting a node in real-time without retraining of the network. Simulation results confirm the efficiency of the
proposed approach.
Keywords: ontogenic artificial neural network, orthogonal activation functions, time-series forecasting.
ACM Classification Keywords: I.2.6 Learning – Connectionism and neural nets
Link:
GROWING NEURAL NETWORKS USING NONCONVENTIONAL ACTIVATION FUNCTIONS
Yevgeniy Bodyanskiy, Iryna Pliss, Oleksandr Slipchenko
http://www.foibg.com/ijita/vol14/ijita14-3-p13.pdf