Vinesha Peiris, Rational activation functions in neural network with uniform norm based loss function and its application in classification, Vol. 2022 (2022), Article ID 3, pp. 1-25

Full Text: PDF
DOI: 10.23952/cot.2022.3

Received October 12, 2021; Accepted February 15, 2022; Published March 8, 2022

 

Abstract. In this paper, we demonstrate an application of generalised rational uniform (Chebyshev) approximation to neural networks. In particular, our activation functions are rational functions of degree (1,1) and the loss function is based on the uniform norm. In this setting, when the coefficients of the rational activation function are fixed, overall optimisation problem of the neural network can be formulated as a generalised rational approximation problem with additional linear constraints. In this case, the weights and the bias of the network are the decision variables. To optimise the decision variables, we suggest using two prominent methods: bisection method and differential correction algorithm. We illustrate the efficiency of this application by performing numerical experiments on classification problems with two classes and report the classification accuracy obtained by the network using the bisection method, differential correction algorithm along with the standard MATLAB toolbox which uses least square loss function. We show that the choice of the uniform norm based loss function with rational activation function leads to a better classification accuracy when the training dataset is either very small or if the classes are imbalanced.

 

How to Cite this Article:
V. Peiris, Rational activation functions in neural network with uniform norm based loss function and its application in classification, Commun. Optim. Theory 2022 (2022) 3.