doi.org
Comparison of ReLU and linear saturated activation functions in neural network for universal approximation
June 2019 • Dominik Štursa, Petr Doležel
Activation functions used in hidden layers directly affect the possibilities for describing nonlinear systems using a feedforward neural network. Furthermore, linear based activation functions are less computationally demanding than their nonlinear alternatives. In addition, feedforward neural networks with linear based activation functions can be advantageously used for control of nonlinear systems, as shown in previous authors' publications. This paper aims to compare two types of linear based functions…