main webpage
W Topic
Rectifier (Neural Networks)
Comparison of ReLU and linear saturated activation functions in neural network for universal approximation
2019
Activation functions used in hidden layers directly affect the possibilities for describing nonlinear systems using a feedforward neural network. Furthermore, linear based activation functions are less computationally demanding than their nonlinear alternativ…
Article

Rectifier (Neural Networks)

Activation function

In the context of artificial neural networks, the rectifier or ReLU (rectified linear unit) activation function is an activation function defined as the positive part of its argument:

where x is the input to a neuron. This is also known as a ramp function and is analogous to half-wave rectification in electrical engineering. This activation function was introduced by Kunihiko Fukushima in 1969 in the context of visual feature extraction in hierarchical neural networks. It was later argued that it has strong biological motivations and mathematical justifications. In 2011 it was found to enable better training of deeper networks, compared to the widely used activation functions prior to 2011, e.g.

Exploring foci of:
Comparison of ReLU and linear saturated activation functions in neural network for universal approximation
2019
Activation functions used in hidden layers directly affect the possibilities for describing nonlinear systems using a feedforward neural network. Furthermore, linear based activation functions are less computationally demanding than their nonlinear alternatives. In addition, feedforward neural networks with linear based activation functions can be advantageously used for control of nonlinear systems, as shown in previous authors' publications. This paper aims to compare two types of linear based functions…
Click Rectifier (Neural Networks) Vs:
Activation Function
Feedforward Neural Network
Computer Science
Feed Forward (Control)
Algorithm
Mathematics
Artificial Intelligence
Engineering
Programming Language
Click Rectifier (Neural Networks) Vs:
Physics
Quantum Mechanics
Control Engineering