Activation functions are really important for an Artificial Neural Network to learn and make sense of something really complicated and Non-linear complex functional mappings between the inputs and response variable. They introduce non-linear properties to our Network. Their main purpose is to convert an input signal of a node in an ANN to an output signal. That output signal now is used as an input in the next layer in the stack.
we need to apply an activation function f(x) so as to make the network more powerful and add the ability to it to learn something complex and complicated form data and represent non-linear complex arbitrary functional mappings between inputs and outputs. Hence using a non-linear Activation we are able to generate non-linear mappings from inputs to outputs.
Also, another important feature of an Activation function is that it should be differentiable. We need it to be this way so as to perform backpropagation optimization strategy while propagating backwards in the network to compute gradients of Error(loss) with respect to Weights and then accordingly optimize weights using Gradient descend or any other Optimization technique to reduce Error.
Most popular types of Activation functions:
Sigmoid or Logistic - Predict value in between 0 to 1. Returns f(x) = 1 / (1 + exp(-x))
we need to apply an activation function f(x) so as to make the network more powerful and add the ability to it to learn something complex and complicated form data and represent non-linear complex arbitrary functional mappings between inputs and outputs. Hence using a non-linear Activation we are able to generate non-linear mappings from inputs to outputs.
Also, another important feature of an Activation function is that it should be differentiable. We need it to be this way so as to perform backpropagation optimization strategy while propagating backwards in the network to compute gradients of Error(loss) with respect to Weights and then accordingly optimize weights using Gradient descend or any other Optimization technique to reduce Error.
Most popular types of Activation functions:
Sigmoid or Logistic - Predict value in between 0 to 1. Returns f(x) = 1 / (1 + exp(-x))
TanH (Hyperbolic Tangent) - Predict value in between -1 to 1. Returns f(x) = tanh(x)