-
Notifications
You must be signed in to change notification settings - Fork 0
[1] Explanation of important aspects
This section is meant to cover a few important aspects of the neural network that one might quickly forget to notice the importance of. In this section, the following components that play a key role in the artificial Neural Network (ANN) will be covered:
- Activation Functions (AF)
Let's start by introducing a few definitions that will be used in this section:
- Nj : Neuron j in layer J
- Netj : Output of the weighted sum of Nj (before application of the AF)
- Oj : Output of neuron Nj (after applying the AF to Netj)
- Wij : Weight of the connection between Ni and Nj
Activation functions play a key role in, and have major influence on, the output of the network. When we zoom in on one 'Neuron' in the network, we can see that is 'split into two parts', that both execute their own task in the process of feedforward. The first half computes the weighted sum, which is:
Netj = sum(i=1 -> n) [Oi * Wij]
To determine the actual output of this Neuron Nj, we need one more mathematical operation. This is where the AF comes in, which is in fact the mathematical operation applied to Netj. There are multiple options for the AF but one popular AF is the Sigmoid function, which always results in a value between 0 and 1.
It is important to notice that the choice of AF has a huge impact on the outcome of the network. One has to understand that a network, that uses the Sigmoid AF, can never be trained to give an output of 6, or -73, because after application of the AF, the outcome will always lay between 0 and 1.