Artificial Neural Network Part 4


Why Neural Networks?

In last few years, Neural Networks has attracted a great deal of attention and is being successfully applied across a range of domains like Stock Exchange, Forecasting, Handwriting recognition or Image Recognition etc. Let us understand working of ANN with the help of example.

Let us suppose a bank wants to develop an algorithm to predict the probability of default of new customer. He has the following data based on which probability of default would be predicted. Higher the predicted probability of default more chances that customer would default.



First let us try to understand working of ANN with an example without hidden layers. So in the very basic neural network would look like the following figure.



Working of Neural Network without hidden layer is more or less like regular regression model. Where we try to predict target variable based on few independent variable. So we try to develop a regression equation and weights are given to each of the input variable.



In the above equation, w1 to w4 are the weights given to independent variables from x1 till x4.  Similarly in case we do not have any hidden layer our neural network algorithm would predict target variable while assigning weights to different input variable.

But the real power of neural networks comes from hidden layer and how the distribution of weights varies across neurons within hidden layers. So let us understand working on Neural Network with an example of multi-layer neural network.



The input layer would first start with top neuron of the hidden layer. All the Neurons in the Input layer would develop a function or an equation with certain weights being assigned to each of the input variable. Similarly all the input variables would develop another equation for the second neuron of the hidden layer. Now the weights assigned to each of the input variables would be different from the weights assigned for the first neuron. Not all the weights assigned will have significant value. Few variables might be assigned zero value and the others might be assigned non zero value for a particular neuron but this distribution would change as ANN moves from one neuron to another.

So it is not just the hidden layers which make ANN so powerful algorithm but the real power lies in the redistribution of weights across neurons of hidden layers. This redistribution of weights in ANN helps to predict target variable with very high accuracy as distribution of weights would adjust as per the customer characteristics.

Key points to understand:

  1. The hidden layers within ANN are like filters that filter out some of the input variable and let other variable to pass through the layer reach the output layer. So Impact or significance of each of the input variable would vary from one neuron to other

Weights assigned to input variables would vary for each observation and that is why ANN predict target variable with very high accuracy

Next Section:  Artificial Neural Network Part 5



Please enter your comment!
Please enter your name here