What is the role of bias in Neural Network?

Simon Benavides Pinjosovsky, PhD
4 min readApr 11, 2024
Photo by Alina Grubnyak on Unsplash

When we talk about bias in the context of neural network, we refer to the constant added to the product of features and weights. It allows us to account of any asymmetry of shift in the data source.

A neural network represent a function, combining the outputs for each neuron in the network. In practice, a bias is an additional parameter that each neuron will have in the network, allowing the built function to increase its flexibility in representing patterns that do not pass all through the origin (0,0).

In a simple way, the weights control the strength of the connection between neurons, the bias helps shift the activation function of each neuron.

Little example

It might help to look at a simple example. Consider this 1-input, 1-output network that has no bias:

The output of the network is computed by multiplying the input (x) by the weight (w0) and passing the result through some kind of activation function (e.g. a sigmoid function.)

Here is the function that this network computes, for various values of w0:

Simon Benavides Pinjosovsky, PhD

Written by Simon Benavides Pinjosovsky, PhD

Director Data Science & AI @Novartis Switzerland

No responses yet

What are your thoughts?