Learn
Perceptron
The Bias Weight

You have understood that the perceptron can be trained to produce correct outputs by tweaking the regular weights.

However, there are times when a minor adjustment is needed for the perceptron to be more accurate. This supporting role is played by the bias weight. It takes a default input value of 1 and some random weight value.

So now the weighted sum equation should look like:

weighted sum=x1w1+x2w2+...+xnwn+1wbweighted\ sum = x_1w_1 + x_2w_2 + ... + x_nw_n + 1w_b

How does this change the code so far? You only have to consider two small changes:

  • Add a 1 to the set of inputs (now there are 3 inputs instead of 2)
  • Add a bias weight to the list of weights (now there are 3 weights instead of 2)

We’ll automatically make these replacements in the code so you should be good to go!

Folder Icon

Sign up to start coding

Already have an account?