For

  •  Numerical Stability
  • For algorithms which expect the input to be normalises

Numerical stability is very imp, when you are doing any  machiene learning algorithm. Especailly when you are adding  many small number to very large number

We will try a simple example to demonstrate the numerical stability in python

1,000,000,000

+0.000,000,001 <- 100,000 times

-1,000,000,000

= 0.953 in python. should have been 1.0

Code

“`big_value = 1000000000
small_value = 0.000001
for i in range(1000000):
big_value = big_value + 0.000001
print(big_value – 1000000000)

#O/p : 0.953 vs 1 (Actual)

“`

Try replacing it with 1, you see that the error becomes very less.

Conclusion : Because of the numerical stability  problem and because most of the alogorithms work very well with zero mean and equal variance, you should always  use zero mean and equal variance features or transform to it, as far as possible.

 

 

Advertisements