Deep Learning

Leaky Relu Derivative Python Implementation with Explanation

Leaky Relu solves the problem of dead neurons. Because it is not zero even in the case of negative values. Let’s see leaky Relu derivative python.

Leaky Relu & derivative –

Let’s see the mathematical expression for Leaky Relu.
x>0 then f(x)=x
x<0 then f(x)=x*constant

Here we have seen that the value of Leky Relu function is always the same as the variable if the value is greater than zero. It will be a constant in the place of zero. Now let’s see the mathematical expression of the derivative of Leky Relu.

x>0 then f(x)=1
x<0 then f(x)=constant

This constant is the only difference between Leaky Relu and Relu. In the below section, we will see its importance.

Leaky Relu derivative python Implementation –

In the above section, We have seen the mathematical expression. Now let’s see leaky Relu derivative python Implementation

def leaky_Relu(x):
  return x*0.01 if x < 0 else x

def  leaky_Relu_Derivative(x):
  return 0.01 if x < 0 else 1

Graphical Representation of Leaky Relu-

In the earlier section, We have seen the mathematical expression and python code for Leaky Relu. Now let’s see the graph of Leaky Relu.

leaky relu[1]

Graphical Representation of Leaky Relu Derivative-

Let’s see the Graphical Representation of Leaky Relu Derivative. Here we need to be careful that it looks approximate to zero in negative values. But actually, it is not zero.

leaky relu derivative python graph [2]

Conclusion-

Leaky Relu is a Revolution in Neural Network. It solves the problem of Vanishing Gradient Descent in RNNs. That is a clear reason for rising in the Deep Learning journey. Actually, Sigmoid Function’s derivative has a range between (0,0.25) which tends to zero because of the chain rule. This causes the deactivation of neurons.

I hope now must have understood an intuition of Leaky Relu. How its existence helps neural networks. Still, If you have any doubt on Leaky Relu, Please comment below in the comment box. We also appreciate you for reading the whole article till the end.

Thanks 

Data Science Learner Team

Reference

1 .https://paperswithcode.com/method/leaky-relu

2. https://www.analyticsvidhya.com/blog/2020/01/fundamentals-deep-learning-activation-functions-when-to-use-them/