softmax function

The Softmax function

The Softmax function is usually used in classification problems such as neural networks and multinomial logistic regression, this is just generalisation of the logistic function:

f(x) = 1/(1 + e^(-k(z-z0)))

The output of the Softmax is used as a categorical distribution meaning that for each outcome we will get the possibility of it, it takes into account both the result we are checking but all other results together to create the probability.

 Yi = e^Zi/ sum(e^Zi)

The beauty of this function is that if you create the derivative according to Zi you will get an elegant solution :

Yi(1-Yi)

So it is very easy to work with.

Some Python…

Let`s implement the softmax function in Python. It should receive as an input the array for which we would like to imply the softmax function and return the probability for each item in the array :

import numpy as np
# Define our softmax function
def softmax(x):
    ex = np.exp(x)
    sum_ex = np.sum( np.exp(x))
    return ex/sum_ex


print softmax([1,2,3])

Softmax Function in Python

Yoni Amishav


Tech lead, blogger, node js Angular and more ...


Post navigation


Leave a Reply

Free Email Updates
Get the latest content first.
We respect your privacy.
%d