It is a smooth approximation (in fact, an analytic function) to the ramp function, which is known as the rectifier or ReLU (rectified linear unit) in machine learning. For large negative it is , so just above 0, while for large positive it is , so just above .
The names softplus[1][2] and SmoothReLU[3] are used in machine learning. The name "softplus" (2000), by analogy with the earlier softmax (1989) is presumably because it is a smooth (soft) approximation of the positive part of x, which is sometimes denoted with a superscript plus, .
The multivariable generalization of single-variable softplus is the LogSumExp with the first argument set to zero:
The LogSumExp function is
and its gradient is the softmax; the softmax with the first argument set to zero is the multivariable generalization of the logistic function. Both LogSumExp and softmax are used in machine learning.
Convex conjugate
The convex conjugate (specifically, the Legendre transformation) of the softplus function is the negative binary entropy (with base e). This is because (following the definition of the Legendre transformation: the derivatives are inverse functions) the derivative of softplus is the logistic function, whose inverse function is the logit, which is the derivative of negative binary entropy.
Softplus can be interpreted as logistic loss (as a positive number), so, by duality, minimizing logistic loss corresponds to maximizing entropy. This justifies the principle of maximum entropy as loss minimization.
References
↑ Dugas, Charles; Bengio, Yoshua; Bélisle, François; Nadeau, Claude; Garcia, René (2000). "Incorporating second-order functional knowledge for better option pricing"(PDF). Proceedings of the 13th International Conference on Neural Information Processing Systems (NIPS'00). MIT Press: 451–457. Since the sigmoid h has a positive first derivative, its primitive, which we call softplus, is convex.
↑ Glorot, Xavier; Bordes, Antoine; Bengio, Yoshua (2011-06-14). "Deep Sparse Rectifier Neural Networks". Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics. JMLR Workshop and Conference Proceedings: 315–323. Rectifier and softplus activation functions. The second one is a smooth version of the first.
This page is based on this Wikipedia article Text is available under the CC BY-SA 4.0 license; additional terms may apply. Images, videos and audio are available under their respective licenses.