def applyActivationFunction_LeakyReLU( inputData, leakiness ) :
"""leakiness : float
Slope for negative input, usually between 0 and 1.
A leakiness of 0 will lead to the standard rectifier,
a leakiness of 1 will lead to a linear activation function,
and any value in between will give a leaky rectifier.
[1] Maas et al. (2013):
Rectifier Nonlinearities Improve Neural Network Acoustic Models,
http://web.stanford.edu/~awni/papers/relu_hybrid_icml2013_final.pdf
- The input is a tensor of shape (batchSize, FeatMaps, xDim, yDim, zDim) """
pos = 0.5 * (1 + leakiness)
neg = 0.5 * (1 - leakiness)
output = pos * inputData + neg * abs(inputData)
return (output)
# *** There actually exist several ways to implement PReLU activations ***
# PReLU activations (from Kamnitsas)
评论列表
文章目录