ActivationFunctions.py 文件源码

python
阅读 22 收藏 0 点赞 0 评论 0

项目:LiviaNET 作者: josedolz 项目源码 文件源码
def applyActivationFunction_LeakyReLU( inputData, leakiness ) :
    """leakiness : float
        Slope for negative input, usually between 0 and 1.
        A leakiness of 0 will lead to the standard rectifier,
        a leakiness of 1 will lead to a linear activation function,
        and any value in between will give a leaky rectifier.

        [1] Maas et al. (2013):
        Rectifier Nonlinearities Improve Neural Network Acoustic Models,
        http://web.stanford.edu/~awni/papers/relu_hybrid_icml2013_final.pdf


    - The input is a tensor of shape (batchSize, FeatMaps, xDim, yDim, zDim) """

    pos = 0.5 * (1 + leakiness)
    neg = 0.5 * (1 - leakiness)

    output = pos * inputData + neg * abs(inputData)

    return (output)

# *** There actually exist several ways to implement PReLU activations ***

# PReLU activations (from Kamnitsas)
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号