ActivationFunctions.py 文件源码

python
阅读 25 收藏 0 点赞 0 评论 0

项目:LiviaNET 作者: josedolz 项目源码 文件源码
def applyActivationFunction_PReLU( inputData, PreluActivations ) :
    """Parametric Rectified Linear Unit.
    It follows:
    `f(x) = alpha * x for x < 0`,
    `f(x) = x for x >= 0`,
    where `alpha` is a learned array with the same shape as x.

    - The input is a tensor of shape (batchSize, FeatMaps, xDim, yDim, zDim) """
    preluActivationsAsRow = PreluActivations.dimshuffle('x', 0, 'x', 'x', 'x')

    pos = T.maximum(0, inputData)
    neg = preluActivationsAsRow * (inputData - abs(inputData)) * 0.5
    output = pos + neg

    return (output)

# --- version 2 ---
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号