ActivationFunctions.py 文件源码

python
阅读 31 收藏 0 点赞 0 评论 0

项目:LiviaNET 作者: josedolz 项目源码 文件源码
def applyActivationFunction_PReLU_v2(inputData,PreluActivations) :
    """ inputData is a tensor5D with shape:
     (batchSize,
     Number of feature Maps,
     convolvedImageShape[0],
     convolvedImageShape[1],
     convolvedImageShape[2]) """ 

    # The input is a tensor of shape (batchSize, FeatMaps, xDim, yDim, zDim)
    preluActivationsAsRow = PreluActivations.dimshuffle('x', 0, 'x', 'x', 'x')

    pos = ((inputData + abs(inputData)) / 2.0 )
    neg = preluActivationsAsRow * ((inputData - abs(inputData)) / 2.0 )
    output = pos + neg

    return ( output)

# --- version 3 ---
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号