layers.py 文件源码

python
阅读 30 收藏 0 点赞 0 评论 0

项目:tefla 作者: openAGI 项目源码 文件源码
def selu(x, alpha=None, scale=None, name='selu', outputs_collections=None, **unused):
    """
    Computes selu

    Args:
        x: a `Tensor` with type `float`, `double`, `int32`, `int64`, `uint8`, int16`, or `int8`.
        alpha: float, selu parameters calculated from fixed points
        scale: float, selu parameters calculated from fixed points
        name: a optional scope/name of the layer
        outputs_collections: The collections to which the outputs are added.

    Returns:
        A `Tensor` representing the results of the selu activation operation.
    """
    _check_unused(unused, name)
    with tf.name_scope(name):
        if None in (alpha, scale):
            # using parameters from 0 mean, unit variance points
            alpha = 1.6732632423543772848170429916717
            scale = 1.0507009873554804934193349852946
        output = scale * tf.where(x >= 0.0, x, alpha * tf.nn.elu(x))
        return _collect_named_outputs(outputs_collections, name, output)
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号