attention.py 文件源码

python
阅读 17 收藏 0 点赞 0 评论 0

项目:extkeras 作者: andhus 项目源码 文件源码
def __init__(
        self,
        n_components,
        alpha_activation=None,
        beta_activation=None,
        kappa_activation=None,
        *args,
        **kwargs
    ):
        super(GravesSequenceAttention, self).__init__(*args, **kwargs)
        self.distribution = AlexGravesSequenceAttentionParams(
            n_components,
            alpha_activation,
            beta_activation,
            kappa_activation,
        )
        self._attention_states = [None, None]
        self._attention_state_spec = [
            InputSpec(ndim=2),          # attention (tm1)
            InputSpec(shape=(None, 1))  # kappa
        ]
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号