attention.py 文件源码

python
阅读 24 收藏 0 点赞 0 评论 0

项目:extkeras 作者: andhus 项目源码 文件源码
def attention_step(
        self,
        attended,
        attention_states,
        step_input,
        recurrent_states
    ):
        [attention_tm1, kappa_tm1] = attention_states
        params = self.params_layer(
            concatenate([step_input, recurrent_states[0]])
        )
        attention, kappa = self._get_attention_and_kappa(
            attended,
            params,
            kappa_tm1
        )
        return attention, [attention, kappa]
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号