attention.py 文件源码

python
阅读 17 收藏 0 点赞 0 评论 0

项目:extkeras 作者: andhus 项目源码 文件源码
def __init__(self, recurrent_layer,
                 return_attention=False,
                 concatenate_input=True,
                 attend_after=False,
                 **kwargs):
        super(RecurrentAttention, self).__init__(**kwargs)
        self.recurrent_layer = self.add_child(
            'recurrent_layer',
            recurrent_layer
        )
        self.return_attention = return_attention
        self.concatenate_input = concatenate_input
        self.attend_after = attend_after

        self.input_spec = [InputSpec(ndim=3), None]

        self._attended_spec = InputSpec(ndim=2)
        self._attention_step_output_spec = InputSpec(ndim=2)
        self._attention_state_spec = [InputSpec(ndim=2)]
        self._attention_states = [None]

        # will be set in call, then passed to step by get_constants
        self._attended = None
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号