layers.py 文件源码

python
阅读 15 收藏 0 点赞 0 评论 0

项目:keras-text 作者: raghakot 项目源码 文件源码
def __init__(self,
                 kernel_initializer='he_normal',
                 kernel_regularizer=None,
                 kernel_constraint=None,
                 use_bias=True,
                 bias_initializer='zeros',
                 bias_regularizer=None,
                 bias_constraint=None,
                 use_context=True,
                 context_initializer='he_normal',
                 context_regularizer=None,
                 context_constraint=None,
                 attention_dims=None,
                 **kwargs):
        """
        Args:
            attention_dims: The dimensionality of the inner attention calculating neural network.
                For input `(32, 10, 300)`, with `attention_dims` of 100, the output is `(32, 10, 100)`.
                i.e., the attended words are 100 dimensional. This is then collapsed via summation to
                `(32, 10, 1)` to indicate the attention weights for 10 words.
                If set to None, `features` dims are used as `attention_dims`. (Default value: None)
        """
        if 'input_shape' not in kwargs and 'input_dim' in kwargs:
            kwargs['input_shape'] = (kwargs.pop('input_dim'),)

        super(AttentionLayer, self).__init__(**kwargs)
        self.kernel_initializer = initializers.get(kernel_initializer)
        self.kernel_regularizer = regularizers.get(kernel_regularizer)
        self.kernel_constraint = constraints.get(kernel_constraint)

        self.use_bias = use_bias
        self.bias_initializer = initializers.get(bias_initializer)
        self.bias_regularizer = regularizers.get(bias_regularizer)
        self.bias_constraint = constraints.get(bias_constraint)

        self.use_context = use_context
        self.context_initializer = initializers.get(context_initializer)
        self.context_regularizer = regularizers.get(context_regularizer)
        self.context_constraint = constraints.get(context_constraint)

        self.attention_dims = attention_dims
        self.supports_masking = True
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号