rnn.py 文件源码

python
阅读 25 收藏 0 点赞 0 评论 0

项目:seq2seq 作者: eske 项目源码 文件源码
def call(self, inputs, state):
        inputs = tf.concat(inputs, axis=1)
        input_size = inputs.shape[1]
        state_size = state.shape[1]
        dtype = inputs.dtype

        with tf.variable_scope("gates"):
            bias_initializer = self._bias_initializer
            if self._bias_initializer is None and not self._layer_norm:  # bias of 1 for layer norm?
                bias_initializer = init_ops.constant_initializer(1.0, dtype=dtype)

            bias = tf.get_variable('bias', [2 * self._num_units], dtype=dtype, initializer=bias_initializer)
            weights = tf.get_variable('kernel', [input_size + state_size, 2 * self._num_units], dtype=dtype,
                                      initializer=self._kernel_initializer)

            inputs_ = tf.matmul(inputs, weights[:input_size])
            state_ = tf.matmul(state, weights[input_size:])

            if self._layer_norm:
                inputs_ = tf.contrib.layers.layer_norm(inputs_, scope='inputs')
                state_ = tf.contrib.layers.layer_norm(state_, scope='state')

            value = tf.nn.sigmoid(inputs_ + state_ + bias)
            r, u = tf.split(value=value, num_or_size_splits=2, axis=1)

        with tf.variable_scope("candidate"):
            bias = tf.get_variable('bias', [self._num_units], dtype=dtype, initializer=self._bias_initializer)
            weights = tf.get_variable('kernel', [input_size + state_size, self._num_units], dtype=dtype,
                                      initializer=self._kernel_initializer)

            c = tf.matmul(tf.concat([inputs, r * state], axis=1), weights)

            if self._layer_norm:
                c = tf.contrib.layers.layer_norm(c)

            c = self._activation(c + bias)

        new_h = u * state + (1 - u) * c
        return new_h, new_h
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号