lnRNNCell.py 文件源码

python
阅读 31 收藏 0 点赞 0 评论 0

项目:Automatic_Speech_Recognition 作者: zzw922cn 项目源码 文件源码
def call(self, inputs, state):
    """Gated recurrent unit (GRU) with nunits cells."""
    with tf.variable_scope('layer_normalization'):
      gain1 = tf.get_variable('gain1', shape=[2*self._num_units], initializer=tf.ones_initializer())
      bias1 = tf.get_variable('bias1', shape=[2*self._num_units], initializer=tf.zeros_initializer())
      gain2 = tf.get_variable('gain2', shape=[self._num_units], initializer=tf.ones_initializer())
      bias2 = tf.get_variable('bias2', shape=[self._num_units], initializer=tf.zeros_initializer())

    with vs.variable_scope("gates"):  # Reset gate and update gate.
      # We start with bias of 1.0 to not reset and not update.
      bias_ones = self._bias_initializer
      if self._bias_initializer is None:
        dtype = [a.dtype for a in [inputs, state]][0]
        bias_ones = tf.constant_initializer(1.0, dtype=dtype)
      value = tf.nn.sigmoid(ln(
          _linear([inputs, state], 2 * self._num_units, True, bias_ones,
                  self._kernel_initializer), gain1, bias1))
      r, u = array_ops.split(value=value, num_or_size_splits=2, axis=1)
    with vs.variable_scope("candidate"):
      c = self._activation(ln(
          _linear([inputs, r * state], self._num_units, True,
                  self._bias_initializer, self._kernel_initializer), gain2, bias2))
    new_h = u * state + (1 - u) * c
    return new_h, new_h
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号