dizzyRNNCellOptHackyReLU.py 文件源码

python
阅读 27 收藏 0 点赞 0 评论 0

项目:dizzy_layer 作者: Pastromhaug 项目源码 文件源码
def __call__(self, inputs, state, scope=None):

        with vs.variable_scope(scope or type(self).__name__):

            t_state = tf.transpose(state)

            state_out = doRotations(t_state, self._rotations)
            input_out = linearTransformWithBias([inputs],
                self._num_units, bias=False, scope=scope)

            state_out = tf.transpose(state_out)

            bias = vs.get_variable(
                "Bias", [self._num_units],
                dtype=tf.float32,
                initializer=init_ops.constant_initializer(dtype=tf.float32))

            output = tf.nn.relu(state_out + input_out + bias)
        return output, output
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号