kernels.py 文件源码

python
阅读 40 收藏 0 点赞 0 评论 0

项目:GPflow 作者: GPflow 项目源码 文件源码
def exKxz(self, Z, Xmu, Xcov):
        """
        Computes <x_t K_{x_t z}>_q(x) for the same x_t.
        :param Z: Fixed inputs (MxD).
        :param Xmu: X means (TxD).
        :param Xcov: TxDxD. Contains covariances for each x_t.
        :return: (TxMxD).
        """
        self._check_quadrature()
        # Slicing is NOT needed here. The desired behaviour is to *still* return an NxMxD matrix.
        # As even when the kernel does not depend on certain inputs, the output matrix will still
        # contain the outer product between the mean of x_t and K_{x_t Z}. The code here will
        # do this correctly automatically, since the quadrature will still be done over the
        # distribution x_t, only now the kernel will not depend on certain inputs.
        # However, this does mean that at the time of running this function we need to know the
        # input *size* of Xmu, not just `input_dim`.
        M = tf.shape(Z)[0]
        # Number of actual input dimensions
        D = self.input_size if hasattr(self, 'input_size') else self.input_dim

        msg = "Numerical quadrature needs to know correct shape of Xmu."
        assert_shape = tf.assert_equal(tf.shape(Xmu)[1], D, message=msg)
        with tf.control_dependencies([assert_shape]):
            Xmu = tf.identity(Xmu)

        def integrand(x):
            return tf.expand_dims(self.K(x, Z), 2) * tf.expand_dims(x, 1)

        num_points = self.num_gauss_hermite_points
        return mvnquad(integrand, Xmu, Xcov, num_points, D, Dout=(M, D))
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号