shared.py 文件源码

python
阅读 37 收藏 0 点赞 0 评论 0

项目:adversarial-variational-bayes 作者: gdikov 项目源码 文件源码
def estimate_i_alpha(y, co):
    """ Estimate i_alpha = \int p^{\alpha}(y)dy.

    The Renyi and Tsallis entropies are simple functions of this quantity. 

    Parameters
    ----------
    y : (number of samples, dimension)-ndarray
        One row of y corresponds to one sample.
    co : cost object; details below.
    co.knn_method : str
                    kNN computation method; 'cKDTree' or 'KDTree'.
    co.k : int, >= 1
           k-nearest neighbors.
    co.eps : float, >= 0
             the k^th returned value is guaranteed to be no further than 
             (1+eps) times the distance to the real kNN.
    co.alpha : float
               alpha in the definition of i_alpha

    Returns
    -------
    i_alpha : float
              Estimated i_alpha value.

    Examples
    --------
    i_alpha = estimate_i_alpha(y,co)

    """

    num_of_samples, dim = y.shape
    distances_yy = knn_distances(y, y, True, co.knn_method, co.k, co.eps,
                                 2)[0]
    v = volume_of_the_unit_ball(dim)

    # Solution-1 (normal k):
    c = (gamma(co.k)/gamma(co.k + 1 - co.alpha))**(1 / (1 - co.alpha))

    # Solution-2 (if k is 'extreme large', say self.k=180 [ =>
    #            gamma(self.k)=inf], then use this alternative form of
    #            'c', after importing gammaln). Note: we used the
    #            'gamma(a) / gamma(b) = exp(gammaln(a) - gammaln(b))'
    #            identity.
    # c = exp(gammaln(co.k) - gammaln(co.k+1-co.alpha))**(1 / (1-co.alpha))

    s = sum(distances_yy[:, co.k-1]**(dim * (1 - co.alpha)))
    i_alpha = \
        (num_of_samples - 1) / num_of_samples * v**(1 - co.alpha) * \
        c**(1 - co.alpha) * s / (num_of_samples - 1)**co.alpha

    return i_alpha
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号