statistics.py 文件源码

python
阅读 34 收藏 0 点赞 0 评论 0

项目:PyME 作者: vikramsunkara 项目源码 文件源码
def kl_divergence(p, q):
    """
    Returns KL-divergence of distribution q from distribution p.

    The Kullback-Leibler (KL) divergence is defined as

    .. math::

           \\textrm{KL-divergence}(p, q) :=
           \\sum_{x} p(x) \\log{} \\frac{p(x)}{q(x)}

    Warning: this function uses numpy's scalar floating point types to
    perform the evaluation. Therefore, the result may be non-finite.
    For example, if the state x has non-zero probability for distribution p,
    but zero probability for distribution q, then the result will be
    non-finite.
    """
    accum = 0.0
    for x in p:
        p_x = numpy.float_(p[x])
        if p_x != 0.0:
            q_x = numpy.float_(q.get(x, 0.0))
            accum += p_x * numpy.log(p_x / q_x)
    return accum
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号