entropy_estimators.py 文件源码

python
阅读 27 收藏 0 点赞 0 评论 0

项目:cgpm 作者: probcomp 项目源码 文件源码
def cmi(x, y, z, k=3, base=2):
  """Mutual information of x and y, conditioned on z
  x,y,z should be a list of vectors, e.g. x = [[1.3], [3.7], [5.1], [2.4]]
  if x is a one-dimensional scalar and we have four samples
  """
  assert len(x)==len(y), 'Lists should have same length.'
  assert k <= len(x) - 1, 'Set k smaller than num samples - 1.'
  intens = 1e-10 # Small noise to break degeneracy, see doc.
  x = [list(p + intens*nr.rand(len(x[0]))) for p in x]
  y = [list(p + intens*nr.rand(len(y[0]))) for p in y]
  z = [list(p + intens*nr.rand(len(z[0]))) for p in z]
  points = zip2(x,y,z)
  # Find nearest neighbors in joint space, p=inf means max-norm.
  tree = ss.cKDTree(points)
  dvec = [tree.query(point, k+1, p=float('inf'))[0][k] for point in points]
  a = avgdigamma(zip2(x,z), dvec)
  b = avgdigamma(zip2(y,z), dvec)
  c = avgdigamma(z,dvec)
  d = digamma(k)
  return (-a-b+c+d) / log(base)
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号