def softmax_cross_entropy(
x, t, use_cudnn=True, normalize=True, cache_score=True,
class_weight=None):
"""Computes cross entropy loss for pre-softmax activations.
Args:
x (~chainer.Variable): Variable holding a multidimensional array whose
element indicates unnormalized log probability: the first axis of
the variable represents the number of samples, and the second axis
represents the number of classes. While this function computes
a usual softmax cross entropy if the number of dimensions is equal
to 2, it computes a cross entropy of the replicated softmax if the
number of dimensions is greater than 2.
t (~chainer.Variable): Variable holding an int32 vector of ground truth
labels. If ``t[i] == -1``, corresponding ``x[i]`` is ignored.
normalize (bool): If ``True``, this function normalizes the cross
entropy loss across all instances. If ``False``, it only
normalizes along a batch size.
cache_score (bool): When it is ``True``, the function stores result
of forward computation to use it on backward computation. It
reduces computational cost though consumes more memory.
class_weight (~numpy.ndarray or ~cupy.ndarray): An array that contains
constant weights that will be multiplied with the loss values along
with the second dimension. The shape of this array should be
``(x.shape[1],)``.
Returns:
Variable: A variable holding a scalar array of the cross entropy loss.
.. note::
This function is differentiable only by ``x``.
"""
return SoftmaxCrossEntropy(
use_cudnn, normalize, cache_score, class_weight)(x, t)
softmax_cross_entropy.py 文件源码
python
阅读 19
收藏 0
点赞 0
评论 0
评论列表
文章目录