rntnmodel.py 文件源码

python
阅读 32 收藏 0 点赞 0 评论 0

项目:SentimentAnalysis 作者: Conchylicultor 项目源码 文件源码
def backpropagate(self, sample):
        """
        Compute the derivate at each level of the sample and return the sum
        of it (stored in a gradient object)
        """
        # Notations:
        #   a: Output at root node (after activation)
        #   z: Output before softmax (z=Ws*a + bs)
        #   y: Output after softmax, final prediction (y=softmax(z))
        #   E: Cost of the current prediction (E = cost(softmax(Ws*a + bs)) = cost(y))
        #   t: Gound truth prediction
        # We then have:
        #   x -> a -> x -> a -> ... x -> a(last layer) -> z (projection on dim 5) -> y (softmax prediction) -> E (cost)

        return self._backpropagate(sample.root, None) # No incoming error for the root node (except the one coming from softmax)
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号