NN.py 文件源码

python
阅读 33 收藏 0 点赞 0 评论 0

项目:MachineLearningProjects 作者: geallen 项目源码 文件源码
def calcGrads(X, Z1, Z2, E1, E2, Eb1):
    ## YOUR CODE HERE ##
    d_W1 = 0
    d_b1 = 0
    d_W2 = 0
    d_b2 = 0


    ## In here we should the derivatives for gradients. To find derivative, we should multiply.

    # d_w2 is the derivative for weights between hidden layer and the output layer.
    d_W2 = np.dot(np.transpose(E2), Z1)
    # d_w1 is the derivative for weights between hidden layer and the input layer.
    d_W1 = np.dot(E1, X)
    # d_b2 is the derivative for weights between hidden layer bias and the output layer.
    d_b2 = np.dot(np.transpose(E2), Eb1)
    # d_b1 is the derivative for weights between hidden layer bias and the input layer.
    d_b1 = np.dot(np.transpose(E1), 1)


    ####################
    return d_W1, d_W2, d_b1, d_b2

# update the weights between units and the bias weights using a learning rate of alpha
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号