lstm_theanompi_outdated.py 文件源码

python
阅读 31 收藏 0 点赞 0 评论 0

项目:Theano-MPI 作者: uoguelph-mlrg 项目源码 文件源码
def cleanup(self,*args, **kwargs):

        from theanompi.models.lstm import zipp, unzip, get_minibatches_idx, pred_error

        if self.best_p is not None:
            zipp(self.best_p, self.tparams)
        else:
            self.best_p = unzip(self.tparams)

        self.use_noise.set_value(0.)
        kf_train_sorted = get_minibatches_idx(len(self.train[0]), self.model_options['batch_size'])
        train_err = pred_error(self.f_pred, self.prepare_data, self.train, kf_train_sorted)
        valid_err = pred_error(self.f_pred, self.prepare_data, self.valid, kf_valid)
        test_err = pred_error(self.f_pred, self.prepare_data, self.test, kf_test)

        if self.rank==0: print( 'Train ', train_err, 'Valid ', valid_err, 'Test ', test_err )
        if saveto:
            numpy.savez(self.model_options['saveto'], train_err=train_err,
                        valid_err=valid_err, test_err=test_err,
                        history_errs=self.history_errs, **self.best_p)
        # print('The code run for %d epochs, with %f sec/epochs' % (
        #     (self.eidx + 1), (end_time - start_time) / (1. * (self.eidx + 1))))
        # print( ('Training took %.1fs' %
        #         (end_time - start_time)), file=sys.stderr)
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号