tuning.py 文件源码

python
阅读 20 收藏 0 点赞 0 评论 0

项目:kaggle-Kobe-Bryant-Shot-Selection 作者: shiba24 项目源码 文件源码
def tune(self, train_X, train_y, test_X, max_evals=2500):
        self.train_X = train_X
        self.train_y = train_y.reshape(len(train_y),)
        self.test_X = test_X
        np.random.seed(0)
        trials = Trials()
        params = self.optimize(trials, max_evals=max_evals)

        #     Average of best iteration 64.5
        #     Score 0.6018852
        # best parameters {'colsample_bytree': 0.6000000000000001, 'min_child_weight': 7.0, 'subsample': 0.9, 'eta': 0.2, 'max_depth': 6.0, 'gamma': 0.9}

        # best parameters {'colsample_bytree': 0.55, 'learning_rate': 0.03,
        #                  'min_child_weight': 9.0, 'n_estimators': 580.0,
        #                  'subsample': 1.0, 'eta': 0.2, 'max_depth': 7.0, 'gamma': 0.75}
        # best params : 2
        #                 {'colsample_bytree': 0.45, 'eta': 0.2,
        #                  'gamma': 0.9500000000000001, 'learning_rate': 0.04,
        #                  'max_depth': 6.0, 'min_child_weight': 9.0,
        #                  'n_estimators': 750.0, 'subsample': 1.84}


        # Adapt best params
        # params = {'objective': 'multi:softprob',
        #           'eval_metric': 'mlogloss',
        #           'colsample_bytree': 0.55,
        #           'min_child_weight': 9.0, 
        #           'subsample': 1.0, 
        #           'learning_rate': 0.03,
        #           'eta': 0.2, 
        #           'max_depth': 7.0, 
        #           'gamma': 0.75,
        #           'num_class': 2,
        #           'n_estimators': 580.0
        #           }


        params_result = self.score(params)

        # Training with params : 
        # train-mlogloss:0.564660 eval-mlogloss:0.608842
        # Average of best iteration 32.0
        # Score 0.6000522
        return params, params_result
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号