如何使用Scikit-Learn包装器获得XGBoost和XGBoost的预测以进行匹配?

发布于 2021-01-29 18:15:12

我是Python的XGBoost的新手,所以我很抱歉,如果答案是显而易见的,但是我尝试使用panda数据框并以Python的形式获取XGBoost,以提供与使用Scikit-
Learn包装器相同时得到的相同预测行使。到目前为止,我一直无法做到这一点。举个例子,在这里我拿波士顿数据集,转换为熊猫数据框,训练该数据集的前500个观察值,然后预测最后6个。我先使用XGBoost,然后使用Scikit-
Learn包装器和即使将模型的参数设置为相同,我也会得到不同的预测。具体而言,数组预测看起来与数组预测2完全不同(请参见下面的代码)。任何帮助将非常感激!

from sklearn import datasets
import pandas as pd
import xgboost as xgb
from xgboost.sklearn import XGBClassifier
from xgboost.sklearn import XGBRegressor

### Use the boston data as an example, train on first 500, predict last 6 
boston_data = datasets.load_boston()
df_boston = pd.DataFrame(boston_data.data,columns=boston_data.feature_names)
df_boston['target'] = pd.Series(boston_data.target)


#### Code using XGBoost
Sub_train = df_boston.head(500)
target = Sub_train["target"]
Sub_train = Sub_train.drop('target', axis=1)

Sub_predict = df_boston.tail(6)
Sub_predict = Sub_predict.drop('target', axis=1)

xgtrain = xgb.DMatrix(Sub_train.as_matrix(), label=target.tolist())
xgtest = xgb.DMatrix(Sub_predict.as_matrix())

params = {'booster': 'gblinear', 'objective': 'reg:linear', 
      'max_depth': 2, 'learning_rate': .1, 'n_estimators': 500,    'min_child_weight': 3, 'colsample_bytree': .7,
      'subsample': .8, 'gamma': 0, 'reg_alpha': 1}

model = xgb.train(dtrain=xgtrain, params=params)

predictions = model.predict(xgtest)

#### Code using Sk learn Wrapper for XGBoost
model = XGBRegressor(learning_rate =.1, n_estimators=500,
max_depth=2, min_child_weight=3, gamma=0, 
subsample=.8, colsample_bytree=.7, reg_alpha=1, 
objective= 'reg:linear')

target = "target"

Sub_train = df_boston.head(500)
Sub_predict = df_boston.tail(6)
Sub_predict = Sub_predict.drop('target', axis=1)

Ex_List = ['target']

predictors = [i for i in Sub_train.columns if i not in Ex_List]

model = model.fit(Sub_train[predictors],Sub_train[target])

predictions2 = model.predict(Sub_predict)
关注者
0
被浏览
180
1 个回答
  • 面试哥
    面试哥 2021-01-29
    为面试而生,有面试问题,就找面试哥。

    请在这里这个答案

    xgboost.trainxgboost.XGBRegressor接受时将忽略参数n_estimators
    。在xgboost.train中,增强迭代(即n_estimators)由num_boost_round(默认值:10)控制

    建议n_estimators从提供给xgb.train它的参数中删除并替换为num_boost_round

    因此,像这样更改您的参数:

    params = {'objective': 'reg:linear', 
          'max_depth': 2, 'learning_rate': .1,    
          'min_child_weight': 3, 'colsample_bytree': .7,
          'subsample': .8, 'gamma': 0, 'alpha': 1}
    

    像这样训练xgb.train:

    model = xgb.train(dtrain=xgtrain, params=params,num_boost_round=500)
    

    您将获得相同的结果。

    或者,保持xgb.train不变,并像这样更改XGBRegressor:

    model = XGBRegressor(learning_rate =.1, n_estimators=10,
                         max_depth=2, min_child_weight=3, gamma=0, 
                         subsample=.8, colsample_bytree=.7, reg_alpha=1, 
                         objective= 'reg:linear')
    

    然后,您也将获得相同的结果。



知识点
面圈网VIP题库

面圈网VIP题库全新上线,海量真题题库资源。 90大类考试,超10万份考试真题开放下载啦

去下载看看