ML之回归预测:利用6个单独+2个集成模型(LassoR、KernelRidgeR、ElasticNetR、GBR、XGBR、LGBMR,Avg、Stacking)对自动驾驶数据集【5+1】回归预测

   日期:2020-07-15     浏览:134    评论:0    
核心提示:ML之回归预测:利用6个单独+2个集成模型(LassoR、KernelRidgeR、ElasticNetR、GBR、XGBR、LGBMR,Avg、Stacking)对自动驾驶数据集【5+1】进行回归预测目录利用6个单独+2个集成模型(LassoR、KernelRidgeR、ElasticNetR、GBR、XGBR、LGBMR,Avg、Stacking)对自动驾驶数据集【5+1】进行回归预测输出结果实现代码利用6个单独+2个集成模型(LassoR、KernelRidgeR、El.._config->bagg

ML之回归预测:利用6个单独+2个集成模型(LassoR、KernelRidgeR、ElasticNetR、GBR、XGBR、LGBMR,Avg、Stacking)对自动驾驶数据集【5+1】进行回归预测

 

目录

利用6个单独+2个集成模型(LassoR、KernelRidgeR、ElasticNetR、GBR、XGBR、LGBMR,Avg、Stacking)对自动驾驶数据集【5+1】进行回归预测

输出结果

实现代码

 

利用6个单独+2个集成模型(LassoR、KernelRidgeR、ElasticNetR、GBR、XGBR、LGBMR,Avg、Stacking)对自动驾驶数据集【5+1】进行回归预测

输出结果

kf: <class 'int'> 5
LassoR 5f-CV: 0.9303 (0.0088)
kf: <class 'int'> 5
LassoR02 5f-CV: 0.9652 (0.0064)
LassoR02,train数据集 0.9661743820406197
LassoR02,test数据集  0.9729422633172817
-------------------------------------
kf: <class 'int'> 5
KernelRidgeR 5f-CV: 0.9756 (0.0065)
KernelRidgeR,train数据集 0.9779833615150171
KernelRidgeR,test数据集  0.9735860613699098
-------------------------------------
kf: <class 'int'> 5
ElasticNetR 5f-CV: 0.8869 (0.0152)
kf: <class 'int'> 5
ElasticNetR02 5f-CV: 0.9652 (0.0064)
ElasticNetR02,train数据集 0.9661740611722313
ElasticNetR02,test数据集  0.9729302413806289
ElasticNetR02,新数据test数据集 0.9012162287120226
-------------------------------------
kf: <class 'int'> 5
GBR 5f-CV: 0.9817 (0.0056)
GBR,train数据集 0.9953533672376947
GBR,test数据集  0.9795040374323927
GBR,新数据test数据集 0.9362198547614025
-------------------------------------
kf: <class 'int'> 5
XGBR 5f-CV: 0.9798 (0.0049)
XGBR,train数据集 0.9968314776157515
XGBR,test数据集  0.9756711778757111
-------------------------------------
kf: <class 'int'> 5
[LightGBM] [Warning] feature_fraction is set=0.2319, colsample_bytree=1.0 will be ignored. Current value: feature_fraction=0.2319
[LightGBM] [Warning] min_data_in_leaf is set=6, min_child_samples=20 will be ignored. Current value: min_data_in_leaf=6
[LightGBM] [Warning] min_sum_hessian_in_leaf is set=11, min_child_weight=0.001 will be ignored. Current value: min_sum_hessian_in_leaf=11
[LightGBM] [Warning] bagging_fraction is set=0.8, subsample=1.0 will be ignored. Current value: bagging_fraction=0.8
[LightGBM] [Warning] bagging_freq is set=5, subsample_freq=0 will be ignored. Current value: bagging_freq=5
[LightGBM] [Warning] feature_fraction is set=0.2319, colsample_bytree=1.0 will be ignored. Current value: feature_fraction=0.2319
[LightGBM] [Warning] min_data_in_leaf is set=6, min_child_samples=20 will be ignored. Current value: min_data_in_leaf=6
[LightGBM] [Warning] min_sum_hessian_in_leaf is set=11, min_child_weight=0.001 will be ignored. Current value: min_sum_hessian_in_leaf=11
[LightGBM] [Warning] bagging_fraction is set=0.8, subsample=1.0 will be ignored. Current value: bagging_fraction=0.8
[LightGBM] [Warning] bagging_freq is set=5, subsample_freq=0 will be ignored. Current value: bagging_freq=5
[LightGBM] [Warning] feature_fraction is set=0.2319, colsample_bytree=1.0 will be ignored. Current value: feature_fraction=0.2319
[LightGBM] [Warning] min_data_in_leaf is set=6, min_child_samples=20 will be ignored. Current value: min_data_in_leaf=6
[LightGBM] [Warning] min_sum_hessian_in_leaf is set=11, min_child_weight=0.001 will be ignored. Current value: min_sum_hessian_in_leaf=11
[LightGBM] [Warning] bagging_fraction is set=0.8, subsample=1.0 will be ignored. Current value: bagging_fraction=0.8
[LightGBM] [Warning] bagging_freq is set=5, subsample_freq=0 will be ignored. Current value: bagging_freq=5
[LightGBM] [Warning] feature_fraction is set=0.2319, colsample_bytree=1.0 will be ignored. Current value: feature_fraction=0.2319
[LightGBM] [Warning] min_data_in_leaf is set=6, min_child_samples=20 will be ignored. Current value: min_data_in_leaf=6
[LightGBM] [Warning] min_sum_hessian_in_leaf is set=11, min_child_weight=0.001 will be ignored. Current value: min_sum_hessian_in_leaf=11
[LightGBM] [Warning] bagging_fraction is set=0.8, subsample=1.0 will be ignored. Current value: bagging_fraction=0.8
[LightGBM] [Warning] bagging_freq is set=5, subsample_freq=0 will be ignored. Current value: bagging_freq=5
[LightGBM] [Warning] feature_fraction is set=0.2319, colsample_bytree=1.0 will be ignored. Current value: feature_fraction=0.2319
[LightGBM] [Warning] min_data_in_leaf is set=6, min_child_samples=20 will be ignored. Current value: min_data_in_leaf=6
[LightGBM] [Warning] min_sum_hessian_in_leaf is set=11, min_child_weight=0.001 will be ignored. Current value: min_sum_hessian_in_leaf=11
[LightGBM] [Warning] bagging_fraction is set=0.8, subsample=1.0 will be ignored. Current value: bagging_fraction=0.8
[LightGBM] [Warning] bagging_freq is set=5, subsample_freq=0 will be ignored. Current value: bagging_freq=5
LGBMR 5f-CV: 0.9748 (0.0050)
[LightGBM] [Warning] feature_fraction is set=0.2319, colsample_bytree=1.0 will be ignored. Current value: feature_fraction=0.2319
[LightGBM] [Warning] min_data_in_leaf is set=6, min_child_samples=20 will be ignored. Current value: min_data_in_leaf=6
[LightGBM] [Warning] min_sum_hessian_in_leaf is set=11, min_child_weight=0.001 will be ignored. Current value: min_sum_hessian_in_leaf=11
[LightGBM] [Warning] bagging_fraction is set=0.8, subsample=1.0 will be ignored. Current value: bagging_fraction=0.8
[LightGBM] [Warning] bagging_freq is set=5, subsample_freq=0 will be ignored. Current value: bagging_freq=5
LGBMR,train数据集 0.9808755176740463
LGBMR,test数据集 0.9692198833898547
LGBMR,新数据test数据集 0.9201215322025592
-------------------------------------
kf: <class 'int'> 5
AverageModels:LassoR,ElasticNetR,GBR,对基模型进行集成之后的得分:0.9527 (0.0058)
Avg_models,train数据集 0.9589290712084004
Avg_models,test数据集 0.9491595320322921
Avg_models,新数据test数据集 0.9235325595276671
-------------------------------------
kf: <class 'int'> 5
StackingAverageModels:对基模型进行集成之后的得分:0.9728 (0.0052)
stacked_averaged_model,train数据集 0.9840574443555236
stacked_averaged_model,test数据集 0.9727198834799341
stacked_averaged_model,新数据test数据集 0.9585729656542717
采用datetime方法,run time: 0:02:23.166606

 

实现代码

更新……

 

 

 

 

 

 

 

 

 

 

 
打赏
 本文转载自:网络 
所有权利归属于原作者,如文章来源标示错误或侵犯了您的权利请联系微信13520258486
更多>最近资讯中心
更多>最新资讯中心
0相关评论

推荐图文
推荐资讯中心
点击排行
最新信息
新手指南
采购商服务
供应商服务
交易安全
关注我们
手机网站:
新浪微博:
微信关注:

13520258486

周一至周五 9:00-18:00
(其他时间联系在线客服)

24小时在线客服