0
2021-01-26 02:21:35
556753
1
2021-01-26 02:33:16
556754
2
2021-01-26 02:39:49
556755
3
2021-01-26 02:47:53
556756
4
2021-01-26 03:06:30
556757
0
2021-02-06 10:03:24
130231
1
2021-02-06 10:03:26
130232
2
2021-02-06 10:03:27
130233
3
2021-02-06 10:03:29
130234
4
2021-02-06 10:03:35
130235
0
130231
1
130232
2
130233
3
130234
4
130235
count
450000
450000
mean
369143.0808
7763.244016
std
131146.9064
5592.880135
min
118350
0
25%
257342.75
2805
50%
369842.5
6754
75%
482342.25
11965
max
594842
21566
/shared-libs/python3.7/py/lib/python3.7/site-packages/seaborn/_decorators.py:43: FutureWarning: Pass the following variable as a keyword arg: x. From version 0.12, the only valid positional argument will be `data`, and passing other arguments without an explicit keyword will result in an error or misinterpretation.
FutureWarning
0
24
0
1
41
0
2
8
0
3
32
0
4
51
0
0
25
747
1
1
75
2
47
2214
3
1
1020
4
8
7284
358446
1
0
66053
74
0
132910
43
0
339902
56
0
366660
82
1
/shared-libs/python3.7/py/lib/python3.7/site-packages/seaborn/_decorators.py:43: FutureWarning: Pass the following variable as a keyword arg: x. From version 0.12, the only valid positional argument will be `data`, and passing other arguments without an explicit keyword will result in an error or misinterpretation.
FutureWarning
358446
1
781
66053
74
687
132910
43
801
339902
56
44
366660
82
164
Epoch 1/100
836/836 [==============================] - 6s 6ms/step - loss: 1.9728 - accuracy: 0.5193 - val_loss: 0.5777 - val_accuracy: 0.9351
Epoch 2/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6416 - accuracy: 0.6200 - val_loss: 0.5870 - val_accuracy: 0.9008
Epoch 3/100
836/836 [==============================] - 4s 4ms/step - loss: 0.6347 - accuracy: 0.6435 - val_loss: 0.5469 - val_accuracy: 0.8839
Epoch 4/100
836/836 [==============================] - 4s 4ms/step - loss: 0.6311 - accuracy: 0.6537 - val_loss: 0.5383 - val_accuracy: 0.9017
Epoch 5/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6280 - accuracy: 0.6629 - val_loss: 0.5340 - val_accuracy: 0.9030
Epoch 6/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6276 - accuracy: 0.6698 - val_loss: 0.5547 - val_accuracy: 0.8731
Epoch 7/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6201 - accuracy: 0.6780 - val_loss: 0.5174 - val_accuracy: 0.8963
Epoch 8/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6236 - accuracy: 0.6744 - val_loss: 0.5305 - val_accuracy: 0.8740
Epoch 9/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6226 - accuracy: 0.6754 - val_loss: 0.5310 - val_accuracy: 0.8741
Epoch 10/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6216 - accuracy: 0.6746 - val_loss: 0.5646 - val_accuracy: 0.8543
Epoch 11/100
836/836 [==============================] - 4s 4ms/step - loss: 0.6175 - accuracy: 0.6795 - val_loss: 0.5186 - val_accuracy: 0.8818
Epoch 12/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6183 - accuracy: 0.6780 - val_loss: 0.5158 - val_accuracy: 0.8807
Epoch 13/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6170 - accuracy: 0.6755 - val_loss: 0.5162 - val_accuracy: 0.8766
Epoch 14/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6187 - accuracy: 0.6745 - val_loss: 0.5311 - val_accuracy: 0.8702
Epoch 15/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6128 - accuracy: 0.6788 - val_loss: 0.5395 - val_accuracy: 0.8600
Epoch 16/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6088 - accuracy: 0.6825 - val_loss: 0.4961 - val_accuracy: 0.8903
Epoch 17/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6085 - accuracy: 0.6781 - val_loss: 0.5260 - val_accuracy: 0.8767
Epoch 18/100
836/836 [==============================] - 5s 5ms/step - loss: 0.6076 - accuracy: 0.6797 - val_loss: 0.5100 - val_accuracy: 0.8858
Epoch 19/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6084 - accuracy: 0.6733 - val_loss: 0.5418 - val_accuracy: 0.8763
Epoch 20/100
836/836 [==============================] - 5s 5ms/step - loss: 0.6061 - accuracy: 0.6750 - val_loss: 0.5110 - val_accuracy: 0.8858
Epoch 21/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6029 - accuracy: 0.6792 - val_loss: 0.4990 - val_accuracy: 0.8940
Epoch 22/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6030 - accuracy: 0.6771 - val_loss: 0.4988 - val_accuracy: 0.8881
Epoch 23/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6000 - accuracy: 0.6771 - val_loss: 0.4837 - val_accuracy: 0.9049
Epoch 24/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5990 - accuracy: 0.6805 - val_loss: 0.5432 - val_accuracy: 0.8753
Epoch 25/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5977 - accuracy: 0.6825 - val_loss: 0.5020 - val_accuracy: 0.8950
Epoch 26/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6015 - accuracy: 0.6781 - val_loss: 0.5163 - val_accuracy: 0.8842
Epoch 27/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5958 - accuracy: 0.6890 - val_loss: 0.4680 - val_accuracy: 0.9133
Epoch 28/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5981 - accuracy: 0.6824 - val_loss: 0.4740 - val_accuracy: 0.9087
Epoch 29/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6005 - accuracy: 0.6817 - val_loss: 0.4817 - val_accuracy: 0.9064
Epoch 30/100
836/836 [==============================] - 4s 5ms/step - loss: 0.6021 - accuracy: 0.6800 - val_loss: 0.5180 - val_accuracy: 0.8846
Epoch 31/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5972 - accuracy: 0.6845 - val_loss: 0.4812 - val_accuracy: 0.9024
Epoch 32/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5973 - accuracy: 0.6878 - val_loss: 0.5186 - val_accuracy: 0.8741
Epoch 33/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5943 - accuracy: 0.6911 - val_loss: 0.5012 - val_accuracy: 0.8919
Epoch 34/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5953 - accuracy: 0.6864 - val_loss: 0.5307 - val_accuracy: 0.8792
Epoch 35/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5939 - accuracy: 0.6857 - val_loss: 0.4889 - val_accuracy: 0.9001
Epoch 36/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5992 - accuracy: 0.6839 - val_loss: 0.4821 - val_accuracy: 0.9026
Epoch 37/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5984 - accuracy: 0.6835 - val_loss: 0.5082 - val_accuracy: 0.8957
Epoch 38/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5907 - accuracy: 0.6888 - val_loss: 0.5394 - val_accuracy: 0.8720
Epoch 39/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5955 - accuracy: 0.6881 - val_loss: 0.5196 - val_accuracy: 0.8821
Epoch 40/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5921 - accuracy: 0.6874 - val_loss: 0.4767 - val_accuracy: 0.9071
Epoch 41/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5920 - accuracy: 0.6910 - val_loss: 0.4768 - val_accuracy: 0.9100
Epoch 42/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5919 - accuracy: 0.6883 - val_loss: 0.5012 - val_accuracy: 0.8900
Epoch 43/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5984 - accuracy: 0.6820 - val_loss: 0.4845 - val_accuracy: 0.8997
Epoch 44/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5937 - accuracy: 0.6873 - val_loss: 0.4910 - val_accuracy: 0.8946
Epoch 45/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5958 - accuracy: 0.6838 - val_loss: 0.4839 - val_accuracy: 0.8999
Epoch 46/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5938 - accuracy: 0.6896 - val_loss: 0.5175 - val_accuracy: 0.8896
Epoch 47/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5971 - accuracy: 0.6844 - val_loss: 0.4567 - val_accuracy: 0.9161
Epoch 48/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5898 - accuracy: 0.6896 - val_loss: 0.4883 - val_accuracy: 0.8994
Epoch 49/100
836/836 [==============================] - 5s 6ms/step - loss: 0.5942 - accuracy: 0.6856 - val_loss: 0.4994 - val_accuracy: 0.8895
Epoch 50/100
836/836 [==============================] - 5s 6ms/step - loss: 0.5930 - accuracy: 0.6894 - val_loss: 0.4800 - val_accuracy: 0.9027
Epoch 51/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5935 - accuracy: 0.6867 - val_loss: 0.4755 - val_accuracy: 0.9047
Epoch 52/100
836/836 [==============================] - 5s 5ms/step - loss: 0.5913 - accuracy: 0.6903 - val_loss: 0.5147 - val_accuracy: 0.8837
Epoch 53/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5924 - accuracy: 0.6910 - val_loss: 0.5056 - val_accuracy: 0.8908
Epoch 54/100
836/836 [==============================] - 4s 4ms/step - loss: 0.5916 - accuracy: 0.6908 - val_loss: 0.5109 - val_accuracy: 0.8884
Epoch 55/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5935 - accuracy: 0.6891 - val_loss: 0.4823 - val_accuracy: 0.9042
Epoch 56/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5861 - accuracy: 0.6918 - val_loss: 0.5005 - val_accuracy: 0.8874
Epoch 57/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5947 - accuracy: 0.6868 - val_loss: 0.4926 - val_accuracy: 0.8940
Epoch 58/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5917 - accuracy: 0.6854 - val_loss: 0.4914 - val_accuracy: 0.8953
Epoch 59/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5904 - accuracy: 0.6886 - val_loss: 0.4750 - val_accuracy: 0.9029
Epoch 60/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5944 - accuracy: 0.6860 - val_loss: 0.4891 - val_accuracy: 0.8969
Epoch 61/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5898 - accuracy: 0.6915 - val_loss: 0.4926 - val_accuracy: 0.8965
Epoch 62/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5921 - accuracy: 0.6869 - val_loss: 0.4936 - val_accuracy: 0.8941
Epoch 63/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5922 - accuracy: 0.6877 - val_loss: 0.4631 - val_accuracy: 0.9108
Epoch 64/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5939 - accuracy: 0.6858 - val_loss: 0.5547 - val_accuracy: 0.8450
Epoch 65/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5922 - accuracy: 0.6861 - val_loss: 0.4865 - val_accuracy: 0.8945
Epoch 66/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5940 - accuracy: 0.6872 - val_loss: 0.4632 - val_accuracy: 0.9078
Epoch 67/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5900 - accuracy: 0.6897 - val_loss: 0.4450 - val_accuracy: 0.9288
Epoch 68/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5946 - accuracy: 0.6826 - val_loss: 0.4807 - val_accuracy: 0.8982
Epoch 69/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5888 - accuracy: 0.6912 - val_loss: 0.5474 - val_accuracy: 0.8681
Epoch 70/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5896 - accuracy: 0.6893 - val_loss: 0.4843 - val_accuracy: 0.9000
Epoch 71/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5913 - accuracy: 0.6895 - val_loss: 0.4908 - val_accuracy: 0.9024
Epoch 72/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5891 - accuracy: 0.6936 - val_loss: 0.4902 - val_accuracy: 0.8895
Epoch 73/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5927 - accuracy: 0.6884 - val_loss: 0.5164 - val_accuracy: 0.8822
Epoch 74/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5898 - accuracy: 0.6915 - val_loss: 0.4638 - val_accuracy: 0.9129
Epoch 75/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5888 - accuracy: 0.6890 - val_loss: 0.4804 - val_accuracy: 0.9030
Epoch 76/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5934 - accuracy: 0.6860 - val_loss: 0.5196 - val_accuracy: 0.8741
Epoch 77/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5898 - accuracy: 0.6915 - val_loss: 0.4804 - val_accuracy: 0.8983
Epoch 78/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5855 - accuracy: 0.6930 - val_loss: 0.4738 - val_accuracy: 0.9051
Epoch 79/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5876 - accuracy: 0.6918 - val_loss: 0.4854 - val_accuracy: 0.8997
Epoch 80/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5932 - accuracy: 0.6868 - val_loss: 0.4628 - val_accuracy: 0.9135
Epoch 81/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5896 - accuracy: 0.6885 - val_loss: 0.4894 - val_accuracy: 0.8971
Epoch 82/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5937 - accuracy: 0.6862 - val_loss: 0.4698 - val_accuracy: 0.9033
Epoch 83/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5900 - accuracy: 0.6892 - val_loss: 0.5147 - val_accuracy: 0.8827
Epoch 84/100
836/836 [==============================] - 4s 4ms/step - loss: 0.5930 - accuracy: 0.6894 - val_loss: 0.5385 - val_accuracy: 0.8642
Epoch 85/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5902 - accuracy: 0.6883 - val_loss: 0.4954 - val_accuracy: 0.8864
Epoch 86/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5894 - accuracy: 0.6884 - val_loss: 0.4697 - val_accuracy: 0.9075
Epoch 87/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5921 - accuracy: 0.6879 - val_loss: 0.4645 - val_accuracy: 0.9130
Epoch 88/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5905 - accuracy: 0.6882 - val_loss: 0.5029 - val_accuracy: 0.8866
Epoch 89/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5915 - accuracy: 0.6919 - val_loss: 0.5029 - val_accuracy: 0.8882
Epoch 90/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5901 - accuracy: 0.6893 - val_loss: 0.4604 - val_accuracy: 0.9098
Epoch 91/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5923 - accuracy: 0.6868 - val_loss: 0.5550 - val_accuracy: 0.8338
Epoch 92/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5875 - accuracy: 0.6887 - val_loss: 0.4982 - val_accuracy: 0.8864
Epoch 93/100
836/836 [==============================] - 4s 4ms/step - loss: 0.5952 - accuracy: 0.6898 - val_loss: 0.4548 - val_accuracy: 0.9123
Epoch 94/100
836/836 [==============================] - 4s 4ms/step - loss: 0.5938 - accuracy: 0.6794 - val_loss: 0.4653 - val_accuracy: 0.9045
Epoch 95/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5874 - accuracy: 0.6912 - val_loss: 0.4678 - val_accuracy: 0.9024
Epoch 96/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5915 - accuracy: 0.6894 - val_loss: 0.5393 - val_accuracy: 0.8531
Epoch 97/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5917 - accuracy: 0.6868 - val_loss: 0.4929 - val_accuracy: 0.8869
Epoch 98/100
836/836 [==============================] - 4s 5ms/step - loss: 0.5928 - accuracy: 0.6877 - val_loss: 0.5093 - val_accuracy: 0.8760
Epoch 99/100
836/836 [==============================] - 4s 4ms/step - loss: 0.5907 - accuracy: 0.6877 - val_loss: 0.5258 - val_accuracy: 0.8728
Epoch 100/100
836/836 [==============================] - 5s 5ms/step - loss: 0.5888 - accuracy: 0.6930 - val_loss: 0.5571 - val_accuracy: 0.8334
[0] validation_0-auc:0.67882
/root/venv/lib/python3.7/site-packages/xgboost/sklearn.py:1224: UserWarning: The use of label encoder in XGBClassifier is deprecated and will be removed in a future release. To remove this warning, do the following: 1) Pass option use_label_encoder=False when constructing XGBClassifier object; and 2) Encode your labels (y) as integers starting with 0, i.e. 0, 1, 2, ..., [num_class - 1].
warnings.warn(label_encoder_deprecation_msg, UserWarning)
[1] validation_0-auc:0.70817
[2] validation_0-auc:0.72359
[3] validation_0-auc:0.72547
[4] validation_0-auc:0.72669
[5] validation_0-auc:0.72907
[6] validation_0-auc:0.72948
[7] validation_0-auc:0.72867
[8] validation_0-auc:0.72995
[9] validation_0-auc:0.73036
[10] validation_0-auc:0.72951
[11] validation_0-auc:0.72946
[12] validation_0-auc:0.72974
[13] validation_0-auc:0.72998
[14] validation_0-auc:0.73113
[15] validation_0-auc:0.73125
[16] validation_0-auc:0.73158
[17] validation_0-auc:0.73235
[18] validation_0-auc:0.73244
[19] validation_0-auc:0.73272
[20] validation_0-auc:0.73235
[21] validation_0-auc:0.73232
[22] validation_0-auc:0.73264
[23] validation_0-auc:0.73244
[24] validation_0-auc:0.73227
[25] validation_0-auc:0.73172
[26] validation_0-auc:0.73178
[27] validation_0-auc:0.73142
[28] validation_0-auc:0.73123
[29] validation_0-auc:0.73089
Test Accuracy: 0.90936
Train Accuracy: 0.7087026440779386
AUC Score: 0.6705428952864724
ERROR: Could not find a version that satisfies the requirement lighgbm (from versions: none)
ERROR: No matching distribution found for lighgbm
Note: you may need to restart the kernel to use updated packages.
[LightGBM] [Warning] Unknown parameter: gamma
Test Accuracy: 0.8898311111111111
Train Accuracy: 0.7485321066606829
AUC Score: 0.6769455957303959
/shared-libs/python3.7/py/lib/python3.7/site-packages/sklearn/base.py:451: UserWarning: X does not have valid feature names, but RandomForestClassifier was fitted with feature names
"X does not have valid feature names, but"
/root/venv/lib/python3.7/site-packages/xgboost/sklearn.py:1224: UserWarning: The use of label encoder in XGBClassifier is deprecated and will be removed in a future release. To remove this warning, do the following: 1) Pass option use_label_encoder=False when constructing XGBClassifier object; and 2) Encode your labels (y) as integers starting with 0, i.e. 0, 1, 2, ..., [num_class - 1].
warnings.warn(label_encoder_deprecation_msg, UserWarning)
[13:01:38] WARNING: ../src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
/root/venv/lib/python3.7/site-packages/xgboost/sklearn.py:1224: UserWarning: The use of label encoder in XGBClassifier is deprecated and will be removed in a future release. To remove this warning, do the following: 1) Pass option use_label_encoder=False when constructing XGBClassifier object; and 2) Encode your labels (y) as integers starting with 0, i.e. 0, 1, 2, ..., [num_class - 1].
warnings.warn(label_encoder_deprecation_msg, UserWarning)
[13:01:40] WARNING: ../src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
[LightGBM] [Warning] Unknown parameter: gamma
[LightGBM] [Warning] Unknown parameter: gamma
/root/venv/lib/python3.7/site-packages/xgboost/sklearn.py:1224: UserWarning: The use of label encoder in XGBClassifier is deprecated and will be removed in a future release. To remove this warning, do the following: 1) Pass option use_label_encoder=False when constructing XGBClassifier object; and 2) Encode your labels (y) as integers starting with 0, i.e. 0, 1, 2, ..., [num_class - 1].
warnings.warn(label_encoder_deprecation_msg, UserWarning)
[13:01:44] WARNING: ../src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
[LightGBM] [Warning] Unknown parameter: gamma
/root/venv/lib/python3.7/site-packages/xgboost/sklearn.py:1224: UserWarning: The use of label encoder in XGBClassifier is deprecated and will be removed in a future release. To remove this warning, do the following: 1) Pass option use_label_encoder=False when constructing XGBClassifier object; and 2) Encode your labels (y) as integers starting with 0, i.e. 0, 1, 2, ..., [num_class - 1].
warnings.warn(label_encoder_deprecation_msg, UserWarning)
[13:01:48] WARNING: ../src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
/shared-libs/python3.7/py/lib/python3.7/site-packages/sklearn/base.py:444: UserWarning: X has feature names, but RandomForestClassifier was fitted without feature names
f"X has feature names, but {self.__class__.__name__} was fitted without"
0
130231
0
1
130232
0
2
130233
0
3
130234
0
4
130235
0
5
130237
0
6
130236
0
7
130238
0
8
130239
0
9
130240
0
Saved file to disk.