Competition/Kaggle

[kaggle][필사] Costa Rican Household Proverty (3)

bisi 2020. 9. 22. 12:44

이번 주제는 Costa Rican Household Proverty 로, 

목표는 미주 개발 은행(Inter-American Development Bank)의 세계에서 가장 빈곤 한 일부 가정의 소득 자격을 예측을 하는 것이다. 

 

보통 세계 최 빈곤층은 그들의 자격을 증명하기가 어려운데, 라틴 아메리카는 알고리즘을 통해 소득자격을 확인한다.

예를 들어 프록시 수단 테스트(PMT)을 통해 벽과 천장의 재료 또는 집에서 발견 된 자산과 같은 가족의 관찰 가능한 가구 속성을 고려하는 것이다. 

 

이를 바탕으로 다양한 feature가 제공 되었는데, LGBMClassifier를 사용하여 소득 자격을 예측해본다. 

 

이번 필사는 이유한님의 코드를 참고하였다.

 

 

목록 

Costa Rican Household Proverty (1)

더보기

1. Check datasets

1.1 Read datasets

1.2 Make description df

1.3 Check null data

1.4 Fill missing values

Costa Rican Household Proverty (2)

더보기

2. Feature Engineering

2.1 Object features

2.1.1 dependency

2.1.2 edjefe

2.1.3 edjefa

2.1.4 roof and elecricity

 

2.2 카테고리 변수 추출

 

2.3 연속형 변수를 사용하여 새로운 변수 생성

2.3.1 연속형 변수 컬럼 추출

2.3.2 새로운 변수 생성

2.3.3 가족 변수의 대출 비율

2.3.4 가족 변수의 방 비율

2.3.5 가족 변수의 침대 비율

2.3.6 가족 변수의 태블릿 보유 비율

2.3.7 가족 변수의 핸드폰 보유 비율

2.3.8 가족 변수의 학창 시절의 몇년뒤 

2.3.9 Rich features

 

2.4 집합 변수

Costa Rican Household Proverty (3)

더보기

3. Feature Selection Using shap

4. Model Development

4.1 LGB를 통한 예측 및 변수 중요도 생성

4.2 랜덤하게 찾기 (Randomized Search)


 

 

 

 

 

 

Costa Rican Household Poverty Level Prediction

 

3. Feature Selection Using Shap

In [94]:
# 중복되는 열이 있으므로, 중복되는 열을 삭제하고 binary category 변수를 생성한다.
# train = train.T.drop_duplicates().T
binary_cat_features = [col for col in train.columns if train[col].value_counts().shape[0] == 2]
object_features = ['edjefe', 'edjefa']

categorical_feats = binary_cat_features + object_features
In [95]:
def evaluate_macroF1_lgb(truth, predictions):  
    # this follows the discussion in https://github.com/Microsoft/LightGBM/issues/1483
    pred_labels = predictions.reshape(len(np.unique(truth)),-1).argmax(axis=0)
    f1 = f1_score(truth, pred_labels, average='macro')
    return ('macroF1', f1, True) 
In [96]:
y = train['Target']
train.drop(columns=['Target'], inplace=True)
In [97]:
def print_execution_time(start):
    end = time.time()
    hours, rem = divmod(end-start, 3600)
    minutes, seconds = divmod(rem, 60)
    print('*'*20, "Execution ended in {:0>2}h {:0>2}m {:05.2f}s".format(int(hours),int(minutes),seconds), '*'*20)
In [98]:
# LGBMClassifier https://lightgbm.readthedocs.io/en/latest/Parameters-Tuning.html
# def extract_good_features_using_shap_LGB(params, SEED):
#     clf = lgb.LGBMClassifier(objective='multiclass',
#                             random_state=1989,
#                             max_depth=params['max_depth'],#max_depth:트리의 최대 깊이
#                             learning_rate=params['learning_rate'],#learning_rate : 학습률
#                             silent=True,
#                             metric='multi_logloss',
#                             n_jobs=-1, n_estimators =10000,#n_estimators : 반복하려는 트리의 개수
#                             class_weight = 'balanced',
#                             colsample_bytree=params['colsample_bytree'],
#                             min_split_gain=params['min_split_gain'],
#                             bagging_freq=params['bagging_freq'],
#                             min_child_weight=params['min_child_weight'],#min_child_samples : 리프 노드가 되기 위한 최소한의 샘플 데이터 수
#                              num_leaves=params['num_leaves'],#num_leaves:하나의 트리가 가질수 있는 최대의 리프 개수
#                              subsample=params['subsample'],
#                              reg_alpha=params['reg_alpha'],#reg_alpha:L2regularization
#                              reg_lambda=params['reg_lambda'],#reg_lambda:L1regularization
#                              num_class=len(np.unique(y)),
#                              bagging_seed=SEED,
#                              seed=SEED                             
#                             )
#     kfold = 5
#     kf = StratifiedKFold(n_splits=kfold, shuffle=True)
#     feat_importance_df = pd.DataFrame()
    
#     for i, (train_index, test_index) in enumerate(kf.split(train, y)): # y : Target 
#         print('='*30,'{} of {} folds'.format(i+1, kfold), '='*30)
#         start = time.time()
#         X_train, X_val = train.iloc[train_index], train_iloc[test_index]
#         y_train, y_val = y.iloc[train_index], y.iloc[testst_index]
#         clf.fit(X_train, y_train, eval_set=[(X_train, y_train), (X_val, y_val)], eval_metric=evaluate_macroF1_lgb,
#                categorical_feature=categorical_feats, early_stopping_rounds=500, verbose=500)
#         shap_values=shap.TreeExplainer(clf.booster_).shap_values(X_train)
#         fold_importance_df = pd.DataFrame()
#         fold_importance_df['feature'] = X_train.columns
#         fold_importance_df['shap_values'] = abs(np.array(shap_values)[:,:].mean(1).mean(0))
#         fold_importance_df['feat_imp'] = clf.feature_importances_
#         feat_importance_df = pd.concat([feat_importance_df, fold_importance_df])
#         print_execution_time(start)
        
#     feat_importance_df_shap = feat_importance_df.groupby('feature').mean().sort_values('shap_values', ascendig=False).reset_index()
    
#     return feat_importance_df_shap
In [99]:
def extract_good_features_using_shap_LGB(params, SEED):
    clf = lgb.LGBMClassifier(objective='multiclass',
                             random_state=1989,
                             max_depth=params['max_depth'], 
                             learning_rate=params['learning_rate'],  
                             silent=True, 
                             metric='multi_logloss',
                             n_jobs=-1, n_estimators=10000, 
                             class_weight='balanced',
                             colsample_bytree = params['colsample_bytree'], 
                             min_split_gain= params['min_split_gain'], 
                             bagging_freq = params['bagging_freq'],
                             min_child_weight=params['min_child_weight'],
                             num_leaves = params['num_leaves'], 
                             subsample = params['subsample'],
                             reg_alpha= params['reg_alpha'],
                             reg_lambda= params['reg_lambda'],
                             num_class=len(np.unique(y)),
                             bagging_seed=SEED,
                             seed=SEED,
                            )

    kfold = 5
    kf = StratifiedKFold(n_splits=kfold, shuffle=True)
    feat_importance_df  = pd.DataFrame()

    for i, (train_index, test_index) in enumerate(kf.split(train, y)):
        print('='*30, '{} of {} folds'.format(i+1, kfold), '='*30)
        start = time.time()
        X_train, X_val = train.iloc[train_index], train.iloc[test_index]
        y_train, y_val = y.iloc[train_index], y.iloc[test_index]
        clf.fit(X_train, y_train, eval_set=[(X_train, y_train), (X_val, y_val)], eval_metric=evaluate_macroF1_lgb, categorical_feature=categorical_feats,
                early_stopping_rounds=500, verbose=500)
        shap_values = shap.TreeExplainer(clf.booster_).shap_values(X_train)
        fold_importance_df  = pd.DataFrame()
        fold_importance_df['feature'] = X_train.columns
        fold_importance_df['shap_values'] = abs(np.array(shap_values)[:, :].mean(1).mean(0))
        fold_importance_df['feat_imp'] = clf.feature_importances_
        feat_importance_df = pd.concat([feat_importance_df, fold_importance_df])
        print_execution_time(start)

    feat_importance_df_shap = feat_importance_df.groupby('feature').mean().sort_values('shap_values', ascending=False).reset_index()
#     feat_importance_df_shap['shap_cumsum'] = feat_importance_df_shap['shap_values'].cumsum() / feat_importance_df_shap['shap_values'].sum()
#     good_features = feat_importance_df_shap.loc[feat_importance_df_shap['shap_cumsum'] < 0.999].feature
    return feat_importance_df_shap
In [100]:
total_shap_df  = pd.DataFrame()
NUM_ITERATIONS = 50
for SEED in range(NUM_ITERATIONS):
    print('#'*40, '{} of {} iterations'.format(SEED+1, NUM_ITERATIONS), '#' * 40)
    params = {'max_depth': np.random.choice([5, 6, 7, 8, 10, 12, -1]),
             'learning_rate': np.random.rand() * 0.02,
              'colsample_bytree': np.random.rand() * (1 - 0.5) + 0.5,
              'subsample': np.random.rand() * (1 - 0.5) + 0.5,
              'min_split_gain': np.random.rand() * 0.2,
              'num_leaves': np.random.choice([32, 48, 64]),
              'reg_alpha': np.random.rand() * 2,
              'reg_lambda': np.random.rand() *2,
              'bagging_freq': np.random.randint(4) +1,
              'min_child_weight': np.random.randint(100) + 20
             }
    temp_shap_df = extract_good_features_using_shap_LGB(params, SEED)
    total_shap_df = pd.concat([total_shap_df, temp_shap_df])
 
######################################## 1 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.00485	training's macroF1: 0.557042	valid_1's multi_logloss: 1.04836	valid_1's macroF1: 0.377357
Early stopping, best iteration is:
[49]	training's multi_logloss: 1.30129	training's macroF1: 0.487104	valid_1's multi_logloss: 1.27716	valid_1's macroF1: 0.387977
******************** Execution ended in 00h 00m 20.92s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.00997	training's macroF1: 0.560006	valid_1's multi_logloss: 1.08823	valid_1's macroF1: 0.38931
[1000]	training's multi_logloss: 0.895372	training's macroF1: 0.618211	valid_1's multi_logloss: 1.07981	valid_1's macroF1: 0.397377
[1500]	training's multi_logloss: 0.818834	training's macroF1: 0.662662	valid_1's multi_logloss: 1.07889	valid_1's macroF1: 0.390747
Early stopping, best iteration is:
[1032]	training's multi_logloss: 0.889825	training's macroF1: 0.620445	valid_1's multi_logloss: 1.07929	valid_1's macroF1: 0.401916
******************** Execution ended in 00h 00m 57.44s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.01797	training's macroF1: 0.555856	valid_1's multi_logloss: 1.01584	valid_1's macroF1: 0.454784
[1000]	training's multi_logloss: 0.903931	training's macroF1: 0.605618	valid_1's multi_logloss: 0.986396	valid_1's macroF1: 0.459385
Early stopping, best iteration is:
[701]	training's multi_logloss: 0.964902	training's macroF1: 0.578769	valid_1's multi_logloss: 0.996749	valid_1's macroF1: 0.471793
******************** Execution ended in 00h 00m 46.32s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.02129	training's macroF1: 0.561591	valid_1's multi_logloss: 1.0278	valid_1's macroF1: 0.422317
Early stopping, best iteration is:
[388]	training's multi_logloss: 1.06019	training's macroF1: 0.550715	valid_1's multi_logloss: 1.04633	valid_1's macroF1: 0.436703
******************** Execution ended in 00h 00m 34.18s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.01423	training's macroF1: 0.56672	valid_1's multi_logloss: 1.07017	valid_1's macroF1: 0.411672
Early stopping, best iteration is:
[462]	training's multi_logloss: 1.02643	training's macroF1: 0.56065	valid_1's multi_logloss: 1.07366	valid_1's macroF1: 0.416939
******************** Execution ended in 00h 00m 37.52s ********************
######################################## 2 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15349	training's macroF1: 0.494975	valid_1's multi_logloss: 1.12155	valid_1's macroF1: 0.367781
[1000]	training's multi_logloss: 1.06031	training's macroF1: 0.530256	valid_1's multi_logloss: 1.0632	valid_1's macroF1: 0.374037
Early stopping, best iteration is:
[562]	training's multi_logloss: 1.13825	training's macroF1: 0.5007	valid_1's multi_logloss: 1.1094	valid_1's macroF1: 0.381025
******************** Execution ended in 00h 01m 06.75s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15701	training's macroF1: 0.492258	valid_1's multi_logloss: 1.13662	valid_1's macroF1: 0.417573
[1000]	training's multi_logloss: 1.06583	training's macroF1: 0.52717	valid_1's multi_logloss: 1.08027	valid_1's macroF1: 0.420325
Early stopping, best iteration is:
[782]	training's multi_logloss: 1.09846	training's macroF1: 0.51704	valid_1's multi_logloss: 1.09675	valid_1's macroF1: 0.432322
******************** Execution ended in 00h 01m 20.97s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15252	training's macroF1: 0.495734	valid_1's multi_logloss: 1.14348	valid_1's macroF1: 0.40653
[1000]	training's multi_logloss: 1.05594	training's macroF1: 0.523409	valid_1's multi_logloss: 1.09	valid_1's macroF1: 0.390093
Early stopping, best iteration is:
[722]	training's multi_logloss: 1.10135	training's macroF1: 0.516633	valid_1's multi_logloss: 1.1108	valid_1's macroF1: 0.414585
******************** Execution ended in 00h 01m 15.52s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15287	training's macroF1: 0.494296	valid_1's multi_logloss: 1.13193	valid_1's macroF1: 0.384456
[1000]	training's multi_logloss: 1.05863	training's macroF1: 0.530018	valid_1's multi_logloss: 1.07259	valid_1's macroF1: 0.390425
[1500]	training's multi_logloss: 1.00248	training's macroF1: 0.557114	valid_1's multi_logloss: 1.05278	valid_1's macroF1: 0.385397
Early stopping, best iteration is:
[1167]	training's multi_logloss: 1.03744	training's macroF1: 0.540135	valid_1's multi_logloss: 1.06327	valid_1's macroF1: 0.39476
******************** Execution ended in 00h 01m 43.83s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.162	training's macroF1: 0.494477	valid_1's multi_logloss: 1.1371	valid_1's macroF1: 0.421501
[1000]	training's multi_logloss: 1.06962	training's macroF1: 0.518801	valid_1's multi_logloss: 1.07597	valid_1's macroF1: 0.424294
Early stopping, best iteration is:
[606]	training's multi_logloss: 1.137	training's macroF1: 0.502814	valid_1's multi_logloss: 1.11745	valid_1's macroF1: 0.431822
******************** Execution ended in 00h 01m 13.06s ********************
######################################## 3 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.809865	training's macroF1: 0.70056	valid_1's multi_logloss: 0.995402	valid_1's macroF1: 0.444196
Early stopping, best iteration is:
[490]	training's multi_logloss: 0.81464	training's macroF1: 0.696039	valid_1's multi_logloss: 0.995938	valid_1's macroF1: 0.451588
******************** Execution ended in 00h 01m 05.31s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.825917	training's macroF1: 0.681596	valid_1's multi_logloss: 1.02221	valid_1's macroF1: 0.442533
Early stopping, best iteration is:
[310]	training's multi_logloss: 0.929943	training's macroF1: 0.635213	valid_1's multi_logloss: 1.04201	valid_1's macroF1: 0.454028
******************** Execution ended in 00h 00m 53.24s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.827298	training's macroF1: 0.669178	valid_1's multi_logloss: 0.968586	valid_1's macroF1: 0.441924
Early stopping, best iteration is:
[344]	training's multi_logloss: 0.910514	training's macroF1: 0.634273	valid_1's multi_logloss: 0.98736	valid_1's macroF1: 0.45069
******************** Execution ended in 00h 00m 56.89s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.820977	training's macroF1: 0.671463	valid_1's multi_logloss: 1.03319	valid_1's macroF1: 0.380203
Early stopping, best iteration is:
[3]	training's multi_logloss: 1.37461	training's macroF1: 0.495498	valid_1's multi_logloss: 1.37252	valid_1's macroF1: 0.394692
******************** Execution ended in 00h 00m 33.10s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.814642	training's macroF1: 0.685804	valid_1's multi_logloss: 1.06496	valid_1's macroF1: 0.421855
[1000]	training's multi_logloss: 0.647217	training's macroF1: 0.759141	valid_1's multi_logloss: 1.06244	valid_1's macroF1: 0.414225
Early stopping, best iteration is:
[521]	training's multi_logloss: 0.805173	training's macroF1: 0.693018	valid_1's multi_logloss: 1.06418	valid_1's macroF1: 0.422102
******************** Execution ended in 00h 01m 10.16s ********************
######################################## 4 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.01083	training's macroF1: 0.537277	valid_1's multi_logloss: 1.05847	valid_1's macroF1: 0.40936
Early stopping, best iteration is:
[379]	training's multi_logloss: 1.04687	training's macroF1: 0.523113	valid_1's multi_logloss: 1.0641	valid_1's macroF1: 0.428405
******************** Execution ended in 00h 00m 44.50s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.01284	training's macroF1: 0.543027	valid_1's multi_logloss: 1.02977	valid_1's macroF1: 0.423176
[1000]	training's multi_logloss: 0.913201	training's macroF1: 0.596816	valid_1's multi_logloss: 1.01665	valid_1's macroF1: 0.426669
[1500]	training's multi_logloss: 0.845772	training's macroF1: 0.628095	valid_1's multi_logloss: 1.01106	valid_1's macroF1: 0.427882
Early stopping, best iteration is:
[1252]	training's multi_logloss: 0.876021	training's macroF1: 0.615787	valid_1's multi_logloss: 1.01373	valid_1's macroF1: 0.434092
******************** Execution ended in 00h 01m 22.45s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.01234	training's macroF1: 0.548258	valid_1's multi_logloss: 1.08506	valid_1's macroF1: 0.369514
[1000]	training's multi_logloss: 0.911285	training's macroF1: 0.607829	valid_1's multi_logloss: 1.07324	valid_1's macroF1: 0.382539
Early stopping, best iteration is:
[859]	training's multi_logloss: 0.933664	training's macroF1: 0.590257	valid_1's multi_logloss: 1.07817	valid_1's macroF1: 0.38561
******************** Execution ended in 00h 01m 04.50s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.01314	training's macroF1: 0.539969	valid_1's multi_logloss: 1.07433	valid_1's macroF1: 0.399989
[1000]	training's multi_logloss: 0.912796	training's macroF1: 0.59641	valid_1's multi_logloss: 1.06692	valid_1's macroF1: 0.40821
Early stopping, best iteration is:
[675]	training's multi_logloss: 0.971183	training's macroF1: 0.565023	valid_1's multi_logloss: 1.07018	valid_1's macroF1: 0.413991
******************** Execution ended in 00h 00m 56.07s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.00459	training's macroF1: 0.544863	valid_1's multi_logloss: 1.04974	valid_1's macroF1: 0.401321
Early stopping, best iteration is:
[177]	training's multi_logloss: 1.1427	training's macroF1: 0.493032	valid_1's multi_logloss: 1.10439	valid_1's macroF1: 0.420533
******************** Execution ended in 00h 00m 33.07s ********************
######################################## 5 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.992275	training's macroF1: 0.554661	valid_1's multi_logloss: 1.04224	valid_1's macroF1: 0.350008
Early stopping, best iteration is:
[42]	training's multi_logloss: 1.28725	training's macroF1: 0.456181	valid_1's multi_logloss: 1.25924	valid_1's macroF1: 0.369595
******************** Execution ended in 00h 00m 36.03s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.998066	training's macroF1: 0.554626	valid_1's multi_logloss: 1.07297	valid_1's macroF1: 0.394617
Early stopping, best iteration is:
[177]	training's multi_logloss: 1.12975	training's macroF1: 0.496828	valid_1's multi_logloss: 1.12768	valid_1's macroF1: 0.411119
******************** Execution ended in 00h 00m 40.77s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.991497	training's macroF1: 0.547388	valid_1's multi_logloss: 1.05748	valid_1's macroF1: 0.413745
[1000]	training's multi_logloss: 0.888162	training's macroF1: 0.598206	valid_1's multi_logloss: 1.05411	valid_1's macroF1: 0.420612
Early stopping, best iteration is:
[947]	training's multi_logloss: 0.896833	training's macroF1: 0.590849	valid_1's multi_logloss: 1.05515	valid_1's macroF1: 0.423638
******************** Execution ended in 00h 01m 25.98s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.00452	training's macroF1: 0.542783	valid_1's multi_logloss: 1.05158	valid_1's macroF1: 0.414599
Early stopping, best iteration is:
[372]	training's multi_logloss: 1.04301	training's macroF1: 0.52251	valid_1's multi_logloss: 1.06469	valid_1's macroF1: 0.423675
******************** Execution ended in 00h 00m 52.88s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.00984	training's macroF1: 0.544674	valid_1's multi_logloss: 1.07357	valid_1's macroF1: 0.410082
Early stopping, best iteration is:
[316]	training's multi_logloss: 1.06984	training's macroF1: 0.515931	valid_1's multi_logloss: 1.08716	valid_1's macroF1: 0.424683
******************** Execution ended in 00h 00m 49.64s ********************
######################################## 6 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.14213	training's macroF1: 0.475743	valid_1's multi_logloss: 1.07646	valid_1's macroF1: 0.428338
[1000]	training's multi_logloss: 1.08016	training's macroF1: 0.502444	valid_1's multi_logloss: 1.0567	valid_1's macroF1: 0.421571
Early stopping, best iteration is:
[693]	training's multi_logloss: 1.11319	training's macroF1: 0.490205	valid_1's multi_logloss: 1.06397	valid_1's macroF1: 0.436966
******************** Execution ended in 00h 00m 42.05s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.13654	training's macroF1: 0.476825	valid_1's multi_logloss: 1.10079	valid_1's macroF1: 0.38883
Early stopping, best iteration is:
[138]	training's multi_logloss: 1.25913	training's macroF1: 0.449623	valid_1's multi_logloss: 1.21325	valid_1's macroF1: 0.400343
******************** Execution ended in 00h 00m 22.95s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.12307	training's macroF1: 0.486005	valid_1's multi_logloss: 1.10988	valid_1's macroF1: 0.397362
Early stopping, best iteration is:
[257]	training's multi_logloss: 1.18855	training's macroF1: 0.473953	valid_1's multi_logloss: 1.15157	valid_1's macroF1: 0.416235
******************** Execution ended in 00h 00m 27.46s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.12768	training's macroF1: 0.488627	valid_1's multi_logloss: 1.12428	valid_1's macroF1: 0.393269
[1000]	training's multi_logloss: 1.06535	training's macroF1: 0.499936	valid_1's multi_logloss: 1.11425	valid_1's macroF1: 0.389314
Early stopping, best iteration is:
[606]	training's multi_logloss: 1.11056	training's macroF1: 0.496642	valid_1's multi_logloss: 1.11785	valid_1's macroF1: 0.394677
******************** Execution ended in 00h 00m 39.21s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.13176	training's macroF1: 0.472903	valid_1's multi_logloss: 1.07774	valid_1's macroF1: 0.372848
[1000]	training's multi_logloss: 1.07215	training's macroF1: 0.502397	valid_1's multi_logloss: 1.05157	valid_1's macroF1: 0.392767
[1500]	training's multi_logloss: 1.03591	training's macroF1: 0.521544	valid_1's multi_logloss: 1.04452	valid_1's macroF1: 0.399919
Early stopping, best iteration is:
[1367]	training's multi_logloss: 1.04458	training's macroF1: 0.517019	valid_1's multi_logloss: 1.04357	valid_1's macroF1: 0.409965
******************** Execution ended in 00h 01m 04.88s ********************
######################################## 7 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.05164	training's macroF1: 0.550302	valid_1's multi_logloss: 1.06148	valid_1's macroF1: 0.383299
[1000]	training's multi_logloss: 0.930645	training's macroF1: 0.601107	valid_1's multi_logloss: 1.00967	valid_1's macroF1: 0.406095
[1500]	training's multi_logloss: 0.8504	training's macroF1: 0.643756	valid_1's multi_logloss: 0.993597	valid_1's macroF1: 0.400182
Early stopping, best iteration is:
[1138]	training's multi_logloss: 0.905869	training's macroF1: 0.610902	valid_1's multi_logloss: 1.00348	valid_1's macroF1: 0.41111
******************** Execution ended in 00h 01m 56.51s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.0452	training's macroF1: 0.577034	valid_1's multi_logloss: 1.07369	valid_1's macroF1: 0.406377
Early stopping, best iteration is:
[7]	training's multi_logloss: 1.37621	training's macroF1: 0.477851	valid_1's multi_logloss: 1.37415	valid_1's macroF1: 0.424352
******************** Execution ended in 00h 00m 36.99s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.04699	training's macroF1: 0.580629	valid_1's multi_logloss: 1.06364	valid_1's macroF1: 0.403479
[1000]	training's multi_logloss: 0.9211	training's macroF1: 0.614351	valid_1's multi_logloss: 1.01444	valid_1's macroF1: 0.416426
[1500]	training's multi_logloss: 0.838359	training's macroF1: 0.646375	valid_1's multi_logloss: 0.998619	valid_1's macroF1: 0.417353
Early stopping, best iteration is:
[1171]	training's multi_logloss: 0.889932	training's macroF1: 0.628065	valid_1's multi_logloss: 1.00701	valid_1's macroF1: 0.424089
******************** Execution ended in 00h 02m 03.77s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.04777	training's macroF1: 0.566903	valid_1's multi_logloss: 1.10848	valid_1's macroF1: 0.421771
[1000]	training's multi_logloss: 0.92302	training's macroF1: 0.607525	valid_1's multi_logloss: 1.07036	valid_1's macroF1: 0.403037
Early stopping, best iteration is:
[623]	training's multi_logloss: 1.00933	training's macroF1: 0.57663	valid_1's multi_logloss: 1.09255	valid_1's macroF1: 0.424716
******************** Execution ended in 00h 01m 20.67s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.04324	training's macroF1: 0.569833	valid_1's multi_logloss: 1.08703	valid_1's macroF1: 0.394684
Early stopping, best iteration is:
[1]	training's multi_logloss: 1.38485	training's macroF1: 0.44307	valid_1's multi_logloss: 1.38456	valid_1's macroF1: 0.424377
******************** Execution ended in 00h 00m 36.32s ********************
######################################## 8 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.16149	training's macroF1: 0.543075	valid_1's multi_logloss: 1.17371	valid_1's macroF1: 0.387537
[1000]	training's multi_logloss: 1.04611	training's macroF1: 0.574457	valid_1's multi_logloss: 1.10192	valid_1's macroF1: 0.386237
Early stopping, best iteration is:
[573]	training's multi_logloss: 1.14036	training's macroF1: 0.548412	valid_1's multi_logloss: 1.15857	valid_1's macroF1: 0.400353
******************** Execution ended in 00h 01m 35.76s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.16426	training's macroF1: 0.534868	valid_1's multi_logloss: 1.17475	valid_1's macroF1: 0.398161
[1000]	training's multi_logloss: 1.04831	training's macroF1: 0.567904	valid_1's multi_logloss: 1.10382	valid_1's macroF1: 0.416745
[1500]	training's multi_logloss: 0.974683	training's macroF1: 0.595193	valid_1's multi_logloss: 1.07643	valid_1's macroF1: 0.412397
[2000]	training's multi_logloss: 0.920059	training's macroF1: 0.61623	valid_1's multi_logloss: 1.06484	valid_1's macroF1: 0.417813
Early stopping, best iteration is:
[1752]	training's multi_logloss: 0.94565	training's macroF1: 0.606602	valid_1's multi_logloss: 1.06978	valid_1's macroF1: 0.425967
******************** Execution ended in 00h 03m 17.94s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.16205	training's macroF1: 0.528716	valid_1's multi_logloss: 1.14379	valid_1's macroF1: 0.402361
Early stopping, best iteration is:
[1]	training's multi_logloss: 1.3856	training's macroF1: 0.449768	valid_1's multi_logloss: 1.38543	valid_1's macroF1: 0.416367
******************** Execution ended in 00h 00m 46.12s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15768	training's macroF1: 0.532378	valid_1's multi_logloss: 1.16844	valid_1's macroF1: 0.398
[1000]	training's multi_logloss: 1.03994	training's macroF1: 0.570291	valid_1's multi_logloss: 1.09988	valid_1's macroF1: 0.399108
Early stopping, best iteration is:
[679]	training's multi_logloss: 1.10844	training's macroF1: 0.546831	valid_1's multi_logloss: 1.13583	valid_1's macroF1: 0.414
******************** Execution ended in 00h 01m 47.68s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.16801	training's macroF1: 0.536208	valid_1's multi_logloss: 1.15321	valid_1's macroF1: 0.459141
[1000]	training's multi_logloss: 1.05757	training's macroF1: 0.564505	valid_1's multi_logloss: 1.0696	valid_1's macroF1: 0.458623
[1500]	training's multi_logloss: 0.984854	training's macroF1: 0.591915	valid_1's multi_logloss: 1.03504	valid_1's macroF1: 0.472119
[2000]	training's multi_logloss: 0.93066	training's macroF1: 0.610184	valid_1's multi_logloss: 1.01801	valid_1's macroF1: 0.462045
Early stopping, best iteration is:
[1510]	training's multi_logloss: 0.983687	training's macroF1: 0.5917	valid_1's multi_logloss: 1.03466	valid_1's macroF1: 0.477436
******************** Execution ended in 00h 02m 56.35s ********************
######################################## 9 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.960583	training's macroF1: 0.579914	valid_1's multi_logloss: 1.03474	valid_1's macroF1: 0.39926
Early stopping, best iteration is:
[408]	training's multi_logloss: 0.994213	training's macroF1: 0.568429	valid_1's multi_logloss: 1.04096	valid_1's macroF1: 0.40628
******************** Execution ended in 00h 00m 48.61s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.970497	training's macroF1: 0.583039	valid_1's multi_logloss: 1.06043	valid_1's macroF1: 0.417215
Early stopping, best iteration is:
[167]	training's multi_logloss: 1.14265	training's macroF1: 0.524687	valid_1's multi_logloss: 1.13427	valid_1's macroF1: 0.429021
******************** Execution ended in 00h 00m 34.70s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.968217	training's macroF1: 0.586098	valid_1's multi_logloss: 1.04779	valid_1's macroF1: 0.409036
Early stopping, best iteration is:
[217]	training's multi_logloss: 1.10033	training's macroF1: 0.537756	valid_1's multi_logloss: 1.10205	valid_1's macroF1: 0.429907
******************** Execution ended in 00h 00m 38.41s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.970681	training's macroF1: 0.593783	valid_1's multi_logloss: 1.0127	valid_1's macroF1: 0.432998
Early stopping, best iteration is:
[300]	training's multi_logloss: 1.05387	training's macroF1: 0.561329	valid_1's multi_logloss: 1.03988	valid_1's macroF1: 0.447737
******************** Execution ended in 00h 00m 42.99s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.979745	training's macroF1: 0.572951	valid_1's multi_logloss: 1.019	valid_1's macroF1: 0.423471
[1000]	training's multi_logloss: 0.854143	training's macroF1: 0.645413	valid_1's multi_logloss: 0.988961	valid_1's macroF1: 0.449642
[1500]	training's multi_logloss: 0.770033	training's macroF1: 0.694107	valid_1's multi_logloss: 0.981734	valid_1's macroF1: 0.4499
[2000]	training's multi_logloss: 0.706969	training's macroF1: 0.728366	valid_1's multi_logloss: 0.978016	valid_1's macroF1: 0.459857
Early stopping, best iteration is:
[1715]	training's multi_logloss: 0.74066	training's macroF1: 0.709213	valid_1's multi_logloss: 0.978958	valid_1's macroF1: 0.469395
******************** Execution ended in 00h 01m 52.54s ********************
######################################## 10 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.05739	training's macroF1: 0.530952	valid_1's multi_logloss: 1.07139	valid_1's macroF1: 0.391319
Early stopping, best iteration is:
[333]	training's multi_logloss: 1.11348	training's macroF1: 0.51335	valid_1's multi_logloss: 1.09858	valid_1's macroF1: 0.412825
******************** Execution ended in 00h 00m 52.83s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.06447	training's macroF1: 0.52675	valid_1's multi_logloss: 1.05993	valid_1's macroF1: 0.414848
[1000]	training's multi_logloss: 0.96818	training's macroF1: 0.561465	valid_1's multi_logloss: 1.02944	valid_1's macroF1: 0.418391
Early stopping, best iteration is:
[713]	training's multi_logloss: 1.01657	training's macroF1: 0.544651	valid_1's multi_logloss: 1.0394	valid_1's macroF1: 0.426892
******************** Execution ended in 00h 01m 18.15s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.04849	training's macroF1: 0.543154	valid_1's multi_logloss: 1.08344	valid_1's macroF1: 0.376857
Early stopping, best iteration is:
[335]	training's multi_logloss: 1.10386	training's macroF1: 0.523546	valid_1's multi_logloss: 1.10624	valid_1's macroF1: 0.392807
******************** Execution ended in 00h 00m 52.42s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.04849	training's macroF1: 0.536687	valid_1's multi_logloss: 1.08009	valid_1's macroF1: 0.41184
[1000]	training's multi_logloss: 0.947811	training's macroF1: 0.58389	valid_1's multi_logloss: 1.06999	valid_1's macroF1: 0.411491
Early stopping, best iteration is:
[692]	training's multi_logloss: 1.0028	training's macroF1: 0.55583	valid_1's multi_logloss: 1.07183	valid_1's macroF1: 0.425535
******************** Execution ended in 00h 01m 13.43s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.04776	training's macroF1: 0.525231	valid_1's multi_logloss: 1.05478	valid_1's macroF1: 0.394999
[1000]	training's multi_logloss: 0.953525	training's macroF1: 0.570049	valid_1's multi_logloss: 1.03083	valid_1's macroF1: 0.387886
Early stopping, best iteration is:
[600]	training's multi_logloss: 1.02328	training's macroF1: 0.536102	valid_1's multi_logloss: 1.0459	valid_1's macroF1: 0.406943
******************** Execution ended in 00h 01m 09.34s ********************
######################################## 11 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.979067	training's macroF1: 0.580296	valid_1's multi_logloss: 1.03406	valid_1's macroF1: 0.41965
[1000]	training's multi_logloss: 0.85012	training's macroF1: 0.644015	valid_1's multi_logloss: 1.00996	valid_1's macroF1: 0.42395
Early stopping, best iteration is:
[550]	training's multi_logloss: 0.962447	training's macroF1: 0.589368	valid_1's multi_logloss: 1.02926	valid_1's macroF1: 0.432322
******************** Execution ended in 00h 00m 55.44s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.975745	training's macroF1: 0.581412	valid_1's multi_logloss: 1.01378	valid_1's macroF1: 0.414806
Early stopping, best iteration is:
[414]	training's multi_logloss: 1.00842	training's macroF1: 0.567243	valid_1's multi_logloss: 1.02664	valid_1's macroF1: 0.422131
******************** Execution ended in 00h 00m 48.16s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.976941	training's macroF1: 0.591674	valid_1's multi_logloss: 1.05171	valid_1's macroF1: 0.420491
Early stopping, best iteration is:
[154]	training's multi_logloss: 1.16688	training's macroF1: 0.531293	valid_1's multi_logloss: 1.15214	valid_1's macroF1: 0.441034
******************** Execution ended in 00h 00m 33.84s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.971184	training's macroF1: 0.583924	valid_1's multi_logloss: 1.03994	valid_1's macroF1: 0.402042
Early stopping, best iteration is:
[361]	training's multi_logloss: 1.02742	training's macroF1: 0.568314	valid_1's multi_logloss: 1.05436	valid_1's macroF1: 0.411159
******************** Execution ended in 00h 00m 45.24s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.980007	training's macroF1: 0.592463	valid_1's multi_logloss: 1.08213	valid_1's macroF1: 0.418053
[1000]	training's multi_logloss: 0.851693	training's macroF1: 0.660205	valid_1's multi_logloss: 1.06363	valid_1's macroF1: 0.428683
Early stopping, best iteration is:
[848]	training's multi_logloss: 0.884041	training's macroF1: 0.637129	valid_1's multi_logloss: 1.06712	valid_1's macroF1: 0.429805
******************** Execution ended in 00h 01m 10.30s ********************
######################################## 12 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.29246	training's macroF1: 0.475911	valid_1's multi_logloss: 1.27084	valid_1's macroF1: 0.348424
Early stopping, best iteration is:
[80]	training's multi_logloss: 1.36876	training's macroF1: 0.450804	valid_1's multi_logloss: 1.36327	valid_1's macroF1: 0.354663
******************** Execution ended in 00h 00m 39.81s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.29905	training's macroF1: 0.465427	valid_1's multi_logloss: 1.28232	valid_1's macroF1: 0.37384
Early stopping, best iteration is:
[158]	training's multi_logloss: 1.35494	training's macroF1: 0.456206	valid_1's multi_logloss: 1.34719	valid_1's macroF1: 0.384269
******************** Execution ended in 00h 00m 45.47s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.30009	training's macroF1: 0.460667	valid_1's multi_logloss: 1.27615	valid_1's macroF1: 0.395346
Early stopping, best iteration is:
[7]	training's multi_logloss: 1.3848	training's macroF1: 0.432936	valid_1's multi_logloss: 1.38439	valid_1's macroF1: 0.417813
******************** Execution ended in 00h 00m 36.17s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.29841	training's macroF1: 0.483632	valid_1's multi_logloss: 1.27353	valid_1's macroF1: 0.410683
Early stopping, best iteration is:
[7]	training's multi_logloss: 1.38483	training's macroF1: 0.451192	valid_1's multi_logloss: 1.38425	valid_1's macroF1: 0.42068
******************** Execution ended in 00h 00m 39.64s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.29901	training's macroF1: 0.481841	valid_1's multi_logloss: 1.28631	valid_1's macroF1: 0.408987
[1000]	training's multi_logloss: 1.23559	training's macroF1: 0.497927	valid_1's multi_logloss: 1.22382	valid_1's macroF1: 0.418935
[1500]	training's multi_logloss: 1.18594	training's macroF1: 0.5059	valid_1's multi_logloss: 1.18069	valid_1's macroF1: 0.416248
Early stopping, best iteration is:
[1151]	training's multi_logloss: 1.2195	training's macroF1: 0.499133	valid_1's multi_logloss: 1.20936	valid_1's macroF1: 0.424885
******************** Execution ended in 00h 01m 58.60s ********************
######################################## 13 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.796053	training's macroF1: 0.6769	valid_1's multi_logloss: 1.06737	valid_1's macroF1: 0.366604
[1000]	training's multi_logloss: 0.639473	training's macroF1: 0.749079	valid_1's multi_logloss: 1.06012	valid_1's macroF1: 0.374075
Early stopping, best iteration is:
[621]	training's multi_logloss: 0.748547	training's macroF1: 0.697439	valid_1's multi_logloss: 1.06386	valid_1's macroF1: 0.384188
******************** Execution ended in 00h 01m 02.29s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.810608	training's macroF1: 0.688407	valid_1's multi_logloss: 1.00303	valid_1's macroF1: 0.410421
Early stopping, best iteration is:
[36]	training's multi_logloss: 1.26059	training's macroF1: 0.529108	valid_1's multi_logloss: 1.23036	valid_1's macroF1: 0.444541
******************** Execution ended in 00h 00m 29.62s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.81863	training's macroF1: 0.668042	valid_1's multi_logloss: 0.984283	valid_1's macroF1: 0.426227
[1000]	training's multi_logloss: 0.659517	training's macroF1: 0.741733	valid_1's multi_logloss: 0.961818	valid_1's macroF1: 0.431293
[1500]	training's multi_logloss: 0.56535	training's macroF1: 0.785813	valid_1's multi_logloss: 0.951673	valid_1's macroF1: 0.42772
[2000]	training's multi_logloss: 0.501424	training's macroF1: 0.805714	valid_1's multi_logloss: 0.948709	valid_1's macroF1: 0.421404
Early stopping, best iteration is:
[1522]	training's multi_logloss: 0.56183	training's macroF1: 0.787145	valid_1's multi_logloss: 0.952727	valid_1's macroF1: 0.438539
******************** Execution ended in 00h 01m 46.77s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.814833	training's macroF1: 0.68109	valid_1's multi_logloss: 0.989224	valid_1's macroF1: 0.423091
Early stopping, best iteration is:
[67]	training's multi_logloss: 1.18275	training's macroF1: 0.543441	valid_1's multi_logloss: 1.16114	valid_1's macroF1: 0.448106
******************** Execution ended in 00h 00m 32.09s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.795034	training's macroF1: 0.683981	valid_1's multi_logloss: 1.04802	valid_1's macroF1: 0.415655
Early stopping, best iteration is:
[285]	training's multi_logloss: 0.913446	training's macroF1: 0.634247	valid_1's multi_logloss: 1.05315	valid_1's macroF1: 0.427517
******************** Execution ended in 00h 00m 45.17s ********************
######################################## 14 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.11515	training's macroF1: 0.502067	valid_1's multi_logloss: 1.13088	valid_1's macroF1: 0.382403
[1000]	training's multi_logloss: 1.02928	training's macroF1: 0.533553	valid_1's multi_logloss: 1.10583	valid_1's macroF1: 0.386869
[1500]	training's multi_logloss: 0.976349	training's macroF1: 0.554248	valid_1's multi_logloss: 1.1039	valid_1's macroF1: 0.390098
Early stopping, best iteration is:
[1439]	training's multi_logloss: 0.981934	training's macroF1: 0.553774	valid_1's multi_logloss: 1.10414	valid_1's macroF1: 0.392735
******************** Execution ended in 00h 01m 37.46s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.12244	training's macroF1: 0.502836	valid_1's multi_logloss: 1.09195	valid_1's macroF1: 0.401618
Early stopping, best iteration is:
[173]	training's multi_logloss: 1.24366	training's macroF1: 0.47021	valid_1's multi_logloss: 1.19787	valid_1's macroF1: 0.416747
******************** Execution ended in 00h 00m 33.20s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.1167	training's macroF1: 0.494135	valid_1's multi_logloss: 1.05546	valid_1's macroF1: 0.404235
[1000]	training's multi_logloss: 1.03462	training's macroF1: 0.538682	valid_1's multi_logloss: 1.01899	valid_1's macroF1: 0.410165
Early stopping, best iteration is:
[623]	training's multi_logloss: 1.09099	training's macroF1: 0.507756	valid_1's multi_logloss: 1.04091	valid_1's macroF1: 0.415521
******************** Execution ended in 00h 00m 58.58s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.1164	training's macroF1: 0.511266	valid_1's multi_logloss: 1.13824	valid_1's macroF1: 0.382079
Early stopping, best iteration is:
[53]	training's multi_logloss: 1.32614	training's macroF1: 0.456518	valid_1's multi_logloss: 1.3151	valid_1's macroF1: 0.385225
******************** Execution ended in 00h 00m 27.48s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.11219	training's macroF1: 0.491069	valid_1's multi_logloss: 1.07265	valid_1's macroF1: 0.385793
[1000]	training's multi_logloss: 1.02932	training's macroF1: 0.527314	valid_1's multi_logloss: 1.03614	valid_1's macroF1: 0.390766
[1500]	training's multi_logloss: 0.978791	training's macroF1: 0.549414	valid_1's multi_logloss: 1.02757	valid_1's macroF1: 0.399675
[2000]	training's multi_logloss: 0.939191	training's macroF1: 0.576691	valid_1's multi_logloss: 1.02427	valid_1's macroF1: 0.390172
Early stopping, best iteration is:
[1765]	training's multi_logloss: 0.956839	training's macroF1: 0.563314	valid_1's multi_logloss: 1.02566	valid_1's macroF1: 0.404754
******************** Execution ended in 00h 01m 48.52s ********************
######################################## 15 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.719719	training's macroF1: 0.718996	valid_1's multi_logloss: 1.04115	valid_1's macroF1: 0.38858
Early stopping, best iteration is:
[103]	training's multi_logloss: 1.07686	training's macroF1: 0.580143	valid_1's multi_logloss: 1.13666	valid_1's macroF1: 0.403155
******************** Execution ended in 00h 00m 51.63s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.723133	training's macroF1: 0.729887	valid_1's multi_logloss: 0.987832	valid_1's macroF1: 0.415537
Early stopping, best iteration is:
[326]	training's multi_logloss: 0.831968	training's macroF1: 0.684592	valid_1's multi_logloss: 0.998096	valid_1's macroF1: 0.432349
******************** Execution ended in 00h 01m 10.56s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.718914	training's macroF1: 0.723201	valid_1's multi_logloss: 0.970824	valid_1's macroF1: 0.399813
Early stopping, best iteration is:
[157]	training's multi_logloss: 0.985924	training's macroF1: 0.629349	valid_1's multi_logloss: 1.03996	valid_1's macroF1: 0.41978
******************** Execution ended in 00h 00m 55.50s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.710193	training's macroF1: 0.733349	valid_1's multi_logloss: 1.05624	valid_1's macroF1: 0.416228
[1000]	training's multi_logloss: 0.52302	training's macroF1: 0.814478	valid_1's multi_logloss: 1.04124	valid_1's macroF1: 0.397145
Early stopping, best iteration is:
[504]	training's multi_logloss: 0.708226	training's macroF1: 0.735863	valid_1's multi_logloss: 1.05606	valid_1's macroF1: 0.417131
******************** Execution ended in 00h 01m 24.97s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.720316	training's macroF1: 0.720217	valid_1's multi_logloss: 0.998861	valid_1's macroF1: 0.437373
[1000]	training's multi_logloss: 0.53047	training's macroF1: 0.800926	valid_1's multi_logloss: 0.97258	valid_1's macroF1: 0.45171
[1500]	training's multi_logloss: 0.423284	training's macroF1: 0.852537	valid_1's multi_logloss: 0.96553	valid_1's macroF1: 0.454637
Early stopping, best iteration is:
[1278]	training's multi_logloss: 0.464182	training's macroF1: 0.835227	valid_1's multi_logloss: 0.968516	valid_1's macroF1: 0.461752
******************** Execution ended in 00h 02m 27.49s ********************
######################################## 16 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.991341	training's macroF1: 0.548778	valid_1's multi_logloss: 1.07581	valid_1's macroF1: 0.366344
Early stopping, best iteration is:
[361]	training's multi_logloss: 1.03856	training's macroF1: 0.533282	valid_1's multi_logloss: 1.08662	valid_1's macroF1: 0.379251
******************** Execution ended in 00h 00m 41.76s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.01557	training's macroF1: 0.546137	valid_1's multi_logloss: 1.04278	valid_1's macroF1: 0.393345
Early stopping, best iteration is:
[213]	training's multi_logloss: 1.13356	training's macroF1: 0.510934	valid_1's multi_logloss: 1.1085	valid_1's macroF1: 0.420432
******************** Execution ended in 00h 00m 37.91s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.00182	training's macroF1: 0.570637	valid_1's multi_logloss: 1.05628	valid_1's macroF1: 0.410853
Early stopping, best iteration is:
[395]	training's multi_logloss: 1.03678	training's macroF1: 0.552593	valid_1's multi_logloss: 1.06752	valid_1's macroF1: 0.415816
******************** Execution ended in 00h 00m 44.91s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.01552	training's macroF1: 0.556334	valid_1's multi_logloss: 1.05619	valid_1's macroF1: 0.405569
Early stopping, best iteration is:
[153]	training's multi_logloss: 1.18173	training's macroF1: 0.506857	valid_1's multi_logloss: 1.15633	valid_1's macroF1: 0.410264
******************** Execution ended in 00h 00m 32.93s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.01017	training's macroF1: 0.547826	valid_1's multi_logloss: 1.05511	valid_1's macroF1: 0.406426
[1000]	training's multi_logloss: 0.899636	training's macroF1: 0.599848	valid_1's multi_logloss: 1.04141	valid_1's macroF1: 0.431214
[1500]	training's multi_logloss: 0.826699	training's macroF1: 0.639014	valid_1's multi_logloss: 1.03901	valid_1's macroF1: 0.431922
[2000]	training's multi_logloss: 0.77062	training's macroF1: 0.672196	valid_1's multi_logloss: 1.03768	valid_1's macroF1: 0.434137
[2500]	training's multi_logloss: 0.725227	training's macroF1: 0.699865	valid_1's multi_logloss: 1.038	valid_1's macroF1: 0.441531
Early stopping, best iteration is:
[2107]	training's multi_logloss: 0.760165	training's macroF1: 0.680114	valid_1's multi_logloss: 1.03825	valid_1's macroF1: 0.44639
******************** Execution ended in 00h 02m 06.20s ********************
######################################## 17 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.16144	training's macroF1: 0.485398	valid_1's multi_logloss: 1.10625	valid_1's macroF1: 0.412715
Early stopping, best iteration is:
[430]	training's multi_logloss: 1.17939	training's macroF1: 0.477711	valid_1's multi_logloss: 1.12221	valid_1's macroF1: 0.428811
******************** Execution ended in 00h 00m 45.84s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15249	training's macroF1: 0.485752	valid_1's multi_logloss: 1.12388	valid_1's macroF1: 0.417605
[1000]	training's multi_logloss: 1.06534	training's macroF1: 0.513267	valid_1's multi_logloss: 1.07632	valid_1's macroF1: 0.414977
Early stopping, best iteration is:
[630]	training's multi_logloss: 1.1237	training's macroF1: 0.495489	valid_1's multi_logloss: 1.10269	valid_1's macroF1: 0.42889
******************** Execution ended in 00h 00m 54.39s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15398	training's macroF1: 0.483677	valid_1's multi_logloss: 1.09515	valid_1's macroF1: 0.405397
[1000]	training's multi_logloss: 1.06827	training's macroF1: 0.519403	valid_1's multi_logloss: 1.03886	valid_1's macroF1: 0.425609
[1500]	training's multi_logloss: 1.01702	training's macroF1: 0.544883	valid_1's multi_logloss: 1.02252	valid_1's macroF1: 0.413679
Early stopping, best iteration is:
[1199]	training's multi_logloss: 1.04571	training's macroF1: 0.53007	valid_1's multi_logloss: 1.03036	valid_1's macroF1: 0.428099
******************** Execution ended in 00h 01m 21.86s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.14959	training's macroF1: 0.487923	valid_1's multi_logloss: 1.12724	valid_1's macroF1: 0.371036
Early stopping, best iteration is:
[472]	training's multi_logloss: 1.15639	training's macroF1: 0.486354	valid_1's multi_logloss: 1.13165	valid_1's macroF1: 0.375175
******************** Execution ended in 00h 00m 48.98s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15383	training's macroF1: 0.496044	valid_1's multi_logloss: 1.15408	valid_1's macroF1: 0.381772
[1000]	training's multi_logloss: 1.07192	training's macroF1: 0.519119	valid_1's multi_logloss: 1.11394	valid_1's macroF1: 0.402939
[1500]	training's multi_logloss: 1.02229	training's macroF1: 0.539649	valid_1's multi_logloss: 1.10419	valid_1's macroF1: 0.406172
Early stopping, best iteration is:
[1422]	training's multi_logloss: 1.02903	training's macroF1: 0.533826	valid_1's multi_logloss: 1.10546	valid_1's macroF1: 0.410275
******************** Execution ended in 00h 01m 33.57s ********************
######################################## 18 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.10311	training's macroF1: 0.560559	valid_1's multi_logloss: 1.0851	valid_1's macroF1: 0.415552
Early stopping, best iteration is:
[168]	training's multi_logloss: 1.25194	training's macroF1: 0.527789	valid_1's multi_logloss: 1.22333	valid_1's macroF1: 0.434388
******************** Execution ended in 00h 00m 46.63s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.09991	training's macroF1: 0.560437	valid_1's multi_logloss: 1.09192	valid_1's macroF1: 0.42913
Early stopping, best iteration is:
[21]	training's multi_logloss: 1.36434	training's macroF1: 0.494973	valid_1's multi_logloss: 1.3571	valid_1's macroF1: 0.450147
******************** Execution ended in 00h 00m 40.47s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.10035	training's macroF1: 0.555457	valid_1's multi_logloss: 1.08409	valid_1's macroF1: 0.447443
Early stopping, best iteration is:
[325]	training's multi_logloss: 1.16711	training's macroF1: 0.535076	valid_1's multi_logloss: 1.14041	valid_1's macroF1: 0.449101
******************** Execution ended in 00h 00m 56.82s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.07951	training's macroF1: 0.556213	valid_1's multi_logloss: 1.12567	valid_1's macroF1: 0.389544
Early stopping, best iteration is:
[16]	training's multi_logloss: 1.36801	training's macroF1: 0.486192	valid_1's multi_logloss: 1.36709	valid_1's macroF1: 0.397304
******************** Execution ended in 00h 00m 35.74s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.08637	training's macroF1: 0.563228	valid_1's multi_logloss: 1.13928	valid_1's macroF1: 0.377075
Early stopping, best iteration is:
[18]	training's multi_logloss: 1.36655	training's macroF1: 0.489961	valid_1's multi_logloss: 1.36515	valid_1's macroF1: 0.392596
******************** Execution ended in 00h 00m 35.37s ********************
######################################## 19 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.16096	training's macroF1: 0.528093	valid_1's multi_logloss: 1.12317	valid_1's macroF1: 0.422496
Early stopping, best iteration is:
[455]	training's multi_logloss: 1.17441	training's macroF1: 0.525833	valid_1's multi_logloss: 1.13537	valid_1's macroF1: 0.430164
******************** Execution ended in 00h 01m 16.98s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.1651	training's macroF1: 0.520922	valid_1's multi_logloss: 1.14566	valid_1's macroF1: 0.395906
Early stopping, best iteration is:
[332]	training's multi_logloss: 1.21984	training's macroF1: 0.498022	valid_1's multi_logloss: 1.19586	valid_1's macroF1: 0.409149
******************** Execution ended in 00h 01m 06.14s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15709	training's macroF1: 0.520787	valid_1's multi_logloss: 1.14558	valid_1's macroF1: 0.391539
Early stopping, best iteration is:
[2]	training's multi_logloss: 1.38474	training's macroF1: 0.445344	valid_1's multi_logloss: 1.38438	valid_1's macroF1: 0.417154
******************** Execution ended in 00h 00m 39.71s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.1501	training's macroF1: 0.507617	valid_1's multi_logloss: 1.17402	valid_1's macroF1: 0.363787
Early stopping, best iteration is:
[3]	training's multi_logloss: 1.38396	training's macroF1: 0.442248	valid_1's multi_logloss: 1.38362	valid_1's macroF1: 0.376224
******************** Execution ended in 00h 00m 41.43s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15215	training's macroF1: 0.525629	valid_1's multi_logloss: 1.17457	valid_1's macroF1: 0.361562
[1000]	training's multi_logloss: 1.03978	training's macroF1: 0.558132	valid_1's multi_logloss: 1.11233	valid_1's macroF1: 0.379311
Early stopping, best iteration is:
[905]	training's multi_logloss: 1.05657	training's macroF1: 0.54828	valid_1's multi_logloss: 1.11952	valid_1's macroF1: 0.383473
******************** Execution ended in 00h 01m 53.64s ********************
######################################## 20 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.04039	training's macroF1: 0.527414	valid_1's multi_logloss: 1.0534	valid_1's macroF1: 0.42005
[1000]	training's multi_logloss: 0.945869	training's macroF1: 0.580036	valid_1's multi_logloss: 1.03864	valid_1's macroF1: 0.406612
Early stopping, best iteration is:
[628]	training's multi_logloss: 1.01068	training's macroF1: 0.545473	valid_1's multi_logloss: 1.04708	valid_1's macroF1: 0.42716
******************** Execution ended in 00h 00m 54.36s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.03631	training's macroF1: 0.51629	valid_1's multi_logloss: 1.06409	valid_1's macroF1: 0.407714
Early stopping, best iteration is:
[259]	training's multi_logloss: 1.11757	training's macroF1: 0.492983	valid_1's multi_logloss: 1.09678	valid_1's macroF1: 0.429086
******************** Execution ended in 00h 00m 37.09s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.0356	training's macroF1: 0.54323	valid_1's multi_logloss: 1.0763	valid_1's macroF1: 0.399144
Early stopping, best iteration is:
[309]	training's multi_logloss: 1.09434	training's macroF1: 0.513538	valid_1's multi_logloss: 1.09515	valid_1's macroF1: 0.41528
******************** Execution ended in 00h 00m 43.94s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.02369	training's macroF1: 0.548811	valid_1's multi_logloss: 1.11332	valid_1's macroF1: 0.387036
[1000]	training's multi_logloss: 0.92829	training's macroF1: 0.587162	valid_1's multi_logloss: 1.11578	valid_1's macroF1: 0.395637
Early stopping, best iteration is:
[544]	training's multi_logloss: 1.01359	training's macroF1: 0.554965	valid_1's multi_logloss: 1.11091	valid_1's macroF1: 0.381769
******************** Execution ended in 00h 00m 50.26s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.03283	training's macroF1: 0.522996	valid_1's multi_logloss: 1.03461	valid_1's macroF1: 0.420961
[1000]	training's multi_logloss: 0.940988	training's macroF1: 0.560025	valid_1's multi_logloss: 1.01778	valid_1's macroF1: 0.41203
Early stopping, best iteration is:
[605]	training's multi_logloss: 1.00911	training's macroF1: 0.534159	valid_1's multi_logloss: 1.02658	valid_1's macroF1: 0.436975
******************** Execution ended in 00h 00m 52.41s ********************
######################################## 21 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.8295	training's macroF1: 0.716314	valid_1's multi_logloss: 1.01155	valid_1's macroF1: 0.422958
Early stopping, best iteration is:
[13]	training's multi_logloss: 1.34781	training's macroF1: 0.525285	valid_1's multi_logloss: 1.3452	valid_1's macroF1: 0.435518
******************** Execution ended in 00h 00m 29.77s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.821289	training's macroF1: 0.696045	valid_1's multi_logloss: 0.983	valid_1's macroF1: 0.39235
Early stopping, best iteration is:
[12]	training's multi_logloss: 1.35023	training's macroF1: 0.506874	valid_1's multi_logloss: 1.34009	valid_1's macroF1: 0.429818
******************** Execution ended in 00h 00m 30.32s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.80744	training's macroF1: 0.697536	valid_1's multi_logloss: 1.00423	valid_1's macroF1: 0.383837
Early stopping, best iteration is:
[9]	training's multi_logloss: 1.35814	training's macroF1: 0.523172	valid_1's multi_logloss: 1.35401	valid_1's macroF1: 0.402235
******************** Execution ended in 00h 00m 29.81s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.823709	training's macroF1: 0.68925	valid_1's multi_logloss: 1.02915	valid_1's macroF1: 0.400118
Early stopping, best iteration is:
[386]	training's multi_logloss: 0.885753	training's macroF1: 0.66877	valid_1's multi_logloss: 1.04237	valid_1's macroF1: 0.417867
******************** Execution ended in 00h 00m 55.31s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.828871	training's macroF1: 0.697904	valid_1's multi_logloss: 1.01026	valid_1's macroF1: 0.448786
[1000]	training's multi_logloss: 0.647254	training's macroF1: 0.765756	valid_1's multi_logloss: 0.985841	valid_1's macroF1: 0.442639
Early stopping, best iteration is:
[503]	training's multi_logloss: 0.827499	training's macroF1: 0.699775	valid_1's multi_logloss: 1.0104	valid_1's macroF1: 0.452063
******************** Execution ended in 00h 01m 02.04s ********************
######################################## 22 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.853062	training's macroF1: 0.67499	valid_1's multi_logloss: 1.03501	valid_1's macroF1: 0.422689
[1000]	training's multi_logloss: 0.673052	training's macroF1: 0.752262	valid_1's multi_logloss: 1.00532	valid_1's macroF1: 0.418821
Early stopping, best iteration is:
[501]	training's multi_logloss: 0.852552	training's macroF1: 0.674994	valid_1's multi_logloss: 1.03481	valid_1's macroF1: 0.42654
******************** Execution ended in 00h 01m 27.92s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.854253	training's macroF1: 0.697153	valid_1's multi_logloss: 1.02194	valid_1's macroF1: 0.414942
[1000]	training's multi_logloss: 0.672828	training's macroF1: 0.766584	valid_1's multi_logloss: 0.999475	valid_1's macroF1: 0.432806
Early stopping, best iteration is:
[886]	training's multi_logloss: 0.704918	training's macroF1: 0.758429	valid_1's multi_logloss: 1.0024	valid_1's macroF1: 0.436967
******************** Execution ended in 00h 02m 01.38s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.853408	training's macroF1: 0.686406	valid_1's multi_logloss: 0.997469	valid_1's macroF1: 0.40929
[1000]	training's multi_logloss: 0.674496	training's macroF1: 0.759269	valid_1's multi_logloss: 0.965511	valid_1's macroF1: 0.424854
Early stopping, best iteration is:
[860]	training's multi_logloss: 0.713759	training's macroF1: 0.745124	valid_1's multi_logloss: 0.970084	valid_1's macroF1: 0.434274
******************** Execution ended in 00h 02m 02.50s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.855503	training's macroF1: 0.682209	valid_1's multi_logloss: 1.00245	valid_1's macroF1: 0.408855
Early stopping, best iteration is:
[221]	training's multi_logloss: 1.04339	training's macroF1: 0.618299	valid_1's multi_logloss: 1.07951	valid_1's macroF1: 0.420676
******************** Execution ended in 00h 01m 01.94s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.851665	training's macroF1: 0.679089	valid_1's multi_logloss: 1.0501	valid_1's macroF1: 0.390139
Early stopping, best iteration is:
[8]	training's multi_logloss: 1.36429	training's macroF1: 0.499563	valid_1's multi_logloss: 1.36444	valid_1's macroF1: 0.39964
******************** Execution ended in 00h 00m 43.59s ********************
######################################## 23 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.07651	training's macroF1: 0.576284	valid_1's multi_logloss: 1.09385	valid_1's macroF1: 0.406565
Early stopping, best iteration is:
[452]	training's multi_logloss: 1.09473	training's macroF1: 0.57358	valid_1's multi_logloss: 1.10612	valid_1's macroF1: 0.414289
******************** Execution ended in 00h 01m 11.73s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.07664	training's macroF1: 0.562097	valid_1's multi_logloss: 1.10442	valid_1's macroF1: 0.380507
Early stopping, best iteration is:
[5]	training's multi_logloss: 1.38067	training's macroF1: 0.468954	valid_1's multi_logloss: 1.3799	valid_1's macroF1: 0.392206
******************** Execution ended in 00h 00m 37.61s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.07919	training's macroF1: 0.56974	valid_1's multi_logloss: 1.10431	valid_1's macroF1: 0.403978
Early stopping, best iteration is:
[314]	training's multi_logloss: 1.15861	training's macroF1: 0.556978	valid_1's multi_logloss: 1.15816	valid_1's macroF1: 0.417018
******************** Execution ended in 00h 01m 03.50s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.08752	training's macroF1: 0.570788	valid_1's multi_logloss: 1.09413	valid_1's macroF1: 0.419409
[1000]	training's multi_logloss: 0.955432	training's macroF1: 0.613257	valid_1's multi_logloss: 1.02056	valid_1's macroF1: 0.435846
[1500]	training's multi_logloss: 0.869719	training's macroF1: 0.654768	valid_1's multi_logloss: 0.995543	valid_1's macroF1: 0.438266
[2000]	training's multi_logloss: 0.803634	training's macroF1: 0.691102	valid_1's multi_logloss: 0.983469	valid_1's macroF1: 0.453552
[2500]	training's multi_logloss: 0.750031	training's macroF1: 0.716025	valid_1's multi_logloss: 0.975284	valid_1's macroF1: 0.4533
Early stopping, best iteration is:
[2137]	training's multi_logloss: 0.787831	training's macroF1: 0.698501	valid_1's multi_logloss: 0.980244	valid_1's macroF1: 0.458939
******************** Execution ended in 00h 03m 11.28s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.07572	training's macroF1: 0.56772	valid_1's multi_logloss: 1.10394	valid_1's macroF1: 0.401905
Early stopping, best iteration is:
[496]	training's multi_logloss: 1.07714	training's macroF1: 0.567087	valid_1's multi_logloss: 1.10453	valid_1's macroF1: 0.409293
******************** Execution ended in 00h 01m 14.87s ********************
######################################## 24 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.36872	training's macroF1: 0.476035	valid_1's multi_logloss: 1.36434	valid_1's macroF1: 0.371415
Early stopping, best iteration is:
[4]	training's multi_logloss: 1.38615	training's macroF1: 0.45417	valid_1's multi_logloss: 1.38612	valid_1's macroF1: 0.377204
******************** Execution ended in 00h 00m 35.62s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.36778	training's macroF1: 0.475893	valid_1's multi_logloss: 1.36227	valid_1's macroF1: 0.380175
Early stopping, best iteration is:
[44]	training's multi_logloss: 1.38461	training's macroF1: 0.476616	valid_1's multi_logloss: 1.3841	valid_1's macroF1: 0.410753
******************** Execution ended in 00h 00m 35.43s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.36791	training's macroF1: 0.493256	valid_1's multi_logloss: 1.36258	valid_1's macroF1: 0.371391
Early stopping, best iteration is:
[2]	training's multi_logloss: 1.38622	training's macroF1: 0.446181	valid_1's multi_logloss: 1.38619	valid_1's macroF1: 0.394467
******************** Execution ended in 00h 00m 32.39s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.36786	training's macroF1: 0.470686	valid_1's multi_logloss: 1.36366	valid_1's macroF1: 0.394163
Early stopping, best iteration is:
[14]	training's multi_logloss: 1.38577	training's macroF1: 0.482172	valid_1's multi_logloss: 1.38564	valid_1's macroF1: 0.377765
******************** Execution ended in 00h 00m 31.95s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.3677	training's macroF1: 0.462665	valid_1's multi_logloss: 1.36474	valid_1's macroF1: 0.396157
Early stopping, best iteration is:
[58]	training's multi_logloss: 1.38406	training's macroF1: 0.472283	valid_1's multi_logloss: 1.38367	valid_1's macroF1: 0.397819
******************** Execution ended in 00h 00m 34.09s ********************
######################################## 25 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.11832	training's macroF1: 0.488679	valid_1's multi_logloss: 1.06334	valid_1's macroF1: 0.406784
[1000]	training's multi_logloss: 1.05304	training's macroF1: 0.524668	valid_1's multi_logloss: 1.043	valid_1's macroF1: 0.415379
[1500]	training's multi_logloss: 1.01016	training's macroF1: 0.536585	valid_1's multi_logloss: 1.0435	valid_1's macroF1: 0.406986
Early stopping, best iteration is:
[1120]	training's multi_logloss: 1.04213	training's macroF1: 0.526782	valid_1's multi_logloss: 1.03906	valid_1's macroF1: 0.413274
******************** Execution ended in 00h 01m 01.05s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.11296	training's macroF1: 0.492108	valid_1's multi_logloss: 1.05919	valid_1's macroF1: 0.393494
Early stopping, best iteration is:
[30]	training's multi_logloss: 1.34225	training's macroF1: 0.442494	valid_1's multi_logloss: 1.32048	valid_1's macroF1: 0.418895
******************** Execution ended in 00h 00m 20.37s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.10973	training's macroF1: 0.492034	valid_1's multi_logloss: 1.10373	valid_1's macroF1: 0.391655
Early stopping, best iteration is:
[156]	training's multi_logloss: 1.22373	training's macroF1: 0.448961	valid_1's multi_logloss: 1.1895	valid_1's macroF1: 0.404394
******************** Execution ended in 00h 00m 25.40s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.1092	training's macroF1: 0.488976	valid_1's multi_logloss: 1.09781	valid_1's macroF1: 0.425358
[1000]	training's multi_logloss: 1.04403	training's macroF1: 0.516806	valid_1's multi_logloss: 1.08192	valid_1's macroF1: 0.413425
Early stopping, best iteration is:
[517]	training's multi_logloss: 1.10579	training's macroF1: 0.488927	valid_1's multi_logloss: 1.09574	valid_1's macroF1: 0.429606
******************** Execution ended in 00h 00m 38.88s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.1075	training's macroF1: 0.499065	valid_1's multi_logloss: 1.09136	valid_1's macroF1: 0.383037
Early stopping, best iteration is:
[148]	training's multi_logloss: 1.22991	training's macroF1: 0.468456	valid_1's multi_logloss: 1.18295	valid_1's macroF1: 0.398795
******************** Execution ended in 00h 00m 25.06s ********************
######################################## 26 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.898636	training's macroF1: 0.602686	valid_1's multi_logloss: 1.01344	valid_1's macroF1: 0.416381
[1000]	training's multi_logloss: 0.754101	training's macroF1: 0.678634	valid_1's multi_logloss: 1.00585	valid_1's macroF1: 0.42001
Early stopping, best iteration is:
[964]	training's multi_logloss: 0.761773	training's macroF1: 0.673882	valid_1's multi_logloss: 1.00345	valid_1's macroF1: 0.421104
******************** Execution ended in 00h 01m 49.52s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.907838	training's macroF1: 0.619255	valid_1's multi_logloss: 1.04572	valid_1's macroF1: 0.403503
[1000]	training's multi_logloss: 0.765939	training's macroF1: 0.680588	valid_1's multi_logloss: 1.03142	valid_1's macroF1: 0.415198
[1500]	training's multi_logloss: 0.674712	training's macroF1: 0.733827	valid_1's multi_logloss: 1.01706	valid_1's macroF1: 0.427487
[2000]	training's multi_logloss: 0.610025	training's macroF1: 0.766093	valid_1's multi_logloss: 1.01442	valid_1's macroF1: 0.425556
[2500]	training's multi_logloss: 0.559445	training's macroF1: 0.791199	valid_1's multi_logloss: 1.01278	valid_1's macroF1: 0.433576
Early stopping, best iteration is:
[2064]	training's multi_logloss: 0.602841	training's macroF1: 0.770507	valid_1's multi_logloss: 1.01589	valid_1's macroF1: 0.438177
******************** Execution ended in 00h 03m 09.63s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.895739	training's macroF1: 0.626521	valid_1's multi_logloss: 1.08062	valid_1's macroF1: 0.385384
Early stopping, best iteration is:
[26]	training's multi_logloss: 1.31069	training's macroF1: 0.480752	valid_1's multi_logloss: 1.3042	valid_1's macroF1: 0.420754
******************** Execution ended in 00h 00m 41.80s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.922218	training's macroF1: 0.615951	valid_1's multi_logloss: 0.956162	valid_1's macroF1: 0.486686
Early stopping, best iteration is:
[431]	training's multi_logloss: 0.950119	training's macroF1: 0.596526	valid_1's multi_logloss: 0.963887	valid_1's macroF1: 0.492032
******************** Execution ended in 00h 01m 11.48s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.90137	training's macroF1: 0.618391	valid_1's multi_logloss: 1.07553	valid_1's macroF1: 0.389735
Early stopping, best iteration is:
[407]	training's multi_logloss: 0.940399	training's macroF1: 0.602883	valid_1's multi_logloss: 1.0817	valid_1's macroF1: 0.405909
******************** Execution ended in 00h 01m 09.97s ********************
######################################## 27 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.964246	training's macroF1: 0.584653	valid_1's multi_logloss: 1.02534	valid_1's macroF1: 0.452632
[1000]	training's multi_logloss: 0.83076	training's macroF1: 0.64756	valid_1's multi_logloss: 1.00605	valid_1's macroF1: 0.463799
Early stopping, best iteration is:
[848]	training's multi_logloss: 0.864505	training's macroF1: 0.638089	valid_1's multi_logloss: 1.01154	valid_1's macroF1: 0.467193
******************** Execution ended in 00h 01m 15.27s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.953429	training's macroF1: 0.587224	valid_1's multi_logloss: 1.03686	valid_1's macroF1: 0.400687
Early stopping, best iteration is:
[45]	training's multi_logloss: 1.29274	training's macroF1: 0.495714	valid_1's multi_logloss: 1.27677	valid_1's macroF1: 0.408356
******************** Execution ended in 00h 00m 31.04s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.949186	training's macroF1: 0.595898	valid_1's multi_logloss: 1.0566	valid_1's macroF1: 0.368725
Early stopping, best iteration is:
[14]	training's multi_logloss: 1.35418	training's macroF1: 0.464546	valid_1's multi_logloss: 1.34839	valid_1's macroF1: 0.380296
******************** Execution ended in 00h 00m 28.75s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.959856	training's macroF1: 0.591938	valid_1's multi_logloss: 1.02286	valid_1's macroF1: 0.436531
[1000]	training's multi_logloss: 0.825508	training's macroF1: 0.665823	valid_1's multi_logloss: 1.01105	valid_1's macroF1: 0.448329
[1500]	training's multi_logloss: 0.737028	training's macroF1: 0.710174	valid_1's multi_logloss: 1.00841	valid_1's macroF1: 0.448708
[2000]	training's multi_logloss: 0.671124	training's macroF1: 0.741354	valid_1's multi_logloss: 1.00777	valid_1's macroF1: 0.464259
Early stopping, best iteration is:
[1839]	training's multi_logloss: 0.690865	training's macroF1: 0.730756	valid_1's multi_logloss: 1.00276	valid_1's macroF1: 0.456267
******************** Execution ended in 00h 02m 09.13s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.962607	training's macroF1: 0.596246	valid_1's multi_logloss: 1.02851	valid_1's macroF1: 0.429108
[1000]	training's multi_logloss: 0.82895	training's macroF1: 0.66177	valid_1's multi_logloss: 1.00816	valid_1's macroF1: 0.422174
Early stopping, best iteration is:
[613]	training's multi_logloss: 0.924642	training's macroF1: 0.610231	valid_1's multi_logloss: 1.01979	valid_1's macroF1: 0.437229
******************** Execution ended in 00h 01m 02.93s ********************
######################################## 28 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.1064	training's macroF1: 0.621124	valid_1's multi_logloss: 1.1217	valid_1's macroF1: 0.386253
Early stopping, best iteration is:
[20]	training's multi_logloss: 1.36984	training's macroF1: 0.525691	valid_1's multi_logloss: 1.36778	valid_1's macroF1: 0.412845
******************** Execution ended in 00h 00m 48.20s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.10942	training's macroF1: 0.61295	valid_1's multi_logloss: 1.1186	valid_1's macroF1: 0.404855
[1000]	training's multi_logloss: 0.958931	training's macroF1: 0.659805	valid_1's multi_logloss: 1.0312	valid_1's macroF1: 0.409979
Early stopping, best iteration is:
[576]	training's multi_logloss: 1.0818	training's macroF1: 0.620317	valid_1's multi_logloss: 1.10014	valid_1's macroF1: 0.417177
******************** Execution ended in 00h 01m 50.96s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.1085	training's macroF1: 0.611803	valid_1's multi_logloss: 1.13382	valid_1's macroF1: 0.426722
Early stopping, best iteration is:
[58]	training's multi_logloss: 1.34197	training's macroF1: 0.559251	valid_1's multi_logloss: 1.33674	valid_1's macroF1: 0.434148
******************** Execution ended in 00h 00m 52.16s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.10688	training's macroF1: 0.623805	valid_1's multi_logloss: 1.15181	valid_1's macroF1: 0.419124
[1000]	training's multi_logloss: 0.955646	training's macroF1: 0.678811	valid_1's multi_logloss: 1.07043	valid_1's macroF1: 0.421485
Early stopping, best iteration is:
[691]	training's multi_logloss: 1.04024	training's macroF1: 0.641187	valid_1's multi_logloss: 1.11263	valid_1's macroF1: 0.428778
******************** Execution ended in 00h 02m 01.56s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.10045	training's macroF1: 0.612875	valid_1's multi_logloss: 1.15994	valid_1's macroF1: 0.368683
[1000]	training's multi_logloss: 0.945354	training's macroF1: 0.655267	valid_1's multi_logloss: 1.09103	valid_1's macroF1: 0.366159
Early stopping, best iteration is:
[526]	training's multi_logloss: 1.0904	training's macroF1: 0.618306	valid_1's multi_logloss: 1.15403	valid_1's macroF1: 0.377613
******************** Execution ended in 00h 01m 42.45s ********************
######################################## 29 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.03807	training's macroF1: 0.525745	valid_1's multi_logloss: 1.09533	valid_1's macroF1: 0.36886
Early stopping, best iteration is:
[21]	training's multi_logloss: 1.34868	training's macroF1: 0.464098	valid_1's multi_logloss: 1.34108	valid_1's macroF1: 0.380429
******************** Execution ended in 00h 00m 26.17s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.05208	training's macroF1: 0.54688	valid_1's multi_logloss: 1.08021	valid_1's macroF1: 0.404179
Early stopping, best iteration is:
[38]	training's multi_logloss: 1.32729	training's macroF1: 0.466128	valid_1's multi_logloss: 1.30951	valid_1's macroF1: 0.428354
******************** Execution ended in 00h 00m 26.05s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.06078	training's macroF1: 0.536397	valid_1's multi_logloss: 1.06636	valid_1's macroF1: 0.422467
[1000]	training's multi_logloss: 0.96103	training's macroF1: 0.575414	valid_1's multi_logloss: 1.03914	valid_1's macroF1: 0.430793
Early stopping, best iteration is:
[735]	training's multi_logloss: 1.00676	training's macroF1: 0.555576	valid_1's multi_logloss: 1.0481	valid_1's macroF1: 0.44291
******************** Execution ended in 00h 00m 58.41s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.05332	training's macroF1: 0.5335	valid_1's multi_logloss: 1.02225	valid_1's macroF1: 0.413933
Early stopping, best iteration is:
[400]	training's multi_logloss: 1.08394	training's macroF1: 0.526896	valid_1's multi_logloss: 1.03935	valid_1's macroF1: 0.423188
******************** Execution ended in 00h 00m 43.51s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.04599	training's macroF1: 0.543112	valid_1's multi_logloss: 1.07255	valid_1's macroF1: 0.406737
[1000]	training's multi_logloss: 0.942729	training's macroF1: 0.582952	valid_1's multi_logloss: 1.05751	valid_1's macroF1: 0.394757
Early stopping, best iteration is:
[712]	training's multi_logloss: 0.994178	training's macroF1: 0.561105	valid_1's multi_logloss: 1.06215	valid_1's macroF1: 0.414359
******************** Execution ended in 00h 00m 58.65s ********************
######################################## 30 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.645191	training's macroF1: 0.782338	valid_1's multi_logloss: 0.971783	valid_1's macroF1: 0.401896
Early stopping, best iteration is:
[231]	training's multi_logloss: 0.866802	training's macroF1: 0.72021	valid_1's multi_logloss: 1.02972	valid_1's macroF1: 0.424846
******************** Execution ended in 00h 01m 13.83s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.63992	training's macroF1: 0.779209	valid_1's multi_logloss: 0.972584	valid_1's macroF1: 0.416659
Early stopping, best iteration is:
[30]	training's multi_logloss: 1.26468	training's macroF1: 0.592983	valid_1's multi_logloss: 1.26968	valid_1's macroF1: 0.448569
******************** Execution ended in 00h 00m 49.72s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.644894	training's macroF1: 0.774475	valid_1's multi_logloss: 1.00835	valid_1's macroF1: 0.41355
Early stopping, best iteration is:
[57]	training's multi_logloss: 1.17835	training's macroF1: 0.628495	valid_1's multi_logloss: 1.21817	valid_1's macroF1: 0.435933
******************** Execution ended in 00h 00m 51.30s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.639858	training's macroF1: 0.7926	valid_1's multi_logloss: 0.997143	valid_1's macroF1: 0.420676
[1000]	training's multi_logloss: 0.447524	training's macroF1: 0.85148	valid_1's multi_logloss: 0.982106	valid_1's macroF1: 0.42299
[1500]	training's multi_logloss: 0.357004	training's macroF1: 0.886419	valid_1's multi_logloss: 0.978627	valid_1's macroF1: 0.429918
Early stopping, best iteration is:
[1392]	training's multi_logloss: 0.372025	training's macroF1: 0.880198	valid_1's multi_logloss: 0.979598	valid_1's macroF1: 0.432092
******************** Execution ended in 00h 02m 54.87s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.64265	training's macroF1: 0.774327	valid_1's multi_logloss: 1.00109	valid_1's macroF1: 0.384584
Early stopping, best iteration is:
[20]	training's multi_logloss: 1.3018	training's macroF1: 0.595838	valid_1's multi_logloss: 1.30393	valid_1's macroF1: 0.402713
******************** Execution ended in 00h 00m 49.37s ********************
######################################## 31 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.904224	training's macroF1: 0.611103	valid_1's multi_logloss: 1.06779	valid_1's macroF1: 0.385332
Early stopping, best iteration is:
[192]	training's multi_logloss: 1.05865	training's macroF1: 0.545753	valid_1's multi_logloss: 1.08913	valid_1's macroF1: 0.401227
******************** Execution ended in 00h 00m 42.10s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.909831	training's macroF1: 0.609324	valid_1's multi_logloss: 1.05674	valid_1's macroF1: 0.402736
Early stopping, best iteration is:
[353]	training's multi_logloss: 0.966867	training's macroF1: 0.578357	valid_1's multi_logloss: 1.06388	valid_1's macroF1: 0.413389
******************** Execution ended in 00h 00m 52.05s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.908514	training's macroF1: 0.605167	valid_1's multi_logloss: 1.05894	valid_1's macroF1: 0.397606
Early stopping, best iteration is:
[334]	training's multi_logloss: 0.97626	training's macroF1: 0.578249	valid_1's multi_logloss: 1.06431	valid_1's macroF1: 0.413256
******************** Execution ended in 00h 00m 51.43s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.916911	training's macroF1: 0.606472	valid_1's multi_logloss: 1.00978	valid_1's macroF1: 0.444424
[1000]	training's multi_logloss: 0.788544	training's macroF1: 0.670914	valid_1's multi_logloss: 0.992011	valid_1's macroF1: 0.439945
Early stopping, best iteration is:
[579]	training's multi_logloss: 0.891163	training's macroF1: 0.620844	valid_1's multi_logloss: 1.00626	valid_1's macroF1: 0.446838
******************** Execution ended in 00h 01m 05.15s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.909625	training's macroF1: 0.602566	valid_1's multi_logloss: 1.0206	valid_1's macroF1: 0.42355
Early stopping, best iteration is:
[192]	training's multi_logloss: 1.0617	training's macroF1: 0.541712	valid_1's multi_logloss: 1.06113	valid_1's macroF1: 0.446153
******************** Execution ended in 00h 00m 41.95s ********************
######################################## 32 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.05833	training's macroF1: 0.550869	valid_1's multi_logloss: 1.06454	valid_1's macroF1: 0.370216
Early stopping, best iteration is:
[343]	training's multi_logloss: 1.11429	training's macroF1: 0.522876	valid_1's multi_logloss: 1.09302	valid_1's macroF1: 0.383378
******************** Execution ended in 00h 00m 39.39s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.06549	training's macroF1: 0.534841	valid_1's multi_logloss: 1.0534	valid_1's macroF1: 0.384535
[1000]	training's multi_logloss: 0.962423	training's macroF1: 0.580359	valid_1's multi_logloss: 1.02345	valid_1's macroF1: 0.399813
[1500]	training's multi_logloss: 0.895626	training's macroF1: 0.615252	valid_1's multi_logloss: 1.01704	valid_1's macroF1: 0.406948
[2000]	training's multi_logloss: 0.84534	training's macroF1: 0.639609	valid_1's multi_logloss: 1.01367	valid_1's macroF1: 0.412471
Early stopping, best iteration is:
[1966]	training's multi_logloss: 0.848541	training's macroF1: 0.636192	valid_1's multi_logloss: 1.01381	valid_1's macroF1: 0.420376
******************** Execution ended in 00h 01m 37.62s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.06965	training's macroF1: 0.540439	valid_1's multi_logloss: 1.06914	valid_1's macroF1: 0.41693
Early stopping, best iteration is:
[417]	training's multi_logloss: 1.09667	training's macroF1: 0.531259	valid_1's multi_logloss: 1.08425	valid_1's macroF1: 0.431676
******************** Execution ended in 00h 00m 38.11s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.06988	training's macroF1: 0.538012	valid_1's multi_logloss: 1.07106	valid_1's macroF1: 0.372955
Early stopping, best iteration is:
[121]	training's multi_logloss: 1.25043	training's macroF1: 0.491658	valid_1's multi_logloss: 1.2189	valid_1's macroF1: 0.398413
******************** Execution ended in 00h 00m 26.03s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.06315	training's macroF1: 0.548751	valid_1's multi_logloss: 1.08975	valid_1's macroF1: 0.395899
[1000]	training's multi_logloss: 0.961405	training's macroF1: 0.576417	valid_1's multi_logloss: 1.06971	valid_1's macroF1: 0.400837
Early stopping, best iteration is:
[846]	training's multi_logloss: 0.986443	training's macroF1: 0.563439	valid_1's multi_logloss: 1.07041	valid_1's macroF1: 0.408452
******************** Execution ended in 00h 00m 54.59s ********************
######################################## 33 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.869955	training's macroF1: 0.660329	valid_1's multi_logloss: 0.994284	valid_1's macroF1: 0.444177
Early stopping, best iteration is:
[271]	training's multi_logloss: 1.00289	training's macroF1: 0.612345	valid_1's multi_logloss: 1.0425	valid_1's macroF1: 0.447601
******************** Execution ended in 00h 01m 08.85s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.864985	training's macroF1: 0.659996	valid_1's multi_logloss: 1.01526	valid_1's macroF1: 0.430609
[1000]	training's multi_logloss: 0.697442	training's macroF1: 0.739491	valid_1's multi_logloss: 0.997228	valid_1's macroF1: 0.434441
Early stopping, best iteration is:
[613]	training's multi_logloss: 0.817406	training's macroF1: 0.678198	valid_1's multi_logloss: 1.00738	valid_1's macroF1: 0.441088
******************** Execution ended in 00h 01m 38.79s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.862471	training's macroF1: 0.658302	valid_1's multi_logloss: 1.00202	valid_1's macroF1: 0.411821
Early stopping, best iteration is:
[272]	training's multi_logloss: 0.995109	training's macroF1: 0.595925	valid_1's multi_logloss: 1.04182	valid_1's macroF1: 0.424301
******************** Execution ended in 00h 01m 08.35s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.864335	training's macroF1: 0.675712	valid_1's multi_logloss: 1.05802	valid_1's macroF1: 0.405308
Early stopping, best iteration is:
[211]	training's multi_logloss: 1.04696	training's macroF1: 0.616735	valid_1's multi_logloss: 1.11385	valid_1's macroF1: 0.424722
******************** Execution ended in 00h 01m 02.63s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.853958	training's macroF1: 0.652339	valid_1's multi_logloss: 1.05299	valid_1's macroF1: 0.383835
[1000]	training's multi_logloss: 0.690527	training's macroF1: 0.72592	valid_1's multi_logloss: 1.03531	valid_1's macroF1: 0.392891
Early stopping, best iteration is:
[755]	training's multi_logloss: 0.758989	training's macroF1: 0.693673	valid_1's multi_logloss: 1.04183	valid_1's macroF1: 0.405534
******************** Execution ended in 00h 01m 55.76s ********************
######################################## 34 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.08916	training's macroF1: 0.50764	valid_1's multi_logloss: 1.10321	valid_1's macroF1: 0.415162
[1000]	training's multi_logloss: 1.01135	training's macroF1: 0.545349	valid_1's multi_logloss: 1.08305	valid_1's macroF1: 0.421022
Early stopping, best iteration is:
[540]	training's multi_logloss: 1.08063	training's macroF1: 0.505099	valid_1's multi_logloss: 1.10216	valid_1's macroF1: 0.4334
******************** Execution ended in 00h 00m 33.70s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.08482	training's macroF1: 0.521816	valid_1's multi_logloss: 1.05027	valid_1's macroF1: 0.396627
[1000]	training's multi_logloss: 1.00221	training's macroF1: 0.545941	valid_1's multi_logloss: 1.03549	valid_1's macroF1: 0.401351
[1500]	training's multi_logloss: 0.947808	training's macroF1: 0.578846	valid_1's multi_logloss: 1.03253	valid_1's macroF1: 0.399981
Early stopping, best iteration is:
[1377]	training's multi_logloss: 0.959963	training's macroF1: 0.569978	valid_1's multi_logloss: 1.03275	valid_1's macroF1: 0.413781
******************** Execution ended in 00h 00m 59.51s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.07866	training's macroF1: 0.522615	valid_1's multi_logloss: 1.11704	valid_1's macroF1: 0.410071
Early stopping, best iteration is:
[378]	training's multi_logloss: 1.11008	training's macroF1: 0.518178	valid_1's multi_logloss: 1.12543	valid_1's macroF1: 0.414591
******************** Execution ended in 00h 00m 28.61s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.08065	training's macroF1: 0.509672	valid_1's multi_logloss: 1.06682	valid_1's macroF1: 0.415609
[1000]	training's multi_logloss: 0.99958	training's macroF1: 0.545949	valid_1's multi_logloss: 1.04398	valid_1's macroF1: 0.419905
[1500]	training's multi_logloss: 0.94819	training's macroF1: 0.574207	valid_1's multi_logloss: 1.04215	valid_1's macroF1: 0.412599
Early stopping, best iteration is:
[1238]	training's multi_logloss: 0.972718	training's macroF1: 0.556604	valid_1's multi_logloss: 1.04332	valid_1's macroF1: 0.427397
******************** Execution ended in 00h 00m 55.03s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.0755	training's macroF1: 0.505342	valid_1's multi_logloss: 1.02663	valid_1's macroF1: 0.390627
Early stopping, best iteration is:
[10]	training's multi_logloss: 1.36838	training's macroF1: 0.424073	valid_1's multi_logloss: 1.35602	valid_1's macroF1: 0.415529
******************** Execution ended in 00h 00m 16.59s ********************
######################################## 35 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.12479	training's macroF1: 0.566932	valid_1's multi_logloss: 1.10265	valid_1's macroF1: 0.451481
[1000]	training's multi_logloss: 1.00988	training's macroF1: 0.592613	valid_1's multi_logloss: 1.03074	valid_1's macroF1: 0.431464
Early stopping, best iteration is:
[530]	training's multi_logloss: 1.1158	training's macroF1: 0.568653	valid_1's multi_logloss: 1.09559	valid_1's macroF1: 0.454459
******************** Execution ended in 00h 00m 52.75s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.11502	training's macroF1: 0.550964	valid_1's multi_logloss: 1.1411	valid_1's macroF1: 0.37313
Early stopping, best iteration is:
[111]	training's multi_logloss: 1.29554	training's macroF1: 0.504182	valid_1's multi_logloss: 1.29028	valid_1's macroF1: 0.390227
******************** Execution ended in 00h 00m 30.84s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.1116	training's macroF1: 0.558426	valid_1's multi_logloss: 1.10985	valid_1's macroF1: 0.412323
[1000]	training's multi_logloss: 0.99325	training's macroF1: 0.60369	valid_1's multi_logloss: 1.05112	valid_1's macroF1: 0.420128
Early stopping, best iteration is:
[938]	training's multi_logloss: 1.00475	training's macroF1: 0.597078	valid_1's multi_logloss: 1.05506	valid_1's macroF1: 0.423926
******************** Execution ended in 00h 01m 13.84s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.12692	training's macroF1: 0.555206	valid_1's multi_logloss: 1.11367	valid_1's macroF1: 0.397537
Early stopping, best iteration is:
[285]	training's multi_logloss: 1.20772	training's macroF1: 0.521734	valid_1's multi_logloss: 1.18411	valid_1's macroF1: 0.408667
******************** Execution ended in 00h 00m 39.81s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.11754	training's macroF1: 0.551218	valid_1's multi_logloss: 1.10194	valid_1's macroF1: 0.411145
Early stopping, best iteration is:
[44]	training's multi_logloss: 1.34662	training's macroF1: 0.492833	valid_1's multi_logloss: 1.33712	valid_1's macroF1: 0.415093
******************** Execution ended in 00h 00m 26.95s ********************
######################################## 36 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.594738	training's macroF1: 0.802879	valid_1's multi_logloss: 0.955162	valid_1's macroF1: 0.411936
Early stopping, best iteration is:
[178]	training's multi_logloss: 0.907338	training's macroF1: 0.713464	valid_1's multi_logloss: 1.03527	valid_1's macroF1: 0.433103
******************** Execution ended in 00h 01m 08.64s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.591779	training's macroF1: 0.80527	valid_1's multi_logloss: 1.03648	valid_1's macroF1: 0.387512
[1000]	training's multi_logloss: 0.413517	training's macroF1: 0.865685	valid_1's multi_logloss: 1.0168	valid_1's macroF1: 0.377673
Early stopping, best iteration is:
[943]	training's multi_logloss: 0.426294	training's macroF1: 0.863753	valid_1's multi_logloss: 1.01865	valid_1's macroF1: 0.398903
******************** Execution ended in 00h 02m 29.20s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.582462	training's macroF1: 0.817363	valid_1's multi_logloss: 0.975324	valid_1's macroF1: 0.382348
[1000]	training's multi_logloss: 0.406282	training's macroF1: 0.866316	valid_1's multi_logloss: 0.972768	valid_1's macroF1: 0.386858
Early stopping, best iteration is:
[639]	training's multi_logloss: 0.513666	training's macroF1: 0.837524	valid_1's multi_logloss: 0.971406	valid_1's macroF1: 0.395855
******************** Execution ended in 00h 01m 57.08s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.604397	training's macroF1: 0.800887	valid_1's multi_logloss: 0.953704	valid_1's macroF1: 0.408685
Early stopping, best iteration is:
[449]	training's multi_logloss: 0.63664	training's macroF1: 0.784531	valid_1's multi_logloss: 0.961957	valid_1's macroF1: 0.424926
******************** Execution ended in 00h 01m 38.29s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.59178	training's macroF1: 0.803338	valid_1's multi_logloss: 1.01139	valid_1's macroF1: 0.408832
[1000]	training's multi_logloss: 0.414392	training's macroF1: 0.86242	valid_1's multi_logloss: 0.986926	valid_1's macroF1: 0.438745
[1500]	training's multi_logloss: 0.339732	training's macroF1: 0.886646	valid_1's multi_logloss: 0.979916	valid_1's macroF1: 0.426275
Early stopping, best iteration is:
[1145]	training's multi_logloss: 0.387347	training's macroF1: 0.869148	valid_1's multi_logloss: 0.984027	valid_1's macroF1: 0.443808
******************** Execution ended in 00h 02m 40.09s ********************
######################################## 37 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.2577	training's macroF1: 0.488215	valid_1's multi_logloss: 1.2345	valid_1's macroF1: 0.374227
Early stopping, best iteration is:
[7]	training's multi_logloss: 1.38388	training's macroF1: 0.447279	valid_1's multi_logloss: 1.38309	valid_1's macroF1: 0.388279
******************** Execution ended in 00h 00m 35.23s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.25996	training's macroF1: 0.480892	valid_1's multi_logloss: 1.24163	valid_1's macroF1: 0.390542
Early stopping, best iteration is:
[51]	training's multi_logloss: 1.36978	training's macroF1: 0.462242	valid_1's multi_logloss: 1.36549	valid_1's macroF1: 0.410506
******************** Execution ended in 00h 00m 40.25s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.25776	training's macroF1: 0.491685	valid_1's multi_logloss: 1.23015	valid_1's macroF1: 0.391796
[1000]	training's multi_logloss: 1.1749	training's macroF1: 0.514329	valid_1's multi_logloss: 1.15178	valid_1's macroF1: 0.390986
Early stopping, best iteration is:
[751]	training's multi_logloss: 1.21168	training's macroF1: 0.50341	valid_1's multi_logloss: 1.18475	valid_1's macroF1: 0.402807
******************** Execution ended in 00h 01m 35.28s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.26241	training's macroF1: 0.494631	valid_1's multi_logloss: 1.23409	valid_1's macroF1: 0.412751
Early stopping, best iteration is:
[402]	training's multi_logloss: 1.28258	training's macroF1: 0.484972	valid_1's multi_logloss: 1.25643	valid_1's macroF1: 0.428322
******************** Execution ended in 00h 01m 05.95s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.2558	training's macroF1: 0.517966	valid_1's multi_logloss: 1.24026	valid_1's macroF1: 0.379645
Early stopping, best iteration is:
[16]	training's multi_logloss: 1.38082	training's macroF1: 0.470095	valid_1's multi_logloss: 1.37941	valid_1's macroF1: 0.3966
******************** Execution ended in 00h 00m 37.38s ********************
######################################## 38 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.14074	training's macroF1: 0.516031	valid_1's multi_logloss: 1.12059	valid_1's macroF1: 0.392192
[1000]	training's multi_logloss: 1.03391	training's macroF1: 0.548701	valid_1's multi_logloss: 1.05962	valid_1's macroF1: 0.402237
Early stopping, best iteration is:
[978]	training's multi_logloss: 1.0376	training's macroF1: 0.544943	valid_1's multi_logloss: 1.06128	valid_1's macroF1: 0.409218
******************** Execution ended in 00h 01m 27.50s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.13431	training's macroF1: 0.533157	valid_1's multi_logloss: 1.13219	valid_1's macroF1: 0.383539
[1000]	training's multi_logloss: 1.02554	training's macroF1: 0.564812	valid_1's multi_logloss: 1.07394	valid_1's macroF1: 0.395119
[1500]	training's multi_logloss: 0.958499	training's macroF1: 0.587929	valid_1's multi_logloss: 1.06045	valid_1's macroF1: 0.398654
Early stopping, best iteration is:
[1293]	training's multi_logloss: 0.983309	training's macroF1: 0.579312	valid_1's multi_logloss: 1.06314	valid_1's macroF1: 0.404405
******************** Execution ended in 00h 01m 43.33s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.14383	training's macroF1: 0.52709	valid_1's multi_logloss: 1.12558	valid_1's macroF1: 0.404605
Early stopping, best iteration is:
[390]	training's multi_logloss: 1.17823	training's macroF1: 0.518382	valid_1's multi_logloss: 1.15295	valid_1's macroF1: 0.414989
******************** Execution ended in 00h 00m 49.31s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.14268	training's macroF1: 0.531082	valid_1's multi_logloss: 1.10133	valid_1's macroF1: 0.398754
Early stopping, best iteration is:
[27]	training's multi_logloss: 1.36433	training's macroF1: 0.45498	valid_1's multi_logloss: 1.35415	valid_1's macroF1: 0.426706
******************** Execution ended in 00h 00m 29.24s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.14384	training's macroF1: 0.533249	valid_1's multi_logloss: 1.12562	valid_1's macroF1: 0.414502
[1000]	training's multi_logloss: 1.03884	training's macroF1: 0.564221	valid_1's multi_logloss: 1.06224	valid_1's macroF1: 0.407688
Early stopping, best iteration is:
[527]	training's multi_logloss: 1.13625	training's macroF1: 0.535358	valid_1's multi_logloss: 1.12004	valid_1's macroF1: 0.416617
******************** Execution ended in 00h 00m 57.38s ********************
######################################## 39 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.11294	training's macroF1: 0.484685	valid_1's multi_logloss: 1.08443	valid_1's macroF1: 0.412734
Early stopping, best iteration is:
[260]	training's multi_logloss: 1.17315	training's macroF1: 0.475675	valid_1's multi_logloss: 1.12432	valid_1's macroF1: 0.427565
******************** Execution ended in 00h 00m 21.88s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.10516	training's macroF1: 0.484149	valid_1's multi_logloss: 1.06738	valid_1's macroF1: 0.389824
Early stopping, best iteration is:
[441]	training's multi_logloss: 1.11608	training's macroF1: 0.485254	valid_1's multi_logloss: 1.07286	valid_1's macroF1: 0.39981
******************** Execution ended in 00h 00m 26.72s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.1143	training's macroF1: 0.481821	valid_1's multi_logloss: 1.07888	valid_1's macroF1: 0.401004
Early stopping, best iteration is:
[335]	training's multi_logloss: 1.15022	training's macroF1: 0.475125	valid_1's multi_logloss: 1.09236	valid_1's macroF1: 0.419557
******************** Execution ended in 00h 00m 24.04s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.10833	training's macroF1: 0.483019	valid_1's multi_logloss: 1.09362	valid_1's macroF1: 0.394489
[1000]	training's multi_logloss: 1.04626	training's macroF1: 0.517864	valid_1's multi_logloss: 1.08117	valid_1's macroF1: 0.402033
Early stopping, best iteration is:
[992]	training's multi_logloss: 1.04705	training's macroF1: 0.514815	valid_1's multi_logloss: 1.08122	valid_1's macroF1: 0.405579
******************** Execution ended in 00h 00m 42.15s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.113	training's macroF1: 0.482034	valid_1's multi_logloss: 1.1112	valid_1's macroF1: 0.393716
Early stopping, best iteration is:
[10]	training's multi_logloss: 1.37023	training's macroF1: 0.412393	valid_1's multi_logloss: 1.36237	valid_1's macroF1: 0.402828
******************** Execution ended in 00h 00m 15.02s ********************
######################################## 40 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.04014	training's macroF1: 0.520467	valid_1's multi_logloss: 1.02889	valid_1's macroF1: 0.426815
Early stopping, best iteration is:
[373]	training's multi_logloss: 1.07351	training's macroF1: 0.506163	valid_1's multi_logloss: 1.0348	valid_1's macroF1: 0.43991
******************** Execution ended in 00h 00m 39.45s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.03129	training's macroF1: 0.535138	valid_1's multi_logloss: 1.05016	valid_1's macroF1: 0.416594
[1000]	training's multi_logloss: 0.946173	training's macroF1: 0.576429	valid_1's multi_logloss: 1.04836	valid_1's macroF1: 0.425101
Early stopping, best iteration is:
[890]	training's multi_logloss: 0.961534	training's macroF1: 0.56605	valid_1's multi_logloss: 1.05001	valid_1's macroF1: 0.433966
******************** Execution ended in 00h 00m 57.00s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.02408	training's macroF1: 0.532041	valid_1's multi_logloss: 1.1079	valid_1's macroF1: 0.38635
Early stopping, best iteration is:
[259]	training's multi_logloss: 1.09935	training's macroF1: 0.4964	valid_1's multi_logloss: 1.12086	valid_1's macroF1: 0.397631
******************** Execution ended in 00h 00m 31.36s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.02369	training's macroF1: 0.532613	valid_1's multi_logloss: 1.05888	valid_1's macroF1: 0.39882
Early stopping, best iteration is:
[5]	training's multi_logloss: 1.37137	training's macroF1: 0.416943	valid_1's multi_logloss: 1.36531	valid_1's macroF1: 0.410584
******************** Execution ended in 00h 00m 21.11s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.02547	training's macroF1: 0.533093	valid_1's multi_logloss: 1.08358	valid_1's macroF1: 0.388974
Early stopping, best iteration is:
[104]	training's multi_logloss: 1.19854	training's macroF1: 0.473315	valid_1's multi_logloss: 1.16579	valid_1's macroF1: 0.405119
******************** Execution ended in 00h 00m 25.32s ********************
######################################## 41 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.23509	training's macroF1: 0.503037	valid_1's multi_logloss: 1.20367	valid_1's macroF1: 0.392491
Early stopping, best iteration is:
[54]	training's multi_logloss: 1.36366	training's macroF1: 0.470994	valid_1's multi_logloss: 1.3557	valid_1's macroF1: 0.414576
******************** Execution ended in 00h 00m 25.78s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.23808	training's macroF1: 0.47522	valid_1's multi_logloss: 1.19177	valid_1's macroF1: 0.390301
Early stopping, best iteration is:
[23]	training's multi_logloss: 1.37648	training's macroF1: 0.448003	valid_1's multi_logloss: 1.37203	valid_1's macroF1: 0.396884
******************** Execution ended in 00h 00m 23.78s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.2318	training's macroF1: 0.499995	valid_1's multi_logloss: 1.20854	valid_1's macroF1: 0.363396
Early stopping, best iteration is:
[1]	training's multi_logloss: 1.38584	training's macroF1: 0.417351	valid_1's multi_logloss: 1.38565	valid_1's macroF1: 0.382147
******************** Execution ended in 00h 00m 22.49s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.23005	training's macroF1: 0.482272	valid_1's multi_logloss: 1.2131	valid_1's macroF1: 0.368922
Early stopping, best iteration is:
[165]	training's multi_logloss: 1.32168	training's macroF1: 0.465481	valid_1's multi_logloss: 1.3094	valid_1's macroF1: 0.389235
******************** Execution ended in 00h 00m 29.59s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.23676	training's macroF1: 0.497106	valid_1's multi_logloss: 1.18902	valid_1's macroF1: 0.401446
Early stopping, best iteration is:
[165]	training's multi_logloss: 1.32543	training's macroF1: 0.473608	valid_1's multi_logloss: 1.29781	valid_1's macroF1: 0.428447
******************** Execution ended in 00h 00m 29.77s ********************
######################################## 42 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.889546	training's macroF1: 0.652826	valid_1's multi_logloss: 1.0798	valid_1's macroF1: 0.383314
Early stopping, best iteration is:
[32]	training's multi_logloss: 1.3063	training's macroF1: 0.531342	valid_1's multi_logloss: 1.309	valid_1's macroF1: 0.385634
******************** Execution ended in 00h 00m 42.10s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.896787	training's macroF1: 0.647364	valid_1's multi_logloss: 1.01983	valid_1's macroF1: 0.438611
[1000]	training's multi_logloss: 0.741668	training's macroF1: 0.71529	valid_1's multi_logloss: 0.992336	valid_1's macroF1: 0.444097
Early stopping, best iteration is:
[780]	training's multi_logloss: 0.801303	training's macroF1: 0.681928	valid_1's multi_logloss: 1.00041	valid_1's macroF1: 0.457308
******************** Execution ended in 00h 01m 40.72s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.907669	training's macroF1: 0.629332	valid_1's multi_logloss: 0.974822	valid_1's macroF1: 0.448167
[1000]	training's multi_logloss: 0.746819	training's macroF1: 0.697101	valid_1's multi_logloss: 0.942006	valid_1's macroF1: 0.450647
[1500]	training's multi_logloss: 0.638433	training's macroF1: 0.752848	valid_1's multi_logloss: 0.930953	valid_1's macroF1: 0.467112
[2000]	training's multi_logloss: 0.560742	training's macroF1: 0.789906	valid_1's multi_logloss: 0.923515	valid_1's macroF1: 0.472055
[2500]	training's multi_logloss: 0.501909	training's macroF1: 0.815554	valid_1's multi_logloss: 0.920454	valid_1's macroF1: 0.483169
Early stopping, best iteration is:
[2249]	training's multi_logloss: 0.529697	training's macroF1: 0.801963	valid_1's multi_logloss: 0.922251	valid_1's macroF1: 0.486116
******************** Execution ended in 00h 03m 27.72s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.898249	training's macroF1: 0.649344	valid_1's multi_logloss: 1.04874	valid_1's macroF1: 0.422564
Early stopping, best iteration is:
[397]	training's multi_logloss: 0.944895	training's macroF1: 0.631595	valid_1's multi_logloss: 1.06135	valid_1's macroF1: 0.437156
******************** Execution ended in 00h 01m 13.37s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.898129	training's macroF1: 0.639093	valid_1's multi_logloss: 1.03037	valid_1's macroF1: 0.413664
Early stopping, best iteration is:
[121]	training's multi_logloss: 1.16143	training's macroF1: 0.554013	valid_1's multi_logloss: 1.16101	valid_1's macroF1: 0.427538
******************** Execution ended in 00h 00m 51.84s ********************
######################################## 43 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.11196	training's macroF1: 0.504259	valid_1's multi_logloss: 1.0803	valid_1's macroF1: 0.389258
[1000]	training's multi_logloss: 1.03571	training's macroF1: 0.524563	valid_1's multi_logloss: 1.05815	valid_1's macroF1: 0.382977
Early stopping, best iteration is:
[877]	training's multi_logloss: 1.04996	training's macroF1: 0.519737	valid_1's multi_logloss: 1.0593	valid_1's macroF1: 0.408931
******************** Execution ended in 00h 00m 48.36s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.12878	training's macroF1: 0.493121	valid_1's multi_logloss: 1.09028	valid_1's macroF1: 0.409471
[1000]	training's multi_logloss: 1.05606	training's macroF1: 0.519611	valid_1's multi_logloss: 1.05386	valid_1's macroF1: 0.419115
Early stopping, best iteration is:
[979]	training's multi_logloss: 1.0583	training's macroF1: 0.520011	valid_1's multi_logloss: 1.05508	valid_1's macroF1: 0.420111
******************** Execution ended in 00h 00m 51.35s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.12305	training's macroF1: 0.490997	valid_1's multi_logloss: 1.06293	valid_1's macroF1: 0.465243
Early stopping, best iteration is:
[384]	training's multi_logloss: 1.15304	training's macroF1: 0.475243	valid_1's multi_logloss: 1.08284	valid_1's macroF1: 0.467708
******************** Execution ended in 00h 00m 31.41s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.12727	training's macroF1: 0.499517	valid_1's multi_logloss: 1.11572	valid_1's macroF1: 0.361759
Early stopping, best iteration is:
[60]	training's multi_logloss: 1.31666	training's macroF1: 0.453537	valid_1's multi_logloss: 1.29174	valid_1's macroF1: 0.380608
******************** Execution ended in 00h 00m 20.51s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.11776	training's macroF1: 0.486488	valid_1's multi_logloss: 1.12242	valid_1's macroF1: 0.391126
[1000]	training's multi_logloss: 1.04245	training's macroF1: 0.524056	valid_1's multi_logloss: 1.10541	valid_1's macroF1: 0.408292
[1500]	training's multi_logloss: 0.995414	training's macroF1: 0.551524	valid_1's multi_logloss: 1.10192	valid_1's macroF1: 0.416645
[2000]	training's multi_logloss: 0.959362	training's macroF1: 0.563075	valid_1's multi_logloss: 1.10111	valid_1's macroF1: 0.411329
Early stopping, best iteration is:
[1572]	training's multi_logloss: 0.989864	training's macroF1: 0.552538	valid_1's multi_logloss: 1.10227	valid_1's macroF1: 0.42512
******************** Execution ended in 00h 01m 13.23s ********************
######################################## 44 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.820078	training's macroF1: 0.665156	valid_1's multi_logloss: 1.02735	valid_1's macroF1: 0.414773
Early stopping, best iteration is:
[187]	training's multi_logloss: 1.01082	training's macroF1: 0.572393	valid_1's multi_logloss: 1.06414	valid_1's macroF1: 0.441662
******************** Execution ended in 00h 00m 44.45s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.80254	training's macroF1: 0.666306	valid_1's multi_logloss: 1.04256	valid_1's macroF1: 0.388427
[1000]	training's multi_logloss: 0.647958	training's macroF1: 0.742893	valid_1's multi_logloss: 1.03417	valid_1's macroF1: 0.388625
Early stopping, best iteration is:
[787]	training's multi_logloss: 0.702192	training's macroF1: 0.71535	valid_1's multi_logloss: 1.03752	valid_1's macroF1: 0.405027
******************** Execution ended in 00h 01m 22.27s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.808003	training's macroF1: 0.678017	valid_1's multi_logloss: 1.01835	valid_1's macroF1: 0.432496
Early stopping, best iteration is:
[189]	training's multi_logloss: 1.00249	training's macroF1: 0.584859	valid_1's multi_logloss: 1.0591	valid_1's macroF1: 0.443377
******************** Execution ended in 00h 00m 45.26s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.822026	training's macroF1: 0.664212	valid_1's multi_logloss: 0.983132	valid_1's macroF1: 0.422824
Early stopping, best iteration is:
[131]	training's multi_logloss: 1.07427	training's macroF1: 0.559816	valid_1's multi_logloss: 1.07254	valid_1's macroF1: 0.442215
******************** Execution ended in 00h 00m 40.11s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.797009	training's macroF1: 0.667564	valid_1's multi_logloss: 1.03259	valid_1's macroF1: 0.402937
Early stopping, best iteration is:
[111]	training's multi_logloss: 1.09313	training's macroF1: 0.552966	valid_1's multi_logloss: 1.09779	valid_1's macroF1: 0.421831
******************** Execution ended in 00h 00m 43.89s ********************
######################################## 45 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.13046	training's macroF1: 0.542273	valid_1's multi_logloss: 1.14415	valid_1's macroF1: 0.407926
Early stopping, best iteration is:
[2]	training's multi_logloss: 1.38451	training's macroF1: 0.444451	valid_1's multi_logloss: 1.38406	valid_1's macroF1: 0.418685
******************** Execution ended in 00h 00m 32.22s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.1383	training's macroF1: 0.530167	valid_1's multi_logloss: 1.11988	valid_1's macroF1: 0.41558
[1000]	training's multi_logloss: 1.0312	training's macroF1: 0.561026	valid_1's multi_logloss: 1.04854	valid_1's macroF1: 0.416117
Early stopping, best iteration is:
[743]	training's multi_logloss: 1.07809	training's macroF1: 0.542512	valid_1's multi_logloss: 1.07468	valid_1's macroF1: 0.423817
******************** Execution ended in 00h 01m 19.51s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.13462	training's macroF1: 0.527095	valid_1's multi_logloss: 1.11482	valid_1's macroF1: 0.389953
Early stopping, best iteration is:
[323]	training's multi_logloss: 1.19679	training's macroF1: 0.508338	valid_1's multi_logloss: 1.16876	valid_1's macroF1: 0.406442
******************** Execution ended in 00h 00m 52.42s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.13123	training's macroF1: 0.539203	valid_1's multi_logloss: 1.12962	valid_1's macroF1: 0.377098
[1000]	training's multi_logloss: 1.02152	training's macroF1: 0.572134	valid_1's multi_logloss: 1.0657	valid_1's macroF1: 0.386884
[1500]	training's multi_logloss: 0.953087	training's macroF1: 0.594816	valid_1's multi_logloss: 1.04669	valid_1's macroF1: 0.392709
Early stopping, best iteration is:
[1469]	training's multi_logloss: 0.956754	training's macroF1: 0.593971	valid_1's multi_logloss: 1.0474	valid_1's macroF1: 0.399109
******************** Execution ended in 00h 02m 02.05s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.14035	training's macroF1: 0.530468	valid_1's multi_logloss: 1.13482	valid_1's macroF1: 0.391258
[1000]	training's multi_logloss: 1.02942	training's macroF1: 0.558691	valid_1's multi_logloss: 1.06955	valid_1's macroF1: 0.397508
[1500]	training's multi_logloss: 0.960629	training's macroF1: 0.590336	valid_1's multi_logloss: 1.04884	valid_1's macroF1: 0.423985
[2000]	training's multi_logloss: 0.909239	training's macroF1: 0.610841	valid_1's multi_logloss: 1.04082	valid_1's macroF1: 0.424011
Early stopping, best iteration is:
[1665]	training's multi_logloss: 0.942381	training's macroF1: 0.597739	valid_1's multi_logloss: 1.04549	valid_1's macroF1: 0.42744
******************** Execution ended in 00h 02m 12.88s ********************
######################################## 46 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.23944	training's macroF1: 0.466684	valid_1's multi_logloss: 1.1629	valid_1's macroF1: 0.438267
Early stopping, best iteration is:
[39]	training's multi_logloss: 1.37005	training's macroF1: 0.437431	valid_1's multi_logloss: 1.35939	valid_1's macroF1: 0.442367
******************** Execution ended in 00h 00m 26.36s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.2395	training's macroF1: 0.473864	valid_1's multi_logloss: 1.18588	valid_1's macroF1: 0.419919
Early stopping, best iteration is:
[58]	training's multi_logloss: 1.3622	training's macroF1: 0.440625	valid_1's multi_logloss: 1.3493	valid_1's macroF1: 0.42835
******************** Execution ended in 00h 00m 26.33s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.23113	training's macroF1: 0.472421	valid_1's multi_logloss: 1.19936	valid_1's macroF1: 0.379946
Early stopping, best iteration is:
[458]	training's multi_logloss: 1.2402	training's macroF1: 0.471238	valid_1's multi_logloss: 1.20833	valid_1's macroF1: 0.389444
******************** Execution ended in 00h 00m 45.80s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.23263	training's macroF1: 0.460599	valid_1's multi_logloss: 1.20267	valid_1's macroF1: 0.369858
Early stopping, best iteration is:
[362]	training's multi_logloss: 1.26398	training's macroF1: 0.450829	valid_1's multi_logloss: 1.23575	valid_1's macroF1: 0.380211
******************** Execution ended in 00h 00m 42.09s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.22938	training's macroF1: 0.478254	valid_1's multi_logloss: 1.21713	valid_1's macroF1: 0.387789
Early stopping, best iteration is:
[404]	training's multi_logloss: 1.25086	training's macroF1: 0.472834	valid_1's multi_logloss: 1.23782	valid_1's macroF1: 0.393494
******************** Execution ended in 00h 00m 47.69s ********************
######################################## 47 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.00031	training's macroF1: 0.626342	valid_1's multi_logloss: 1.06127	valid_1's macroF1: 0.409881
Early stopping, best iteration is:
[428]	training's multi_logloss: 1.03115	training's macroF1: 0.616777	valid_1's multi_logloss: 1.07478	valid_1's macroF1: 0.422387
******************** Execution ended in 00h 00m 49.58s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.00173	training's macroF1: 0.614745	valid_1's multi_logloss: 1.0657	valid_1's macroF1: 0.419639
[1000]	training's multi_logloss: 0.855671	training's macroF1: 0.659258	valid_1's multi_logloss: 1.0181	valid_1's macroF1: 0.417516
[1500]	training's multi_logloss: 0.760623	training's macroF1: 0.703882	valid_1's multi_logloss: 1.00042	valid_1's macroF1: 0.428468
[2000]	training's multi_logloss: 0.689996	training's macroF1: 0.737662	valid_1's multi_logloss: 0.991489	valid_1's macroF1: 0.423894
Early stopping, best iteration is:
[1729]	training's multi_logloss: 0.725886	training's macroF1: 0.721141	valid_1's multi_logloss: 0.995122	valid_1's macroF1: 0.43257
******************** Execution ended in 00h 01m 59.01s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.00725	training's macroF1: 0.618185	valid_1's multi_logloss: 1.04318	valid_1's macroF1: 0.421554
Early stopping, best iteration is:
[5]	training's multi_logloss: 1.37856	training's macroF1: 0.484139	valid_1's multi_logloss: 1.37578	valid_1's macroF1: 0.433348
******************** Execution ended in 00h 00m 24.84s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.0089	training's macroF1: 0.607344	valid_1's multi_logloss: 1.04721	valid_1's macroF1: 0.405481
Early stopping, best iteration is:
[71]	training's multi_logloss: 1.2926	training's macroF1: 0.53826	valid_1's multi_logloss: 1.27648	valid_1's macroF1: 0.422041
******************** Execution ended in 00h 00m 28.76s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.998805	training's macroF1: 0.609789	valid_1's multi_logloss: 1.04472	valid_1's macroF1: 0.374976
Early stopping, best iteration is:
[178]	training's multi_logloss: 1.18252	training's macroF1: 0.566823	valid_1's multi_logloss: 1.17038	valid_1's macroF1: 0.393856
******************** Execution ended in 00h 00m 35.03s ********************
######################################## 48 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.8071	training's macroF1: 0.662279	valid_1's multi_logloss: 1.01148	valid_1's macroF1: 0.404122
Early stopping, best iteration is:
[462]	training's multi_logloss: 0.824204	training's macroF1: 0.656333	valid_1's multi_logloss: 1.01275	valid_1's macroF1: 0.412464
******************** Execution ended in 00h 00m 51.20s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.799244	training's macroF1: 0.674325	valid_1's multi_logloss: 0.971178	valid_1's macroF1: 0.448099
Early stopping, best iteration is:
[283]	training's multi_logloss: 0.918767	training's macroF1: 0.622818	valid_1's multi_logloss: 0.989862	valid_1's macroF1: 0.474071
******************** Execution ended in 00h 00m 41.54s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.802105	training's macroF1: 0.681719	valid_1's multi_logloss: 1.02001	valid_1's macroF1: 0.410017
Early stopping, best iteration is:
[338]	training's multi_logloss: 0.883788	training's macroF1: 0.643205	valid_1's multi_logloss: 1.02797	valid_1's macroF1: 0.429726
******************** Execution ended in 00h 00m 42.93s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.788106	training's macroF1: 0.676333	valid_1's multi_logloss: 1.07599	valid_1's macroF1: 0.367092
Early stopping, best iteration is:
[104]	training's multi_logloss: 1.08921	training's macroF1: 0.559944	valid_1's multi_logloss: 1.13602	valid_1's macroF1: 0.398403
******************** Execution ended in 00h 00m 30.57s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.794707	training's macroF1: 0.668448	valid_1's multi_logloss: 1.04107	valid_1's macroF1: 0.392949
Early stopping, best iteration is:
[173]	training's multi_logloss: 1.00595	training's macroF1: 0.572598	valid_1's multi_logloss: 1.07324	valid_1's macroF1: 0.409823
******************** Execution ended in 00h 00m 34.36s ********************
######################################## 49 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.26168	training's macroF1: 0.487832	valid_1's multi_logloss: 1.22688	valid_1's macroF1: 0.391319
Early stopping, best iteration is:
[13]	training's multi_logloss: 1.38197	training's macroF1: 0.433916	valid_1's multi_logloss: 1.38039	valid_1's macroF1: 0.399763
******************** Execution ended in 00h 00m 26.78s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.25725	training's macroF1: 0.49344	valid_1's multi_logloss: 1.23395	valid_1's macroF1: 0.363941
Early stopping, best iteration is:
[28]	training's multi_logloss: 1.37696	training's macroF1: 0.46196	valid_1's multi_logloss: 1.3748	valid_1's macroF1: 0.382518
******************** Execution ended in 00h 00m 28.31s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.25907	training's macroF1: 0.492664	valid_1's multi_logloss: 1.23247	valid_1's macroF1: 0.384621
Early stopping, best iteration is:
[16]	training's multi_logloss: 1.38094	training's macroF1: 0.462733	valid_1's multi_logloss: 1.37894	valid_1's macroF1: 0.417659
******************** Execution ended in 00h 00m 27.27s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.25378	training's macroF1: 0.503521	valid_1's multi_logloss: 1.23431	valid_1's macroF1: 0.384534
Early stopping, best iteration is:
[33]	training's multi_logloss: 1.37508	training's macroF1: 0.445598	valid_1's multi_logloss: 1.37155	valid_1's macroF1: 0.403611
******************** Execution ended in 00h 00m 28.30s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.25653	training's macroF1: 0.492158	valid_1's multi_logloss: 1.22414	valid_1's macroF1: 0.388131
Early stopping, best iteration is:
[62]	training's multi_logloss: 1.36623	training's macroF1: 0.462166	valid_1's multi_logloss: 1.35954	valid_1's macroF1: 0.410976
******************** Execution ended in 00h 00m 30.75s ********************
######################################## 50 of 50 iterations ########################################
============================== 1 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.20308	training's macroF1: 0.492346	valid_1's multi_logloss: 1.13667	valid_1's macroF1: 0.439857
[1000]	training's multi_logloss: 1.11532	training's macroF1: 0.509183	valid_1's multi_logloss: 1.05185	valid_1's macroF1: 0.436571
Early stopping, best iteration is:
[542]	training's multi_logloss: 1.19321	training's macroF1: 0.489722	valid_1's multi_logloss: 1.1252	valid_1's macroF1: 0.452032
******************** Execution ended in 00h 00m 58.62s ********************
============================== 2 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.20211	training's macroF1: 0.502901	valid_1's multi_logloss: 1.16359	valid_1's macroF1: 0.394206
[1000]	training's multi_logloss: 1.11377	training's macroF1: 0.53092	valid_1's multi_logloss: 1.09168	valid_1's macroF1: 0.40153
[1500]	training's multi_logloss: 1.05951	training's macroF1: 0.5409	valid_1's multi_logloss: 1.06526	valid_1's macroF1: 0.404961
Early stopping, best iteration is:
[1432]	training's multi_logloss: 1.06584	training's macroF1: 0.539975	valid_1's multi_logloss: 1.06719	valid_1's macroF1: 0.411713
******************** Execution ended in 00h 01m 39.81s ********************
============================== 3 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.19885	training's macroF1: 0.489688	valid_1's multi_logloss: 1.14254	valid_1's macroF1: 0.412708
Early stopping, best iteration is:
[136]	training's multi_logloss: 1.31544	training's macroF1: 0.461815	valid_1's multi_logloss: 1.28364	valid_1's macroF1: 0.422691
******************** Execution ended in 00h 00m 32.58s ********************
============================== 4 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.19172	training's macroF1: 0.497777	valid_1's multi_logloss: 1.17207	valid_1's macroF1: 0.382614
Early stopping, best iteration is:
[38]	training's multi_logloss: 1.36292	training's macroF1: 0.454936	valid_1's multi_logloss: 1.3571	valid_1's macroF1: 0.403982
******************** Execution ended in 00h 00m 27.01s ********************
============================== 5 of 5 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.18896	training's macroF1: 0.498736	valid_1's multi_logloss: 1.18409	valid_1's macroF1: 0.353528
Early stopping, best iteration is:
[147]	training's multi_logloss: 1.30661	training's macroF1: 0.47536	valid_1's multi_logloss: 1.29646	valid_1's macroF1: 0.365157
******************** Execution ended in 00h 00m 32.80s ********************
In [101]:
shop_sorted_df = total_shap_df.groupby('feature').mean().sort_values('shap_values', ascending = False).reset_index()
feat_imp_sorted_df = total_shap_df.groupby('feature').mean().sort_values('feat_imp', ascending=False).reset_index()
features_top_shap = shop_sorted_df['feature'][:500] # 500개 컬럼만 추출 
features_top_feat_imp = feat_imp_sorted_df['feature'][:500] # 500개 컬럼만 추출 

top_features = pd.Series(features_top_shap.tolist() + features_top_feat_imp.tolist())
top_features = top_features.unique() # features_top_shap과 features_top_feat_imp 겹치는 것은 삭제하고 유일한 값만 둔다.
 

4. Model Development

In [104]:
new_train = train[top_features].copy()
new_test = test[top_features].copy()
In [106]:
print('new train shape:', new_train.shape, 'new test shape:', new_test.shape)
 
new train shape: (2973, 532) new test shape: (23856, 532)
 

LGB를 통한 예측 및 변수 중요도 생성

In [107]:
# 카테고리 변수 리스트 생성
new_categorical_feats = [col for col in top_features if col in categorical_feats]
In [110]:
# LGB : Light Gradient Boost 함수 생성 
def LGB_OOF(params, categorical_feats, N_FOLDs, SEED=1989):
    clf = lgb.LGBMClassifier(objective='multiclass',
                             random_state=1989,
                             max_depth=params['max_depth'], 
                             learning_rate=params['learning_rate'],  
                             silent=True, 
                             metric='multi_logloss',
                             n_jobs=-1, n_estimators=10000, 
                             class_weight='balanced',
                             colsample_bytree = params['colsample_bytree'], 
                             min_split_gain= params['min_split_gain'], 
                             bagging_freq = params['bagging_freq'],
                             min_child_weight=params['min_child_weight'],
                             num_leaves = params['num_leaves'], 
                             subsample = params['subsample'],
                             reg_alpha= params['reg_alpha'],
                             reg_lambda= params['reg_lambda'],
                             num_class=len(np.unique(y)),
                             bagging_seed=SEED,
                             seed=SEED,
                            )

    kfold = 10
    kf = StratifiedKFold(n_splits=kfold, shuffle=True)
    feat_importance_df  = pd.DataFrame()
    predicts_result = []

    for i, (train_index, test_index) in enumerate(kf.split(new_train, y)):
        print('='*30, '{} of {} folds'.format(i+1, kfold), '='*30)
        start = time.time()
        X_train, X_val = new_train.iloc[train_index], new_train.iloc[test_index]
        y_train, y_val = y.iloc[train_index], y.iloc[test_index]
        clf.fit(X_train, y_train, eval_set=[(X_train, y_train), (X_val, y_val)], eval_metric=evaluate_macroF1_lgb,categorical_feature=new_categorical_feats,
                early_stopping_rounds=500, verbose=500)
        shap_values = shap.TreeExplainer(clf.booster_).shap_values(X_train)
        fold_importance_df  = pd.DataFrame()
        fold_importance_df['feature'] = X_train.columns
        fold_importance_df['shap_values'] = abs(np.array(shap_values)[:, :].mean(1).mean(0))
        fold_importance_df['feat_imp'] = clf.feature_importances_
        feat_importance_df = pd.concat([feat_importance_df, fold_importance_df])
        predicts_result.append(clf.predict(new_test))
        print_execution_time(start)
    return predicts_result, feat_importance_df
In [111]:
# 셋팅값 입력 
params = {'max_depth': 6,
         'learning_rate': 0.002,
          'colsample_bytree': 0.8,
          'subsample': 0.8,
          'min_split_gain': 0.02,
          'num_leaves': 48,
          'reg_alpha': 0.04,
          'reg_lambda': 0.073,
          'bagging_freq': 2,
          'min_child_weight': 40
         }

N_Folds = 20
SEED = 1989
predicts_result, feat_importance_df = LGB_OOF(params, new_categorical_feats, N_Folds, SEED=1989)
 
============================== 1 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.16249	training's macroF1: 0.569094	valid_1's multi_logloss: 1.18146	valid_1's macroF1: 0.42358
Early stopping, best iteration is:
[164]	training's multi_logloss: 1.29343	training's macroF1: 0.537105	valid_1's multi_logloss: 1.29374	valid_1's macroF1: 0.439404
******************** Execution ended in 00h 00m 19.45s ********************
============================== 2 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15839	training's macroF1: 0.575707	valid_1's multi_logloss: 1.1554	valid_1's macroF1: 0.394559
Early stopping, best iteration is:
[148]	training's multi_logloss: 1.29955	training's macroF1: 0.544585	valid_1's multi_logloss: 1.28984	valid_1's macroF1: 0.399304
******************** Execution ended in 00h 00m 18.94s ********************
============================== 3 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.1647	training's macroF1: 0.576162	valid_1's multi_logloss: 1.15358	valid_1's macroF1: 0.426385
Early stopping, best iteration is:
[29]	training's multi_logloss: 1.36818	training's macroF1: 0.524073	valid_1's multi_logloss: 1.36502	valid_1's macroF1: 0.448364
******************** Execution ended in 00h 00m 13.28s ********************
============================== 4 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15892	training's macroF1: 0.567091	valid_1's multi_logloss: 1.15141	valid_1's macroF1: 0.407524
Early stopping, best iteration is:
[32]	training's multi_logloss: 1.36582	training's macroF1: 0.538317	valid_1's multi_logloss: 1.36181	valid_1's macroF1: 0.416631
******************** Execution ended in 00h 00m 13.53s ********************
============================== 5 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.16022	training's macroF1: 0.569641	valid_1's multi_logloss: 1.17066	valid_1's macroF1: 0.38358
Early stopping, best iteration is:
[4]	training's multi_logloss: 1.38358	training's macroF1: 0.487092	valid_1's multi_logloss: 1.38315	valid_1's macroF1: 0.409667
******************** Execution ended in 00h 00m 12.30s ********************
============================== 6 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15697	training's macroF1: 0.571246	valid_1's multi_logloss: 1.18832	valid_1's macroF1: 0.384436
Early stopping, best iteration is:
[118]	training's multi_logloss: 1.31517	training's macroF1: 0.542032	valid_1's multi_logloss: 1.31653	valid_1's macroF1: 0.405468
******************** Execution ended in 00h 00m 17.61s ********************
============================== 7 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.16527	training's macroF1: 0.56447	valid_1's multi_logloss: 1.15593	valid_1's macroF1: 0.438815
[1000]	training's multi_logloss: 1.04307	training's macroF1: 0.59581	valid_1's multi_logloss: 1.06816	valid_1's macroF1: 0.467058
[1500]	training's multi_logloss: 0.962391	training's macroF1: 0.620582	valid_1's multi_logloss: 1.03027	valid_1's macroF1: 0.464022
Early stopping, best iteration is:
[1040]	training's multi_logloss: 1.03559	training's macroF1: 0.598135	valid_1's multi_logloss: 1.06401	valid_1's macroF1: 0.471391
******************** Execution ended in 00h 00m 57.97s ********************
============================== 8 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.16038	training's macroF1: 0.563742	valid_1's multi_logloss: 1.16608	valid_1's macroF1: 0.386054
Early stopping, best iteration is:
[4]	training's multi_logloss: 1.38361	training's macroF1: 0.489458	valid_1's multi_logloss: 1.38324	valid_1's macroF1: 0.414727
******************** Execution ended in 00h 00m 12.39s ********************
============================== 9 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.15672	training's macroF1: 0.564723	valid_1's multi_logloss: 1.16849	valid_1's macroF1: 0.409507
Early stopping, best iteration is:
[444]	training's multi_logloss: 1.17521	training's macroF1: 0.561	valid_1's multi_logloss: 1.1833	valid_1's macroF1: 0.418126
******************** Execution ended in 00h 00m 32.01s ********************
============================== 10 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 1.159	training's macroF1: 0.56456	valid_1's multi_logloss: 1.16643	valid_1's macroF1: 0.38565
Early stopping, best iteration is:
[367]	training's multi_logloss: 1.20482	training's macroF1: 0.555876	valid_1's multi_logloss: 1.20274	valid_1's macroF1: 0.396325
******************** Execution ended in 00h 00m 27.79s ********************
In [112]:
# 변수중요도를 그래프로 표현 
fig, ax = plt.subplots(1, 2, figsize=(20, 20))
feat_importance_df_shap = feat_importance_df.groupby('feature').mean().sort_values('shap_values', ascending=False).reset_index()

num_features = 50
sns.barplot(x=feat_importance_df_shap.shap_values[:num_features], y=feat_importance_df_shap.feature[:num_features], ax=ax[0])
ax[0].set_title('Feature importance based on shap values')

feat_importance_df = feat_importance_df.groupby('feature').mean().sort_values('feat_imp', ascending=False).reset_index()

num_features = 50
sns.barplot(x=feat_importance_df.shap_values[:num_features], y=feat_importance_df.feature[:num_features], ax=ax[1])
ax[1].set_title('Feature importance based on feature importance from lgbm')
plt.show()
 
In [113]:
# 예측 결과 cvs 파일로 생성
submission['Target'] = np.array(predicts_result).mean(axis=0).round().astype(int)
submission.to_csv('submission_with_new_feature_set.csv', index=False)
 

랜덤하게 찾기 (Randomized Search)

In [114]:
optimized_param = None
lowest_cv = 1000
total_iteration = 100
for i in range(total_iteration):
    print('-'*20, 'For {} of {} iterations'.format(i+1, total_iteration), '-'*20)
    learning_rate = np.random.rand() * 0.02
    n_folds = 3

    num_class = len(np.unique(y))
    
    # parameter value들을 일정 범위에 맞게 random하게 설정
    params = {}
    params['application'] = 'multiclass'
    params['metric'] = 'multi_logloss'
    params['num_class'] = num_class
    params['class_weight'] = 'balanced'
    params['num_leaves'] = np.random.randint(24, 48)
    params['max_depth'] = np.random.randint(5, 8)
    params['min_child_weight'] = np.random.randint(5, 50)
    params['min_split_gain'] = np.random.rand() * 0.09
    params['colsample_bytree'] = np.random.rand() * (0.9 - 0.1) + 0.1
    params['subsample'] = np.random.rand() * (1 - 0.8) + 0.8
    params['bagging_freq'] = np.random.randint(1, 5)
    params['bagging_seed'] = np.random.randint(1, 5)
    params['reg_alpha'] = np.random.rand() * 2
    params['reg_lambda'] = np.random.rand() * 2
    params['learning_rate'] = np.random.rand() * 0.02
    params['seed']  =1989

    d_train = lgb.Dataset(data=new_train, label=y.values-1, categorical_feature=new_categorical_feats, free_raw_data=False)
    cv_results = lgb.cv(params=params, train_set=d_train, num_boost_round=10000, categorical_feature=new_categorical_feats,
                        nfold=n_folds, stratified=True, shuffle=True, early_stopping_rounds=1, verbose_eval=1000)

    min_cv_results = min(cv_results['multi_logloss-mean'])

    # 가장 작은 평균 값의 parmas을 최적화 param으로 등록
    if min_cv_results < lowest_cv:
        lowest_cv = min_cv_results
        optimized_param = params
 
-------------------- For 1 of 100 iterations --------------------
-------------------- For 2 of 100 iterations --------------------
-------------------- For 3 of 100 iterations --------------------
-------------------- For 4 of 100 iterations --------------------
-------------------- For 5 of 100 iterations --------------------
-------------------- For 6 of 100 iterations --------------------
-------------------- For 7 of 100 iterations --------------------
-------------------- For 8 of 100 iterations --------------------
-------------------- For 9 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.850644 + 0.0107643
-------------------- For 10 of 100 iterations --------------------
-------------------- For 11 of 100 iterations --------------------
-------------------- For 12 of 100 iterations --------------------
-------------------- For 13 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.844449 + 0.0113269
-------------------- For 14 of 100 iterations --------------------
-------------------- For 15 of 100 iterations --------------------
-------------------- For 16 of 100 iterations --------------------
-------------------- For 17 of 100 iterations --------------------
-------------------- For 18 of 100 iterations --------------------
-------------------- For 19 of 100 iterations --------------------
-------------------- For 20 of 100 iterations --------------------
-------------------- For 21 of 100 iterations --------------------
-------------------- For 22 of 100 iterations --------------------
-------------------- For 23 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.874355 + 0.00846433
[2000]	cv_agg's multi_logloss: 0.838224 + 0.0124167
-------------------- For 24 of 100 iterations --------------------
-------------------- For 25 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.840324 + 0.0121589
-------------------- For 26 of 100 iterations --------------------
-------------------- For 27 of 100 iterations --------------------
-------------------- For 28 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.96267 + 0.0018782
[2000]	cv_agg's multi_logloss: 0.930341 + 0.00400037
[3000]	cv_agg's multi_logloss: 0.906175 + 0.00582314
[4000]	cv_agg's multi_logloss: 0.88784 + 0.00736017
[5000]	cv_agg's multi_logloss: 0.873672 + 0.0085177
[6000]	cv_agg's multi_logloss: 0.862664 + 0.00952161
[7000]	cv_agg's multi_logloss: 0.853967 + 0.0103318
[8000]	cv_agg's multi_logloss: 0.847137 + 0.0110746
[9000]	cv_agg's multi_logloss: 0.841789 + 0.0118185
[10000]	cv_agg's multi_logloss: 0.837651 + 0.0124768
-------------------- For 29 of 100 iterations --------------------
-------------------- For 30 of 100 iterations --------------------
-------------------- For 31 of 100 iterations --------------------
-------------------- For 32 of 100 iterations --------------------
-------------------- For 33 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.845318 + 0.0122234
-------------------- For 34 of 100 iterations --------------------
-------------------- For 35 of 100 iterations --------------------
-------------------- For 36 of 100 iterations --------------------
-------------------- For 37 of 100 iterations --------------------
-------------------- For 38 of 100 iterations --------------------
-------------------- For 39 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.844441 + 0.0112841
-------------------- For 40 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.840559 + 0.0119178
-------------------- For 41 of 100 iterations --------------------
-------------------- For 42 of 100 iterations --------------------
-------------------- For 43 of 100 iterations --------------------
-------------------- For 44 of 100 iterations --------------------
-------------------- For 45 of 100 iterations --------------------
-------------------- For 46 of 100 iterations --------------------
-------------------- For 47 of 100 iterations --------------------
-------------------- For 48 of 100 iterations --------------------
-------------------- For 49 of 100 iterations --------------------
-------------------- For 50 of 100 iterations --------------------
-------------------- For 51 of 100 iterations --------------------
-------------------- For 52 of 100 iterations --------------------
-------------------- For 53 of 100 iterations --------------------
-------------------- For 54 of 100 iterations --------------------
-------------------- For 55 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.933994 + 0.00295004
[2000]	cv_agg's multi_logloss: 0.891992 + 0.00616865
[3000]	cv_agg's multi_logloss: 0.866375 + 0.0085591
[4000]	cv_agg's multi_logloss: 0.850178 + 0.0105231
[5000]	cv_agg's multi_logloss: 0.840013 + 0.0122332
[6000]	cv_agg's multi_logloss: 0.83367 + 0.0137947
-------------------- For 56 of 100 iterations --------------------
-------------------- For 57 of 100 iterations --------------------
-------------------- For 58 of 100 iterations --------------------
-------------------- For 59 of 100 iterations --------------------
-------------------- For 60 of 100 iterations --------------------
-------------------- For 61 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.959375 + 0.00139668
[2000]	cv_agg's multi_logloss: 0.925476 + 0.00324838
[3000]	cv_agg's multi_logloss: 0.90067 + 0.00512656
[4000]	cv_agg's multi_logloss: 0.882047 + 0.00678869
[5000]	cv_agg's multi_logloss: 0.867985 + 0.00814784
[6000]	cv_agg's multi_logloss: 0.857164 + 0.0093546
[7000]	cv_agg's multi_logloss: 0.848876 + 0.0103916
[8000]	cv_agg's multi_logloss: 0.842528 + 0.0113579
[9000]	cv_agg's multi_logloss: 0.837696 + 0.0122748
[10000]	cv_agg's multi_logloss: 0.834037 + 0.0132079
-------------------- For 62 of 100 iterations --------------------
-------------------- For 63 of 100 iterations --------------------
-------------------- For 64 of 100 iterations --------------------
-------------------- For 65 of 100 iterations --------------------
-------------------- For 66 of 100 iterations --------------------
-------------------- For 67 of 100 iterations --------------------
-------------------- For 68 of 100 iterations --------------------
-------------------- For 69 of 100 iterations --------------------
-------------------- For 70 of 100 iterations --------------------
-------------------- For 71 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.829763 + 0.0154857
-------------------- For 72 of 100 iterations --------------------
-------------------- For 73 of 100 iterations --------------------
-------------------- For 74 of 100 iterations --------------------
-------------------- For 75 of 100 iterations --------------------
-------------------- For 76 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.880608 + 0.00730383
[2000]	cv_agg's multi_logloss: 0.843414 + 0.0122766
-------------------- For 77 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.953842 + 0.00181947
[2000]	cv_agg's multi_logloss: 0.917862 + 0.00409414
[3000]	cv_agg's multi_logloss: 0.892608 + 0.0061033
[4000]	cv_agg's multi_logloss: 0.874444 + 0.00772182
[5000]	cv_agg's multi_logloss: 0.861021 + 0.00908268
[6000]	cv_agg's multi_logloss: 0.851073 + 0.0102535
[7000]	cv_agg's multi_logloss: 0.843687 + 0.011348
[8000]	cv_agg's multi_logloss: 0.838232 + 0.0123942
[9000]	cv_agg's multi_logloss: 0.83417 + 0.0133474
[10000]	cv_agg's multi_logloss: 0.831216 + 0.0142597
-------------------- For 78 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.835838 + 0.013082
-------------------- For 79 of 100 iterations --------------------
-------------------- For 80 of 100 iterations --------------------
-------------------- For 81 of 100 iterations --------------------
-------------------- For 82 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.837136 + 0.0125166
-------------------- For 83 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.882841 + 0.00739006
[2000]	cv_agg's multi_logloss: 0.843938 + 0.0117782
[3000]	cv_agg's multi_logloss: 0.830597 + 0.0151525
-------------------- For 84 of 100 iterations --------------------
-------------------- For 85 of 100 iterations --------------------
-------------------- For 86 of 100 iterations --------------------
-------------------- For 87 of 100 iterations --------------------
-------------------- For 88 of 100 iterations --------------------
-------------------- For 89 of 100 iterations --------------------
-------------------- For 90 of 100 iterations --------------------
-------------------- For 91 of 100 iterations --------------------
-------------------- For 92 of 100 iterations --------------------
-------------------- For 93 of 100 iterations --------------------
-------------------- For 94 of 100 iterations --------------------
-------------------- For 95 of 100 iterations --------------------
-------------------- For 96 of 100 iterations --------------------
-------------------- For 97 of 100 iterations --------------------
[1000]	cv_agg's multi_logloss: 0.833212 + 0.0148136
-------------------- For 98 of 100 iterations --------------------
-------------------- For 99 of 100 iterations --------------------
-------------------- For 100 of 100 iterations --------------------
In [115]:
N_Folds = 20
SEED = 1989
# 랜덤으로 추출한 변수를 LGB 함수에 적용
predicts_result, feat_importance_df = LGB_OOF(optimized_param, new_categorical_feats, N_Folds, SEED=1989)
 
============================== 1 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.643909	training's macroF1: 0.767897	valid_1's multi_logloss: 1.01731	valid_1's macroF1: 0.419311
Early stopping, best iteration is:
[455]	training's multi_logloss: 0.66926	training's macroF1: 0.756981	valid_1's multi_logloss: 1.02123	valid_1's macroF1: 0.424335
******************** Execution ended in 00h 00m 31.75s ********************
============================== 2 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.642988	training's macroF1: 0.772115	valid_1's multi_logloss: 0.973459	valid_1's macroF1: 0.441129
Early stopping, best iteration is:
[246]	training's multi_logloss: 0.836124	training's macroF1: 0.70146	valid_1's multi_logloss: 1.00646	valid_1's macroF1: 0.460188
******************** Execution ended in 00h 00m 22.09s ********************
============================== 3 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.646518	training's macroF1: 0.764841	valid_1's multi_logloss: 0.90879	valid_1's macroF1: 0.425241
Early stopping, best iteration is:
[301]	training's multi_logloss: 0.785451	training's macroF1: 0.71514	valid_1's multi_logloss: 0.941626	valid_1's macroF1: 0.43586
******************** Execution ended in 00h 00m 24.43s ********************
============================== 4 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.642282	training's macroF1: 0.776681	valid_1's multi_logloss: 1.00217	valid_1's macroF1: 0.454198
[1000]	training's multi_logloss: 0.448314	training's macroF1: 0.847268	valid_1's multi_logloss: 0.976682	valid_1's macroF1: 0.466582
Early stopping, best iteration is:
[722]	training's multi_logloss: 0.536986	training's macroF1: 0.814864	valid_1's multi_logloss: 0.985533	valid_1's macroF1: 0.475647
******************** Execution ended in 00h 00m 42.88s ********************
============================== 5 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.642159	training's macroF1: 0.772177	valid_1's multi_logloss: 0.95837	valid_1's macroF1: 0.421447
Early stopping, best iteration is:
[30]	training's multi_logloss: 1.24539	training's macroF1: 0.586624	valid_1's multi_logloss: 1.24092	valid_1's macroF1: 0.438978
******************** Execution ended in 00h 00m 11.48s ********************
============================== 6 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.642027	training's macroF1: 0.767923	valid_1's multi_logloss: 1.06385	valid_1's macroF1: 0.415905
Early stopping, best iteration is:
[355]	training's multi_logloss: 0.734352	training's macroF1: 0.730868	valid_1's multi_logloss: 1.06958	valid_1's macroF1: 0.419991
******************** Execution ended in 00h 00m 27.10s ********************
============================== 7 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.65071	training's macroF1: 0.769133	valid_1's multi_logloss: 0.944704	valid_1's macroF1: 0.407626
[1000]	training's multi_logloss: 0.454591	training's macroF1: 0.841963	valid_1's multi_logloss: 0.921269	valid_1's macroF1: 0.411259
Early stopping, best iteration is:
[820]	training's multi_logloss: 0.509099	training's macroF1: 0.827846	valid_1's multi_logloss: 0.925363	valid_1's macroF1: 0.427033
******************** Execution ended in 00h 00m 47.39s ********************
============================== 8 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.641123	training's macroF1: 0.773204	valid_1's multi_logloss: 0.970383	valid_1's macroF1: 0.396786
[1000]	training's multi_logloss: 0.449023	training's macroF1: 0.853571	valid_1's multi_logloss: 0.94899	valid_1's macroF1: 0.398541
Early stopping, best iteration is:
[968]	training's multi_logloss: 0.457699	training's macroF1: 0.847806	valid_1's multi_logloss: 0.949025	valid_1's macroF1: 0.414443
******************** Execution ended in 00h 00m 51.53s ********************
============================== 9 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.63771	training's macroF1: 0.773218	valid_1's multi_logloss: 0.987514	valid_1's macroF1: 0.439933
Early stopping, best iteration is:
[115]	training's multi_logloss: 1.01131	training's macroF1: 0.6531	valid_1's multi_logloss: 1.08709	valid_1's macroF1: 0.469625
******************** Execution ended in 00h 00m 15.76s ********************
============================== 10 of 10 folds ==============================
Training until validation scores don't improve for 500 rounds.
[500]	training's multi_logloss: 0.640941	training's macroF1: 0.777775	valid_1's multi_logloss: 1.03424	valid_1's macroF1: 0.404788
[1000]	training's multi_logloss: 0.445307	training's macroF1: 0.848409	valid_1's multi_logloss: 1.0132	valid_1's macroF1: 0.420932
[1500]	training's multi_logloss: 0.348561	training's macroF1: 0.882879	valid_1's multi_logloss: 1.01566	valid_1's macroF1: 0.411427
Early stopping, best iteration is:
[1068]	training's multi_logloss: 0.4287	training's macroF1: 0.853714	valid_1's multi_logloss: 1.01169	valid_1's macroF1: 0.42216
******************** Execution ended in 00h 00m 56.61s ********************
In [ ]:
submission['Target'] = np.array(predicts_result).mean(axis=0).round().astype(int)
submission.to_csv('submission_shap_randomized_search.csv', index = False)

 


github 소스코드 바로가기