2023-11-12 20:52:32,863 INFO [train.py:699] (0/4) Training started 2023-11-12 20:52:32,875 INFO [train.py:709] (0/4) Device: cuda:0 2023-11-12 20:52:34,062 INFO [train.py:716] (0/4) {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': -1, 'log_interval': 50, 'valid_interval': 200, 'env_info': {'k2-version': '1.24.4', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '44a9d5682af9fd3ef77074777e15278ec6d390eb', 'k2-git-date': 'Wed Sep 27 11:22:55 2023', 'lhotse-version': '0.0.0+unknown.version', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'vits', 'icefall-git-sha1': 'f55e80a7-dirty', 'icefall-git-date': 'Mon Nov 6 15:05:49 2023', 'icefall-path': '/star-zw/workspace/tts/icefall_tts', 'k2-path': '/star-zw/workspace/k2/k2/k2/python/k2/__init__.py', 'lhotse-path': '/star-zw/workspace/lhotse/lhotse_dev/lhotse/__init__.py', 'hostname': 'de-74279-k2-train-6-0423201309-7c68fd68fb-qfn6b', 'IP address': '10.177.58.19'}, 'sampling_rate': 22050, 'frame_shift': 256, 'frame_length': 1024, 'feature_dim': 513, 'n_mels': 80, 'lambda_adv': 1.0, 'lambda_mel': 45.0, 'lambda_feat_match': 2.0, 'lambda_dur': 1.0, 'lambda_kl': 1.0, 'world_size': 4, 'master_port': 12354, 'tensorboard': True, 'num_epochs': 1000, 'start_epoch': 1, 'exp_dir': PosixPath('vits/exp-g2p-conformer-text-encoder-new'), 'tokens': 'data/tokens.txt', 'lr': 0.0002, 'seed': 42, 'print_diagnostics': False, 'inf_check': False, 'save_every_n': 20, 'use_fp16': True, 'manifest_dir': PosixPath('data/spectrogram'), 'max_duration': 500, 'bucketing_sampler': True, 'num_buckets': 30, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': False, 'num_workers': 2, 'input_strategy': 'PrecomputedFeatures', 'blank_id': 0, 'oov_id': 2, 'vocab_size': 79} 2023-11-12 20:52:34,062 INFO [train.py:718] (0/4) About to create model 2023-11-12 20:52:37,030 INFO [train.py:724] (0/4) Number of parameters in generator: 35621746 2023-11-12 20:52:37,031 INFO [train.py:726] (0/4) Number of parameters in discriminator: 50974956 2023-11-12 20:52:37,032 INFO [train.py:727] (0/4) Total number of parameters: 86596702 2023-11-12 20:52:43,991 INFO [train.py:734] (0/4) Using DDP 2023-11-12 20:52:44,568 INFO [tts_datamodule.py:314] (0/4) About to get train cuts 2023-11-12 20:52:44,858 INFO [tts_datamodule.py:169] (0/4) About to create train dataset 2023-11-12 20:52:44,858 INFO [tts_datamodule.py:195] (0/4) Using DynamicBucketingSampler. 2023-11-12 20:52:46,863 INFO [tts_datamodule.py:210] (0/4) About to create train dataloader 2023-11-12 20:52:46,866 INFO [tts_datamodule.py:321] (0/4) About to get validation cuts 2023-11-12 20:52:46,868 INFO [tts_datamodule.py:233] (0/4) About to create dev dataset 2023-11-12 20:52:46,880 INFO [tts_datamodule.py:262] (0/4) About to create valid dataloader 2023-11-12 20:52:46,881 INFO [train.py:628] (0/4) Sanity check -- see if any of the batches in epoch 1 would cause OOM. 2023-11-12 20:52:56,710 INFO [train.py:674] (0/4) Maximum memory allocated so far is 14361MB 2023-11-12 20:53:00,876 INFO [train.py:674] (0/4) Maximum memory allocated so far is 14708MB 2023-11-12 20:53:05,535 INFO [train.py:674] (0/4) Maximum memory allocated so far is 16163MB 2023-11-12 20:53:10,191 INFO [train.py:674] (0/4) Maximum memory allocated so far is 16163MB 2023-11-12 20:53:18,696 INFO [train.py:674] (0/4) Maximum memory allocated so far is 26296MB 2023-11-12 20:53:27,187 INFO [train.py:674] (0/4) Maximum memory allocated so far is 26299MB 2023-11-12 20:53:27,213 INFO [train.py:811] (0/4) Start epoch 1 2023-11-12 20:53:41,210 INFO [train.py:467] (0/4) Epoch 1, batch 0, global_batch_idx: 0, batch size: 51, loss[discriminator_loss=5.934, discriminator_real_loss=5.934, discriminator_fake_loss=0.001226, generator_loss=930.5, generator_mel_loss=73.51, generator_kl_loss=850, generator_dur_loss=2.055, generator_adv_loss=4.762, generator_feat_match_loss=0.2446, over 51.00 samples.], tot_loss[discriminator_loss=5.934, discriminator_real_loss=5.934, discriminator_fake_loss=0.001226, generator_loss=930.5, generator_mel_loss=73.51, generator_kl_loss=850, generator_dur_loss=2.055, generator_adv_loss=4.762, generator_feat_match_loss=0.2446, over 51.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 2.0 2023-11-12 20:53:43,655 INFO [train.py:508] (0/4) Computing validation loss 2023-11-12 20:53:52,377 INFO [train.py:517] (0/4) Epoch 1, validation: discriminator_loss=4.839, discriminator_real_loss=4.761, discriminator_fake_loss=0.0778, generator_loss=489.8, generator_mel_loss=72.73, generator_kl_loss=409.9, generator_dur_loss=2.185, generator_adv_loss=4.762, generator_feat_match_loss=0.2584, over 100.00 samples. 2023-11-12 20:53:52,378 INFO [train.py:518] (0/4) Maximum memory allocated so far is 26299MB 2023-11-12 20:57:06,409 INFO [train.py:811] (0/4) Start epoch 2 2023-11-12 20:58:36,595 INFO [train.py:467] (0/4) Epoch 2, batch 13, global_batch_idx: 50, batch size: 52, loss[discriminator_loss=3.416, discriminator_real_loss=2.496, discriminator_fake_loss=0.9199, generator_loss=61.74, generator_mel_loss=42.48, generator_kl_loss=12.86, generator_dur_loss=1.846, generator_adv_loss=2.266, generator_feat_match_loss=2.285, over 52.00 samples.], tot_loss[discriminator_loss=2.642, discriminator_real_loss=1.617, discriminator_fake_loss=1.025, generator_loss=66.25, generator_mel_loss=43.47, generator_kl_loss=15.36, generator_dur_loss=1.809, generator_adv_loss=2.45, generator_feat_match_loss=3.156, over 925.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 2.0 2023-11-12 21:00:35,274 INFO [train.py:811] (0/4) Start epoch 3 2023-11-12 21:02:59,029 INFO [train.py:467] (0/4) Epoch 3, batch 26, global_batch_idx: 100, batch size: 53, loss[discriminator_loss=3.324, discriminator_real_loss=1.785, discriminator_fake_loss=1.539, generator_loss=52.03, generator_mel_loss=39.44, generator_kl_loss=6.416, generator_dur_loss=1.878, generator_adv_loss=2.055, generator_feat_match_loss=2.238, over 53.00 samples.], tot_loss[discriminator_loss=2.593, discriminator_real_loss=1.466, discriminator_fake_loss=1.126, generator_loss=55.01, generator_mel_loss=39.73, generator_kl_loss=7.472, generator_dur_loss=1.865, generator_adv_loss=2.478, generator_feat_match_loss=3.461, over 1735.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 4.0 2023-11-12 21:04:04,553 INFO [train.py:811] (0/4) Start epoch 4 2023-11-12 21:07:36,812 INFO [train.py:811] (0/4) Start epoch 5 2023-11-12 21:08:06,212 INFO [train.py:467] (0/4) Epoch 5, batch 2, global_batch_idx: 150, batch size: 110, loss[discriminator_loss=2.736, discriminator_real_loss=1.488, discriminator_fake_loss=1.248, generator_loss=47.92, generator_mel_loss=37.7, generator_kl_loss=4.514, generator_dur_loss=1.865, generator_adv_loss=1.676, generator_feat_match_loss=2.162, over 110.00 samples.], tot_loss[discriminator_loss=2.827, discriminator_real_loss=1.701, discriminator_fake_loss=1.126, generator_loss=48.31, generator_mel_loss=38.02, generator_kl_loss=4.402, generator_dur_loss=1.875, generator_adv_loss=1.869, generator_feat_match_loss=2.137, over 260.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 4.0 2023-11-12 21:11:08,800 INFO [train.py:811] (0/4) Start epoch 6 2023-11-12 21:12:49,842 INFO [train.py:467] (0/4) Epoch 6, batch 15, global_batch_idx: 200, batch size: 71, loss[discriminator_loss=2.619, discriminator_real_loss=1.406, discriminator_fake_loss=1.213, generator_loss=46.18, generator_mel_loss=37.01, generator_kl_loss=3.365, generator_dur_loss=1.907, generator_adv_loss=1.764, generator_feat_match_loss=2.139, over 71.00 samples.], tot_loss[discriminator_loss=2.645, discriminator_real_loss=1.407, discriminator_fake_loss=1.238, generator_loss=45.91, generator_mel_loss=36.59, generator_kl_loss=3.449, generator_dur_loss=1.891, generator_adv_loss=1.945, generator_feat_match_loss=2.03, over 1249.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 8.0 2023-11-12 21:12:50,294 INFO [train.py:508] (0/4) Computing validation loss 2023-11-12 21:13:01,088 INFO [train.py:517] (0/4) Epoch 6, validation: discriminator_loss=2.551, discriminator_real_loss=1.182, discriminator_fake_loss=1.369, generator_loss=46.58, generator_mel_loss=36.93, generator_kl_loss=3.442, generator_dur_loss=1.942, generator_adv_loss=1.789, generator_feat_match_loss=2.469, over 100.00 samples. 2023-11-12 21:13:01,089 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27004MB 2023-11-12 21:14:53,755 INFO [train.py:811] (0/4) Start epoch 7 2023-11-12 21:17:39,664 INFO [train.py:467] (0/4) Epoch 7, batch 28, global_batch_idx: 250, batch size: 58, loss[discriminator_loss=2.66, discriminator_real_loss=1.202, discriminator_fake_loss=1.458, generator_loss=44.01, generator_mel_loss=35.31, generator_kl_loss=2.662, generator_dur_loss=1.89, generator_adv_loss=2.062, generator_feat_match_loss=2.086, over 58.00 samples.], tot_loss[discriminator_loss=2.635, discriminator_real_loss=1.396, discriminator_fake_loss=1.24, generator_loss=45.68, generator_mel_loss=36.46, generator_kl_loss=2.93, generator_dur_loss=1.901, generator_adv_loss=2.081, generator_feat_match_loss=2.311, over 2075.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 8.0 2023-11-12 21:18:23,440 INFO [train.py:811] (0/4) Start epoch 8 2023-11-12 21:21:51,483 INFO [train.py:811] (0/4) Start epoch 9 2023-11-12 21:22:30,273 INFO [train.py:467] (0/4) Epoch 9, batch 4, global_batch_idx: 300, batch size: 65, loss[discriminator_loss=2.516, discriminator_real_loss=1.668, discriminator_fake_loss=0.8477, generator_loss=43.77, generator_mel_loss=34.82, generator_kl_loss=2.511, generator_dur_loss=1.919, generator_adv_loss=1.848, generator_feat_match_loss=2.674, over 65.00 samples.], tot_loss[discriminator_loss=2.538, discriminator_real_loss=1.368, discriminator_fake_loss=1.17, generator_loss=43.64, generator_mel_loss=34.77, generator_kl_loss=2.539, generator_dur_loss=1.906, generator_adv_loss=2.092, generator_feat_match_loss=2.334, over 411.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 8.0 2023-11-12 21:25:27,645 INFO [train.py:811] (0/4) Start epoch 10 2023-11-12 21:27:16,839 INFO [train.py:467] (0/4) Epoch 10, batch 17, global_batch_idx: 350, batch size: 153, loss[discriminator_loss=2.312, discriminator_real_loss=1.099, discriminator_fake_loss=1.215, generator_loss=45.14, generator_mel_loss=35.29, generator_kl_loss=2.431, generator_dur_loss=1.877, generator_adv_loss=2.42, generator_feat_match_loss=3.125, over 153.00 samples.], tot_loss[discriminator_loss=2.474, discriminator_real_loss=1.324, discriminator_fake_loss=1.15, generator_loss=44.04, generator_mel_loss=34.73, generator_kl_loss=2.368, generator_dur_loss=1.9, generator_adv_loss=2.181, generator_feat_match_loss=2.862, over 1488.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 8.0 2023-11-12 21:28:59,461 INFO [train.py:811] (0/4) Start epoch 11 2023-11-12 21:31:55,300 INFO [train.py:467] (0/4) Epoch 11, batch 30, global_batch_idx: 400, batch size: 59, loss[discriminator_loss=2.652, discriminator_real_loss=1.557, discriminator_fake_loss=1.095, generator_loss=42.53, generator_mel_loss=34.72, generator_kl_loss=2.147, generator_dur_loss=1.918, generator_adv_loss=1.954, generator_feat_match_loss=1.783, over 59.00 samples.], tot_loss[discriminator_loss=2.725, discriminator_real_loss=1.461, discriminator_fake_loss=1.264, generator_loss=42.41, generator_mel_loss=34.2, generator_kl_loss=2.199, generator_dur_loss=1.912, generator_adv_loss=1.957, generator_feat_match_loss=2.147, over 2077.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2023-11-12 21:31:55,868 INFO [train.py:508] (0/4) Computing validation loss 2023-11-12 21:32:06,359 INFO [train.py:517] (0/4) Epoch 11, validation: discriminator_loss=2.597, discriminator_real_loss=1.44, discriminator_fake_loss=1.157, generator_loss=43.15, generator_mel_loss=34.91, generator_kl_loss=2.243, generator_dur_loss=1.931, generator_adv_loss=1.948, generator_feat_match_loss=2.121, over 100.00 samples. 2023-11-12 21:32:06,360 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27006MB 2023-11-12 21:32:43,654 INFO [train.py:811] (0/4) Start epoch 12 2023-11-12 21:36:16,235 INFO [train.py:811] (0/4) Start epoch 13 2023-11-12 21:37:05,186 INFO [train.py:467] (0/4) Epoch 13, batch 6, global_batch_idx: 450, batch size: 65, loss[discriminator_loss=2.699, discriminator_real_loss=1.556, discriminator_fake_loss=1.145, generator_loss=41.13, generator_mel_loss=33.47, generator_kl_loss=2.079, generator_dur_loss=1.931, generator_adv_loss=1.923, generator_feat_match_loss=1.723, over 65.00 samples.], tot_loss[discriminator_loss=2.689, discriminator_real_loss=1.409, discriminator_fake_loss=1.28, generator_loss=42.08, generator_mel_loss=33.93, generator_kl_loss=2.116, generator_dur_loss=1.904, generator_adv_loss=2.054, generator_feat_match_loss=2.076, over 601.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2023-11-12 21:39:51,349 INFO [train.py:811] (0/4) Start epoch 14 2023-11-12 21:41:39,936 INFO [train.py:467] (0/4) Epoch 14, batch 19, global_batch_idx: 500, batch size: 52, loss[discriminator_loss=2.723, discriminator_real_loss=1.336, discriminator_fake_loss=1.387, generator_loss=39.38, generator_mel_loss=32.02, generator_kl_loss=1.979, generator_dur_loss=1.922, generator_adv_loss=1.74, generator_feat_match_loss=1.721, over 52.00 samples.], tot_loss[discriminator_loss=2.795, discriminator_real_loss=1.473, discriminator_fake_loss=1.322, generator_loss=40.28, generator_mel_loss=32.87, generator_kl_loss=2.028, generator_dur_loss=1.917, generator_adv_loss=1.827, generator_feat_match_loss=1.633, over 1274.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2023-11-12 21:43:16,327 INFO [train.py:811] (0/4) Start epoch 15 2023-11-12 21:46:18,656 INFO [train.py:467] (0/4) Epoch 15, batch 32, global_batch_idx: 550, batch size: 67, loss[discriminator_loss=2.781, discriminator_real_loss=1.51, discriminator_fake_loss=1.271, generator_loss=40.25, generator_mel_loss=32.47, generator_kl_loss=2.033, generator_dur_loss=1.924, generator_adv_loss=1.881, generator_feat_match_loss=1.949, over 67.00 samples.], tot_loss[discriminator_loss=2.712, discriminator_real_loss=1.411, discriminator_fake_loss=1.3, generator_loss=40.28, generator_mel_loss=32.48, generator_kl_loss=2.063, generator_dur_loss=1.914, generator_adv_loss=1.931, generator_feat_match_loss=1.897, over 2379.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2023-11-12 21:46:42,757 INFO [train.py:811] (0/4) Start epoch 16 2023-11-12 21:50:15,626 INFO [train.py:811] (0/4) Start epoch 17 2023-11-12 21:51:20,388 INFO [train.py:467] (0/4) Epoch 17, batch 8, global_batch_idx: 600, batch size: 50, loss[discriminator_loss=2.814, discriminator_real_loss=1.755, discriminator_fake_loss=1.06, generator_loss=39.37, generator_mel_loss=31.88, generator_kl_loss=2.113, generator_dur_loss=1.941, generator_adv_loss=1.596, generator_feat_match_loss=1.835, over 50.00 samples.], tot_loss[discriminator_loss=2.644, discriminator_real_loss=1.362, discriminator_fake_loss=1.283, generator_loss=40.12, generator_mel_loss=32.03, generator_kl_loss=2.157, generator_dur_loss=1.91, generator_adv_loss=1.921, generator_feat_match_loss=2.098, over 700.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2023-11-12 21:51:21,009 INFO [train.py:508] (0/4) Computing validation loss 2023-11-12 21:51:31,350 INFO [train.py:517] (0/4) Epoch 17, validation: discriminator_loss=2.587, discriminator_real_loss=1.127, discriminator_fake_loss=1.46, generator_loss=40.38, generator_mel_loss=32.44, generator_kl_loss=2.307, generator_dur_loss=1.91, generator_adv_loss=1.636, generator_feat_match_loss=2.091, over 100.00 samples. 2023-11-12 21:51:31,351 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27009MB 2023-11-12 21:54:04,377 INFO [train.py:811] (0/4) Start epoch 18 2023-11-12 21:56:14,742 INFO [train.py:467] (0/4) Epoch 18, batch 21, global_batch_idx: 650, batch size: 81, loss[discriminator_loss=2.582, discriminator_real_loss=1.141, discriminator_fake_loss=1.441, generator_loss=39.65, generator_mel_loss=31.13, generator_kl_loss=2.134, generator_dur_loss=1.912, generator_adv_loss=2.092, generator_feat_match_loss=2.383, over 81.00 samples.], tot_loss[discriminator_loss=2.675, discriminator_real_loss=1.388, discriminator_fake_loss=1.286, generator_loss=39.95, generator_mel_loss=31.71, generator_kl_loss=2.15, generator_dur_loss=1.924, generator_adv_loss=1.97, generator_feat_match_loss=2.189, over 1412.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2023-11-12 21:57:41,250 INFO [train.py:811] (0/4) Start epoch 19 2023-11-12 22:00:56,410 INFO [train.py:467] (0/4) Epoch 19, batch 34, global_batch_idx: 700, batch size: 50, loss[discriminator_loss=2.672, discriminator_real_loss=1.35, discriminator_fake_loss=1.322, generator_loss=40.65, generator_mel_loss=32.13, generator_kl_loss=2.04, generator_dur_loss=1.924, generator_adv_loss=2.219, generator_feat_match_loss=2.328, over 50.00 samples.], tot_loss[discriminator_loss=2.592, discriminator_real_loss=1.36, discriminator_fake_loss=1.233, generator_loss=39.99, generator_mel_loss=31.45, generator_kl_loss=2.069, generator_dur_loss=1.912, generator_adv_loss=2.06, generator_feat_match_loss=2.497, over 2541.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2023-11-12 22:01:12,340 INFO [train.py:811] (0/4) Start epoch 20 2023-11-12 22:04:47,326 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-20.pt 2023-11-12 22:04:53,058 INFO [train.py:811] (0/4) Start epoch 21 2023-11-12 22:06:00,298 INFO [train.py:467] (0/4) Epoch 21, batch 10, global_batch_idx: 750, batch size: 56, loss[discriminator_loss=2.613, discriminator_real_loss=1.353, discriminator_fake_loss=1.261, generator_loss=39.07, generator_mel_loss=30.69, generator_kl_loss=2.052, generator_dur_loss=1.932, generator_adv_loss=2.113, generator_feat_match_loss=2.279, over 56.00 samples.], tot_loss[discriminator_loss=2.651, discriminator_real_loss=1.426, discriminator_fake_loss=1.225, generator_loss=39.23, generator_mel_loss=30.95, generator_kl_loss=2.035, generator_dur_loss=1.925, generator_adv_loss=2.024, generator_feat_match_loss=2.295, over 721.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2023-11-12 22:08:26,226 INFO [train.py:811] (0/4) Start epoch 22 2023-11-12 22:10:46,134 INFO [train.py:467] (0/4) Epoch 22, batch 23, global_batch_idx: 800, batch size: 59, loss[discriminator_loss=2.574, discriminator_real_loss=1.209, discriminator_fake_loss=1.365, generator_loss=38.48, generator_mel_loss=30.59, generator_kl_loss=2.043, generator_dur_loss=1.91, generator_adv_loss=1.557, generator_feat_match_loss=2.375, over 59.00 samples.], tot_loss[discriminator_loss=2.616, discriminator_real_loss=1.384, discriminator_fake_loss=1.232, generator_loss=39.28, generator_mel_loss=30.97, generator_kl_loss=2.061, generator_dur_loss=1.91, generator_adv_loss=1.982, generator_feat_match_loss=2.362, over 1883.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 22:10:46,931 INFO [train.py:508] (0/4) Computing validation loss 2023-11-12 22:10:56,991 INFO [train.py:517] (0/4) Epoch 22, validation: discriminator_loss=2.534, discriminator_real_loss=0.9679, discriminator_fake_loss=1.566, generator_loss=40.08, generator_mel_loss=31.58, generator_kl_loss=2.227, generator_dur_loss=1.91, generator_adv_loss=1.604, generator_feat_match_loss=2.761, over 100.00 samples. 2023-11-12 22:10:56,992 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27125MB 2023-11-12 22:12:09,526 INFO [train.py:811] (0/4) Start epoch 23 2023-11-12 22:15:46,199 INFO [train.py:467] (0/4) Epoch 23, batch 36, global_batch_idx: 850, batch size: 52, loss[discriminator_loss=2.824, discriminator_real_loss=1.474, discriminator_fake_loss=1.35, generator_loss=37.92, generator_mel_loss=30.51, generator_kl_loss=1.959, generator_dur_loss=1.954, generator_adv_loss=1.626, generator_feat_match_loss=1.87, over 52.00 samples.], tot_loss[discriminator_loss=2.602, discriminator_real_loss=1.385, discriminator_fake_loss=1.218, generator_loss=38.71, generator_mel_loss=30.29, generator_kl_loss=2.012, generator_dur_loss=1.923, generator_adv_loss=2.063, generator_feat_match_loss=2.424, over 2411.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 22:15:47,708 INFO [train.py:811] (0/4) Start epoch 24 2023-11-12 22:19:14,642 INFO [train.py:811] (0/4) Start epoch 25 2023-11-12 22:20:35,908 INFO [train.py:467] (0/4) Epoch 25, batch 12, global_batch_idx: 900, batch size: 110, loss[discriminator_loss=2.535, discriminator_real_loss=1.32, discriminator_fake_loss=1.214, generator_loss=39.36, generator_mel_loss=30.75, generator_kl_loss=2.043, generator_dur_loss=1.881, generator_adv_loss=2.203, generator_feat_match_loss=2.479, over 110.00 samples.], tot_loss[discriminator_loss=2.616, discriminator_real_loss=1.324, discriminator_fake_loss=1.292, generator_loss=38.32, generator_mel_loss=30.27, generator_kl_loss=1.998, generator_dur_loss=1.91, generator_adv_loss=1.953, generator_feat_match_loss=2.189, over 985.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 22:22:43,249 INFO [train.py:811] (0/4) Start epoch 26 2023-11-12 22:25:14,062 INFO [train.py:467] (0/4) Epoch 26, batch 25, global_batch_idx: 950, batch size: 51, loss[discriminator_loss=2.605, discriminator_real_loss=1.274, discriminator_fake_loss=1.33, generator_loss=39.06, generator_mel_loss=30.47, generator_kl_loss=1.963, generator_dur_loss=1.943, generator_adv_loss=2.082, generator_feat_match_loss=2.602, over 51.00 samples.], tot_loss[discriminator_loss=2.637, discriminator_real_loss=1.386, discriminator_fake_loss=1.251, generator_loss=38.17, generator_mel_loss=29.99, generator_kl_loss=1.965, generator_dur_loss=1.911, generator_adv_loss=1.998, generator_feat_match_loss=2.307, over 1983.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 22:26:18,533 INFO [train.py:811] (0/4) Start epoch 27 2023-11-12 22:29:49,114 INFO [train.py:811] (0/4) Start epoch 28 2023-11-12 22:30:10,027 INFO [train.py:467] (0/4) Epoch 28, batch 1, global_batch_idx: 1000, batch size: 76, loss[discriminator_loss=2.629, discriminator_real_loss=1.266, discriminator_fake_loss=1.364, generator_loss=37.16, generator_mel_loss=29.11, generator_kl_loss=2.033, generator_dur_loss=1.916, generator_adv_loss=1.934, generator_feat_match_loss=2.172, over 76.00 samples.], tot_loss[discriminator_loss=2.631, discriminator_real_loss=1.314, discriminator_fake_loss=1.318, generator_loss=37.27, generator_mel_loss=29.35, generator_kl_loss=1.989, generator_dur_loss=1.911, generator_adv_loss=1.858, generator_feat_match_loss=2.155, over 134.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 22:30:10,686 INFO [train.py:508] (0/4) Computing validation loss 2023-11-12 22:30:21,704 INFO [train.py:517] (0/4) Epoch 28, validation: discriminator_loss=2.516, discriminator_real_loss=1.224, discriminator_fake_loss=1.292, generator_loss=38.54, generator_mel_loss=30.3, generator_kl_loss=2.126, generator_dur_loss=1.907, generator_adv_loss=1.876, generator_feat_match_loss=2.332, over 100.00 samples. 2023-11-12 22:30:21,705 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27125MB 2023-11-12 22:33:37,437 INFO [train.py:811] (0/4) Start epoch 29 2023-11-12 22:35:09,470 INFO [train.py:467] (0/4) Epoch 29, batch 14, global_batch_idx: 1050, batch size: 59, loss[discriminator_loss=2.633, discriminator_real_loss=1.332, discriminator_fake_loss=1.3, generator_loss=37.41, generator_mel_loss=29.54, generator_kl_loss=2.075, generator_dur_loss=1.936, generator_adv_loss=1.827, generator_feat_match_loss=2.025, over 59.00 samples.], tot_loss[discriminator_loss=2.662, discriminator_real_loss=1.416, discriminator_fake_loss=1.246, generator_loss=37.87, generator_mel_loss=29.72, generator_kl_loss=1.988, generator_dur_loss=1.921, generator_adv_loss=1.983, generator_feat_match_loss=2.253, over 1067.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 22:37:10,306 INFO [train.py:811] (0/4) Start epoch 30 2023-11-12 22:39:46,618 INFO [train.py:467] (0/4) Epoch 30, batch 27, global_batch_idx: 1100, batch size: 126, loss[discriminator_loss=2.566, discriminator_real_loss=1.203, discriminator_fake_loss=1.364, generator_loss=38.5, generator_mel_loss=30.16, generator_kl_loss=2.109, generator_dur_loss=1.882, generator_adv_loss=1.993, generator_feat_match_loss=2.359, over 126.00 samples.], tot_loss[discriminator_loss=2.636, discriminator_real_loss=1.359, discriminator_fake_loss=1.277, generator_loss=37.88, generator_mel_loss=29.63, generator_kl_loss=1.992, generator_dur_loss=1.92, generator_adv_loss=1.987, generator_feat_match_loss=2.345, over 1880.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 22:40:37,744 INFO [train.py:811] (0/4) Start epoch 31 2023-11-12 22:44:07,624 INFO [train.py:811] (0/4) Start epoch 32 2023-11-12 22:44:43,084 INFO [train.py:467] (0/4) Epoch 32, batch 3, global_batch_idx: 1150, batch size: 55, loss[discriminator_loss=2.566, discriminator_real_loss=1.261, discriminator_fake_loss=1.305, generator_loss=39.27, generator_mel_loss=30.62, generator_kl_loss=2.026, generator_dur_loss=1.967, generator_adv_loss=2.174, generator_feat_match_loss=2.477, over 55.00 samples.], tot_loss[discriminator_loss=2.674, discriminator_real_loss=1.424, discriminator_fake_loss=1.25, generator_loss=37.75, generator_mel_loss=29.72, generator_kl_loss=1.96, generator_dur_loss=1.912, generator_adv_loss=1.91, generator_feat_match_loss=2.242, over 372.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 22:47:35,368 INFO [train.py:811] (0/4) Start epoch 33 2023-11-12 22:49:25,324 INFO [train.py:467] (0/4) Epoch 33, batch 16, global_batch_idx: 1200, batch size: 58, loss[discriminator_loss=2.568, discriminator_real_loss=1.263, discriminator_fake_loss=1.306, generator_loss=37.49, generator_mel_loss=29.14, generator_kl_loss=1.936, generator_dur_loss=1.919, generator_adv_loss=2.158, generator_feat_match_loss=2.34, over 58.00 samples.], tot_loss[discriminator_loss=2.78, discriminator_real_loss=1.48, discriminator_fake_loss=1.3, generator_loss=37.35, generator_mel_loss=29.32, generator_kl_loss=1.929, generator_dur_loss=1.906, generator_adv_loss=2.03, generator_feat_match_loss=2.157, over 1414.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 22:49:26,082 INFO [train.py:508] (0/4) Computing validation loss 2023-11-12 22:49:36,161 INFO [train.py:517] (0/4) Epoch 33, validation: discriminator_loss=2.588, discriminator_real_loss=1.459, discriminator_fake_loss=1.129, generator_loss=39.25, generator_mel_loss=30.77, generator_kl_loss=2.195, generator_dur_loss=1.925, generator_adv_loss=2.007, generator_feat_match_loss=2.35, over 100.00 samples. 2023-11-12 22:49:36,162 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27125MB 2023-11-12 22:51:17,731 INFO [train.py:811] (0/4) Start epoch 34 2023-11-12 22:54:07,331 INFO [train.py:467] (0/4) Epoch 34, batch 29, global_batch_idx: 1250, batch size: 90, loss[discriminator_loss=2.75, discriminator_real_loss=1.492, discriminator_fake_loss=1.257, generator_loss=37.63, generator_mel_loss=29.73, generator_kl_loss=1.997, generator_dur_loss=1.927, generator_adv_loss=1.854, generator_feat_match_loss=2.121, over 90.00 samples.], tot_loss[discriminator_loss=2.67, discriminator_real_loss=1.381, discriminator_fake_loss=1.289, generator_loss=37.06, generator_mel_loss=29.09, generator_kl_loss=1.971, generator_dur_loss=1.918, generator_adv_loss=1.93, generator_feat_match_loss=2.15, over 2123.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 22:54:46,063 INFO [train.py:811] (0/4) Start epoch 35 2023-11-12 22:58:21,238 INFO [train.py:811] (0/4) Start epoch 36 2023-11-12 22:59:01,728 INFO [train.py:467] (0/4) Epoch 36, batch 5, global_batch_idx: 1300, batch size: 58, loss[discriminator_loss=2.512, discriminator_real_loss=1.374, discriminator_fake_loss=1.137, generator_loss=37, generator_mel_loss=28.34, generator_kl_loss=1.981, generator_dur_loss=1.912, generator_adv_loss=2.076, generator_feat_match_loss=2.688, over 58.00 samples.], tot_loss[discriminator_loss=2.626, discriminator_real_loss=1.373, discriminator_fake_loss=1.253, generator_loss=36.95, generator_mel_loss=28.72, generator_kl_loss=1.964, generator_dur_loss=1.918, generator_adv_loss=1.966, generator_feat_match_loss=2.383, over 426.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:01:54,265 INFO [train.py:811] (0/4) Start epoch 37 2023-11-12 23:03:44,587 INFO [train.py:467] (0/4) Epoch 37, batch 18, global_batch_idx: 1350, batch size: 79, loss[discriminator_loss=2.537, discriminator_real_loss=1.107, discriminator_fake_loss=1.43, generator_loss=37.48, generator_mel_loss=28.88, generator_kl_loss=1.949, generator_dur_loss=1.919, generator_adv_loss=2.17, generator_feat_match_loss=2.566, over 79.00 samples.], tot_loss[discriminator_loss=2.667, discriminator_real_loss=1.389, discriminator_fake_loss=1.278, generator_loss=37.11, generator_mel_loss=28.86, generator_kl_loss=1.952, generator_dur_loss=1.935, generator_adv_loss=2.022, generator_feat_match_loss=2.344, over 1186.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:05:25,598 INFO [train.py:811] (0/4) Start epoch 38 2023-11-12 23:08:29,493 INFO [train.py:467] (0/4) Epoch 38, batch 31, global_batch_idx: 1400, batch size: 69, loss[discriminator_loss=2.637, discriminator_real_loss=1.541, discriminator_fake_loss=1.097, generator_loss=37.39, generator_mel_loss=28.85, generator_kl_loss=1.9, generator_dur_loss=1.928, generator_adv_loss=2.121, generator_feat_match_loss=2.588, over 69.00 samples.], tot_loss[discriminator_loss=2.667, discriminator_real_loss=1.385, discriminator_fake_loss=1.281, generator_loss=36.83, generator_mel_loss=28.66, generator_kl_loss=1.97, generator_dur_loss=1.929, generator_adv_loss=1.991, generator_feat_match_loss=2.281, over 2318.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:08:30,127 INFO [train.py:508] (0/4) Computing validation loss 2023-11-12 23:08:40,524 INFO [train.py:517] (0/4) Epoch 38, validation: discriminator_loss=2.645, discriminator_real_loss=1.425, discriminator_fake_loss=1.22, generator_loss=39.17, generator_mel_loss=30.61, generator_kl_loss=2.023, generator_dur_loss=1.978, generator_adv_loss=1.968, generator_feat_match_loss=2.591, over 100.00 samples. 2023-11-12 23:08:40,525 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27125MB 2023-11-12 23:09:05,452 INFO [train.py:811] (0/4) Start epoch 39 2023-11-12 23:12:38,023 INFO [train.py:811] (0/4) Start epoch 40 2023-11-12 23:13:39,083 INFO [train.py:467] (0/4) Epoch 40, batch 7, global_batch_idx: 1450, batch size: 71, loss[discriminator_loss=2.703, discriminator_real_loss=1.424, discriminator_fake_loss=1.278, generator_loss=35.53, generator_mel_loss=27.64, generator_kl_loss=1.916, generator_dur_loss=1.937, generator_adv_loss=1.967, generator_feat_match_loss=2.07, over 71.00 samples.], tot_loss[discriminator_loss=2.678, discriminator_real_loss=1.402, discriminator_fake_loss=1.276, generator_loss=36.16, generator_mel_loss=28.23, generator_kl_loss=1.893, generator_dur_loss=1.924, generator_adv_loss=1.922, generator_feat_match_loss=2.186, over 597.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:16:10,928 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-40.pt 2023-11-12 23:16:17,004 INFO [train.py:811] (0/4) Start epoch 41 2023-11-12 23:18:22,774 INFO [train.py:467] (0/4) Epoch 41, batch 20, global_batch_idx: 1500, batch size: 59, loss[discriminator_loss=2.842, discriminator_real_loss=1.334, discriminator_fake_loss=1.508, generator_loss=36.47, generator_mel_loss=28.24, generator_kl_loss=1.805, generator_dur_loss=1.927, generator_adv_loss=2.234, generator_feat_match_loss=2.262, over 59.00 samples.], tot_loss[discriminator_loss=2.672, discriminator_real_loss=1.387, discriminator_fake_loss=1.285, generator_loss=36.29, generator_mel_loss=28.17, generator_kl_loss=1.905, generator_dur_loss=1.932, generator_adv_loss=2.013, generator_feat_match_loss=2.27, over 1524.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:19:52,328 INFO [train.py:811] (0/4) Start epoch 42 2023-11-12 23:23:08,829 INFO [train.py:467] (0/4) Epoch 42, batch 33, global_batch_idx: 1550, batch size: 59, loss[discriminator_loss=2.66, discriminator_real_loss=1.242, discriminator_fake_loss=1.419, generator_loss=36.56, generator_mel_loss=28.27, generator_kl_loss=1.919, generator_dur_loss=1.954, generator_adv_loss=2.084, generator_feat_match_loss=2.338, over 59.00 samples.], tot_loss[discriminator_loss=2.649, discriminator_real_loss=1.392, discriminator_fake_loss=1.257, generator_loss=36.07, generator_mel_loss=27.93, generator_kl_loss=1.927, generator_dur_loss=1.933, generator_adv_loss=1.991, generator_feat_match_loss=2.294, over 2344.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:23:26,133 INFO [train.py:811] (0/4) Start epoch 43 2023-11-12 23:26:52,245 INFO [train.py:811] (0/4) Start epoch 44 2023-11-12 23:27:57,949 INFO [train.py:467] (0/4) Epoch 44, batch 9, global_batch_idx: 1600, batch size: 90, loss[discriminator_loss=2.662, discriminator_real_loss=1.515, discriminator_fake_loss=1.147, generator_loss=36.26, generator_mel_loss=27.83, generator_kl_loss=1.955, generator_dur_loss=1.919, generator_adv_loss=2.27, generator_feat_match_loss=2.283, over 90.00 samples.], tot_loss[discriminator_loss=2.622, discriminator_real_loss=1.337, discriminator_fake_loss=1.285, generator_loss=36.41, generator_mel_loss=28.16, generator_kl_loss=1.949, generator_dur_loss=1.936, generator_adv_loss=1.976, generator_feat_match_loss=2.39, over 827.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:27:58,574 INFO [train.py:508] (0/4) Computing validation loss 2023-11-12 23:28:08,903 INFO [train.py:517] (0/4) Epoch 44, validation: discriminator_loss=2.572, discriminator_real_loss=1.465, discriminator_fake_loss=1.107, generator_loss=37.81, generator_mel_loss=29.18, generator_kl_loss=1.971, generator_dur_loss=1.932, generator_adv_loss=2.111, generator_feat_match_loss=2.616, over 100.00 samples. 2023-11-12 23:28:08,904 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27125MB 2023-11-12 23:30:29,484 INFO [train.py:811] (0/4) Start epoch 45 2023-11-12 23:32:41,411 INFO [train.py:467] (0/4) Epoch 45, batch 22, global_batch_idx: 1650, batch size: 52, loss[discriminator_loss=2.691, discriminator_real_loss=1.386, discriminator_fake_loss=1.306, generator_loss=34.52, generator_mel_loss=26.75, generator_kl_loss=1.913, generator_dur_loss=1.935, generator_adv_loss=1.977, generator_feat_match_loss=1.941, over 52.00 samples.], tot_loss[discriminator_loss=2.695, discriminator_real_loss=1.42, discriminator_fake_loss=1.275, generator_loss=35.72, generator_mel_loss=27.7, generator_kl_loss=1.943, generator_dur_loss=1.935, generator_adv_loss=1.93, generator_feat_match_loss=2.211, over 1687.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:33:56,794 INFO [train.py:811] (0/4) Start epoch 46 2023-11-12 23:37:24,934 INFO [train.py:467] (0/4) Epoch 46, batch 35, global_batch_idx: 1700, batch size: 90, loss[discriminator_loss=2.711, discriminator_real_loss=1.201, discriminator_fake_loss=1.51, generator_loss=35.64, generator_mel_loss=27.42, generator_kl_loss=1.99, generator_dur_loss=1.941, generator_adv_loss=2.072, generator_feat_match_loss=2.209, over 90.00 samples.], tot_loss[discriminator_loss=2.694, discriminator_real_loss=1.406, discriminator_fake_loss=1.288, generator_loss=35.54, generator_mel_loss=27.49, generator_kl_loss=1.953, generator_dur_loss=1.946, generator_adv_loss=1.936, generator_feat_match_loss=2.214, over 2646.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:37:29,964 INFO [train.py:811] (0/4) Start epoch 47 2023-11-12 23:41:01,028 INFO [train.py:811] (0/4) Start epoch 48 2023-11-12 23:42:16,604 INFO [train.py:467] (0/4) Epoch 48, batch 11, global_batch_idx: 1750, batch size: 59, loss[discriminator_loss=2.654, discriminator_real_loss=1.281, discriminator_fake_loss=1.373, generator_loss=34.32, generator_mel_loss=26.26, generator_kl_loss=1.933, generator_dur_loss=1.948, generator_adv_loss=2.018, generator_feat_match_loss=2.164, over 59.00 samples.], tot_loss[discriminator_loss=2.67, discriminator_real_loss=1.345, discriminator_fake_loss=1.324, generator_loss=35.39, generator_mel_loss=27.36, generator_kl_loss=1.982, generator_dur_loss=1.94, generator_adv_loss=1.922, generator_feat_match_loss=2.187, over 1032.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:44:30,581 INFO [train.py:811] (0/4) Start epoch 49 2023-11-12 23:46:57,491 INFO [train.py:467] (0/4) Epoch 49, batch 24, global_batch_idx: 1800, batch size: 55, loss[discriminator_loss=2.766, discriminator_real_loss=1.537, discriminator_fake_loss=1.229, generator_loss=34.61, generator_mel_loss=26.79, generator_kl_loss=1.95, generator_dur_loss=1.988, generator_adv_loss=1.816, generator_feat_match_loss=2.062, over 55.00 samples.], tot_loss[discriminator_loss=2.714, discriminator_real_loss=1.43, discriminator_fake_loss=1.284, generator_loss=35.07, generator_mel_loss=27.08, generator_kl_loss=2.015, generator_dur_loss=1.948, generator_adv_loss=1.911, generator_feat_match_loss=2.119, over 1777.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:46:58,154 INFO [train.py:508] (0/4) Computing validation loss 2023-11-12 23:47:08,424 INFO [train.py:517] (0/4) Epoch 49, validation: discriminator_loss=2.643, discriminator_real_loss=1.239, discriminator_fake_loss=1.404, generator_loss=36.36, generator_mel_loss=28.35, generator_kl_loss=2.026, generator_dur_loss=1.943, generator_adv_loss=1.79, generator_feat_match_loss=2.245, over 100.00 samples. 2023-11-12 23:47:08,425 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27125MB 2023-11-12 23:48:15,299 INFO [train.py:811] (0/4) Start epoch 50 2023-11-12 23:51:47,403 INFO [train.py:811] (0/4) Start epoch 51 2023-11-12 23:52:02,395 INFO [train.py:467] (0/4) Epoch 51, batch 0, global_batch_idx: 1850, batch size: 64, loss[discriminator_loss=2.703, discriminator_real_loss=1.229, discriminator_fake_loss=1.474, generator_loss=34.9, generator_mel_loss=26.8, generator_kl_loss=1.976, generator_dur_loss=1.939, generator_adv_loss=2.066, generator_feat_match_loss=2.117, over 64.00 samples.], tot_loss[discriminator_loss=2.703, discriminator_real_loss=1.229, discriminator_fake_loss=1.474, generator_loss=34.9, generator_mel_loss=26.8, generator_kl_loss=1.976, generator_dur_loss=1.939, generator_adv_loss=2.066, generator_feat_match_loss=2.117, over 64.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:55:18,084 INFO [train.py:811] (0/4) Start epoch 52 2023-11-12 23:56:40,076 INFO [train.py:467] (0/4) Epoch 52, batch 13, global_batch_idx: 1900, batch size: 52, loss[discriminator_loss=2.707, discriminator_real_loss=1.529, discriminator_fake_loss=1.177, generator_loss=33.34, generator_mel_loss=25.81, generator_kl_loss=1.982, generator_dur_loss=1.96, generator_adv_loss=1.729, generator_feat_match_loss=1.863, over 52.00 samples.], tot_loss[discriminator_loss=2.752, discriminator_real_loss=1.456, discriminator_fake_loss=1.296, generator_loss=34.1, generator_mel_loss=26.28, generator_kl_loss=2.042, generator_dur_loss=1.945, generator_adv_loss=1.887, generator_feat_match_loss=1.948, over 967.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-12 23:58:45,832 INFO [train.py:811] (0/4) Start epoch 53 2023-11-13 00:01:25,379 INFO [train.py:467] (0/4) Epoch 53, batch 26, global_batch_idx: 1950, batch size: 53, loss[discriminator_loss=2.754, discriminator_real_loss=1.512, discriminator_fake_loss=1.243, generator_loss=33.63, generator_mel_loss=26.14, generator_kl_loss=1.931, generator_dur_loss=1.949, generator_adv_loss=1.767, generator_feat_match_loss=1.845, over 53.00 samples.], tot_loss[discriminator_loss=2.757, discriminator_real_loss=1.437, discriminator_fake_loss=1.32, generator_loss=34.21, generator_mel_loss=26.24, generator_kl_loss=2.041, generator_dur_loss=1.933, generator_adv_loss=1.913, generator_feat_match_loss=2.08, over 2089.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2023-11-13 00:02:18,439 INFO [train.py:811] (0/4) Start epoch 54 2023-11-13 00:05:44,563 INFO [train.py:811] (0/4) Start epoch 55 2023-11-13 00:06:08,606 INFO [train.py:467] (0/4) Epoch 55, batch 2, global_batch_idx: 2000, batch size: 69, loss[discriminator_loss=2.641, discriminator_real_loss=1.356, discriminator_fake_loss=1.285, generator_loss=34.82, generator_mel_loss=26.78, generator_kl_loss=1.959, generator_dur_loss=1.934, generator_adv_loss=1.966, generator_feat_match_loss=2.188, over 69.00 samples.], tot_loss[discriminator_loss=2.684, discriminator_real_loss=1.311, discriminator_fake_loss=1.374, generator_loss=34.49, generator_mel_loss=26.52, generator_kl_loss=1.984, generator_dur_loss=1.94, generator_adv_loss=1.937, generator_feat_match_loss=2.109, over 194.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 64.0 2023-11-13 00:06:09,190 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 00:06:20,172 INFO [train.py:517] (0/4) Epoch 55, validation: discriminator_loss=2.637, discriminator_real_loss=1.39, discriminator_fake_loss=1.247, generator_loss=34.51, generator_mel_loss=26.79, generator_kl_loss=1.921, generator_dur_loss=1.915, generator_adv_loss=1.903, generator_feat_match_loss=1.974, over 100.00 samples. 2023-11-13 00:06:20,173 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27125MB 2023-11-13 00:09:25,017 INFO [train.py:811] (0/4) Start epoch 56 2023-11-13 00:11:05,083 INFO [train.py:467] (0/4) Epoch 56, batch 15, global_batch_idx: 2050, batch size: 51, loss[discriminator_loss=2.836, discriminator_real_loss=1.475, discriminator_fake_loss=1.36, generator_loss=33.81, generator_mel_loss=26.23, generator_kl_loss=1.936, generator_dur_loss=1.945, generator_adv_loss=1.803, generator_feat_match_loss=1.9, over 51.00 samples.], tot_loss[discriminator_loss=2.743, discriminator_real_loss=1.43, discriminator_fake_loss=1.312, generator_loss=33.81, generator_mel_loss=26.05, generator_kl_loss=1.967, generator_dur_loss=1.911, generator_adv_loss=1.872, generator_feat_match_loss=2.015, over 1308.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 64.0 2023-11-13 00:12:58,089 INFO [train.py:811] (0/4) Start epoch 57 2023-11-13 00:15:54,469 INFO [train.py:467] (0/4) Epoch 57, batch 28, global_batch_idx: 2100, batch size: 90, loss[discriminator_loss=2.734, discriminator_real_loss=1.354, discriminator_fake_loss=1.379, generator_loss=33.46, generator_mel_loss=25.58, generator_kl_loss=2.023, generator_dur_loss=1.895, generator_adv_loss=2.004, generator_feat_match_loss=1.959, over 90.00 samples.], tot_loss[discriminator_loss=2.724, discriminator_real_loss=1.401, discriminator_fake_loss=1.323, generator_loss=33.57, generator_mel_loss=25.73, generator_kl_loss=1.976, generator_dur_loss=1.907, generator_adv_loss=1.888, generator_feat_match_loss=2.064, over 2207.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 64.0 2023-11-13 00:16:33,640 INFO [train.py:811] (0/4) Start epoch 58 2023-11-13 00:20:09,560 INFO [train.py:811] (0/4) Start epoch 59 2023-11-13 00:20:44,051 INFO [train.py:467] (0/4) Epoch 59, batch 4, global_batch_idx: 2150, batch size: 52, loss[discriminator_loss=2.758, discriminator_real_loss=1.069, discriminator_fake_loss=1.688, generator_loss=33.15, generator_mel_loss=25.36, generator_kl_loss=1.908, generator_dur_loss=1.927, generator_adv_loss=1.95, generator_feat_match_loss=2.006, over 52.00 samples.], tot_loss[discriminator_loss=2.721, discriminator_real_loss=1.377, discriminator_fake_loss=1.343, generator_loss=33.01, generator_mel_loss=25.26, generator_kl_loss=1.923, generator_dur_loss=1.906, generator_adv_loss=1.867, generator_feat_match_loss=2.054, over 286.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 64.0 2023-11-13 00:23:42,612 INFO [train.py:811] (0/4) Start epoch 60 2023-11-13 00:25:37,871 INFO [train.py:467] (0/4) Epoch 60, batch 17, global_batch_idx: 2200, batch size: 101, loss[discriminator_loss=2.701, discriminator_real_loss=1.41, discriminator_fake_loss=1.291, generator_loss=33.8, generator_mel_loss=25.97, generator_kl_loss=2.015, generator_dur_loss=1.869, generator_adv_loss=1.873, generator_feat_match_loss=2.072, over 101.00 samples.], tot_loss[discriminator_loss=2.716, discriminator_real_loss=1.392, discriminator_fake_loss=1.324, generator_loss=33.31, generator_mel_loss=25.44, generator_kl_loss=1.985, generator_dur_loss=1.888, generator_adv_loss=1.915, generator_feat_match_loss=2.081, over 1202.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 64.0 2023-11-13 00:25:38,360 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 00:25:49,058 INFO [train.py:517] (0/4) Epoch 60, validation: discriminator_loss=2.641, discriminator_real_loss=1.307, discriminator_fake_loss=1.334, generator_loss=34.97, generator_mel_loss=27.14, generator_kl_loss=1.974, generator_dur_loss=1.873, generator_adv_loss=1.789, generator_feat_match_loss=2.193, over 100.00 samples. 2023-11-13 00:25:49,059 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27125MB 2023-11-13 00:27:28,377 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-60.pt 2023-11-13 00:27:34,053 INFO [train.py:811] (0/4) Start epoch 61 2023-11-13 00:30:40,005 INFO [train.py:467] (0/4) Epoch 61, batch 30, global_batch_idx: 2250, batch size: 153, loss[discriminator_loss=2.762, discriminator_real_loss=1.366, discriminator_fake_loss=1.396, generator_loss=34.06, generator_mel_loss=26.05, generator_kl_loss=2.043, generator_dur_loss=1.843, generator_adv_loss=1.805, generator_feat_match_loss=2.32, over 153.00 samples.], tot_loss[discriminator_loss=2.724, discriminator_real_loss=1.399, discriminator_fake_loss=1.325, generator_loss=33.14, generator_mel_loss=25.3, generator_kl_loss=1.962, generator_dur_loss=1.874, generator_adv_loss=1.894, generator_feat_match_loss=2.109, over 2526.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 64.0 2023-11-13 00:31:07,350 INFO [train.py:811] (0/4) Start epoch 62 2023-11-13 00:34:46,088 INFO [train.py:811] (0/4) Start epoch 63 2023-11-13 00:35:39,835 INFO [train.py:467] (0/4) Epoch 63, batch 6, global_batch_idx: 2300, batch size: 64, loss[discriminator_loss=2.688, discriminator_real_loss=1.355, discriminator_fake_loss=1.332, generator_loss=32.49, generator_mel_loss=24.79, generator_kl_loss=1.982, generator_dur_loss=1.903, generator_adv_loss=1.727, generator_feat_match_loss=2.08, over 64.00 samples.], tot_loss[discriminator_loss=2.768, discriminator_real_loss=1.404, discriminator_fake_loss=1.364, generator_loss=33.1, generator_mel_loss=25.28, generator_kl_loss=1.955, generator_dur_loss=1.87, generator_adv_loss=1.925, generator_feat_match_loss=2.071, over 534.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 00:38:19,736 INFO [train.py:811] (0/4) Start epoch 64 2023-11-13 00:40:24,067 INFO [train.py:467] (0/4) Epoch 64, batch 19, global_batch_idx: 2350, batch size: 67, loss[discriminator_loss=2.816, discriminator_real_loss=1.456, discriminator_fake_loss=1.359, generator_loss=32.84, generator_mel_loss=25.31, generator_kl_loss=2.012, generator_dur_loss=1.868, generator_adv_loss=1.773, generator_feat_match_loss=1.877, over 67.00 samples.], tot_loss[discriminator_loss=2.759, discriminator_real_loss=1.416, discriminator_fake_loss=1.343, generator_loss=32.69, generator_mel_loss=25.1, generator_kl_loss=1.983, generator_dur_loss=1.864, generator_adv_loss=1.85, generator_feat_match_loss=1.895, over 1510.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 00:41:52,767 INFO [train.py:811] (0/4) Start epoch 65 2023-11-13 00:45:07,196 INFO [train.py:467] (0/4) Epoch 65, batch 32, global_batch_idx: 2400, batch size: 51, loss[discriminator_loss=2.746, discriminator_real_loss=1.395, discriminator_fake_loss=1.352, generator_loss=33.02, generator_mel_loss=25.3, generator_kl_loss=2.066, generator_dur_loss=1.881, generator_adv_loss=1.874, generator_feat_match_loss=1.891, over 51.00 samples.], tot_loss[discriminator_loss=2.732, discriminator_real_loss=1.388, discriminator_fake_loss=1.344, generator_loss=32.69, generator_mel_loss=24.97, generator_kl_loss=1.985, generator_dur_loss=1.858, generator_adv_loss=1.878, generator_feat_match_loss=1.996, over 2473.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 00:45:07,786 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 00:45:18,048 INFO [train.py:517] (0/4) Epoch 65, validation: discriminator_loss=2.64, discriminator_real_loss=1.369, discriminator_fake_loss=1.272, generator_loss=33.82, generator_mel_loss=25.91, generator_kl_loss=2.019, generator_dur_loss=1.845, generator_adv_loss=1.855, generator_feat_match_loss=2.192, over 100.00 samples. 2023-11-13 00:45:18,050 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27125MB 2023-11-13 00:45:36,307 INFO [train.py:811] (0/4) Start epoch 66 2023-11-13 00:49:07,823 INFO [train.py:811] (0/4) Start epoch 67 2023-11-13 00:50:03,768 INFO [train.py:467] (0/4) Epoch 67, batch 8, global_batch_idx: 2450, batch size: 55, loss[discriminator_loss=2.676, discriminator_real_loss=1.365, discriminator_fake_loss=1.31, generator_loss=33.15, generator_mel_loss=25.37, generator_kl_loss=2.068, generator_dur_loss=1.874, generator_adv_loss=1.793, generator_feat_match_loss=2.045, over 55.00 samples.], tot_loss[discriminator_loss=2.722, discriminator_real_loss=1.392, discriminator_fake_loss=1.331, generator_loss=32.32, generator_mel_loss=24.62, generator_kl_loss=1.958, generator_dur_loss=1.871, generator_adv_loss=1.871, generator_feat_match_loss=1.991, over 650.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 00:52:45,320 INFO [train.py:811] (0/4) Start epoch 68 2023-11-13 00:55:03,159 INFO [train.py:467] (0/4) Epoch 68, batch 21, global_batch_idx: 2500, batch size: 73, loss[discriminator_loss=2.68, discriminator_real_loss=1.406, discriminator_fake_loss=1.274, generator_loss=32.84, generator_mel_loss=25.31, generator_kl_loss=1.996, generator_dur_loss=1.855, generator_adv_loss=1.568, generator_feat_match_loss=2.109, over 73.00 samples.], tot_loss[discriminator_loss=2.731, discriminator_real_loss=1.414, discriminator_fake_loss=1.318, generator_loss=32.53, generator_mel_loss=24.79, generator_kl_loss=2.018, generator_dur_loss=1.846, generator_adv_loss=1.855, generator_feat_match_loss=2.022, over 1643.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 00:56:22,221 INFO [train.py:811] (0/4) Start epoch 69 2023-11-13 00:59:44,468 INFO [train.py:467] (0/4) Epoch 69, batch 34, global_batch_idx: 2550, batch size: 50, loss[discriminator_loss=2.746, discriminator_real_loss=1.284, discriminator_fake_loss=1.463, generator_loss=32.45, generator_mel_loss=24.59, generator_kl_loss=1.987, generator_dur_loss=1.855, generator_adv_loss=1.919, generator_feat_match_loss=2.104, over 50.00 samples.], tot_loss[discriminator_loss=2.745, discriminator_real_loss=1.392, discriminator_fake_loss=1.353, generator_loss=32.55, generator_mel_loss=24.78, generator_kl_loss=1.994, generator_dur_loss=1.847, generator_adv_loss=1.894, generator_feat_match_loss=2.032, over 2649.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 00:59:55,343 INFO [train.py:811] (0/4) Start epoch 70 2023-11-13 01:03:19,274 INFO [train.py:811] (0/4) Start epoch 71 2023-11-13 01:04:28,866 INFO [train.py:467] (0/4) Epoch 71, batch 10, global_batch_idx: 2600, batch size: 153, loss[discriminator_loss=2.678, discriminator_real_loss=1.307, discriminator_fake_loss=1.371, generator_loss=32.89, generator_mel_loss=24.86, generator_kl_loss=2.01, generator_dur_loss=1.809, generator_adv_loss=1.847, generator_feat_match_loss=2.359, over 153.00 samples.], tot_loss[discriminator_loss=2.724, discriminator_real_loss=1.363, discriminator_fake_loss=1.361, generator_loss=32.21, generator_mel_loss=24.46, generator_kl_loss=1.974, generator_dur_loss=1.838, generator_adv_loss=1.846, generator_feat_match_loss=2.09, over 808.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 01:04:29,545 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 01:04:39,258 INFO [train.py:517] (0/4) Epoch 71, validation: discriminator_loss=2.675, discriminator_real_loss=1.302, discriminator_fake_loss=1.373, generator_loss=33.04, generator_mel_loss=25.21, generator_kl_loss=2.058, generator_dur_loss=1.819, generator_adv_loss=1.796, generator_feat_match_loss=2.157, over 100.00 samples. 2023-11-13 01:04:39,259 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27125MB 2023-11-13 01:06:57,306 INFO [train.py:811] (0/4) Start epoch 72 2023-11-13 01:09:24,980 INFO [train.py:467] (0/4) Epoch 72, batch 23, global_batch_idx: 2650, batch size: 90, loss[discriminator_loss=2.793, discriminator_real_loss=1.677, discriminator_fake_loss=1.115, generator_loss=32.3, generator_mel_loss=24.67, generator_kl_loss=2.013, generator_dur_loss=1.857, generator_adv_loss=1.718, generator_feat_match_loss=2.043, over 90.00 samples.], tot_loss[discriminator_loss=2.743, discriminator_real_loss=1.413, discriminator_fake_loss=1.33, generator_loss=32.2, generator_mel_loss=24.46, generator_kl_loss=1.988, generator_dur_loss=1.833, generator_adv_loss=1.874, generator_feat_match_loss=2.041, over 1872.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 01:10:28,893 INFO [train.py:811] (0/4) Start epoch 73 2023-11-13 01:14:02,920 INFO [train.py:467] (0/4) Epoch 73, batch 36, global_batch_idx: 2700, batch size: 101, loss[discriminator_loss=2.805, discriminator_real_loss=1.182, discriminator_fake_loss=1.623, generator_loss=32.33, generator_mel_loss=24.41, generator_kl_loss=2.123, generator_dur_loss=1.818, generator_adv_loss=1.889, generator_feat_match_loss=2.088, over 101.00 samples.], tot_loss[discriminator_loss=2.73, discriminator_real_loss=1.389, discriminator_fake_loss=1.341, generator_loss=32.11, generator_mel_loss=24.32, generator_kl_loss=1.988, generator_dur_loss=1.837, generator_adv_loss=1.883, generator_feat_match_loss=2.076, over 2498.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 01:14:04,106 INFO [train.py:811] (0/4) Start epoch 74 2023-11-13 01:17:38,757 INFO [train.py:811] (0/4) Start epoch 75 2023-11-13 01:18:57,344 INFO [train.py:467] (0/4) Epoch 75, batch 12, global_batch_idx: 2750, batch size: 51, loss[discriminator_loss=2.82, discriminator_real_loss=1.239, discriminator_fake_loss=1.58, generator_loss=32.75, generator_mel_loss=24.43, generator_kl_loss=1.945, generator_dur_loss=1.843, generator_adv_loss=2.32, generator_feat_match_loss=2.203, over 51.00 samples.], tot_loss[discriminator_loss=2.734, discriminator_real_loss=1.372, discriminator_fake_loss=1.362, generator_loss=32.2, generator_mel_loss=24.36, generator_kl_loss=1.972, generator_dur_loss=1.824, generator_adv_loss=1.918, generator_feat_match_loss=2.127, over 1004.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 01:21:05,289 INFO [train.py:811] (0/4) Start epoch 76 2023-11-13 01:23:36,700 INFO [train.py:467] (0/4) Epoch 76, batch 25, global_batch_idx: 2800, batch size: 50, loss[discriminator_loss=2.75, discriminator_real_loss=1.428, discriminator_fake_loss=1.321, generator_loss=31.38, generator_mel_loss=23.56, generator_kl_loss=1.979, generator_dur_loss=1.822, generator_adv_loss=1.982, generator_feat_match_loss=2.033, over 50.00 samples.], tot_loss[discriminator_loss=2.728, discriminator_real_loss=1.388, discriminator_fake_loss=1.34, generator_loss=32.01, generator_mel_loss=24.22, generator_kl_loss=1.966, generator_dur_loss=1.828, generator_adv_loss=1.901, generator_feat_match_loss=2.101, over 1959.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 01:23:37,491 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 01:23:47,555 INFO [train.py:517] (0/4) Epoch 76, validation: discriminator_loss=2.666, discriminator_real_loss=1.444, discriminator_fake_loss=1.222, generator_loss=33.52, generator_mel_loss=25.64, generator_kl_loss=2.048, generator_dur_loss=1.806, generator_adv_loss=1.952, generator_feat_match_loss=2.076, over 100.00 samples. 2023-11-13 01:23:47,556 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27125MB 2023-11-13 01:24:48,380 INFO [train.py:811] (0/4) Start epoch 77 2023-11-13 01:28:19,135 INFO [train.py:811] (0/4) Start epoch 78 2023-11-13 01:28:39,643 INFO [train.py:467] (0/4) Epoch 78, batch 1, global_batch_idx: 2850, batch size: 81, loss[discriminator_loss=2.857, discriminator_real_loss=1.691, discriminator_fake_loss=1.166, generator_loss=31.93, generator_mel_loss=24.41, generator_kl_loss=1.984, generator_dur_loss=1.825, generator_adv_loss=1.732, generator_feat_match_loss=1.977, over 81.00 samples.], tot_loss[discriminator_loss=2.838, discriminator_real_loss=1.489, discriminator_fake_loss=1.35, generator_loss=32, generator_mel_loss=24.32, generator_kl_loss=2.001, generator_dur_loss=1.835, generator_adv_loss=1.907, generator_feat_match_loss=1.94, over 133.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 01:31:48,721 INFO [train.py:811] (0/4) Start epoch 79 2023-11-13 01:33:16,502 INFO [train.py:467] (0/4) Epoch 79, batch 14, global_batch_idx: 2900, batch size: 58, loss[discriminator_loss=2.66, discriminator_real_loss=1.261, discriminator_fake_loss=1.4, generator_loss=32.6, generator_mel_loss=24.5, generator_kl_loss=2.033, generator_dur_loss=1.839, generator_adv_loss=2.02, generator_feat_match_loss=2.207, over 58.00 samples.], tot_loss[discriminator_loss=2.749, discriminator_real_loss=1.411, discriminator_fake_loss=1.338, generator_loss=32.05, generator_mel_loss=24.23, generator_kl_loss=1.992, generator_dur_loss=1.824, generator_adv_loss=1.892, generator_feat_match_loss=2.108, over 1042.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 01:35:20,881 INFO [train.py:811] (0/4) Start epoch 80 2023-11-13 01:38:04,839 INFO [train.py:467] (0/4) Epoch 80, batch 27, global_batch_idx: 2950, batch size: 51, loss[discriminator_loss=2.805, discriminator_real_loss=1.422, discriminator_fake_loss=1.382, generator_loss=31.35, generator_mel_loss=23.9, generator_kl_loss=2.003, generator_dur_loss=1.816, generator_adv_loss=1.743, generator_feat_match_loss=1.889, over 51.00 samples.], tot_loss[discriminator_loss=2.734, discriminator_real_loss=1.42, discriminator_fake_loss=1.314, generator_loss=31.66, generator_mel_loss=23.91, generator_kl_loss=1.983, generator_dur_loss=1.816, generator_adv_loss=1.879, generator_feat_match_loss=2.075, over 2064.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 01:38:48,579 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-80.pt 2023-11-13 01:38:54,595 INFO [train.py:811] (0/4) Start epoch 81 2023-11-13 01:42:32,750 INFO [train.py:811] (0/4) Start epoch 82 2023-11-13 01:43:03,087 INFO [train.py:467] (0/4) Epoch 82, batch 3, global_batch_idx: 3000, batch size: 65, loss[discriminator_loss=2.789, discriminator_real_loss=1.239, discriminator_fake_loss=1.549, generator_loss=31.29, generator_mel_loss=23.37, generator_kl_loss=1.941, generator_dur_loss=1.831, generator_adv_loss=2.092, generator_feat_match_loss=2.059, over 65.00 samples.], tot_loss[discriminator_loss=2.764, discriminator_real_loss=1.354, discriminator_fake_loss=1.41, generator_loss=31.59, generator_mel_loss=23.75, generator_kl_loss=1.944, generator_dur_loss=1.836, generator_adv_loss=1.955, generator_feat_match_loss=2.106, over 241.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 01:43:03,761 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 01:43:14,687 INFO [train.py:517] (0/4) Epoch 82, validation: discriminator_loss=2.629, discriminator_real_loss=1.475, discriminator_fake_loss=1.154, generator_loss=32.79, generator_mel_loss=24.73, generator_kl_loss=2.072, generator_dur_loss=1.789, generator_adv_loss=2.068, generator_feat_match_loss=2.126, over 100.00 samples. 2023-11-13 01:43:14,687 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 01:46:15,551 INFO [train.py:811] (0/4) Start epoch 83 2023-11-13 01:48:02,214 INFO [train.py:467] (0/4) Epoch 83, batch 16, global_batch_idx: 3050, batch size: 81, loss[discriminator_loss=2.77, discriminator_real_loss=1.492, discriminator_fake_loss=1.276, generator_loss=30.71, generator_mel_loss=23.32, generator_kl_loss=1.971, generator_dur_loss=1.815, generator_adv_loss=1.698, generator_feat_match_loss=1.904, over 81.00 samples.], tot_loss[discriminator_loss=2.756, discriminator_real_loss=1.397, discriminator_fake_loss=1.359, generator_loss=31.82, generator_mel_loss=23.93, generator_kl_loss=2.005, generator_dur_loss=1.81, generator_adv_loss=1.948, generator_feat_match_loss=2.129, over 1352.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 01:49:53,631 INFO [train.py:811] (0/4) Start epoch 84 2023-11-13 01:52:42,028 INFO [train.py:467] (0/4) Epoch 84, batch 29, global_batch_idx: 3100, batch size: 81, loss[discriminator_loss=2.652, discriminator_real_loss=1.261, discriminator_fake_loss=1.393, generator_loss=31.8, generator_mel_loss=23.66, generator_kl_loss=1.918, generator_dur_loss=1.787, generator_adv_loss=2.195, generator_feat_match_loss=2.242, over 81.00 samples.], tot_loss[discriminator_loss=2.722, discriminator_real_loss=1.396, discriminator_fake_loss=1.326, generator_loss=31.86, generator_mel_loss=23.99, generator_kl_loss=1.987, generator_dur_loss=1.806, generator_adv_loss=1.901, generator_feat_match_loss=2.176, over 2519.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 01:53:25,275 INFO [train.py:811] (0/4) Start epoch 85 2023-11-13 01:56:53,183 INFO [train.py:811] (0/4) Start epoch 86 2023-11-13 01:57:32,040 INFO [train.py:467] (0/4) Epoch 86, batch 5, global_batch_idx: 3150, batch size: 51, loss[discriminator_loss=2.84, discriminator_real_loss=1.369, discriminator_fake_loss=1.471, generator_loss=30.84, generator_mel_loss=23.14, generator_kl_loss=2.009, generator_dur_loss=1.835, generator_adv_loss=1.893, generator_feat_match_loss=1.963, over 51.00 samples.], tot_loss[discriminator_loss=2.766, discriminator_real_loss=1.382, discriminator_fake_loss=1.383, generator_loss=31.36, generator_mel_loss=23.7, generator_kl_loss=1.987, generator_dur_loss=1.801, generator_adv_loss=1.854, generator_feat_match_loss=2.021, over 450.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 02:00:25,720 INFO [train.py:811] (0/4) Start epoch 87 2023-11-13 02:02:21,702 INFO [train.py:467] (0/4) Epoch 87, batch 18, global_batch_idx: 3200, batch size: 55, loss[discriminator_loss=2.777, discriminator_real_loss=1.442, discriminator_fake_loss=1.334, generator_loss=30.79, generator_mel_loss=23.12, generator_kl_loss=1.95, generator_dur_loss=1.83, generator_adv_loss=1.833, generator_feat_match_loss=2.059, over 55.00 samples.], tot_loss[discriminator_loss=2.716, discriminator_real_loss=1.38, discriminator_fake_loss=1.336, generator_loss=31.64, generator_mel_loss=23.77, generator_kl_loss=1.979, generator_dur_loss=1.813, generator_adv_loss=1.907, generator_feat_match_loss=2.164, over 1163.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 02:02:22,300 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 02:02:32,500 INFO [train.py:517] (0/4) Epoch 87, validation: discriminator_loss=2.654, discriminator_real_loss=1.294, discriminator_fake_loss=1.36, generator_loss=32.68, generator_mel_loss=24.71, generator_kl_loss=2.052, generator_dur_loss=1.781, generator_adv_loss=1.804, generator_feat_match_loss=2.333, over 100.00 samples. 2023-11-13 02:02:32,501 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 02:04:11,445 INFO [train.py:811] (0/4) Start epoch 88 2023-11-13 02:07:12,341 INFO [train.py:467] (0/4) Epoch 88, batch 31, global_batch_idx: 3250, batch size: 54, loss[discriminator_loss=2.789, discriminator_real_loss=1.682, discriminator_fake_loss=1.107, generator_loss=31.32, generator_mel_loss=23.93, generator_kl_loss=1.882, generator_dur_loss=1.807, generator_adv_loss=1.6, generator_feat_match_loss=2.098, over 54.00 samples.], tot_loss[discriminator_loss=2.731, discriminator_real_loss=1.396, discriminator_fake_loss=1.335, generator_loss=31.49, generator_mel_loss=23.66, generator_kl_loss=1.971, generator_dur_loss=1.799, generator_adv_loss=1.888, generator_feat_match_loss=2.177, over 2383.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 02:07:43,121 INFO [train.py:811] (0/4) Start epoch 89 2023-11-13 02:11:19,140 INFO [train.py:811] (0/4) Start epoch 90 2023-11-13 02:12:13,401 INFO [train.py:467] (0/4) Epoch 90, batch 7, global_batch_idx: 3300, batch size: 60, loss[discriminator_loss=2.828, discriminator_real_loss=1.405, discriminator_fake_loss=1.423, generator_loss=31.12, generator_mel_loss=23.38, generator_kl_loss=1.913, generator_dur_loss=1.796, generator_adv_loss=2.07, generator_feat_match_loss=1.958, over 60.00 samples.], tot_loss[discriminator_loss=2.774, discriminator_real_loss=1.421, discriminator_fake_loss=1.354, generator_loss=31.69, generator_mel_loss=23.88, generator_kl_loss=1.999, generator_dur_loss=1.811, generator_adv_loss=1.915, generator_feat_match_loss=2.082, over 606.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 02:14:56,214 INFO [train.py:811] (0/4) Start epoch 91 2023-11-13 02:17:00,004 INFO [train.py:467] (0/4) Epoch 91, batch 20, global_batch_idx: 3350, batch size: 63, loss[discriminator_loss=2.711, discriminator_real_loss=1.392, discriminator_fake_loss=1.318, generator_loss=30.93, generator_mel_loss=23.24, generator_kl_loss=1.91, generator_dur_loss=1.796, generator_adv_loss=1.879, generator_feat_match_loss=2.109, over 63.00 samples.], tot_loss[discriminator_loss=2.753, discriminator_real_loss=1.42, discriminator_fake_loss=1.333, generator_loss=31.36, generator_mel_loss=23.51, generator_kl_loss=2.001, generator_dur_loss=1.802, generator_adv_loss=1.899, generator_feat_match_loss=2.145, over 1407.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 02:18:23,062 INFO [train.py:811] (0/4) Start epoch 92 2023-11-13 02:21:30,565 INFO [train.py:467] (0/4) Epoch 92, batch 33, global_batch_idx: 3400, batch size: 153, loss[discriminator_loss=2.832, discriminator_real_loss=1.262, discriminator_fake_loss=1.57, generator_loss=32.02, generator_mel_loss=23.94, generator_kl_loss=1.915, generator_dur_loss=1.761, generator_adv_loss=2.111, generator_feat_match_loss=2.293, over 153.00 samples.], tot_loss[discriminator_loss=2.735, discriminator_real_loss=1.397, discriminator_fake_loss=1.338, generator_loss=31.58, generator_mel_loss=23.68, generator_kl_loss=1.982, generator_dur_loss=1.797, generator_adv_loss=1.915, generator_feat_match_loss=2.207, over 2643.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 02:21:31,186 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 02:21:41,597 INFO [train.py:517] (0/4) Epoch 92, validation: discriminator_loss=2.768, discriminator_real_loss=1.521, discriminator_fake_loss=1.246, generator_loss=33.17, generator_mel_loss=24.83, generator_kl_loss=2.05, generator_dur_loss=1.77, generator_adv_loss=2.067, generator_feat_match_loss=2.454, over 100.00 samples. 2023-11-13 02:21:41,598 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 02:21:59,765 INFO [train.py:811] (0/4) Start epoch 93 2023-11-13 02:25:30,851 INFO [train.py:811] (0/4) Start epoch 94 2023-11-13 02:26:32,244 INFO [train.py:467] (0/4) Epoch 94, batch 9, global_batch_idx: 3450, batch size: 53, loss[discriminator_loss=2.773, discriminator_real_loss=1.39, discriminator_fake_loss=1.384, generator_loss=31.39, generator_mel_loss=23.55, generator_kl_loss=1.948, generator_dur_loss=1.807, generator_adv_loss=1.959, generator_feat_match_loss=2.127, over 53.00 samples.], tot_loss[discriminator_loss=2.783, discriminator_real_loss=1.416, discriminator_fake_loss=1.367, generator_loss=31.2, generator_mel_loss=23.49, generator_kl_loss=1.955, generator_dur_loss=1.798, generator_adv_loss=1.866, generator_feat_match_loss=2.089, over 665.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 02:29:02,058 INFO [train.py:811] (0/4) Start epoch 95 2023-11-13 02:31:26,332 INFO [train.py:467] (0/4) Epoch 95, batch 22, global_batch_idx: 3500, batch size: 54, loss[discriminator_loss=2.758, discriminator_real_loss=1.63, discriminator_fake_loss=1.128, generator_loss=31.1, generator_mel_loss=23.33, generator_kl_loss=1.983, generator_dur_loss=1.842, generator_adv_loss=1.943, generator_feat_match_loss=2.004, over 54.00 samples.], tot_loss[discriminator_loss=2.764, discriminator_real_loss=1.394, discriminator_fake_loss=1.37, generator_loss=31.15, generator_mel_loss=23.43, generator_kl_loss=1.962, generator_dur_loss=1.792, generator_adv_loss=1.892, generator_feat_match_loss=2.076, over 1794.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 02:32:38,800 INFO [train.py:811] (0/4) Start epoch 96 2023-11-13 02:35:56,134 INFO [train.py:467] (0/4) Epoch 96, batch 35, global_batch_idx: 3550, batch size: 67, loss[discriminator_loss=2.801, discriminator_real_loss=1.291, discriminator_fake_loss=1.51, generator_loss=31.38, generator_mel_loss=23.46, generator_kl_loss=2.042, generator_dur_loss=1.805, generator_adv_loss=1.881, generator_feat_match_loss=2.195, over 67.00 samples.], tot_loss[discriminator_loss=2.736, discriminator_real_loss=1.389, discriminator_fake_loss=1.347, generator_loss=31.19, generator_mel_loss=23.41, generator_kl_loss=1.953, generator_dur_loss=1.796, generator_adv_loss=1.892, generator_feat_match_loss=2.137, over 2515.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 02:36:02,619 INFO [train.py:811] (0/4) Start epoch 97 2023-11-13 02:39:34,269 INFO [train.py:811] (0/4) Start epoch 98 2023-11-13 02:40:52,655 INFO [train.py:467] (0/4) Epoch 98, batch 11, global_batch_idx: 3600, batch size: 95, loss[discriminator_loss=2.762, discriminator_real_loss=1.352, discriminator_fake_loss=1.41, generator_loss=31.65, generator_mel_loss=23.89, generator_kl_loss=1.97, generator_dur_loss=1.77, generator_adv_loss=1.881, generator_feat_match_loss=2.133, over 95.00 samples.], tot_loss[discriminator_loss=2.754, discriminator_real_loss=1.382, discriminator_fake_loss=1.372, generator_loss=31.22, generator_mel_loss=23.55, generator_kl_loss=1.973, generator_dur_loss=1.785, generator_adv_loss=1.843, generator_feat_match_loss=2.066, over 1004.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 02:40:53,248 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 02:41:03,531 INFO [train.py:517] (0/4) Epoch 98, validation: discriminator_loss=2.675, discriminator_real_loss=1.354, discriminator_fake_loss=1.321, generator_loss=32.06, generator_mel_loss=24.26, generator_kl_loss=2.058, generator_dur_loss=1.766, generator_adv_loss=1.862, generator_feat_match_loss=2.108, over 100.00 samples. 2023-11-13 02:41:03,532 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 02:43:14,238 INFO [train.py:811] (0/4) Start epoch 99 2023-11-13 02:45:42,505 INFO [train.py:467] (0/4) Epoch 99, batch 24, global_batch_idx: 3650, batch size: 67, loss[discriminator_loss=2.824, discriminator_real_loss=1.703, discriminator_fake_loss=1.12, generator_loss=31.03, generator_mel_loss=23.13, generator_kl_loss=1.961, generator_dur_loss=1.807, generator_adv_loss=1.783, generator_feat_match_loss=2.355, over 67.00 samples.], tot_loss[discriminator_loss=2.753, discriminator_real_loss=1.407, discriminator_fake_loss=1.346, generator_loss=31.17, generator_mel_loss=23.37, generator_kl_loss=1.974, generator_dur_loss=1.783, generator_adv_loss=1.896, generator_feat_match_loss=2.151, over 1791.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 02:46:48,844 INFO [train.py:811] (0/4) Start epoch 100 2023-11-13 02:50:27,381 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-100.pt 2023-11-13 02:50:30,717 INFO [train.py:811] (0/4) Start epoch 101 2023-11-13 02:50:46,151 INFO [train.py:467] (0/4) Epoch 101, batch 0, global_batch_idx: 3700, batch size: 95, loss[discriminator_loss=2.795, discriminator_real_loss=1.276, discriminator_fake_loss=1.519, generator_loss=30.5, generator_mel_loss=23, generator_kl_loss=1.97, generator_dur_loss=1.746, generator_adv_loss=1.73, generator_feat_match_loss=2.059, over 95.00 samples.], tot_loss[discriminator_loss=2.795, discriminator_real_loss=1.276, discriminator_fake_loss=1.519, generator_loss=30.5, generator_mel_loss=23, generator_kl_loss=1.97, generator_dur_loss=1.746, generator_adv_loss=1.73, generator_feat_match_loss=2.059, over 95.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2023-11-13 02:54:03,308 INFO [train.py:811] (0/4) Start epoch 102 2023-11-13 02:55:22,961 INFO [train.py:467] (0/4) Epoch 102, batch 13, global_batch_idx: 3750, batch size: 54, loss[discriminator_loss=2.707, discriminator_real_loss=1.553, discriminator_fake_loss=1.153, generator_loss=30.67, generator_mel_loss=23.05, generator_kl_loss=1.921, generator_dur_loss=1.8, generator_adv_loss=1.704, generator_feat_match_loss=2.191, over 54.00 samples.], tot_loss[discriminator_loss=2.722, discriminator_real_loss=1.394, discriminator_fake_loss=1.328, generator_loss=30.54, generator_mel_loss=22.87, generator_kl_loss=1.951, generator_dur_loss=1.792, generator_adv_loss=1.846, generator_feat_match_loss=2.08, over 958.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 64.0 2023-11-13 02:57:40,841 INFO [train.py:811] (0/4) Start epoch 103 2023-11-13 03:00:04,358 INFO [train.py:467] (0/4) Epoch 103, batch 26, global_batch_idx: 3800, batch size: 60, loss[discriminator_loss=2.73, discriminator_real_loss=1.359, discriminator_fake_loss=1.37, generator_loss=31.46, generator_mel_loss=23.62, generator_kl_loss=1.954, generator_dur_loss=1.785, generator_adv_loss=1.98, generator_feat_match_loss=2.119, over 60.00 samples.], tot_loss[discriminator_loss=2.733, discriminator_real_loss=1.383, discriminator_fake_loss=1.35, generator_loss=31.07, generator_mel_loss=23.3, generator_kl_loss=1.974, generator_dur_loss=1.788, generator_adv_loss=1.862, generator_feat_match_loss=2.145, over 1883.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 64.0 2023-11-13 03:00:05,022 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 03:00:15,275 INFO [train.py:517] (0/4) Epoch 103, validation: discriminator_loss=2.699, discriminator_real_loss=1.437, discriminator_fake_loss=1.263, generator_loss=31.77, generator_mel_loss=23.74, generator_kl_loss=2.119, generator_dur_loss=1.75, generator_adv_loss=1.923, generator_feat_match_loss=2.244, over 100.00 samples. 2023-11-13 03:00:15,276 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 03:01:13,935 INFO [train.py:811] (0/4) Start epoch 104 2023-11-13 03:04:45,304 INFO [train.py:811] (0/4) Start epoch 105 2023-11-13 03:05:16,147 INFO [train.py:467] (0/4) Epoch 105, batch 2, global_batch_idx: 3850, batch size: 153, loss[discriminator_loss=2.789, discriminator_real_loss=1.446, discriminator_fake_loss=1.344, generator_loss=31.57, generator_mel_loss=23.53, generator_kl_loss=2.008, generator_dur_loss=1.768, generator_adv_loss=2.057, generator_feat_match_loss=2.211, over 153.00 samples.], tot_loss[discriminator_loss=2.749, discriminator_real_loss=1.389, discriminator_fake_loss=1.361, generator_loss=31.37, generator_mel_loss=23.36, generator_kl_loss=2.006, generator_dur_loss=1.771, generator_adv_loss=1.989, generator_feat_match_loss=2.244, over 359.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 64.0 2023-11-13 03:08:13,755 INFO [train.py:811] (0/4) Start epoch 106 2023-11-13 03:09:43,355 INFO [train.py:467] (0/4) Epoch 106, batch 15, global_batch_idx: 3900, batch size: 51, loss[discriminator_loss=2.812, discriminator_real_loss=1.575, discriminator_fake_loss=1.236, generator_loss=31.15, generator_mel_loss=23.1, generator_kl_loss=2.031, generator_dur_loss=1.8, generator_adv_loss=2.01, generator_feat_match_loss=2.209, over 51.00 samples.], tot_loss[discriminator_loss=2.78, discriminator_real_loss=1.424, discriminator_fake_loss=1.356, generator_loss=30.91, generator_mel_loss=23.22, generator_kl_loss=1.964, generator_dur_loss=1.786, generator_adv_loss=1.847, generator_feat_match_loss=2.09, over 1036.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 64.0 2023-11-13 03:11:40,339 INFO [train.py:811] (0/4) Start epoch 107 2023-11-13 03:14:19,916 INFO [train.py:467] (0/4) Epoch 107, batch 28, global_batch_idx: 3950, batch size: 51, loss[discriminator_loss=2.773, discriminator_real_loss=1.485, discriminator_fake_loss=1.287, generator_loss=30.03, generator_mel_loss=22.49, generator_kl_loss=1.953, generator_dur_loss=1.813, generator_adv_loss=1.838, generator_feat_match_loss=1.929, over 51.00 samples.], tot_loss[discriminator_loss=2.769, discriminator_real_loss=1.422, discriminator_fake_loss=1.347, generator_loss=31.05, generator_mel_loss=23.26, generator_kl_loss=1.967, generator_dur_loss=1.78, generator_adv_loss=1.874, generator_feat_match_loss=2.173, over 2139.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 64.0 2023-11-13 03:15:13,790 INFO [train.py:811] (0/4) Start epoch 108 2023-11-13 03:18:43,025 INFO [train.py:811] (0/4) Start epoch 109 2023-11-13 03:19:19,560 INFO [train.py:467] (0/4) Epoch 109, batch 4, global_batch_idx: 4000, batch size: 55, loss[discriminator_loss=2.789, discriminator_real_loss=1.289, discriminator_fake_loss=1.5, generator_loss=31.03, generator_mel_loss=23.23, generator_kl_loss=1.976, generator_dur_loss=1.803, generator_adv_loss=1.912, generator_feat_match_loss=2.105, over 55.00 samples.], tot_loss[discriminator_loss=2.739, discriminator_real_loss=1.398, discriminator_fake_loss=1.341, generator_loss=30.29, generator_mel_loss=22.72, generator_kl_loss=1.921, generator_dur_loss=1.788, generator_adv_loss=1.83, generator_feat_match_loss=2.033, over 355.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 03:19:20,147 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 03:19:31,526 INFO [train.py:517] (0/4) Epoch 109, validation: discriminator_loss=2.658, discriminator_real_loss=1.349, discriminator_fake_loss=1.309, generator_loss=31.43, generator_mel_loss=23.69, generator_kl_loss=1.953, generator_dur_loss=1.745, generator_adv_loss=1.882, generator_feat_match_loss=2.155, over 100.00 samples. 2023-11-13 03:19:31,527 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 03:22:31,718 INFO [train.py:811] (0/4) Start epoch 110 2023-11-13 03:24:17,771 INFO [train.py:467] (0/4) Epoch 110, batch 17, global_batch_idx: 4050, batch size: 54, loss[discriminator_loss=2.723, discriminator_real_loss=1.228, discriminator_fake_loss=1.495, generator_loss=31.77, generator_mel_loss=23.61, generator_kl_loss=1.898, generator_dur_loss=1.801, generator_adv_loss=2.137, generator_feat_match_loss=2.326, over 54.00 samples.], tot_loss[discriminator_loss=2.73, discriminator_real_loss=1.376, discriminator_fake_loss=1.354, generator_loss=30.93, generator_mel_loss=23.06, generator_kl_loss=1.953, generator_dur_loss=1.781, generator_adv_loss=1.875, generator_feat_match_loss=2.264, over 1244.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 03:26:04,304 INFO [train.py:811] (0/4) Start epoch 111 2023-11-13 03:29:08,004 INFO [train.py:467] (0/4) Epoch 111, batch 30, global_batch_idx: 4100, batch size: 101, loss[discriminator_loss=2.889, discriminator_real_loss=1.754, discriminator_fake_loss=1.135, generator_loss=30.21, generator_mel_loss=22.81, generator_kl_loss=1.825, generator_dur_loss=1.777, generator_adv_loss=1.618, generator_feat_match_loss=2.18, over 101.00 samples.], tot_loss[discriminator_loss=2.78, discriminator_real_loss=1.421, discriminator_fake_loss=1.359, generator_loss=30.9, generator_mel_loss=23.09, generator_kl_loss=1.972, generator_dur_loss=1.776, generator_adv_loss=1.887, generator_feat_match_loss=2.176, over 2449.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 03:29:39,137 INFO [train.py:811] (0/4) Start epoch 112 2023-11-13 03:33:13,572 INFO [train.py:811] (0/4) Start epoch 113 2023-11-13 03:33:57,745 INFO [train.py:467] (0/4) Epoch 113, batch 6, global_batch_idx: 4150, batch size: 79, loss[discriminator_loss=2.73, discriminator_real_loss=1.219, discriminator_fake_loss=1.512, generator_loss=31.19, generator_mel_loss=23.09, generator_kl_loss=1.989, generator_dur_loss=1.776, generator_adv_loss=2.125, generator_feat_match_loss=2.207, over 79.00 samples.], tot_loss[discriminator_loss=2.734, discriminator_real_loss=1.371, discriminator_fake_loss=1.363, generator_loss=30.82, generator_mel_loss=23, generator_kl_loss=1.994, generator_dur_loss=1.778, generator_adv_loss=1.886, generator_feat_match_loss=2.158, over 488.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 03:36:39,971 INFO [train.py:811] (0/4) Start epoch 114 2023-11-13 03:38:42,397 INFO [train.py:467] (0/4) Epoch 114, batch 19, global_batch_idx: 4200, batch size: 110, loss[discriminator_loss=2.738, discriminator_real_loss=1.394, discriminator_fake_loss=1.346, generator_loss=31.14, generator_mel_loss=23.24, generator_kl_loss=2.001, generator_dur_loss=1.758, generator_adv_loss=2.018, generator_feat_match_loss=2.125, over 110.00 samples.], tot_loss[discriminator_loss=2.751, discriminator_real_loss=1.406, discriminator_fake_loss=1.345, generator_loss=30.88, generator_mel_loss=23.08, generator_kl_loss=1.964, generator_dur_loss=1.776, generator_adv_loss=1.854, generator_feat_match_loss=2.207, over 1546.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 03:38:43,029 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 03:38:54,190 INFO [train.py:517] (0/4) Epoch 114, validation: discriminator_loss=2.653, discriminator_real_loss=1.35, discriminator_fake_loss=1.303, generator_loss=31.92, generator_mel_loss=23.96, generator_kl_loss=2.078, generator_dur_loss=1.744, generator_adv_loss=1.877, generator_feat_match_loss=2.255, over 100.00 samples. 2023-11-13 03:38:54,191 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 03:40:19,582 INFO [train.py:811] (0/4) Start epoch 115 2023-11-13 03:43:25,655 INFO [train.py:467] (0/4) Epoch 115, batch 32, global_batch_idx: 4250, batch size: 153, loss[discriminator_loss=2.748, discriminator_real_loss=1.411, discriminator_fake_loss=1.337, generator_loss=30.69, generator_mel_loss=22.88, generator_kl_loss=1.96, generator_dur_loss=1.751, generator_adv_loss=1.84, generator_feat_match_loss=2.254, over 153.00 samples.], tot_loss[discriminator_loss=2.753, discriminator_real_loss=1.392, discriminator_fake_loss=1.361, generator_loss=30.63, generator_mel_loss=22.83, generator_kl_loss=1.971, generator_dur_loss=1.766, generator_adv_loss=1.878, generator_feat_match_loss=2.186, over 2426.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 03:43:48,549 INFO [train.py:811] (0/4) Start epoch 116 2023-11-13 03:47:13,960 INFO [train.py:811] (0/4) Start epoch 117 2023-11-13 03:48:13,602 INFO [train.py:467] (0/4) Epoch 117, batch 8, global_batch_idx: 4300, batch size: 61, loss[discriminator_loss=2.73, discriminator_real_loss=1.553, discriminator_fake_loss=1.178, generator_loss=30.74, generator_mel_loss=22.87, generator_kl_loss=1.878, generator_dur_loss=1.788, generator_adv_loss=1.863, generator_feat_match_loss=2.34, over 61.00 samples.], tot_loss[discriminator_loss=2.761, discriminator_real_loss=1.418, discriminator_fake_loss=1.343, generator_loss=30.38, generator_mel_loss=22.62, generator_kl_loss=1.946, generator_dur_loss=1.779, generator_adv_loss=1.854, generator_feat_match_loss=2.18, over 535.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 03:50:52,073 INFO [train.py:811] (0/4) Start epoch 118 2023-11-13 03:53:08,225 INFO [train.py:467] (0/4) Epoch 118, batch 21, global_batch_idx: 4350, batch size: 49, loss[discriminator_loss=2.766, discriminator_real_loss=1.376, discriminator_fake_loss=1.39, generator_loss=30.47, generator_mel_loss=22.75, generator_kl_loss=1.919, generator_dur_loss=1.762, generator_adv_loss=2.002, generator_feat_match_loss=2.039, over 49.00 samples.], tot_loss[discriminator_loss=2.787, discriminator_real_loss=1.417, discriminator_fake_loss=1.371, generator_loss=30.57, generator_mel_loss=22.88, generator_kl_loss=1.936, generator_dur_loss=1.771, generator_adv_loss=1.88, generator_feat_match_loss=2.113, over 1534.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 03:54:26,545 INFO [train.py:811] (0/4) Start epoch 119 2023-11-13 03:57:50,227 INFO [train.py:467] (0/4) Epoch 119, batch 34, global_batch_idx: 4400, batch size: 64, loss[discriminator_loss=2.668, discriminator_real_loss=1.263, discriminator_fake_loss=1.404, generator_loss=30.62, generator_mel_loss=22.75, generator_kl_loss=1.954, generator_dur_loss=1.766, generator_adv_loss=1.819, generator_feat_match_loss=2.326, over 64.00 samples.], tot_loss[discriminator_loss=2.751, discriminator_real_loss=1.399, discriminator_fake_loss=1.352, generator_loss=30.48, generator_mel_loss=22.69, generator_kl_loss=1.961, generator_dur_loss=1.774, generator_adv_loss=1.867, generator_feat_match_loss=2.189, over 2292.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 03:57:50,863 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 03:58:00,794 INFO [train.py:517] (0/4) Epoch 119, validation: discriminator_loss=2.664, discriminator_real_loss=1.22, discriminator_fake_loss=1.443, generator_loss=31.92, generator_mel_loss=23.95, generator_kl_loss=1.953, generator_dur_loss=1.737, generator_adv_loss=1.761, generator_feat_match_loss=2.518, over 100.00 samples. 2023-11-13 03:58:00,794 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 03:58:10,945 INFO [train.py:811] (0/4) Start epoch 120 2023-11-13 04:01:43,037 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-120.pt 2023-11-13 04:01:46,243 INFO [train.py:811] (0/4) Start epoch 121 2023-11-13 04:02:50,439 INFO [train.py:467] (0/4) Epoch 121, batch 10, global_batch_idx: 4450, batch size: 81, loss[discriminator_loss=2.77, discriminator_real_loss=1.431, discriminator_fake_loss=1.34, generator_loss=30.25, generator_mel_loss=22.62, generator_kl_loss=1.935, generator_dur_loss=1.761, generator_adv_loss=1.815, generator_feat_match_loss=2.109, over 81.00 samples.], tot_loss[discriminator_loss=2.754, discriminator_real_loss=1.418, discriminator_fake_loss=1.337, generator_loss=30.63, generator_mel_loss=22.89, generator_kl_loss=1.957, generator_dur_loss=1.763, generator_adv_loss=1.843, generator_feat_match_loss=2.179, over 809.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 04:05:14,769 INFO [train.py:811] (0/4) Start epoch 122 2023-11-13 04:07:37,392 INFO [train.py:467] (0/4) Epoch 122, batch 23, global_batch_idx: 4500, batch size: 76, loss[discriminator_loss=2.672, discriminator_real_loss=1.376, discriminator_fake_loss=1.295, generator_loss=31.04, generator_mel_loss=23.09, generator_kl_loss=1.893, generator_dur_loss=1.749, generator_adv_loss=1.859, generator_feat_match_loss=2.451, over 76.00 samples.], tot_loss[discriminator_loss=2.739, discriminator_real_loss=1.387, discriminator_fake_loss=1.352, generator_loss=30.66, generator_mel_loss=22.81, generator_kl_loss=1.977, generator_dur_loss=1.766, generator_adv_loss=1.867, generator_feat_match_loss=2.232, over 1900.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 04:08:46,289 INFO [train.py:811] (0/4) Start epoch 123 2023-11-13 04:12:13,050 INFO [train.py:467] (0/4) Epoch 123, batch 36, global_batch_idx: 4550, batch size: 52, loss[discriminator_loss=2.752, discriminator_real_loss=1.236, discriminator_fake_loss=1.516, generator_loss=30.76, generator_mel_loss=22.69, generator_kl_loss=1.942, generator_dur_loss=1.768, generator_adv_loss=2.189, generator_feat_match_loss=2.172, over 52.00 samples.], tot_loss[discriminator_loss=2.774, discriminator_real_loss=1.411, discriminator_fake_loss=1.363, generator_loss=30.41, generator_mel_loss=22.72, generator_kl_loss=1.966, generator_dur_loss=1.765, generator_adv_loss=1.848, generator_feat_match_loss=2.105, over 2594.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 04:12:14,181 INFO [train.py:811] (0/4) Start epoch 124 2023-11-13 04:15:43,123 INFO [train.py:811] (0/4) Start epoch 125 2023-11-13 04:17:08,801 INFO [train.py:467] (0/4) Epoch 125, batch 12, global_batch_idx: 4600, batch size: 55, loss[discriminator_loss=2.664, discriminator_real_loss=1.354, discriminator_fake_loss=1.311, generator_loss=31.18, generator_mel_loss=23.22, generator_kl_loss=1.935, generator_dur_loss=1.78, generator_adv_loss=1.832, generator_feat_match_loss=2.41, over 55.00 samples.], tot_loss[discriminator_loss=2.713, discriminator_real_loss=1.358, discriminator_fake_loss=1.355, generator_loss=30.72, generator_mel_loss=22.83, generator_kl_loss=1.96, generator_dur_loss=1.754, generator_adv_loss=1.87, generator_feat_match_loss=2.305, over 1107.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 04:17:09,427 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 04:17:19,763 INFO [train.py:517] (0/4) Epoch 125, validation: discriminator_loss=2.659, discriminator_real_loss=1.309, discriminator_fake_loss=1.35, generator_loss=31.88, generator_mel_loss=23.86, generator_kl_loss=1.967, generator_dur_loss=1.736, generator_adv_loss=1.793, generator_feat_match_loss=2.523, over 100.00 samples. 2023-11-13 04:17:19,764 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 04:19:29,555 INFO [train.py:811] (0/4) Start epoch 126 2023-11-13 04:21:56,585 INFO [train.py:467] (0/4) Epoch 126, batch 25, global_batch_idx: 4650, batch size: 60, loss[discriminator_loss=2.723, discriminator_real_loss=1.45, discriminator_fake_loss=1.273, generator_loss=30.03, generator_mel_loss=22.2, generator_kl_loss=1.961, generator_dur_loss=1.789, generator_adv_loss=1.864, generator_feat_match_loss=2.221, over 60.00 samples.], tot_loss[discriminator_loss=2.735, discriminator_real_loss=1.377, discriminator_fake_loss=1.358, generator_loss=30.71, generator_mel_loss=22.8, generator_kl_loss=1.984, generator_dur_loss=1.764, generator_adv_loss=1.876, generator_feat_match_loss=2.287, over 1760.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 04:22:56,042 INFO [train.py:811] (0/4) Start epoch 127 2023-11-13 04:26:33,858 INFO [train.py:811] (0/4) Start epoch 128 2023-11-13 04:26:58,830 INFO [train.py:467] (0/4) Epoch 128, batch 1, global_batch_idx: 4700, batch size: 60, loss[discriminator_loss=2.734, discriminator_real_loss=1.383, discriminator_fake_loss=1.351, generator_loss=31.62, generator_mel_loss=23.47, generator_kl_loss=2.028, generator_dur_loss=1.746, generator_adv_loss=2.086, generator_feat_match_loss=2.289, over 60.00 samples.], tot_loss[discriminator_loss=2.74, discriminator_real_loss=1.278, discriminator_fake_loss=1.461, generator_loss=31.42, generator_mel_loss=23.34, generator_kl_loss=2.08, generator_dur_loss=1.758, generator_adv_loss=1.91, generator_feat_match_loss=2.338, over 186.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 04:30:11,247 INFO [train.py:811] (0/4) Start epoch 129 2023-11-13 04:31:40,383 INFO [train.py:467] (0/4) Epoch 129, batch 14, global_batch_idx: 4750, batch size: 95, loss[discriminator_loss=2.797, discriminator_real_loss=1.192, discriminator_fake_loss=1.604, generator_loss=30.67, generator_mel_loss=22.62, generator_kl_loss=1.909, generator_dur_loss=1.748, generator_adv_loss=2.197, generator_feat_match_loss=2.195, over 95.00 samples.], tot_loss[discriminator_loss=2.772, discriminator_real_loss=1.401, discriminator_fake_loss=1.371, generator_loss=30.49, generator_mel_loss=22.67, generator_kl_loss=1.944, generator_dur_loss=1.763, generator_adv_loss=1.904, generator_feat_match_loss=2.212, over 1061.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 04:33:45,678 INFO [train.py:811] (0/4) Start epoch 130 2023-11-13 04:36:24,731 INFO [train.py:467] (0/4) Epoch 130, batch 27, global_batch_idx: 4800, batch size: 153, loss[discriminator_loss=2.736, discriminator_real_loss=1.417, discriminator_fake_loss=1.319, generator_loss=30.37, generator_mel_loss=22.49, generator_kl_loss=2.025, generator_dur_loss=1.75, generator_adv_loss=1.791, generator_feat_match_loss=2.318, over 153.00 samples.], tot_loss[discriminator_loss=2.796, discriminator_real_loss=1.433, discriminator_fake_loss=1.363, generator_loss=30.07, generator_mel_loss=22.41, generator_kl_loss=1.943, generator_dur_loss=1.756, generator_adv_loss=1.843, generator_feat_match_loss=2.115, over 1940.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 04:36:25,286 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 04:36:35,810 INFO [train.py:517] (0/4) Epoch 130, validation: discriminator_loss=2.648, discriminator_real_loss=1.241, discriminator_fake_loss=1.407, generator_loss=30.79, generator_mel_loss=23.05, generator_kl_loss=2.004, generator_dur_loss=1.733, generator_adv_loss=1.731, generator_feat_match_loss=2.275, over 100.00 samples. 2023-11-13 04:36:35,811 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 04:37:26,033 INFO [train.py:811] (0/4) Start epoch 131 2023-11-13 04:40:55,941 INFO [train.py:811] (0/4) Start epoch 132 2023-11-13 04:41:29,787 INFO [train.py:467] (0/4) Epoch 132, batch 3, global_batch_idx: 4850, batch size: 71, loss[discriminator_loss=2.711, discriminator_real_loss=1.407, discriminator_fake_loss=1.304, generator_loss=31.32, generator_mel_loss=23.25, generator_kl_loss=2.129, generator_dur_loss=1.749, generator_adv_loss=1.893, generator_feat_match_loss=2.295, over 71.00 samples.], tot_loss[discriminator_loss=2.74, discriminator_real_loss=1.395, discriminator_fake_loss=1.346, generator_loss=30.94, generator_mel_loss=22.97, generator_kl_loss=2.008, generator_dur_loss=1.755, generator_adv_loss=1.894, generator_feat_match_loss=2.318, over 323.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 04:44:25,830 INFO [train.py:811] (0/4) Start epoch 133 2023-11-13 04:46:02,989 INFO [train.py:467] (0/4) Epoch 133, batch 16, global_batch_idx: 4900, batch size: 60, loss[discriminator_loss=2.762, discriminator_real_loss=1.395, discriminator_fake_loss=1.367, generator_loss=30.78, generator_mel_loss=22.92, generator_kl_loss=1.945, generator_dur_loss=1.756, generator_adv_loss=1.851, generator_feat_match_loss=2.307, over 60.00 samples.], tot_loss[discriminator_loss=2.765, discriminator_real_loss=1.421, discriminator_fake_loss=1.344, generator_loss=30.5, generator_mel_loss=22.71, generator_kl_loss=1.937, generator_dur_loss=1.765, generator_adv_loss=1.866, generator_feat_match_loss=2.228, over 985.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 04:47:56,960 INFO [train.py:811] (0/4) Start epoch 134 2023-11-13 04:50:46,841 INFO [train.py:467] (0/4) Epoch 134, batch 29, global_batch_idx: 4950, batch size: 58, loss[discriminator_loss=2.746, discriminator_real_loss=1.374, discriminator_fake_loss=1.372, generator_loss=30.39, generator_mel_loss=22.39, generator_kl_loss=2.001, generator_dur_loss=1.739, generator_adv_loss=1.953, generator_feat_match_loss=2.309, over 58.00 samples.], tot_loss[discriminator_loss=2.802, discriminator_real_loss=1.414, discriminator_fake_loss=1.387, generator_loss=30.34, generator_mel_loss=22.55, generator_kl_loss=1.975, generator_dur_loss=1.756, generator_adv_loss=1.868, generator_feat_match_loss=2.188, over 2165.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 04:51:26,207 INFO [train.py:811] (0/4) Start epoch 135 2023-11-13 04:54:55,788 INFO [train.py:811] (0/4) Start epoch 136 2023-11-13 04:55:38,412 INFO [train.py:467] (0/4) Epoch 136, batch 5, global_batch_idx: 5000, batch size: 71, loss[discriminator_loss=2.734, discriminator_real_loss=1.3, discriminator_fake_loss=1.435, generator_loss=30.86, generator_mel_loss=22.97, generator_kl_loss=1.918, generator_dur_loss=1.759, generator_adv_loss=1.81, generator_feat_match_loss=2.402, over 71.00 samples.], tot_loss[discriminator_loss=2.723, discriminator_real_loss=1.369, discriminator_fake_loss=1.354, generator_loss=30.63, generator_mel_loss=22.74, generator_kl_loss=1.925, generator_dur_loss=1.759, generator_adv_loss=1.848, generator_feat_match_loss=2.357, over 410.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 04:55:38,964 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 04:55:50,229 INFO [train.py:517] (0/4) Epoch 136, validation: discriminator_loss=2.601, discriminator_real_loss=1.183, discriminator_fake_loss=1.418, generator_loss=31.35, generator_mel_loss=23.4, generator_kl_loss=1.903, generator_dur_loss=1.72, generator_adv_loss=1.731, generator_feat_match_loss=2.595, over 100.00 samples. 2023-11-13 04:55:50,230 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 04:58:35,223 INFO [train.py:811] (0/4) Start epoch 137 2023-11-13 05:00:30,210 INFO [train.py:467] (0/4) Epoch 137, batch 18, global_batch_idx: 5050, batch size: 153, loss[discriminator_loss=2.746, discriminator_real_loss=1.162, discriminator_fake_loss=1.583, generator_loss=30.87, generator_mel_loss=22.74, generator_kl_loss=1.929, generator_dur_loss=1.737, generator_adv_loss=2.186, generator_feat_match_loss=2.275, over 153.00 samples.], tot_loss[discriminator_loss=2.779, discriminator_real_loss=1.387, discriminator_fake_loss=1.392, generator_loss=30.22, generator_mel_loss=22.45, generator_kl_loss=1.956, generator_dur_loss=1.757, generator_adv_loss=1.891, generator_feat_match_loss=2.16, over 1315.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 05:02:10,251 INFO [train.py:811] (0/4) Start epoch 138 2023-11-13 05:05:17,213 INFO [train.py:467] (0/4) Epoch 138, batch 31, global_batch_idx: 5100, batch size: 53, loss[discriminator_loss=2.768, discriminator_real_loss=1.387, discriminator_fake_loss=1.381, generator_loss=30.67, generator_mel_loss=22.77, generator_kl_loss=2.026, generator_dur_loss=1.778, generator_adv_loss=1.9, generator_feat_match_loss=2.195, over 53.00 samples.], tot_loss[discriminator_loss=2.772, discriminator_real_loss=1.417, discriminator_fake_loss=1.356, generator_loss=30.28, generator_mel_loss=22.5, generator_kl_loss=1.969, generator_dur_loss=1.75, generator_adv_loss=1.858, generator_feat_match_loss=2.202, over 2300.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 05:05:45,553 INFO [train.py:811] (0/4) Start epoch 139 2023-11-13 05:09:14,744 INFO [train.py:811] (0/4) Start epoch 140 2023-11-13 05:10:06,738 INFO [train.py:467] (0/4) Epoch 140, batch 7, global_batch_idx: 5150, batch size: 51, loss[discriminator_loss=2.787, discriminator_real_loss=1.575, discriminator_fake_loss=1.212, generator_loss=30.39, generator_mel_loss=22.71, generator_kl_loss=2.009, generator_dur_loss=1.771, generator_adv_loss=1.828, generator_feat_match_loss=2.074, over 51.00 samples.], tot_loss[discriminator_loss=2.791, discriminator_real_loss=1.408, discriminator_fake_loss=1.383, generator_loss=30.36, generator_mel_loss=22.58, generator_kl_loss=1.985, generator_dur_loss=1.746, generator_adv_loss=1.869, generator_feat_match_loss=2.175, over 595.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 05:12:47,474 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-140.pt 2023-11-13 05:12:51,007 INFO [train.py:811] (0/4) Start epoch 141 2023-11-13 05:14:54,160 INFO [train.py:467] (0/4) Epoch 141, batch 20, global_batch_idx: 5200, batch size: 58, loss[discriminator_loss=2.82, discriminator_real_loss=1.464, discriminator_fake_loss=1.356, generator_loss=29.49, generator_mel_loss=21.89, generator_kl_loss=1.94, generator_dur_loss=1.725, generator_adv_loss=1.766, generator_feat_match_loss=2.168, over 58.00 samples.], tot_loss[discriminator_loss=2.758, discriminator_real_loss=1.398, discriminator_fake_loss=1.359, generator_loss=30.44, generator_mel_loss=22.55, generator_kl_loss=1.957, generator_dur_loss=1.747, generator_adv_loss=1.869, generator_feat_match_loss=2.318, over 1472.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 05:14:54,733 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 05:15:05,080 INFO [train.py:517] (0/4) Epoch 141, validation: discriminator_loss=2.734, discriminator_real_loss=1.313, discriminator_fake_loss=1.421, generator_loss=31.4, generator_mel_loss=23.61, generator_kl_loss=2.005, generator_dur_loss=1.718, generator_adv_loss=1.729, generator_feat_match_loss=2.339, over 100.00 samples. 2023-11-13 05:15:05,081 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 05:16:26,951 INFO [train.py:811] (0/4) Start epoch 142 2023-11-13 05:19:39,470 INFO [train.py:467] (0/4) Epoch 142, batch 33, global_batch_idx: 5250, batch size: 95, loss[discriminator_loss=2.754, discriminator_real_loss=1.378, discriminator_fake_loss=1.377, generator_loss=30.16, generator_mel_loss=22.3, generator_kl_loss=2.022, generator_dur_loss=1.741, generator_adv_loss=1.815, generator_feat_match_loss=2.273, over 95.00 samples.], tot_loss[discriminator_loss=2.75, discriminator_real_loss=1.396, discriminator_fake_loss=1.354, generator_loss=30.29, generator_mel_loss=22.43, generator_kl_loss=1.962, generator_dur_loss=1.748, generator_adv_loss=1.864, generator_feat_match_loss=2.29, over 2318.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2023-11-13 05:19:57,149 INFO [train.py:811] (0/4) Start epoch 143 2023-11-13 05:23:26,112 INFO [train.py:811] (0/4) Start epoch 144 2023-11-13 05:24:33,417 INFO [train.py:467] (0/4) Epoch 144, batch 9, global_batch_idx: 5300, batch size: 90, loss[discriminator_loss=2.73, discriminator_real_loss=1.477, discriminator_fake_loss=1.255, generator_loss=30.42, generator_mel_loss=22.56, generator_kl_loss=2.007, generator_dur_loss=1.737, generator_adv_loss=1.735, generator_feat_match_loss=2.379, over 90.00 samples.], tot_loss[discriminator_loss=2.724, discriminator_real_loss=1.377, discriminator_fake_loss=1.347, generator_loss=30.26, generator_mel_loss=22.34, generator_kl_loss=2.005, generator_dur_loss=1.742, generator_adv_loss=1.858, generator_feat_match_loss=2.318, over 891.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2023-11-13 05:27:00,417 INFO [train.py:811] (0/4) Start epoch 145 2023-11-13 05:29:15,246 INFO [train.py:467] (0/4) Epoch 145, batch 22, global_batch_idx: 5350, batch size: 53, loss[discriminator_loss=2.676, discriminator_real_loss=1.281, discriminator_fake_loss=1.395, generator_loss=29.62, generator_mel_loss=21.7, generator_kl_loss=1.966, generator_dur_loss=1.726, generator_adv_loss=1.933, generator_feat_match_loss=2.293, over 53.00 samples.], tot_loss[discriminator_loss=2.752, discriminator_real_loss=1.399, discriminator_fake_loss=1.352, generator_loss=30.38, generator_mel_loss=22.5, generator_kl_loss=2.017, generator_dur_loss=1.749, generator_adv_loss=1.861, generator_feat_match_loss=2.249, over 1611.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2023-11-13 05:30:35,931 INFO [train.py:811] (0/4) Start epoch 146 2023-11-13 05:34:02,583 INFO [train.py:467] (0/4) Epoch 146, batch 35, global_batch_idx: 5400, batch size: 85, loss[discriminator_loss=2.816, discriminator_real_loss=1.255, discriminator_fake_loss=1.562, generator_loss=30.83, generator_mel_loss=22.56, generator_kl_loss=2.024, generator_dur_loss=1.749, generator_adv_loss=2.096, generator_feat_match_loss=2.396, over 85.00 samples.], tot_loss[discriminator_loss=2.756, discriminator_real_loss=1.388, discriminator_fake_loss=1.368, generator_loss=30.41, generator_mel_loss=22.46, generator_kl_loss=1.969, generator_dur_loss=1.744, generator_adv_loss=1.886, generator_feat_match_loss=2.347, over 2634.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2023-11-13 05:34:03,120 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 05:34:13,540 INFO [train.py:517] (0/4) Epoch 146, validation: discriminator_loss=2.781, discriminator_real_loss=1.589, discriminator_fake_loss=1.192, generator_loss=31.94, generator_mel_loss=23.76, generator_kl_loss=2.033, generator_dur_loss=1.711, generator_adv_loss=2.037, generator_feat_match_loss=2.401, over 100.00 samples. 2023-11-13 05:34:13,541 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 05:34:18,946 INFO [train.py:811] (0/4) Start epoch 147 2023-11-13 05:37:48,491 INFO [train.py:811] (0/4) Start epoch 148 2023-11-13 05:38:57,700 INFO [train.py:467] (0/4) Epoch 148, batch 11, global_batch_idx: 5450, batch size: 76, loss[discriminator_loss=2.754, discriminator_real_loss=1.364, discriminator_fake_loss=1.391, generator_loss=30.1, generator_mel_loss=22.3, generator_kl_loss=1.968, generator_dur_loss=1.778, generator_adv_loss=1.859, generator_feat_match_loss=2.199, over 76.00 samples.], tot_loss[discriminator_loss=2.76, discriminator_real_loss=1.385, discriminator_fake_loss=1.375, generator_loss=30.45, generator_mel_loss=22.61, generator_kl_loss=1.997, generator_dur_loss=1.751, generator_adv_loss=1.838, generator_feat_match_loss=2.254, over 885.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2023-11-13 05:41:17,293 INFO [train.py:811] (0/4) Start epoch 149 2023-11-13 05:43:45,085 INFO [train.py:467] (0/4) Epoch 149, batch 24, global_batch_idx: 5500, batch size: 52, loss[discriminator_loss=2.75, discriminator_real_loss=1.521, discriminator_fake_loss=1.228, generator_loss=29.79, generator_mel_loss=21.97, generator_kl_loss=1.934, generator_dur_loss=1.745, generator_adv_loss=2.01, generator_feat_match_loss=2.129, over 52.00 samples.], tot_loss[discriminator_loss=2.753, discriminator_real_loss=1.406, discriminator_fake_loss=1.347, generator_loss=30.27, generator_mel_loss=22.35, generator_kl_loss=1.964, generator_dur_loss=1.741, generator_adv_loss=1.887, generator_feat_match_loss=2.324, over 1852.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2023-11-13 05:44:52,147 INFO [train.py:811] (0/4) Start epoch 150 2023-11-13 05:48:19,214 INFO [train.py:811] (0/4) Start epoch 151 2023-11-13 05:48:33,665 INFO [train.py:467] (0/4) Epoch 151, batch 0, global_batch_idx: 5550, batch size: 52, loss[discriminator_loss=2.68, discriminator_real_loss=1.344, discriminator_fake_loss=1.335, generator_loss=30.65, generator_mel_loss=22.57, generator_kl_loss=1.994, generator_dur_loss=1.773, generator_adv_loss=1.978, generator_feat_match_loss=2.336, over 52.00 samples.], tot_loss[discriminator_loss=2.68, discriminator_real_loss=1.344, discriminator_fake_loss=1.335, generator_loss=30.65, generator_mel_loss=22.57, generator_kl_loss=1.994, generator_dur_loss=1.773, generator_adv_loss=1.978, generator_feat_match_loss=2.336, over 52.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2023-11-13 05:51:49,050 INFO [train.py:811] (0/4) Start epoch 152 2023-11-13 05:53:09,420 INFO [train.py:467] (0/4) Epoch 152, batch 13, global_batch_idx: 5600, batch size: 53, loss[discriminator_loss=2.76, discriminator_real_loss=1.428, discriminator_fake_loss=1.332, generator_loss=29.11, generator_mel_loss=21.42, generator_kl_loss=1.942, generator_dur_loss=1.743, generator_adv_loss=1.93, generator_feat_match_loss=2.072, over 53.00 samples.], tot_loss[discriminator_loss=2.749, discriminator_real_loss=1.381, discriminator_fake_loss=1.368, generator_loss=30.26, generator_mel_loss=22.41, generator_kl_loss=1.969, generator_dur_loss=1.752, generator_adv_loss=1.842, generator_feat_match_loss=2.288, over 836.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2023-11-13 05:53:09,901 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 05:53:20,499 INFO [train.py:517] (0/4) Epoch 152, validation: discriminator_loss=2.72, discriminator_real_loss=1.382, discriminator_fake_loss=1.337, generator_loss=31.62, generator_mel_loss=23.69, generator_kl_loss=2.044, generator_dur_loss=1.705, generator_adv_loss=1.797, generator_feat_match_loss=2.381, over 100.00 samples. 2023-11-13 05:53:20,500 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 05:55:27,421 INFO [train.py:811] (0/4) Start epoch 153 2023-11-13 05:58:05,084 INFO [train.py:467] (0/4) Epoch 153, batch 26, global_batch_idx: 5650, batch size: 73, loss[discriminator_loss=2.686, discriminator_real_loss=1.362, discriminator_fake_loss=1.323, generator_loss=30.42, generator_mel_loss=22.27, generator_kl_loss=2.076, generator_dur_loss=1.76, generator_adv_loss=1.923, generator_feat_match_loss=2.391, over 73.00 samples.], tot_loss[discriminator_loss=2.758, discriminator_real_loss=1.39, discriminator_fake_loss=1.368, generator_loss=30.16, generator_mel_loss=22.31, generator_kl_loss=1.968, generator_dur_loss=1.74, generator_adv_loss=1.866, generator_feat_match_loss=2.274, over 1766.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2023-11-13 05:59:03,380 INFO [train.py:811] (0/4) Start epoch 154 2023-11-13 06:02:31,536 INFO [train.py:811] (0/4) Start epoch 155 2023-11-13 06:02:58,830 INFO [train.py:467] (0/4) Epoch 155, batch 2, global_batch_idx: 5700, batch size: 65, loss[discriminator_loss=2.801, discriminator_real_loss=1.312, discriminator_fake_loss=1.49, generator_loss=30.24, generator_mel_loss=22.28, generator_kl_loss=1.947, generator_dur_loss=1.739, generator_adv_loss=1.968, generator_feat_match_loss=2.307, over 65.00 samples.], tot_loss[discriminator_loss=2.761, discriminator_real_loss=1.402, discriminator_fake_loss=1.359, generator_loss=30.45, generator_mel_loss=22.48, generator_kl_loss=1.956, generator_dur_loss=1.728, generator_adv_loss=1.937, generator_feat_match_loss=2.348, over 219.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2023-11-13 06:06:06,795 INFO [train.py:811] (0/4) Start epoch 156 2023-11-13 06:07:45,296 INFO [train.py:467] (0/4) Epoch 156, batch 15, global_batch_idx: 5750, batch size: 64, loss[discriminator_loss=2.717, discriminator_real_loss=1.269, discriminator_fake_loss=1.448, generator_loss=30.2, generator_mel_loss=22.11, generator_kl_loss=1.923, generator_dur_loss=1.763, generator_adv_loss=2.016, generator_feat_match_loss=2.387, over 64.00 samples.], tot_loss[discriminator_loss=2.741, discriminator_real_loss=1.377, discriminator_fake_loss=1.364, generator_loss=30.28, generator_mel_loss=22.32, generator_kl_loss=1.939, generator_dur_loss=1.744, generator_adv_loss=1.894, generator_feat_match_loss=2.379, over 1117.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2023-11-13 06:09:43,611 INFO [train.py:811] (0/4) Start epoch 157 2023-11-13 06:12:34,803 INFO [train.py:467] (0/4) Epoch 157, batch 28, global_batch_idx: 5800, batch size: 153, loss[discriminator_loss=2.77, discriminator_real_loss=1.231, discriminator_fake_loss=1.537, generator_loss=30.6, generator_mel_loss=22.51, generator_kl_loss=2.008, generator_dur_loss=1.705, generator_adv_loss=2.029, generator_feat_match_loss=2.348, over 153.00 samples.], tot_loss[discriminator_loss=2.762, discriminator_real_loss=1.387, discriminator_fake_loss=1.375, generator_loss=30.38, generator_mel_loss=22.35, generator_kl_loss=1.974, generator_dur_loss=1.729, generator_adv_loss=1.924, generator_feat_match_loss=2.405, over 2405.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 64.0 2023-11-13 06:12:35,335 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 06:12:46,006 INFO [train.py:517] (0/4) Epoch 157, validation: discriminator_loss=2.75, discriminator_real_loss=1.503, discriminator_fake_loss=1.247, generator_loss=31.28, generator_mel_loss=23.14, generator_kl_loss=2.07, generator_dur_loss=1.715, generator_adv_loss=1.982, generator_feat_match_loss=2.37, over 100.00 samples. 2023-11-13 06:12:46,007 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 06:13:25,195 INFO [train.py:811] (0/4) Start epoch 158 2023-11-13 06:16:56,459 INFO [train.py:811] (0/4) Start epoch 159 2023-11-13 06:17:37,077 INFO [train.py:467] (0/4) Epoch 159, batch 4, global_batch_idx: 5850, batch size: 52, loss[discriminator_loss=2.66, discriminator_real_loss=1.302, discriminator_fake_loss=1.359, generator_loss=30.44, generator_mel_loss=22.32, generator_kl_loss=2.034, generator_dur_loss=1.734, generator_adv_loss=1.872, generator_feat_match_loss=2.477, over 52.00 samples.], tot_loss[discriminator_loss=2.694, discriminator_real_loss=1.338, discriminator_fake_loss=1.356, generator_loss=30.68, generator_mel_loss=22.54, generator_kl_loss=2.01, generator_dur_loss=1.732, generator_adv_loss=1.908, generator_feat_match_loss=2.481, over 327.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 64.0 2023-11-13 06:20:24,488 INFO [train.py:811] (0/4) Start epoch 160 2023-11-13 06:22:15,097 INFO [train.py:467] (0/4) Epoch 160, batch 17, global_batch_idx: 5900, batch size: 110, loss[discriminator_loss=2.529, discriminator_real_loss=1.339, discriminator_fake_loss=1.19, generator_loss=31.1, generator_mel_loss=22.3, generator_kl_loss=1.981, generator_dur_loss=1.725, generator_adv_loss=1.978, generator_feat_match_loss=3.111, over 110.00 samples.], tot_loss[discriminator_loss=2.686, discriminator_real_loss=1.337, discriminator_fake_loss=1.349, generator_loss=31.14, generator_mel_loss=22.48, generator_kl_loss=1.996, generator_dur_loss=1.734, generator_adv_loss=2.112, generator_feat_match_loss=2.815, over 1390.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 32.0 2023-11-13 06:23:58,056 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-160.pt 2023-11-13 06:24:01,252 INFO [train.py:811] (0/4) Start epoch 161 2023-11-13 06:27:03,377 INFO [train.py:467] (0/4) Epoch 161, batch 30, global_batch_idx: 5950, batch size: 61, loss[discriminator_loss=2.805, discriminator_real_loss=1.221, discriminator_fake_loss=1.584, generator_loss=30.39, generator_mel_loss=22.14, generator_kl_loss=1.975, generator_dur_loss=1.739, generator_adv_loss=2.016, generator_feat_match_loss=2.521, over 61.00 samples.], tot_loss[discriminator_loss=2.657, discriminator_real_loss=1.316, discriminator_fake_loss=1.341, generator_loss=30.97, generator_mel_loss=22.4, generator_kl_loss=1.959, generator_dur_loss=1.734, generator_adv_loss=2.102, generator_feat_match_loss=2.78, over 2295.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 32.0 2023-11-13 06:27:31,311 INFO [train.py:811] (0/4) Start epoch 162 2023-11-13 06:31:06,586 INFO [train.py:811] (0/4) Start epoch 163 2023-11-13 06:31:59,804 INFO [train.py:467] (0/4) Epoch 163, batch 6, global_batch_idx: 6000, batch size: 65, loss[discriminator_loss=2.691, discriminator_real_loss=1.337, discriminator_fake_loss=1.354, generator_loss=30.29, generator_mel_loss=22.26, generator_kl_loss=2.005, generator_dur_loss=1.741, generator_adv_loss=1.846, generator_feat_match_loss=2.439, over 65.00 samples.], tot_loss[discriminator_loss=2.72, discriminator_real_loss=1.391, discriminator_fake_loss=1.329, generator_loss=30.08, generator_mel_loss=22.12, generator_kl_loss=1.947, generator_dur_loss=1.734, generator_adv_loss=1.903, generator_feat_match_loss=2.374, over 564.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 32.0 2023-11-13 06:32:00,389 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 06:32:13,073 INFO [train.py:517] (0/4) Epoch 163, validation: discriminator_loss=2.666, discriminator_real_loss=1.218, discriminator_fake_loss=1.447, generator_loss=31.02, generator_mel_loss=23.15, generator_kl_loss=2.002, generator_dur_loss=1.701, generator_adv_loss=1.687, generator_feat_match_loss=2.479, over 100.00 samples. 2023-11-13 06:32:13,074 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 06:34:52,665 INFO [train.py:811] (0/4) Start epoch 164 2023-11-13 06:36:51,819 INFO [train.py:467] (0/4) Epoch 164, batch 19, global_batch_idx: 6050, batch size: 73, loss[discriminator_loss=2.551, discriminator_real_loss=1.303, discriminator_fake_loss=1.248, generator_loss=30.02, generator_mel_loss=21.66, generator_kl_loss=2.027, generator_dur_loss=1.733, generator_adv_loss=1.896, generator_feat_match_loss=2.695, over 73.00 samples.], tot_loss[discriminator_loss=2.702, discriminator_real_loss=1.394, discriminator_fake_loss=1.308, generator_loss=31.01, generator_mel_loss=22.25, generator_kl_loss=1.947, generator_dur_loss=1.738, generator_adv_loss=2.163, generator_feat_match_loss=2.905, over 1456.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 32.0 2023-11-13 06:38:26,608 INFO [train.py:811] (0/4) Start epoch 165 2023-11-13 06:41:30,873 INFO [train.py:467] (0/4) Epoch 165, batch 32, global_batch_idx: 6100, batch size: 63, loss[discriminator_loss=2.562, discriminator_real_loss=1.32, discriminator_fake_loss=1.242, generator_loss=30.96, generator_mel_loss=22.32, generator_kl_loss=1.93, generator_dur_loss=1.733, generator_adv_loss=2.117, generator_feat_match_loss=2.863, over 63.00 samples.], tot_loss[discriminator_loss=2.65, discriminator_real_loss=1.33, discriminator_fake_loss=1.32, generator_loss=30.68, generator_mel_loss=22.13, generator_kl_loss=1.937, generator_dur_loss=1.73, generator_adv_loss=2.102, generator_feat_match_loss=2.782, over 2444.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 06:41:51,962 INFO [train.py:811] (0/4) Start epoch 166 2023-11-13 06:45:29,212 INFO [train.py:811] (0/4) Start epoch 167 2023-11-13 06:46:33,464 INFO [train.py:467] (0/4) Epoch 167, batch 8, global_batch_idx: 6150, batch size: 60, loss[discriminator_loss=2.539, discriminator_real_loss=1.341, discriminator_fake_loss=1.197, generator_loss=30.52, generator_mel_loss=21.84, generator_kl_loss=1.935, generator_dur_loss=1.732, generator_adv_loss=2.254, generator_feat_match_loss=2.766, over 60.00 samples.], tot_loss[discriminator_loss=2.675, discriminator_real_loss=1.348, discriminator_fake_loss=1.327, generator_loss=30.49, generator_mel_loss=22.05, generator_kl_loss=1.973, generator_dur_loss=1.726, generator_adv_loss=2.051, generator_feat_match_loss=2.688, over 838.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 06:49:02,990 INFO [train.py:811] (0/4) Start epoch 168 2023-11-13 06:51:24,131 INFO [train.py:467] (0/4) Epoch 168, batch 21, global_batch_idx: 6200, batch size: 59, loss[discriminator_loss=3.016, discriminator_real_loss=1.485, discriminator_fake_loss=1.531, generator_loss=29.6, generator_mel_loss=21.97, generator_kl_loss=1.874, generator_dur_loss=1.746, generator_adv_loss=1.723, generator_feat_match_loss=2.295, over 59.00 samples.], tot_loss[discriminator_loss=2.697, discriminator_real_loss=1.372, discriminator_fake_loss=1.324, generator_loss=31.03, generator_mel_loss=22.23, generator_kl_loss=1.958, generator_dur_loss=1.731, generator_adv_loss=2.172, generator_feat_match_loss=2.941, over 1506.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 06:51:24,675 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 06:51:35,085 INFO [train.py:517] (0/4) Epoch 168, validation: discriminator_loss=2.937, discriminator_real_loss=1.581, discriminator_fake_loss=1.356, generator_loss=30.96, generator_mel_loss=22.99, generator_kl_loss=1.986, generator_dur_loss=1.7, generator_adv_loss=1.922, generator_feat_match_loss=2.362, over 100.00 samples. 2023-11-13 06:51:35,085 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 06:52:50,441 INFO [train.py:811] (0/4) Start epoch 169 2023-11-13 06:56:14,825 INFO [train.py:467] (0/4) Epoch 169, batch 34, global_batch_idx: 6250, batch size: 69, loss[discriminator_loss=2.723, discriminator_real_loss=1.27, discriminator_fake_loss=1.454, generator_loss=30.78, generator_mel_loss=21.86, generator_kl_loss=1.925, generator_dur_loss=1.73, generator_adv_loss=2.189, generator_feat_match_loss=3.072, over 69.00 samples.], tot_loss[discriminator_loss=2.658, discriminator_real_loss=1.341, discriminator_fake_loss=1.317, generator_loss=30.61, generator_mel_loss=22.25, generator_kl_loss=1.952, generator_dur_loss=1.725, generator_adv_loss=2.011, generator_feat_match_loss=2.679, over 2804.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 06:56:24,860 INFO [train.py:811] (0/4) Start epoch 170 2023-11-13 06:59:57,642 INFO [train.py:811] (0/4) Start epoch 171 2023-11-13 07:01:11,275 INFO [train.py:467] (0/4) Epoch 171, batch 10, global_batch_idx: 6300, batch size: 73, loss[discriminator_loss=2.641, discriminator_real_loss=1.371, discriminator_fake_loss=1.269, generator_loss=31.45, generator_mel_loss=21.98, generator_kl_loss=2.005, generator_dur_loss=1.751, generator_adv_loss=2.293, generator_feat_match_loss=3.428, over 73.00 samples.], tot_loss[discriminator_loss=2.686, discriminator_real_loss=1.362, discriminator_fake_loss=1.324, generator_loss=31.24, generator_mel_loss=22.17, generator_kl_loss=1.991, generator_dur_loss=1.724, generator_adv_loss=2.271, generator_feat_match_loss=3.084, over 937.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 07:03:36,364 INFO [train.py:811] (0/4) Start epoch 172 2023-11-13 07:05:49,542 INFO [train.py:467] (0/4) Epoch 172, batch 23, global_batch_idx: 6350, batch size: 53, loss[discriminator_loss=2.615, discriminator_real_loss=1.367, discriminator_fake_loss=1.248, generator_loss=30.65, generator_mel_loss=22.14, generator_kl_loss=1.913, generator_dur_loss=1.735, generator_adv_loss=2.074, generator_feat_match_loss=2.787, over 53.00 samples.], tot_loss[discriminator_loss=2.668, discriminator_real_loss=1.334, discriminator_fake_loss=1.334, generator_loss=30.48, generator_mel_loss=22.05, generator_kl_loss=1.935, generator_dur_loss=1.733, generator_adv_loss=2.032, generator_feat_match_loss=2.731, over 1515.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 07:07:06,122 INFO [train.py:811] (0/4) Start epoch 173 2023-11-13 07:10:37,998 INFO [train.py:467] (0/4) Epoch 173, batch 36, global_batch_idx: 6400, batch size: 54, loss[discriminator_loss=2.73, discriminator_real_loss=1.506, discriminator_fake_loss=1.225, generator_loss=29.56, generator_mel_loss=21.73, generator_kl_loss=1.959, generator_dur_loss=1.754, generator_adv_loss=1.801, generator_feat_match_loss=2.314, over 54.00 samples.], tot_loss[discriminator_loss=2.644, discriminator_real_loss=1.346, discriminator_fake_loss=1.298, generator_loss=31.17, generator_mel_loss=22.13, generator_kl_loss=1.979, generator_dur_loss=1.723, generator_adv_loss=2.237, generator_feat_match_loss=3.099, over 2705.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 32.0 2023-11-13 07:10:38,551 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 07:10:50,207 INFO [train.py:517] (0/4) Epoch 173, validation: discriminator_loss=2.858, discriminator_real_loss=1.201, discriminator_fake_loss=1.657, generator_loss=30.66, generator_mel_loss=22.74, generator_kl_loss=1.988, generator_dur_loss=1.701, generator_adv_loss=1.712, generator_feat_match_loss=2.527, over 100.00 samples. 2023-11-13 07:10:50,208 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 07:10:50,926 INFO [train.py:811] (0/4) Start epoch 174 2023-11-13 07:14:25,866 INFO [train.py:811] (0/4) Start epoch 175 2023-11-13 07:15:51,311 INFO [train.py:467] (0/4) Epoch 175, batch 12, global_batch_idx: 6450, batch size: 126, loss[discriminator_loss=2.443, discriminator_real_loss=1.197, discriminator_fake_loss=1.246, generator_loss=31.96, generator_mel_loss=22.51, generator_kl_loss=1.963, generator_dur_loss=1.724, generator_adv_loss=2.129, generator_feat_match_loss=3.641, over 126.00 samples.], tot_loss[discriminator_loss=2.72, discriminator_real_loss=1.415, discriminator_fake_loss=1.305, generator_loss=31.01, generator_mel_loss=22.09, generator_kl_loss=1.967, generator_dur_loss=1.725, generator_adv_loss=2.191, generator_feat_match_loss=3.046, over 848.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 07:17:58,992 INFO [train.py:811] (0/4) Start epoch 176 2023-11-13 07:20:24,769 INFO [train.py:467] (0/4) Epoch 176, batch 25, global_batch_idx: 6500, batch size: 56, loss[discriminator_loss=2.797, discriminator_real_loss=1.557, discriminator_fake_loss=1.24, generator_loss=29.57, generator_mel_loss=21.45, generator_kl_loss=2.024, generator_dur_loss=1.74, generator_adv_loss=2.096, generator_feat_match_loss=2.27, over 56.00 samples.], tot_loss[discriminator_loss=2.69, discriminator_real_loss=1.347, discriminator_fake_loss=1.343, generator_loss=30.23, generator_mel_loss=22.1, generator_kl_loss=1.949, generator_dur_loss=1.721, generator_adv_loss=1.928, generator_feat_match_loss=2.523, over 1902.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 07:21:29,796 INFO [train.py:811] (0/4) Start epoch 177 2023-11-13 07:25:01,258 INFO [train.py:811] (0/4) Start epoch 178 2023-11-13 07:25:21,847 INFO [train.py:467] (0/4) Epoch 178, batch 1, global_batch_idx: 6550, batch size: 85, loss[discriminator_loss=2.592, discriminator_real_loss=1.24, discriminator_fake_loss=1.352, generator_loss=31.12, generator_mel_loss=22.18, generator_kl_loss=1.888, generator_dur_loss=1.735, generator_adv_loss=2.484, generator_feat_match_loss=2.83, over 85.00 samples.], tot_loss[discriminator_loss=2.6, discriminator_real_loss=1.259, discriminator_fake_loss=1.341, generator_loss=31, generator_mel_loss=21.88, generator_kl_loss=1.93, generator_dur_loss=1.737, generator_adv_loss=2.403, generator_feat_match_loss=3.057, over 170.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 07:28:34,310 INFO [train.py:811] (0/4) Start epoch 179 2023-11-13 07:30:03,343 INFO [train.py:467] (0/4) Epoch 179, batch 14, global_batch_idx: 6600, batch size: 101, loss[discriminator_loss=3.254, discriminator_real_loss=1.914, discriminator_fake_loss=1.341, generator_loss=30.98, generator_mel_loss=22.42, generator_kl_loss=2.055, generator_dur_loss=1.727, generator_adv_loss=2.291, generator_feat_match_loss=2.49, over 101.00 samples.], tot_loss[discriminator_loss=2.768, discriminator_real_loss=1.423, discriminator_fake_loss=1.344, generator_loss=31.12, generator_mel_loss=22.29, generator_kl_loss=1.936, generator_dur_loss=1.724, generator_adv_loss=2.238, generator_feat_match_loss=2.934, over 1036.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 07:30:03,847 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 07:30:14,354 INFO [train.py:517] (0/4) Epoch 179, validation: discriminator_loss=2.851, discriminator_real_loss=1.733, discriminator_fake_loss=1.118, generator_loss=31.68, generator_mel_loss=23.22, generator_kl_loss=1.976, generator_dur_loss=1.695, generator_adv_loss=2.28, generator_feat_match_loss=2.509, over 100.00 samples. 2023-11-13 07:30:14,355 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 07:32:14,087 INFO [train.py:811] (0/4) Start epoch 180 2023-11-13 07:35:02,494 INFO [train.py:467] (0/4) Epoch 180, batch 27, global_batch_idx: 6650, batch size: 58, loss[discriminator_loss=2.676, discriminator_real_loss=1.389, discriminator_fake_loss=1.286, generator_loss=30.05, generator_mel_loss=21.46, generator_kl_loss=1.917, generator_dur_loss=1.711, generator_adv_loss=2.125, generator_feat_match_loss=2.836, over 58.00 samples.], tot_loss[discriminator_loss=2.696, discriminator_real_loss=1.37, discriminator_fake_loss=1.326, generator_loss=30.14, generator_mel_loss=21.97, generator_kl_loss=1.954, generator_dur_loss=1.723, generator_adv_loss=2.012, generator_feat_match_loss=2.479, over 1984.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 07:35:48,504 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-180.pt 2023-11-13 07:35:51,788 INFO [train.py:811] (0/4) Start epoch 181 2023-11-13 07:39:21,213 INFO [train.py:811] (0/4) Start epoch 182 2023-11-13 07:39:52,279 INFO [train.py:467] (0/4) Epoch 182, batch 3, global_batch_idx: 6700, batch size: 73, loss[discriminator_loss=2.992, discriminator_real_loss=1.465, discriminator_fake_loss=1.527, generator_loss=29.91, generator_mel_loss=22.12, generator_kl_loss=1.989, generator_dur_loss=1.706, generator_adv_loss=1.857, generator_feat_match_loss=2.242, over 73.00 samples.], tot_loss[discriminator_loss=2.698, discriminator_real_loss=1.354, discriminator_fake_loss=1.343, generator_loss=30.96, generator_mel_loss=21.92, generator_kl_loss=2, generator_dur_loss=1.721, generator_adv_loss=2.272, generator_feat_match_loss=3.044, over 291.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 07:42:47,306 INFO [train.py:811] (0/4) Start epoch 183 2023-11-13 07:44:30,906 INFO [train.py:467] (0/4) Epoch 183, batch 16, global_batch_idx: 6750, batch size: 85, loss[discriminator_loss=2.582, discriminator_real_loss=1.239, discriminator_fake_loss=1.343, generator_loss=30.5, generator_mel_loss=22.2, generator_kl_loss=1.862, generator_dur_loss=1.716, generator_adv_loss=2.051, generator_feat_match_loss=2.664, over 85.00 samples.], tot_loss[discriminator_loss=2.637, discriminator_real_loss=1.326, discriminator_fake_loss=1.312, generator_loss=30.16, generator_mel_loss=21.93, generator_kl_loss=1.921, generator_dur_loss=1.724, generator_adv_loss=1.957, generator_feat_match_loss=2.634, over 1114.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 16.0 2023-11-13 07:46:19,817 INFO [train.py:811] (0/4) Start epoch 184 2023-11-13 07:49:18,123 INFO [train.py:467] (0/4) Epoch 184, batch 29, global_batch_idx: 6800, batch size: 73, loss[discriminator_loss=2.566, discriminator_real_loss=1.193, discriminator_fake_loss=1.374, generator_loss=29.98, generator_mel_loss=21.56, generator_kl_loss=1.947, generator_dur_loss=1.735, generator_adv_loss=2.08, generator_feat_match_loss=2.664, over 73.00 samples.], tot_loss[discriminator_loss=2.653, discriminator_real_loss=1.355, discriminator_fake_loss=1.298, generator_loss=30.92, generator_mel_loss=22.07, generator_kl_loss=1.941, generator_dur_loss=1.728, generator_adv_loss=2.182, generator_feat_match_loss=2.992, over 2060.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 32.0 2023-11-13 07:49:18,623 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 07:49:29,713 INFO [train.py:517] (0/4) Epoch 184, validation: discriminator_loss=2.926, discriminator_real_loss=1.293, discriminator_fake_loss=1.633, generator_loss=30.87, generator_mel_loss=23.04, generator_kl_loss=2.013, generator_dur_loss=1.695, generator_adv_loss=1.598, generator_feat_match_loss=2.517, over 100.00 samples. 2023-11-13 07:49:29,714 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 07:50:04,027 INFO [train.py:811] (0/4) Start epoch 185 2023-11-13 07:53:35,847 INFO [train.py:811] (0/4) Start epoch 186 2023-11-13 07:54:17,593 INFO [train.py:467] (0/4) Epoch 186, batch 5, global_batch_idx: 6850, batch size: 55, loss[discriminator_loss=2.971, discriminator_real_loss=1.305, discriminator_fake_loss=1.666, generator_loss=30.29, generator_mel_loss=22.18, generator_kl_loss=1.961, generator_dur_loss=1.779, generator_adv_loss=1.961, generator_feat_match_loss=2.404, over 55.00 samples.], tot_loss[discriminator_loss=2.595, discriminator_real_loss=1.298, discriminator_fake_loss=1.296, generator_loss=31.12, generator_mel_loss=22.13, generator_kl_loss=1.928, generator_dur_loss=1.731, generator_adv_loss=2.135, generator_feat_match_loss=3.206, over 382.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 07:57:04,086 INFO [train.py:811] (0/4) Start epoch 187 2023-11-13 07:58:58,544 INFO [train.py:467] (0/4) Epoch 187, batch 18, global_batch_idx: 6900, batch size: 50, loss[discriminator_loss=2.82, discriminator_real_loss=1.629, discriminator_fake_loss=1.191, generator_loss=30.3, generator_mel_loss=21.37, generator_kl_loss=1.844, generator_dur_loss=1.732, generator_adv_loss=2.377, generator_feat_match_loss=2.971, over 50.00 samples.], tot_loss[discriminator_loss=2.615, discriminator_real_loss=1.317, discriminator_fake_loss=1.298, generator_loss=30.57, generator_mel_loss=21.73, generator_kl_loss=1.941, generator_dur_loss=1.722, generator_adv_loss=2.161, generator_feat_match_loss=3.018, over 1247.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 08:00:37,746 INFO [train.py:811] (0/4) Start epoch 188 2023-11-13 08:03:33,558 INFO [train.py:467] (0/4) Epoch 188, batch 31, global_batch_idx: 6950, batch size: 85, loss[discriminator_loss=2.412, discriminator_real_loss=1.322, discriminator_fake_loss=1.09, generator_loss=32.79, generator_mel_loss=22.28, generator_kl_loss=2.031, generator_dur_loss=1.707, generator_adv_loss=2.619, generator_feat_match_loss=4.148, over 85.00 samples.], tot_loss[discriminator_loss=2.595, discriminator_real_loss=1.317, discriminator_fake_loss=1.278, generator_loss=31, generator_mel_loss=21.91, generator_kl_loss=1.949, generator_dur_loss=1.724, generator_adv_loss=2.247, generator_feat_match_loss=3.171, over 2152.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 08:04:05,256 INFO [train.py:811] (0/4) Start epoch 189 2023-11-13 08:07:34,993 INFO [train.py:811] (0/4) Start epoch 190 2023-11-13 08:08:35,969 INFO [train.py:467] (0/4) Epoch 190, batch 7, global_batch_idx: 7000, batch size: 61, loss[discriminator_loss=2.625, discriminator_real_loss=1.244, discriminator_fake_loss=1.381, generator_loss=29.9, generator_mel_loss=21.79, generator_kl_loss=1.952, generator_dur_loss=1.716, generator_adv_loss=1.942, generator_feat_match_loss=2.496, over 61.00 samples.], tot_loss[discriminator_loss=2.623, discriminator_real_loss=1.328, discriminator_fake_loss=1.294, generator_loss=30.36, generator_mel_loss=21.93, generator_kl_loss=1.938, generator_dur_loss=1.719, generator_adv_loss=2.017, generator_feat_match_loss=2.756, over 713.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 08:08:36,453 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 08:08:47,371 INFO [train.py:517] (0/4) Epoch 190, validation: discriminator_loss=2.559, discriminator_real_loss=1.268, discriminator_fake_loss=1.291, generator_loss=31.28, generator_mel_loss=22.84, generator_kl_loss=1.986, generator_dur_loss=1.694, generator_adv_loss=1.96, generator_feat_match_loss=2.801, over 100.00 samples. 2023-11-13 08:08:47,372 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 08:11:22,316 INFO [train.py:811] (0/4) Start epoch 191 2023-11-13 08:13:23,833 INFO [train.py:467] (0/4) Epoch 191, batch 20, global_batch_idx: 7050, batch size: 52, loss[discriminator_loss=2.449, discriminator_real_loss=1.158, discriminator_fake_loss=1.29, generator_loss=31.36, generator_mel_loss=22.1, generator_kl_loss=1.832, generator_dur_loss=1.737, generator_adv_loss=2.227, generator_feat_match_loss=3.471, over 52.00 samples.], tot_loss[discriminator_loss=2.666, discriminator_real_loss=1.359, discriminator_fake_loss=1.307, generator_loss=30.67, generator_mel_loss=21.76, generator_kl_loss=1.943, generator_dur_loss=1.721, generator_adv_loss=2.171, generator_feat_match_loss=3.083, over 1387.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 08:14:52,721 INFO [train.py:811] (0/4) Start epoch 192 2023-11-13 08:18:07,468 INFO [train.py:467] (0/4) Epoch 192, batch 33, global_batch_idx: 7100, batch size: 51, loss[discriminator_loss=2.543, discriminator_real_loss=1.212, discriminator_fake_loss=1.33, generator_loss=31.07, generator_mel_loss=22.08, generator_kl_loss=1.979, generator_dur_loss=1.732, generator_adv_loss=2.188, generator_feat_match_loss=3.086, over 51.00 samples.], tot_loss[discriminator_loss=2.644, discriminator_real_loss=1.336, discriminator_fake_loss=1.308, generator_loss=30.58, generator_mel_loss=21.99, generator_kl_loss=1.963, generator_dur_loss=1.722, generator_adv_loss=2.066, generator_feat_match_loss=2.833, over 2397.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 08:18:24,272 INFO [train.py:811] (0/4) Start epoch 193 2023-11-13 08:21:46,847 INFO [train.py:811] (0/4) Start epoch 194 2023-11-13 08:22:44,465 INFO [train.py:467] (0/4) Epoch 194, batch 9, global_batch_idx: 7150, batch size: 49, loss[discriminator_loss=2.52, discriminator_real_loss=1.133, discriminator_fake_loss=1.388, generator_loss=31.79, generator_mel_loss=21.78, generator_kl_loss=1.926, generator_dur_loss=1.712, generator_adv_loss=2.664, generator_feat_match_loss=3.709, over 49.00 samples.], tot_loss[discriminator_loss=2.585, discriminator_real_loss=1.292, discriminator_fake_loss=1.293, generator_loss=31.36, generator_mel_loss=21.91, generator_kl_loss=1.983, generator_dur_loss=1.724, generator_adv_loss=2.365, generator_feat_match_loss=3.371, over 730.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 08:25:12,722 INFO [train.py:811] (0/4) Start epoch 195 2023-11-13 08:27:14,726 INFO [train.py:467] (0/4) Epoch 195, batch 22, global_batch_idx: 7200, batch size: 95, loss[discriminator_loss=2.699, discriminator_real_loss=1.376, discriminator_fake_loss=1.324, generator_loss=31.24, generator_mel_loss=22.15, generator_kl_loss=1.957, generator_dur_loss=1.719, generator_adv_loss=2.254, generator_feat_match_loss=3.158, over 95.00 samples.], tot_loss[discriminator_loss=2.628, discriminator_real_loss=1.328, discriminator_fake_loss=1.3, generator_loss=30.58, generator_mel_loss=21.9, generator_kl_loss=1.97, generator_dur_loss=1.723, generator_adv_loss=2.094, generator_feat_match_loss=2.89, over 1488.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 32.0 2023-11-13 08:27:15,223 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 08:27:25,456 INFO [train.py:517] (0/4) Epoch 195, validation: discriminator_loss=2.731, discriminator_real_loss=1.228, discriminator_fake_loss=1.504, generator_loss=31.59, generator_mel_loss=23.37, generator_kl_loss=2.062, generator_dur_loss=1.684, generator_adv_loss=1.696, generator_feat_match_loss=2.776, over 100.00 samples. 2023-11-13 08:27:25,457 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 08:28:48,369 INFO [train.py:811] (0/4) Start epoch 196 2023-11-13 08:32:12,709 INFO [train.py:467] (0/4) Epoch 196, batch 35, global_batch_idx: 7250, batch size: 54, loss[discriminator_loss=2.43, discriminator_real_loss=1.204, discriminator_fake_loss=1.225, generator_loss=31.45, generator_mel_loss=21.95, generator_kl_loss=2, generator_dur_loss=1.732, generator_adv_loss=2.32, generator_feat_match_loss=3.449, over 54.00 samples.], tot_loss[discriminator_loss=2.655, discriminator_real_loss=1.328, discriminator_fake_loss=1.327, generator_loss=30.75, generator_mel_loss=21.79, generator_kl_loss=1.956, generator_dur_loss=1.716, generator_adv_loss=2.201, generator_feat_match_loss=3.086, over 2586.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 08:32:18,271 INFO [train.py:811] (0/4) Start epoch 197 2023-11-13 08:35:48,454 INFO [train.py:811] (0/4) Start epoch 198 2023-11-13 08:37:03,828 INFO [train.py:467] (0/4) Epoch 198, batch 11, global_batch_idx: 7300, batch size: 56, loss[discriminator_loss=2.67, discriminator_real_loss=1.334, discriminator_fake_loss=1.336, generator_loss=30.97, generator_mel_loss=22.07, generator_kl_loss=1.912, generator_dur_loss=1.719, generator_adv_loss=2.197, generator_feat_match_loss=3.076, over 56.00 samples.], tot_loss[discriminator_loss=2.652, discriminator_real_loss=1.328, discriminator_fake_loss=1.324, generator_loss=30.51, generator_mel_loss=21.66, generator_kl_loss=1.936, generator_dur_loss=1.717, generator_adv_loss=2.143, generator_feat_match_loss=3.055, over 951.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 08:39:16,778 INFO [train.py:811] (0/4) Start epoch 199 2023-11-13 08:41:45,035 INFO [train.py:467] (0/4) Epoch 199, batch 24, global_batch_idx: 7350, batch size: 90, loss[discriminator_loss=2.506, discriminator_real_loss=1.37, discriminator_fake_loss=1.136, generator_loss=31.4, generator_mel_loss=22.09, generator_kl_loss=1.877, generator_dur_loss=1.752, generator_adv_loss=2.346, generator_feat_match_loss=3.342, over 90.00 samples.], tot_loss[discriminator_loss=2.627, discriminator_real_loss=1.345, discriminator_fake_loss=1.283, generator_loss=30.93, generator_mel_loss=21.97, generator_kl_loss=1.961, generator_dur_loss=1.717, generator_adv_loss=2.213, generator_feat_match_loss=3.07, over 1913.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 08:42:50,821 INFO [train.py:811] (0/4) Start epoch 200 2023-11-13 08:46:22,653 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-200.pt 2023-11-13 08:46:25,957 INFO [train.py:811] (0/4) Start epoch 201 2023-11-13 08:46:43,021 INFO [train.py:467] (0/4) Epoch 201, batch 0, global_batch_idx: 7400, batch size: 126, loss[discriminator_loss=2.479, discriminator_real_loss=1.277, discriminator_fake_loss=1.201, generator_loss=31.53, generator_mel_loss=22.13, generator_kl_loss=1.987, generator_dur_loss=1.719, generator_adv_loss=2.26, generator_feat_match_loss=3.434, over 126.00 samples.], tot_loss[discriminator_loss=2.479, discriminator_real_loss=1.277, discriminator_fake_loss=1.201, generator_loss=31.53, generator_mel_loss=22.13, generator_kl_loss=1.987, generator_dur_loss=1.719, generator_adv_loss=2.26, generator_feat_match_loss=3.434, over 126.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 08:46:43,619 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 08:46:55,224 INFO [train.py:517] (0/4) Epoch 201, validation: discriminator_loss=2.565, discriminator_real_loss=1.226, discriminator_fake_loss=1.339, generator_loss=31.02, generator_mel_loss=22.61, generator_kl_loss=2.034, generator_dur_loss=1.688, generator_adv_loss=1.874, generator_feat_match_loss=2.82, over 100.00 samples. 2023-11-13 08:46:55,225 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 08:50:15,807 INFO [train.py:811] (0/4) Start epoch 202 2023-11-13 08:51:44,167 INFO [train.py:467] (0/4) Epoch 202, batch 13, global_batch_idx: 7450, batch size: 67, loss[discriminator_loss=2.305, discriminator_real_loss=1.213, discriminator_fake_loss=1.093, generator_loss=32.21, generator_mel_loss=21.67, generator_kl_loss=1.874, generator_dur_loss=1.73, generator_adv_loss=2.555, generator_feat_match_loss=4.383, over 67.00 samples.], tot_loss[discriminator_loss=2.503, discriminator_real_loss=1.276, discriminator_fake_loss=1.227, generator_loss=31.2, generator_mel_loss=21.74, generator_kl_loss=1.948, generator_dur_loss=1.716, generator_adv_loss=2.3, generator_feat_match_loss=3.489, over 1117.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 08:53:50,229 INFO [train.py:811] (0/4) Start epoch 203 2023-11-13 08:56:20,604 INFO [train.py:467] (0/4) Epoch 203, batch 26, global_batch_idx: 7500, batch size: 101, loss[discriminator_loss=2.453, discriminator_real_loss=1.244, discriminator_fake_loss=1.208, generator_loss=31.47, generator_mel_loss=22.19, generator_kl_loss=1.854, generator_dur_loss=1.693, generator_adv_loss=2.482, generator_feat_match_loss=3.25, over 101.00 samples.], tot_loss[discriminator_loss=2.457, discriminator_real_loss=1.255, discriminator_fake_loss=1.202, generator_loss=31.11, generator_mel_loss=21.44, generator_kl_loss=1.924, generator_dur_loss=1.719, generator_adv_loss=2.382, generator_feat_match_loss=3.644, over 1839.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 08:57:21,269 INFO [train.py:811] (0/4) Start epoch 204 2023-11-13 09:00:58,172 INFO [train.py:811] (0/4) Start epoch 205 2023-11-13 09:01:25,513 INFO [train.py:467] (0/4) Epoch 205, batch 2, global_batch_idx: 7550, batch size: 64, loss[discriminator_loss=2.238, discriminator_real_loss=1.097, discriminator_fake_loss=1.142, generator_loss=32.76, generator_mel_loss=21.85, generator_kl_loss=1.885, generator_dur_loss=1.738, generator_adv_loss=2.527, generator_feat_match_loss=4.762, over 64.00 samples.], tot_loss[discriminator_loss=2.337, discriminator_real_loss=1.199, discriminator_fake_loss=1.138, generator_loss=32.28, generator_mel_loss=21.85, generator_kl_loss=1.95, generator_dur_loss=1.709, generator_adv_loss=2.47, generator_feat_match_loss=4.299, over 193.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 09:04:25,159 INFO [train.py:811] (0/4) Start epoch 206 2023-11-13 09:06:00,665 INFO [train.py:467] (0/4) Epoch 206, batch 15, global_batch_idx: 7600, batch size: 73, loss[discriminator_loss=2.328, discriminator_real_loss=1.042, discriminator_fake_loss=1.285, generator_loss=31.64, generator_mel_loss=21.43, generator_kl_loss=2.012, generator_dur_loss=1.704, generator_adv_loss=2.506, generator_feat_match_loss=3.994, over 73.00 samples.], tot_loss[discriminator_loss=2.507, discriminator_real_loss=1.244, discriminator_fake_loss=1.263, generator_loss=30.6, generator_mel_loss=21.16, generator_kl_loss=1.94, generator_dur_loss=1.716, generator_adv_loss=2.321, generator_feat_match_loss=3.463, over 1076.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 32.0 2023-11-13 09:06:01,178 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 09:06:11,530 INFO [train.py:517] (0/4) Epoch 206, validation: discriminator_loss=2.259, discriminator_real_loss=1.189, discriminator_fake_loss=1.07, generator_loss=32.32, generator_mel_loss=22.16, generator_kl_loss=2.046, generator_dur_loss=1.682, generator_adv_loss=2.501, generator_feat_match_loss=3.928, over 100.00 samples. 2023-11-13 09:06:11,531 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 09:08:09,408 INFO [train.py:811] (0/4) Start epoch 207 2023-11-13 09:10:59,129 INFO [train.py:467] (0/4) Epoch 207, batch 28, global_batch_idx: 7650, batch size: 76, loss[discriminator_loss=2.695, discriminator_real_loss=1.369, discriminator_fake_loss=1.326, generator_loss=29.99, generator_mel_loss=21.71, generator_kl_loss=1.95, generator_dur_loss=1.727, generator_adv_loss=1.982, generator_feat_match_loss=2.619, over 76.00 samples.], tot_loss[discriminator_loss=2.629, discriminator_real_loss=1.32, discriminator_fake_loss=1.309, generator_loss=30.78, generator_mel_loss=21.83, generator_kl_loss=1.981, generator_dur_loss=1.717, generator_adv_loss=2.174, generator_feat_match_loss=3.084, over 2115.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 09:11:41,372 INFO [train.py:811] (0/4) Start epoch 208 2023-11-13 09:15:09,808 INFO [train.py:811] (0/4) Start epoch 209 2023-11-13 09:15:44,311 INFO [train.py:467] (0/4) Epoch 209, batch 4, global_batch_idx: 7700, batch size: 56, loss[discriminator_loss=2.498, discriminator_real_loss=1.234, discriminator_fake_loss=1.264, generator_loss=30.73, generator_mel_loss=21.51, generator_kl_loss=1.955, generator_dur_loss=1.693, generator_adv_loss=2.002, generator_feat_match_loss=3.566, over 56.00 samples.], tot_loss[discriminator_loss=2.669, discriminator_real_loss=1.355, discriminator_fake_loss=1.313, generator_loss=30.61, generator_mel_loss=21.71, generator_kl_loss=1.946, generator_dur_loss=1.703, generator_adv_loss=2.209, generator_feat_match_loss=3.038, over 392.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 09:18:42,589 INFO [train.py:811] (0/4) Start epoch 210 2023-11-13 09:20:35,052 INFO [train.py:467] (0/4) Epoch 210, batch 17, global_batch_idx: 7750, batch size: 101, loss[discriminator_loss=2.912, discriminator_real_loss=1.693, discriminator_fake_loss=1.219, generator_loss=30.77, generator_mel_loss=21.88, generator_kl_loss=2.005, generator_dur_loss=1.697, generator_adv_loss=2.311, generator_feat_match_loss=2.877, over 101.00 samples.], tot_loss[discriminator_loss=2.65, discriminator_real_loss=1.333, discriminator_fake_loss=1.316, generator_loss=30.86, generator_mel_loss=21.68, generator_kl_loss=1.936, generator_dur_loss=1.716, generator_adv_loss=2.266, generator_feat_match_loss=3.257, over 1188.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 09:22:21,552 INFO [train.py:811] (0/4) Start epoch 211 2023-11-13 09:25:29,373 INFO [train.py:467] (0/4) Epoch 211, batch 30, global_batch_idx: 7800, batch size: 73, loss[discriminator_loss=2.699, discriminator_real_loss=1.477, discriminator_fake_loss=1.222, generator_loss=30.67, generator_mel_loss=21.57, generator_kl_loss=1.967, generator_dur_loss=1.716, generator_adv_loss=2.191, generator_feat_match_loss=3.227, over 73.00 samples.], tot_loss[discriminator_loss=2.662, discriminator_real_loss=1.338, discriminator_fake_loss=1.324, generator_loss=30.58, generator_mel_loss=21.78, generator_kl_loss=1.953, generator_dur_loss=1.716, generator_adv_loss=2.14, generator_feat_match_loss=2.985, over 2276.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 09:25:29,946 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 09:25:40,564 INFO [train.py:517] (0/4) Epoch 211, validation: discriminator_loss=2.683, discriminator_real_loss=1.262, discriminator_fake_loss=1.421, generator_loss=30.94, generator_mel_loss=22.57, generator_kl_loss=1.999, generator_dur_loss=1.684, generator_adv_loss=1.803, generator_feat_match_loss=2.877, over 100.00 samples. 2023-11-13 09:25:40,565 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 09:26:08,324 INFO [train.py:811] (0/4) Start epoch 212 2023-11-13 09:29:29,007 INFO [train.py:811] (0/4) Start epoch 213 2023-11-13 09:30:20,023 INFO [train.py:467] (0/4) Epoch 213, batch 6, global_batch_idx: 7850, batch size: 76, loss[discriminator_loss=2.406, discriminator_real_loss=1.268, discriminator_fake_loss=1.138, generator_loss=30.83, generator_mel_loss=21.21, generator_kl_loss=1.968, generator_dur_loss=1.702, generator_adv_loss=2.48, generator_feat_match_loss=3.469, over 76.00 samples.], tot_loss[discriminator_loss=2.76, discriminator_real_loss=1.377, discriminator_fake_loss=1.383, generator_loss=30.32, generator_mel_loss=21.42, generator_kl_loss=1.912, generator_dur_loss=1.706, generator_adv_loss=2.227, generator_feat_match_loss=3.06, over 534.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 09:32:56,777 INFO [train.py:811] (0/4) Start epoch 214 2023-11-13 09:34:53,769 INFO [train.py:467] (0/4) Epoch 214, batch 19, global_batch_idx: 7900, batch size: 51, loss[discriminator_loss=2.332, discriminator_real_loss=1.25, discriminator_fake_loss=1.083, generator_loss=31.19, generator_mel_loss=21.51, generator_kl_loss=1.841, generator_dur_loss=1.715, generator_adv_loss=2.539, generator_feat_match_loss=3.582, over 51.00 samples.], tot_loss[discriminator_loss=2.577, discriminator_real_loss=1.3, discriminator_fake_loss=1.277, generator_loss=31.09, generator_mel_loss=21.76, generator_kl_loss=1.955, generator_dur_loss=1.712, generator_adv_loss=2.28, generator_feat_match_loss=3.385, over 1539.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 09:36:31,122 INFO [train.py:811] (0/4) Start epoch 215 2023-11-13 09:39:38,443 INFO [train.py:467] (0/4) Epoch 215, batch 32, global_batch_idx: 7950, batch size: 95, loss[discriminator_loss=2.875, discriminator_real_loss=1.317, discriminator_fake_loss=1.559, generator_loss=30.7, generator_mel_loss=21.79, generator_kl_loss=1.885, generator_dur_loss=1.709, generator_adv_loss=2.318, generator_feat_match_loss=3, over 95.00 samples.], tot_loss[discriminator_loss=2.637, discriminator_real_loss=1.321, discriminator_fake_loss=1.316, generator_loss=30.69, generator_mel_loss=21.84, generator_kl_loss=1.936, generator_dur_loss=1.714, generator_adv_loss=2.162, generator_feat_match_loss=3.035, over 2337.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 8.0 2023-11-13 09:40:02,014 INFO [train.py:811] (0/4) Start epoch 216 2023-11-13 09:43:37,349 INFO [train.py:811] (0/4) Start epoch 217 2023-11-13 09:44:42,976 INFO [train.py:467] (0/4) Epoch 217, batch 8, global_batch_idx: 8000, batch size: 126, loss[discriminator_loss=2.77, discriminator_real_loss=1.472, discriminator_fake_loss=1.297, generator_loss=31.39, generator_mel_loss=22.22, generator_kl_loss=1.969, generator_dur_loss=1.709, generator_adv_loss=2.443, generator_feat_match_loss=3.057, over 126.00 samples.], tot_loss[discriminator_loss=2.68, discriminator_real_loss=1.374, discriminator_fake_loss=1.306, generator_loss=30.65, generator_mel_loss=21.79, generator_kl_loss=1.935, generator_dur_loss=1.716, generator_adv_loss=2.197, generator_feat_match_loss=3.004, over 626.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 09:44:43,528 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 09:44:53,740 INFO [train.py:517] (0/4) Epoch 217, validation: discriminator_loss=2.619, discriminator_real_loss=1.454, discriminator_fake_loss=1.165, generator_loss=31.28, generator_mel_loss=22.22, generator_kl_loss=1.973, generator_dur_loss=1.683, generator_adv_loss=2.305, generator_feat_match_loss=3.1, over 100.00 samples. 2023-11-13 09:44:53,741 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 09:47:21,192 INFO [train.py:811] (0/4) Start epoch 218 2023-11-13 09:49:24,602 INFO [train.py:467] (0/4) Epoch 218, batch 21, global_batch_idx: 8050, batch size: 56, loss[discriminator_loss=2.766, discriminator_real_loss=1.639, discriminator_fake_loss=1.127, generator_loss=30.86, generator_mel_loss=21.54, generator_kl_loss=2.063, generator_dur_loss=1.712, generator_adv_loss=2.277, generator_feat_match_loss=3.266, over 56.00 samples.], tot_loss[discriminator_loss=2.554, discriminator_real_loss=1.283, discriminator_fake_loss=1.271, generator_loss=30.96, generator_mel_loss=21.54, generator_kl_loss=1.937, generator_dur_loss=1.716, generator_adv_loss=2.3, generator_feat_match_loss=3.466, over 1396.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 09:50:44,483 INFO [train.py:811] (0/4) Start epoch 219 2023-11-13 09:54:06,214 INFO [train.py:467] (0/4) Epoch 219, batch 34, global_batch_idx: 8100, batch size: 55, loss[discriminator_loss=2.354, discriminator_real_loss=1.098, discriminator_fake_loss=1.256, generator_loss=31.67, generator_mel_loss=21.26, generator_kl_loss=2.024, generator_dur_loss=1.735, generator_adv_loss=2.564, generator_feat_match_loss=4.078, over 55.00 samples.], tot_loss[discriminator_loss=2.576, discriminator_real_loss=1.303, discriminator_fake_loss=1.274, generator_loss=30.89, generator_mel_loss=21.73, generator_kl_loss=1.969, generator_dur_loss=1.714, generator_adv_loss=2.214, generator_feat_match_loss=3.258, over 2569.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 09:54:15,014 INFO [train.py:811] (0/4) Start epoch 220 2023-11-13 09:57:43,784 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-220.pt 2023-11-13 09:57:46,925 INFO [train.py:811] (0/4) Start epoch 221 2023-11-13 09:58:59,655 INFO [train.py:467] (0/4) Epoch 221, batch 10, global_batch_idx: 8150, batch size: 53, loss[discriminator_loss=2.629, discriminator_real_loss=1.413, discriminator_fake_loss=1.217, generator_loss=29.64, generator_mel_loss=21.03, generator_kl_loss=1.864, generator_dur_loss=1.709, generator_adv_loss=2.217, generator_feat_match_loss=2.818, over 53.00 samples.], tot_loss[discriminator_loss=2.595, discriminator_real_loss=1.261, discriminator_fake_loss=1.334, generator_loss=30.61, generator_mel_loss=21.42, generator_kl_loss=1.929, generator_dur_loss=1.707, generator_adv_loss=2.264, generator_feat_match_loss=3.29, over 712.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 10:01:15,649 INFO [train.py:811] (0/4) Start epoch 222 2023-11-13 10:03:37,497 INFO [train.py:467] (0/4) Epoch 222, batch 23, global_batch_idx: 8200, batch size: 61, loss[discriminator_loss=2.93, discriminator_real_loss=1.45, discriminator_fake_loss=1.479, generator_loss=30.38, generator_mel_loss=21.96, generator_kl_loss=1.973, generator_dur_loss=1.712, generator_adv_loss=2, generator_feat_match_loss=2.734, over 61.00 samples.], tot_loss[discriminator_loss=2.623, discriminator_real_loss=1.302, discriminator_fake_loss=1.322, generator_loss=31.03, generator_mel_loss=21.8, generator_kl_loss=1.958, generator_dur_loss=1.716, generator_adv_loss=2.255, generator_feat_match_loss=3.305, over 1599.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 10:03:37,990 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 10:03:48,042 INFO [train.py:517] (0/4) Epoch 222, validation: discriminator_loss=2.788, discriminator_real_loss=1.489, discriminator_fake_loss=1.298, generator_loss=31.44, generator_mel_loss=22.81, generator_kl_loss=1.963, generator_dur_loss=1.681, generator_adv_loss=2.115, generator_feat_match_loss=2.867, over 100.00 samples. 2023-11-13 10:03:48,042 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 10:05:01,440 INFO [train.py:811] (0/4) Start epoch 223 2023-11-13 10:08:33,693 INFO [train.py:467] (0/4) Epoch 223, batch 36, global_batch_idx: 8250, batch size: 79, loss[discriminator_loss=2.354, discriminator_real_loss=1.109, discriminator_fake_loss=1.244, generator_loss=31.59, generator_mel_loss=21.41, generator_kl_loss=1.988, generator_dur_loss=1.706, generator_adv_loss=2.426, generator_feat_match_loss=4.059, over 79.00 samples.], tot_loss[discriminator_loss=2.657, discriminator_real_loss=1.352, discriminator_fake_loss=1.305, generator_loss=30.64, generator_mel_loss=21.7, generator_kl_loss=1.96, generator_dur_loss=1.708, generator_adv_loss=2.159, generator_feat_match_loss=3.121, over 2880.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2023-11-13 10:08:35,086 INFO [train.py:811] (0/4) Start epoch 224 2023-11-13 10:12:11,058 INFO [train.py:811] (0/4) Start epoch 225 2023-11-13 10:13:30,014 INFO [train.py:467] (0/4) Epoch 225, batch 12, global_batch_idx: 8300, batch size: 55, loss[discriminator_loss=2.699, discriminator_real_loss=1.209, discriminator_fake_loss=1.489, generator_loss=30.1, generator_mel_loss=21.79, generator_kl_loss=1.952, generator_dur_loss=1.741, generator_adv_loss=1.846, generator_feat_match_loss=2.77, over 55.00 samples.], tot_loss[discriminator_loss=2.622, discriminator_real_loss=1.297, discriminator_fake_loss=1.326, generator_loss=30.16, generator_mel_loss=21.56, generator_kl_loss=1.95, generator_dur_loss=1.714, generator_adv_loss=2.047, generator_feat_match_loss=2.892, over 971.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 10:15:42,818 INFO [train.py:811] (0/4) Start epoch 226 2023-11-13 10:18:17,245 INFO [train.py:467] (0/4) Epoch 226, batch 25, global_batch_idx: 8350, batch size: 101, loss[discriminator_loss=2.678, discriminator_real_loss=1.197, discriminator_fake_loss=1.48, generator_loss=30.48, generator_mel_loss=21.66, generator_kl_loss=2.052, generator_dur_loss=1.684, generator_adv_loss=2.156, generator_feat_match_loss=2.934, over 101.00 samples.], tot_loss[discriminator_loss=2.589, discriminator_real_loss=1.293, discriminator_fake_loss=1.296, generator_loss=30.84, generator_mel_loss=21.78, generator_kl_loss=1.946, generator_dur_loss=1.71, generator_adv_loss=2.201, generator_feat_match_loss=3.198, over 2034.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 10:19:17,377 INFO [train.py:811] (0/4) Start epoch 227 2023-11-13 10:22:52,844 INFO [train.py:811] (0/4) Start epoch 228 2023-11-13 10:23:13,932 INFO [train.py:467] (0/4) Epoch 228, batch 1, global_batch_idx: 8400, batch size: 71, loss[discriminator_loss=2.605, discriminator_real_loss=1.275, discriminator_fake_loss=1.329, generator_loss=30.26, generator_mel_loss=21.62, generator_kl_loss=2.027, generator_dur_loss=1.689, generator_adv_loss=2.053, generator_feat_match_loss=2.865, over 71.00 samples.], tot_loss[discriminator_loss=2.6, discriminator_real_loss=1.343, discriminator_fake_loss=1.257, generator_loss=30.44, generator_mel_loss=21.63, generator_kl_loss=1.996, generator_dur_loss=1.696, generator_adv_loss=2.136, generator_feat_match_loss=2.978, over 140.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 32.0 2023-11-13 10:23:14,373 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 10:23:25,527 INFO [train.py:517] (0/4) Epoch 228, validation: discriminator_loss=2.709, discriminator_real_loss=1.193, discriminator_fake_loss=1.516, generator_loss=29.8, generator_mel_loss=22.04, generator_kl_loss=1.973, generator_dur_loss=1.681, generator_adv_loss=1.645, generator_feat_match_loss=2.456, over 100.00 samples. 2023-11-13 10:23:25,527 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 10:26:38,287 INFO [train.py:811] (0/4) Start epoch 229 2023-11-13 10:27:57,103 INFO [train.py:467] (0/4) Epoch 229, batch 14, global_batch_idx: 8450, batch size: 73, loss[discriminator_loss=2.367, discriminator_real_loss=1.226, discriminator_fake_loss=1.143, generator_loss=31.42, generator_mel_loss=21.01, generator_kl_loss=1.96, generator_dur_loss=1.715, generator_adv_loss=2.557, generator_feat_match_loss=4.172, over 73.00 samples.], tot_loss[discriminator_loss=2.527, discriminator_real_loss=1.273, discriminator_fake_loss=1.255, generator_loss=31.11, generator_mel_loss=21.71, generator_kl_loss=1.941, generator_dur_loss=1.714, generator_adv_loss=2.275, generator_feat_match_loss=3.476, over 954.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 10:30:04,478 INFO [train.py:811] (0/4) Start epoch 230 2023-11-13 10:32:46,502 INFO [train.py:467] (0/4) Epoch 230, batch 27, global_batch_idx: 8500, batch size: 52, loss[discriminator_loss=2.551, discriminator_real_loss=1.318, discriminator_fake_loss=1.231, generator_loss=31.39, generator_mel_loss=21.42, generator_kl_loss=1.933, generator_dur_loss=1.699, generator_adv_loss=2.543, generator_feat_match_loss=3.789, over 52.00 samples.], tot_loss[discriminator_loss=2.603, discriminator_real_loss=1.302, discriminator_fake_loss=1.301, generator_loss=30.79, generator_mel_loss=21.68, generator_kl_loss=1.967, generator_dur_loss=1.712, generator_adv_loss=2.199, generator_feat_match_loss=3.235, over 1856.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 10:33:39,014 INFO [train.py:811] (0/4) Start epoch 231 2023-11-13 10:37:09,253 INFO [train.py:811] (0/4) Start epoch 232 2023-11-13 10:37:40,334 INFO [train.py:467] (0/4) Epoch 232, batch 3, global_batch_idx: 8550, batch size: 81, loss[discriminator_loss=2.668, discriminator_real_loss=1.375, discriminator_fake_loss=1.294, generator_loss=30.03, generator_mel_loss=21.71, generator_kl_loss=1.966, generator_dur_loss=1.718, generator_adv_loss=1.935, generator_feat_match_loss=2.705, over 81.00 samples.], tot_loss[discriminator_loss=2.665, discriminator_real_loss=1.352, discriminator_fake_loss=1.313, generator_loss=30.35, generator_mel_loss=21.9, generator_kl_loss=1.966, generator_dur_loss=1.709, generator_adv_loss=1.979, generator_feat_match_loss=2.797, over 254.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 10:40:43,536 INFO [train.py:811] (0/4) Start epoch 233 2023-11-13 10:42:29,914 INFO [train.py:467] (0/4) Epoch 233, batch 16, global_batch_idx: 8600, batch size: 69, loss[discriminator_loss=2.352, discriminator_real_loss=1.223, discriminator_fake_loss=1.129, generator_loss=31.96, generator_mel_loss=21.48, generator_kl_loss=1.939, generator_dur_loss=1.68, generator_adv_loss=2.857, generator_feat_match_loss=4, over 69.00 samples.], tot_loss[discriminator_loss=2.768, discriminator_real_loss=1.528, discriminator_fake_loss=1.239, generator_loss=31.48, generator_mel_loss=21.6, generator_kl_loss=1.949, generator_dur_loss=1.698, generator_adv_loss=2.559, generator_feat_match_loss=3.672, over 1361.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 10:42:30,403 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 10:42:41,017 INFO [train.py:517] (0/4) Epoch 233, validation: discriminator_loss=2.891, discriminator_real_loss=1.424, discriminator_fake_loss=1.467, generator_loss=33.18, generator_mel_loss=22.47, generator_kl_loss=2.009, generator_dur_loss=1.672, generator_adv_loss=2.831, generator_feat_match_loss=4.196, over 100.00 samples. 2023-11-13 10:42:41,018 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27135MB 2023-11-13 10:44:22,937 INFO [train.py:811] (0/4) Start epoch 234 2023-11-13 10:47:18,927 INFO [train.py:467] (0/4) Epoch 234, batch 29, global_batch_idx: 8650, batch size: 90, loss[discriminator_loss=2.812, discriminator_real_loss=1.445, discriminator_fake_loss=1.367, generator_loss=30.39, generator_mel_loss=21.46, generator_kl_loss=1.953, generator_dur_loss=1.724, generator_adv_loss=2.285, generator_feat_match_loss=2.975, over 90.00 samples.], tot_loss[discriminator_loss=2.787, discriminator_real_loss=1.433, discriminator_fake_loss=1.354, generator_loss=29.42, generator_mel_loss=21.28, generator_kl_loss=1.928, generator_dur_loss=1.705, generator_adv_loss=1.971, generator_feat_match_loss=2.532, over 2211.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 10:47:57,117 INFO [train.py:811] (0/4) Start epoch 235 2023-11-13 10:51:26,116 INFO [train.py:811] (0/4) Start epoch 236 2023-11-13 10:52:11,343 INFO [train.py:467] (0/4) Epoch 236, batch 5, global_batch_idx: 8700, batch size: 90, loss[discriminator_loss=2.668, discriminator_real_loss=1.273, discriminator_fake_loss=1.395, generator_loss=30.11, generator_mel_loss=21.53, generator_kl_loss=1.988, generator_dur_loss=1.69, generator_adv_loss=1.983, generator_feat_match_loss=2.926, over 90.00 samples.], tot_loss[discriminator_loss=2.764, discriminator_real_loss=1.383, discriminator_fake_loss=1.381, generator_loss=29.97, generator_mel_loss=21.58, generator_kl_loss=1.958, generator_dur_loss=1.7, generator_adv_loss=1.953, generator_feat_match_loss=2.778, over 618.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 10:54:56,136 INFO [train.py:811] (0/4) Start epoch 237 2023-11-13 10:56:45,522 INFO [train.py:467] (0/4) Epoch 237, batch 18, global_batch_idx: 8750, batch size: 101, loss[discriminator_loss=2.723, discriminator_real_loss=1.156, discriminator_fake_loss=1.567, generator_loss=29.81, generator_mel_loss=21.37, generator_kl_loss=1.912, generator_dur_loss=1.681, generator_adv_loss=2.172, generator_feat_match_loss=2.672, over 101.00 samples.], tot_loss[discriminator_loss=2.702, discriminator_real_loss=1.368, discriminator_fake_loss=1.334, generator_loss=30.23, generator_mel_loss=21.63, generator_kl_loss=1.925, generator_dur_loss=1.705, generator_adv_loss=2.077, generator_feat_match_loss=2.895, over 1359.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 10:58:28,238 INFO [train.py:811] (0/4) Start epoch 238 2023-11-13 11:01:28,606 INFO [train.py:467] (0/4) Epoch 238, batch 31, global_batch_idx: 8800, batch size: 85, loss[discriminator_loss=2.732, discriminator_real_loss=1.399, discriminator_fake_loss=1.333, generator_loss=30.34, generator_mel_loss=21.65, generator_kl_loss=1.914, generator_dur_loss=1.71, generator_adv_loss=2.098, generator_feat_match_loss=2.969, over 85.00 samples.], tot_loss[discriminator_loss=2.665, discriminator_real_loss=1.348, discriminator_fake_loss=1.317, generator_loss=30.26, generator_mel_loss=21.69, generator_kl_loss=1.944, generator_dur_loss=1.707, generator_adv_loss=2.022, generator_feat_match_loss=2.898, over 2218.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 32.0 2023-11-13 11:01:29,145 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 11:01:39,346 INFO [train.py:517] (0/4) Epoch 238, validation: discriminator_loss=2.529, discriminator_real_loss=1.238, discriminator_fake_loss=1.29, generator_loss=31.39, generator_mel_loss=22.59, generator_kl_loss=1.976, generator_dur_loss=1.667, generator_adv_loss=1.969, generator_feat_match_loss=3.19, over 100.00 samples. 2023-11-13 11:01:39,347 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 11:02:11,948 INFO [train.py:811] (0/4) Start epoch 239 2023-11-13 11:05:43,871 INFO [train.py:811] (0/4) Start epoch 240 2023-11-13 11:06:41,792 INFO [train.py:467] (0/4) Epoch 240, batch 7, global_batch_idx: 8850, batch size: 81, loss[discriminator_loss=2.629, discriminator_real_loss=1.151, discriminator_fake_loss=1.479, generator_loss=31.32, generator_mel_loss=21.74, generator_kl_loss=1.937, generator_dur_loss=1.693, generator_adv_loss=2.412, generator_feat_match_loss=3.531, over 81.00 samples.], tot_loss[discriminator_loss=2.695, discriminator_real_loss=1.343, discriminator_fake_loss=1.352, generator_loss=30.84, generator_mel_loss=21.92, generator_kl_loss=1.954, generator_dur_loss=1.692, generator_adv_loss=2.219, generator_feat_match_loss=3.056, over 634.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 11:09:14,894 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-240.pt 2023-11-13 11:09:18,047 INFO [train.py:811] (0/4) Start epoch 241 2023-11-13 11:11:24,357 INFO [train.py:467] (0/4) Epoch 241, batch 20, global_batch_idx: 8900, batch size: 63, loss[discriminator_loss=2.895, discriminator_real_loss=1.603, discriminator_fake_loss=1.293, generator_loss=29.55, generator_mel_loss=21.43, generator_kl_loss=2.003, generator_dur_loss=1.703, generator_adv_loss=1.759, generator_feat_match_loss=2.654, over 63.00 samples.], tot_loss[discriminator_loss=2.781, discriminator_real_loss=1.432, discriminator_fake_loss=1.348, generator_loss=30.22, generator_mel_loss=21.44, generator_kl_loss=1.932, generator_dur_loss=1.708, generator_adv_loss=2.146, generator_feat_match_loss=2.994, over 1526.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 11:12:43,766 INFO [train.py:811] (0/4) Start epoch 242 2023-11-13 11:15:59,096 INFO [train.py:467] (0/4) Epoch 242, batch 33, global_batch_idx: 8950, batch size: 73, loss[discriminator_loss=2.719, discriminator_real_loss=1.271, discriminator_fake_loss=1.448, generator_loss=30.39, generator_mel_loss=21.8, generator_kl_loss=1.876, generator_dur_loss=1.686, generator_adv_loss=2.117, generator_feat_match_loss=2.908, over 73.00 samples.], tot_loss[discriminator_loss=2.687, discriminator_real_loss=1.363, discriminator_fake_loss=1.324, generator_loss=30.28, generator_mel_loss=21.76, generator_kl_loss=1.967, generator_dur_loss=1.701, generator_adv_loss=2.005, generator_feat_match_loss=2.853, over 2409.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 11:16:14,370 INFO [train.py:811] (0/4) Start epoch 243 2023-11-13 11:19:44,042 INFO [train.py:811] (0/4) Start epoch 244 2023-11-13 11:20:50,558 INFO [train.py:467] (0/4) Epoch 244, batch 9, global_batch_idx: 9000, batch size: 81, loss[discriminator_loss=2.67, discriminator_real_loss=1.458, discriminator_fake_loss=1.212, generator_loss=29.75, generator_mel_loss=21.25, generator_kl_loss=1.952, generator_dur_loss=1.684, generator_adv_loss=2.072, generator_feat_match_loss=2.789, over 81.00 samples.], tot_loss[discriminator_loss=2.715, discriminator_real_loss=1.362, discriminator_fake_loss=1.352, generator_loss=30.63, generator_mel_loss=21.87, generator_kl_loss=1.916, generator_dur_loss=1.693, generator_adv_loss=2.163, generator_feat_match_loss=2.982, over 891.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 11:20:51,089 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 11:21:01,412 INFO [train.py:517] (0/4) Epoch 244, validation: discriminator_loss=2.725, discriminator_real_loss=1.361, discriminator_fake_loss=1.365, generator_loss=30.89, generator_mel_loss=22.52, generator_kl_loss=2.073, generator_dur_loss=1.669, generator_adv_loss=1.862, generator_feat_match_loss=2.76, over 100.00 samples. 2023-11-13 11:21:01,413 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 11:23:20,640 INFO [train.py:811] (0/4) Start epoch 245 2023-11-13 11:25:37,306 INFO [train.py:467] (0/4) Epoch 245, batch 22, global_batch_idx: 9050, batch size: 60, loss[discriminator_loss=2.664, discriminator_real_loss=1.312, discriminator_fake_loss=1.353, generator_loss=30.85, generator_mel_loss=22.43, generator_kl_loss=1.949, generator_dur_loss=1.713, generator_adv_loss=2.049, generator_feat_match_loss=2.705, over 60.00 samples.], tot_loss[discriminator_loss=2.671, discriminator_real_loss=1.345, discriminator_fake_loss=1.326, generator_loss=30.36, generator_mel_loss=21.77, generator_kl_loss=1.937, generator_dur_loss=1.704, generator_adv_loss=2.041, generator_feat_match_loss=2.912, over 1631.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 11:26:54,860 INFO [train.py:811] (0/4) Start epoch 246 2023-11-13 11:30:22,506 INFO [train.py:467] (0/4) Epoch 246, batch 35, global_batch_idx: 9100, batch size: 82, loss[discriminator_loss=2.684, discriminator_real_loss=1.559, discriminator_fake_loss=1.125, generator_loss=31.36, generator_mel_loss=21.6, generator_kl_loss=1.971, generator_dur_loss=1.708, generator_adv_loss=2.383, generator_feat_match_loss=3.691, over 82.00 samples.], tot_loss[discriminator_loss=2.645, discriminator_real_loss=1.327, discriminator_fake_loss=1.317, generator_loss=30.74, generator_mel_loss=21.61, generator_kl_loss=1.929, generator_dur_loss=1.701, generator_adv_loss=2.21, generator_feat_match_loss=3.291, over 2793.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 11:30:31,451 INFO [train.py:811] (0/4) Start epoch 247 2023-11-13 11:34:00,834 INFO [train.py:811] (0/4) Start epoch 248 2023-11-13 11:35:20,479 INFO [train.py:467] (0/4) Epoch 248, batch 11, global_batch_idx: 9150, batch size: 69, loss[discriminator_loss=2.938, discriminator_real_loss=1.253, discriminator_fake_loss=1.685, generator_loss=28.86, generator_mel_loss=20.81, generator_kl_loss=1.979, generator_dur_loss=1.703, generator_adv_loss=1.854, generator_feat_match_loss=2.516, over 69.00 samples.], tot_loss[discriminator_loss=2.599, discriminator_real_loss=1.28, discriminator_fake_loss=1.319, generator_loss=30.6, generator_mel_loss=21.41, generator_kl_loss=1.944, generator_dur_loss=1.707, generator_adv_loss=2.184, generator_feat_match_loss=3.354, over 897.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 11:37:36,986 INFO [train.py:811] (0/4) Start epoch 249 2023-11-13 11:40:01,577 INFO [train.py:467] (0/4) Epoch 249, batch 24, global_batch_idx: 9200, batch size: 153, loss[discriminator_loss=2.715, discriminator_real_loss=1.418, discriminator_fake_loss=1.297, generator_loss=30.6, generator_mel_loss=21.75, generator_kl_loss=1.936, generator_dur_loss=1.677, generator_adv_loss=2.178, generator_feat_match_loss=3.053, over 153.00 samples.], tot_loss[discriminator_loss=2.627, discriminator_real_loss=1.328, discriminator_fake_loss=1.299, generator_loss=30.47, generator_mel_loss=21.48, generator_kl_loss=1.936, generator_dur_loss=1.7, generator_adv_loss=2.157, generator_feat_match_loss=3.203, over 1966.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 32.0 2023-11-13 11:40:02,126 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 11:40:12,656 INFO [train.py:517] (0/4) Epoch 249, validation: discriminator_loss=2.564, discriminator_real_loss=1.379, discriminator_fake_loss=1.185, generator_loss=31.21, generator_mel_loss=22.41, generator_kl_loss=1.995, generator_dur_loss=1.668, generator_adv_loss=2.13, generator_feat_match_loss=3.014, over 100.00 samples. 2023-11-13 11:40:12,657 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 11:41:14,930 INFO [train.py:811] (0/4) Start epoch 250 2023-11-13 11:44:43,405 INFO [train.py:811] (0/4) Start epoch 251 2023-11-13 11:44:59,085 INFO [train.py:467] (0/4) Epoch 251, batch 0, global_batch_idx: 9250, batch size: 69, loss[discriminator_loss=2.574, discriminator_real_loss=1.508, discriminator_fake_loss=1.065, generator_loss=31.12, generator_mel_loss=21.4, generator_kl_loss=1.933, generator_dur_loss=1.714, generator_adv_loss=2.393, generator_feat_match_loss=3.682, over 69.00 samples.], tot_loss[discriminator_loss=2.574, discriminator_real_loss=1.508, discriminator_fake_loss=1.065, generator_loss=31.12, generator_mel_loss=21.4, generator_kl_loss=1.933, generator_dur_loss=1.714, generator_adv_loss=2.393, generator_feat_match_loss=3.682, over 69.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 11:48:15,338 INFO [train.py:811] (0/4) Start epoch 252 2023-11-13 11:49:38,868 INFO [train.py:467] (0/4) Epoch 252, batch 13, global_batch_idx: 9300, batch size: 90, loss[discriminator_loss=2.562, discriminator_real_loss=1.342, discriminator_fake_loss=1.222, generator_loss=31.43, generator_mel_loss=22.05, generator_kl_loss=1.918, generator_dur_loss=1.685, generator_adv_loss=2.238, generator_feat_match_loss=3.539, over 90.00 samples.], tot_loss[discriminator_loss=2.637, discriminator_real_loss=1.33, discriminator_fake_loss=1.307, generator_loss=30.52, generator_mel_loss=21.75, generator_kl_loss=1.954, generator_dur_loss=1.705, generator_adv_loss=2.069, generator_feat_match_loss=3.04, over 1055.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 11:51:46,597 INFO [train.py:811] (0/4) Start epoch 253 2023-11-13 11:54:37,190 INFO [train.py:467] (0/4) Epoch 253, batch 26, global_batch_idx: 9350, batch size: 58, loss[discriminator_loss=2.807, discriminator_real_loss=1.183, discriminator_fake_loss=1.624, generator_loss=30.2, generator_mel_loss=21.5, generator_kl_loss=1.926, generator_dur_loss=1.702, generator_adv_loss=2.193, generator_feat_match_loss=2.885, over 58.00 samples.], tot_loss[discriminator_loss=2.6, discriminator_real_loss=1.311, discriminator_fake_loss=1.289, generator_loss=30.83, generator_mel_loss=21.46, generator_kl_loss=1.941, generator_dur_loss=1.698, generator_adv_loss=2.31, generator_feat_match_loss=3.416, over 1920.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 11:55:23,985 INFO [train.py:811] (0/4) Start epoch 254 2023-11-13 11:58:55,773 INFO [train.py:811] (0/4) Start epoch 255 2023-11-13 11:59:23,499 INFO [train.py:467] (0/4) Epoch 255, batch 2, global_batch_idx: 9400, batch size: 55, loss[discriminator_loss=2.734, discriminator_real_loss=1.338, discriminator_fake_loss=1.396, generator_loss=29.93, generator_mel_loss=21.54, generator_kl_loss=2.057, generator_dur_loss=1.732, generator_adv_loss=1.899, generator_feat_match_loss=2.701, over 55.00 samples.], tot_loss[discriminator_loss=2.62, discriminator_real_loss=1.265, discriminator_fake_loss=1.355, generator_loss=30.35, generator_mel_loss=21.56, generator_kl_loss=2.001, generator_dur_loss=1.705, generator_adv_loss=2.074, generator_feat_match_loss=3.005, over 182.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 11:59:23,992 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 11:59:35,607 INFO [train.py:517] (0/4) Epoch 255, validation: discriminator_loss=2.662, discriminator_real_loss=1.208, discriminator_fake_loss=1.454, generator_loss=31.34, generator_mel_loss=22.31, generator_kl_loss=2.044, generator_dur_loss=1.678, generator_adv_loss=2.136, generator_feat_match_loss=3.177, over 100.00 samples. 2023-11-13 11:59:35,608 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 12:02:43,982 INFO [train.py:811] (0/4) Start epoch 256 2023-11-13 12:04:14,554 INFO [train.py:467] (0/4) Epoch 256, batch 15, global_batch_idx: 9450, batch size: 110, loss[discriminator_loss=2.561, discriminator_real_loss=1.289, discriminator_fake_loss=1.271, generator_loss=31.01, generator_mel_loss=21.77, generator_kl_loss=1.984, generator_dur_loss=1.691, generator_adv_loss=2.371, generator_feat_match_loss=3.199, over 110.00 samples.], tot_loss[discriminator_loss=2.671, discriminator_real_loss=1.35, discriminator_fake_loss=1.321, generator_loss=30.23, generator_mel_loss=21.46, generator_kl_loss=1.952, generator_dur_loss=1.698, generator_adv_loss=2.124, generator_feat_match_loss=3, over 1168.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 12:06:17,364 INFO [train.py:811] (0/4) Start epoch 257 2023-11-13 12:09:11,862 INFO [train.py:467] (0/4) Epoch 257, batch 28, global_batch_idx: 9500, batch size: 55, loss[discriminator_loss=2.578, discriminator_real_loss=1.256, discriminator_fake_loss=1.321, generator_loss=30.3, generator_mel_loss=21.07, generator_kl_loss=1.878, generator_dur_loss=1.72, generator_adv_loss=2.088, generator_feat_match_loss=3.541, over 55.00 samples.], tot_loss[discriminator_loss=2.678, discriminator_real_loss=1.368, discriminator_fake_loss=1.31, generator_loss=30.67, generator_mel_loss=21.67, generator_kl_loss=1.95, generator_dur_loss=1.699, generator_adv_loss=2.163, generator_feat_match_loss=3.192, over 2179.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 12:09:52,305 INFO [train.py:811] (0/4) Start epoch 258 2023-11-13 12:13:22,967 INFO [train.py:811] (0/4) Start epoch 259 2023-11-13 12:14:01,654 INFO [train.py:467] (0/4) Epoch 259, batch 4, global_batch_idx: 9550, batch size: 85, loss[discriminator_loss=2.52, discriminator_real_loss=1.269, discriminator_fake_loss=1.25, generator_loss=31.56, generator_mel_loss=21.71, generator_kl_loss=1.901, generator_dur_loss=1.711, generator_adv_loss=2.461, generator_feat_match_loss=3.775, over 85.00 samples.], tot_loss[discriminator_loss=2.686, discriminator_real_loss=1.291, discriminator_fake_loss=1.395, generator_loss=30.34, generator_mel_loss=21.4, generator_kl_loss=1.901, generator_dur_loss=1.695, generator_adv_loss=2.088, generator_feat_match_loss=3.256, over 347.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 12:16:53,760 INFO [train.py:811] (0/4) Start epoch 260 2023-11-13 12:18:39,467 INFO [train.py:467] (0/4) Epoch 260, batch 17, global_batch_idx: 9600, batch size: 52, loss[discriminator_loss=2.816, discriminator_real_loss=1.547, discriminator_fake_loss=1.271, generator_loss=29.83, generator_mel_loss=21.77, generator_kl_loss=1.906, generator_dur_loss=1.696, generator_adv_loss=1.856, generator_feat_match_loss=2.602, over 52.00 samples.], tot_loss[discriminator_loss=2.663, discriminator_real_loss=1.341, discriminator_fake_loss=1.321, generator_loss=30.43, generator_mel_loss=21.78, generator_kl_loss=1.934, generator_dur_loss=1.698, generator_adv_loss=2.051, generator_feat_match_loss=2.964, over 1245.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 32.0 2023-11-13 12:18:39,971 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 12:18:51,923 INFO [train.py:517] (0/4) Epoch 260, validation: discriminator_loss=2.6, discriminator_real_loss=1.279, discriminator_fake_loss=1.322, generator_loss=31.63, generator_mel_loss=22.92, generator_kl_loss=1.995, generator_dur_loss=1.666, generator_adv_loss=1.953, generator_feat_match_loss=3.099, over 100.00 samples. 2023-11-13 12:18:51,924 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 12:20:37,591 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-260.pt 2023-11-13 12:20:41,017 INFO [train.py:811] (0/4) Start epoch 261 2023-11-13 12:23:39,499 INFO [train.py:467] (0/4) Epoch 261, batch 30, global_batch_idx: 9650, batch size: 101, loss[discriminator_loss=3.047, discriminator_real_loss=1.441, discriminator_fake_loss=1.606, generator_loss=30.11, generator_mel_loss=21.68, generator_kl_loss=1.994, generator_dur_loss=1.695, generator_adv_loss=1.822, generator_feat_match_loss=2.916, over 101.00 samples.], tot_loss[discriminator_loss=2.62, discriminator_real_loss=1.306, discriminator_fake_loss=1.314, generator_loss=30.88, generator_mel_loss=21.51, generator_kl_loss=1.951, generator_dur_loss=1.694, generator_adv_loss=2.268, generator_feat_match_loss=3.459, over 2419.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 12:24:11,383 INFO [train.py:811] (0/4) Start epoch 262 2023-11-13 12:27:37,974 INFO [train.py:811] (0/4) Start epoch 263 2023-11-13 12:28:24,539 INFO [train.py:467] (0/4) Epoch 263, batch 6, global_batch_idx: 9700, batch size: 85, loss[discriminator_loss=2.684, discriminator_real_loss=1.288, discriminator_fake_loss=1.395, generator_loss=30.15, generator_mel_loss=21.25, generator_kl_loss=2.059, generator_dur_loss=1.7, generator_adv_loss=2.174, generator_feat_match_loss=2.959, over 85.00 samples.], tot_loss[discriminator_loss=2.637, discriminator_real_loss=1.323, discriminator_fake_loss=1.313, generator_loss=30.37, generator_mel_loss=21.61, generator_kl_loss=2.008, generator_dur_loss=1.69, generator_adv_loss=2.064, generator_feat_match_loss=2.994, over 625.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 12:31:11,135 INFO [train.py:811] (0/4) Start epoch 264 2023-11-13 12:33:01,577 INFO [train.py:467] (0/4) Epoch 264, batch 19, global_batch_idx: 9750, batch size: 52, loss[discriminator_loss=2.68, discriminator_real_loss=1.398, discriminator_fake_loss=1.282, generator_loss=29.87, generator_mel_loss=21.03, generator_kl_loss=1.971, generator_dur_loss=1.686, generator_adv_loss=2.219, generator_feat_match_loss=2.957, over 52.00 samples.], tot_loss[discriminator_loss=2.659, discriminator_real_loss=1.346, discriminator_fake_loss=1.313, generator_loss=30.25, generator_mel_loss=21.43, generator_kl_loss=1.944, generator_dur_loss=1.696, generator_adv_loss=2.099, generator_feat_match_loss=3.076, over 1354.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 12:34:31,251 INFO [train.py:811] (0/4) Start epoch 265 2023-11-13 12:37:46,631 INFO [train.py:467] (0/4) Epoch 265, batch 32, global_batch_idx: 9800, batch size: 64, loss[discriminator_loss=2.332, discriminator_real_loss=1.109, discriminator_fake_loss=1.224, generator_loss=31.59, generator_mel_loss=21.07, generator_kl_loss=1.948, generator_dur_loss=1.688, generator_adv_loss=2.543, generator_feat_match_loss=4.336, over 64.00 samples.], tot_loss[discriminator_loss=2.642, discriminator_real_loss=1.341, discriminator_fake_loss=1.301, generator_loss=30.81, generator_mel_loss=21.61, generator_kl_loss=1.937, generator_dur_loss=1.7, generator_adv_loss=2.239, generator_feat_match_loss=3.325, over 2297.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2023-11-13 12:37:47,168 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 12:37:57,626 INFO [train.py:517] (0/4) Epoch 265, validation: discriminator_loss=2.392, discriminator_real_loss=1.221, discriminator_fake_loss=1.171, generator_loss=32.52, generator_mel_loss=22.43, generator_kl_loss=2.05, generator_dur_loss=1.678, generator_adv_loss=2.219, generator_feat_match_loss=4.139, over 100.00 samples. 2023-11-13 12:37:57,627 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 12:38:15,208 INFO [train.py:811] (0/4) Start epoch 266 2023-11-13 12:41:48,851 INFO [train.py:811] (0/4) Start epoch 267 2023-11-13 12:42:51,443 INFO [train.py:467] (0/4) Epoch 267, batch 8, global_batch_idx: 9850, batch size: 79, loss[discriminator_loss=2.643, discriminator_real_loss=1.37, discriminator_fake_loss=1.272, generator_loss=29.82, generator_mel_loss=21.35, generator_kl_loss=1.858, generator_dur_loss=1.687, generator_adv_loss=1.961, generator_feat_match_loss=2.963, over 79.00 samples.], tot_loss[discriminator_loss=2.634, discriminator_real_loss=1.341, discriminator_fake_loss=1.293, generator_loss=30.29, generator_mel_loss=21.7, generator_kl_loss=1.952, generator_dur_loss=1.694, generator_adv_loss=1.983, generator_feat_match_loss=2.96, over 860.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 12:45:23,329 INFO [train.py:811] (0/4) Start epoch 268 2023-11-13 12:47:39,261 INFO [train.py:467] (0/4) Epoch 268, batch 21, global_batch_idx: 9900, batch size: 81, loss[discriminator_loss=2.781, discriminator_real_loss=1.596, discriminator_fake_loss=1.185, generator_loss=31.03, generator_mel_loss=21.82, generator_kl_loss=1.971, generator_dur_loss=1.691, generator_adv_loss=2.367, generator_feat_match_loss=3.182, over 81.00 samples.], tot_loss[discriminator_loss=2.672, discriminator_real_loss=1.334, discriminator_fake_loss=1.337, generator_loss=30.77, generator_mel_loss=21.67, generator_kl_loss=1.968, generator_dur_loss=1.697, generator_adv_loss=2.189, generator_feat_match_loss=3.241, over 1671.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 12:48:59,495 INFO [train.py:811] (0/4) Start epoch 269 2023-11-13 12:52:21,586 INFO [train.py:467] (0/4) Epoch 269, batch 34, global_batch_idx: 9950, batch size: 85, loss[discriminator_loss=2.387, discriminator_real_loss=1.179, discriminator_fake_loss=1.208, generator_loss=32.3, generator_mel_loss=21.85, generator_kl_loss=1.915, generator_dur_loss=1.699, generator_adv_loss=2.467, generator_feat_match_loss=4.379, over 85.00 samples.], tot_loss[discriminator_loss=2.644, discriminator_real_loss=1.354, discriminator_fake_loss=1.29, generator_loss=30.55, generator_mel_loss=21.48, generator_kl_loss=1.929, generator_dur_loss=1.698, generator_adv_loss=2.191, generator_feat_match_loss=3.259, over 2711.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 12:52:32,463 INFO [train.py:811] (0/4) Start epoch 270 2023-11-13 12:55:58,792 INFO [train.py:811] (0/4) Start epoch 271 2023-11-13 12:57:08,093 INFO [train.py:467] (0/4) Epoch 271, batch 10, global_batch_idx: 10000, batch size: 60, loss[discriminator_loss=2.637, discriminator_real_loss=1.326, discriminator_fake_loss=1.312, generator_loss=30.48, generator_mel_loss=21.49, generator_kl_loss=1.939, generator_dur_loss=1.725, generator_adv_loss=2.129, generator_feat_match_loss=3.193, over 60.00 samples.], tot_loss[discriminator_loss=2.641, discriminator_real_loss=1.343, discriminator_fake_loss=1.298, generator_loss=30.03, generator_mel_loss=21.46, generator_kl_loss=1.928, generator_dur_loss=1.699, generator_adv_loss=2.018, generator_feat_match_loss=2.922, over 740.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 32.0 2023-11-13 12:57:08,642 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 12:57:18,949 INFO [train.py:517] (0/4) Epoch 271, validation: discriminator_loss=2.664, discriminator_real_loss=1.391, discriminator_fake_loss=1.273, generator_loss=31.42, generator_mel_loss=22.54, generator_kl_loss=1.989, generator_dur_loss=1.673, generator_adv_loss=2.122, generator_feat_match_loss=3.091, over 100.00 samples. 2023-11-13 12:57:18,950 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 12:59:41,374 INFO [train.py:811] (0/4) Start epoch 272 2023-11-13 13:01:58,791 INFO [train.py:467] (0/4) Epoch 272, batch 23, global_batch_idx: 10050, batch size: 51, loss[discriminator_loss=2.504, discriminator_real_loss=1.286, discriminator_fake_loss=1.218, generator_loss=30.77, generator_mel_loss=21.4, generator_kl_loss=1.903, generator_dur_loss=1.703, generator_adv_loss=2.252, generator_feat_match_loss=3.518, over 51.00 samples.], tot_loss[discriminator_loss=2.636, discriminator_real_loss=1.333, discriminator_fake_loss=1.303, generator_loss=30.46, generator_mel_loss=21.49, generator_kl_loss=1.941, generator_dur_loss=1.701, generator_adv_loss=2.144, generator_feat_match_loss=3.186, over 1546.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 13:03:12,751 INFO [train.py:811] (0/4) Start epoch 273 2023-11-13 13:06:49,433 INFO [train.py:467] (0/4) Epoch 273, batch 36, global_batch_idx: 10100, batch size: 126, loss[discriminator_loss=2.652, discriminator_real_loss=1.299, discriminator_fake_loss=1.354, generator_loss=30.55, generator_mel_loss=21.73, generator_kl_loss=1.961, generator_dur_loss=1.677, generator_adv_loss=1.956, generator_feat_match_loss=3.223, over 126.00 samples.], tot_loss[discriminator_loss=2.665, discriminator_real_loss=1.352, discriminator_fake_loss=1.312, generator_loss=30.44, generator_mel_loss=21.41, generator_kl_loss=1.928, generator_dur_loss=1.694, generator_adv_loss=2.157, generator_feat_match_loss=3.246, over 2850.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 13:06:50,556 INFO [train.py:811] (0/4) Start epoch 274 2023-11-13 13:10:22,941 INFO [train.py:811] (0/4) Start epoch 275 2023-11-13 13:11:43,904 INFO [train.py:467] (0/4) Epoch 275, batch 12, global_batch_idx: 10150, batch size: 61, loss[discriminator_loss=2.805, discriminator_real_loss=1.428, discriminator_fake_loss=1.376, generator_loss=29.62, generator_mel_loss=21.4, generator_kl_loss=1.958, generator_dur_loss=1.7, generator_adv_loss=1.967, generator_feat_match_loss=2.602, over 61.00 samples.], tot_loss[discriminator_loss=2.691, discriminator_real_loss=1.357, discriminator_fake_loss=1.334, generator_loss=30.46, generator_mel_loss=21.52, generator_kl_loss=1.921, generator_dur_loss=1.684, generator_adv_loss=2.106, generator_feat_match_loss=3.231, over 1117.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 13:13:56,078 INFO [train.py:811] (0/4) Start epoch 276 2023-11-13 13:16:19,341 INFO [train.py:467] (0/4) Epoch 276, batch 25, global_batch_idx: 10200, batch size: 51, loss[discriminator_loss=2.58, discriminator_real_loss=1.396, discriminator_fake_loss=1.184, generator_loss=30.98, generator_mel_loss=21.26, generator_kl_loss=2.01, generator_dur_loss=1.712, generator_adv_loss=2.385, generator_feat_match_loss=3.609, over 51.00 samples.], tot_loss[discriminator_loss=2.634, discriminator_real_loss=1.327, discriminator_fake_loss=1.307, generator_loss=30.46, generator_mel_loss=21.55, generator_kl_loss=1.953, generator_dur_loss=1.697, generator_adv_loss=2.115, generator_feat_match_loss=3.14, over 1888.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 13:16:19,833 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 13:16:30,527 INFO [train.py:517] (0/4) Epoch 276, validation: discriminator_loss=2.581, discriminator_real_loss=1.203, discriminator_fake_loss=1.377, generator_loss=31.56, generator_mel_loss=22.84, generator_kl_loss=2.009, generator_dur_loss=1.67, generator_adv_loss=1.857, generator_feat_match_loss=3.182, over 100.00 samples. 2023-11-13 13:16:30,528 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 13:17:37,571 INFO [train.py:811] (0/4) Start epoch 277 2023-11-13 13:21:10,520 INFO [train.py:811] (0/4) Start epoch 278 2023-11-13 13:21:33,445 INFO [train.py:467] (0/4) Epoch 278, batch 1, global_batch_idx: 10250, batch size: 85, loss[discriminator_loss=2.684, discriminator_real_loss=1.293, discriminator_fake_loss=1.392, generator_loss=30.08, generator_mel_loss=21.48, generator_kl_loss=1.976, generator_dur_loss=1.705, generator_adv_loss=2.004, generator_feat_match_loss=2.922, over 85.00 samples.], tot_loss[discriminator_loss=2.694, discriminator_real_loss=1.326, discriminator_fake_loss=1.369, generator_loss=29.94, generator_mel_loss=21.44, generator_kl_loss=1.912, generator_dur_loss=1.705, generator_adv_loss=1.977, generator_feat_match_loss=2.911, over 135.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 13:24:40,092 INFO [train.py:811] (0/4) Start epoch 279 2023-11-13 13:26:13,088 INFO [train.py:467] (0/4) Epoch 279, batch 14, global_batch_idx: 10300, batch size: 65, loss[discriminator_loss=2.645, discriminator_real_loss=1.203, discriminator_fake_loss=1.44, generator_loss=31.54, generator_mel_loss=22.01, generator_kl_loss=1.936, generator_dur_loss=1.696, generator_adv_loss=2.426, generator_feat_match_loss=3.469, over 65.00 samples.], tot_loss[discriminator_loss=2.688, discriminator_real_loss=1.35, discriminator_fake_loss=1.337, generator_loss=30.59, generator_mel_loss=21.6, generator_kl_loss=1.951, generator_dur_loss=1.692, generator_adv_loss=2.149, generator_feat_match_loss=3.202, over 1035.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 13:28:15,341 INFO [train.py:811] (0/4) Start epoch 280 2023-11-13 13:30:56,381 INFO [train.py:467] (0/4) Epoch 280, batch 27, global_batch_idx: 10350, batch size: 76, loss[discriminator_loss=2.77, discriminator_real_loss=1.471, discriminator_fake_loss=1.3, generator_loss=30.58, generator_mel_loss=21.97, generator_kl_loss=1.857, generator_dur_loss=1.697, generator_adv_loss=2.207, generator_feat_match_loss=2.85, over 76.00 samples.], tot_loss[discriminator_loss=2.641, discriminator_real_loss=1.35, discriminator_fake_loss=1.291, generator_loss=30.78, generator_mel_loss=21.51, generator_kl_loss=1.944, generator_dur_loss=1.693, generator_adv_loss=2.225, generator_feat_match_loss=3.41, over 2318.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 13:31:43,621 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-280.pt 2023-11-13 13:31:46,780 INFO [train.py:811] (0/4) Start epoch 281 2023-11-13 13:35:19,278 INFO [train.py:811] (0/4) Start epoch 282 2023-11-13 13:35:52,401 INFO [train.py:467] (0/4) Epoch 282, batch 3, global_batch_idx: 10400, batch size: 61, loss[discriminator_loss=2.641, discriminator_real_loss=1.349, discriminator_fake_loss=1.291, generator_loss=30.56, generator_mel_loss=21.38, generator_kl_loss=2.037, generator_dur_loss=1.685, generator_adv_loss=2.309, generator_feat_match_loss=3.143, over 61.00 samples.], tot_loss[discriminator_loss=2.591, discriminator_real_loss=1.256, discriminator_fake_loss=1.335, generator_loss=30.85, generator_mel_loss=21.47, generator_kl_loss=1.974, generator_dur_loss=1.689, generator_adv_loss=2.225, generator_feat_match_loss=3.489, over 360.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 32.0 2023-11-13 13:35:52,882 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 13:36:04,174 INFO [train.py:517] (0/4) Epoch 282, validation: discriminator_loss=2.715, discriminator_real_loss=1.545, discriminator_fake_loss=1.17, generator_loss=31.62, generator_mel_loss=22.3, generator_kl_loss=2.052, generator_dur_loss=1.661, generator_adv_loss=2.386, generator_feat_match_loss=3.221, over 100.00 samples. 2023-11-13 13:36:04,175 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 13:39:00,247 INFO [train.py:811] (0/4) Start epoch 283 2023-11-13 13:40:33,494 INFO [train.py:467] (0/4) Epoch 283, batch 16, global_batch_idx: 10450, batch size: 71, loss[discriminator_loss=2.469, discriminator_real_loss=1.202, discriminator_fake_loss=1.268, generator_loss=31.02, generator_mel_loss=21.6, generator_kl_loss=1.938, generator_dur_loss=1.692, generator_adv_loss=2.184, generator_feat_match_loss=3.604, over 71.00 samples.], tot_loss[discriminator_loss=2.607, discriminator_real_loss=1.317, discriminator_fake_loss=1.291, generator_loss=30.4, generator_mel_loss=21.45, generator_kl_loss=1.924, generator_dur_loss=1.697, generator_adv_loss=2.139, generator_feat_match_loss=3.187, over 1187.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 13:42:27,448 INFO [train.py:811] (0/4) Start epoch 284 2023-11-13 13:45:23,179 INFO [train.py:467] (0/4) Epoch 284, batch 29, global_batch_idx: 10500, batch size: 52, loss[discriminator_loss=2.656, discriminator_real_loss=1.434, discriminator_fake_loss=1.224, generator_loss=30.36, generator_mel_loss=21.61, generator_kl_loss=1.883, generator_dur_loss=1.684, generator_adv_loss=2.072, generator_feat_match_loss=3.109, over 52.00 samples.], tot_loss[discriminator_loss=2.647, discriminator_real_loss=1.337, discriminator_fake_loss=1.311, generator_loss=30.4, generator_mel_loss=21.43, generator_kl_loss=1.934, generator_dur_loss=1.693, generator_adv_loss=2.125, generator_feat_match_loss=3.215, over 2287.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 13:46:06,771 INFO [train.py:811] (0/4) Start epoch 285 2023-11-13 13:49:43,111 INFO [train.py:811] (0/4) Start epoch 286 2023-11-13 13:50:24,561 INFO [train.py:467] (0/4) Epoch 286, batch 5, global_batch_idx: 10550, batch size: 58, loss[discriminator_loss=2.809, discriminator_real_loss=1.368, discriminator_fake_loss=1.441, generator_loss=29.38, generator_mel_loss=21.02, generator_kl_loss=1.902, generator_dur_loss=1.697, generator_adv_loss=1.982, generator_feat_match_loss=2.771, over 58.00 samples.], tot_loss[discriminator_loss=2.719, discriminator_real_loss=1.34, discriminator_fake_loss=1.379, generator_loss=30.13, generator_mel_loss=21.33, generator_kl_loss=1.924, generator_dur_loss=1.697, generator_adv_loss=2.029, generator_feat_match_loss=3.155, over 470.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 13:53:17,354 INFO [train.py:811] (0/4) Start epoch 287 2023-11-13 13:55:11,090 INFO [train.py:467] (0/4) Epoch 287, batch 18, global_batch_idx: 10600, batch size: 50, loss[discriminator_loss=2.553, discriminator_real_loss=1.374, discriminator_fake_loss=1.179, generator_loss=31.16, generator_mel_loss=21.61, generator_kl_loss=1.881, generator_dur_loss=1.665, generator_adv_loss=2.35, generator_feat_match_loss=3.658, over 50.00 samples.], tot_loss[discriminator_loss=2.582, discriminator_real_loss=1.297, discriminator_fake_loss=1.285, generator_loss=30.97, generator_mel_loss=21.67, generator_kl_loss=1.958, generator_dur_loss=1.692, generator_adv_loss=2.199, generator_feat_match_loss=3.454, over 1426.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 13:55:11,567 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 13:55:22,043 INFO [train.py:517] (0/4) Epoch 287, validation: discriminator_loss=2.661, discriminator_real_loss=1.14, discriminator_fake_loss=1.521, generator_loss=30.62, generator_mel_loss=21.94, generator_kl_loss=2.034, generator_dur_loss=1.667, generator_adv_loss=1.799, generator_feat_match_loss=3.179, over 100.00 samples. 2023-11-13 13:55:22,043 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 13:56:58,474 INFO [train.py:811] (0/4) Start epoch 288 2023-11-13 13:59:59,273 INFO [train.py:467] (0/4) Epoch 288, batch 31, global_batch_idx: 10650, batch size: 49, loss[discriminator_loss=2.645, discriminator_real_loss=1.489, discriminator_fake_loss=1.155, generator_loss=29.73, generator_mel_loss=21.01, generator_kl_loss=1.963, generator_dur_loss=1.698, generator_adv_loss=2.045, generator_feat_match_loss=3.02, over 49.00 samples.], tot_loss[discriminator_loss=2.645, discriminator_real_loss=1.333, discriminator_fake_loss=1.312, generator_loss=30.51, generator_mel_loss=21.31, generator_kl_loss=1.951, generator_dur_loss=1.691, generator_adv_loss=2.184, generator_feat_match_loss=3.374, over 2303.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 14:00:30,369 INFO [train.py:811] (0/4) Start epoch 289 2023-11-13 14:04:01,499 INFO [train.py:811] (0/4) Start epoch 290 2023-11-13 14:05:00,428 INFO [train.py:467] (0/4) Epoch 290, batch 7, global_batch_idx: 10700, batch size: 54, loss[discriminator_loss=2.629, discriminator_real_loss=1.45, discriminator_fake_loss=1.179, generator_loss=30.02, generator_mel_loss=21.06, generator_kl_loss=1.952, generator_dur_loss=1.727, generator_adv_loss=2.219, generator_feat_match_loss=3.066, over 54.00 samples.], tot_loss[discriminator_loss=2.596, discriminator_real_loss=1.326, discriminator_fake_loss=1.27, generator_loss=30.54, generator_mel_loss=21.25, generator_kl_loss=1.941, generator_dur_loss=1.704, generator_adv_loss=2.232, generator_feat_match_loss=3.405, over 559.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 14:07:32,105 INFO [train.py:811] (0/4) Start epoch 291 2023-11-13 14:09:36,450 INFO [train.py:467] (0/4) Epoch 291, batch 20, global_batch_idx: 10750, batch size: 69, loss[discriminator_loss=2.75, discriminator_real_loss=1.896, discriminator_fake_loss=0.854, generator_loss=30.88, generator_mel_loss=21.24, generator_kl_loss=1.942, generator_dur_loss=1.673, generator_adv_loss=2.324, generator_feat_match_loss=3.699, over 69.00 samples.], tot_loss[discriminator_loss=2.601, discriminator_real_loss=1.324, discriminator_fake_loss=1.277, generator_loss=30.88, generator_mel_loss=21.43, generator_kl_loss=1.956, generator_dur_loss=1.692, generator_adv_loss=2.269, generator_feat_match_loss=3.531, over 1574.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 14:11:01,017 INFO [train.py:811] (0/4) Start epoch 292 2023-11-13 14:14:14,508 INFO [train.py:467] (0/4) Epoch 292, batch 33, global_batch_idx: 10800, batch size: 95, loss[discriminator_loss=2.648, discriminator_real_loss=1.321, discriminator_fake_loss=1.327, generator_loss=30.35, generator_mel_loss=21.74, generator_kl_loss=1.916, generator_dur_loss=1.703, generator_adv_loss=2.012, generator_feat_match_loss=2.984, over 95.00 samples.], tot_loss[discriminator_loss=2.632, discriminator_real_loss=1.348, discriminator_fake_loss=1.284, generator_loss=30.4, generator_mel_loss=21.36, generator_kl_loss=1.935, generator_dur_loss=1.691, generator_adv_loss=2.14, generator_feat_match_loss=3.273, over 2563.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 32.0 2023-11-13 14:14:14,955 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 14:14:25,392 INFO [train.py:517] (0/4) Epoch 292, validation: discriminator_loss=2.609, discriminator_real_loss=1.31, discriminator_fake_loss=1.299, generator_loss=30.94, generator_mel_loss=22.17, generator_kl_loss=1.969, generator_dur_loss=1.665, generator_adv_loss=2.03, generator_feat_match_loss=3.101, over 100.00 samples. 2023-11-13 14:14:25,393 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 14:14:46,060 INFO [train.py:811] (0/4) Start epoch 293 2023-11-13 14:18:20,690 INFO [train.py:811] (0/4) Start epoch 294 2023-11-13 14:19:23,584 INFO [train.py:467] (0/4) Epoch 294, batch 9, global_batch_idx: 10850, batch size: 79, loss[discriminator_loss=2.59, discriminator_real_loss=1.235, discriminator_fake_loss=1.354, generator_loss=30.36, generator_mel_loss=21.65, generator_kl_loss=1.942, generator_dur_loss=1.68, generator_adv_loss=1.902, generator_feat_match_loss=3.188, over 79.00 samples.], tot_loss[discriminator_loss=2.624, discriminator_real_loss=1.328, discriminator_fake_loss=1.296, generator_loss=30.43, generator_mel_loss=21.49, generator_kl_loss=1.928, generator_dur_loss=1.695, generator_adv_loss=2.087, generator_feat_match_loss=3.232, over 709.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 14:21:52,355 INFO [train.py:811] (0/4) Start epoch 295 2023-11-13 14:23:59,385 INFO [train.py:467] (0/4) Epoch 295, batch 22, global_batch_idx: 10900, batch size: 53, loss[discriminator_loss=2.463, discriminator_real_loss=1.379, discriminator_fake_loss=1.084, generator_loss=31.66, generator_mel_loss=21.85, generator_kl_loss=1.921, generator_dur_loss=1.692, generator_adv_loss=2.43, generator_feat_match_loss=3.77, over 53.00 samples.], tot_loss[discriminator_loss=2.585, discriminator_real_loss=1.322, discriminator_fake_loss=1.264, generator_loss=30.85, generator_mel_loss=21.25, generator_kl_loss=1.941, generator_dur_loss=1.692, generator_adv_loss=2.307, generator_feat_match_loss=3.661, over 1559.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 14:25:18,270 INFO [train.py:811] (0/4) Start epoch 296 2023-11-13 14:28:45,969 INFO [train.py:467] (0/4) Epoch 296, batch 35, global_batch_idx: 10950, batch size: 58, loss[discriminator_loss=2.953, discriminator_real_loss=1.561, discriminator_fake_loss=1.394, generator_loss=29.99, generator_mel_loss=21.39, generator_kl_loss=1.964, generator_dur_loss=1.69, generator_adv_loss=2.168, generator_feat_match_loss=2.783, over 58.00 samples.], tot_loss[discriminator_loss=2.634, discriminator_real_loss=1.333, discriminator_fake_loss=1.301, generator_loss=30.59, generator_mel_loss=21.4, generator_kl_loss=1.942, generator_dur_loss=1.689, generator_adv_loss=2.192, generator_feat_match_loss=3.366, over 2502.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 14:28:52,639 INFO [train.py:811] (0/4) Start epoch 297 2023-11-13 14:32:25,204 INFO [train.py:811] (0/4) Start epoch 298 2023-11-13 14:33:34,832 INFO [train.py:467] (0/4) Epoch 298, batch 11, global_batch_idx: 11000, batch size: 54, loss[discriminator_loss=2.502, discriminator_real_loss=1.28, discriminator_fake_loss=1.222, generator_loss=31.82, generator_mel_loss=21.56, generator_kl_loss=1.941, generator_dur_loss=1.728, generator_adv_loss=2.457, generator_feat_match_loss=4.133, over 54.00 samples.], tot_loss[discriminator_loss=2.563, discriminator_real_loss=1.284, discriminator_fake_loss=1.279, generator_loss=30.58, generator_mel_loss=21.31, generator_kl_loss=1.952, generator_dur_loss=1.698, generator_adv_loss=2.224, generator_feat_match_loss=3.394, over 882.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 14:33:35,366 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 14:33:46,167 INFO [train.py:517] (0/4) Epoch 298, validation: discriminator_loss=2.492, discriminator_real_loss=1.252, discriminator_fake_loss=1.24, generator_loss=30.79, generator_mel_loss=21.56, generator_kl_loss=1.967, generator_dur_loss=1.664, generator_adv_loss=2.107, generator_feat_match_loss=3.486, over 100.00 samples. 2023-11-13 14:33:46,168 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 14:36:01,043 INFO [train.py:811] (0/4) Start epoch 299 2023-11-13 14:38:28,906 INFO [train.py:467] (0/4) Epoch 299, batch 24, global_batch_idx: 11050, batch size: 52, loss[discriminator_loss=2.719, discriminator_real_loss=1.385, discriminator_fake_loss=1.333, generator_loss=29.28, generator_mel_loss=20.89, generator_kl_loss=1.896, generator_dur_loss=1.683, generator_adv_loss=1.94, generator_feat_match_loss=2.871, over 52.00 samples.], tot_loss[discriminator_loss=2.626, discriminator_real_loss=1.332, discriminator_fake_loss=1.294, generator_loss=30.62, generator_mel_loss=21.21, generator_kl_loss=1.931, generator_dur_loss=1.694, generator_adv_loss=2.263, generator_feat_match_loss=3.522, over 1864.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 14:39:35,465 INFO [train.py:811] (0/4) Start epoch 300 2023-11-13 14:43:07,533 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-300.pt 2023-11-13 14:43:10,794 INFO [train.py:811] (0/4) Start epoch 301 2023-11-13 14:43:25,730 INFO [train.py:467] (0/4) Epoch 301, batch 0, global_batch_idx: 11100, batch size: 59, loss[discriminator_loss=2.711, discriminator_real_loss=1.365, discriminator_fake_loss=1.347, generator_loss=30.22, generator_mel_loss=21.58, generator_kl_loss=1.919, generator_dur_loss=1.705, generator_adv_loss=2.01, generator_feat_match_loss=3.014, over 59.00 samples.], tot_loss[discriminator_loss=2.711, discriminator_real_loss=1.365, discriminator_fake_loss=1.347, generator_loss=30.22, generator_mel_loss=21.58, generator_kl_loss=1.919, generator_dur_loss=1.705, generator_adv_loss=2.01, generator_feat_match_loss=3.014, over 59.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 14:46:38,672 INFO [train.py:811] (0/4) Start epoch 302 2023-11-13 14:47:59,368 INFO [train.py:467] (0/4) Epoch 302, batch 13, global_batch_idx: 11150, batch size: 85, loss[discriminator_loss=2.508, discriminator_real_loss=1.301, discriminator_fake_loss=1.207, generator_loss=30.97, generator_mel_loss=21.32, generator_kl_loss=1.973, generator_dur_loss=1.706, generator_adv_loss=2.199, generator_feat_match_loss=3.773, over 85.00 samples.], tot_loss[discriminator_loss=2.627, discriminator_real_loss=1.334, discriminator_fake_loss=1.292, generator_loss=30.98, generator_mel_loss=21.55, generator_kl_loss=1.95, generator_dur_loss=1.689, generator_adv_loss=2.23, generator_feat_match_loss=3.565, over 1130.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 14:50:08,674 INFO [train.py:811] (0/4) Start epoch 303 2023-11-13 14:52:45,866 INFO [train.py:467] (0/4) Epoch 303, batch 26, global_batch_idx: 11200, batch size: 63, loss[discriminator_loss=2.916, discriminator_real_loss=1.629, discriminator_fake_loss=1.287, generator_loss=29.53, generator_mel_loss=20.97, generator_kl_loss=1.962, generator_dur_loss=1.692, generator_adv_loss=1.998, generator_feat_match_loss=2.912, over 63.00 samples.], tot_loss[discriminator_loss=2.61, discriminator_real_loss=1.317, discriminator_fake_loss=1.294, generator_loss=30.89, generator_mel_loss=21.4, generator_kl_loss=1.932, generator_dur_loss=1.693, generator_adv_loss=2.299, generator_feat_match_loss=3.563, over 2048.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 14:52:46,305 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 14:52:57,010 INFO [train.py:517] (0/4) Epoch 303, validation: discriminator_loss=2.558, discriminator_real_loss=1.27, discriminator_fake_loss=1.287, generator_loss=31.35, generator_mel_loss=22.14, generator_kl_loss=2.025, generator_dur_loss=1.664, generator_adv_loss=2.113, generator_feat_match_loss=3.409, over 100.00 samples. 2023-11-13 14:52:57,011 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 14:53:49,203 INFO [train.py:811] (0/4) Start epoch 304 2023-11-13 14:57:24,635 INFO [train.py:811] (0/4) Start epoch 305 2023-11-13 14:57:50,870 INFO [train.py:467] (0/4) Epoch 305, batch 2, global_batch_idx: 11250, batch size: 52, loss[discriminator_loss=2.568, discriminator_real_loss=1.286, discriminator_fake_loss=1.282, generator_loss=30.13, generator_mel_loss=20.86, generator_kl_loss=1.951, generator_dur_loss=1.705, generator_adv_loss=2.205, generator_feat_match_loss=3.41, over 52.00 samples.], tot_loss[discriminator_loss=2.597, discriminator_real_loss=1.285, discriminator_fake_loss=1.313, generator_loss=30.26, generator_mel_loss=21.23, generator_kl_loss=1.947, generator_dur_loss=1.694, generator_adv_loss=2.126, generator_feat_match_loss=3.258, over 198.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 15:00:51,782 INFO [train.py:811] (0/4) Start epoch 306 2023-11-13 15:02:25,466 INFO [train.py:467] (0/4) Epoch 306, batch 15, global_batch_idx: 11300, batch size: 90, loss[discriminator_loss=2.648, discriminator_real_loss=1.394, discriminator_fake_loss=1.255, generator_loss=29.58, generator_mel_loss=20.89, generator_kl_loss=1.89, generator_dur_loss=1.685, generator_adv_loss=2.162, generator_feat_match_loss=2.953, over 90.00 samples.], tot_loss[discriminator_loss=2.57, discriminator_real_loss=1.285, discriminator_fake_loss=1.285, generator_loss=30.54, generator_mel_loss=21.24, generator_kl_loss=1.937, generator_dur_loss=1.693, generator_adv_loss=2.225, generator_feat_match_loss=3.437, over 1112.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2023-11-13 15:04:23,210 INFO [train.py:811] (0/4) Start epoch 307 2023-11-13 15:07:05,922 INFO [train.py:467] (0/4) Epoch 307, batch 28, global_batch_idx: 11350, batch size: 64, loss[discriminator_loss=2.471, discriminator_real_loss=1.272, discriminator_fake_loss=1.198, generator_loss=30.34, generator_mel_loss=20.92, generator_kl_loss=1.993, generator_dur_loss=1.678, generator_adv_loss=2.295, generator_feat_match_loss=3.449, over 64.00 samples.], tot_loss[discriminator_loss=2.592, discriminator_real_loss=1.318, discriminator_fake_loss=1.274, generator_loss=30.92, generator_mel_loss=21.42, generator_kl_loss=1.959, generator_dur_loss=1.69, generator_adv_loss=2.249, generator_feat_match_loss=3.595, over 2142.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 15:07:52,291 INFO [train.py:811] (0/4) Start epoch 308 2023-11-13 15:11:22,203 INFO [train.py:811] (0/4) Start epoch 309 2023-11-13 15:11:56,203 INFO [train.py:467] (0/4) Epoch 309, batch 4, global_batch_idx: 11400, batch size: 60, loss[discriminator_loss=2.539, discriminator_real_loss=1.375, discriminator_fake_loss=1.164, generator_loss=30.31, generator_mel_loss=20.94, generator_kl_loss=1.916, generator_dur_loss=1.706, generator_adv_loss=2.078, generator_feat_match_loss=3.674, over 60.00 samples.], tot_loss[discriminator_loss=2.621, discriminator_real_loss=1.336, discriminator_fake_loss=1.285, generator_loss=30.23, generator_mel_loss=21.23, generator_kl_loss=1.897, generator_dur_loss=1.686, generator_adv_loss=2.124, generator_feat_match_loss=3.292, over 365.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 15:11:56,679 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 15:12:08,156 INFO [train.py:517] (0/4) Epoch 309, validation: discriminator_loss=2.516, discriminator_real_loss=1.074, discriminator_fake_loss=1.442, generator_loss=31.1, generator_mel_loss=22.33, generator_kl_loss=1.964, generator_dur_loss=1.66, generator_adv_loss=1.733, generator_feat_match_loss=3.416, over 100.00 samples. 2023-11-13 15:12:08,157 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 15:15:09,924 INFO [train.py:811] (0/4) Start epoch 310 2023-11-13 15:17:04,693 INFO [train.py:467] (0/4) Epoch 310, batch 17, global_batch_idx: 11450, batch size: 65, loss[discriminator_loss=2.713, discriminator_real_loss=1.287, discriminator_fake_loss=1.426, generator_loss=29.3, generator_mel_loss=20.83, generator_kl_loss=1.922, generator_dur_loss=1.698, generator_adv_loss=1.882, generator_feat_match_loss=2.961, over 65.00 samples.], tot_loss[discriminator_loss=2.639, discriminator_real_loss=1.353, discriminator_fake_loss=1.286, generator_loss=29.96, generator_mel_loss=21.08, generator_kl_loss=1.919, generator_dur_loss=1.694, generator_adv_loss=2.073, generator_feat_match_loss=3.191, over 1317.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 15:18:42,233 INFO [train.py:811] (0/4) Start epoch 311 2023-11-13 15:21:34,876 INFO [train.py:467] (0/4) Epoch 311, batch 30, global_batch_idx: 11500, batch size: 53, loss[discriminator_loss=2.746, discriminator_real_loss=1.296, discriminator_fake_loss=1.45, generator_loss=30.05, generator_mel_loss=21.31, generator_kl_loss=1.948, generator_dur_loss=1.687, generator_adv_loss=1.99, generator_feat_match_loss=3.115, over 53.00 samples.], tot_loss[discriminator_loss=2.651, discriminator_real_loss=1.347, discriminator_fake_loss=1.305, generator_loss=30.33, generator_mel_loss=21.42, generator_kl_loss=1.96, generator_dur_loss=1.681, generator_adv_loss=2.079, generator_feat_match_loss=3.191, over 2204.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 15:22:09,987 INFO [train.py:811] (0/4) Start epoch 312 2023-11-13 15:25:38,539 INFO [train.py:811] (0/4) Start epoch 313 2023-11-13 15:26:27,472 INFO [train.py:467] (0/4) Epoch 313, batch 6, global_batch_idx: 11550, batch size: 52, loss[discriminator_loss=2.676, discriminator_real_loss=1.3, discriminator_fake_loss=1.375, generator_loss=30.02, generator_mel_loss=21, generator_kl_loss=1.932, generator_dur_loss=1.682, generator_adv_loss=2.105, generator_feat_match_loss=3.295, over 52.00 samples.], tot_loss[discriminator_loss=2.671, discriminator_real_loss=1.335, discriminator_fake_loss=1.336, generator_loss=30.77, generator_mel_loss=21.62, generator_kl_loss=1.969, generator_dur_loss=1.682, generator_adv_loss=2.111, generator_feat_match_loss=3.389, over 576.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 15:29:11,553 INFO [train.py:811] (0/4) Start epoch 314 2023-11-13 15:31:19,101 INFO [train.py:467] (0/4) Epoch 314, batch 19, global_batch_idx: 11600, batch size: 55, loss[discriminator_loss=2.658, discriminator_real_loss=1.292, discriminator_fake_loss=1.366, generator_loss=29.82, generator_mel_loss=21.23, generator_kl_loss=1.938, generator_dur_loss=1.716, generator_adv_loss=1.957, generator_feat_match_loss=2.973, over 55.00 samples.], tot_loss[discriminator_loss=2.679, discriminator_real_loss=1.351, discriminator_fake_loss=1.328, generator_loss=30.43, generator_mel_loss=21.44, generator_kl_loss=1.933, generator_dur_loss=1.691, generator_adv_loss=2.123, generator_feat_match_loss=3.238, over 1412.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 32.0 2023-11-13 15:31:19,587 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 15:31:30,951 INFO [train.py:517] (0/4) Epoch 314, validation: discriminator_loss=2.669, discriminator_real_loss=1.306, discriminator_fake_loss=1.363, generator_loss=30.58, generator_mel_loss=22.17, generator_kl_loss=1.924, generator_dur_loss=1.659, generator_adv_loss=1.836, generator_feat_match_loss=2.989, over 100.00 samples. 2023-11-13 15:31:30,952 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 15:32:58,424 INFO [train.py:811] (0/4) Start epoch 315 2023-11-13 15:36:17,437 INFO [train.py:467] (0/4) Epoch 315, batch 32, global_batch_idx: 11650, batch size: 63, loss[discriminator_loss=2.379, discriminator_real_loss=1.223, discriminator_fake_loss=1.157, generator_loss=31.62, generator_mel_loss=20.68, generator_kl_loss=1.898, generator_dur_loss=1.685, generator_adv_loss=2.844, generator_feat_match_loss=4.508, over 63.00 samples.], tot_loss[discriminator_loss=2.616, discriminator_real_loss=1.324, discriminator_fake_loss=1.292, generator_loss=30.74, generator_mel_loss=21.4, generator_kl_loss=1.948, generator_dur_loss=1.683, generator_adv_loss=2.213, generator_feat_match_loss=3.492, over 2578.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 15:36:37,015 INFO [train.py:811] (0/4) Start epoch 316 2023-11-13 15:40:13,636 INFO [train.py:811] (0/4) Start epoch 317 2023-11-13 15:41:17,931 INFO [train.py:467] (0/4) Epoch 317, batch 8, global_batch_idx: 11700, batch size: 61, loss[discriminator_loss=2.783, discriminator_real_loss=1.365, discriminator_fake_loss=1.418, generator_loss=28.89, generator_mel_loss=21, generator_kl_loss=1.837, generator_dur_loss=1.678, generator_adv_loss=1.812, generator_feat_match_loss=2.557, over 61.00 samples.], tot_loss[discriminator_loss=2.652, discriminator_real_loss=1.323, discriminator_fake_loss=1.328, generator_loss=30.1, generator_mel_loss=21.08, generator_kl_loss=1.902, generator_dur_loss=1.679, generator_adv_loss=2.161, generator_feat_match_loss=3.27, over 828.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 15:43:45,717 INFO [train.py:811] (0/4) Start epoch 318 2023-11-13 15:46:01,184 INFO [train.py:467] (0/4) Epoch 318, batch 21, global_batch_idx: 11750, batch size: 67, loss[discriminator_loss=2.605, discriminator_real_loss=1.331, discriminator_fake_loss=1.275, generator_loss=29.89, generator_mel_loss=21.06, generator_kl_loss=1.944, generator_dur_loss=1.695, generator_adv_loss=2.053, generator_feat_match_loss=3.141, over 67.00 samples.], tot_loss[discriminator_loss=2.612, discriminator_real_loss=1.324, discriminator_fake_loss=1.289, generator_loss=30.19, generator_mel_loss=21.13, generator_kl_loss=1.915, generator_dur_loss=1.68, generator_adv_loss=2.146, generator_feat_match_loss=3.316, over 1616.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 15:47:19,101 INFO [train.py:811] (0/4) Start epoch 319 2023-11-13 15:50:47,673 INFO [train.py:467] (0/4) Epoch 319, batch 34, global_batch_idx: 11800, batch size: 79, loss[discriminator_loss=2.523, discriminator_real_loss=1.296, discriminator_fake_loss=1.227, generator_loss=31.08, generator_mel_loss=21.21, generator_kl_loss=1.919, generator_dur_loss=1.707, generator_adv_loss=2.506, generator_feat_match_loss=3.734, over 79.00 samples.], tot_loss[discriminator_loss=2.594, discriminator_real_loss=1.318, discriminator_fake_loss=1.276, generator_loss=30.63, generator_mel_loss=21.15, generator_kl_loss=1.944, generator_dur_loss=1.687, generator_adv_loss=2.29, generator_feat_match_loss=3.559, over 2474.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 15:50:48,189 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 15:51:00,598 INFO [train.py:517] (0/4) Epoch 319, validation: discriminator_loss=2.383, discriminator_real_loss=1.151, discriminator_fake_loss=1.232, generator_loss=31.38, generator_mel_loss=22.12, generator_kl_loss=1.926, generator_dur_loss=1.659, generator_adv_loss=2.087, generator_feat_match_loss=3.581, over 100.00 samples. 2023-11-13 15:51:00,599 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 15:51:10,354 INFO [train.py:811] (0/4) Start epoch 320 2023-11-13 15:54:48,097 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-320.pt 2023-11-13 15:54:51,652 INFO [train.py:811] (0/4) Start epoch 321 2023-11-13 15:55:58,404 INFO [train.py:467] (0/4) Epoch 321, batch 10, global_batch_idx: 11850, batch size: 56, loss[discriminator_loss=2.707, discriminator_real_loss=1.183, discriminator_fake_loss=1.523, generator_loss=30.49, generator_mel_loss=21.47, generator_kl_loss=1.959, generator_dur_loss=1.687, generator_adv_loss=2.027, generator_feat_match_loss=3.346, over 56.00 samples.], tot_loss[discriminator_loss=2.648, discriminator_real_loss=1.335, discriminator_fake_loss=1.313, generator_loss=30.52, generator_mel_loss=21.37, generator_kl_loss=1.955, generator_dur_loss=1.685, generator_adv_loss=2.161, generator_feat_match_loss=3.343, over 708.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 15:58:24,654 INFO [train.py:811] (0/4) Start epoch 322 2023-11-13 16:00:41,159 INFO [train.py:467] (0/4) Epoch 322, batch 23, global_batch_idx: 11900, batch size: 110, loss[discriminator_loss=2.562, discriminator_real_loss=1.316, discriminator_fake_loss=1.246, generator_loss=30.27, generator_mel_loss=21.22, generator_kl_loss=1.962, generator_dur_loss=1.66, generator_adv_loss=2.199, generator_feat_match_loss=3.232, over 110.00 samples.], tot_loss[discriminator_loss=2.622, discriminator_real_loss=1.316, discriminator_fake_loss=1.306, generator_loss=30.21, generator_mel_loss=21.2, generator_kl_loss=1.946, generator_dur_loss=1.682, generator_adv_loss=2.111, generator_feat_match_loss=3.273, over 1868.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 16:01:56,213 INFO [train.py:811] (0/4) Start epoch 323 2023-11-13 16:05:28,919 INFO [train.py:467] (0/4) Epoch 323, batch 36, global_batch_idx: 11950, batch size: 59, loss[discriminator_loss=2.928, discriminator_real_loss=1.787, discriminator_fake_loss=1.141, generator_loss=29.74, generator_mel_loss=21.11, generator_kl_loss=1.859, generator_dur_loss=1.705, generator_adv_loss=2.174, generator_feat_match_loss=2.891, over 59.00 samples.], tot_loss[discriminator_loss=2.652, discriminator_real_loss=1.364, discriminator_fake_loss=1.288, generator_loss=30.7, generator_mel_loss=21.4, generator_kl_loss=1.958, generator_dur_loss=1.688, generator_adv_loss=2.196, generator_feat_match_loss=3.455, over 2715.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 16:05:30,363 INFO [train.py:811] (0/4) Start epoch 324 2023-11-13 16:09:06,148 INFO [train.py:811] (0/4) Start epoch 325 2023-11-13 16:10:33,126 INFO [train.py:467] (0/4) Epoch 325, batch 12, global_batch_idx: 12000, batch size: 64, loss[discriminator_loss=2.672, discriminator_real_loss=1.332, discriminator_fake_loss=1.34, generator_loss=29.84, generator_mel_loss=21.24, generator_kl_loss=1.952, generator_dur_loss=1.675, generator_adv_loss=2.006, generator_feat_match_loss=2.963, over 64.00 samples.], tot_loss[discriminator_loss=2.615, discriminator_real_loss=1.316, discriminator_fake_loss=1.298, generator_loss=30.33, generator_mel_loss=21.22, generator_kl_loss=1.934, generator_dur_loss=1.686, generator_adv_loss=2.151, generator_feat_match_loss=3.341, over 1022.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 32.0 2023-11-13 16:10:33,629 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 16:10:45,041 INFO [train.py:517] (0/4) Epoch 325, validation: discriminator_loss=2.661, discriminator_real_loss=1.313, discriminator_fake_loss=1.347, generator_loss=31.06, generator_mel_loss=22.35, generator_kl_loss=2.023, generator_dur_loss=1.656, generator_adv_loss=1.906, generator_feat_match_loss=3.13, over 100.00 samples. 2023-11-13 16:10:45,042 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 16:12:57,385 INFO [train.py:811] (0/4) Start epoch 326 2023-11-13 16:15:34,117 INFO [train.py:467] (0/4) Epoch 326, batch 25, global_batch_idx: 12050, batch size: 49, loss[discriminator_loss=2.686, discriminator_real_loss=1.297, discriminator_fake_loss=1.389, generator_loss=31.62, generator_mel_loss=21.82, generator_kl_loss=1.955, generator_dur_loss=1.683, generator_adv_loss=2.369, generator_feat_match_loss=3.791, over 49.00 samples.], tot_loss[discriminator_loss=2.612, discriminator_real_loss=1.323, discriminator_fake_loss=1.288, generator_loss=30.69, generator_mel_loss=21.41, generator_kl_loss=1.938, generator_dur_loss=1.681, generator_adv_loss=2.19, generator_feat_match_loss=3.47, over 1955.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 16:16:33,764 INFO [train.py:811] (0/4) Start epoch 327 2023-11-13 16:20:13,773 INFO [train.py:811] (0/4) Start epoch 328 2023-11-13 16:20:35,994 INFO [train.py:467] (0/4) Epoch 328, batch 1, global_batch_idx: 12100, batch size: 65, loss[discriminator_loss=2.641, discriminator_real_loss=1.31, discriminator_fake_loss=1.331, generator_loss=30.61, generator_mel_loss=21.26, generator_kl_loss=1.986, generator_dur_loss=1.688, generator_adv_loss=2.129, generator_feat_match_loss=3.543, over 65.00 samples.], tot_loss[discriminator_loss=2.633, discriminator_real_loss=1.369, discriminator_fake_loss=1.264, generator_loss=30.56, generator_mel_loss=21.2, generator_kl_loss=1.916, generator_dur_loss=1.687, generator_adv_loss=2.131, generator_feat_match_loss=3.629, over 116.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 16:23:46,578 INFO [train.py:811] (0/4) Start epoch 329 2023-11-13 16:25:14,812 INFO [train.py:467] (0/4) Epoch 329, batch 14, global_batch_idx: 12150, batch size: 101, loss[discriminator_loss=2.602, discriminator_real_loss=1.292, discriminator_fake_loss=1.309, generator_loss=30.54, generator_mel_loss=21.63, generator_kl_loss=1.981, generator_dur_loss=1.671, generator_adv_loss=2.094, generator_feat_match_loss=3.162, over 101.00 samples.], tot_loss[discriminator_loss=2.651, discriminator_real_loss=1.341, discriminator_fake_loss=1.31, generator_loss=30.42, generator_mel_loss=21.41, generator_kl_loss=1.945, generator_dur_loss=1.681, generator_adv_loss=2.112, generator_feat_match_loss=3.273, over 1050.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 16:27:13,312 INFO [train.py:811] (0/4) Start epoch 330 2023-11-13 16:29:59,142 INFO [train.py:467] (0/4) Epoch 330, batch 27, global_batch_idx: 12200, batch size: 61, loss[discriminator_loss=2.699, discriminator_real_loss=1.43, discriminator_fake_loss=1.27, generator_loss=29.92, generator_mel_loss=21.34, generator_kl_loss=1.94, generator_dur_loss=1.693, generator_adv_loss=2.047, generator_feat_match_loss=2.9, over 61.00 samples.], tot_loss[discriminator_loss=2.696, discriminator_real_loss=1.384, discriminator_fake_loss=1.313, generator_loss=30.11, generator_mel_loss=21.13, generator_kl_loss=1.909, generator_dur_loss=1.686, generator_adv_loss=2.107, generator_feat_match_loss=3.283, over 1931.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 16:29:59,666 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 16:30:10,578 INFO [train.py:517] (0/4) Epoch 330, validation: discriminator_loss=2.728, discriminator_real_loss=1.36, discriminator_fake_loss=1.368, generator_loss=30.56, generator_mel_loss=22.03, generator_kl_loss=2.067, generator_dur_loss=1.651, generator_adv_loss=1.844, generator_feat_match_loss=2.969, over 100.00 samples. 2023-11-13 16:30:10,579 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 16:30:56,912 INFO [train.py:811] (0/4) Start epoch 331 2023-11-13 16:34:26,019 INFO [train.py:811] (0/4) Start epoch 332 2023-11-13 16:34:55,294 INFO [train.py:467] (0/4) Epoch 332, batch 3, global_batch_idx: 12250, batch size: 65, loss[discriminator_loss=2.66, discriminator_real_loss=1.342, discriminator_fake_loss=1.318, generator_loss=30.03, generator_mel_loss=21.38, generator_kl_loss=1.937, generator_dur_loss=1.707, generator_adv_loss=1.979, generator_feat_match_loss=3.033, over 65.00 samples.], tot_loss[discriminator_loss=2.686, discriminator_real_loss=1.354, discriminator_fake_loss=1.332, generator_loss=30.02, generator_mel_loss=21.3, generator_kl_loss=1.934, generator_dur_loss=1.692, generator_adv_loss=2.031, generator_feat_match_loss=3.062, over 265.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 16:37:57,077 INFO [train.py:811] (0/4) Start epoch 333 2023-11-13 16:39:39,110 INFO [train.py:467] (0/4) Epoch 333, batch 16, global_batch_idx: 12300, batch size: 52, loss[discriminator_loss=2.654, discriminator_real_loss=1.345, discriminator_fake_loss=1.31, generator_loss=29.48, generator_mel_loss=20.76, generator_kl_loss=1.819, generator_dur_loss=1.732, generator_adv_loss=2.25, generator_feat_match_loss=2.92, over 52.00 samples.], tot_loss[discriminator_loss=2.651, discriminator_real_loss=1.334, discriminator_fake_loss=1.317, generator_loss=30.43, generator_mel_loss=21.37, generator_kl_loss=1.944, generator_dur_loss=1.69, generator_adv_loss=2.111, generator_feat_match_loss=3.31, over 1134.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 16:41:36,079 INFO [train.py:811] (0/4) Start epoch 334 2023-11-13 16:44:27,115 INFO [train.py:467] (0/4) Epoch 334, batch 29, global_batch_idx: 12350, batch size: 85, loss[discriminator_loss=3.098, discriminator_real_loss=1.541, discriminator_fake_loss=1.557, generator_loss=29.69, generator_mel_loss=21.26, generator_kl_loss=1.851, generator_dur_loss=1.663, generator_adv_loss=2.027, generator_feat_match_loss=2.893, over 85.00 samples.], tot_loss[discriminator_loss=2.574, discriminator_real_loss=1.294, discriminator_fake_loss=1.28, generator_loss=30.7, generator_mel_loss=21.09, generator_kl_loss=1.931, generator_dur_loss=1.679, generator_adv_loss=2.297, generator_feat_match_loss=3.704, over 2301.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 16:45:06,555 INFO [train.py:811] (0/4) Start epoch 335 2023-11-13 16:48:44,441 INFO [train.py:811] (0/4) Start epoch 336 2023-11-13 16:49:29,985 INFO [train.py:467] (0/4) Epoch 336, batch 5, global_batch_idx: 12400, batch size: 64, loss[discriminator_loss=2.623, discriminator_real_loss=1.227, discriminator_fake_loss=1.396, generator_loss=30.56, generator_mel_loss=21.41, generator_kl_loss=2.02, generator_dur_loss=1.719, generator_adv_loss=2.195, generator_feat_match_loss=3.221, over 64.00 samples.], tot_loss[discriminator_loss=2.66, discriminator_real_loss=1.344, discriminator_fake_loss=1.316, generator_loss=30.25, generator_mel_loss=21.26, generator_kl_loss=1.931, generator_dur_loss=1.685, generator_adv_loss=2.116, generator_feat_match_loss=3.259, over 391.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 32.0 2023-11-13 16:49:30,614 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 16:49:42,791 INFO [train.py:517] (0/4) Epoch 336, validation: discriminator_loss=2.72, discriminator_real_loss=1.378, discriminator_fake_loss=1.342, generator_loss=30.43, generator_mel_loss=21.91, generator_kl_loss=2.01, generator_dur_loss=1.664, generator_adv_loss=1.865, generator_feat_match_loss=2.978, over 100.00 samples. 2023-11-13 16:49:42,792 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 16:52:32,858 INFO [train.py:811] (0/4) Start epoch 337 2023-11-13 16:54:26,079 INFO [train.py:467] (0/4) Epoch 337, batch 18, global_batch_idx: 12450, batch size: 56, loss[discriminator_loss=2.576, discriminator_real_loss=1.158, discriminator_fake_loss=1.418, generator_loss=30.52, generator_mel_loss=20.95, generator_kl_loss=1.855, generator_dur_loss=1.689, generator_adv_loss=2.154, generator_feat_match_loss=3.871, over 56.00 samples.], tot_loss[discriminator_loss=2.626, discriminator_real_loss=1.306, discriminator_fake_loss=1.319, generator_loss=30.45, generator_mel_loss=21.18, generator_kl_loss=1.934, generator_dur_loss=1.678, generator_adv_loss=2.182, generator_feat_match_loss=3.469, over 1554.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 16:56:04,490 INFO [train.py:811] (0/4) Start epoch 338 2023-11-13 16:59:07,650 INFO [train.py:467] (0/4) Epoch 338, batch 31, global_batch_idx: 12500, batch size: 101, loss[discriminator_loss=2.648, discriminator_real_loss=1.398, discriminator_fake_loss=1.249, generator_loss=30.64, generator_mel_loss=21.67, generator_kl_loss=1.991, generator_dur_loss=1.687, generator_adv_loss=2.01, generator_feat_match_loss=3.277, over 101.00 samples.], tot_loss[discriminator_loss=2.628, discriminator_real_loss=1.326, discriminator_fake_loss=1.302, generator_loss=30.29, generator_mel_loss=21.19, generator_kl_loss=1.957, generator_dur_loss=1.687, generator_adv_loss=2.126, generator_feat_match_loss=3.337, over 2173.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 16:59:39,727 INFO [train.py:811] (0/4) Start epoch 339 2023-11-13 17:03:14,400 INFO [train.py:811] (0/4) Start epoch 340 2023-11-13 17:04:11,826 INFO [train.py:467] (0/4) Epoch 340, batch 7, global_batch_idx: 12550, batch size: 153, loss[discriminator_loss=2.613, discriminator_real_loss=1.34, discriminator_fake_loss=1.273, generator_loss=30.86, generator_mel_loss=21.53, generator_kl_loss=1.975, generator_dur_loss=1.664, generator_adv_loss=1.997, generator_feat_match_loss=3.691, over 153.00 samples.], tot_loss[discriminator_loss=2.638, discriminator_real_loss=1.368, discriminator_fake_loss=1.271, generator_loss=30.52, generator_mel_loss=21.24, generator_kl_loss=1.947, generator_dur_loss=1.677, generator_adv_loss=2.127, generator_feat_match_loss=3.533, over 677.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 17:06:47,913 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-340.pt 2023-11-13 17:06:51,389 INFO [train.py:811] (0/4) Start epoch 341 2023-11-13 17:09:05,220 INFO [train.py:467] (0/4) Epoch 341, batch 20, global_batch_idx: 12600, batch size: 60, loss[discriminator_loss=2.621, discriminator_real_loss=1.418, discriminator_fake_loss=1.202, generator_loss=31.15, generator_mel_loss=21.52, generator_kl_loss=1.958, generator_dur_loss=1.683, generator_adv_loss=2.4, generator_feat_match_loss=3.586, over 60.00 samples.], tot_loss[discriminator_loss=2.642, discriminator_real_loss=1.316, discriminator_fake_loss=1.327, generator_loss=30.73, generator_mel_loss=21.4, generator_kl_loss=1.944, generator_dur_loss=1.679, generator_adv_loss=2.174, generator_feat_match_loss=3.529, over 1670.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 17:09:05,732 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 17:09:16,644 INFO [train.py:517] (0/4) Epoch 341, validation: discriminator_loss=2.477, discriminator_real_loss=1.291, discriminator_fake_loss=1.186, generator_loss=31.98, generator_mel_loss=22.19, generator_kl_loss=2.086, generator_dur_loss=1.654, generator_adv_loss=2.258, generator_feat_match_loss=3.797, over 100.00 samples. 2023-11-13 17:09:16,645 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 17:10:39,575 INFO [train.py:811] (0/4) Start epoch 342 2023-11-13 17:13:57,610 INFO [train.py:467] (0/4) Epoch 342, batch 33, global_batch_idx: 12650, batch size: 95, loss[discriminator_loss=2.363, discriminator_real_loss=1.231, discriminator_fake_loss=1.131, generator_loss=31.4, generator_mel_loss=21.15, generator_kl_loss=1.973, generator_dur_loss=1.671, generator_adv_loss=2.406, generator_feat_match_loss=4.207, over 95.00 samples.], tot_loss[discriminator_loss=2.606, discriminator_real_loss=1.323, discriminator_fake_loss=1.283, generator_loss=30.8, generator_mel_loss=21.26, generator_kl_loss=1.958, generator_dur_loss=1.678, generator_adv_loss=2.248, generator_feat_match_loss=3.658, over 2456.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 17:14:14,241 INFO [train.py:811] (0/4) Start epoch 343 2023-11-13 17:17:50,690 INFO [train.py:811] (0/4) Start epoch 344 2023-11-13 17:18:55,259 INFO [train.py:467] (0/4) Epoch 344, batch 9, global_batch_idx: 12700, batch size: 73, loss[discriminator_loss=2.676, discriminator_real_loss=1.199, discriminator_fake_loss=1.477, generator_loss=30.77, generator_mel_loss=21.51, generator_kl_loss=1.907, generator_dur_loss=1.671, generator_adv_loss=2.252, generator_feat_match_loss=3.43, over 73.00 samples.], tot_loss[discriminator_loss=2.595, discriminator_real_loss=1.295, discriminator_fake_loss=1.299, generator_loss=30.68, generator_mel_loss=21.34, generator_kl_loss=1.945, generator_dur_loss=1.674, generator_adv_loss=2.166, generator_feat_match_loss=3.555, over 731.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 17:21:26,430 INFO [train.py:811] (0/4) Start epoch 345 2023-11-13 17:23:36,720 INFO [train.py:467] (0/4) Epoch 345, batch 22, global_batch_idx: 12750, batch size: 90, loss[discriminator_loss=2.758, discriminator_real_loss=1.407, discriminator_fake_loss=1.351, generator_loss=29.84, generator_mel_loss=20.95, generator_kl_loss=1.96, generator_dur_loss=1.72, generator_adv_loss=2.059, generator_feat_match_loss=3.154, over 90.00 samples.], tot_loss[discriminator_loss=2.654, discriminator_real_loss=1.34, discriminator_fake_loss=1.314, generator_loss=30.23, generator_mel_loss=21.1, generator_kl_loss=1.931, generator_dur_loss=1.684, generator_adv_loss=2.137, generator_feat_match_loss=3.383, over 1530.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 17:25:02,158 INFO [train.py:811] (0/4) Start epoch 346 2023-11-13 17:28:34,579 INFO [train.py:467] (0/4) Epoch 346, batch 35, global_batch_idx: 12800, batch size: 81, loss[discriminator_loss=2.566, discriminator_real_loss=1.185, discriminator_fake_loss=1.383, generator_loss=30.94, generator_mel_loss=21.38, generator_kl_loss=1.893, generator_dur_loss=1.686, generator_adv_loss=2.326, generator_feat_match_loss=3.648, over 81.00 samples.], tot_loss[discriminator_loss=2.644, discriminator_real_loss=1.342, discriminator_fake_loss=1.301, generator_loss=30.47, generator_mel_loss=21.27, generator_kl_loss=1.938, generator_dur_loss=1.679, generator_adv_loss=2.147, generator_feat_match_loss=3.438, over 2790.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 32.0 2023-11-13 17:28:35,079 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 17:28:46,962 INFO [train.py:517] (0/4) Epoch 346, validation: discriminator_loss=2.767, discriminator_real_loss=1.383, discriminator_fake_loss=1.384, generator_loss=30.62, generator_mel_loss=22.02, generator_kl_loss=2.037, generator_dur_loss=1.65, generator_adv_loss=1.855, generator_feat_match_loss=3.06, over 100.00 samples. 2023-11-13 17:28:46,963 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27145MB 2023-11-13 17:28:52,232 INFO [train.py:811] (0/4) Start epoch 347 2023-11-13 17:32:24,480 INFO [train.py:811] (0/4) Start epoch 348 2023-11-13 17:33:38,087 INFO [train.py:467] (0/4) Epoch 348, batch 11, global_batch_idx: 12850, batch size: 76, loss[discriminator_loss=2.443, discriminator_real_loss=1.225, discriminator_fake_loss=1.219, generator_loss=30.72, generator_mel_loss=20.88, generator_kl_loss=1.894, generator_dur_loss=1.661, generator_adv_loss=2.346, generator_feat_match_loss=3.938, over 76.00 samples.], tot_loss[discriminator_loss=2.575, discriminator_real_loss=1.324, discriminator_fake_loss=1.252, generator_loss=30.92, generator_mel_loss=21.24, generator_kl_loss=1.931, generator_dur_loss=1.675, generator_adv_loss=2.318, generator_feat_match_loss=3.763, over 849.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2023-11-13 17:35:54,273 INFO [train.py:811] (0/4) Start epoch 349 2023-11-13 17:38:21,651 INFO [train.py:467] (0/4) Epoch 349, batch 24, global_batch_idx: 12900, batch size: 126, loss[discriminator_loss=2.701, discriminator_real_loss=1.297, discriminator_fake_loss=1.404, generator_loss=30.39, generator_mel_loss=21.53, generator_kl_loss=1.961, generator_dur_loss=1.671, generator_adv_loss=2, generator_feat_match_loss=3.234, over 126.00 samples.], tot_loss[discriminator_loss=2.693, discriminator_real_loss=1.381, discriminator_fake_loss=1.312, generator_loss=30.1, generator_mel_loss=21.22, generator_kl_loss=1.929, generator_dur_loss=1.677, generator_adv_loss=2.059, generator_feat_match_loss=3.212, over 1946.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 17:39:22,596 INFO [train.py:811] (0/4) Start epoch 350 2023-11-13 17:42:58,401 INFO [train.py:811] (0/4) Start epoch 351 2023-11-13 17:43:14,241 INFO [train.py:467] (0/4) Epoch 351, batch 0, global_batch_idx: 12950, batch size: 50, loss[discriminator_loss=2.668, discriminator_real_loss=1.273, discriminator_fake_loss=1.395, generator_loss=31.36, generator_mel_loss=22.32, generator_kl_loss=1.86, generator_dur_loss=1.732, generator_adv_loss=1.971, generator_feat_match_loss=3.473, over 50.00 samples.], tot_loss[discriminator_loss=2.668, discriminator_real_loss=1.273, discriminator_fake_loss=1.395, generator_loss=31.36, generator_mel_loss=22.32, generator_kl_loss=1.86, generator_dur_loss=1.732, generator_adv_loss=1.971, generator_feat_match_loss=3.473, over 50.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 17:46:33,105 INFO [train.py:811] (0/4) Start epoch 352 2023-11-13 17:47:51,758 INFO [train.py:467] (0/4) Epoch 352, batch 13, global_batch_idx: 13000, batch size: 55, loss[discriminator_loss=2.693, discriminator_real_loss=1.229, discriminator_fake_loss=1.464, generator_loss=30, generator_mel_loss=21.14, generator_kl_loss=1.923, generator_dur_loss=1.69, generator_adv_loss=1.92, generator_feat_match_loss=3.33, over 55.00 samples.], tot_loss[discriminator_loss=2.651, discriminator_real_loss=1.332, discriminator_fake_loss=1.319, generator_loss=30.45, generator_mel_loss=21.18, generator_kl_loss=1.944, generator_dur_loss=1.684, generator_adv_loss=2.16, generator_feat_match_loss=3.48, over 915.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 17:47:52,283 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 17:48:03,374 INFO [train.py:517] (0/4) Epoch 352, validation: discriminator_loss=2.671, discriminator_real_loss=1.116, discriminator_fake_loss=1.555, generator_loss=30.09, generator_mel_loss=21.66, generator_kl_loss=2.014, generator_dur_loss=1.65, generator_adv_loss=1.618, generator_feat_match_loss=3.148, over 100.00 samples. 2023-11-13 17:48:03,375 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27258MB 2023-11-13 17:50:17,474 INFO [train.py:811] (0/4) Start epoch 353 2023-11-13 17:52:59,124 INFO [train.py:467] (0/4) Epoch 353, batch 26, global_batch_idx: 13050, batch size: 126, loss[discriminator_loss=2.859, discriminator_real_loss=1.199, discriminator_fake_loss=1.659, generator_loss=30.05, generator_mel_loss=21.42, generator_kl_loss=1.937, generator_dur_loss=1.662, generator_adv_loss=2.107, generator_feat_match_loss=2.918, over 126.00 samples.], tot_loss[discriminator_loss=2.646, discriminator_real_loss=1.347, discriminator_fake_loss=1.299, generator_loss=30.69, generator_mel_loss=21.3, generator_kl_loss=1.958, generator_dur_loss=1.682, generator_adv_loss=2.199, generator_feat_match_loss=3.555, over 2021.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 17:53:53,706 INFO [train.py:811] (0/4) Start epoch 354 2023-11-13 17:57:27,458 INFO [train.py:811] (0/4) Start epoch 355 2023-11-13 17:57:57,720 INFO [train.py:467] (0/4) Epoch 355, batch 2, global_batch_idx: 13100, batch size: 58, loss[discriminator_loss=2.637, discriminator_real_loss=1.336, discriminator_fake_loss=1.301, generator_loss=29.72, generator_mel_loss=20.85, generator_kl_loss=1.939, generator_dur_loss=1.692, generator_adv_loss=2.004, generator_feat_match_loss=3.242, over 58.00 samples.], tot_loss[discriminator_loss=2.667, discriminator_real_loss=1.374, discriminator_fake_loss=1.293, generator_loss=30.14, generator_mel_loss=21.35, generator_kl_loss=1.922, generator_dur_loss=1.683, generator_adv_loss=1.984, generator_feat_match_loss=3.204, over 244.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 18:00:58,491 INFO [train.py:811] (0/4) Start epoch 356 2023-11-13 18:02:38,870 INFO [train.py:467] (0/4) Epoch 356, batch 15, global_batch_idx: 13150, batch size: 81, loss[discriminator_loss=2.613, discriminator_real_loss=1.381, discriminator_fake_loss=1.232, generator_loss=30.17, generator_mel_loss=20.96, generator_kl_loss=1.852, generator_dur_loss=1.681, generator_adv_loss=2.209, generator_feat_match_loss=3.461, over 81.00 samples.], tot_loss[discriminator_loss=2.652, discriminator_real_loss=1.356, discriminator_fake_loss=1.296, generator_loss=30.28, generator_mel_loss=21.25, generator_kl_loss=1.931, generator_dur_loss=1.687, generator_adv_loss=2.101, generator_feat_match_loss=3.313, over 1137.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 18:04:30,338 INFO [train.py:811] (0/4) Start epoch 357 2023-11-13 18:07:13,673 INFO [train.py:467] (0/4) Epoch 357, batch 28, global_batch_idx: 13200, batch size: 55, loss[discriminator_loss=2.557, discriminator_real_loss=1.261, discriminator_fake_loss=1.296, generator_loss=30.51, generator_mel_loss=20.94, generator_kl_loss=1.99, generator_dur_loss=1.696, generator_adv_loss=2.336, generator_feat_match_loss=3.547, over 55.00 samples.], tot_loss[discriminator_loss=2.656, discriminator_real_loss=1.348, discriminator_fake_loss=1.308, generator_loss=30.24, generator_mel_loss=21.14, generator_kl_loss=1.943, generator_dur_loss=1.678, generator_adv_loss=2.116, generator_feat_match_loss=3.368, over 2064.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 32.0 2023-11-13 18:07:14,163 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 18:07:25,638 INFO [train.py:517] (0/4) Epoch 357, validation: discriminator_loss=2.6, discriminator_real_loss=1.316, discriminator_fake_loss=1.283, generator_loss=31.01, generator_mel_loss=21.87, generator_kl_loss=2.01, generator_dur_loss=1.656, generator_adv_loss=1.989, generator_feat_match_loss=3.487, over 100.00 samples. 2023-11-13 18:07:25,639 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27258MB 2023-11-13 18:08:08,133 INFO [train.py:811] (0/4) Start epoch 358 2023-11-13 18:11:43,287 INFO [train.py:811] (0/4) Start epoch 359 2023-11-13 18:12:22,396 INFO [train.py:467] (0/4) Epoch 359, batch 4, global_batch_idx: 13250, batch size: 73, loss[discriminator_loss=3.156, discriminator_real_loss=1.747, discriminator_fake_loss=1.41, generator_loss=30.27, generator_mel_loss=21.1, generator_kl_loss=1.879, generator_dur_loss=1.697, generator_adv_loss=2.414, generator_feat_match_loss=3.172, over 73.00 samples.], tot_loss[discriminator_loss=2.829, discriminator_real_loss=1.404, discriminator_fake_loss=1.425, generator_loss=30.44, generator_mel_loss=21.06, generator_kl_loss=1.972, generator_dur_loss=1.687, generator_adv_loss=2.375, generator_feat_match_loss=3.338, over 389.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 18:15:07,846 INFO [train.py:811] (0/4) Start epoch 360 2023-11-13 18:17:09,761 INFO [train.py:467] (0/4) Epoch 360, batch 17, global_batch_idx: 13300, batch size: 53, loss[discriminator_loss=2.562, discriminator_real_loss=1.334, discriminator_fake_loss=1.229, generator_loss=29.71, generator_mel_loss=20.54, generator_kl_loss=1.979, generator_dur_loss=1.69, generator_adv_loss=2.32, generator_feat_match_loss=3.178, over 53.00 samples.], tot_loss[discriminator_loss=2.667, discriminator_real_loss=1.363, discriminator_fake_loss=1.304, generator_loss=30.3, generator_mel_loss=20.98, generator_kl_loss=1.983, generator_dur_loss=1.675, generator_adv_loss=2.17, generator_feat_match_loss=3.494, over 1528.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 18:18:47,594 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-360.pt 2023-11-13 18:18:51,041 INFO [train.py:811] (0/4) Start epoch 361 2023-11-13 18:21:58,732 INFO [train.py:467] (0/4) Epoch 361, batch 30, global_batch_idx: 13350, batch size: 126, loss[discriminator_loss=2.598, discriminator_real_loss=1.16, discriminator_fake_loss=1.438, generator_loss=30.62, generator_mel_loss=20.98, generator_kl_loss=1.964, generator_dur_loss=1.669, generator_adv_loss=2.25, generator_feat_match_loss=3.754, over 126.00 samples.], tot_loss[discriminator_loss=2.641, discriminator_real_loss=1.348, discriminator_fake_loss=1.293, generator_loss=30.58, generator_mel_loss=21.19, generator_kl_loss=1.962, generator_dur_loss=1.678, generator_adv_loss=2.196, generator_feat_match_loss=3.553, over 2090.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 18:22:31,604 INFO [train.py:811] (0/4) Start epoch 362 2023-11-13 18:26:08,472 INFO [train.py:811] (0/4) Start epoch 363 2023-11-13 18:26:49,613 INFO [train.py:467] (0/4) Epoch 363, batch 6, global_batch_idx: 13400, batch size: 58, loss[discriminator_loss=2.727, discriminator_real_loss=1.348, discriminator_fake_loss=1.378, generator_loss=29.86, generator_mel_loss=21.08, generator_kl_loss=1.907, generator_dur_loss=1.674, generator_adv_loss=2.09, generator_feat_match_loss=3.113, over 58.00 samples.], tot_loss[discriminator_loss=2.669, discriminator_real_loss=1.357, discriminator_fake_loss=1.312, generator_loss=30.13, generator_mel_loss=21.12, generator_kl_loss=1.948, generator_dur_loss=1.688, generator_adv_loss=2.087, generator_feat_match_loss=3.283, over 417.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 18:26:50,114 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 18:27:02,178 INFO [train.py:517] (0/4) Epoch 363, validation: discriminator_loss=2.582, discriminator_real_loss=1.272, discriminator_fake_loss=1.31, generator_loss=30.95, generator_mel_loss=21.97, generator_kl_loss=2.024, generator_dur_loss=1.651, generator_adv_loss=1.956, generator_feat_match_loss=3.342, over 100.00 samples. 2023-11-13 18:27:02,179 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27258MB 2023-11-13 18:29:57,992 INFO [train.py:811] (0/4) Start epoch 364 2023-11-13 18:32:03,109 INFO [train.py:467] (0/4) Epoch 364, batch 19, global_batch_idx: 13450, batch size: 95, loss[discriminator_loss=2.566, discriminator_real_loss=1.191, discriminator_fake_loss=1.375, generator_loss=30.67, generator_mel_loss=21.18, generator_kl_loss=1.971, generator_dur_loss=1.652, generator_adv_loss=2.012, generator_feat_match_loss=3.857, over 95.00 samples.], tot_loss[discriminator_loss=2.703, discriminator_real_loss=1.41, discriminator_fake_loss=1.293, generator_loss=30.42, generator_mel_loss=21.11, generator_kl_loss=1.926, generator_dur_loss=1.671, generator_adv_loss=2.204, generator_feat_match_loss=3.509, over 1571.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 18:33:36,392 INFO [train.py:811] (0/4) Start epoch 365 2023-11-13 18:36:45,117 INFO [train.py:467] (0/4) Epoch 365, batch 32, global_batch_idx: 13500, batch size: 56, loss[discriminator_loss=2.518, discriminator_real_loss=1.289, discriminator_fake_loss=1.229, generator_loss=31.31, generator_mel_loss=21.31, generator_kl_loss=2.046, generator_dur_loss=1.666, generator_adv_loss=2.359, generator_feat_match_loss=3.928, over 56.00 samples.], tot_loss[discriminator_loss=2.596, discriminator_real_loss=1.317, discriminator_fake_loss=1.278, generator_loss=30.28, generator_mel_loss=21.02, generator_kl_loss=1.95, generator_dur_loss=1.675, generator_adv_loss=2.154, generator_feat_match_loss=3.476, over 2442.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 18:37:07,789 INFO [train.py:811] (0/4) Start epoch 366 2023-11-13 18:40:43,269 INFO [train.py:811] (0/4) Start epoch 367 2023-11-13 18:41:44,774 INFO [train.py:467] (0/4) Epoch 367, batch 8, global_batch_idx: 13550, batch size: 110, loss[discriminator_loss=2.578, discriminator_real_loss=1.41, discriminator_fake_loss=1.168, generator_loss=30.5, generator_mel_loss=21.33, generator_kl_loss=1.905, generator_dur_loss=1.647, generator_adv_loss=2.09, generator_feat_match_loss=3.527, over 110.00 samples.], tot_loss[discriminator_loss=2.644, discriminator_real_loss=1.367, discriminator_fake_loss=1.277, generator_loss=30.22, generator_mel_loss=21.14, generator_kl_loss=1.937, generator_dur_loss=1.673, generator_adv_loss=2.074, generator_feat_match_loss=3.396, over 780.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 18:44:16,022 INFO [train.py:811] (0/4) Start epoch 368 2023-11-13 18:46:36,932 INFO [train.py:467] (0/4) Epoch 368, batch 21, global_batch_idx: 13600, batch size: 101, loss[discriminator_loss=2.572, discriminator_real_loss=1.326, discriminator_fake_loss=1.246, generator_loss=30.84, generator_mel_loss=21, generator_kl_loss=1.963, generator_dur_loss=1.666, generator_adv_loss=2.148, generator_feat_match_loss=4.062, over 101.00 samples.], tot_loss[discriminator_loss=2.664, discriminator_real_loss=1.362, discriminator_fake_loss=1.302, generator_loss=30.22, generator_mel_loss=21.12, generator_kl_loss=1.932, generator_dur_loss=1.68, generator_adv_loss=2.129, generator_feat_match_loss=3.35, over 1657.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 32.0 2023-11-13 18:46:37,430 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 18:46:48,969 INFO [train.py:517] (0/4) Epoch 368, validation: discriminator_loss=2.436, discriminator_real_loss=1.103, discriminator_fake_loss=1.333, generator_loss=31.34, generator_mel_loss=21.77, generator_kl_loss=2.029, generator_dur_loss=1.646, generator_adv_loss=1.987, generator_feat_match_loss=3.909, over 100.00 samples. 2023-11-13 18:46:48,970 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27258MB 2023-11-13 18:48:07,651 INFO [train.py:811] (0/4) Start epoch 369 2023-11-13 18:51:27,666 INFO [train.py:467] (0/4) Epoch 369, batch 34, global_batch_idx: 13650, batch size: 71, loss[discriminator_loss=2.582, discriminator_real_loss=1.34, discriminator_fake_loss=1.242, generator_loss=30.22, generator_mel_loss=20.99, generator_kl_loss=1.936, generator_dur_loss=1.67, generator_adv_loss=2.191, generator_feat_match_loss=3.426, over 71.00 samples.], tot_loss[discriminator_loss=2.637, discriminator_real_loss=1.324, discriminator_fake_loss=1.313, generator_loss=30.28, generator_mel_loss=21.14, generator_kl_loss=1.952, generator_dur_loss=1.67, generator_adv_loss=2.105, generator_feat_match_loss=3.411, over 2563.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 18:51:41,924 INFO [train.py:811] (0/4) Start epoch 370 2023-11-13 18:55:16,050 INFO [train.py:811] (0/4) Start epoch 371 2023-11-13 18:56:25,841 INFO [train.py:467] (0/4) Epoch 371, batch 10, global_batch_idx: 13700, batch size: 50, loss[discriminator_loss=2.586, discriminator_real_loss=1.274, discriminator_fake_loss=1.312, generator_loss=30.33, generator_mel_loss=20.4, generator_kl_loss=1.97, generator_dur_loss=1.712, generator_adv_loss=2.24, generator_feat_match_loss=4.004, over 50.00 samples.], tot_loss[discriminator_loss=2.586, discriminator_real_loss=1.325, discriminator_fake_loss=1.261, generator_loss=30.25, generator_mel_loss=20.76, generator_kl_loss=1.936, generator_dur_loss=1.681, generator_adv_loss=2.232, generator_feat_match_loss=3.631, over 726.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 18:58:47,466 INFO [train.py:811] (0/4) Start epoch 372 2023-11-13 19:01:09,309 INFO [train.py:467] (0/4) Epoch 372, batch 23, global_batch_idx: 13750, batch size: 110, loss[discriminator_loss=2.551, discriminator_real_loss=1.239, discriminator_fake_loss=1.311, generator_loss=31.06, generator_mel_loss=21.81, generator_kl_loss=1.906, generator_dur_loss=1.663, generator_adv_loss=2.139, generator_feat_match_loss=3.541, over 110.00 samples.], tot_loss[discriminator_loss=2.585, discriminator_real_loss=1.294, discriminator_fake_loss=1.292, generator_loss=30.53, generator_mel_loss=21.16, generator_kl_loss=1.942, generator_dur_loss=1.675, generator_adv_loss=2.164, generator_feat_match_loss=3.589, over 1931.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 19:02:21,029 INFO [train.py:811] (0/4) Start epoch 373 2023-11-13 19:05:55,184 INFO [train.py:467] (0/4) Epoch 373, batch 36, global_batch_idx: 13800, batch size: 153, loss[discriminator_loss=2.457, discriminator_real_loss=1.283, discriminator_fake_loss=1.174, generator_loss=30.94, generator_mel_loss=20.98, generator_kl_loss=1.94, generator_dur_loss=1.641, generator_adv_loss=2.326, generator_feat_match_loss=4.051, over 153.00 samples.], tot_loss[discriminator_loss=2.611, discriminator_real_loss=1.325, discriminator_fake_loss=1.287, generator_loss=30.66, generator_mel_loss=21.04, generator_kl_loss=1.935, generator_dur_loss=1.674, generator_adv_loss=2.259, generator_feat_match_loss=3.755, over 2760.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2023-11-13 19:05:55,683 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 19:06:07,435 INFO [train.py:517] (0/4) Epoch 373, validation: discriminator_loss=2.515, discriminator_real_loss=1.122, discriminator_fake_loss=1.393, generator_loss=31.29, generator_mel_loss=21.7, generator_kl_loss=2.09, generator_dur_loss=1.652, generator_adv_loss=2.007, generator_feat_match_loss=3.836, over 100.00 samples. 2023-11-13 19:06:07,436 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27258MB 2023-11-13 19:06:08,084 INFO [train.py:811] (0/4) Start epoch 374 2023-11-13 19:09:38,960 INFO [train.py:811] (0/4) Start epoch 375 2023-11-13 19:10:53,233 INFO [train.py:467] (0/4) Epoch 375, batch 12, global_batch_idx: 13850, batch size: 153, loss[discriminator_loss=2.598, discriminator_real_loss=1.24, discriminator_fake_loss=1.357, generator_loss=30.94, generator_mel_loss=21.21, generator_kl_loss=1.979, generator_dur_loss=1.652, generator_adv_loss=2.295, generator_feat_match_loss=3.807, over 153.00 samples.], tot_loss[discriminator_loss=2.631, discriminator_real_loss=1.328, discriminator_fake_loss=1.303, generator_loss=30.3, generator_mel_loss=21.07, generator_kl_loss=1.95, generator_dur_loss=1.671, generator_adv_loss=2.158, generator_feat_match_loss=3.455, over 866.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2023-11-13 19:13:09,564 INFO [train.py:811] (0/4) Start epoch 376 2023-11-13 19:15:39,023 INFO [train.py:467] (0/4) Epoch 376, batch 25, global_batch_idx: 13900, batch size: 85, loss[discriminator_loss=2.844, discriminator_real_loss=1.576, discriminator_fake_loss=1.267, generator_loss=30.39, generator_mel_loss=21.01, generator_kl_loss=1.979, generator_dur_loss=1.693, generator_adv_loss=2.27, generator_feat_match_loss=3.43, over 85.00 samples.], tot_loss[discriminator_loss=2.635, discriminator_real_loss=1.345, discriminator_fake_loss=1.291, generator_loss=30.62, generator_mel_loss=21.11, generator_kl_loss=1.957, generator_dur_loss=1.679, generator_adv_loss=2.242, generator_feat_match_loss=3.632, over 1884.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2023-11-13 19:16:40,683 INFO [train.py:811] (0/4) Start epoch 377 2023-11-13 19:20:13,300 INFO [train.py:811] (0/4) Start epoch 378 2023-11-13 19:20:36,033 INFO [train.py:467] (0/4) Epoch 378, batch 1, global_batch_idx: 13950, batch size: 81, loss[discriminator_loss=2.613, discriminator_real_loss=1.311, discriminator_fake_loss=1.302, generator_loss=30.4, generator_mel_loss=21.21, generator_kl_loss=1.916, generator_dur_loss=1.657, generator_adv_loss=2.045, generator_feat_match_loss=3.568, over 81.00 samples.], tot_loss[discriminator_loss=2.656, discriminator_real_loss=1.421, discriminator_fake_loss=1.235, generator_loss=30.26, generator_mel_loss=21.12, generator_kl_loss=1.912, generator_dur_loss=1.667, generator_adv_loss=2.071, generator_feat_match_loss=3.483, over 162.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2023-11-13 19:23:53,740 INFO [train.py:811] (0/4) Start epoch 379 2023-11-13 19:25:22,644 INFO [train.py:467] (0/4) Epoch 379, batch 14, global_batch_idx: 14000, batch size: 101, loss[discriminator_loss=2.811, discriminator_real_loss=1.328, discriminator_fake_loss=1.482, generator_loss=30.3, generator_mel_loss=21.36, generator_kl_loss=1.951, generator_dur_loss=1.649, generator_adv_loss=2.18, generator_feat_match_loss=3.162, over 101.00 samples.], tot_loss[discriminator_loss=2.644, discriminator_real_loss=1.326, discriminator_fake_loss=1.318, generator_loss=30.4, generator_mel_loss=21.18, generator_kl_loss=1.95, generator_dur_loss=1.673, generator_adv_loss=2.121, generator_feat_match_loss=3.477, over 1207.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 19:25:23,145 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 19:25:34,172 INFO [train.py:517] (0/4) Epoch 379, validation: discriminator_loss=2.685, discriminator_real_loss=1.545, discriminator_fake_loss=1.14, generator_loss=31.28, generator_mel_loss=21.94, generator_kl_loss=2.153, generator_dur_loss=1.648, generator_adv_loss=2.175, generator_feat_match_loss=3.365, over 100.00 samples. 2023-11-13 19:25:34,174 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27258MB 2023-11-13 19:27:37,064 INFO [train.py:811] (0/4) Start epoch 380 2023-11-13 19:30:11,278 INFO [train.py:467] (0/4) Epoch 380, batch 27, global_batch_idx: 14050, batch size: 63, loss[discriminator_loss=2.613, discriminator_real_loss=1.431, discriminator_fake_loss=1.182, generator_loss=30.61, generator_mel_loss=21.17, generator_kl_loss=1.967, generator_dur_loss=1.694, generator_adv_loss=2.332, generator_feat_match_loss=3.449, over 63.00 samples.], tot_loss[discriminator_loss=2.664, discriminator_real_loss=1.35, discriminator_fake_loss=1.315, generator_loss=30.6, generator_mel_loss=21.27, generator_kl_loss=1.935, generator_dur_loss=1.671, generator_adv_loss=2.197, generator_feat_match_loss=3.524, over 1981.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 19:31:03,327 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-380.pt 2023-11-13 19:31:06,794 INFO [train.py:811] (0/4) Start epoch 381 2023-11-13 19:34:38,425 INFO [train.py:811] (0/4) Start epoch 382 2023-11-13 19:35:11,664 INFO [train.py:467] (0/4) Epoch 382, batch 3, global_batch_idx: 14100, batch size: 59, loss[discriminator_loss=2.547, discriminator_real_loss=1.361, discriminator_fake_loss=1.186, generator_loss=30.46, generator_mel_loss=21.03, generator_kl_loss=1.914, generator_dur_loss=1.676, generator_adv_loss=2.176, generator_feat_match_loss=3.666, over 59.00 samples.], tot_loss[discriminator_loss=2.614, discriminator_real_loss=1.348, discriminator_fake_loss=1.267, generator_loss=30.27, generator_mel_loss=20.98, generator_kl_loss=1.902, generator_dur_loss=1.678, generator_adv_loss=2.182, generator_feat_match_loss=3.534, over 266.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 19:38:13,257 INFO [train.py:811] (0/4) Start epoch 383 2023-11-13 19:40:00,779 INFO [train.py:467] (0/4) Epoch 383, batch 16, global_batch_idx: 14150, batch size: 76, loss[discriminator_loss=2.584, discriminator_real_loss=1.229, discriminator_fake_loss=1.355, generator_loss=31.07, generator_mel_loss=21.08, generator_kl_loss=1.931, generator_dur_loss=1.69, generator_adv_loss=2.168, generator_feat_match_loss=4.199, over 76.00 samples.], tot_loss[discriminator_loss=2.629, discriminator_real_loss=1.337, discriminator_fake_loss=1.293, generator_loss=30.54, generator_mel_loss=21.07, generator_kl_loss=1.909, generator_dur_loss=1.675, generator_adv_loss=2.245, generator_feat_match_loss=3.641, over 1287.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2023-11-13 19:41:50,242 INFO [train.py:811] (0/4) Start epoch 384 2023-11-13 19:44:47,527 INFO [train.py:467] (0/4) Epoch 384, batch 29, global_batch_idx: 14200, batch size: 69, loss[discriminator_loss=2.715, discriminator_real_loss=1.193, discriminator_fake_loss=1.521, generator_loss=29.9, generator_mel_loss=20.79, generator_kl_loss=1.955, generator_dur_loss=1.665, generator_adv_loss=1.912, generator_feat_match_loss=3.58, over 69.00 samples.], tot_loss[discriminator_loss=2.594, discriminator_real_loss=1.318, discriminator_fake_loss=1.276, generator_loss=30.35, generator_mel_loss=20.82, generator_kl_loss=1.937, generator_dur_loss=1.677, generator_adv_loss=2.204, generator_feat_match_loss=3.722, over 2217.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2023-11-13 19:44:48,151 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 19:44:59,377 INFO [train.py:517] (0/4) Epoch 384, validation: discriminator_loss=2.806, discriminator_real_loss=1.156, discriminator_fake_loss=1.65, generator_loss=30.87, generator_mel_loss=22.28, generator_kl_loss=1.928, generator_dur_loss=1.651, generator_adv_loss=1.578, generator_feat_match_loss=3.435, over 100.00 samples. 2023-11-13 19:44:59,379 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27258MB 2023-11-13 19:45:34,897 INFO [train.py:811] (0/4) Start epoch 385 2023-11-13 19:49:10,613 INFO [train.py:811] (0/4) Start epoch 386 2023-11-13 19:49:54,851 INFO [train.py:467] (0/4) Epoch 386, batch 5, global_batch_idx: 14250, batch size: 81, loss[discriminator_loss=2.867, discriminator_real_loss=1.611, discriminator_fake_loss=1.255, generator_loss=31.13, generator_mel_loss=21.14, generator_kl_loss=1.982, generator_dur_loss=1.693, generator_adv_loss=2.539, generator_feat_match_loss=3.777, over 81.00 samples.], tot_loss[discriminator_loss=2.717, discriminator_real_loss=1.432, discriminator_fake_loss=1.284, generator_loss=30.84, generator_mel_loss=21.15, generator_kl_loss=1.949, generator_dur_loss=1.682, generator_adv_loss=2.342, generator_feat_match_loss=3.719, over 380.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2023-11-13 19:52:48,385 INFO [train.py:811] (0/4) Start epoch 387 2023-11-13 19:54:42,890 INFO [train.py:467] (0/4) Epoch 387, batch 18, global_batch_idx: 14300, batch size: 65, loss[discriminator_loss=2.688, discriminator_real_loss=1.284, discriminator_fake_loss=1.402, generator_loss=30.33, generator_mel_loss=21.31, generator_kl_loss=1.918, generator_dur_loss=1.671, generator_adv_loss=2.162, generator_feat_match_loss=3.27, over 65.00 samples.], tot_loss[discriminator_loss=2.661, discriminator_real_loss=1.354, discriminator_fake_loss=1.307, generator_loss=30.03, generator_mel_loss=21.06, generator_kl_loss=1.953, generator_dur_loss=1.673, generator_adv_loss=2.056, generator_feat_match_loss=3.28, over 1374.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2023-11-13 19:56:22,267 INFO [train.py:811] (0/4) Start epoch 388 2023-11-13 19:59:33,603 INFO [train.py:467] (0/4) Epoch 388, batch 31, global_batch_idx: 14350, batch size: 64, loss[discriminator_loss=2.564, discriminator_real_loss=1.432, discriminator_fake_loss=1.133, generator_loss=30.25, generator_mel_loss=20.27, generator_kl_loss=1.907, generator_dur_loss=1.668, generator_adv_loss=2.375, generator_feat_match_loss=4.031, over 64.00 samples.], tot_loss[discriminator_loss=2.684, discriminator_real_loss=1.39, discriminator_fake_loss=1.294, generator_loss=30.23, generator_mel_loss=20.88, generator_kl_loss=1.948, generator_dur_loss=1.675, generator_adv_loss=2.188, generator_feat_match_loss=3.543, over 2217.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2023-11-13 19:59:56,139 INFO [train.py:811] (0/4) Start epoch 389 2023-11-13 20:03:27,415 INFO [train.py:811] (0/4) Start epoch 390 2023-11-13 20:04:17,262 INFO [train.py:467] (0/4) Epoch 390, batch 7, global_batch_idx: 14400, batch size: 49, loss[discriminator_loss=2.754, discriminator_real_loss=1.562, discriminator_fake_loss=1.191, generator_loss=29.68, generator_mel_loss=20.87, generator_kl_loss=1.887, generator_dur_loss=1.663, generator_adv_loss=2.115, generator_feat_match_loss=3.143, over 49.00 samples.], tot_loss[discriminator_loss=2.68, discriminator_real_loss=1.362, discriminator_fake_loss=1.317, generator_loss=30.03, generator_mel_loss=20.96, generator_kl_loss=1.948, generator_dur_loss=1.676, generator_adv_loss=2.127, generator_feat_match_loss=3.321, over 497.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2023-11-13 20:04:17,731 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 20:04:28,951 INFO [train.py:517] (0/4) Epoch 390, validation: discriminator_loss=2.847, discriminator_real_loss=1.415, discriminator_fake_loss=1.432, generator_loss=30.63, generator_mel_loss=21.9, generator_kl_loss=2.055, generator_dur_loss=1.644, generator_adv_loss=1.844, generator_feat_match_loss=3.191, over 100.00 samples. 2023-11-13 20:04:28,952 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27258MB 2023-11-13 20:07:11,060 INFO [train.py:811] (0/4) Start epoch 391 2023-11-13 20:09:14,854 INFO [train.py:467] (0/4) Epoch 391, batch 20, global_batch_idx: 14450, batch size: 64, loss[discriminator_loss=2.637, discriminator_real_loss=1.396, discriminator_fake_loss=1.241, generator_loss=29.91, generator_mel_loss=20.76, generator_kl_loss=1.938, generator_dur_loss=1.681, generator_adv_loss=2.123, generator_feat_match_loss=3.406, over 64.00 samples.], tot_loss[discriminator_loss=2.668, discriminator_real_loss=1.352, discriminator_fake_loss=1.316, generator_loss=30.22, generator_mel_loss=21.05, generator_kl_loss=1.921, generator_dur_loss=1.674, generator_adv_loss=2.118, generator_feat_match_loss=3.46, over 1619.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 20:10:43,980 INFO [train.py:811] (0/4) Start epoch 392 2023-11-13 20:13:52,960 INFO [train.py:467] (0/4) Epoch 392, batch 33, global_batch_idx: 14500, batch size: 101, loss[discriminator_loss=2.682, discriminator_real_loss=1.529, discriminator_fake_loss=1.152, generator_loss=30.79, generator_mel_loss=21.51, generator_kl_loss=1.97, generator_dur_loss=1.688, generator_adv_loss=1.871, generator_feat_match_loss=3.746, over 101.00 samples.], tot_loss[discriminator_loss=2.677, discriminator_real_loss=1.364, discriminator_fake_loss=1.314, generator_loss=30.49, generator_mel_loss=21.31, generator_kl_loss=1.958, generator_dur_loss=1.673, generator_adv_loss=2.102, generator_feat_match_loss=3.441, over 2430.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 20:14:14,844 INFO [train.py:811] (0/4) Start epoch 393 2023-11-13 20:17:40,905 INFO [train.py:811] (0/4) Start epoch 394 2023-11-13 20:18:50,284 INFO [train.py:467] (0/4) Epoch 394, batch 9, global_batch_idx: 14550, batch size: 58, loss[discriminator_loss=2.615, discriminator_real_loss=1.26, discriminator_fake_loss=1.355, generator_loss=30.61, generator_mel_loss=21.44, generator_kl_loss=1.959, generator_dur_loss=1.675, generator_adv_loss=2.141, generator_feat_match_loss=3.398, over 58.00 samples.], tot_loss[discriminator_loss=2.61, discriminator_real_loss=1.323, discriminator_fake_loss=1.287, generator_loss=30.52, generator_mel_loss=21.15, generator_kl_loss=1.976, generator_dur_loss=1.671, generator_adv_loss=2.137, generator_feat_match_loss=3.588, over 883.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 20:21:12,305 INFO [train.py:811] (0/4) Start epoch 395 2023-11-13 20:23:27,814 INFO [train.py:467] (0/4) Epoch 395, batch 22, global_batch_idx: 14600, batch size: 101, loss[discriminator_loss=2.523, discriminator_real_loss=1.252, discriminator_fake_loss=1.271, generator_loss=31.17, generator_mel_loss=21.27, generator_kl_loss=2.074, generator_dur_loss=1.636, generator_adv_loss=2.273, generator_feat_match_loss=3.918, over 101.00 samples.], tot_loss[discriminator_loss=2.574, discriminator_real_loss=1.302, discriminator_fake_loss=1.272, generator_loss=30.4, generator_mel_loss=20.95, generator_kl_loss=1.971, generator_dur_loss=1.671, generator_adv_loss=2.187, generator_feat_match_loss=3.621, over 1687.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 20:23:28,359 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 20:23:39,079 INFO [train.py:517] (0/4) Epoch 395, validation: discriminator_loss=2.607, discriminator_real_loss=1.199, discriminator_fake_loss=1.408, generator_loss=31.02, generator_mel_loss=21.93, generator_kl_loss=2.106, generator_dur_loss=1.653, generator_adv_loss=1.804, generator_feat_match_loss=3.531, over 100.00 samples. 2023-11-13 20:23:39,081 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27258MB 2023-11-13 20:24:56,979 INFO [train.py:811] (0/4) Start epoch 396 2023-11-13 20:28:20,203 INFO [train.py:467] (0/4) Epoch 396, batch 35, global_batch_idx: 14650, batch size: 65, loss[discriminator_loss=2.602, discriminator_real_loss=1.409, discriminator_fake_loss=1.192, generator_loss=30.44, generator_mel_loss=21.16, generator_kl_loss=1.987, generator_dur_loss=1.66, generator_adv_loss=1.998, generator_feat_match_loss=3.631, over 65.00 samples.], tot_loss[discriminator_loss=2.615, discriminator_real_loss=1.33, discriminator_fake_loss=1.285, generator_loss=30.56, generator_mel_loss=20.98, generator_kl_loss=1.951, generator_dur_loss=1.666, generator_adv_loss=2.224, generator_feat_match_loss=3.737, over 2647.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 20:28:27,959 INFO [train.py:811] (0/4) Start epoch 397 2023-11-13 20:32:03,282 INFO [train.py:811] (0/4) Start epoch 398 2023-11-13 20:33:20,244 INFO [train.py:467] (0/4) Epoch 398, batch 11, global_batch_idx: 14700, batch size: 50, loss[discriminator_loss=2.779, discriminator_real_loss=1.404, discriminator_fake_loss=1.375, generator_loss=30.24, generator_mel_loss=21.45, generator_kl_loss=2.057, generator_dur_loss=1.682, generator_adv_loss=2.004, generator_feat_match_loss=3.047, over 50.00 samples.], tot_loss[discriminator_loss=2.626, discriminator_real_loss=1.323, discriminator_fake_loss=1.303, generator_loss=30.33, generator_mel_loss=21.08, generator_kl_loss=1.968, generator_dur_loss=1.674, generator_adv_loss=2.119, generator_feat_match_loss=3.486, over 893.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 20:35:35,874 INFO [train.py:811] (0/4) Start epoch 399 2023-11-13 20:38:01,191 INFO [train.py:467] (0/4) Epoch 399, batch 24, global_batch_idx: 14750, batch size: 58, loss[discriminator_loss=2.672, discriminator_real_loss=1.334, discriminator_fake_loss=1.338, generator_loss=30.35, generator_mel_loss=20.99, generator_kl_loss=2.016, generator_dur_loss=1.677, generator_adv_loss=2.217, generator_feat_match_loss=3.457, over 58.00 samples.], tot_loss[discriminator_loss=2.646, discriminator_real_loss=1.347, discriminator_fake_loss=1.298, generator_loss=30.36, generator_mel_loss=21.16, generator_kl_loss=1.989, generator_dur_loss=1.678, generator_adv_loss=2.107, generator_feat_match_loss=3.434, over 1862.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 20:39:10,537 INFO [train.py:811] (0/4) Start epoch 400 2023-11-13 20:42:42,886 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-400.pt 2023-11-13 20:42:46,112 INFO [train.py:811] (0/4) Start epoch 401 2023-11-13 20:42:59,068 INFO [train.py:467] (0/4) Epoch 401, batch 0, global_batch_idx: 14800, batch size: 76, loss[discriminator_loss=2.676, discriminator_real_loss=1.447, discriminator_fake_loss=1.229, generator_loss=30.24, generator_mel_loss=21.03, generator_kl_loss=1.99, generator_dur_loss=1.688, generator_adv_loss=1.863, generator_feat_match_loss=3.664, over 76.00 samples.], tot_loss[discriminator_loss=2.676, discriminator_real_loss=1.447, discriminator_fake_loss=1.229, generator_loss=30.24, generator_mel_loss=21.03, generator_kl_loss=1.99, generator_dur_loss=1.688, generator_adv_loss=1.863, generator_feat_match_loss=3.664, over 76.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 20:42:59,607 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 20:43:10,795 INFO [train.py:517] (0/4) Epoch 401, validation: discriminator_loss=2.685, discriminator_real_loss=1.134, discriminator_fake_loss=1.55, generator_loss=30.75, generator_mel_loss=21.67, generator_kl_loss=2.09, generator_dur_loss=1.65, generator_adv_loss=1.705, generator_feat_match_loss=3.639, over 100.00 samples. 2023-11-13 20:43:10,796 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-13 20:46:21,586 INFO [train.py:811] (0/4) Start epoch 402 2023-11-13 20:47:50,186 INFO [train.py:467] (0/4) Epoch 402, batch 13, global_batch_idx: 14850, batch size: 79, loss[discriminator_loss=2.469, discriminator_real_loss=1.195, discriminator_fake_loss=1.274, generator_loss=31.03, generator_mel_loss=20.71, generator_kl_loss=1.928, generator_dur_loss=1.654, generator_adv_loss=2.393, generator_feat_match_loss=4.344, over 79.00 samples.], tot_loss[discriminator_loss=2.66, discriminator_real_loss=1.383, discriminator_fake_loss=1.277, generator_loss=30.7, generator_mel_loss=21.02, generator_kl_loss=1.955, generator_dur_loss=1.679, generator_adv_loss=2.236, generator_feat_match_loss=3.815, over 977.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 20:49:53,083 INFO [train.py:811] (0/4) Start epoch 403 2023-11-13 20:52:22,030 INFO [train.py:467] (0/4) Epoch 403, batch 26, global_batch_idx: 14900, batch size: 90, loss[discriminator_loss=2.662, discriminator_real_loss=1.391, discriminator_fake_loss=1.271, generator_loss=30.29, generator_mel_loss=21.23, generator_kl_loss=1.859, generator_dur_loss=1.681, generator_adv_loss=1.936, generator_feat_match_loss=3.58, over 90.00 samples.], tot_loss[discriminator_loss=2.642, discriminator_real_loss=1.342, discriminator_fake_loss=1.3, generator_loss=30.34, generator_mel_loss=21.07, generator_kl_loss=1.959, generator_dur_loss=1.678, generator_adv_loss=2.115, generator_feat_match_loss=3.512, over 1754.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 20:53:20,764 INFO [train.py:811] (0/4) Start epoch 404 2023-11-13 20:56:47,738 INFO [train.py:811] (0/4) Start epoch 405 2023-11-13 20:57:17,928 INFO [train.py:467] (0/4) Epoch 405, batch 2, global_batch_idx: 14950, batch size: 59, loss[discriminator_loss=2.516, discriminator_real_loss=1.24, discriminator_fake_loss=1.276, generator_loss=31.25, generator_mel_loss=21.41, generator_kl_loss=2.036, generator_dur_loss=1.702, generator_adv_loss=2.377, generator_feat_match_loss=3.727, over 59.00 samples.], tot_loss[discriminator_loss=2.543, discriminator_real_loss=1.258, discriminator_fake_loss=1.285, generator_loss=31.1, generator_mel_loss=21.33, generator_kl_loss=1.967, generator_dur_loss=1.674, generator_adv_loss=2.224, generator_feat_match_loss=3.899, over 322.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 21:00:21,030 INFO [train.py:811] (0/4) Start epoch 406 2023-11-13 21:01:56,683 INFO [train.py:467] (0/4) Epoch 406, batch 15, global_batch_idx: 15000, batch size: 81, loss[discriminator_loss=2.668, discriminator_real_loss=1.15, discriminator_fake_loss=1.518, generator_loss=30.5, generator_mel_loss=21.11, generator_kl_loss=2.063, generator_dur_loss=1.68, generator_adv_loss=2.186, generator_feat_match_loss=3.461, over 81.00 samples.], tot_loss[discriminator_loss=2.613, discriminator_real_loss=1.302, discriminator_fake_loss=1.312, generator_loss=30.98, generator_mel_loss=21.2, generator_kl_loss=1.987, generator_dur_loss=1.671, generator_adv_loss=2.265, generator_feat_match_loss=3.852, over 1264.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 21:01:57,144 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 21:02:07,736 INFO [train.py:517] (0/4) Epoch 406, validation: discriminator_loss=2.92, discriminator_real_loss=1.335, discriminator_fake_loss=1.585, generator_loss=30.66, generator_mel_loss=21.84, generator_kl_loss=2.085, generator_dur_loss=1.638, generator_adv_loss=1.74, generator_feat_match_loss=3.348, over 100.00 samples. 2023-11-13 21:02:07,737 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-13 21:04:08,464 INFO [train.py:811] (0/4) Start epoch 407 2023-11-13 21:06:57,073 INFO [train.py:467] (0/4) Epoch 407, batch 28, global_batch_idx: 15050, batch size: 153, loss[discriminator_loss=2.75, discriminator_real_loss=1.505, discriminator_fake_loss=1.245, generator_loss=29.95, generator_mel_loss=20.95, generator_kl_loss=1.952, generator_dur_loss=1.654, generator_adv_loss=1.947, generator_feat_match_loss=3.451, over 153.00 samples.], tot_loss[discriminator_loss=2.639, discriminator_real_loss=1.351, discriminator_fake_loss=1.288, generator_loss=30.1, generator_mel_loss=20.78, generator_kl_loss=1.958, generator_dur_loss=1.667, generator_adv_loss=2.142, generator_feat_match_loss=3.559, over 2168.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 21:07:41,752 INFO [train.py:811] (0/4) Start epoch 408 2023-11-13 21:11:10,036 INFO [train.py:811] (0/4) Start epoch 409 2023-11-13 21:11:44,665 INFO [train.py:467] (0/4) Epoch 409, batch 4, global_batch_idx: 15100, batch size: 63, loss[discriminator_loss=2.686, discriminator_real_loss=1.326, discriminator_fake_loss=1.359, generator_loss=29.57, generator_mel_loss=20.62, generator_kl_loss=2.018, generator_dur_loss=1.686, generator_adv_loss=2.062, generator_feat_match_loss=3.184, over 63.00 samples.], tot_loss[discriminator_loss=2.669, discriminator_real_loss=1.373, discriminator_fake_loss=1.295, generator_loss=30.38, generator_mel_loss=21.22, generator_kl_loss=1.991, generator_dur_loss=1.669, generator_adv_loss=2.091, generator_feat_match_loss=3.406, over 387.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 21:14:40,439 INFO [train.py:811] (0/4) Start epoch 410 2023-11-13 21:16:30,390 INFO [train.py:467] (0/4) Epoch 410, batch 17, global_batch_idx: 15150, batch size: 73, loss[discriminator_loss=2.652, discriminator_real_loss=1.446, discriminator_fake_loss=1.206, generator_loss=30.31, generator_mel_loss=21.27, generator_kl_loss=1.99, generator_dur_loss=1.644, generator_adv_loss=1.959, generator_feat_match_loss=3.447, over 73.00 samples.], tot_loss[discriminator_loss=2.631, discriminator_real_loss=1.328, discriminator_fake_loss=1.302, generator_loss=30.35, generator_mel_loss=21.05, generator_kl_loss=1.972, generator_dur_loss=1.67, generator_adv_loss=2.153, generator_feat_match_loss=3.513, over 1487.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 21:18:08,670 INFO [train.py:811] (0/4) Start epoch 411 2023-11-13 21:21:10,649 INFO [train.py:467] (0/4) Epoch 411, batch 30, global_batch_idx: 15200, batch size: 60, loss[discriminator_loss=2.875, discriminator_real_loss=1.678, discriminator_fake_loss=1.196, generator_loss=29.92, generator_mel_loss=20.97, generator_kl_loss=1.998, generator_dur_loss=1.677, generator_adv_loss=2.186, generator_feat_match_loss=3.09, over 60.00 samples.], tot_loss[discriminator_loss=2.643, discriminator_real_loss=1.348, discriminator_fake_loss=1.295, generator_loss=30.55, generator_mel_loss=21, generator_kl_loss=1.956, generator_dur_loss=1.662, generator_adv_loss=2.237, generator_feat_match_loss=3.702, over 2367.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 32.0 2023-11-13 21:21:11,126 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 21:21:21,733 INFO [train.py:517] (0/4) Epoch 411, validation: discriminator_loss=2.778, discriminator_real_loss=1.515, discriminator_fake_loss=1.263, generator_loss=30.97, generator_mel_loss=21.71, generator_kl_loss=2.018, generator_dur_loss=1.643, generator_adv_loss=2.269, generator_feat_match_loss=3.328, over 100.00 samples. 2023-11-13 21:21:21,734 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-13 21:21:51,667 INFO [train.py:811] (0/4) Start epoch 412 2023-11-13 21:25:24,642 INFO [train.py:811] (0/4) Start epoch 413 2023-11-13 21:26:11,831 INFO [train.py:467] (0/4) Epoch 413, batch 6, global_batch_idx: 15250, batch size: 79, loss[discriminator_loss=2.633, discriminator_real_loss=1.29, discriminator_fake_loss=1.343, generator_loss=29.85, generator_mel_loss=20.82, generator_kl_loss=1.935, generator_dur_loss=1.666, generator_adv_loss=2.035, generator_feat_match_loss=3.387, over 79.00 samples.], tot_loss[discriminator_loss=2.603, discriminator_real_loss=1.299, discriminator_fake_loss=1.303, generator_loss=30.13, generator_mel_loss=21.01, generator_kl_loss=1.974, generator_dur_loss=1.679, generator_adv_loss=2.044, generator_feat_match_loss=3.423, over 530.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 21:28:57,438 INFO [train.py:811] (0/4) Start epoch 414 2023-11-13 21:30:49,771 INFO [train.py:467] (0/4) Epoch 414, batch 19, global_batch_idx: 15300, batch size: 101, loss[discriminator_loss=2.492, discriminator_real_loss=1.205, discriminator_fake_loss=1.286, generator_loss=30.95, generator_mel_loss=20.77, generator_kl_loss=2.013, generator_dur_loss=1.664, generator_adv_loss=2.359, generator_feat_match_loss=4.141, over 101.00 samples.], tot_loss[discriminator_loss=2.587, discriminator_real_loss=1.314, discriminator_fake_loss=1.273, generator_loss=30.59, generator_mel_loss=20.76, generator_kl_loss=1.96, generator_dur_loss=1.676, generator_adv_loss=2.277, generator_feat_match_loss=3.912, over 1232.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 21:32:26,928 INFO [train.py:811] (0/4) Start epoch 415 2023-11-13 21:35:34,913 INFO [train.py:467] (0/4) Epoch 415, batch 32, global_batch_idx: 15350, batch size: 50, loss[discriminator_loss=2.652, discriminator_real_loss=1.494, discriminator_fake_loss=1.158, generator_loss=29.69, generator_mel_loss=20.82, generator_kl_loss=1.895, generator_dur_loss=1.649, generator_adv_loss=1.943, generator_feat_match_loss=3.383, over 50.00 samples.], tot_loss[discriminator_loss=2.645, discriminator_real_loss=1.345, discriminator_fake_loss=1.3, generator_loss=30.16, generator_mel_loss=21, generator_kl_loss=1.962, generator_dur_loss=1.675, generator_adv_loss=2.089, generator_feat_match_loss=3.434, over 2379.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 21:36:01,093 INFO [train.py:811] (0/4) Start epoch 416 2023-11-13 21:39:30,079 INFO [train.py:811] (0/4) Start epoch 417 2023-11-13 21:40:26,689 INFO [train.py:467] (0/4) Epoch 417, batch 8, global_batch_idx: 15400, batch size: 65, loss[discriminator_loss=2.572, discriminator_real_loss=1.235, discriminator_fake_loss=1.337, generator_loss=30.48, generator_mel_loss=20.98, generator_kl_loss=2.02, generator_dur_loss=1.67, generator_adv_loss=2.129, generator_feat_match_loss=3.684, over 65.00 samples.], tot_loss[discriminator_loss=2.594, discriminator_real_loss=1.302, discriminator_fake_loss=1.291, generator_loss=30.78, generator_mel_loss=21.23, generator_kl_loss=1.998, generator_dur_loss=1.675, generator_adv_loss=2.183, generator_feat_match_loss=3.694, over 608.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 21:40:27,154 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 21:40:38,427 INFO [train.py:517] (0/4) Epoch 417, validation: discriminator_loss=2.501, discriminator_real_loss=1.233, discriminator_fake_loss=1.268, generator_loss=31.27, generator_mel_loss=21.79, generator_kl_loss=1.954, generator_dur_loss=1.645, generator_adv_loss=2.051, generator_feat_match_loss=3.831, over 100.00 samples. 2023-11-13 21:40:38,428 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-13 21:43:11,563 INFO [train.py:811] (0/4) Start epoch 418 2023-11-13 21:45:28,070 INFO [train.py:467] (0/4) Epoch 418, batch 21, global_batch_idx: 15450, batch size: 81, loss[discriminator_loss=2.543, discriminator_real_loss=1.214, discriminator_fake_loss=1.33, generator_loss=31.32, generator_mel_loss=21.37, generator_kl_loss=1.95, generator_dur_loss=1.66, generator_adv_loss=2.383, generator_feat_match_loss=3.955, over 81.00 samples.], tot_loss[discriminator_loss=2.64, discriminator_real_loss=1.336, discriminator_fake_loss=1.304, generator_loss=30.57, generator_mel_loss=21.15, generator_kl_loss=1.944, generator_dur_loss=1.667, generator_adv_loss=2.182, generator_feat_match_loss=3.627, over 1662.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 21:46:43,849 INFO [train.py:811] (0/4) Start epoch 419 2023-11-13 21:50:02,940 INFO [train.py:467] (0/4) Epoch 419, batch 34, global_batch_idx: 15500, batch size: 153, loss[discriminator_loss=2.666, discriminator_real_loss=1.365, discriminator_fake_loss=1.301, generator_loss=30.01, generator_mel_loss=21.04, generator_kl_loss=1.971, generator_dur_loss=1.637, generator_adv_loss=1.971, generator_feat_match_loss=3.387, over 153.00 samples.], tot_loss[discriminator_loss=2.672, discriminator_real_loss=1.356, discriminator_fake_loss=1.315, generator_loss=30.27, generator_mel_loss=21.08, generator_kl_loss=1.97, generator_dur_loss=1.673, generator_adv_loss=2.106, generator_feat_match_loss=3.441, over 2469.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 21:50:16,788 INFO [train.py:811] (0/4) Start epoch 420 2023-11-13 21:53:53,503 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-420.pt 2023-11-13 21:53:57,129 INFO [train.py:811] (0/4) Start epoch 421 2023-11-13 21:55:14,911 INFO [train.py:467] (0/4) Epoch 421, batch 10, global_batch_idx: 15550, batch size: 101, loss[discriminator_loss=2.57, discriminator_real_loss=1.216, discriminator_fake_loss=1.354, generator_loss=30.87, generator_mel_loss=21.06, generator_kl_loss=1.965, generator_dur_loss=1.657, generator_adv_loss=2.408, generator_feat_match_loss=3.771, over 101.00 samples.], tot_loss[discriminator_loss=2.586, discriminator_real_loss=1.312, discriminator_fake_loss=1.274, generator_loss=30.66, generator_mel_loss=21.05, generator_kl_loss=1.968, generator_dur_loss=1.667, generator_adv_loss=2.201, generator_feat_match_loss=3.772, over 934.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 21:57:33,342 INFO [train.py:811] (0/4) Start epoch 422 2023-11-13 21:59:51,611 INFO [train.py:467] (0/4) Epoch 422, batch 23, global_batch_idx: 15600, batch size: 110, loss[discriminator_loss=2.596, discriminator_real_loss=1.145, discriminator_fake_loss=1.451, generator_loss=30.9, generator_mel_loss=21.06, generator_kl_loss=2.026, generator_dur_loss=1.669, generator_adv_loss=2.373, generator_feat_match_loss=3.771, over 110.00 samples.], tot_loss[discriminator_loss=2.56, discriminator_real_loss=1.286, discriminator_fake_loss=1.273, generator_loss=30.52, generator_mel_loss=20.7, generator_kl_loss=1.953, generator_dur_loss=1.67, generator_adv_loss=2.256, generator_feat_match_loss=3.941, over 1819.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 32.0 2023-11-13 21:59:52,146 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 22:00:02,681 INFO [train.py:517] (0/4) Epoch 422, validation: discriminator_loss=2.598, discriminator_real_loss=1.279, discriminator_fake_loss=1.319, generator_loss=30.89, generator_mel_loss=21.57, generator_kl_loss=2.034, generator_dur_loss=1.642, generator_adv_loss=1.964, generator_feat_match_loss=3.671, over 100.00 samples. 2023-11-13 22:00:02,682 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-13 22:01:12,423 INFO [train.py:811] (0/4) Start epoch 423 2023-11-13 22:04:45,460 INFO [train.py:467] (0/4) Epoch 423, batch 36, global_batch_idx: 15650, batch size: 56, loss[discriminator_loss=2.531, discriminator_real_loss=1.336, discriminator_fake_loss=1.196, generator_loss=31.14, generator_mel_loss=21.25, generator_kl_loss=1.974, generator_dur_loss=1.655, generator_adv_loss=2.295, generator_feat_match_loss=3.969, over 56.00 samples.], tot_loss[discriminator_loss=2.613, discriminator_real_loss=1.327, discriminator_fake_loss=1.286, generator_loss=30.36, generator_mel_loss=20.93, generator_kl_loss=1.975, generator_dur_loss=1.668, generator_adv_loss=2.141, generator_feat_match_loss=3.648, over 2862.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 22:04:46,710 INFO [train.py:811] (0/4) Start epoch 424 2023-11-13 22:08:05,988 INFO [train.py:811] (0/4) Start epoch 425 2023-11-13 22:09:22,653 INFO [train.py:467] (0/4) Epoch 425, batch 12, global_batch_idx: 15700, batch size: 59, loss[discriminator_loss=2.654, discriminator_real_loss=1.251, discriminator_fake_loss=1.403, generator_loss=30.93, generator_mel_loss=21.4, generator_kl_loss=2.034, generator_dur_loss=1.665, generator_adv_loss=2.236, generator_feat_match_loss=3.588, over 59.00 samples.], tot_loss[discriminator_loss=2.648, discriminator_real_loss=1.36, discriminator_fake_loss=1.288, generator_loss=30.56, generator_mel_loss=21.07, generator_kl_loss=1.956, generator_dur_loss=1.673, generator_adv_loss=2.217, generator_feat_match_loss=3.647, over 790.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 22:11:36,648 INFO [train.py:811] (0/4) Start epoch 426 2023-11-13 22:13:59,507 INFO [train.py:467] (0/4) Epoch 426, batch 25, global_batch_idx: 15750, batch size: 65, loss[discriminator_loss=2.672, discriminator_real_loss=1.365, discriminator_fake_loss=1.308, generator_loss=30.15, generator_mel_loss=20.2, generator_kl_loss=1.968, generator_dur_loss=1.655, generator_adv_loss=2.352, generator_feat_match_loss=3.977, over 65.00 samples.], tot_loss[discriminator_loss=2.605, discriminator_real_loss=1.312, discriminator_fake_loss=1.293, generator_loss=30.9, generator_mel_loss=21.1, generator_kl_loss=1.989, generator_dur_loss=1.67, generator_adv_loss=2.298, generator_feat_match_loss=3.846, over 1984.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 22:15:08,082 INFO [train.py:811] (0/4) Start epoch 427 2023-11-13 22:18:41,188 INFO [train.py:811] (0/4) Start epoch 428 2023-11-13 22:19:03,537 INFO [train.py:467] (0/4) Epoch 428, batch 1, global_batch_idx: 15800, batch size: 52, loss[discriminator_loss=2.635, discriminator_real_loss=1.226, discriminator_fake_loss=1.409, generator_loss=29.27, generator_mel_loss=20.29, generator_kl_loss=2.099, generator_dur_loss=1.679, generator_adv_loss=2.012, generator_feat_match_loss=3.188, over 52.00 samples.], tot_loss[discriminator_loss=2.64, discriminator_real_loss=1.211, discriminator_fake_loss=1.428, generator_loss=29.66, generator_mel_loss=20.75, generator_kl_loss=2.005, generator_dur_loss=1.68, generator_adv_loss=1.882, generator_feat_match_loss=3.343, over 142.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 22:19:04,055 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 22:19:15,685 INFO [train.py:517] (0/4) Epoch 428, validation: discriminator_loss=2.552, discriminator_real_loss=1.217, discriminator_fake_loss=1.334, generator_loss=30.8, generator_mel_loss=21.6, generator_kl_loss=2.089, generator_dur_loss=1.651, generator_adv_loss=1.891, generator_feat_match_loss=3.575, over 100.00 samples. 2023-11-13 22:19:15,686 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-13 22:22:25,577 INFO [train.py:811] (0/4) Start epoch 429 2023-11-13 22:23:54,903 INFO [train.py:467] (0/4) Epoch 429, batch 14, global_batch_idx: 15850, batch size: 71, loss[discriminator_loss=2.51, discriminator_real_loss=1.308, discriminator_fake_loss=1.202, generator_loss=30.43, generator_mel_loss=20.48, generator_kl_loss=1.904, generator_dur_loss=1.671, generator_adv_loss=2.15, generator_feat_match_loss=4.227, over 71.00 samples.], tot_loss[discriminator_loss=2.652, discriminator_real_loss=1.369, discriminator_fake_loss=1.283, generator_loss=30.74, generator_mel_loss=21, generator_kl_loss=1.984, generator_dur_loss=1.667, generator_adv_loss=2.249, generator_feat_match_loss=3.838, over 1137.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 22:25:58,825 INFO [train.py:811] (0/4) Start epoch 430 2023-11-13 22:28:46,611 INFO [train.py:467] (0/4) Epoch 430, batch 27, global_batch_idx: 15900, batch size: 58, loss[discriminator_loss=2.695, discriminator_real_loss=1.509, discriminator_fake_loss=1.186, generator_loss=29.85, generator_mel_loss=21.02, generator_kl_loss=1.951, generator_dur_loss=1.662, generator_adv_loss=2.012, generator_feat_match_loss=3.203, over 58.00 samples.], tot_loss[discriminator_loss=2.633, discriminator_real_loss=1.328, discriminator_fake_loss=1.305, generator_loss=30.43, generator_mel_loss=21, generator_kl_loss=1.982, generator_dur_loss=1.668, generator_adv_loss=2.139, generator_feat_match_loss=3.642, over 1875.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 22:29:34,208 INFO [train.py:811] (0/4) Start epoch 431 2023-11-13 22:33:09,213 INFO [train.py:811] (0/4) Start epoch 432 2023-11-13 22:33:45,695 INFO [train.py:467] (0/4) Epoch 432, batch 3, global_batch_idx: 15950, batch size: 85, loss[discriminator_loss=2.68, discriminator_real_loss=1.452, discriminator_fake_loss=1.227, generator_loss=30.09, generator_mel_loss=20.65, generator_kl_loss=1.961, generator_dur_loss=1.677, generator_adv_loss=2.184, generator_feat_match_loss=3.617, over 85.00 samples.], tot_loss[discriminator_loss=2.722, discriminator_real_loss=1.407, discriminator_fake_loss=1.315, generator_loss=30.1, generator_mel_loss=20.96, generator_kl_loss=1.964, generator_dur_loss=1.68, generator_adv_loss=2.062, generator_feat_match_loss=3.432, over 272.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2023-11-13 22:36:47,643 INFO [train.py:811] (0/4) Start epoch 433 2023-11-13 22:38:23,780 INFO [train.py:467] (0/4) Epoch 433, batch 16, global_batch_idx: 16000, batch size: 63, loss[discriminator_loss=2.666, discriminator_real_loss=1.359, discriminator_fake_loss=1.307, generator_loss=30.35, generator_mel_loss=20.81, generator_kl_loss=1.886, generator_dur_loss=1.674, generator_adv_loss=2.344, generator_feat_match_loss=3.631, over 63.00 samples.], tot_loss[discriminator_loss=2.644, discriminator_real_loss=1.338, discriminator_fake_loss=1.306, generator_loss=30.49, generator_mel_loss=21.03, generator_kl_loss=1.964, generator_dur_loss=1.672, generator_adv_loss=2.189, generator_feat_match_loss=3.631, over 1222.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 32.0 2023-11-13 22:38:24,360 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 22:38:34,696 INFO [train.py:517] (0/4) Epoch 433, validation: discriminator_loss=2.623, discriminator_real_loss=1.341, discriminator_fake_loss=1.282, generator_loss=31.39, generator_mel_loss=21.95, generator_kl_loss=2.056, generator_dur_loss=1.648, generator_adv_loss=2.076, generator_feat_match_loss=3.654, over 100.00 samples. 2023-11-13 22:38:34,697 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-13 22:40:30,192 INFO [train.py:811] (0/4) Start epoch 434 2023-11-13 22:43:22,585 INFO [train.py:467] (0/4) Epoch 434, batch 29, global_batch_idx: 16050, batch size: 153, loss[discriminator_loss=2.605, discriminator_real_loss=1.399, discriminator_fake_loss=1.205, generator_loss=30.84, generator_mel_loss=21.29, generator_kl_loss=2.036, generator_dur_loss=1.671, generator_adv_loss=2.137, generator_feat_match_loss=3.709, over 153.00 samples.], tot_loss[discriminator_loss=2.666, discriminator_real_loss=1.359, discriminator_fake_loss=1.307, generator_loss=30.26, generator_mel_loss=21, generator_kl_loss=1.984, generator_dur_loss=1.667, generator_adv_loss=2.102, generator_feat_match_loss=3.507, over 2170.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 22:44:01,622 INFO [train.py:811] (0/4) Start epoch 435 2023-11-13 22:47:32,067 INFO [train.py:811] (0/4) Start epoch 436 2023-11-13 22:48:13,604 INFO [train.py:467] (0/4) Epoch 436, batch 5, global_batch_idx: 16100, batch size: 76, loss[discriminator_loss=2.746, discriminator_real_loss=1.602, discriminator_fake_loss=1.145, generator_loss=30.26, generator_mel_loss=20.79, generator_kl_loss=1.935, generator_dur_loss=1.651, generator_adv_loss=2.244, generator_feat_match_loss=3.643, over 76.00 samples.], tot_loss[discriminator_loss=2.627, discriminator_real_loss=1.351, discriminator_fake_loss=1.275, generator_loss=30.34, generator_mel_loss=20.64, generator_kl_loss=1.954, generator_dur_loss=1.673, generator_adv_loss=2.268, generator_feat_match_loss=3.803, over 368.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 22:51:05,724 INFO [train.py:811] (0/4) Start epoch 437 2023-11-13 22:52:55,838 INFO [train.py:467] (0/4) Epoch 437, batch 18, global_batch_idx: 16150, batch size: 49, loss[discriminator_loss=2.77, discriminator_real_loss=1.484, discriminator_fake_loss=1.285, generator_loss=29.74, generator_mel_loss=20.42, generator_kl_loss=2.025, generator_dur_loss=1.652, generator_adv_loss=2.18, generator_feat_match_loss=3.465, over 49.00 samples.], tot_loss[discriminator_loss=2.578, discriminator_real_loss=1.317, discriminator_fake_loss=1.261, generator_loss=30.7, generator_mel_loss=20.72, generator_kl_loss=1.968, generator_dur_loss=1.668, generator_adv_loss=2.303, generator_feat_match_loss=4.035, over 1309.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 22:54:32,596 INFO [train.py:811] (0/4) Start epoch 438 2023-11-13 22:57:42,669 INFO [train.py:467] (0/4) Epoch 438, batch 31, global_batch_idx: 16200, batch size: 52, loss[discriminator_loss=2.594, discriminator_real_loss=1.278, discriminator_fake_loss=1.316, generator_loss=29.55, generator_mel_loss=20.45, generator_kl_loss=2.029, generator_dur_loss=1.665, generator_adv_loss=2.037, generator_feat_match_loss=3.373, over 52.00 samples.], tot_loss[discriminator_loss=2.601, discriminator_real_loss=1.312, discriminator_fake_loss=1.289, generator_loss=30.27, generator_mel_loss=20.88, generator_kl_loss=2, generator_dur_loss=1.668, generator_adv_loss=2.115, generator_feat_match_loss=3.608, over 2628.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 22:57:43,269 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 22:57:53,380 INFO [train.py:517] (0/4) Epoch 438, validation: discriminator_loss=2.524, discriminator_real_loss=1.23, discriminator_fake_loss=1.294, generator_loss=31.6, generator_mel_loss=21.79, generator_kl_loss=2.175, generator_dur_loss=1.647, generator_adv_loss=2.047, generator_feat_match_loss=3.945, over 100.00 samples. 2023-11-13 22:57:53,381 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-13 22:58:17,401 INFO [train.py:811] (0/4) Start epoch 439 2023-11-13 23:01:50,554 INFO [train.py:811] (0/4) Start epoch 440 2023-11-13 23:02:41,574 INFO [train.py:467] (0/4) Epoch 440, batch 7, global_batch_idx: 16250, batch size: 55, loss[discriminator_loss=2.605, discriminator_real_loss=1.36, discriminator_fake_loss=1.244, generator_loss=29.93, generator_mel_loss=20.69, generator_kl_loss=2.038, generator_dur_loss=1.69, generator_adv_loss=2.074, generator_feat_match_loss=3.441, over 55.00 samples.], tot_loss[discriminator_loss=2.564, discriminator_real_loss=1.264, discriminator_fake_loss=1.3, generator_loss=29.78, generator_mel_loss=20.56, generator_kl_loss=1.953, generator_dur_loss=1.667, generator_adv_loss=2.101, generator_feat_match_loss=3.501, over 555.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 23:05:22,476 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-440.pt 2023-11-13 23:05:25,844 INFO [train.py:811] (0/4) Start epoch 441 2023-11-13 23:07:31,653 INFO [train.py:467] (0/4) Epoch 441, batch 20, global_batch_idx: 16300, batch size: 85, loss[discriminator_loss=2.758, discriminator_real_loss=1.222, discriminator_fake_loss=1.537, generator_loss=29.72, generator_mel_loss=20.82, generator_kl_loss=1.856, generator_dur_loss=1.684, generator_adv_loss=2.01, generator_feat_match_loss=3.35, over 85.00 samples.], tot_loss[discriminator_loss=2.615, discriminator_real_loss=1.315, discriminator_fake_loss=1.299, generator_loss=30.5, generator_mel_loss=20.87, generator_kl_loss=1.979, generator_dur_loss=1.668, generator_adv_loss=2.229, generator_feat_match_loss=3.753, over 1589.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 23:08:52,490 INFO [train.py:811] (0/4) Start epoch 442 2023-11-13 23:12:10,653 INFO [train.py:467] (0/4) Epoch 442, batch 33, global_batch_idx: 16350, batch size: 61, loss[discriminator_loss=2.457, discriminator_real_loss=1.301, discriminator_fake_loss=1.156, generator_loss=30.68, generator_mel_loss=20.64, generator_kl_loss=1.889, generator_dur_loss=1.653, generator_adv_loss=2.293, generator_feat_match_loss=4.203, over 61.00 samples.], tot_loss[discriminator_loss=2.532, discriminator_real_loss=1.278, discriminator_fake_loss=1.254, generator_loss=30.73, generator_mel_loss=20.85, generator_kl_loss=1.944, generator_dur_loss=1.67, generator_adv_loss=2.286, generator_feat_match_loss=3.978, over 2434.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 23:12:32,529 INFO [train.py:811] (0/4) Start epoch 443 2023-11-13 23:16:04,884 INFO [train.py:811] (0/4) Start epoch 444 2023-11-13 23:17:12,829 INFO [train.py:467] (0/4) Epoch 444, batch 9, global_batch_idx: 16400, batch size: 65, loss[discriminator_loss=2.676, discriminator_real_loss=1.337, discriminator_fake_loss=1.34, generator_loss=30.26, generator_mel_loss=20.72, generator_kl_loss=2.015, generator_dur_loss=1.652, generator_adv_loss=2.225, generator_feat_match_loss=3.641, over 65.00 samples.], tot_loss[discriminator_loss=2.549, discriminator_real_loss=1.288, discriminator_fake_loss=1.261, generator_loss=30.33, generator_mel_loss=20.6, generator_kl_loss=1.96, generator_dur_loss=1.665, generator_adv_loss=2.265, generator_feat_match_loss=3.835, over 733.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 32.0 2023-11-13 23:17:13,799 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 23:17:23,918 INFO [train.py:517] (0/4) Epoch 444, validation: discriminator_loss=2.579, discriminator_real_loss=1.265, discriminator_fake_loss=1.314, generator_loss=30.75, generator_mel_loss=21.29, generator_kl_loss=2.088, generator_dur_loss=1.643, generator_adv_loss=2.073, generator_feat_match_loss=3.658, over 100.00 samples. 2023-11-13 23:17:23,919 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-13 23:19:49,837 INFO [train.py:811] (0/4) Start epoch 445 2023-11-13 23:22:03,111 INFO [train.py:467] (0/4) Epoch 445, batch 22, global_batch_idx: 16450, batch size: 49, loss[discriminator_loss=2.594, discriminator_real_loss=1.386, discriminator_fake_loss=1.208, generator_loss=30.91, generator_mel_loss=20.78, generator_kl_loss=2.042, generator_dur_loss=1.661, generator_adv_loss=2.26, generator_feat_match_loss=4.164, over 49.00 samples.], tot_loss[discriminator_loss=2.604, discriminator_real_loss=1.328, discriminator_fake_loss=1.275, generator_loss=30.25, generator_mel_loss=20.76, generator_kl_loss=1.984, generator_dur_loss=1.668, generator_adv_loss=2.18, generator_feat_match_loss=3.665, over 1655.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 23:23:20,531 INFO [train.py:811] (0/4) Start epoch 446 2023-11-13 23:26:47,937 INFO [train.py:467] (0/4) Epoch 446, batch 35, global_batch_idx: 16500, batch size: 90, loss[discriminator_loss=2.609, discriminator_real_loss=1.308, discriminator_fake_loss=1.301, generator_loss=29.84, generator_mel_loss=20.48, generator_kl_loss=1.965, generator_dur_loss=1.662, generator_adv_loss=2.201, generator_feat_match_loss=3.527, over 90.00 samples.], tot_loss[discriminator_loss=2.615, discriminator_real_loss=1.331, discriminator_fake_loss=1.285, generator_loss=30.25, generator_mel_loss=20.7, generator_kl_loss=1.998, generator_dur_loss=1.664, generator_adv_loss=2.165, generator_feat_match_loss=3.724, over 2703.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 23:26:53,747 INFO [train.py:811] (0/4) Start epoch 447 2023-11-13 23:30:19,556 INFO [train.py:811] (0/4) Start epoch 448 2023-11-13 23:31:37,757 INFO [train.py:467] (0/4) Epoch 448, batch 11, global_batch_idx: 16550, batch size: 50, loss[discriminator_loss=2.654, discriminator_real_loss=1.287, discriminator_fake_loss=1.367, generator_loss=30.25, generator_mel_loss=20.87, generator_kl_loss=1.96, generator_dur_loss=1.683, generator_adv_loss=2.17, generator_feat_match_loss=3.566, over 50.00 samples.], tot_loss[discriminator_loss=2.656, discriminator_real_loss=1.35, discriminator_fake_loss=1.306, generator_loss=30.03, generator_mel_loss=20.89, generator_kl_loss=1.967, generator_dur_loss=1.674, generator_adv_loss=2.072, generator_feat_match_loss=3.434, over 676.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 23:33:54,366 INFO [train.py:811] (0/4) Start epoch 449 2023-11-13 23:36:23,416 INFO [train.py:467] (0/4) Epoch 449, batch 24, global_batch_idx: 16600, batch size: 58, loss[discriminator_loss=2.688, discriminator_real_loss=1.309, discriminator_fake_loss=1.38, generator_loss=29.94, generator_mel_loss=20.67, generator_kl_loss=2, generator_dur_loss=1.673, generator_adv_loss=2.182, generator_feat_match_loss=3.414, over 58.00 samples.], tot_loss[discriminator_loss=2.619, discriminator_real_loss=1.316, discriminator_fake_loss=1.303, generator_loss=30.22, generator_mel_loss=20.85, generator_kl_loss=1.94, generator_dur_loss=1.663, generator_adv_loss=2.109, generator_feat_match_loss=3.664, over 2030.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 23:36:23,902 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 23:36:34,682 INFO [train.py:517] (0/4) Epoch 449, validation: discriminator_loss=2.759, discriminator_real_loss=1.395, discriminator_fake_loss=1.364, generator_loss=31.01, generator_mel_loss=21.7, generator_kl_loss=2.131, generator_dur_loss=1.646, generator_adv_loss=1.918, generator_feat_match_loss=3.617, over 100.00 samples. 2023-11-13 23:36:34,683 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-13 23:37:40,529 INFO [train.py:811] (0/4) Start epoch 450 2023-11-13 23:41:16,016 INFO [train.py:811] (0/4) Start epoch 451 2023-11-13 23:41:32,093 INFO [train.py:467] (0/4) Epoch 451, batch 0, global_batch_idx: 16650, batch size: 101, loss[discriminator_loss=2.684, discriminator_real_loss=1.255, discriminator_fake_loss=1.43, generator_loss=30.15, generator_mel_loss=21.03, generator_kl_loss=2.062, generator_dur_loss=1.632, generator_adv_loss=1.96, generator_feat_match_loss=3.469, over 101.00 samples.], tot_loss[discriminator_loss=2.684, discriminator_real_loss=1.255, discriminator_fake_loss=1.43, generator_loss=30.15, generator_mel_loss=21.03, generator_kl_loss=2.062, generator_dur_loss=1.632, generator_adv_loss=1.96, generator_feat_match_loss=3.469, over 101.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 23:44:48,645 INFO [train.py:811] (0/4) Start epoch 452 2023-11-13 23:46:12,804 INFO [train.py:467] (0/4) Epoch 452, batch 13, global_batch_idx: 16700, batch size: 69, loss[discriminator_loss=2.676, discriminator_real_loss=1.49, discriminator_fake_loss=1.185, generator_loss=30.31, generator_mel_loss=20.77, generator_kl_loss=1.943, generator_dur_loss=1.657, generator_adv_loss=2.088, generator_feat_match_loss=3.855, over 69.00 samples.], tot_loss[discriminator_loss=2.599, discriminator_real_loss=1.318, discriminator_fake_loss=1.281, generator_loss=30.48, generator_mel_loss=20.86, generator_kl_loss=1.958, generator_dur_loss=1.664, generator_adv_loss=2.237, generator_feat_match_loss=3.756, over 968.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 23:48:14,881 INFO [train.py:811] (0/4) Start epoch 453 2023-11-13 23:50:45,783 INFO [train.py:467] (0/4) Epoch 453, batch 26, global_batch_idx: 16750, batch size: 85, loss[discriminator_loss=2.613, discriminator_real_loss=1.396, discriminator_fake_loss=1.218, generator_loss=29.83, generator_mel_loss=20.75, generator_kl_loss=1.997, generator_dur_loss=1.67, generator_adv_loss=1.803, generator_feat_match_loss=3.609, over 85.00 samples.], tot_loss[discriminator_loss=2.607, discriminator_real_loss=1.317, discriminator_fake_loss=1.289, generator_loss=30.09, generator_mel_loss=20.69, generator_kl_loss=1.965, generator_dur_loss=1.673, generator_adv_loss=2.115, generator_feat_match_loss=3.643, over 1756.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-13 23:51:41,794 INFO [train.py:811] (0/4) Start epoch 454 2023-11-13 23:55:12,436 INFO [train.py:811] (0/4) Start epoch 455 2023-11-13 23:55:36,271 INFO [train.py:467] (0/4) Epoch 455, batch 2, global_batch_idx: 16800, batch size: 79, loss[discriminator_loss=2.668, discriminator_real_loss=1.361, discriminator_fake_loss=1.307, generator_loss=30.1, generator_mel_loss=20.82, generator_kl_loss=1.984, generator_dur_loss=1.667, generator_adv_loss=2.086, generator_feat_match_loss=3.545, over 79.00 samples.], tot_loss[discriminator_loss=2.655, discriminator_real_loss=1.33, discriminator_fake_loss=1.325, generator_loss=30.25, generator_mel_loss=20.95, generator_kl_loss=1.963, generator_dur_loss=1.673, generator_adv_loss=2.109, generator_feat_match_loss=3.549, over 213.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 32.0 2023-11-13 23:55:36,871 INFO [train.py:508] (0/4) Computing validation loss 2023-11-13 23:55:47,919 INFO [train.py:517] (0/4) Epoch 455, validation: discriminator_loss=2.686, discriminator_real_loss=1.338, discriminator_fake_loss=1.347, generator_loss=30.89, generator_mel_loss=21.51, generator_kl_loss=2.185, generator_dur_loss=1.65, generator_adv_loss=1.963, generator_feat_match_loss=3.585, over 100.00 samples. 2023-11-13 23:55:47,920 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-13 23:58:54,481 INFO [train.py:811] (0/4) Start epoch 456 2023-11-14 00:00:36,737 INFO [train.py:467] (0/4) Epoch 456, batch 15, global_batch_idx: 16850, batch size: 69, loss[discriminator_loss=2.516, discriminator_real_loss=1.292, discriminator_fake_loss=1.223, generator_loss=29.86, generator_mel_loss=20.36, generator_kl_loss=1.938, generator_dur_loss=1.671, generator_adv_loss=2.193, generator_feat_match_loss=3.697, over 69.00 samples.], tot_loss[discriminator_loss=2.659, discriminator_real_loss=1.355, discriminator_fake_loss=1.304, generator_loss=30.02, generator_mel_loss=20.63, generator_kl_loss=1.954, generator_dur_loss=1.666, generator_adv_loss=2.134, generator_feat_match_loss=3.634, over 1179.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 00:02:31,776 INFO [train.py:811] (0/4) Start epoch 457 2023-11-14 00:05:07,824 INFO [train.py:467] (0/4) Epoch 457, batch 28, global_batch_idx: 16900, batch size: 53, loss[discriminator_loss=2.641, discriminator_real_loss=1.344, discriminator_fake_loss=1.297, generator_loss=30.47, generator_mel_loss=20.89, generator_kl_loss=1.998, generator_dur_loss=1.672, generator_adv_loss=2.158, generator_feat_match_loss=3.75, over 53.00 samples.], tot_loss[discriminator_loss=2.644, discriminator_real_loss=1.346, discriminator_fake_loss=1.298, generator_loss=30.35, generator_mel_loss=20.87, generator_kl_loss=1.991, generator_dur_loss=1.665, generator_adv_loss=2.146, generator_feat_match_loss=3.677, over 2093.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 00:05:59,056 INFO [train.py:811] (0/4) Start epoch 458 2023-11-14 00:09:25,614 INFO [train.py:811] (0/4) Start epoch 459 2023-11-14 00:10:04,099 INFO [train.py:467] (0/4) Epoch 459, batch 4, global_batch_idx: 16950, batch size: 55, loss[discriminator_loss=2.637, discriminator_real_loss=1.43, discriminator_fake_loss=1.208, generator_loss=30.23, generator_mel_loss=20.83, generator_kl_loss=1.948, generator_dur_loss=1.696, generator_adv_loss=2.25, generator_feat_match_loss=3.506, over 55.00 samples.], tot_loss[discriminator_loss=2.623, discriminator_real_loss=1.344, discriminator_fake_loss=1.279, generator_loss=30.59, generator_mel_loss=21.19, generator_kl_loss=1.955, generator_dur_loss=1.681, generator_adv_loss=2.155, generator_feat_match_loss=3.607, over 388.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 00:12:58,592 INFO [train.py:811] (0/4) Start epoch 460 2023-11-14 00:14:48,869 INFO [train.py:467] (0/4) Epoch 460, batch 17, global_batch_idx: 17000, batch size: 63, loss[discriminator_loss=2.975, discriminator_real_loss=1.411, discriminator_fake_loss=1.563, generator_loss=29.47, generator_mel_loss=20.78, generator_kl_loss=1.948, generator_dur_loss=1.664, generator_adv_loss=1.965, generator_feat_match_loss=3.115, over 63.00 samples.], tot_loss[discriminator_loss=2.657, discriminator_real_loss=1.345, discriminator_fake_loss=1.312, generator_loss=30.87, generator_mel_loss=20.92, generator_kl_loss=1.999, generator_dur_loss=1.666, generator_adv_loss=2.32, generator_feat_match_loss=3.962, over 1439.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 00:14:49,367 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 00:15:00,459 INFO [train.py:517] (0/4) Epoch 460, validation: discriminator_loss=2.665, discriminator_real_loss=1.441, discriminator_fake_loss=1.223, generator_loss=30.68, generator_mel_loss=21.55, generator_kl_loss=2.03, generator_dur_loss=1.65, generator_adv_loss=2.015, generator_feat_match_loss=3.434, over 100.00 samples. 2023-11-14 00:15:00,460 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 00:16:50,838 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-460.pt 2023-11-14 00:16:54,292 INFO [train.py:811] (0/4) Start epoch 461 2023-11-14 00:19:53,241 INFO [train.py:467] (0/4) Epoch 461, batch 30, global_batch_idx: 17050, batch size: 69, loss[discriminator_loss=2.656, discriminator_real_loss=1.364, discriminator_fake_loss=1.291, generator_loss=29.98, generator_mel_loss=20.7, generator_kl_loss=1.969, generator_dur_loss=1.669, generator_adv_loss=2.043, generator_feat_match_loss=3.596, over 69.00 samples.], tot_loss[discriminator_loss=2.626, discriminator_real_loss=1.332, discriminator_fake_loss=1.294, generator_loss=30.17, generator_mel_loss=20.84, generator_kl_loss=1.998, generator_dur_loss=1.666, generator_adv_loss=2.076, generator_feat_match_loss=3.585, over 2228.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 00:20:26,987 INFO [train.py:811] (0/4) Start epoch 462 2023-11-14 00:23:57,342 INFO [train.py:811] (0/4) Start epoch 463 2023-11-14 00:24:50,004 INFO [train.py:467] (0/4) Epoch 463, batch 6, global_batch_idx: 17100, batch size: 90, loss[discriminator_loss=2.451, discriminator_real_loss=1.147, discriminator_fake_loss=1.304, generator_loss=30.53, generator_mel_loss=20.58, generator_kl_loss=1.955, generator_dur_loss=1.645, generator_adv_loss=2.246, generator_feat_match_loss=4.109, over 90.00 samples.], tot_loss[discriminator_loss=2.491, discriminator_real_loss=1.227, discriminator_fake_loss=1.264, generator_loss=30.67, generator_mel_loss=20.78, generator_kl_loss=1.962, generator_dur_loss=1.653, generator_adv_loss=2.238, generator_feat_match_loss=4.039, over 522.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 00:27:31,405 INFO [train.py:811] (0/4) Start epoch 464 2023-11-14 00:29:25,879 INFO [train.py:467] (0/4) Epoch 464, batch 19, global_batch_idx: 17150, batch size: 65, loss[discriminator_loss=2.57, discriminator_real_loss=1.225, discriminator_fake_loss=1.346, generator_loss=30.37, generator_mel_loss=20.69, generator_kl_loss=1.934, generator_dur_loss=1.678, generator_adv_loss=2.111, generator_feat_match_loss=3.961, over 65.00 samples.], tot_loss[discriminator_loss=2.578, discriminator_real_loss=1.3, discriminator_fake_loss=1.278, generator_loss=30.75, generator_mel_loss=21.03, generator_kl_loss=1.986, generator_dur_loss=1.668, generator_adv_loss=2.197, generator_feat_match_loss=3.871, over 1513.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 00:30:59,245 INFO [train.py:811] (0/4) Start epoch 465 2023-11-14 00:34:09,568 INFO [train.py:467] (0/4) Epoch 465, batch 32, global_batch_idx: 17200, batch size: 101, loss[discriminator_loss=2.699, discriminator_real_loss=1.277, discriminator_fake_loss=1.422, generator_loss=30.27, generator_mel_loss=20.97, generator_kl_loss=1.991, generator_dur_loss=1.649, generator_adv_loss=2.105, generator_feat_match_loss=3.555, over 101.00 samples.], tot_loss[discriminator_loss=2.619, discriminator_real_loss=1.334, discriminator_fake_loss=1.285, generator_loss=30.54, generator_mel_loss=20.84, generator_kl_loss=1.984, generator_dur_loss=1.657, generator_adv_loss=2.216, generator_feat_match_loss=3.845, over 2512.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 32.0 2023-11-14 00:34:10,150 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 00:34:20,938 INFO [train.py:517] (0/4) Epoch 465, validation: discriminator_loss=2.597, discriminator_real_loss=1.297, discriminator_fake_loss=1.3, generator_loss=31.33, generator_mel_loss=21.9, generator_kl_loss=2.121, generator_dur_loss=1.642, generator_adv_loss=2.012, generator_feat_match_loss=3.655, over 100.00 samples. 2023-11-14 00:34:20,939 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 00:34:43,390 INFO [train.py:811] (0/4) Start epoch 466 2023-11-14 00:38:13,533 INFO [train.py:811] (0/4) Start epoch 467 2023-11-14 00:39:09,843 INFO [train.py:467] (0/4) Epoch 467, batch 8, global_batch_idx: 17250, batch size: 101, loss[discriminator_loss=2.633, discriminator_real_loss=1.351, discriminator_fake_loss=1.283, generator_loss=29.72, generator_mel_loss=20.52, generator_kl_loss=2.047, generator_dur_loss=1.658, generator_adv_loss=1.971, generator_feat_match_loss=3.523, over 101.00 samples.], tot_loss[discriminator_loss=2.604, discriminator_real_loss=1.321, discriminator_fake_loss=1.282, generator_loss=29.97, generator_mel_loss=20.64, generator_kl_loss=1.985, generator_dur_loss=1.67, generator_adv_loss=2.076, generator_feat_match_loss=3.593, over 595.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 00:41:42,646 INFO [train.py:811] (0/4) Start epoch 468 2023-11-14 00:43:57,948 INFO [train.py:467] (0/4) Epoch 468, batch 21, global_batch_idx: 17300, batch size: 90, loss[discriminator_loss=2.539, discriminator_real_loss=1.176, discriminator_fake_loss=1.364, generator_loss=30.21, generator_mel_loss=20.4, generator_kl_loss=2.015, generator_dur_loss=1.677, generator_adv_loss=2.182, generator_feat_match_loss=3.939, over 90.00 samples.], tot_loss[discriminator_loss=2.607, discriminator_real_loss=1.334, discriminator_fake_loss=1.273, generator_loss=30.38, generator_mel_loss=20.59, generator_kl_loss=1.956, generator_dur_loss=1.659, generator_adv_loss=2.23, generator_feat_match_loss=3.951, over 1705.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 00:45:19,456 INFO [train.py:811] (0/4) Start epoch 469 2023-11-14 00:48:44,769 INFO [train.py:467] (0/4) Epoch 469, batch 34, global_batch_idx: 17350, batch size: 67, loss[discriminator_loss=2.555, discriminator_real_loss=1.242, discriminator_fake_loss=1.312, generator_loss=30.21, generator_mel_loss=20.47, generator_kl_loss=1.956, generator_dur_loss=1.676, generator_adv_loss=2.203, generator_feat_match_loss=3.898, over 67.00 samples.], tot_loss[discriminator_loss=2.615, discriminator_real_loss=1.331, discriminator_fake_loss=1.284, generator_loss=30.26, generator_mel_loss=20.82, generator_kl_loss=2.007, generator_dur_loss=1.663, generator_adv_loss=2.098, generator_feat_match_loss=3.676, over 2830.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 00:48:53,863 INFO [train.py:811] (0/4) Start epoch 470 2023-11-14 00:52:27,242 INFO [train.py:811] (0/4) Start epoch 471 2023-11-14 00:53:43,786 INFO [train.py:467] (0/4) Epoch 471, batch 10, global_batch_idx: 17400, batch size: 110, loss[discriminator_loss=2.576, discriminator_real_loss=1.326, discriminator_fake_loss=1.25, generator_loss=31.21, generator_mel_loss=21.3, generator_kl_loss=2.004, generator_dur_loss=1.66, generator_adv_loss=2.348, generator_feat_match_loss=3.895, over 110.00 samples.], tot_loss[discriminator_loss=2.624, discriminator_real_loss=1.348, discriminator_fake_loss=1.277, generator_loss=30.73, generator_mel_loss=20.98, generator_kl_loss=2.007, generator_dur_loss=1.665, generator_adv_loss=2.221, generator_feat_match_loss=3.855, over 965.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 00:53:44,309 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 00:53:55,626 INFO [train.py:517] (0/4) Epoch 471, validation: discriminator_loss=2.586, discriminator_real_loss=1.263, discriminator_fake_loss=1.323, generator_loss=31.27, generator_mel_loss=21.78, generator_kl_loss=2.146, generator_dur_loss=1.636, generator_adv_loss=1.985, generator_feat_match_loss=3.723, over 100.00 samples. 2023-11-14 00:53:55,627 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 00:56:15,854 INFO [train.py:811] (0/4) Start epoch 472 2023-11-14 00:58:31,896 INFO [train.py:467] (0/4) Epoch 472, batch 23, global_batch_idx: 17450, batch size: 55, loss[discriminator_loss=2.631, discriminator_real_loss=1.312, discriminator_fake_loss=1.319, generator_loss=30.31, generator_mel_loss=20.81, generator_kl_loss=1.972, generator_dur_loss=1.701, generator_adv_loss=2.1, generator_feat_match_loss=3.725, over 55.00 samples.], tot_loss[discriminator_loss=2.662, discriminator_real_loss=1.356, discriminator_fake_loss=1.307, generator_loss=30.45, generator_mel_loss=20.99, generator_kl_loss=1.979, generator_dur_loss=1.665, generator_adv_loss=2.139, generator_feat_match_loss=3.675, over 1682.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 00:59:52,670 INFO [train.py:811] (0/4) Start epoch 473 2023-11-14 01:03:24,349 INFO [train.py:467] (0/4) Epoch 473, batch 36, global_batch_idx: 17500, batch size: 60, loss[discriminator_loss=2.703, discriminator_real_loss=1.333, discriminator_fake_loss=1.371, generator_loss=29.85, generator_mel_loss=20.67, generator_kl_loss=1.979, generator_dur_loss=1.671, generator_adv_loss=2.055, generator_feat_match_loss=3.477, over 60.00 samples.], tot_loss[discriminator_loss=2.629, discriminator_real_loss=1.346, discriminator_fake_loss=1.284, generator_loss=30.46, generator_mel_loss=20.92, generator_kl_loss=1.962, generator_dur_loss=1.66, generator_adv_loss=2.188, generator_feat_match_loss=3.732, over 2682.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2023-11-14 01:03:25,463 INFO [train.py:811] (0/4) Start epoch 474 2023-11-14 01:06:51,822 INFO [train.py:811] (0/4) Start epoch 475 2023-11-14 01:08:17,284 INFO [train.py:467] (0/4) Epoch 475, batch 12, global_batch_idx: 17550, batch size: 52, loss[discriminator_loss=2.73, discriminator_real_loss=1.274, discriminator_fake_loss=1.457, generator_loss=29.63, generator_mel_loss=20.82, generator_kl_loss=1.984, generator_dur_loss=1.663, generator_adv_loss=2.08, generator_feat_match_loss=3.078, over 52.00 samples.], tot_loss[discriminator_loss=2.661, discriminator_real_loss=1.371, discriminator_fake_loss=1.289, generator_loss=30.36, generator_mel_loss=20.64, generator_kl_loss=1.976, generator_dur_loss=1.656, generator_adv_loss=2.242, generator_feat_match_loss=3.845, over 1022.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 01:10:25,002 INFO [train.py:811] (0/4) Start epoch 476 2023-11-14 01:13:03,753 INFO [train.py:467] (0/4) Epoch 476, batch 25, global_batch_idx: 17600, batch size: 65, loss[discriminator_loss=2.613, discriminator_real_loss=1.357, discriminator_fake_loss=1.257, generator_loss=29.88, generator_mel_loss=20.53, generator_kl_loss=1.979, generator_dur_loss=1.657, generator_adv_loss=2.043, generator_feat_match_loss=3.666, over 65.00 samples.], tot_loss[discriminator_loss=2.603, discriminator_real_loss=1.328, discriminator_fake_loss=1.275, generator_loss=30.17, generator_mel_loss=20.78, generator_kl_loss=1.981, generator_dur_loss=1.658, generator_adv_loss=2.073, generator_feat_match_loss=3.681, over 1802.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 32.0 2023-11-14 01:13:04,259 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 01:13:14,838 INFO [train.py:517] (0/4) Epoch 476, validation: discriminator_loss=2.622, discriminator_real_loss=1.184, discriminator_fake_loss=1.438, generator_loss=31.07, generator_mel_loss=21.51, generator_kl_loss=2.189, generator_dur_loss=1.645, generator_adv_loss=1.809, generator_feat_match_loss=3.92, over 100.00 samples. 2023-11-14 01:13:14,839 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 01:14:08,165 INFO [train.py:811] (0/4) Start epoch 477 2023-11-14 01:17:42,960 INFO [train.py:811] (0/4) Start epoch 478 2023-11-14 01:18:05,263 INFO [train.py:467] (0/4) Epoch 478, batch 1, global_batch_idx: 17650, batch size: 76, loss[discriminator_loss=2.559, discriminator_real_loss=1.317, discriminator_fake_loss=1.24, generator_loss=30.09, generator_mel_loss=20.55, generator_kl_loss=2.033, generator_dur_loss=1.655, generator_adv_loss=2.193, generator_feat_match_loss=3.658, over 76.00 samples.], tot_loss[discriminator_loss=2.588, discriminator_real_loss=1.333, discriminator_fake_loss=1.254, generator_loss=30.45, generator_mel_loss=20.95, generator_kl_loss=2.026, generator_dur_loss=1.661, generator_adv_loss=2.117, generator_feat_match_loss=3.7, over 135.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 32.0 2023-11-14 01:21:20,213 INFO [train.py:811] (0/4) Start epoch 479 2023-11-14 01:22:52,518 INFO [train.py:467] (0/4) Epoch 479, batch 14, global_batch_idx: 17700, batch size: 52, loss[discriminator_loss=2.41, discriminator_real_loss=1.207, discriminator_fake_loss=1.202, generator_loss=30.78, generator_mel_loss=20.11, generator_kl_loss=2.013, generator_dur_loss=1.65, generator_adv_loss=2.4, generator_feat_match_loss=4.602, over 52.00 samples.], tot_loss[discriminator_loss=2.593, discriminator_real_loss=1.329, discriminator_fake_loss=1.264, generator_loss=30.71, generator_mel_loss=20.74, generator_kl_loss=1.979, generator_dur_loss=1.659, generator_adv_loss=2.284, generator_feat_match_loss=4.053, over 1256.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 01:24:53,154 INFO [train.py:811] (0/4) Start epoch 480 2023-11-14 01:27:37,042 INFO [train.py:467] (0/4) Epoch 480, batch 27, global_batch_idx: 17750, batch size: 53, loss[discriminator_loss=2.555, discriminator_real_loss=1.293, discriminator_fake_loss=1.262, generator_loss=30.05, generator_mel_loss=20.58, generator_kl_loss=1.937, generator_dur_loss=1.685, generator_adv_loss=2.104, generator_feat_match_loss=3.744, over 53.00 samples.], tot_loss[discriminator_loss=2.601, discriminator_real_loss=1.312, discriminator_fake_loss=1.289, generator_loss=30.1, generator_mel_loss=20.56, generator_kl_loss=1.979, generator_dur_loss=1.661, generator_adv_loss=2.126, generator_feat_match_loss=3.771, over 1791.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 01:28:22,237 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-480.pt 2023-11-14 01:28:25,539 INFO [train.py:811] (0/4) Start epoch 481 2023-11-14 01:31:54,854 INFO [train.py:811] (0/4) Start epoch 482 2023-11-14 01:32:24,365 INFO [train.py:467] (0/4) Epoch 482, batch 3, global_batch_idx: 17800, batch size: 85, loss[discriminator_loss=2.562, discriminator_real_loss=1.289, discriminator_fake_loss=1.273, generator_loss=31.2, generator_mel_loss=21.05, generator_kl_loss=1.884, generator_dur_loss=1.647, generator_adv_loss=2.691, generator_feat_match_loss=3.926, over 85.00 samples.], tot_loss[discriminator_loss=2.569, discriminator_real_loss=1.319, discriminator_fake_loss=1.25, generator_loss=31, generator_mel_loss=21, generator_kl_loss=1.997, generator_dur_loss=1.668, generator_adv_loss=2.362, generator_feat_match_loss=3.972, over 266.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 01:32:24,849 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 01:32:36,718 INFO [train.py:517] (0/4) Epoch 482, validation: discriminator_loss=2.663, discriminator_real_loss=1.5, discriminator_fake_loss=1.163, generator_loss=31.29, generator_mel_loss=21.22, generator_kl_loss=2.11, generator_dur_loss=1.642, generator_adv_loss=2.364, generator_feat_match_loss=3.958, over 100.00 samples. 2023-11-14 01:32:36,719 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 01:35:40,266 INFO [train.py:811] (0/4) Start epoch 483 2023-11-14 01:37:24,204 INFO [train.py:467] (0/4) Epoch 483, batch 16, global_batch_idx: 17850, batch size: 53, loss[discriminator_loss=2.662, discriminator_real_loss=1.359, discriminator_fake_loss=1.303, generator_loss=29.84, generator_mel_loss=20.45, generator_kl_loss=2.014, generator_dur_loss=1.672, generator_adv_loss=2.168, generator_feat_match_loss=3.531, over 53.00 samples.], tot_loss[discriminator_loss=2.669, discriminator_real_loss=1.361, discriminator_fake_loss=1.308, generator_loss=30.66, generator_mel_loss=21.06, generator_kl_loss=2.016, generator_dur_loss=1.664, generator_adv_loss=2.16, generator_feat_match_loss=3.768, over 1375.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 01:39:16,047 INFO [train.py:811] (0/4) Start epoch 484 2023-11-14 01:41:58,325 INFO [train.py:467] (0/4) Epoch 484, batch 29, global_batch_idx: 17900, batch size: 58, loss[discriminator_loss=2.486, discriminator_real_loss=1.326, discriminator_fake_loss=1.16, generator_loss=30.78, generator_mel_loss=20.63, generator_kl_loss=1.937, generator_dur_loss=1.653, generator_adv_loss=2.25, generator_feat_match_loss=4.309, over 58.00 samples.], tot_loss[discriminator_loss=2.641, discriminator_real_loss=1.339, discriminator_fake_loss=1.302, generator_loss=30.5, generator_mel_loss=20.85, generator_kl_loss=2.001, generator_dur_loss=1.664, generator_adv_loss=2.203, generator_feat_match_loss=3.773, over 2109.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 01:42:43,016 INFO [train.py:811] (0/4) Start epoch 485 2023-11-14 01:46:21,828 INFO [train.py:811] (0/4) Start epoch 486 2023-11-14 01:47:01,154 INFO [train.py:467] (0/4) Epoch 486, batch 5, global_batch_idx: 17950, batch size: 54, loss[discriminator_loss=2.414, discriminator_real_loss=1.171, discriminator_fake_loss=1.242, generator_loss=31.36, generator_mel_loss=20.65, generator_kl_loss=2.142, generator_dur_loss=1.701, generator_adv_loss=2.607, generator_feat_match_loss=4.258, over 54.00 samples.], tot_loss[discriminator_loss=2.553, discriminator_real_loss=1.288, discriminator_fake_loss=1.265, generator_loss=30.5, generator_mel_loss=20.55, generator_kl_loss=2.009, generator_dur_loss=1.672, generator_adv_loss=2.269, generator_feat_match_loss=4.002, over 437.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 01:49:57,339 INFO [train.py:811] (0/4) Start epoch 487 2023-11-14 01:51:46,759 INFO [train.py:467] (0/4) Epoch 487, batch 18, global_batch_idx: 18000, batch size: 71, loss[discriminator_loss=2.543, discriminator_real_loss=1.365, discriminator_fake_loss=1.177, generator_loss=31.22, generator_mel_loss=21.15, generator_kl_loss=1.958, generator_dur_loss=1.656, generator_adv_loss=2.34, generator_feat_match_loss=4.117, over 71.00 samples.], tot_loss[discriminator_loss=2.551, discriminator_real_loss=1.295, discriminator_fake_loss=1.256, generator_loss=30.47, generator_mel_loss=20.74, generator_kl_loss=2.019, generator_dur_loss=1.659, generator_adv_loss=2.177, generator_feat_match_loss=3.874, over 1335.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 01:51:47,240 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 01:51:57,849 INFO [train.py:517] (0/4) Epoch 487, validation: discriminator_loss=2.564, discriminator_real_loss=1.254, discriminator_fake_loss=1.31, generator_loss=31.05, generator_mel_loss=21.43, generator_kl_loss=2.126, generator_dur_loss=1.647, generator_adv_loss=1.936, generator_feat_match_loss=3.913, over 100.00 samples. 2023-11-14 01:51:57,850 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 01:53:34,628 INFO [train.py:811] (0/4) Start epoch 488 2023-11-14 01:56:43,692 INFO [train.py:467] (0/4) Epoch 488, batch 31, global_batch_idx: 18050, batch size: 79, loss[discriminator_loss=2.605, discriminator_real_loss=1.238, discriminator_fake_loss=1.366, generator_loss=30.32, generator_mel_loss=20.79, generator_kl_loss=2.019, generator_dur_loss=1.648, generator_adv_loss=2.08, generator_feat_match_loss=3.785, over 79.00 samples.], tot_loss[discriminator_loss=2.569, discriminator_real_loss=1.301, discriminator_fake_loss=1.268, generator_loss=30.42, generator_mel_loss=20.55, generator_kl_loss=1.99, generator_dur_loss=1.662, generator_adv_loss=2.226, generator_feat_match_loss=3.987, over 2169.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 01:57:11,618 INFO [train.py:811] (0/4) Start epoch 489 2023-11-14 02:00:41,266 INFO [train.py:811] (0/4) Start epoch 490 2023-11-14 02:01:41,370 INFO [train.py:467] (0/4) Epoch 490, batch 7, global_batch_idx: 18100, batch size: 153, loss[discriminator_loss=2.713, discriminator_real_loss=1.514, discriminator_fake_loss=1.199, generator_loss=30.55, generator_mel_loss=20.95, generator_kl_loss=1.983, generator_dur_loss=1.649, generator_adv_loss=2.01, generator_feat_match_loss=3.949, over 153.00 samples.], tot_loss[discriminator_loss=2.67, discriminator_real_loss=1.382, discriminator_fake_loss=1.288, generator_loss=30.61, generator_mel_loss=20.89, generator_kl_loss=1.989, generator_dur_loss=1.665, generator_adv_loss=2.162, generator_feat_match_loss=3.904, over 585.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 02:04:13,386 INFO [train.py:811] (0/4) Start epoch 491 2023-11-14 02:06:06,607 INFO [train.py:467] (0/4) Epoch 491, batch 20, global_batch_idx: 18150, batch size: 85, loss[discriminator_loss=2.613, discriminator_real_loss=1.367, discriminator_fake_loss=1.246, generator_loss=29.81, generator_mel_loss=20.42, generator_kl_loss=2.066, generator_dur_loss=1.654, generator_adv_loss=2.18, generator_feat_match_loss=3.49, over 85.00 samples.], tot_loss[discriminator_loss=2.645, discriminator_real_loss=1.351, discriminator_fake_loss=1.294, generator_loss=30.23, generator_mel_loss=20.78, generator_kl_loss=1.98, generator_dur_loss=1.668, generator_adv_loss=2.16, generator_feat_match_loss=3.648, over 1526.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 02:07:44,489 INFO [train.py:811] (0/4) Start epoch 492 2023-11-14 02:11:00,680 INFO [train.py:467] (0/4) Epoch 492, batch 33, global_batch_idx: 18200, batch size: 90, loss[discriminator_loss=2.605, discriminator_real_loss=1.325, discriminator_fake_loss=1.28, generator_loss=30.81, generator_mel_loss=20.9, generator_kl_loss=1.944, generator_dur_loss=1.633, generator_adv_loss=2.371, generator_feat_match_loss=3.965, over 90.00 samples.], tot_loss[discriminator_loss=2.627, discriminator_real_loss=1.325, discriminator_fake_loss=1.302, generator_loss=30.47, generator_mel_loss=20.87, generator_kl_loss=1.994, generator_dur_loss=1.665, generator_adv_loss=2.184, generator_feat_match_loss=3.765, over 2354.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 02:11:01,219 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 02:11:12,291 INFO [train.py:517] (0/4) Epoch 492, validation: discriminator_loss=2.553, discriminator_real_loss=1.265, discriminator_fake_loss=1.288, generator_loss=31.02, generator_mel_loss=21.31, generator_kl_loss=2.021, generator_dur_loss=1.642, generator_adv_loss=2.181, generator_feat_match_loss=3.864, over 100.00 samples. 2023-11-14 02:11:12,292 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 02:11:30,485 INFO [train.py:811] (0/4) Start epoch 493 2023-11-14 02:15:04,321 INFO [train.py:811] (0/4) Start epoch 494 2023-11-14 02:16:06,156 INFO [train.py:467] (0/4) Epoch 494, batch 9, global_batch_idx: 18250, batch size: 101, loss[discriminator_loss=2.527, discriminator_real_loss=1.308, discriminator_fake_loss=1.221, generator_loss=30.86, generator_mel_loss=20.98, generator_kl_loss=2.005, generator_dur_loss=1.657, generator_adv_loss=2.172, generator_feat_match_loss=4.055, over 101.00 samples.], tot_loss[discriminator_loss=2.597, discriminator_real_loss=1.327, discriminator_fake_loss=1.27, generator_loss=30.34, generator_mel_loss=20.7, generator_kl_loss=1.991, generator_dur_loss=1.668, generator_adv_loss=2.143, generator_feat_match_loss=3.843, over 683.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 02:18:35,820 INFO [train.py:811] (0/4) Start epoch 495 2023-11-14 02:20:43,500 INFO [train.py:467] (0/4) Epoch 495, batch 22, global_batch_idx: 18300, batch size: 61, loss[discriminator_loss=2.703, discriminator_real_loss=1.254, discriminator_fake_loss=1.45, generator_loss=30.49, generator_mel_loss=21.09, generator_kl_loss=2.091, generator_dur_loss=1.643, generator_adv_loss=2.068, generator_feat_match_loss=3.596, over 61.00 samples.], tot_loss[discriminator_loss=2.628, discriminator_real_loss=1.342, discriminator_fake_loss=1.286, generator_loss=30.35, generator_mel_loss=20.75, generator_kl_loss=2.016, generator_dur_loss=1.66, generator_adv_loss=2.15, generator_feat_match_loss=3.773, over 1679.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 02:22:06,595 INFO [train.py:811] (0/4) Start epoch 496 2023-11-14 02:25:33,416 INFO [train.py:467] (0/4) Epoch 496, batch 35, global_batch_idx: 18350, batch size: 95, loss[discriminator_loss=2.699, discriminator_real_loss=1.48, discriminator_fake_loss=1.219, generator_loss=29.76, generator_mel_loss=20.76, generator_kl_loss=2.008, generator_dur_loss=1.649, generator_adv_loss=1.928, generator_feat_match_loss=3.412, over 95.00 samples.], tot_loss[discriminator_loss=2.662, discriminator_real_loss=1.352, discriminator_fake_loss=1.311, generator_loss=30.19, generator_mel_loss=20.78, generator_kl_loss=1.991, generator_dur_loss=1.663, generator_adv_loss=2.098, generator_feat_match_loss=3.658, over 2496.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 02:25:41,675 INFO [train.py:811] (0/4) Start epoch 497 2023-11-14 02:29:15,810 INFO [train.py:811] (0/4) Start epoch 498 2023-11-14 02:30:30,799 INFO [train.py:467] (0/4) Epoch 498, batch 11, global_batch_idx: 18400, batch size: 101, loss[discriminator_loss=2.77, discriminator_real_loss=1.576, discriminator_fake_loss=1.193, generator_loss=30.68, generator_mel_loss=21.03, generator_kl_loss=2.053, generator_dur_loss=1.661, generator_adv_loss=2.43, generator_feat_match_loss=3.506, over 101.00 samples.], tot_loss[discriminator_loss=2.699, discriminator_real_loss=1.387, discriminator_fake_loss=1.312, generator_loss=30.68, generator_mel_loss=21, generator_kl_loss=2.031, generator_dur_loss=1.656, generator_adv_loss=2.191, generator_feat_match_loss=3.795, over 945.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 32.0 2023-11-14 02:30:31,345 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 02:30:41,818 INFO [train.py:517] (0/4) Epoch 498, validation: discriminator_loss=2.681, discriminator_real_loss=1.591, discriminator_fake_loss=1.09, generator_loss=31.44, generator_mel_loss=21.5, generator_kl_loss=2.11, generator_dur_loss=1.651, generator_adv_loss=2.465, generator_feat_match_loss=3.706, over 100.00 samples. 2023-11-14 02:30:41,819 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 02:33:01,563 INFO [train.py:811] (0/4) Start epoch 499 2023-11-14 02:35:21,426 INFO [train.py:467] (0/4) Epoch 499, batch 24, global_batch_idx: 18450, batch size: 76, loss[discriminator_loss=2.549, discriminator_real_loss=1.273, discriminator_fake_loss=1.275, generator_loss=30.82, generator_mel_loss=20.97, generator_kl_loss=2.011, generator_dur_loss=1.682, generator_adv_loss=2.109, generator_feat_match_loss=4.055, over 76.00 samples.], tot_loss[discriminator_loss=2.614, discriminator_real_loss=1.324, discriminator_fake_loss=1.29, generator_loss=30.38, generator_mel_loss=20.84, generator_kl_loss=2.001, generator_dur_loss=1.662, generator_adv_loss=2.096, generator_feat_match_loss=3.787, over 1701.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 02:36:31,810 INFO [train.py:811] (0/4) Start epoch 500 2023-11-14 02:40:01,857 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-500.pt 2023-11-14 02:40:05,357 INFO [train.py:811] (0/4) Start epoch 501 2023-11-14 02:40:20,071 INFO [train.py:467] (0/4) Epoch 501, batch 0, global_batch_idx: 18500, batch size: 52, loss[discriminator_loss=2.617, discriminator_real_loss=1.372, discriminator_fake_loss=1.244, generator_loss=29.61, generator_mel_loss=20.58, generator_kl_loss=1.862, generator_dur_loss=1.643, generator_adv_loss=2.025, generator_feat_match_loss=3.504, over 52.00 samples.], tot_loss[discriminator_loss=2.617, discriminator_real_loss=1.372, discriminator_fake_loss=1.244, generator_loss=29.61, generator_mel_loss=20.58, generator_kl_loss=1.862, generator_dur_loss=1.643, generator_adv_loss=2.025, generator_feat_match_loss=3.504, over 52.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 02:43:35,460 INFO [train.py:811] (0/4) Start epoch 502 2023-11-14 02:45:08,588 INFO [train.py:467] (0/4) Epoch 502, batch 13, global_batch_idx: 18550, batch size: 126, loss[discriminator_loss=2.609, discriminator_real_loss=1.363, discriminator_fake_loss=1.245, generator_loss=30.24, generator_mel_loss=20.67, generator_kl_loss=2.002, generator_dur_loss=1.649, generator_adv_loss=2.133, generator_feat_match_loss=3.791, over 126.00 samples.], tot_loss[discriminator_loss=2.643, discriminator_real_loss=1.337, discriminator_fake_loss=1.306, generator_loss=30.27, generator_mel_loss=20.79, generator_kl_loss=1.99, generator_dur_loss=1.662, generator_adv_loss=2.129, generator_feat_match_loss=3.691, over 1135.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 02:47:09,886 INFO [train.py:811] (0/4) Start epoch 503 2023-11-14 02:49:51,290 INFO [train.py:467] (0/4) Epoch 503, batch 26, global_batch_idx: 18600, batch size: 153, loss[discriminator_loss=2.648, discriminator_real_loss=1.42, discriminator_fake_loss=1.229, generator_loss=31.04, generator_mel_loss=21.26, generator_kl_loss=2.05, generator_dur_loss=1.619, generator_adv_loss=2.236, generator_feat_match_loss=3.871, over 153.00 samples.], tot_loss[discriminator_loss=2.599, discriminator_real_loss=1.336, discriminator_fake_loss=1.263, generator_loss=30.61, generator_mel_loss=20.82, generator_kl_loss=1.986, generator_dur_loss=1.659, generator_adv_loss=2.239, generator_feat_match_loss=3.906, over 2061.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 02:49:51,770 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 02:50:03,295 INFO [train.py:517] (0/4) Epoch 503, validation: discriminator_loss=2.597, discriminator_real_loss=1.343, discriminator_fake_loss=1.253, generator_loss=31.87, generator_mel_loss=21.7, generator_kl_loss=2.231, generator_dur_loss=1.642, generator_adv_loss=2.25, generator_feat_match_loss=4.045, over 100.00 samples. 2023-11-14 02:50:03,296 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 02:50:51,866 INFO [train.py:811] (0/4) Start epoch 504 2023-11-14 02:54:22,032 INFO [train.py:811] (0/4) Start epoch 505 2023-11-14 02:54:50,921 INFO [train.py:467] (0/4) Epoch 505, batch 2, global_batch_idx: 18650, batch size: 85, loss[discriminator_loss=2.656, discriminator_real_loss=1.355, discriminator_fake_loss=1.302, generator_loss=29.99, generator_mel_loss=20.63, generator_kl_loss=1.928, generator_dur_loss=1.664, generator_adv_loss=2.072, generator_feat_match_loss=3.703, over 85.00 samples.], tot_loss[discriminator_loss=2.625, discriminator_real_loss=1.337, discriminator_fake_loss=1.288, generator_loss=30.19, generator_mel_loss=20.78, generator_kl_loss=1.989, generator_dur_loss=1.658, generator_adv_loss=2.072, generator_feat_match_loss=3.696, over 203.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 02:57:54,704 INFO [train.py:811] (0/4) Start epoch 506 2023-11-14 02:59:26,016 INFO [train.py:467] (0/4) Epoch 506, batch 15, global_batch_idx: 18700, batch size: 69, loss[discriminator_loss=2.637, discriminator_real_loss=1.083, discriminator_fake_loss=1.553, generator_loss=30.03, generator_mel_loss=20.5, generator_kl_loss=1.868, generator_dur_loss=1.679, generator_adv_loss=2.207, generator_feat_match_loss=3.773, over 69.00 samples.], tot_loss[discriminator_loss=2.609, discriminator_real_loss=1.286, discriminator_fake_loss=1.323, generator_loss=30.31, generator_mel_loss=20.55, generator_kl_loss=1.965, generator_dur_loss=1.66, generator_adv_loss=2.208, generator_feat_match_loss=3.925, over 1127.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 03:01:26,304 INFO [train.py:811] (0/4) Start epoch 507 2023-11-14 03:04:12,582 INFO [train.py:467] (0/4) Epoch 507, batch 28, global_batch_idx: 18750, batch size: 153, loss[discriminator_loss=2.928, discriminator_real_loss=1.436, discriminator_fake_loss=1.492, generator_loss=29.97, generator_mel_loss=20.62, generator_kl_loss=2.043, generator_dur_loss=1.661, generator_adv_loss=1.992, generator_feat_match_loss=3.654, over 153.00 samples.], tot_loss[discriminator_loss=2.541, discriminator_real_loss=1.283, discriminator_fake_loss=1.259, generator_loss=30.68, generator_mel_loss=20.48, generator_kl_loss=1.966, generator_dur_loss=1.662, generator_adv_loss=2.338, generator_feat_match_loss=4.235, over 1998.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2023-11-14 03:04:57,839 INFO [train.py:811] (0/4) Start epoch 508 2023-11-14 03:08:34,573 INFO [train.py:811] (0/4) Start epoch 509 2023-11-14 03:09:08,979 INFO [train.py:467] (0/4) Epoch 509, batch 4, global_batch_idx: 18800, batch size: 49, loss[discriminator_loss=2.623, discriminator_real_loss=1.269, discriminator_fake_loss=1.354, generator_loss=30.31, generator_mel_loss=20.91, generator_kl_loss=1.997, generator_dur_loss=1.649, generator_adv_loss=2.045, generator_feat_match_loss=3.701, over 49.00 samples.], tot_loss[discriminator_loss=2.595, discriminator_real_loss=1.319, discriminator_fake_loss=1.276, generator_loss=30.33, generator_mel_loss=20.71, generator_kl_loss=1.981, generator_dur_loss=1.665, generator_adv_loss=2.113, generator_feat_match_loss=3.858, over 303.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 03:09:09,538 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 03:09:20,574 INFO [train.py:517] (0/4) Epoch 509, validation: discriminator_loss=2.57, discriminator_real_loss=1.201, discriminator_fake_loss=1.368, generator_loss=30.81, generator_mel_loss=21.53, generator_kl_loss=2.057, generator_dur_loss=1.642, generator_adv_loss=1.856, generator_feat_match_loss=3.719, over 100.00 samples. 2023-11-14 03:09:20,575 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 03:12:16,512 INFO [train.py:811] (0/4) Start epoch 510 2023-11-14 03:14:01,905 INFO [train.py:467] (0/4) Epoch 510, batch 17, global_batch_idx: 18850, batch size: 64, loss[discriminator_loss=2.609, discriminator_real_loss=1.227, discriminator_fake_loss=1.382, generator_loss=30.75, generator_mel_loss=20.75, generator_kl_loss=1.916, generator_dur_loss=1.678, generator_adv_loss=2.455, generator_feat_match_loss=3.957, over 64.00 samples.], tot_loss[discriminator_loss=2.617, discriminator_real_loss=1.329, discriminator_fake_loss=1.287, generator_loss=30.49, generator_mel_loss=20.85, generator_kl_loss=1.981, generator_dur_loss=1.662, generator_adv_loss=2.165, generator_feat_match_loss=3.839, over 1246.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 03:15:45,576 INFO [train.py:811] (0/4) Start epoch 511 2023-11-14 03:18:44,820 INFO [train.py:467] (0/4) Epoch 511, batch 30, global_batch_idx: 18900, batch size: 56, loss[discriminator_loss=2.717, discriminator_real_loss=1.433, discriminator_fake_loss=1.284, generator_loss=30.34, generator_mel_loss=20.75, generator_kl_loss=2.072, generator_dur_loss=1.64, generator_adv_loss=2.1, generator_feat_match_loss=3.773, over 56.00 samples.], tot_loss[discriminator_loss=2.639, discriminator_real_loss=1.338, discriminator_fake_loss=1.301, generator_loss=30.37, generator_mel_loss=20.78, generator_kl_loss=1.964, generator_dur_loss=1.656, generator_adv_loss=2.178, generator_feat_match_loss=3.796, over 2387.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 03:19:21,588 INFO [train.py:811] (0/4) Start epoch 512 2023-11-14 03:22:51,622 INFO [train.py:811] (0/4) Start epoch 513 2023-11-14 03:23:44,017 INFO [train.py:467] (0/4) Epoch 513, batch 6, global_batch_idx: 18950, batch size: 63, loss[discriminator_loss=2.672, discriminator_real_loss=1.337, discriminator_fake_loss=1.334, generator_loss=29.94, generator_mel_loss=20.14, generator_kl_loss=2.011, generator_dur_loss=1.678, generator_adv_loss=2.396, generator_feat_match_loss=3.709, over 63.00 samples.], tot_loss[discriminator_loss=2.642, discriminator_real_loss=1.344, discriminator_fake_loss=1.297, generator_loss=30.56, generator_mel_loss=20.79, generator_kl_loss=1.99, generator_dur_loss=1.66, generator_adv_loss=2.195, generator_feat_match_loss=3.928, over 645.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 03:26:27,655 INFO [train.py:811] (0/4) Start epoch 514 2023-11-14 03:28:28,783 INFO [train.py:467] (0/4) Epoch 514, batch 19, global_batch_idx: 19000, batch size: 64, loss[discriminator_loss=2.582, discriminator_real_loss=1.384, discriminator_fake_loss=1.198, generator_loss=29.4, generator_mel_loss=19.71, generator_kl_loss=2.085, generator_dur_loss=1.674, generator_adv_loss=2.246, generator_feat_match_loss=3.693, over 64.00 samples.], tot_loss[discriminator_loss=2.581, discriminator_real_loss=1.297, discriminator_fake_loss=1.284, generator_loss=30.34, generator_mel_loss=20.66, generator_kl_loss=2.007, generator_dur_loss=1.66, generator_adv_loss=2.147, generator_feat_match_loss=3.869, over 1681.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2023-11-14 03:28:29,314 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 03:28:39,818 INFO [train.py:517] (0/4) Epoch 514, validation: discriminator_loss=2.611, discriminator_real_loss=1.291, discriminator_fake_loss=1.319, generator_loss=30.79, generator_mel_loss=21.2, generator_kl_loss=2.047, generator_dur_loss=1.646, generator_adv_loss=1.98, generator_feat_match_loss=3.917, over 100.00 samples. 2023-11-14 03:28:39,819 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 03:30:10,412 INFO [train.py:811] (0/4) Start epoch 515 2023-11-14 03:33:17,828 INFO [train.py:467] (0/4) Epoch 515, batch 32, global_batch_idx: 19050, batch size: 53, loss[discriminator_loss=2.492, discriminator_real_loss=1.314, discriminator_fake_loss=1.179, generator_loss=31.44, generator_mel_loss=20.24, generator_kl_loss=2.03, generator_dur_loss=1.66, generator_adv_loss=2.604, generator_feat_match_loss=4.91, over 53.00 samples.], tot_loss[discriminator_loss=2.579, discriminator_real_loss=1.301, discriminator_fake_loss=1.278, generator_loss=30.64, generator_mel_loss=20.74, generator_kl_loss=1.999, generator_dur_loss=1.656, generator_adv_loss=2.247, generator_feat_match_loss=3.998, over 2376.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2023-11-14 03:33:45,562 INFO [train.py:811] (0/4) Start epoch 516 2023-11-14 03:37:17,824 INFO [train.py:811] (0/4) Start epoch 517 2023-11-14 03:38:13,533 INFO [train.py:467] (0/4) Epoch 517, batch 8, global_batch_idx: 19100, batch size: 95, loss[discriminator_loss=2.555, discriminator_real_loss=1.272, discriminator_fake_loss=1.281, generator_loss=30.33, generator_mel_loss=20.71, generator_kl_loss=2.011, generator_dur_loss=1.642, generator_adv_loss=2.117, generator_feat_match_loss=3.85, over 95.00 samples.], tot_loss[discriminator_loss=2.608, discriminator_real_loss=1.322, discriminator_fake_loss=1.286, generator_loss=30.26, generator_mel_loss=20.73, generator_kl_loss=1.983, generator_dur_loss=1.659, generator_adv_loss=2.1, generator_feat_match_loss=3.787, over 658.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2023-11-14 03:40:52,865 INFO [train.py:811] (0/4) Start epoch 518 2023-11-14 03:42:59,120 INFO [train.py:467] (0/4) Epoch 518, batch 21, global_batch_idx: 19150, batch size: 90, loss[discriminator_loss=2.568, discriminator_real_loss=1.326, discriminator_fake_loss=1.242, generator_loss=29.79, generator_mel_loss=20.42, generator_kl_loss=1.963, generator_dur_loss=1.677, generator_adv_loss=2.086, generator_feat_match_loss=3.652, over 90.00 samples.], tot_loss[discriminator_loss=2.654, discriminator_real_loss=1.349, discriminator_fake_loss=1.305, generator_loss=30.26, generator_mel_loss=20.77, generator_kl_loss=1.979, generator_dur_loss=1.669, generator_adv_loss=2.114, generator_feat_match_loss=3.73, over 1379.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 03:44:30,031 INFO [train.py:811] (0/4) Start epoch 519 2023-11-14 03:47:46,550 INFO [train.py:467] (0/4) Epoch 519, batch 34, global_batch_idx: 19200, batch size: 49, loss[discriminator_loss=2.668, discriminator_real_loss=1.349, discriminator_fake_loss=1.32, generator_loss=30.25, generator_mel_loss=20.97, generator_kl_loss=1.918, generator_dur_loss=1.677, generator_adv_loss=1.996, generator_feat_match_loss=3.68, over 49.00 samples.], tot_loss[discriminator_loss=2.614, discriminator_real_loss=1.33, discriminator_fake_loss=1.284, generator_loss=30.45, generator_mel_loss=20.76, generator_kl_loss=1.977, generator_dur_loss=1.658, generator_adv_loss=2.184, generator_feat_match_loss=3.871, over 2460.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 03:47:47,058 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 03:47:57,493 INFO [train.py:517] (0/4) Epoch 519, validation: discriminator_loss=2.64, discriminator_real_loss=1.206, discriminator_fake_loss=1.434, generator_loss=30.82, generator_mel_loss=21.42, generator_kl_loss=2.125, generator_dur_loss=1.645, generator_adv_loss=1.822, generator_feat_match_loss=3.804, over 100.00 samples. 2023-11-14 03:47:57,494 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 03:48:06,389 INFO [train.py:811] (0/4) Start epoch 520 2023-11-14 03:51:42,369 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-520.pt 2023-11-14 03:51:45,701 INFO [train.py:811] (0/4) Start epoch 521 2023-11-14 03:52:53,620 INFO [train.py:467] (0/4) Epoch 521, batch 10, global_batch_idx: 19250, batch size: 53, loss[discriminator_loss=2.648, discriminator_real_loss=1.275, discriminator_fake_loss=1.372, generator_loss=29.67, generator_mel_loss=20.32, generator_kl_loss=1.945, generator_dur_loss=1.659, generator_adv_loss=2.098, generator_feat_match_loss=3.654, over 53.00 samples.], tot_loss[discriminator_loss=2.699, discriminator_real_loss=1.371, discriminator_fake_loss=1.328, generator_loss=30.07, generator_mel_loss=20.58, generator_kl_loss=1.988, generator_dur_loss=1.663, generator_adv_loss=2.1, generator_feat_match_loss=3.742, over 860.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 03:55:13,980 INFO [train.py:811] (0/4) Start epoch 522 2023-11-14 03:57:33,665 INFO [train.py:467] (0/4) Epoch 522, batch 23, global_batch_idx: 19300, batch size: 59, loss[discriminator_loss=2.781, discriminator_real_loss=1.474, discriminator_fake_loss=1.309, generator_loss=30.12, generator_mel_loss=21.04, generator_kl_loss=2.019, generator_dur_loss=1.671, generator_adv_loss=1.96, generator_feat_match_loss=3.422, over 59.00 samples.], tot_loss[discriminator_loss=2.55, discriminator_real_loss=1.283, discriminator_fake_loss=1.267, generator_loss=30.8, generator_mel_loss=20.65, generator_kl_loss=2.004, generator_dur_loss=1.662, generator_adv_loss=2.298, generator_feat_match_loss=4.18, over 1725.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 03:58:39,272 INFO [train.py:811] (0/4) Start epoch 523 2023-11-14 04:02:15,350 INFO [train.py:467] (0/4) Epoch 523, batch 36, global_batch_idx: 19350, batch size: 126, loss[discriminator_loss=2.654, discriminator_real_loss=1.443, discriminator_fake_loss=1.211, generator_loss=30.79, generator_mel_loss=21.02, generator_kl_loss=2.025, generator_dur_loss=1.651, generator_adv_loss=2.156, generator_feat_match_loss=3.941, over 126.00 samples.], tot_loss[discriminator_loss=2.626, discriminator_real_loss=1.339, discriminator_fake_loss=1.287, generator_loss=30.19, generator_mel_loss=20.62, generator_kl_loss=1.975, generator_dur_loss=1.657, generator_adv_loss=2.128, generator_feat_match_loss=3.816, over 2657.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 04:02:16,507 INFO [train.py:811] (0/4) Start epoch 524 2023-11-14 04:05:43,737 INFO [train.py:811] (0/4) Start epoch 525 2023-11-14 04:07:04,518 INFO [train.py:467] (0/4) Epoch 525, batch 12, global_batch_idx: 19400, batch size: 81, loss[discriminator_loss=2.723, discriminator_real_loss=1.303, discriminator_fake_loss=1.42, generator_loss=30.13, generator_mel_loss=20.87, generator_kl_loss=2.005, generator_dur_loss=1.661, generator_adv_loss=2.025, generator_feat_match_loss=3.566, over 81.00 samples.], tot_loss[discriminator_loss=2.603, discriminator_real_loss=1.306, discriminator_fake_loss=1.297, generator_loss=30.4, generator_mel_loss=20.75, generator_kl_loss=2.012, generator_dur_loss=1.652, generator_adv_loss=2.138, generator_feat_match_loss=3.849, over 974.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 04:07:05,123 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 04:07:15,969 INFO [train.py:517] (0/4) Epoch 525, validation: discriminator_loss=2.768, discriminator_real_loss=1.453, discriminator_fake_loss=1.315, generator_loss=31.06, generator_mel_loss=21.68, generator_kl_loss=2.127, generator_dur_loss=1.643, generator_adv_loss=1.993, generator_feat_match_loss=3.615, over 100.00 samples. 2023-11-14 04:07:15,969 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 04:09:28,240 INFO [train.py:811] (0/4) Start epoch 526 2023-11-14 04:12:00,390 INFO [train.py:467] (0/4) Epoch 526, batch 25, global_batch_idx: 19450, batch size: 55, loss[discriminator_loss=2.676, discriminator_real_loss=1.265, discriminator_fake_loss=1.41, generator_loss=29.75, generator_mel_loss=20.41, generator_kl_loss=2.094, generator_dur_loss=1.691, generator_adv_loss=2.127, generator_feat_match_loss=3.422, over 55.00 samples.], tot_loss[discriminator_loss=2.649, discriminator_real_loss=1.346, discriminator_fake_loss=1.304, generator_loss=30.45, generator_mel_loss=20.9, generator_kl_loss=1.996, generator_dur_loss=1.664, generator_adv_loss=2.115, generator_feat_match_loss=3.776, over 1896.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 04:13:01,125 INFO [train.py:811] (0/4) Start epoch 527 2023-11-14 04:16:32,606 INFO [train.py:811] (0/4) Start epoch 528 2023-11-14 04:16:50,711 INFO [train.py:467] (0/4) Epoch 528, batch 1, global_batch_idx: 19500, batch size: 56, loss[discriminator_loss=2.434, discriminator_real_loss=1.165, discriminator_fake_loss=1.27, generator_loss=31.75, generator_mel_loss=21.25, generator_kl_loss=1.958, generator_dur_loss=1.653, generator_adv_loss=2.318, generator_feat_match_loss=4.57, over 56.00 samples.], tot_loss[discriminator_loss=2.45, discriminator_real_loss=1.229, discriminator_fake_loss=1.222, generator_loss=30.9, generator_mel_loss=20.57, generator_kl_loss=1.962, generator_dur_loss=1.654, generator_adv_loss=2.316, generator_feat_match_loss=4.399, over 105.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 04:20:04,757 INFO [train.py:811] (0/4) Start epoch 529 2023-11-14 04:21:42,279 INFO [train.py:467] (0/4) Epoch 529, batch 14, global_batch_idx: 19550, batch size: 52, loss[discriminator_loss=2.555, discriminator_real_loss=1.252, discriminator_fake_loss=1.304, generator_loss=30.23, generator_mel_loss=20.4, generator_kl_loss=1.999, generator_dur_loss=1.652, generator_adv_loss=2.1, generator_feat_match_loss=4.074, over 52.00 samples.], tot_loss[discriminator_loss=2.618, discriminator_real_loss=1.318, discriminator_fake_loss=1.3, generator_loss=30.31, generator_mel_loss=20.76, generator_kl_loss=1.975, generator_dur_loss=1.655, generator_adv_loss=2.082, generator_feat_match_loss=3.834, over 1317.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 04:23:43,242 INFO [train.py:811] (0/4) Start epoch 530 2023-11-14 04:26:23,975 INFO [train.py:467] (0/4) Epoch 530, batch 27, global_batch_idx: 19600, batch size: 64, loss[discriminator_loss=2.465, discriminator_real_loss=1.129, discriminator_fake_loss=1.337, generator_loss=30.64, generator_mel_loss=20.28, generator_kl_loss=1.965, generator_dur_loss=1.667, generator_adv_loss=2.219, generator_feat_match_loss=4.512, over 64.00 samples.], tot_loss[discriminator_loss=2.562, discriminator_real_loss=1.307, discriminator_fake_loss=1.256, generator_loss=30.6, generator_mel_loss=20.42, generator_kl_loss=1.993, generator_dur_loss=1.657, generator_adv_loss=2.287, generator_feat_match_loss=4.245, over 1910.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 32.0 2023-11-14 04:26:24,545 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 04:26:34,811 INFO [train.py:517] (0/4) Epoch 530, validation: discriminator_loss=2.489, discriminator_real_loss=1.078, discriminator_fake_loss=1.411, generator_loss=30.36, generator_mel_loss=20.86, generator_kl_loss=2.071, generator_dur_loss=1.644, generator_adv_loss=1.811, generator_feat_match_loss=3.977, over 100.00 samples. 2023-11-14 04:26:34,812 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 04:27:25,819 INFO [train.py:811] (0/4) Start epoch 531 2023-11-14 04:30:54,822 INFO [train.py:811] (0/4) Start epoch 532 2023-11-14 04:31:25,647 INFO [train.py:467] (0/4) Epoch 532, batch 3, global_batch_idx: 19650, batch size: 110, loss[discriminator_loss=2.414, discriminator_real_loss=1.153, discriminator_fake_loss=1.262, generator_loss=31.74, generator_mel_loss=21.05, generator_kl_loss=2.03, generator_dur_loss=1.639, generator_adv_loss=2.215, generator_feat_match_loss=4.801, over 110.00 samples.], tot_loss[discriminator_loss=2.642, discriminator_real_loss=1.417, discriminator_fake_loss=1.226, generator_loss=30.57, generator_mel_loss=20.61, generator_kl_loss=1.986, generator_dur_loss=1.654, generator_adv_loss=2.184, generator_feat_match_loss=4.139, over 281.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 04:34:24,395 INFO [train.py:811] (0/4) Start epoch 533 2023-11-14 04:36:10,393 INFO [train.py:467] (0/4) Epoch 533, batch 16, global_batch_idx: 19700, batch size: 49, loss[discriminator_loss=2.582, discriminator_real_loss=1.218, discriminator_fake_loss=1.365, generator_loss=30.25, generator_mel_loss=20.75, generator_kl_loss=1.801, generator_dur_loss=1.646, generator_adv_loss=2.211, generator_feat_match_loss=3.836, over 49.00 samples.], tot_loss[discriminator_loss=2.586, discriminator_real_loss=1.312, discriminator_fake_loss=1.274, generator_loss=30.38, generator_mel_loss=20.63, generator_kl_loss=2.007, generator_dur_loss=1.656, generator_adv_loss=2.162, generator_feat_match_loss=3.93, over 1220.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 04:37:59,030 INFO [train.py:811] (0/4) Start epoch 534 2023-11-14 04:40:49,994 INFO [train.py:467] (0/4) Epoch 534, batch 29, global_batch_idx: 19750, batch size: 53, loss[discriminator_loss=2.633, discriminator_real_loss=1.193, discriminator_fake_loss=1.438, generator_loss=31.15, generator_mel_loss=21.16, generator_kl_loss=2.009, generator_dur_loss=1.673, generator_adv_loss=2.354, generator_feat_match_loss=3.963, over 53.00 samples.], tot_loss[discriminator_loss=2.647, discriminator_real_loss=1.331, discriminator_fake_loss=1.316, generator_loss=30.51, generator_mel_loss=20.84, generator_kl_loss=1.991, generator_dur_loss=1.662, generator_adv_loss=2.151, generator_feat_match_loss=3.859, over 2280.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 04:41:29,095 INFO [train.py:811] (0/4) Start epoch 535 2023-11-14 04:45:03,924 INFO [train.py:811] (0/4) Start epoch 536 2023-11-14 04:45:44,555 INFO [train.py:467] (0/4) Epoch 536, batch 5, global_batch_idx: 19800, batch size: 59, loss[discriminator_loss=2.586, discriminator_real_loss=1.288, discriminator_fake_loss=1.298, generator_loss=29.69, generator_mel_loss=20.32, generator_kl_loss=2.005, generator_dur_loss=1.651, generator_adv_loss=2.004, generator_feat_match_loss=3.709, over 59.00 samples.], tot_loss[discriminator_loss=2.56, discriminator_real_loss=1.301, discriminator_fake_loss=1.259, generator_loss=30.42, generator_mel_loss=20.61, generator_kl_loss=1.958, generator_dur_loss=1.653, generator_adv_loss=2.203, generator_feat_match_loss=3.994, over 411.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 04:45:45,101 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 04:45:56,195 INFO [train.py:517] (0/4) Epoch 536, validation: discriminator_loss=2.601, discriminator_real_loss=1.14, discriminator_fake_loss=1.461, generator_loss=30.86, generator_mel_loss=21.6, generator_kl_loss=2.091, generator_dur_loss=1.634, generator_adv_loss=1.735, generator_feat_match_loss=3.803, over 100.00 samples. 2023-11-14 04:45:56,196 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 04:48:52,330 INFO [train.py:811] (0/4) Start epoch 537 2023-11-14 04:50:42,092 INFO [train.py:467] (0/4) Epoch 537, batch 18, global_batch_idx: 19850, batch size: 60, loss[discriminator_loss=2.604, discriminator_real_loss=1.328, discriminator_fake_loss=1.275, generator_loss=29.7, generator_mel_loss=20.31, generator_kl_loss=1.984, generator_dur_loss=1.673, generator_adv_loss=2.045, generator_feat_match_loss=3.689, over 60.00 samples.], tot_loss[discriminator_loss=2.569, discriminator_real_loss=1.321, discriminator_fake_loss=1.249, generator_loss=30.34, generator_mel_loss=20.51, generator_kl_loss=2, generator_dur_loss=1.66, generator_adv_loss=2.223, generator_feat_match_loss=3.943, over 1328.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 04:52:20,196 INFO [train.py:811] (0/4) Start epoch 538 2023-11-14 04:55:22,326 INFO [train.py:467] (0/4) Epoch 538, batch 31, global_batch_idx: 19900, batch size: 49, loss[discriminator_loss=2.494, discriminator_real_loss=1.236, discriminator_fake_loss=1.258, generator_loss=31.24, generator_mel_loss=20.52, generator_kl_loss=1.982, generator_dur_loss=1.67, generator_adv_loss=2.293, generator_feat_match_loss=4.77, over 49.00 samples.], tot_loss[discriminator_loss=2.607, discriminator_real_loss=1.327, discriminator_fake_loss=1.28, generator_loss=30.54, generator_mel_loss=20.67, generator_kl_loss=1.996, generator_dur_loss=1.661, generator_adv_loss=2.204, generator_feat_match_loss=4.012, over 2199.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 04:55:55,136 INFO [train.py:811] (0/4) Start epoch 539 2023-11-14 04:59:27,960 INFO [train.py:811] (0/4) Start epoch 540 2023-11-14 05:00:28,233 INFO [train.py:467] (0/4) Epoch 540, batch 7, global_batch_idx: 19950, batch size: 126, loss[discriminator_loss=2.562, discriminator_real_loss=1.262, discriminator_fake_loss=1.301, generator_loss=30.93, generator_mel_loss=21.18, generator_kl_loss=2.104, generator_dur_loss=1.621, generator_adv_loss=2.078, generator_feat_match_loss=3.943, over 126.00 samples.], tot_loss[discriminator_loss=2.574, discriminator_real_loss=1.298, discriminator_fake_loss=1.275, generator_loss=30.39, generator_mel_loss=20.7, generator_kl_loss=2.035, generator_dur_loss=1.661, generator_adv_loss=2.128, generator_feat_match_loss=3.866, over 607.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 05:02:57,584 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-540.pt 2023-11-14 05:03:00,967 INFO [train.py:811] (0/4) Start epoch 541 2023-11-14 05:05:03,644 INFO [train.py:467] (0/4) Epoch 541, batch 20, global_batch_idx: 20000, batch size: 53, loss[discriminator_loss=2.596, discriminator_real_loss=1.275, discriminator_fake_loss=1.32, generator_loss=30.39, generator_mel_loss=20.59, generator_kl_loss=2.032, generator_dur_loss=1.647, generator_adv_loss=2.293, generator_feat_match_loss=3.83, over 53.00 samples.], tot_loss[discriminator_loss=2.61, discriminator_real_loss=1.326, discriminator_fake_loss=1.285, generator_loss=30.25, generator_mel_loss=20.68, generator_kl_loss=1.976, generator_dur_loss=1.652, generator_adv_loss=2.134, generator_feat_match_loss=3.81, over 1464.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 05:05:04,117 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 05:05:14,430 INFO [train.py:517] (0/4) Epoch 541, validation: discriminator_loss=2.567, discriminator_real_loss=1.315, discriminator_fake_loss=1.253, generator_loss=30.51, generator_mel_loss=20.92, generator_kl_loss=2.065, generator_dur_loss=1.654, generator_adv_loss=2.058, generator_feat_match_loss=3.816, over 100.00 samples. 2023-11-14 05:05:14,431 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 05:06:45,437 INFO [train.py:811] (0/4) Start epoch 542 2023-11-14 05:10:00,813 INFO [train.py:467] (0/4) Epoch 542, batch 33, global_batch_idx: 20050, batch size: 111, loss[discriminator_loss=2.496, discriminator_real_loss=1.313, discriminator_fake_loss=1.184, generator_loss=30.85, generator_mel_loss=20.79, generator_kl_loss=1.978, generator_dur_loss=1.633, generator_adv_loss=2.377, generator_feat_match_loss=4.078, over 111.00 samples.], tot_loss[discriminator_loss=2.658, discriminator_real_loss=1.355, discriminator_fake_loss=1.303, generator_loss=30.4, generator_mel_loss=20.81, generator_kl_loss=1.985, generator_dur_loss=1.655, generator_adv_loss=2.133, generator_feat_match_loss=3.821, over 2677.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 05:10:17,229 INFO [train.py:811] (0/4) Start epoch 543 2023-11-14 05:13:50,212 INFO [train.py:811] (0/4) Start epoch 544 2023-11-14 05:14:54,310 INFO [train.py:467] (0/4) Epoch 544, batch 9, global_batch_idx: 20100, batch size: 53, loss[discriminator_loss=2.699, discriminator_real_loss=1.25, discriminator_fake_loss=1.448, generator_loss=30.45, generator_mel_loss=20.7, generator_kl_loss=1.939, generator_dur_loss=1.651, generator_adv_loss=2.32, generator_feat_match_loss=3.834, over 53.00 samples.], tot_loss[discriminator_loss=2.64, discriminator_real_loss=1.328, discriminator_fake_loss=1.312, generator_loss=30.56, generator_mel_loss=20.86, generator_kl_loss=1.986, generator_dur_loss=1.663, generator_adv_loss=2.181, generator_feat_match_loss=3.866, over 683.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 05:17:26,999 INFO [train.py:811] (0/4) Start epoch 545 2023-11-14 05:19:42,817 INFO [train.py:467] (0/4) Epoch 545, batch 22, global_batch_idx: 20150, batch size: 63, loss[discriminator_loss=2.377, discriminator_real_loss=1.144, discriminator_fake_loss=1.233, generator_loss=32.13, generator_mel_loss=20.62, generator_kl_loss=1.962, generator_dur_loss=1.654, generator_adv_loss=2.725, generator_feat_match_loss=5.172, over 63.00 samples.], tot_loss[discriminator_loss=2.572, discriminator_real_loss=1.315, discriminator_fake_loss=1.258, generator_loss=30.73, generator_mel_loss=20.61, generator_kl_loss=1.971, generator_dur_loss=1.656, generator_adv_loss=2.311, generator_feat_match_loss=4.184, over 1613.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 05:21:02,931 INFO [train.py:811] (0/4) Start epoch 546 2023-11-14 05:24:30,271 INFO [train.py:467] (0/4) Epoch 546, batch 35, global_batch_idx: 20200, batch size: 153, loss[discriminator_loss=2.648, discriminator_real_loss=1.324, discriminator_fake_loss=1.323, generator_loss=30.36, generator_mel_loss=20.59, generator_kl_loss=2.017, generator_dur_loss=1.642, generator_adv_loss=2.146, generator_feat_match_loss=3.969, over 153.00 samples.], tot_loss[discriminator_loss=2.617, discriminator_real_loss=1.329, discriminator_fake_loss=1.288, generator_loss=30.26, generator_mel_loss=20.48, generator_kl_loss=1.986, generator_dur_loss=1.653, generator_adv_loss=2.169, generator_feat_match_loss=3.968, over 2792.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 05:24:30,781 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 05:24:41,466 INFO [train.py:517] (0/4) Epoch 546, validation: discriminator_loss=2.577, discriminator_real_loss=1.25, discriminator_fake_loss=1.327, generator_loss=30.97, generator_mel_loss=21.25, generator_kl_loss=2.117, generator_dur_loss=1.65, generator_adv_loss=1.969, generator_feat_match_loss=3.982, over 100.00 samples. 2023-11-14 05:24:41,467 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 05:24:46,219 INFO [train.py:811] (0/4) Start epoch 547 2023-11-14 05:28:22,841 INFO [train.py:811] (0/4) Start epoch 548 2023-11-14 05:29:36,699 INFO [train.py:467] (0/4) Epoch 548, batch 11, global_batch_idx: 20250, batch size: 59, loss[discriminator_loss=2.564, discriminator_real_loss=1.361, discriminator_fake_loss=1.203, generator_loss=30.47, generator_mel_loss=20.58, generator_kl_loss=1.944, generator_dur_loss=1.649, generator_adv_loss=2.316, generator_feat_match_loss=3.977, over 59.00 samples.], tot_loss[discriminator_loss=2.665, discriminator_real_loss=1.36, discriminator_fake_loss=1.306, generator_loss=30.34, generator_mel_loss=20.71, generator_kl_loss=2.024, generator_dur_loss=1.666, generator_adv_loss=2.137, generator_feat_match_loss=3.804, over 849.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 05:31:58,369 INFO [train.py:811] (0/4) Start epoch 549 2023-11-14 05:34:19,716 INFO [train.py:467] (0/4) Epoch 549, batch 24, global_batch_idx: 20300, batch size: 65, loss[discriminator_loss=2.617, discriminator_real_loss=1.385, discriminator_fake_loss=1.232, generator_loss=30.31, generator_mel_loss=20.85, generator_kl_loss=1.987, generator_dur_loss=1.665, generator_adv_loss=2.113, generator_feat_match_loss=3.695, over 65.00 samples.], tot_loss[discriminator_loss=2.596, discriminator_real_loss=1.315, discriminator_fake_loss=1.281, generator_loss=30.64, generator_mel_loss=20.76, generator_kl_loss=1.994, generator_dur_loss=1.657, generator_adv_loss=2.202, generator_feat_match_loss=4.024, over 1903.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 05:35:25,998 INFO [train.py:811] (0/4) Start epoch 550 2023-11-14 05:38:56,564 INFO [train.py:811] (0/4) Start epoch 551 2023-11-14 05:39:12,511 INFO [train.py:467] (0/4) Epoch 551, batch 0, global_batch_idx: 20350, batch size: 52, loss[discriminator_loss=2.516, discriminator_real_loss=1.212, discriminator_fake_loss=1.305, generator_loss=30.51, generator_mel_loss=20.42, generator_kl_loss=1.993, generator_dur_loss=1.68, generator_adv_loss=2.525, generator_feat_match_loss=3.887, over 52.00 samples.], tot_loss[discriminator_loss=2.516, discriminator_real_loss=1.212, discriminator_fake_loss=1.305, generator_loss=30.51, generator_mel_loss=20.42, generator_kl_loss=1.993, generator_dur_loss=1.68, generator_adv_loss=2.525, generator_feat_match_loss=3.887, over 52.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 05:42:33,846 INFO [train.py:811] (0/4) Start epoch 552 2023-11-14 05:43:59,721 INFO [train.py:467] (0/4) Epoch 552, batch 13, global_batch_idx: 20400, batch size: 110, loss[discriminator_loss=2.656, discriminator_real_loss=1.261, discriminator_fake_loss=1.396, generator_loss=30.62, generator_mel_loss=21.04, generator_kl_loss=2.025, generator_dur_loss=1.642, generator_adv_loss=2.107, generator_feat_match_loss=3.801, over 110.00 samples.], tot_loss[discriminator_loss=2.613, discriminator_real_loss=1.323, discriminator_fake_loss=1.29, generator_loss=30.29, generator_mel_loss=20.62, generator_kl_loss=1.982, generator_dur_loss=1.661, generator_adv_loss=2.133, generator_feat_match_loss=3.898, over 1004.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 05:44:00,355 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 05:44:10,831 INFO [train.py:517] (0/4) Epoch 552, validation: discriminator_loss=2.596, discriminator_real_loss=1.278, discriminator_fake_loss=1.318, generator_loss=31.21, generator_mel_loss=21.3, generator_kl_loss=2.196, generator_dur_loss=1.634, generator_adv_loss=2.004, generator_feat_match_loss=4.074, over 100.00 samples. 2023-11-14 05:44:10,832 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 05:46:20,111 INFO [train.py:811] (0/4) Start epoch 553 2023-11-14 05:48:58,516 INFO [train.py:467] (0/4) Epoch 553, batch 26, global_batch_idx: 20450, batch size: 56, loss[discriminator_loss=2.553, discriminator_real_loss=1.281, discriminator_fake_loss=1.271, generator_loss=29.89, generator_mel_loss=20.4, generator_kl_loss=1.97, generator_dur_loss=1.64, generator_adv_loss=2.047, generator_feat_match_loss=3.83, over 56.00 samples.], tot_loss[discriminator_loss=2.656, discriminator_real_loss=1.347, discriminator_fake_loss=1.308, generator_loss=30.33, generator_mel_loss=20.6, generator_kl_loss=1.979, generator_dur_loss=1.655, generator_adv_loss=2.189, generator_feat_match_loss=3.905, over 2083.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 05:49:54,751 INFO [train.py:811] (0/4) Start epoch 554 2023-11-14 05:53:23,246 INFO [train.py:811] (0/4) Start epoch 555 2023-11-14 05:53:51,126 INFO [train.py:467] (0/4) Epoch 555, batch 2, global_batch_idx: 20500, batch size: 50, loss[discriminator_loss=2.723, discriminator_real_loss=1.307, discriminator_fake_loss=1.417, generator_loss=29.41, generator_mel_loss=20.16, generator_kl_loss=1.929, generator_dur_loss=1.681, generator_adv_loss=1.926, generator_feat_match_loss=3.717, over 50.00 samples.], tot_loss[discriminator_loss=2.656, discriminator_real_loss=1.375, discriminator_fake_loss=1.282, generator_loss=30.48, generator_mel_loss=20.66, generator_kl_loss=1.949, generator_dur_loss=1.677, generator_adv_loss=2.103, generator_feat_match_loss=4.099, over 184.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 05:56:53,225 INFO [train.py:811] (0/4) Start epoch 556 2023-11-14 05:58:30,001 INFO [train.py:467] (0/4) Epoch 556, batch 15, global_batch_idx: 20550, batch size: 90, loss[discriminator_loss=2.469, discriminator_real_loss=1.253, discriminator_fake_loss=1.216, generator_loss=30.54, generator_mel_loss=20.25, generator_kl_loss=1.994, generator_dur_loss=1.638, generator_adv_loss=2.256, generator_feat_match_loss=4.398, over 90.00 samples.], tot_loss[discriminator_loss=2.512, discriminator_real_loss=1.269, discriminator_fake_loss=1.242, generator_loss=30.51, generator_mel_loss=20.29, generator_kl_loss=1.974, generator_dur_loss=1.654, generator_adv_loss=2.315, generator_feat_match_loss=4.286, over 1179.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 06:00:27,035 INFO [train.py:811] (0/4) Start epoch 557 2023-11-14 06:03:10,834 INFO [train.py:467] (0/4) Epoch 557, batch 28, global_batch_idx: 20600, batch size: 126, loss[discriminator_loss=2.828, discriminator_real_loss=1.587, discriminator_fake_loss=1.242, generator_loss=29.78, generator_mel_loss=20.29, generator_kl_loss=1.974, generator_dur_loss=1.645, generator_adv_loss=2.279, generator_feat_match_loss=3.586, over 126.00 samples.], tot_loss[discriminator_loss=2.563, discriminator_real_loss=1.297, discriminator_fake_loss=1.267, generator_loss=30.76, generator_mel_loss=20.53, generator_kl_loss=1.99, generator_dur_loss=1.658, generator_adv_loss=2.334, generator_feat_match_loss=4.256, over 2096.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 06:03:11,368 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 06:03:21,704 INFO [train.py:517] (0/4) Epoch 557, validation: discriminator_loss=2.709, discriminator_real_loss=1.569, discriminator_fake_loss=1.14, generator_loss=31.36, generator_mel_loss=21.37, generator_kl_loss=2.118, generator_dur_loss=1.641, generator_adv_loss=2.378, generator_feat_match_loss=3.859, over 100.00 samples. 2023-11-14 06:03:21,705 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 06:04:05,162 INFO [train.py:811] (0/4) Start epoch 558 2023-11-14 06:07:34,893 INFO [train.py:811] (0/4) Start epoch 559 2023-11-14 06:08:16,272 INFO [train.py:467] (0/4) Epoch 559, batch 4, global_batch_idx: 20650, batch size: 56, loss[discriminator_loss=2.609, discriminator_real_loss=1.354, discriminator_fake_loss=1.254, generator_loss=30.36, generator_mel_loss=20.58, generator_kl_loss=1.95, generator_dur_loss=1.658, generator_adv_loss=2.277, generator_feat_match_loss=3.904, over 56.00 samples.], tot_loss[discriminator_loss=2.572, discriminator_real_loss=1.302, discriminator_fake_loss=1.269, generator_loss=30.46, generator_mel_loss=20.68, generator_kl_loss=1.969, generator_dur_loss=1.648, generator_adv_loss=2.165, generator_feat_match_loss=4, over 410.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2023-11-14 06:11:05,011 INFO [train.py:811] (0/4) Start epoch 560 2023-11-14 06:12:51,399 INFO [train.py:467] (0/4) Epoch 560, batch 17, global_batch_idx: 20700, batch size: 52, loss[discriminator_loss=2.576, discriminator_real_loss=1.241, discriminator_fake_loss=1.335, generator_loss=30.91, generator_mel_loss=20.35, generator_kl_loss=1.921, generator_dur_loss=1.653, generator_adv_loss=2.48, generator_feat_match_loss=4.504, over 52.00 samples.], tot_loss[discriminator_loss=2.647, discriminator_real_loss=1.348, discriminator_fake_loss=1.298, generator_loss=30.39, generator_mel_loss=20.48, generator_kl_loss=1.989, generator_dur_loss=1.654, generator_adv_loss=2.244, generator_feat_match_loss=4.028, over 1290.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2023-11-14 06:14:39,837 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-560.pt 2023-11-14 06:14:43,139 INFO [train.py:811] (0/4) Start epoch 561 2023-11-14 06:17:38,132 INFO [train.py:467] (0/4) Epoch 561, batch 30, global_batch_idx: 20750, batch size: 54, loss[discriminator_loss=2.547, discriminator_real_loss=1.376, discriminator_fake_loss=1.172, generator_loss=29.91, generator_mel_loss=20.46, generator_kl_loss=1.943, generator_dur_loss=1.69, generator_adv_loss=1.994, generator_feat_match_loss=3.832, over 54.00 samples.], tot_loss[discriminator_loss=2.544, discriminator_real_loss=1.289, discriminator_fake_loss=1.254, generator_loss=30.41, generator_mel_loss=20.48, generator_kl_loss=1.993, generator_dur_loss=1.665, generator_adv_loss=2.217, generator_feat_match_loss=4.058, over 1994.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2023-11-14 06:18:15,750 INFO [train.py:811] (0/4) Start epoch 562 2023-11-14 06:21:45,229 INFO [train.py:811] (0/4) Start epoch 563 2023-11-14 06:22:31,596 INFO [train.py:467] (0/4) Epoch 563, batch 6, global_batch_idx: 20800, batch size: 64, loss[discriminator_loss=2.859, discriminator_real_loss=1.621, discriminator_fake_loss=1.239, generator_loss=29.29, generator_mel_loss=20.21, generator_kl_loss=1.895, generator_dur_loss=1.651, generator_adv_loss=2.172, generator_feat_match_loss=3.361, over 64.00 samples.], tot_loss[discriminator_loss=2.725, discriminator_real_loss=1.426, discriminator_fake_loss=1.299, generator_loss=30.72, generator_mel_loss=20.65, generator_kl_loss=1.974, generator_dur_loss=1.65, generator_adv_loss=2.338, generator_feat_match_loss=4.107, over 563.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 06:22:32,178 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 06:22:43,366 INFO [train.py:517] (0/4) Epoch 563, validation: discriminator_loss=2.729, discriminator_real_loss=1.466, discriminator_fake_loss=1.263, generator_loss=30.51, generator_mel_loss=21, generator_kl_loss=2.106, generator_dur_loss=1.645, generator_adv_loss=2.2, generator_feat_match_loss=3.556, over 100.00 samples. 2023-11-14 06:22:43,367 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 06:25:33,244 INFO [train.py:811] (0/4) Start epoch 564 2023-11-14 06:27:34,767 INFO [train.py:467] (0/4) Epoch 564, batch 19, global_batch_idx: 20850, batch size: 61, loss[discriminator_loss=2.566, discriminator_real_loss=1.272, discriminator_fake_loss=1.295, generator_loss=30.29, generator_mel_loss=20.51, generator_kl_loss=2.017, generator_dur_loss=1.647, generator_adv_loss=2.18, generator_feat_match_loss=3.932, over 61.00 samples.], tot_loss[discriminator_loss=2.611, discriminator_real_loss=1.338, discriminator_fake_loss=1.272, generator_loss=30.42, generator_mel_loss=20.72, generator_kl_loss=2.017, generator_dur_loss=1.658, generator_adv_loss=2.139, generator_feat_match_loss=3.886, over 1393.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 06:29:06,314 INFO [train.py:811] (0/4) Start epoch 565 2023-11-14 06:32:14,058 INFO [train.py:467] (0/4) Epoch 565, batch 32, global_batch_idx: 20900, batch size: 60, loss[discriminator_loss=2.676, discriminator_real_loss=1.464, discriminator_fake_loss=1.213, generator_loss=30.63, generator_mel_loss=20.53, generator_kl_loss=1.896, generator_dur_loss=1.672, generator_adv_loss=2.346, generator_feat_match_loss=4.188, over 60.00 samples.], tot_loss[discriminator_loss=2.604, discriminator_real_loss=1.341, discriminator_fake_loss=1.263, generator_loss=30.57, generator_mel_loss=20.44, generator_kl_loss=1.974, generator_dur_loss=1.652, generator_adv_loss=2.31, generator_feat_match_loss=4.199, over 2522.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 06:32:37,806 INFO [train.py:811] (0/4) Start epoch 566 2023-11-14 06:36:06,612 INFO [train.py:811] (0/4) Start epoch 567 2023-11-14 06:37:03,296 INFO [train.py:467] (0/4) Epoch 567, batch 8, global_batch_idx: 20950, batch size: 58, loss[discriminator_loss=2.613, discriminator_real_loss=1.365, discriminator_fake_loss=1.249, generator_loss=30.32, generator_mel_loss=20.52, generator_kl_loss=1.978, generator_dur_loss=1.632, generator_adv_loss=2.244, generator_feat_match_loss=3.945, over 58.00 samples.], tot_loss[discriminator_loss=2.624, discriminator_real_loss=1.331, discriminator_fake_loss=1.292, generator_loss=30.38, generator_mel_loss=20.64, generator_kl_loss=1.974, generator_dur_loss=1.66, generator_adv_loss=2.148, generator_feat_match_loss=3.952, over 669.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 06:39:37,645 INFO [train.py:811] (0/4) Start epoch 568 2023-11-14 06:41:45,550 INFO [train.py:467] (0/4) Epoch 568, batch 21, global_batch_idx: 21000, batch size: 90, loss[discriminator_loss=2.57, discriminator_real_loss=1.357, discriminator_fake_loss=1.213, generator_loss=30.58, generator_mel_loss=20.56, generator_kl_loss=2.09, generator_dur_loss=1.685, generator_adv_loss=2.287, generator_feat_match_loss=3.965, over 90.00 samples.], tot_loss[discriminator_loss=2.634, discriminator_real_loss=1.342, discriminator_fake_loss=1.292, generator_loss=30.24, generator_mel_loss=20.57, generator_kl_loss=1.984, generator_dur_loss=1.657, generator_adv_loss=2.144, generator_feat_match_loss=3.882, over 1411.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 06:41:46,046 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 06:41:56,461 INFO [train.py:517] (0/4) Epoch 568, validation: discriminator_loss=2.506, discriminator_real_loss=1.235, discriminator_fake_loss=1.271, generator_loss=30.96, generator_mel_loss=21.07, generator_kl_loss=2.146, generator_dur_loss=1.636, generator_adv_loss=2.12, generator_feat_match_loss=3.986, over 100.00 samples. 2023-11-14 06:41:56,461 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 06:43:21,107 INFO [train.py:811] (0/4) Start epoch 569 2023-11-14 06:46:45,355 INFO [train.py:467] (0/4) Epoch 569, batch 34, global_batch_idx: 21050, batch size: 85, loss[discriminator_loss=2.684, discriminator_real_loss=1.335, discriminator_fake_loss=1.35, generator_loss=30.09, generator_mel_loss=20.7, generator_kl_loss=2.006, generator_dur_loss=1.664, generator_adv_loss=2.033, generator_feat_match_loss=3.693, over 85.00 samples.], tot_loss[discriminator_loss=2.673, discriminator_real_loss=1.354, discriminator_fake_loss=1.32, generator_loss=30.35, generator_mel_loss=20.74, generator_kl_loss=2.003, generator_dur_loss=1.655, generator_adv_loss=2.15, generator_feat_match_loss=3.808, over 2564.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 06:46:55,735 INFO [train.py:811] (0/4) Start epoch 570 2023-11-14 06:50:27,798 INFO [train.py:811] (0/4) Start epoch 571 2023-11-14 06:51:46,405 INFO [train.py:467] (0/4) Epoch 571, batch 10, global_batch_idx: 21100, batch size: 60, loss[discriminator_loss=2.461, discriminator_real_loss=1.349, discriminator_fake_loss=1.111, generator_loss=30.87, generator_mel_loss=20.3, generator_kl_loss=2.007, generator_dur_loss=1.674, generator_adv_loss=2.543, generator_feat_match_loss=4.352, over 60.00 samples.], tot_loss[discriminator_loss=2.525, discriminator_real_loss=1.284, discriminator_fake_loss=1.241, generator_loss=30.94, generator_mel_loss=20.44, generator_kl_loss=1.978, generator_dur_loss=1.65, generator_adv_loss=2.409, generator_feat_match_loss=4.459, over 930.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2023-11-14 06:53:56,959 INFO [train.py:811] (0/4) Start epoch 572 2023-11-14 06:56:21,862 INFO [train.py:467] (0/4) Epoch 572, batch 23, global_batch_idx: 21150, batch size: 69, loss[discriminator_loss=2.371, discriminator_real_loss=1.231, discriminator_fake_loss=1.139, generator_loss=31.28, generator_mel_loss=20.39, generator_kl_loss=1.998, generator_dur_loss=1.635, generator_adv_loss=2.506, generator_feat_match_loss=4.754, over 69.00 samples.], tot_loss[discriminator_loss=2.485, discriminator_real_loss=1.269, discriminator_fake_loss=1.216, generator_loss=30.86, generator_mel_loss=20.26, generator_kl_loss=1.972, generator_dur_loss=1.656, generator_adv_loss=2.402, generator_feat_match_loss=4.571, over 1679.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2023-11-14 06:57:31,533 INFO [train.py:811] (0/4) Start epoch 573 2023-11-14 07:00:53,143 INFO [train.py:467] (0/4) Epoch 573, batch 36, global_batch_idx: 21200, batch size: 153, loss[discriminator_loss=2.648, discriminator_real_loss=1.316, discriminator_fake_loss=1.332, generator_loss=31.05, generator_mel_loss=21.02, generator_kl_loss=2.063, generator_dur_loss=1.64, generator_adv_loss=2.23, generator_feat_match_loss=4.09, over 153.00 samples.], tot_loss[discriminator_loss=2.611, discriminator_real_loss=1.329, discriminator_fake_loss=1.283, generator_loss=30.27, generator_mel_loss=20.56, generator_kl_loss=2.002, generator_dur_loss=1.651, generator_adv_loss=2.134, generator_feat_match_loss=3.926, over 2874.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 07:00:53,658 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 07:01:04,806 INFO [train.py:517] (0/4) Epoch 573, validation: discriminator_loss=2.648, discriminator_real_loss=1.389, discriminator_fake_loss=1.26, generator_loss=30.9, generator_mel_loss=21.23, generator_kl_loss=2.15, generator_dur_loss=1.635, generator_adv_loss=2.081, generator_feat_match_loss=3.801, over 100.00 samples. 2023-11-14 07:01:04,807 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 07:01:05,630 INFO [train.py:811] (0/4) Start epoch 574 2023-11-14 07:04:36,803 INFO [train.py:811] (0/4) Start epoch 575 2023-11-14 07:05:59,237 INFO [train.py:467] (0/4) Epoch 575, batch 12, global_batch_idx: 21250, batch size: 56, loss[discriminator_loss=2.49, discriminator_real_loss=1.244, discriminator_fake_loss=1.246, generator_loss=30.73, generator_mel_loss=20.6, generator_kl_loss=2.036, generator_dur_loss=1.633, generator_adv_loss=2.373, generator_feat_match_loss=4.086, over 56.00 samples.], tot_loss[discriminator_loss=2.527, discriminator_real_loss=1.286, discriminator_fake_loss=1.241, generator_loss=30.94, generator_mel_loss=20.6, generator_kl_loss=1.978, generator_dur_loss=1.654, generator_adv_loss=2.379, generator_feat_match_loss=4.327, over 857.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 07:08:07,730 INFO [train.py:811] (0/4) Start epoch 576 2023-11-14 07:10:37,035 INFO [train.py:467] (0/4) Epoch 576, batch 25, global_batch_idx: 21300, batch size: 95, loss[discriminator_loss=2.588, discriminator_real_loss=1.395, discriminator_fake_loss=1.193, generator_loss=30.34, generator_mel_loss=20.88, generator_kl_loss=1.948, generator_dur_loss=1.624, generator_adv_loss=2.016, generator_feat_match_loss=3.879, over 95.00 samples.], tot_loss[discriminator_loss=2.602, discriminator_real_loss=1.326, discriminator_fake_loss=1.275, generator_loss=30.26, generator_mel_loss=20.52, generator_kl_loss=2.01, generator_dur_loss=1.661, generator_adv_loss=2.157, generator_feat_match_loss=3.915, over 1749.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 07:11:43,748 INFO [train.py:811] (0/4) Start epoch 577 2023-11-14 07:15:09,248 INFO [train.py:811] (0/4) Start epoch 578 2023-11-14 07:15:29,927 INFO [train.py:467] (0/4) Epoch 578, batch 1, global_batch_idx: 21350, batch size: 73, loss[discriminator_loss=2.637, discriminator_real_loss=1.359, discriminator_fake_loss=1.278, generator_loss=30.5, generator_mel_loss=20.91, generator_kl_loss=2.031, generator_dur_loss=1.66, generator_adv_loss=2.109, generator_feat_match_loss=3.791, over 73.00 samples.], tot_loss[discriminator_loss=2.574, discriminator_real_loss=1.319, discriminator_fake_loss=1.256, generator_loss=30.54, generator_mel_loss=20.87, generator_kl_loss=1.954, generator_dur_loss=1.662, generator_adv_loss=2.219, generator_feat_match_loss=3.832, over 142.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 07:18:39,448 INFO [train.py:811] (0/4) Start epoch 579 2023-11-14 07:20:15,465 INFO [train.py:467] (0/4) Epoch 579, batch 14, global_batch_idx: 21400, batch size: 49, loss[discriminator_loss=2.506, discriminator_real_loss=1.103, discriminator_fake_loss=1.403, generator_loss=29.98, generator_mel_loss=20.05, generator_kl_loss=1.923, generator_dur_loss=1.663, generator_adv_loss=2.146, generator_feat_match_loss=4.191, over 49.00 samples.], tot_loss[discriminator_loss=2.605, discriminator_real_loss=1.306, discriminator_fake_loss=1.299, generator_loss=30.22, generator_mel_loss=20.41, generator_kl_loss=1.969, generator_dur_loss=1.655, generator_adv_loss=2.164, generator_feat_match_loss=4.03, over 951.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 07:20:16,040 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 07:20:26,430 INFO [train.py:517] (0/4) Epoch 579, validation: discriminator_loss=2.57, discriminator_real_loss=1.246, discriminator_fake_loss=1.323, generator_loss=30.95, generator_mel_loss=21.05, generator_kl_loss=2.034, generator_dur_loss=1.638, generator_adv_loss=1.944, generator_feat_match_loss=4.283, over 100.00 samples. 2023-11-14 07:20:26,431 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 07:22:25,671 INFO [train.py:811] (0/4) Start epoch 580 2023-11-14 07:25:00,815 INFO [train.py:467] (0/4) Epoch 580, batch 27, global_batch_idx: 21450, batch size: 153, loss[discriminator_loss=2.535, discriminator_real_loss=1.275, discriminator_fake_loss=1.261, generator_loss=30.99, generator_mel_loss=20.64, generator_kl_loss=2.097, generator_dur_loss=1.639, generator_adv_loss=2.285, generator_feat_match_loss=4.324, over 153.00 samples.], tot_loss[discriminator_loss=2.593, discriminator_real_loss=1.311, discriminator_fake_loss=1.282, generator_loss=30.4, generator_mel_loss=20.58, generator_kl_loss=2.009, generator_dur_loss=1.661, generator_adv_loss=2.151, generator_feat_match_loss=3.999, over 2018.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 07:25:54,016 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-580.pt 2023-11-14 07:25:57,240 INFO [train.py:811] (0/4) Start epoch 581 2023-11-14 07:29:22,475 INFO [train.py:811] (0/4) Start epoch 582 2023-11-14 07:29:59,151 INFO [train.py:467] (0/4) Epoch 582, batch 3, global_batch_idx: 21500, batch size: 65, loss[discriminator_loss=2.648, discriminator_real_loss=1.204, discriminator_fake_loss=1.444, generator_loss=29.92, generator_mel_loss=20.43, generator_kl_loss=1.95, generator_dur_loss=1.673, generator_adv_loss=1.994, generator_feat_match_loss=3.873, over 65.00 samples.], tot_loss[discriminator_loss=2.674, discriminator_real_loss=1.36, discriminator_fake_loss=1.314, generator_loss=30.04, generator_mel_loss=20.61, generator_kl_loss=1.987, generator_dur_loss=1.659, generator_adv_loss=1.942, generator_feat_match_loss=3.834, over 349.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 07:32:55,233 INFO [train.py:811] (0/4) Start epoch 583 2023-11-14 07:34:38,767 INFO [train.py:467] (0/4) Epoch 583, batch 16, global_batch_idx: 21550, batch size: 71, loss[discriminator_loss=2.543, discriminator_real_loss=1.261, discriminator_fake_loss=1.283, generator_loss=30.42, generator_mel_loss=20.44, generator_kl_loss=1.956, generator_dur_loss=1.652, generator_adv_loss=2.242, generator_feat_match_loss=4.133, over 71.00 samples.], tot_loss[discriminator_loss=2.585, discriminator_real_loss=1.296, discriminator_fake_loss=1.289, generator_loss=30.22, generator_mel_loss=20.36, generator_kl_loss=1.977, generator_dur_loss=1.651, generator_adv_loss=2.154, generator_feat_match_loss=4.085, over 1290.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 07:36:28,803 INFO [train.py:811] (0/4) Start epoch 584 2023-11-14 07:39:14,225 INFO [train.py:467] (0/4) Epoch 584, batch 29, global_batch_idx: 21600, batch size: 73, loss[discriminator_loss=2.854, discriminator_real_loss=1.705, discriminator_fake_loss=1.148, generator_loss=30.08, generator_mel_loss=20.41, generator_kl_loss=2.049, generator_dur_loss=1.642, generator_adv_loss=2.385, generator_feat_match_loss=3.596, over 73.00 samples.], tot_loss[discriminator_loss=2.575, discriminator_real_loss=1.306, discriminator_fake_loss=1.269, generator_loss=30.59, generator_mel_loss=20.48, generator_kl_loss=1.981, generator_dur_loss=1.65, generator_adv_loss=2.274, generator_feat_match_loss=4.204, over 2049.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 07:39:14,723 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 07:39:25,422 INFO [train.py:517] (0/4) Epoch 584, validation: discriminator_loss=2.718, discriminator_real_loss=1.526, discriminator_fake_loss=1.192, generator_loss=31.35, generator_mel_loss=21.09, generator_kl_loss=2.166, generator_dur_loss=1.646, generator_adv_loss=2.405, generator_feat_match_loss=4.042, over 100.00 samples. 2023-11-14 07:39:25,423 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 07:40:06,990 INFO [train.py:811] (0/4) Start epoch 585 2023-11-14 07:43:38,618 INFO [train.py:811] (0/4) Start epoch 586 2023-11-14 07:44:22,634 INFO [train.py:467] (0/4) Epoch 586, batch 5, global_batch_idx: 21650, batch size: 49, loss[discriminator_loss=2.543, discriminator_real_loss=1.318, discriminator_fake_loss=1.225, generator_loss=29.6, generator_mel_loss=20, generator_kl_loss=1.992, generator_dur_loss=1.647, generator_adv_loss=2.115, generator_feat_match_loss=3.844, over 49.00 samples.], tot_loss[discriminator_loss=2.599, discriminator_real_loss=1.306, discriminator_fake_loss=1.293, generator_loss=30.25, generator_mel_loss=20.62, generator_kl_loss=1.983, generator_dur_loss=1.643, generator_adv_loss=2.082, generator_feat_match_loss=3.916, over 477.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 07:47:06,672 INFO [train.py:811] (0/4) Start epoch 587 2023-11-14 07:48:54,959 INFO [train.py:467] (0/4) Epoch 587, batch 18, global_batch_idx: 21700, batch size: 126, loss[discriminator_loss=2.656, discriminator_real_loss=1.367, discriminator_fake_loss=1.289, generator_loss=31, generator_mel_loss=20.85, generator_kl_loss=2.071, generator_dur_loss=1.646, generator_adv_loss=2.285, generator_feat_match_loss=4.141, over 126.00 samples.], tot_loss[discriminator_loss=2.597, discriminator_real_loss=1.304, discriminator_fake_loss=1.293, generator_loss=30.76, generator_mel_loss=20.7, generator_kl_loss=2.025, generator_dur_loss=1.651, generator_adv_loss=2.222, generator_feat_match_loss=4.161, over 1364.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 07:50:37,671 INFO [train.py:811] (0/4) Start epoch 588 2023-11-14 07:53:36,492 INFO [train.py:467] (0/4) Epoch 588, batch 31, global_batch_idx: 21750, batch size: 69, loss[discriminator_loss=2.582, discriminator_real_loss=1.381, discriminator_fake_loss=1.2, generator_loss=30.51, generator_mel_loss=20.49, generator_kl_loss=1.977, generator_dur_loss=1.643, generator_adv_loss=2.227, generator_feat_match_loss=4.18, over 69.00 samples.], tot_loss[discriminator_loss=2.557, discriminator_real_loss=1.29, discriminator_fake_loss=1.267, generator_loss=30.69, generator_mel_loss=20.53, generator_kl_loss=1.99, generator_dur_loss=1.648, generator_adv_loss=2.291, generator_feat_match_loss=4.233, over 2503.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2023-11-14 07:54:08,907 INFO [train.py:811] (0/4) Start epoch 589 2023-11-14 07:57:40,945 INFO [train.py:811] (0/4) Start epoch 590 2023-11-14 07:58:33,763 INFO [train.py:467] (0/4) Epoch 590, batch 7, global_batch_idx: 21800, batch size: 60, loss[discriminator_loss=2.646, discriminator_real_loss=1.1, discriminator_fake_loss=1.547, generator_loss=29.34, generator_mel_loss=19.94, generator_kl_loss=2.066, generator_dur_loss=1.678, generator_adv_loss=1.992, generator_feat_match_loss=3.66, over 60.00 samples.], tot_loss[discriminator_loss=2.564, discriminator_real_loss=1.262, discriminator_fake_loss=1.302, generator_loss=30.57, generator_mel_loss=20.26, generator_kl_loss=2.026, generator_dur_loss=1.644, generator_adv_loss=2.335, generator_feat_match_loss=4.3, over 650.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2023-11-14 07:58:34,344 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 07:58:45,633 INFO [train.py:517] (0/4) Epoch 590, validation: discriminator_loss=2.695, discriminator_real_loss=1.171, discriminator_fake_loss=1.524, generator_loss=29.96, generator_mel_loss=20.78, generator_kl_loss=2.099, generator_dur_loss=1.633, generator_adv_loss=1.747, generator_feat_match_loss=3.7, over 100.00 samples. 2023-11-14 07:58:45,634 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 08:01:21,335 INFO [train.py:811] (0/4) Start epoch 591 2023-11-14 08:03:28,916 INFO [train.py:467] (0/4) Epoch 591, batch 20, global_batch_idx: 21850, batch size: 63, loss[discriminator_loss=2.371, discriminator_real_loss=1.184, discriminator_fake_loss=1.187, generator_loss=31.24, generator_mel_loss=20.13, generator_kl_loss=1.972, generator_dur_loss=1.662, generator_adv_loss=2.586, generator_feat_match_loss=4.891, over 63.00 samples.], tot_loss[discriminator_loss=2.548, discriminator_real_loss=1.275, discriminator_fake_loss=1.273, generator_loss=30.79, generator_mel_loss=20.55, generator_kl_loss=2.015, generator_dur_loss=1.655, generator_adv_loss=2.264, generator_feat_match_loss=4.305, over 1639.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2023-11-14 08:04:53,773 INFO [train.py:811] (0/4) Start epoch 592 2023-11-14 08:08:10,886 INFO [train.py:467] (0/4) Epoch 592, batch 33, global_batch_idx: 21900, batch size: 52, loss[discriminator_loss=2.543, discriminator_real_loss=1.383, discriminator_fake_loss=1.16, generator_loss=30.07, generator_mel_loss=20.31, generator_kl_loss=1.948, generator_dur_loss=1.65, generator_adv_loss=2.232, generator_feat_match_loss=3.922, over 52.00 samples.], tot_loss[discriminator_loss=2.589, discriminator_real_loss=1.311, discriminator_fake_loss=1.278, generator_loss=30.48, generator_mel_loss=20.45, generator_kl_loss=1.983, generator_dur_loss=1.652, generator_adv_loss=2.231, generator_feat_match_loss=4.168, over 2381.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2023-11-14 08:08:27,603 INFO [train.py:811] (0/4) Start epoch 593 2023-11-14 08:12:00,397 INFO [train.py:811] (0/4) Start epoch 594 2023-11-14 08:13:06,924 INFO [train.py:467] (0/4) Epoch 594, batch 9, global_batch_idx: 21950, batch size: 49, loss[discriminator_loss=2.629, discriminator_real_loss=1.172, discriminator_fake_loss=1.456, generator_loss=30.04, generator_mel_loss=20.29, generator_kl_loss=1.987, generator_dur_loss=1.649, generator_adv_loss=2.307, generator_feat_match_loss=3.799, over 49.00 samples.], tot_loss[discriminator_loss=2.389, discriminator_real_loss=1.237, discriminator_fake_loss=1.151, generator_loss=31.49, generator_mel_loss=20.38, generator_kl_loss=2.011, generator_dur_loss=1.652, generator_adv_loss=2.486, generator_feat_match_loss=4.961, over 707.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2023-11-14 08:15:28,722 INFO [train.py:811] (0/4) Start epoch 595 2023-11-14 08:17:43,021 INFO [train.py:467] (0/4) Epoch 595, batch 22, global_batch_idx: 22000, batch size: 65, loss[discriminator_loss=2.641, discriminator_real_loss=1.278, discriminator_fake_loss=1.361, generator_loss=29.82, generator_mel_loss=20.39, generator_kl_loss=1.971, generator_dur_loss=1.653, generator_adv_loss=1.956, generator_feat_match_loss=3.852, over 65.00 samples.], tot_loss[discriminator_loss=2.639, discriminator_real_loss=1.339, discriminator_fake_loss=1.3, generator_loss=30.12, generator_mel_loss=20.5, generator_kl_loss=1.999, generator_dur_loss=1.649, generator_adv_loss=2.094, generator_feat_match_loss=3.872, over 1703.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 08:17:43,590 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 08:17:54,307 INFO [train.py:517] (0/4) Epoch 595, validation: discriminator_loss=2.616, discriminator_real_loss=1.26, discriminator_fake_loss=1.356, generator_loss=30.43, generator_mel_loss=21.16, generator_kl_loss=2.069, generator_dur_loss=1.636, generator_adv_loss=1.879, generator_feat_match_loss=3.686, over 100.00 samples. 2023-11-14 08:17:54,308 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 08:19:07,833 INFO [train.py:811] (0/4) Start epoch 596 2023-11-14 08:22:32,994 INFO [train.py:467] (0/4) Epoch 596, batch 35, global_batch_idx: 22050, batch size: 52, loss[discriminator_loss=2.693, discriminator_real_loss=1.37, discriminator_fake_loss=1.323, generator_loss=30.2, generator_mel_loss=20.83, generator_kl_loss=2.004, generator_dur_loss=1.693, generator_adv_loss=2.033, generator_feat_match_loss=3.641, over 52.00 samples.], tot_loss[discriminator_loss=2.627, discriminator_real_loss=1.335, discriminator_fake_loss=1.292, generator_loss=30.24, generator_mel_loss=20.53, generator_kl_loss=2.001, generator_dur_loss=1.653, generator_adv_loss=2.148, generator_feat_match_loss=3.903, over 2496.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 08:22:38,224 INFO [train.py:811] (0/4) Start epoch 597 2023-11-14 08:26:08,961 INFO [train.py:811] (0/4) Start epoch 598 2023-11-14 08:27:25,779 INFO [train.py:467] (0/4) Epoch 598, batch 11, global_batch_idx: 22100, batch size: 67, loss[discriminator_loss=2.695, discriminator_real_loss=1.324, discriminator_fake_loss=1.37, generator_loss=30.34, generator_mel_loss=21.01, generator_kl_loss=1.976, generator_dur_loss=1.654, generator_adv_loss=1.828, generator_feat_match_loss=3.875, over 67.00 samples.], tot_loss[discriminator_loss=2.667, discriminator_real_loss=1.359, discriminator_fake_loss=1.308, generator_loss=30.07, generator_mel_loss=20.5, generator_kl_loss=1.949, generator_dur_loss=1.652, generator_adv_loss=2.128, generator_feat_match_loss=3.847, over 810.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 08:29:42,244 INFO [train.py:811] (0/4) Start epoch 599 2023-11-14 08:32:02,701 INFO [train.py:467] (0/4) Epoch 599, batch 24, global_batch_idx: 22150, batch size: 60, loss[discriminator_loss=2.67, discriminator_real_loss=1.49, discriminator_fake_loss=1.18, generator_loss=30.55, generator_mel_loss=20.33, generator_kl_loss=1.934, generator_dur_loss=1.655, generator_adv_loss=2.334, generator_feat_match_loss=4.297, over 60.00 samples.], tot_loss[discriminator_loss=2.598, discriminator_real_loss=1.321, discriminator_fake_loss=1.277, generator_loss=30.56, generator_mel_loss=20.56, generator_kl_loss=1.995, generator_dur_loss=1.649, generator_adv_loss=2.245, generator_feat_match_loss=4.12, over 1843.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 08:33:16,316 INFO [train.py:811] (0/4) Start epoch 600 2023-11-14 08:36:52,389 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-600.pt 2023-11-14 08:36:55,725 INFO [train.py:811] (0/4) Start epoch 601 2023-11-14 08:37:09,005 INFO [train.py:467] (0/4) Epoch 601, batch 0, global_batch_idx: 22200, batch size: 79, loss[discriminator_loss=2.561, discriminator_real_loss=1.347, discriminator_fake_loss=1.214, generator_loss=30.66, generator_mel_loss=20.7, generator_kl_loss=2.074, generator_dur_loss=1.642, generator_adv_loss=2.104, generator_feat_match_loss=4.133, over 79.00 samples.], tot_loss[discriminator_loss=2.561, discriminator_real_loss=1.347, discriminator_fake_loss=1.214, generator_loss=30.66, generator_mel_loss=20.7, generator_kl_loss=2.074, generator_dur_loss=1.642, generator_adv_loss=2.104, generator_feat_match_loss=4.133, over 79.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 08:37:09,485 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 08:37:20,803 INFO [train.py:517] (0/4) Epoch 601, validation: discriminator_loss=2.585, discriminator_real_loss=1.242, discriminator_fake_loss=1.343, generator_loss=30.81, generator_mel_loss=20.9, generator_kl_loss=2.05, generator_dur_loss=1.638, generator_adv_loss=2.001, generator_feat_match_loss=4.224, over 100.00 samples. 2023-11-14 08:37:20,804 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 08:40:40,254 INFO [train.py:811] (0/4) Start epoch 602 2023-11-14 08:42:07,720 INFO [train.py:467] (0/4) Epoch 602, batch 13, global_batch_idx: 22250, batch size: 110, loss[discriminator_loss=2.652, discriminator_real_loss=1.503, discriminator_fake_loss=1.148, generator_loss=30.97, generator_mel_loss=20.96, generator_kl_loss=2.099, generator_dur_loss=1.656, generator_adv_loss=2.211, generator_feat_match_loss=4.051, over 110.00 samples.], tot_loss[discriminator_loss=2.62, discriminator_real_loss=1.345, discriminator_fake_loss=1.275, generator_loss=30.86, generator_mel_loss=20.92, generator_kl_loss=2.017, generator_dur_loss=1.652, generator_adv_loss=2.188, generator_feat_match_loss=4.08, over 1003.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 08:44:12,828 INFO [train.py:811] (0/4) Start epoch 603 2023-11-14 08:46:49,343 INFO [train.py:467] (0/4) Epoch 603, batch 26, global_batch_idx: 22300, batch size: 65, loss[discriminator_loss=2.551, discriminator_real_loss=1.245, discriminator_fake_loss=1.306, generator_loss=30.26, generator_mel_loss=20.48, generator_kl_loss=1.919, generator_dur_loss=1.656, generator_adv_loss=2.051, generator_feat_match_loss=4.156, over 65.00 samples.], tot_loss[discriminator_loss=2.615, discriminator_real_loss=1.334, discriminator_fake_loss=1.282, generator_loss=30.64, generator_mel_loss=20.59, generator_kl_loss=1.998, generator_dur_loss=1.655, generator_adv_loss=2.246, generator_feat_match_loss=4.147, over 1964.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2023-11-14 08:47:48,222 INFO [train.py:811] (0/4) Start epoch 604 2023-11-14 08:51:19,149 INFO [train.py:811] (0/4) Start epoch 605 2023-11-14 08:51:43,952 INFO [train.py:467] (0/4) Epoch 605, batch 2, global_batch_idx: 22350, batch size: 51, loss[discriminator_loss=2.594, discriminator_real_loss=1.367, discriminator_fake_loss=1.228, generator_loss=30.68, generator_mel_loss=20.91, generator_kl_loss=1.881, generator_dur_loss=1.664, generator_adv_loss=2.043, generator_feat_match_loss=4.188, over 51.00 samples.], tot_loss[discriminator_loss=2.618, discriminator_real_loss=1.371, discriminator_fake_loss=1.248, generator_loss=30.45, generator_mel_loss=20.59, generator_kl_loss=2.008, generator_dur_loss=1.657, generator_adv_loss=2.171, generator_feat_match_loss=4.015, over 214.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 08:54:52,248 INFO [train.py:811] (0/4) Start epoch 606 2023-11-14 08:56:29,682 INFO [train.py:467] (0/4) Epoch 606, batch 15, global_batch_idx: 22400, batch size: 50, loss[discriminator_loss=2.613, discriminator_real_loss=1.34, discriminator_fake_loss=1.273, generator_loss=30.37, generator_mel_loss=20.61, generator_kl_loss=1.957, generator_dur_loss=1.646, generator_adv_loss=2.127, generator_feat_match_loss=4.031, over 50.00 samples.], tot_loss[discriminator_loss=2.656, discriminator_real_loss=1.346, discriminator_fake_loss=1.31, generator_loss=30.53, generator_mel_loss=20.82, generator_kl_loss=2.005, generator_dur_loss=1.659, generator_adv_loss=2.113, generator_feat_match_loss=3.938, over 1162.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2023-11-14 08:56:30,181 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 08:56:40,711 INFO [train.py:517] (0/4) Epoch 606, validation: discriminator_loss=2.547, discriminator_real_loss=1.237, discriminator_fake_loss=1.31, generator_loss=30.86, generator_mel_loss=21.06, generator_kl_loss=2.2, generator_dur_loss=1.637, generator_adv_loss=2.034, generator_feat_match_loss=3.924, over 100.00 samples. 2023-11-14 08:56:40,712 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 08:58:38,593 INFO [train.py:811] (0/4) Start epoch 607 2023-11-14 09:01:30,564 INFO [train.py:467] (0/4) Epoch 607, batch 28, global_batch_idx: 22450, batch size: 52, loss[discriminator_loss=2.604, discriminator_real_loss=1.348, discriminator_fake_loss=1.256, generator_loss=30.65, generator_mel_loss=20.41, generator_kl_loss=2.019, generator_dur_loss=1.652, generator_adv_loss=2.303, generator_feat_match_loss=4.273, over 52.00 samples.], tot_loss[discriminator_loss=2.623, discriminator_real_loss=1.329, discriminator_fake_loss=1.294, generator_loss=30.36, generator_mel_loss=20.57, generator_kl_loss=2, generator_dur_loss=1.65, generator_adv_loss=2.142, generator_feat_match_loss=3.993, over 2165.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2023-11-14 09:02:07,457 INFO [train.py:811] (0/4) Start epoch 608 2023-11-14 09:05:37,458 INFO [train.py:811] (0/4) Start epoch 609 2023-11-14 09:06:13,803 INFO [train.py:467] (0/4) Epoch 609, batch 4, global_batch_idx: 22500, batch size: 73, loss[discriminator_loss=2.629, discriminator_real_loss=1.361, discriminator_fake_loss=1.267, generator_loss=30.45, generator_mel_loss=20.61, generator_kl_loss=1.99, generator_dur_loss=1.66, generator_adv_loss=2.229, generator_feat_match_loss=3.961, over 73.00 samples.], tot_loss[discriminator_loss=2.671, discriminator_real_loss=1.339, discriminator_fake_loss=1.332, generator_loss=30.46, generator_mel_loss=20.84, generator_kl_loss=1.981, generator_dur_loss=1.651, generator_adv_loss=2.106, generator_feat_match_loss=3.881, over 353.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2023-11-14 09:09:12,629 INFO [train.py:811] (0/4) Start epoch 610 2023-11-14 09:10:59,050 INFO [train.py:467] (0/4) Epoch 610, batch 17, global_batch_idx: 22550, batch size: 71, loss[discriminator_loss=2.453, discriminator_real_loss=1.228, discriminator_fake_loss=1.226, generator_loss=32.27, generator_mel_loss=21.03, generator_kl_loss=1.991, generator_dur_loss=1.651, generator_adv_loss=2.439, generator_feat_match_loss=5.156, over 71.00 samples.], tot_loss[discriminator_loss=2.531, discriminator_real_loss=1.275, discriminator_fake_loss=1.256, generator_loss=30.9, generator_mel_loss=20.66, generator_kl_loss=1.987, generator_dur_loss=1.654, generator_adv_loss=2.289, generator_feat_match_loss=4.31, over 1375.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 09:12:50,102 INFO [train.py:811] (0/4) Start epoch 611 2023-11-14 09:15:56,619 INFO [train.py:467] (0/4) Epoch 611, batch 30, global_batch_idx: 22600, batch size: 64, loss[discriminator_loss=2.535, discriminator_real_loss=1.266, discriminator_fake_loss=1.269, generator_loss=30.11, generator_mel_loss=20.3, generator_kl_loss=2.029, generator_dur_loss=1.641, generator_adv_loss=2.08, generator_feat_match_loss=4.051, over 64.00 samples.], tot_loss[discriminator_loss=2.55, discriminator_real_loss=1.286, discriminator_fake_loss=1.264, generator_loss=30.35, generator_mel_loss=20.27, generator_kl_loss=1.989, generator_dur_loss=1.649, generator_adv_loss=2.252, generator_feat_match_loss=4.192, over 2385.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 09:15:57,111 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 09:16:07,668 INFO [train.py:517] (0/4) Epoch 611, validation: discriminator_loss=2.545, discriminator_real_loss=1.245, discriminator_fake_loss=1.3, generator_loss=30.97, generator_mel_loss=20.94, generator_kl_loss=2.134, generator_dur_loss=1.63, generator_adv_loss=2.076, generator_feat_match_loss=4.18, over 100.00 samples. 2023-11-14 09:16:07,669 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 09:16:34,363 INFO [train.py:811] (0/4) Start epoch 612 2023-11-14 09:20:06,645 INFO [train.py:811] (0/4) Start epoch 613 2023-11-14 09:20:51,692 INFO [train.py:467] (0/4) Epoch 613, batch 6, global_batch_idx: 22650, batch size: 154, loss[discriminator_loss=2.582, discriminator_real_loss=1.267, discriminator_fake_loss=1.314, generator_loss=31.11, generator_mel_loss=20.6, generator_kl_loss=2.137, generator_dur_loss=1.646, generator_adv_loss=2.264, generator_feat_match_loss=4.469, over 154.00 samples.], tot_loss[discriminator_loss=2.526, discriminator_real_loss=1.273, discriminator_fake_loss=1.253, generator_loss=31.05, generator_mel_loss=20.41, generator_kl_loss=2.081, generator_dur_loss=1.663, generator_adv_loss=2.368, generator_feat_match_loss=4.532, over 496.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 09:23:36,801 INFO [train.py:811] (0/4) Start epoch 614 2023-11-14 09:25:41,066 INFO [train.py:467] (0/4) Epoch 614, batch 19, global_batch_idx: 22700, batch size: 126, loss[discriminator_loss=2.537, discriminator_real_loss=1.317, discriminator_fake_loss=1.22, generator_loss=31.43, generator_mel_loss=20.97, generator_kl_loss=2.006, generator_dur_loss=1.638, generator_adv_loss=2.326, generator_feat_match_loss=4.496, over 126.00 samples.], tot_loss[discriminator_loss=2.512, discriminator_real_loss=1.271, discriminator_fake_loss=1.24, generator_loss=30.45, generator_mel_loss=20.13, generator_kl_loss=1.981, generator_dur_loss=1.648, generator_adv_loss=2.305, generator_feat_match_loss=4.394, over 1339.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 09:27:09,731 INFO [train.py:811] (0/4) Start epoch 615 2023-11-14 09:30:16,623 INFO [train.py:467] (0/4) Epoch 615, batch 32, global_batch_idx: 22750, batch size: 51, loss[discriminator_loss=2.746, discriminator_real_loss=1.225, discriminator_fake_loss=1.521, generator_loss=30.64, generator_mel_loss=20.42, generator_kl_loss=2.044, generator_dur_loss=1.676, generator_adv_loss=2.412, generator_feat_match_loss=4.086, over 51.00 samples.], tot_loss[discriminator_loss=2.605, discriminator_real_loss=1.323, discriminator_fake_loss=1.282, generator_loss=30.49, generator_mel_loss=20.53, generator_kl_loss=2.001, generator_dur_loss=1.648, generator_adv_loss=2.189, generator_feat_match_loss=4.12, over 2468.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 09:30:41,023 INFO [train.py:811] (0/4) Start epoch 616 2023-11-14 09:34:12,987 INFO [train.py:811] (0/4) Start epoch 617 2023-11-14 09:35:08,966 INFO [train.py:467] (0/4) Epoch 617, batch 8, global_batch_idx: 22800, batch size: 59, loss[discriminator_loss=2.648, discriminator_real_loss=1.35, discriminator_fake_loss=1.3, generator_loss=29.97, generator_mel_loss=20.53, generator_kl_loss=2.004, generator_dur_loss=1.648, generator_adv_loss=2.078, generator_feat_match_loss=3.713, over 59.00 samples.], tot_loss[discriminator_loss=2.59, discriminator_real_loss=1.32, discriminator_fake_loss=1.269, generator_loss=30.57, generator_mel_loss=20.7, generator_kl_loss=2.027, generator_dur_loss=1.657, generator_adv_loss=2.157, generator_feat_match_loss=4.023, over 655.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2023-11-14 09:35:09,682 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 09:35:20,813 INFO [train.py:517] (0/4) Epoch 617, validation: discriminator_loss=2.624, discriminator_real_loss=1.329, discriminator_fake_loss=1.295, generator_loss=31.14, generator_mel_loss=21.46, generator_kl_loss=2.126, generator_dur_loss=1.642, generator_adv_loss=2.023, generator_feat_match_loss=3.887, over 100.00 samples. 2023-11-14 09:35:20,814 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 09:38:01,969 INFO [train.py:811] (0/4) Start epoch 618 2023-11-14 09:40:09,689 INFO [train.py:467] (0/4) Epoch 618, batch 21, global_batch_idx: 22850, batch size: 101, loss[discriminator_loss=2.602, discriminator_real_loss=1.333, discriminator_fake_loss=1.268, generator_loss=30.98, generator_mel_loss=20.89, generator_kl_loss=2.054, generator_dur_loss=1.631, generator_adv_loss=2.285, generator_feat_match_loss=4.121, over 101.00 samples.], tot_loss[discriminator_loss=2.631, discriminator_real_loss=1.33, discriminator_fake_loss=1.301, generator_loss=30.41, generator_mel_loss=20.6, generator_kl_loss=2.018, generator_dur_loss=1.647, generator_adv_loss=2.181, generator_feat_match_loss=3.963, over 1538.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2023-11-14 09:41:29,415 INFO [train.py:811] (0/4) Start epoch 619 2023-11-14 09:44:53,781 INFO [train.py:467] (0/4) Epoch 619, batch 34, global_batch_idx: 22900, batch size: 69, loss[discriminator_loss=2.496, discriminator_real_loss=1.31, discriminator_fake_loss=1.188, generator_loss=31.07, generator_mel_loss=20.13, generator_kl_loss=2.011, generator_dur_loss=1.679, generator_adv_loss=2.254, generator_feat_match_loss=5, over 69.00 samples.], tot_loss[discriminator_loss=2.513, discriminator_real_loss=1.288, discriminator_fake_loss=1.225, generator_loss=31.03, generator_mel_loss=20.31, generator_kl_loss=1.986, generator_dur_loss=1.646, generator_adv_loss=2.393, generator_feat_match_loss=4.692, over 2810.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 09:45:03,844 INFO [train.py:811] (0/4) Start epoch 620 2023-11-14 09:48:40,912 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-620.pt 2023-11-14 09:48:44,263 INFO [train.py:811] (0/4) Start epoch 621 2023-11-14 09:49:58,335 INFO [train.py:467] (0/4) Epoch 621, batch 10, global_batch_idx: 22950, batch size: 50, loss[discriminator_loss=2.605, discriminator_real_loss=1.257, discriminator_fake_loss=1.35, generator_loss=30.45, generator_mel_loss=20.72, generator_kl_loss=1.965, generator_dur_loss=1.654, generator_adv_loss=2.275, generator_feat_match_loss=3.836, over 50.00 samples.], tot_loss[discriminator_loss=2.561, discriminator_real_loss=1.276, discriminator_fake_loss=1.286, generator_loss=30.43, generator_mel_loss=20.62, generator_kl_loss=1.961, generator_dur_loss=1.652, generator_adv_loss=2.141, generator_feat_match_loss=4.062, over 775.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 09:52:13,974 INFO [train.py:811] (0/4) Start epoch 622 2023-11-14 09:54:25,813 INFO [train.py:467] (0/4) Epoch 622, batch 23, global_batch_idx: 23000, batch size: 52, loss[discriminator_loss=2.703, discriminator_real_loss=1.415, discriminator_fake_loss=1.289, generator_loss=30.19, generator_mel_loss=20.63, generator_kl_loss=1.944, generator_dur_loss=1.636, generator_adv_loss=2.188, generator_feat_match_loss=3.785, over 52.00 samples.], tot_loss[discriminator_loss=2.635, discriminator_real_loss=1.333, discriminator_fake_loss=1.302, generator_loss=30.28, generator_mel_loss=20.55, generator_kl_loss=1.999, generator_dur_loss=1.645, generator_adv_loss=2.126, generator_feat_match_loss=3.959, over 1811.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 09:54:26,355 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 09:54:37,391 INFO [train.py:517] (0/4) Epoch 622, validation: discriminator_loss=2.656, discriminator_real_loss=1.36, discriminator_fake_loss=1.297, generator_loss=31.1, generator_mel_loss=21.42, generator_kl_loss=2.152, generator_dur_loss=1.639, generator_adv_loss=1.983, generator_feat_match_loss=3.901, over 100.00 samples. 2023-11-14 09:54:37,392 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 09:55:57,022 INFO [train.py:811] (0/4) Start epoch 623 2023-11-14 09:59:26,703 INFO [train.py:467] (0/4) Epoch 623, batch 36, global_batch_idx: 23050, batch size: 69, loss[discriminator_loss=2.564, discriminator_real_loss=1.201, discriminator_fake_loss=1.363, generator_loss=30.76, generator_mel_loss=20.53, generator_kl_loss=2.016, generator_dur_loss=1.647, generator_adv_loss=2.297, generator_feat_match_loss=4.273, over 69.00 samples.], tot_loss[discriminator_loss=2.651, discriminator_real_loss=1.351, discriminator_fake_loss=1.299, generator_loss=30.31, generator_mel_loss=20.59, generator_kl_loss=1.995, generator_dur_loss=1.65, generator_adv_loss=2.148, generator_feat_match_loss=3.933, over 2705.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 09:59:27,836 INFO [train.py:811] (0/4) Start epoch 624 2023-11-14 10:02:54,685 INFO [train.py:811] (0/4) Start epoch 625 2023-11-14 10:04:14,202 INFO [train.py:467] (0/4) Epoch 625, batch 12, global_batch_idx: 23100, batch size: 110, loss[discriminator_loss=2.602, discriminator_real_loss=1.246, discriminator_fake_loss=1.356, generator_loss=30.96, generator_mel_loss=20.57, generator_kl_loss=1.965, generator_dur_loss=1.661, generator_adv_loss=2.416, generator_feat_match_loss=4.348, over 110.00 samples.], tot_loss[discriminator_loss=2.597, discriminator_real_loss=1.322, discriminator_fake_loss=1.276, generator_loss=30.75, generator_mel_loss=20.5, generator_kl_loss=2, generator_dur_loss=1.648, generator_adv_loss=2.297, generator_feat_match_loss=4.303, over 920.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 10:06:27,632 INFO [train.py:811] (0/4) Start epoch 626 2023-11-14 10:08:53,799 INFO [train.py:467] (0/4) Epoch 626, batch 25, global_batch_idx: 23150, batch size: 50, loss[discriminator_loss=2.584, discriminator_real_loss=1.244, discriminator_fake_loss=1.34, generator_loss=30.38, generator_mel_loss=20.51, generator_kl_loss=1.966, generator_dur_loss=1.683, generator_adv_loss=2.186, generator_feat_match_loss=4.043, over 50.00 samples.], tot_loss[discriminator_loss=2.557, discriminator_real_loss=1.281, discriminator_fake_loss=1.276, generator_loss=30.48, generator_mel_loss=20.4, generator_kl_loss=2, generator_dur_loss=1.655, generator_adv_loss=2.199, generator_feat_match_loss=4.229, over 1882.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 10:09:57,285 INFO [train.py:811] (0/4) Start epoch 627 2023-11-14 10:13:35,050 INFO [train.py:811] (0/4) Start epoch 628 2023-11-14 10:13:57,587 INFO [train.py:467] (0/4) Epoch 628, batch 1, global_batch_idx: 23200, batch size: 101, loss[discriminator_loss=2.824, discriminator_real_loss=1.406, discriminator_fake_loss=1.417, generator_loss=29.31, generator_mel_loss=20.05, generator_kl_loss=2.049, generator_dur_loss=1.626, generator_adv_loss=2.02, generator_feat_match_loss=3.562, over 101.00 samples.], tot_loss[discriminator_loss=2.805, discriminator_real_loss=1.376, discriminator_fake_loss=1.429, generator_loss=29.22, generator_mel_loss=19.96, generator_kl_loss=2.022, generator_dur_loss=1.635, generator_adv_loss=2.026, generator_feat_match_loss=3.584, over 150.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2023-11-14 10:13:58,156 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 10:14:09,968 INFO [train.py:517] (0/4) Epoch 628, validation: discriminator_loss=2.78, discriminator_real_loss=1.448, discriminator_fake_loss=1.331, generator_loss=30.2, generator_mel_loss=20.83, generator_kl_loss=2.13, generator_dur_loss=1.641, generator_adv_loss=1.933, generator_feat_match_loss=3.66, over 100.00 samples. 2023-11-14 10:14:09,969 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 10:17:15,587 INFO [train.py:811] (0/4) Start epoch 629 2023-11-14 10:18:45,977 INFO [train.py:467] (0/4) Epoch 629, batch 14, global_batch_idx: 23250, batch size: 79, loss[discriminator_loss=2.562, discriminator_real_loss=1.229, discriminator_fake_loss=1.334, generator_loss=30.84, generator_mel_loss=20.54, generator_kl_loss=2.077, generator_dur_loss=1.643, generator_adv_loss=2.355, generator_feat_match_loss=4.227, over 79.00 samples.], tot_loss[discriminator_loss=2.562, discriminator_real_loss=1.302, discriminator_fake_loss=1.26, generator_loss=30.31, generator_mel_loss=20.42, generator_kl_loss=2.001, generator_dur_loss=1.649, generator_adv_loss=2.167, generator_feat_match_loss=4.074, over 972.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2023-11-14 10:20:46,992 INFO [train.py:811] (0/4) Start epoch 630 2023-11-14 10:23:35,014 INFO [train.py:467] (0/4) Epoch 630, batch 27, global_batch_idx: 23300, batch size: 57, loss[discriminator_loss=2.648, discriminator_real_loss=1.355, discriminator_fake_loss=1.294, generator_loss=29.85, generator_mel_loss=19.94, generator_kl_loss=1.943, generator_dur_loss=1.663, generator_adv_loss=2.332, generator_feat_match_loss=3.98, over 57.00 samples.], tot_loss[discriminator_loss=2.587, discriminator_real_loss=1.308, discriminator_fake_loss=1.279, generator_loss=30.18, generator_mel_loss=20.19, generator_kl_loss=2.014, generator_dur_loss=1.649, generator_adv_loss=2.188, generator_feat_match_loss=4.14, over 2194.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2023-11-14 10:24:20,078 INFO [train.py:811] (0/4) Start epoch 631 2023-11-14 10:27:52,545 INFO [train.py:811] (0/4) Start epoch 632 2023-11-14 10:28:21,178 INFO [train.py:467] (0/4) Epoch 632, batch 3, global_batch_idx: 23350, batch size: 63, loss[discriminator_loss=2.383, discriminator_real_loss=1.236, discriminator_fake_loss=1.146, generator_loss=31.31, generator_mel_loss=20.23, generator_kl_loss=1.936, generator_dur_loss=1.653, generator_adv_loss=2.41, generator_feat_match_loss=5.078, over 63.00 samples.], tot_loss[discriminator_loss=2.531, discriminator_real_loss=1.335, discriminator_fake_loss=1.196, generator_loss=30.58, generator_mel_loss=20.14, generator_kl_loss=1.97, generator_dur_loss=1.662, generator_adv_loss=2.422, generator_feat_match_loss=4.39, over 249.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 10:31:22,587 INFO [train.py:811] (0/4) Start epoch 633 2023-11-14 10:33:10,319 INFO [train.py:467] (0/4) Epoch 633, batch 16, global_batch_idx: 23400, batch size: 52, loss[discriminator_loss=2.617, discriminator_real_loss=1.337, discriminator_fake_loss=1.281, generator_loss=30.17, generator_mel_loss=20.4, generator_kl_loss=2.004, generator_dur_loss=1.634, generator_adv_loss=2.174, generator_feat_match_loss=3.957, over 52.00 samples.], tot_loss[discriminator_loss=2.576, discriminator_real_loss=1.289, discriminator_fake_loss=1.287, generator_loss=30.38, generator_mel_loss=20.42, generator_kl_loss=2.049, generator_dur_loss=1.649, generator_adv_loss=2.153, generator_feat_match_loss=4.11, over 1290.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 10:33:10,848 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 10:33:21,993 INFO [train.py:517] (0/4) Epoch 633, validation: discriminator_loss=2.527, discriminator_real_loss=1.207, discriminator_fake_loss=1.32, generator_loss=30.54, generator_mel_loss=20.69, generator_kl_loss=2.167, generator_dur_loss=1.634, generator_adv_loss=2.041, generator_feat_match_loss=4.006, over 100.00 samples. 2023-11-14 10:33:21,995 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 10:35:03,982 INFO [train.py:811] (0/4) Start epoch 634 2023-11-14 10:37:54,456 INFO [train.py:467] (0/4) Epoch 634, batch 29, global_batch_idx: 23450, batch size: 61, loss[discriminator_loss=2.549, discriminator_real_loss=1.318, discriminator_fake_loss=1.23, generator_loss=29.3, generator_mel_loss=19.56, generator_kl_loss=1.967, generator_dur_loss=1.652, generator_adv_loss=2.189, generator_feat_match_loss=3.928, over 61.00 samples.], tot_loss[discriminator_loss=2.559, discriminator_real_loss=1.277, discriminator_fake_loss=1.282, generator_loss=30.61, generator_mel_loss=20.34, generator_kl_loss=1.991, generator_dur_loss=1.645, generator_adv_loss=2.287, generator_feat_match_loss=4.347, over 2301.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 10:38:37,695 INFO [train.py:811] (0/4) Start epoch 635 2023-11-14 10:42:19,061 INFO [train.py:811] (0/4) Start epoch 636 2023-11-14 10:42:58,855 INFO [train.py:467] (0/4) Epoch 636, batch 5, global_batch_idx: 23500, batch size: 79, loss[discriminator_loss=2.309, discriminator_real_loss=1.121, discriminator_fake_loss=1.188, generator_loss=31.59, generator_mel_loss=19.93, generator_kl_loss=1.971, generator_dur_loss=1.618, generator_adv_loss=2.775, generator_feat_match_loss=5.297, over 79.00 samples.], tot_loss[discriminator_loss=2.369, discriminator_real_loss=1.234, discriminator_fake_loss=1.135, generator_loss=30.9, generator_mel_loss=19.96, generator_kl_loss=1.973, generator_dur_loss=1.646, generator_adv_loss=2.443, generator_feat_match_loss=4.885, over 375.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 10:45:55,020 INFO [train.py:811] (0/4) Start epoch 637 2023-11-14 10:47:54,918 INFO [train.py:467] (0/4) Epoch 637, batch 18, global_batch_idx: 23550, batch size: 71, loss[discriminator_loss=2.633, discriminator_real_loss=1.186, discriminator_fake_loss=1.446, generator_loss=29.88, generator_mel_loss=20.27, generator_kl_loss=2.086, generator_dur_loss=1.629, generator_adv_loss=2.053, generator_feat_match_loss=3.844, over 71.00 samples.], tot_loss[discriminator_loss=2.622, discriminator_real_loss=1.349, discriminator_fake_loss=1.273, generator_loss=30.33, generator_mel_loss=20.45, generator_kl_loss=2.018, generator_dur_loss=1.644, generator_adv_loss=2.136, generator_feat_match_loss=4.084, over 1564.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 10:49:25,727 INFO [train.py:811] (0/4) Start epoch 638 2023-11-14 10:52:32,079 INFO [train.py:467] (0/4) Epoch 638, batch 31, global_batch_idx: 23600, batch size: 76, loss[discriminator_loss=2.652, discriminator_real_loss=1.288, discriminator_fake_loss=1.363, generator_loss=30.62, generator_mel_loss=20.71, generator_kl_loss=1.996, generator_dur_loss=1.653, generator_adv_loss=2.143, generator_feat_match_loss=4.121, over 76.00 samples.], tot_loss[discriminator_loss=2.623, discriminator_real_loss=1.329, discriminator_fake_loss=1.294, generator_loss=30.48, generator_mel_loss=20.59, generator_kl_loss=2.038, generator_dur_loss=1.645, generator_adv_loss=2.138, generator_feat_match_loss=4.064, over 2349.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2023-11-14 10:52:32,718 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 10:52:43,774 INFO [train.py:517] (0/4) Epoch 638, validation: discriminator_loss=2.672, discriminator_real_loss=1.26, discriminator_fake_loss=1.412, generator_loss=30.22, generator_mel_loss=20.97, generator_kl_loss=2.202, generator_dur_loss=1.635, generator_adv_loss=1.772, generator_feat_match_loss=3.642, over 100.00 samples. 2023-11-14 10:52:43,775 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 10:53:10,291 INFO [train.py:811] (0/4) Start epoch 639 2023-11-14 10:56:44,420 INFO [train.py:811] (0/4) Start epoch 640 2023-11-14 10:57:40,773 INFO [train.py:467] (0/4) Epoch 640, batch 7, global_batch_idx: 23650, batch size: 71, loss[discriminator_loss=2.533, discriminator_real_loss=1.156, discriminator_fake_loss=1.377, generator_loss=30.49, generator_mel_loss=20.24, generator_kl_loss=1.916, generator_dur_loss=1.634, generator_adv_loss=2.145, generator_feat_match_loss=4.559, over 71.00 samples.], tot_loss[discriminator_loss=2.633, discriminator_real_loss=1.322, discriminator_fake_loss=1.31, generator_loss=30.37, generator_mel_loss=20.39, generator_kl_loss=1.982, generator_dur_loss=1.652, generator_adv_loss=2.212, generator_feat_match_loss=4.14, over 652.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2023-11-14 11:00:13,388 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-640.pt 2023-11-14 11:00:16,818 INFO [train.py:811] (0/4) Start epoch 641 2023-11-14 11:02:21,391 INFO [train.py:467] (0/4) Epoch 641, batch 20, global_batch_idx: 23700, batch size: 56, loss[discriminator_loss=2.68, discriminator_real_loss=1.43, discriminator_fake_loss=1.25, generator_loss=30.36, generator_mel_loss=20.43, generator_kl_loss=1.954, generator_dur_loss=1.648, generator_adv_loss=2.203, generator_feat_match_loss=4.117, over 56.00 samples.], tot_loss[discriminator_loss=2.602, discriminator_real_loss=1.33, discriminator_fake_loss=1.271, generator_loss=30.45, generator_mel_loss=20.4, generator_kl_loss=2.007, generator_dur_loss=1.646, generator_adv_loss=2.203, generator_feat_match_loss=4.186, over 1640.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2023-11-14 11:03:49,080 INFO [train.py:811] (0/4) Start epoch 642 2023-11-14 11:07:06,494 INFO [train.py:467] (0/4) Epoch 642, batch 33, global_batch_idx: 23750, batch size: 73, loss[discriminator_loss=2.482, discriminator_real_loss=1.332, discriminator_fake_loss=1.15, generator_loss=30.42, generator_mel_loss=20.42, generator_kl_loss=2.027, generator_dur_loss=1.649, generator_adv_loss=2.146, generator_feat_match_loss=4.18, over 73.00 samples.], tot_loss[discriminator_loss=2.562, discriminator_real_loss=1.305, discriminator_fake_loss=1.258, generator_loss=30.58, generator_mel_loss=20.28, generator_kl_loss=2.015, generator_dur_loss=1.645, generator_adv_loss=2.298, generator_feat_match_loss=4.35, over 2394.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 11:07:21,935 INFO [train.py:811] (0/4) Start epoch 643 2023-11-14 11:10:59,624 INFO [train.py:811] (0/4) Start epoch 644 2023-11-14 11:12:06,790 INFO [train.py:467] (0/4) Epoch 644, batch 9, global_batch_idx: 23800, batch size: 76, loss[discriminator_loss=2.338, discriminator_real_loss=1.125, discriminator_fake_loss=1.213, generator_loss=31.22, generator_mel_loss=20, generator_kl_loss=2.019, generator_dur_loss=1.65, generator_adv_loss=2.465, generator_feat_match_loss=5.086, over 76.00 samples.], tot_loss[discriminator_loss=2.442, discriminator_real_loss=1.241, discriminator_fake_loss=1.201, generator_loss=30.95, generator_mel_loss=20.07, generator_kl_loss=1.998, generator_dur_loss=1.648, generator_adv_loss=2.442, generator_feat_match_loss=4.791, over 706.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 11:12:07,475 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 11:12:18,215 INFO [train.py:517] (0/4) Epoch 644, validation: discriminator_loss=2.394, discriminator_real_loss=1.111, discriminator_fake_loss=1.283, generator_loss=31.18, generator_mel_loss=20.58, generator_kl_loss=2.156, generator_dur_loss=1.638, generator_adv_loss=1.979, generator_feat_match_loss=4.832, over 100.00 samples. 2023-11-14 11:12:18,216 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 11:14:49,433 INFO [train.py:811] (0/4) Start epoch 645 2023-11-14 11:17:05,345 INFO [train.py:467] (0/4) Epoch 645, batch 22, global_batch_idx: 23850, batch size: 49, loss[discriminator_loss=2.527, discriminator_real_loss=1.306, discriminator_fake_loss=1.223, generator_loss=30.09, generator_mel_loss=20.31, generator_kl_loss=2.037, generator_dur_loss=1.655, generator_adv_loss=2.205, generator_feat_match_loss=3.887, over 49.00 samples.], tot_loss[discriminator_loss=2.571, discriminator_real_loss=1.307, discriminator_fake_loss=1.264, generator_loss=30.39, generator_mel_loss=20.15, generator_kl_loss=2.006, generator_dur_loss=1.649, generator_adv_loss=2.241, generator_feat_match_loss=4.343, over 1659.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 11:18:21,254 INFO [train.py:811] (0/4) Start epoch 646 2023-11-14 11:21:51,468 INFO [train.py:467] (0/4) Epoch 646, batch 35, global_batch_idx: 23900, batch size: 85, loss[discriminator_loss=2.725, discriminator_real_loss=1.481, discriminator_fake_loss=1.243, generator_loss=29.85, generator_mel_loss=20.17, generator_kl_loss=1.972, generator_dur_loss=1.669, generator_adv_loss=2.273, generator_feat_match_loss=3.766, over 85.00 samples.], tot_loss[discriminator_loss=2.605, discriminator_real_loss=1.327, discriminator_fake_loss=1.278, generator_loss=30.58, generator_mel_loss=20.54, generator_kl_loss=2.046, generator_dur_loss=1.644, generator_adv_loss=2.172, generator_feat_match_loss=4.179, over 2769.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2023-11-14 11:21:56,629 INFO [train.py:811] (0/4) Start epoch 647 2023-11-14 11:25:22,520 INFO [train.py:811] (0/4) Start epoch 648 2023-11-14 11:26:36,090 INFO [train.py:467] (0/4) Epoch 648, batch 11, global_batch_idx: 23950, batch size: 53, loss[discriminator_loss=2.619, discriminator_real_loss=1.338, discriminator_fake_loss=1.281, generator_loss=29.89, generator_mel_loss=20.23, generator_kl_loss=2.05, generator_dur_loss=1.664, generator_adv_loss=2.105, generator_feat_match_loss=3.844, over 53.00 samples.], tot_loss[discriminator_loss=2.628, discriminator_real_loss=1.334, discriminator_fake_loss=1.294, generator_loss=30.11, generator_mel_loss=20.35, generator_kl_loss=1.972, generator_dur_loss=1.647, generator_adv_loss=2.133, generator_feat_match_loss=4, over 825.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 11:28:54,973 INFO [train.py:811] (0/4) Start epoch 649 2023-11-14 11:31:26,759 INFO [train.py:467] (0/4) Epoch 649, batch 24, global_batch_idx: 24000, batch size: 126, loss[discriminator_loss=2.459, discriminator_real_loss=1.309, discriminator_fake_loss=1.15, generator_loss=31.14, generator_mel_loss=20.66, generator_kl_loss=2.095, generator_dur_loss=1.618, generator_adv_loss=2.102, generator_feat_match_loss=4.664, over 126.00 samples.], tot_loss[discriminator_loss=2.62, discriminator_real_loss=1.333, discriminator_fake_loss=1.287, generator_loss=30.6, generator_mel_loss=20.55, generator_kl_loss=2.038, generator_dur_loss=1.639, generator_adv_loss=2.207, generator_feat_match_loss=4.174, over 2141.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 16.0 2023-11-14 11:31:27,348 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 11:31:38,306 INFO [train.py:517] (0/4) Epoch 649, validation: discriminator_loss=2.62, discriminator_real_loss=1.149, discriminator_fake_loss=1.472, generator_loss=31.22, generator_mel_loss=21.2, generator_kl_loss=2.164, generator_dur_loss=1.641, generator_adv_loss=1.789, generator_feat_match_loss=4.427, over 100.00 samples. 2023-11-14 11:31:38,307 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 11:32:40,933 INFO [train.py:811] (0/4) Start epoch 650 2023-11-14 11:36:15,292 INFO [train.py:811] (0/4) Start epoch 651 2023-11-14 11:36:31,695 INFO [train.py:467] (0/4) Epoch 651, batch 0, global_batch_idx: 24050, batch size: 73, loss[discriminator_loss=2.543, discriminator_real_loss=1.174, discriminator_fake_loss=1.37, generator_loss=29.85, generator_mel_loss=19.89, generator_kl_loss=1.965, generator_dur_loss=1.647, generator_adv_loss=2.266, generator_feat_match_loss=4.086, over 73.00 samples.], tot_loss[discriminator_loss=2.543, discriminator_real_loss=1.174, discriminator_fake_loss=1.37, generator_loss=29.85, generator_mel_loss=19.89, generator_kl_loss=1.965, generator_dur_loss=1.647, generator_adv_loss=2.266, generator_feat_match_loss=4.086, over 73.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 11:39:52,419 INFO [train.py:811] (0/4) Start epoch 652 2023-11-14 11:41:29,894 INFO [train.py:467] (0/4) Epoch 652, batch 13, global_batch_idx: 24100, batch size: 54, loss[discriminator_loss=2.703, discriminator_real_loss=1.498, discriminator_fake_loss=1.206, generator_loss=29.77, generator_mel_loss=19.98, generator_kl_loss=1.968, generator_dur_loss=1.692, generator_adv_loss=2.129, generator_feat_match_loss=3.996, over 54.00 samples.], tot_loss[discriminator_loss=2.594, discriminator_real_loss=1.319, discriminator_fake_loss=1.276, generator_loss=30.54, generator_mel_loss=20.25, generator_kl_loss=1.999, generator_dur_loss=1.649, generator_adv_loss=2.255, generator_feat_match_loss=4.385, over 1043.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 11:43:33,115 INFO [train.py:811] (0/4) Start epoch 653 2023-11-14 11:46:13,094 INFO [train.py:467] (0/4) Epoch 653, batch 26, global_batch_idx: 24150, batch size: 71, loss[discriminator_loss=2.535, discriminator_real_loss=1.359, discriminator_fake_loss=1.176, generator_loss=30.48, generator_mel_loss=20.37, generator_kl_loss=2.048, generator_dur_loss=1.623, generator_adv_loss=2.283, generator_feat_match_loss=4.156, over 71.00 samples.], tot_loss[discriminator_loss=2.569, discriminator_real_loss=1.297, discriminator_fake_loss=1.272, generator_loss=30.23, generator_mel_loss=20.3, generator_kl_loss=2.012, generator_dur_loss=1.646, generator_adv_loss=2.142, generator_feat_match_loss=4.129, over 1962.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 11:47:09,437 INFO [train.py:811] (0/4) Start epoch 654 2023-11-14 11:50:39,298 INFO [train.py:811] (0/4) Start epoch 655 2023-11-14 11:51:04,864 INFO [train.py:467] (0/4) Epoch 655, batch 2, global_batch_idx: 24200, batch size: 49, loss[discriminator_loss=2.473, discriminator_real_loss=1.271, discriminator_fake_loss=1.2, generator_loss=30.5, generator_mel_loss=19.89, generator_kl_loss=2.001, generator_dur_loss=1.655, generator_adv_loss=2.445, generator_feat_match_loss=4.508, over 49.00 samples.], tot_loss[discriminator_loss=2.501, discriminator_real_loss=1.255, discriminator_fake_loss=1.245, generator_loss=30.28, generator_mel_loss=19.93, generator_kl_loss=1.981, generator_dur_loss=1.658, generator_adv_loss=2.332, generator_feat_match_loss=4.382, over 178.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 11:51:05,400 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 11:51:17,871 INFO [train.py:517] (0/4) Epoch 655, validation: discriminator_loss=2.579, discriminator_real_loss=1.29, discriminator_fake_loss=1.289, generator_loss=31.08, generator_mel_loss=20.77, generator_kl_loss=2.158, generator_dur_loss=1.64, generator_adv_loss=2.05, generator_feat_match_loss=4.467, over 100.00 samples. 2023-11-14 11:51:17,872 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 11:54:30,084 INFO [train.py:811] (0/4) Start epoch 656 2023-11-14 11:56:12,192 INFO [train.py:467] (0/4) Epoch 656, batch 15, global_batch_idx: 24250, batch size: 110, loss[discriminator_loss=2.535, discriminator_real_loss=1.296, discriminator_fake_loss=1.238, generator_loss=30.61, generator_mel_loss=20.68, generator_kl_loss=2.016, generator_dur_loss=1.641, generator_adv_loss=2.082, generator_feat_match_loss=4.191, over 110.00 samples.], tot_loss[discriminator_loss=2.555, discriminator_real_loss=1.313, discriminator_fake_loss=1.242, generator_loss=30.59, generator_mel_loss=20.41, generator_kl_loss=2.028, generator_dur_loss=1.647, generator_adv_loss=2.2, generator_feat_match_loss=4.298, over 1167.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 11:58:04,127 INFO [train.py:811] (0/4) Start epoch 657 2023-11-14 12:00:59,481 INFO [train.py:467] (0/4) Epoch 657, batch 28, global_batch_idx: 24300, batch size: 153, loss[discriminator_loss=2.531, discriminator_real_loss=1.211, discriminator_fake_loss=1.319, generator_loss=29.92, generator_mel_loss=20.05, generator_kl_loss=1.952, generator_dur_loss=1.636, generator_adv_loss=2.207, generator_feat_match_loss=4.07, over 153.00 samples.], tot_loss[discriminator_loss=2.612, discriminator_real_loss=1.317, discriminator_fake_loss=1.295, generator_loss=30.47, generator_mel_loss=20.4, generator_kl_loss=2.012, generator_dur_loss=1.645, generator_adv_loss=2.191, generator_feat_match_loss=4.226, over 2013.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 12:01:44,859 INFO [train.py:811] (0/4) Start epoch 658 2023-11-14 12:05:15,318 INFO [train.py:811] (0/4) Start epoch 659 2023-11-14 12:06:02,220 INFO [train.py:467] (0/4) Epoch 659, batch 4, global_batch_idx: 24350, batch size: 52, loss[discriminator_loss=2.312, discriminator_real_loss=1.141, discriminator_fake_loss=1.173, generator_loss=32.02, generator_mel_loss=20.33, generator_kl_loss=1.992, generator_dur_loss=1.65, generator_adv_loss=2.693, generator_feat_match_loss=5.355, over 52.00 samples.], tot_loss[discriminator_loss=2.483, discriminator_real_loss=1.26, discriminator_fake_loss=1.224, generator_loss=31.17, generator_mel_loss=20.31, generator_kl_loss=1.983, generator_dur_loss=1.653, generator_adv_loss=2.418, generator_feat_match_loss=4.808, over 360.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 12:08:52,145 INFO [train.py:811] (0/4) Start epoch 660 2023-11-14 12:10:43,925 INFO [train.py:467] (0/4) Epoch 660, batch 17, global_batch_idx: 24400, batch size: 60, loss[discriminator_loss=2.535, discriminator_real_loss=1.346, discriminator_fake_loss=1.19, generator_loss=30.62, generator_mel_loss=20.6, generator_kl_loss=2.176, generator_dur_loss=1.66, generator_adv_loss=2.053, generator_feat_match_loss=4.129, over 60.00 samples.], tot_loss[discriminator_loss=2.55, discriminator_real_loss=1.286, discriminator_fake_loss=1.264, generator_loss=30.38, generator_mel_loss=20.21, generator_kl_loss=1.999, generator_dur_loss=1.648, generator_adv_loss=2.23, generator_feat_match_loss=4.29, over 1307.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 16.0 2023-11-14 12:10:44,453 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 12:10:55,820 INFO [train.py:517] (0/4) Epoch 660, validation: discriminator_loss=2.644, discriminator_real_loss=1.146, discriminator_fake_loss=1.498, generator_loss=30.52, generator_mel_loss=20.88, generator_kl_loss=2.231, generator_dur_loss=1.644, generator_adv_loss=1.815, generator_feat_match_loss=3.949, over 100.00 samples. 2023-11-14 12:10:55,822 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 12:12:42,327 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-660.pt 2023-11-14 12:12:45,887 INFO [train.py:811] (0/4) Start epoch 661 2023-11-14 12:15:44,437 INFO [train.py:467] (0/4) Epoch 661, batch 30, global_batch_idx: 24450, batch size: 54, loss[discriminator_loss=2.383, discriminator_real_loss=1.238, discriminator_fake_loss=1.144, generator_loss=31.86, generator_mel_loss=20.7, generator_kl_loss=2.017, generator_dur_loss=1.666, generator_adv_loss=2.32, generator_feat_match_loss=5.16, over 54.00 samples.], tot_loss[discriminator_loss=2.591, discriminator_real_loss=1.338, discriminator_fake_loss=1.253, generator_loss=30.79, generator_mel_loss=20.37, generator_kl_loss=2.043, generator_dur_loss=1.646, generator_adv_loss=2.298, generator_feat_match_loss=4.433, over 2234.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 12:16:19,744 INFO [train.py:811] (0/4) Start epoch 662 2023-11-14 12:19:55,104 INFO [train.py:811] (0/4) Start epoch 663 2023-11-14 12:20:42,989 INFO [train.py:467] (0/4) Epoch 663, batch 6, global_batch_idx: 24500, batch size: 63, loss[discriminator_loss=2.572, discriminator_real_loss=1.271, discriminator_fake_loss=1.301, generator_loss=30.73, generator_mel_loss=20.56, generator_kl_loss=1.997, generator_dur_loss=1.642, generator_adv_loss=2.154, generator_feat_match_loss=4.379, over 63.00 samples.], tot_loss[discriminator_loss=2.601, discriminator_real_loss=1.326, discriminator_fake_loss=1.275, generator_loss=30.15, generator_mel_loss=20.17, generator_kl_loss=2.041, generator_dur_loss=1.654, generator_adv_loss=2.154, generator_feat_match_loss=4.123, over 424.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 12:23:27,262 INFO [train.py:811] (0/4) Start epoch 664 2023-11-14 12:25:28,976 INFO [train.py:467] (0/4) Epoch 664, batch 19, global_batch_idx: 24550, batch size: 95, loss[discriminator_loss=2.574, discriminator_real_loss=1.367, discriminator_fake_loss=1.207, generator_loss=31, generator_mel_loss=20.64, generator_kl_loss=2.089, generator_dur_loss=1.635, generator_adv_loss=2.273, generator_feat_match_loss=4.363, over 95.00 samples.], tot_loss[discriminator_loss=2.601, discriminator_real_loss=1.325, discriminator_fake_loss=1.277, generator_loss=30.68, generator_mel_loss=20.43, generator_kl_loss=2.031, generator_dur_loss=1.645, generator_adv_loss=2.239, generator_feat_match_loss=4.341, over 1525.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 12:27:02,701 INFO [train.py:811] (0/4) Start epoch 665 2023-11-14 12:30:08,304 INFO [train.py:467] (0/4) Epoch 665, batch 32, global_batch_idx: 24600, batch size: 64, loss[discriminator_loss=2.547, discriminator_real_loss=1.248, discriminator_fake_loss=1.298, generator_loss=30.89, generator_mel_loss=20.35, generator_kl_loss=1.957, generator_dur_loss=1.643, generator_adv_loss=2.543, generator_feat_match_loss=4.398, over 64.00 samples.], tot_loss[discriminator_loss=2.578, discriminator_real_loss=1.295, discriminator_fake_loss=1.282, generator_loss=30.64, generator_mel_loss=20.42, generator_kl_loss=2.028, generator_dur_loss=1.645, generator_adv_loss=2.236, generator_feat_match_loss=4.305, over 2369.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 12:30:08,977 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 12:30:19,981 INFO [train.py:517] (0/4) Epoch 665, validation: discriminator_loss=2.543, discriminator_real_loss=1.435, discriminator_fake_loss=1.108, generator_loss=31.54, generator_mel_loss=20.99, generator_kl_loss=2.102, generator_dur_loss=1.638, generator_adv_loss=2.564, generator_feat_match_loss=4.24, over 100.00 samples. 2023-11-14 12:30:19,983 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 12:30:43,559 INFO [train.py:811] (0/4) Start epoch 666 2023-11-14 12:34:21,518 INFO [train.py:811] (0/4) Start epoch 667 2023-11-14 12:35:22,755 INFO [train.py:467] (0/4) Epoch 667, batch 8, global_batch_idx: 24650, batch size: 49, loss[discriminator_loss=2.715, discriminator_real_loss=1.145, discriminator_fake_loss=1.571, generator_loss=29.84, generator_mel_loss=20.1, generator_kl_loss=1.983, generator_dur_loss=1.662, generator_adv_loss=2.406, generator_feat_match_loss=3.689, over 49.00 samples.], tot_loss[discriminator_loss=2.566, discriminator_real_loss=1.276, discriminator_fake_loss=1.29, generator_loss=31.08, generator_mel_loss=20.43, generator_kl_loss=2.005, generator_dur_loss=1.649, generator_adv_loss=2.391, generator_feat_match_loss=4.602, over 626.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 12:37:53,992 INFO [train.py:811] (0/4) Start epoch 668 2023-11-14 12:40:02,959 INFO [train.py:467] (0/4) Epoch 668, batch 21, global_batch_idx: 24700, batch size: 69, loss[discriminator_loss=2.58, discriminator_real_loss=1.289, discriminator_fake_loss=1.291, generator_loss=30, generator_mel_loss=20.22, generator_kl_loss=1.993, generator_dur_loss=1.633, generator_adv_loss=2.082, generator_feat_match_loss=4.07, over 69.00 samples.], tot_loss[discriminator_loss=2.6, discriminator_real_loss=1.317, discriminator_fake_loss=1.283, generator_loss=30.16, generator_mel_loss=20.22, generator_kl_loss=2.002, generator_dur_loss=1.645, generator_adv_loss=2.142, generator_feat_match_loss=4.156, over 1473.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 12:41:33,346 INFO [train.py:811] (0/4) Start epoch 669 2023-11-14 12:44:52,286 INFO [train.py:467] (0/4) Epoch 669, batch 34, global_batch_idx: 24750, batch size: 81, loss[discriminator_loss=2.535, discriminator_real_loss=1.134, discriminator_fake_loss=1.402, generator_loss=30.41, generator_mel_loss=20.02, generator_kl_loss=2.104, generator_dur_loss=1.633, generator_adv_loss=2.281, generator_feat_match_loss=4.363, over 81.00 samples.], tot_loss[discriminator_loss=2.608, discriminator_real_loss=1.322, discriminator_fake_loss=1.286, generator_loss=30.53, generator_mel_loss=20.35, generator_kl_loss=2.015, generator_dur_loss=1.646, generator_adv_loss=2.24, generator_feat_match_loss=4.275, over 2432.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 12:45:05,631 INFO [train.py:811] (0/4) Start epoch 670 2023-11-14 12:48:36,864 INFO [train.py:811] (0/4) Start epoch 671 2023-11-14 12:49:48,455 INFO [train.py:467] (0/4) Epoch 671, batch 10, global_batch_idx: 24800, batch size: 67, loss[discriminator_loss=2.547, discriminator_real_loss=1.402, discriminator_fake_loss=1.144, generator_loss=31.43, generator_mel_loss=20.6, generator_kl_loss=2.064, generator_dur_loss=1.644, generator_adv_loss=2.359, generator_feat_match_loss=4.766, over 67.00 samples.], tot_loss[discriminator_loss=2.521, discriminator_real_loss=1.24, discriminator_fake_loss=1.28, generator_loss=30.56, generator_mel_loss=20.22, generator_kl_loss=2.044, generator_dur_loss=1.645, generator_adv_loss=2.226, generator_feat_match_loss=4.429, over 892.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 16.0 2023-11-14 12:49:49,025 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 12:50:00,620 INFO [train.py:517] (0/4) Epoch 671, validation: discriminator_loss=2.513, discriminator_real_loss=1.152, discriminator_fake_loss=1.361, generator_loss=30.93, generator_mel_loss=20.81, generator_kl_loss=2.15, generator_dur_loss=1.638, generator_adv_loss=1.923, generator_feat_match_loss=4.411, over 100.00 samples. 2023-11-14 12:50:00,621 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 12:52:23,762 INFO [train.py:811] (0/4) Start epoch 672 2023-11-14 12:54:45,591 INFO [train.py:467] (0/4) Epoch 672, batch 23, global_batch_idx: 24850, batch size: 126, loss[discriminator_loss=2.592, discriminator_real_loss=1.299, discriminator_fake_loss=1.293, generator_loss=30.72, generator_mel_loss=20.44, generator_kl_loss=2.091, generator_dur_loss=1.633, generator_adv_loss=2.256, generator_feat_match_loss=4.305, over 126.00 samples.], tot_loss[discriminator_loss=2.567, discriminator_real_loss=1.291, discriminator_fake_loss=1.276, generator_loss=30.35, generator_mel_loss=20.22, generator_kl_loss=2.013, generator_dur_loss=1.646, generator_adv_loss=2.219, generator_feat_match_loss=4.255, over 1684.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 12:55:56,339 INFO [train.py:811] (0/4) Start epoch 673 2023-11-14 12:59:29,807 INFO [train.py:467] (0/4) Epoch 673, batch 36, global_batch_idx: 24900, batch size: 95, loss[discriminator_loss=2.555, discriminator_real_loss=1.32, discriminator_fake_loss=1.235, generator_loss=30.55, generator_mel_loss=20.44, generator_kl_loss=2.107, generator_dur_loss=1.658, generator_adv_loss=2.129, generator_feat_match_loss=4.211, over 95.00 samples.], tot_loss[discriminator_loss=2.6, discriminator_real_loss=1.321, discriminator_fake_loss=1.279, generator_loss=30.46, generator_mel_loss=20.44, generator_kl_loss=2.048, generator_dur_loss=1.645, generator_adv_loss=2.151, generator_feat_match_loss=4.17, over 2868.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 12:59:31,031 INFO [train.py:811] (0/4) Start epoch 674 2023-11-14 13:03:02,022 INFO [train.py:811] (0/4) Start epoch 675 2023-11-14 13:04:20,033 INFO [train.py:467] (0/4) Epoch 675, batch 12, global_batch_idx: 24950, batch size: 71, loss[discriminator_loss=2.367, discriminator_real_loss=1.249, discriminator_fake_loss=1.117, generator_loss=32.07, generator_mel_loss=20.11, generator_kl_loss=2.001, generator_dur_loss=1.652, generator_adv_loss=2.576, generator_feat_match_loss=5.727, over 71.00 samples.], tot_loss[discriminator_loss=2.555, discriminator_real_loss=1.334, discriminator_fake_loss=1.222, generator_loss=30.95, generator_mel_loss=20.06, generator_kl_loss=1.992, generator_dur_loss=1.645, generator_adv_loss=2.409, generator_feat_match_loss=4.842, over 925.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 13:06:42,769 INFO [train.py:811] (0/4) Start epoch 676 2023-11-14 13:09:11,608 INFO [train.py:467] (0/4) Epoch 676, batch 25, global_batch_idx: 25000, batch size: 126, loss[discriminator_loss=2.602, discriminator_real_loss=1.254, discriminator_fake_loss=1.347, generator_loss=30.63, generator_mel_loss=20.55, generator_kl_loss=2.077, generator_dur_loss=1.639, generator_adv_loss=2.168, generator_feat_match_loss=4.203, over 126.00 samples.], tot_loss[discriminator_loss=2.577, discriminator_real_loss=1.297, discriminator_fake_loss=1.28, generator_loss=30.38, generator_mel_loss=20.39, generator_kl_loss=2.044, generator_dur_loss=1.647, generator_adv_loss=2.169, generator_feat_match_loss=4.135, over 1710.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 13:09:12,149 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 13:09:23,431 INFO [train.py:517] (0/4) Epoch 676, validation: discriminator_loss=2.535, discriminator_real_loss=1.263, discriminator_fake_loss=1.272, generator_loss=31.18, generator_mel_loss=20.96, generator_kl_loss=2.146, generator_dur_loss=1.636, generator_adv_loss=2.061, generator_feat_match_loss=4.378, over 100.00 samples. 2023-11-14 13:09:23,432 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 13:10:27,044 INFO [train.py:811] (0/4) Start epoch 677 2023-11-14 13:13:59,594 INFO [train.py:811] (0/4) Start epoch 678 2023-11-14 13:14:22,106 INFO [train.py:467] (0/4) Epoch 678, batch 1, global_batch_idx: 25050, batch size: 56, loss[discriminator_loss=2.465, discriminator_real_loss=1.295, discriminator_fake_loss=1.171, generator_loss=30.6, generator_mel_loss=19.91, generator_kl_loss=1.992, generator_dur_loss=1.667, generator_adv_loss=2.33, generator_feat_match_loss=4.703, over 56.00 samples.], tot_loss[discriminator_loss=2.544, discriminator_real_loss=1.363, discriminator_fake_loss=1.181, generator_loss=30.54, generator_mel_loss=19.98, generator_kl_loss=2.019, generator_dur_loss=1.655, generator_adv_loss=2.341, generator_feat_match_loss=4.55, over 114.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 13:17:27,936 INFO [train.py:811] (0/4) Start epoch 679 2023-11-14 13:19:06,708 INFO [train.py:467] (0/4) Epoch 679, batch 14, global_batch_idx: 25100, batch size: 49, loss[discriminator_loss=2.559, discriminator_real_loss=1.375, discriminator_fake_loss=1.184, generator_loss=31.01, generator_mel_loss=20.2, generator_kl_loss=1.938, generator_dur_loss=1.662, generator_adv_loss=2.551, generator_feat_match_loss=4.656, over 49.00 samples.], tot_loss[discriminator_loss=2.625, discriminator_real_loss=1.327, discriminator_fake_loss=1.298, generator_loss=30.51, generator_mel_loss=20.28, generator_kl_loss=2.005, generator_dur_loss=1.64, generator_adv_loss=2.282, generator_feat_match_loss=4.303, over 1138.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 13:21:04,009 INFO [train.py:811] (0/4) Start epoch 680 2023-11-14 13:23:45,500 INFO [train.py:467] (0/4) Epoch 680, batch 27, global_batch_idx: 25150, batch size: 153, loss[discriminator_loss=2.582, discriminator_real_loss=1.396, discriminator_fake_loss=1.188, generator_loss=31.46, generator_mel_loss=20.39, generator_kl_loss=2.002, generator_dur_loss=1.627, generator_adv_loss=2.455, generator_feat_match_loss=4.988, over 153.00 samples.], tot_loss[discriminator_loss=2.552, discriminator_real_loss=1.29, discriminator_fake_loss=1.262, generator_loss=30.83, generator_mel_loss=20.36, generator_kl_loss=2.02, generator_dur_loss=1.643, generator_adv_loss=2.292, generator_feat_match_loss=4.515, over 2044.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 13:24:32,548 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-680.pt 2023-11-14 13:24:35,959 INFO [train.py:811] (0/4) Start epoch 681 2023-11-14 13:28:07,803 INFO [train.py:811] (0/4) Start epoch 682 2023-11-14 13:28:42,565 INFO [train.py:467] (0/4) Epoch 682, batch 3, global_batch_idx: 25200, batch size: 101, loss[discriminator_loss=2.33, discriminator_real_loss=1.21, discriminator_fake_loss=1.12, generator_loss=31.38, generator_mel_loss=19.88, generator_kl_loss=1.982, generator_dur_loss=1.625, generator_adv_loss=2.602, generator_feat_match_loss=5.297, over 101.00 samples.], tot_loss[discriminator_loss=2.363, discriminator_real_loss=1.206, discriminator_fake_loss=1.158, generator_loss=31.48, generator_mel_loss=20.04, generator_kl_loss=2.013, generator_dur_loss=1.652, generator_adv_loss=2.568, generator_feat_match_loss=5.21, over 343.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 16.0 2023-11-14 13:28:43,157 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 13:28:56,170 INFO [train.py:517] (0/4) Epoch 682, validation: discriminator_loss=2.282, discriminator_real_loss=1.218, discriminator_fake_loss=1.064, generator_loss=32.04, generator_mel_loss=20.55, generator_kl_loss=2.159, generator_dur_loss=1.636, generator_adv_loss=2.418, generator_feat_match_loss=5.276, over 100.00 samples. 2023-11-14 13:28:56,171 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 13:31:51,276 INFO [train.py:811] (0/4) Start epoch 683 2023-11-14 13:33:37,772 INFO [train.py:467] (0/4) Epoch 683, batch 16, global_batch_idx: 25250, batch size: 101, loss[discriminator_loss=2.602, discriminator_real_loss=1.364, discriminator_fake_loss=1.238, generator_loss=30.75, generator_mel_loss=20.79, generator_kl_loss=2.052, generator_dur_loss=1.655, generator_adv_loss=2.057, generator_feat_match_loss=4.195, over 101.00 samples.], tot_loss[discriminator_loss=2.635, discriminator_real_loss=1.353, discriminator_fake_loss=1.283, generator_loss=30.44, generator_mel_loss=20.43, generator_kl_loss=2.026, generator_dur_loss=1.646, generator_adv_loss=2.15, generator_feat_match_loss=4.193, over 1467.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 16.0 2023-11-14 13:35:24,550 INFO [train.py:811] (0/4) Start epoch 684 2023-11-14 13:38:20,622 INFO [train.py:467] (0/4) Epoch 684, batch 29, global_batch_idx: 25300, batch size: 153, loss[discriminator_loss=2.646, discriminator_real_loss=1.318, discriminator_fake_loss=1.328, generator_loss=30.83, generator_mel_loss=20.67, generator_kl_loss=2.034, generator_dur_loss=1.649, generator_adv_loss=2.359, generator_feat_match_loss=4.117, over 153.00 samples.], tot_loss[discriminator_loss=2.62, discriminator_real_loss=1.328, discriminator_fake_loss=1.291, generator_loss=30.45, generator_mel_loss=20.34, generator_kl_loss=2.007, generator_dur_loss=1.644, generator_adv_loss=2.231, generator_feat_match_loss=4.23, over 2164.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 16.0 2023-11-14 13:38:59,242 INFO [train.py:811] (0/4) Start epoch 685 2023-11-14 13:42:38,302 INFO [train.py:811] (0/4) Start epoch 686 2023-11-14 13:43:29,168 INFO [train.py:467] (0/4) Epoch 686, batch 5, global_batch_idx: 25350, batch size: 79, loss[discriminator_loss=2.578, discriminator_real_loss=1.268, discriminator_fake_loss=1.311, generator_loss=29.74, generator_mel_loss=19.85, generator_kl_loss=1.981, generator_dur_loss=1.641, generator_adv_loss=2.25, generator_feat_match_loss=4.02, over 79.00 samples.], tot_loss[discriminator_loss=2.568, discriminator_real_loss=1.322, discriminator_fake_loss=1.246, generator_loss=30.53, generator_mel_loss=20.03, generator_kl_loss=2.024, generator_dur_loss=1.636, generator_adv_loss=2.287, generator_feat_match_loss=4.556, over 506.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 16.0 2023-11-14 13:46:12,378 INFO [train.py:811] (0/4) Start epoch 687 2023-11-14 13:48:08,380 INFO [train.py:467] (0/4) Epoch 687, batch 18, global_batch_idx: 25400, batch size: 64, loss[discriminator_loss=2.318, discriminator_real_loss=1.124, discriminator_fake_loss=1.194, generator_loss=30.63, generator_mel_loss=19.47, generator_kl_loss=1.948, generator_dur_loss=1.612, generator_adv_loss=2.371, generator_feat_match_loss=5.227, over 64.00 samples.], tot_loss[discriminator_loss=2.502, discriminator_real_loss=1.253, discriminator_fake_loss=1.248, generator_loss=30.76, generator_mel_loss=20.02, generator_kl_loss=2.003, generator_dur_loss=1.644, generator_adv_loss=2.373, generator_feat_match_loss=4.726, over 1464.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 13:48:08,974 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 13:48:20,529 INFO [train.py:517] (0/4) Epoch 687, validation: discriminator_loss=2.486, discriminator_real_loss=1.159, discriminator_fake_loss=1.328, generator_loss=30.94, generator_mel_loss=20.78, generator_kl_loss=2.145, generator_dur_loss=1.627, generator_adv_loss=1.893, generator_feat_match_loss=4.492, over 100.00 samples. 2023-11-14 13:48:20,530 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 13:50:01,097 INFO [train.py:811] (0/4) Start epoch 688 2023-11-14 13:53:05,994 INFO [train.py:467] (0/4) Epoch 688, batch 31, global_batch_idx: 25450, batch size: 51, loss[discriminator_loss=2.562, discriminator_real_loss=1.357, discriminator_fake_loss=1.205, generator_loss=30.51, generator_mel_loss=20.34, generator_kl_loss=2.052, generator_dur_loss=1.629, generator_adv_loss=2.107, generator_feat_match_loss=4.383, over 51.00 samples.], tot_loss[discriminator_loss=2.587, discriminator_real_loss=1.32, discriminator_fake_loss=1.267, generator_loss=30.25, generator_mel_loss=20.27, generator_kl_loss=2.04, generator_dur_loss=1.646, generator_adv_loss=2.132, generator_feat_match_loss=4.164, over 2264.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2023-11-14 13:53:36,034 INFO [train.py:811] (0/4) Start epoch 689 2023-11-14 13:57:05,692 INFO [train.py:811] (0/4) Start epoch 690 2023-11-14 13:58:01,024 INFO [train.py:467] (0/4) Epoch 690, batch 7, global_batch_idx: 25500, batch size: 51, loss[discriminator_loss=2.635, discriminator_real_loss=1.366, discriminator_fake_loss=1.269, generator_loss=30.63, generator_mel_loss=20.65, generator_kl_loss=2.042, generator_dur_loss=1.647, generator_adv_loss=1.984, generator_feat_match_loss=4.301, over 51.00 samples.], tot_loss[discriminator_loss=2.668, discriminator_real_loss=1.387, discriminator_fake_loss=1.281, generator_loss=30.54, generator_mel_loss=20.46, generator_kl_loss=2.028, generator_dur_loss=1.643, generator_adv_loss=2.238, generator_feat_match_loss=4.165, over 575.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 14:00:44,146 INFO [train.py:811] (0/4) Start epoch 691 2023-11-14 14:02:49,329 INFO [train.py:467] (0/4) Epoch 691, batch 20, global_batch_idx: 25550, batch size: 101, loss[discriminator_loss=2.463, discriminator_real_loss=1.236, discriminator_fake_loss=1.227, generator_loss=31.18, generator_mel_loss=20.3, generator_kl_loss=2.076, generator_dur_loss=1.631, generator_adv_loss=2.387, generator_feat_match_loss=4.789, over 101.00 samples.], tot_loss[discriminator_loss=2.574, discriminator_real_loss=1.29, discriminator_fake_loss=1.284, generator_loss=30.46, generator_mel_loss=19.95, generator_kl_loss=2.014, generator_dur_loss=1.638, generator_adv_loss=2.291, generator_feat_match_loss=4.562, over 1562.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 14:04:19,878 INFO [train.py:811] (0/4) Start epoch 692 2023-11-14 14:07:38,791 INFO [train.py:467] (0/4) Epoch 692, batch 33, global_batch_idx: 25600, batch size: 85, loss[discriminator_loss=2.586, discriminator_real_loss=1.33, discriminator_fake_loss=1.255, generator_loss=30.66, generator_mel_loss=20.15, generator_kl_loss=1.955, generator_dur_loss=1.644, generator_adv_loss=2.34, generator_feat_match_loss=4.57, over 85.00 samples.], tot_loss[discriminator_loss=2.555, discriminator_real_loss=1.288, discriminator_fake_loss=1.267, generator_loss=30.45, generator_mel_loss=20.28, generator_kl_loss=2.03, generator_dur_loss=1.65, generator_adv_loss=2.177, generator_feat_match_loss=4.31, over 2347.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 16.0 2023-11-14 14:07:39,261 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 14:07:50,033 INFO [train.py:517] (0/4) Epoch 692, validation: discriminator_loss=2.648, discriminator_real_loss=1.176, discriminator_fake_loss=1.471, generator_loss=30.91, generator_mel_loss=21.14, generator_kl_loss=2.264, generator_dur_loss=1.628, generator_adv_loss=1.743, generator_feat_match_loss=4.13, over 100.00 samples. 2023-11-14 14:07:50,034 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 14:08:02,753 INFO [train.py:811] (0/4) Start epoch 693 2023-11-14 14:11:34,129 INFO [train.py:811] (0/4) Start epoch 694 2023-11-14 14:12:37,417 INFO [train.py:467] (0/4) Epoch 694, batch 9, global_batch_idx: 25650, batch size: 69, loss[discriminator_loss=2.627, discriminator_real_loss=1.318, discriminator_fake_loss=1.309, generator_loss=30.5, generator_mel_loss=20.41, generator_kl_loss=2.047, generator_dur_loss=1.634, generator_adv_loss=2.127, generator_feat_match_loss=4.277, over 69.00 samples.], tot_loss[discriminator_loss=2.586, discriminator_real_loss=1.31, discriminator_fake_loss=1.276, generator_loss=30.43, generator_mel_loss=20.22, generator_kl_loss=2.022, generator_dur_loss=1.653, generator_adv_loss=2.219, generator_feat_match_loss=4.315, over 656.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 14:15:06,010 INFO [train.py:811] (0/4) Start epoch 695 2023-11-14 14:17:23,154 INFO [train.py:467] (0/4) Epoch 695, batch 22, global_batch_idx: 25700, batch size: 49, loss[discriminator_loss=2.598, discriminator_real_loss=1.354, discriminator_fake_loss=1.244, generator_loss=30.55, generator_mel_loss=20.45, generator_kl_loss=1.934, generator_dur_loss=1.663, generator_adv_loss=2.09, generator_feat_match_loss=4.41, over 49.00 samples.], tot_loss[discriminator_loss=2.588, discriminator_real_loss=1.302, discriminator_fake_loss=1.285, generator_loss=30.34, generator_mel_loss=20.22, generator_kl_loss=2.016, generator_dur_loss=1.643, generator_adv_loss=2.188, generator_feat_match_loss=4.277, over 1660.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 14:18:36,545 INFO [train.py:811] (0/4) Start epoch 696 2023-11-14 14:21:53,199 INFO [train.py:467] (0/4) Epoch 696, batch 35, global_batch_idx: 25750, batch size: 110, loss[discriminator_loss=2.656, discriminator_real_loss=1.172, discriminator_fake_loss=1.483, generator_loss=30.4, generator_mel_loss=20.49, generator_kl_loss=2.046, generator_dur_loss=1.634, generator_adv_loss=1.98, generator_feat_match_loss=4.254, over 110.00 samples.], tot_loss[discriminator_loss=2.547, discriminator_real_loss=1.283, discriminator_fake_loss=1.264, generator_loss=30.66, generator_mel_loss=20.13, generator_kl_loss=2.027, generator_dur_loss=1.644, generator_adv_loss=2.322, generator_feat_match_loss=4.542, over 2464.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 14:22:01,678 INFO [train.py:811] (0/4) Start epoch 697 2023-11-14 14:25:36,645 INFO [train.py:811] (0/4) Start epoch 698 2023-11-14 14:26:58,971 INFO [train.py:467] (0/4) Epoch 698, batch 11, global_batch_idx: 25800, batch size: 73, loss[discriminator_loss=2.801, discriminator_real_loss=1.446, discriminator_fake_loss=1.355, generator_loss=29.71, generator_mel_loss=19.8, generator_kl_loss=2.119, generator_dur_loss=1.642, generator_adv_loss=2.049, generator_feat_match_loss=4.094, over 73.00 samples.], tot_loss[discriminator_loss=2.435, discriminator_real_loss=1.244, discriminator_fake_loss=1.191, generator_loss=31.01, generator_mel_loss=19.95, generator_kl_loss=1.997, generator_dur_loss=1.642, generator_adv_loss=2.415, generator_feat_match_loss=4.998, over 905.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 14:26:59,473 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 14:27:10,093 INFO [train.py:517] (0/4) Epoch 698, validation: discriminator_loss=2.54, discriminator_real_loss=1.189, discriminator_fake_loss=1.35, generator_loss=30.44, generator_mel_loss=20.34, generator_kl_loss=2.23, generator_dur_loss=1.631, generator_adv_loss=2.048, generator_feat_match_loss=4.19, over 100.00 samples. 2023-11-14 14:27:10,094 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 14:29:22,836 INFO [train.py:811] (0/4) Start epoch 699 2023-11-14 14:31:40,901 INFO [train.py:467] (0/4) Epoch 699, batch 24, global_batch_idx: 25850, batch size: 101, loss[discriminator_loss=2.564, discriminator_real_loss=1.28, discriminator_fake_loss=1.284, generator_loss=30.72, generator_mel_loss=20.54, generator_kl_loss=2.048, generator_dur_loss=1.624, generator_adv_loss=2.268, generator_feat_match_loss=4.242, over 101.00 samples.], tot_loss[discriminator_loss=2.543, discriminator_real_loss=1.275, discriminator_fake_loss=1.268, generator_loss=30.46, generator_mel_loss=20.26, generator_kl_loss=2.029, generator_dur_loss=1.646, generator_adv_loss=2.196, generator_feat_match_loss=4.329, over 1797.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 14:32:52,140 INFO [train.py:811] (0/4) Start epoch 700 2023-11-14 14:36:23,012 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-700.pt 2023-11-14 14:36:26,398 INFO [train.py:811] (0/4) Start epoch 701 2023-11-14 14:36:40,067 INFO [train.py:467] (0/4) Epoch 701, batch 0, global_batch_idx: 25900, batch size: 90, loss[discriminator_loss=2.748, discriminator_real_loss=1.205, discriminator_fake_loss=1.543, generator_loss=29.79, generator_mel_loss=19.97, generator_kl_loss=1.986, generator_dur_loss=1.648, generator_adv_loss=2.209, generator_feat_match_loss=3.977, over 90.00 samples.], tot_loss[discriminator_loss=2.748, discriminator_real_loss=1.205, discriminator_fake_loss=1.543, generator_loss=29.79, generator_mel_loss=19.97, generator_kl_loss=1.986, generator_dur_loss=1.648, generator_adv_loss=2.209, generator_feat_match_loss=3.977, over 90.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 14:39:57,490 INFO [train.py:811] (0/4) Start epoch 702 2023-11-14 14:41:25,237 INFO [train.py:467] (0/4) Epoch 702, batch 13, global_batch_idx: 25950, batch size: 154, loss[discriminator_loss=2.539, discriminator_real_loss=1.215, discriminator_fake_loss=1.324, generator_loss=31.1, generator_mel_loss=20.59, generator_kl_loss=2.088, generator_dur_loss=1.637, generator_adv_loss=2.254, generator_feat_match_loss=4.531, over 154.00 samples.], tot_loss[discriminator_loss=2.548, discriminator_real_loss=1.272, discriminator_fake_loss=1.276, generator_loss=30.48, generator_mel_loss=20.25, generator_kl_loss=2.043, generator_dur_loss=1.638, generator_adv_loss=2.202, generator_feat_match_loss=4.349, over 1193.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 14:43:31,251 INFO [train.py:811] (0/4) Start epoch 703 2023-11-14 14:46:10,229 INFO [train.py:467] (0/4) Epoch 703, batch 26, global_batch_idx: 26000, batch size: 79, loss[discriminator_loss=2.58, discriminator_real_loss=1.346, discriminator_fake_loss=1.234, generator_loss=30.46, generator_mel_loss=20.13, generator_kl_loss=2.032, generator_dur_loss=1.623, generator_adv_loss=2.309, generator_feat_match_loss=4.367, over 79.00 samples.], tot_loss[discriminator_loss=2.613, discriminator_real_loss=1.334, discriminator_fake_loss=1.279, generator_loss=30.47, generator_mel_loss=20.38, generator_kl_loss=2.025, generator_dur_loss=1.638, generator_adv_loss=2.188, generator_feat_match_loss=4.241, over 2015.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 16.0 2023-11-14 14:46:10,791 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 14:46:21,215 INFO [train.py:517] (0/4) Epoch 703, validation: discriminator_loss=2.567, discriminator_real_loss=1.261, discriminator_fake_loss=1.305, generator_loss=31.06, generator_mel_loss=20.82, generator_kl_loss=2.197, generator_dur_loss=1.635, generator_adv_loss=2.174, generator_feat_match_loss=4.234, over 100.00 samples. 2023-11-14 14:46:21,216 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 14:47:22,518 INFO [train.py:811] (0/4) Start epoch 704 2023-11-14 14:50:54,825 INFO [train.py:811] (0/4) Start epoch 705 2023-11-14 14:51:19,627 INFO [train.py:467] (0/4) Epoch 705, batch 2, global_batch_idx: 26050, batch size: 69, loss[discriminator_loss=2.729, discriminator_real_loss=1.359, discriminator_fake_loss=1.369, generator_loss=30.52, generator_mel_loss=20.5, generator_kl_loss=2.024, generator_dur_loss=1.65, generator_adv_loss=2.242, generator_feat_match_loss=4.113, over 69.00 samples.], tot_loss[discriminator_loss=2.719, discriminator_real_loss=1.368, discriminator_fake_loss=1.352, generator_loss=30.59, generator_mel_loss=20.62, generator_kl_loss=2.038, generator_dur_loss=1.655, generator_adv_loss=2.163, generator_feat_match_loss=4.115, over 175.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 16.0 2023-11-14 14:54:27,657 INFO [train.py:811] (0/4) Start epoch 706 2023-11-14 14:56:04,328 INFO [train.py:467] (0/4) Epoch 706, batch 15, global_batch_idx: 26100, batch size: 58, loss[discriminator_loss=2.645, discriminator_real_loss=1.391, discriminator_fake_loss=1.253, generator_loss=29.97, generator_mel_loss=20.19, generator_kl_loss=1.936, generator_dur_loss=1.634, generator_adv_loss=2.07, generator_feat_match_loss=4.141, over 58.00 samples.], tot_loss[discriminator_loss=2.634, discriminator_real_loss=1.35, discriminator_fake_loss=1.284, generator_loss=30.35, generator_mel_loss=20.3, generator_kl_loss=1.993, generator_dur_loss=1.637, generator_adv_loss=2.205, generator_feat_match_loss=4.218, over 1156.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 16.0 2023-11-14 14:57:56,751 INFO [train.py:811] (0/4) Start epoch 707 2023-11-14 15:00:41,826 INFO [train.py:467] (0/4) Epoch 707, batch 28, global_batch_idx: 26150, batch size: 110, loss[discriminator_loss=2.59, discriminator_real_loss=1.234, discriminator_fake_loss=1.356, generator_loss=29.61, generator_mel_loss=19.81, generator_kl_loss=2.031, generator_dur_loss=1.666, generator_adv_loss=2.002, generator_feat_match_loss=4.102, over 110.00 samples.], tot_loss[discriminator_loss=2.509, discriminator_real_loss=1.259, discriminator_fake_loss=1.25, generator_loss=30.82, generator_mel_loss=20.09, generator_kl_loss=2.007, generator_dur_loss=1.64, generator_adv_loss=2.369, generator_feat_match_loss=4.708, over 2253.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 15:01:28,877 INFO [train.py:811] (0/4) Start epoch 708 2023-11-14 15:05:04,082 INFO [train.py:811] (0/4) Start epoch 709 2023-11-14 15:05:38,945 INFO [train.py:467] (0/4) Epoch 709, batch 4, global_batch_idx: 26200, batch size: 85, loss[discriminator_loss=2.484, discriminator_real_loss=1.32, discriminator_fake_loss=1.165, generator_loss=30.16, generator_mel_loss=19.93, generator_kl_loss=1.959, generator_dur_loss=1.637, generator_adv_loss=2.205, generator_feat_match_loss=4.426, over 85.00 samples.], tot_loss[discriminator_loss=2.557, discriminator_real_loss=1.324, discriminator_fake_loss=1.233, generator_loss=30.32, generator_mel_loss=20.09, generator_kl_loss=2.027, generator_dur_loss=1.641, generator_adv_loss=2.274, generator_feat_match_loss=4.286, over 363.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 15:05:39,463 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 15:05:50,930 INFO [train.py:517] (0/4) Epoch 709, validation: discriminator_loss=2.445, discriminator_real_loss=1.196, discriminator_fake_loss=1.249, generator_loss=31.68, generator_mel_loss=20.94, generator_kl_loss=2.097, generator_dur_loss=1.63, generator_adv_loss=2.2, generator_feat_match_loss=4.81, over 100.00 samples. 2023-11-14 15:05:50,932 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 15:08:42,300 INFO [train.py:811] (0/4) Start epoch 710 2023-11-14 15:10:39,648 INFO [train.py:467] (0/4) Epoch 710, batch 17, global_batch_idx: 26250, batch size: 69, loss[discriminator_loss=2.496, discriminator_real_loss=1.279, discriminator_fake_loss=1.217, generator_loss=30.81, generator_mel_loss=20.15, generator_kl_loss=2.1, generator_dur_loss=1.65, generator_adv_loss=2.471, generator_feat_match_loss=4.441, over 69.00 samples.], tot_loss[discriminator_loss=2.565, discriminator_real_loss=1.294, discriminator_fake_loss=1.271, generator_loss=30.43, generator_mel_loss=19.97, generator_kl_loss=2.021, generator_dur_loss=1.639, generator_adv_loss=2.248, generator_feat_match_loss=4.545, over 1541.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 15:12:18,825 INFO [train.py:811] (0/4) Start epoch 711 2023-11-14 15:15:28,180 INFO [train.py:467] (0/4) Epoch 711, batch 30, global_batch_idx: 26300, batch size: 55, loss[discriminator_loss=2.535, discriminator_real_loss=1.224, discriminator_fake_loss=1.312, generator_loss=31.17, generator_mel_loss=20.54, generator_kl_loss=2.005, generator_dur_loss=1.658, generator_adv_loss=2.277, generator_feat_match_loss=4.691, over 55.00 samples.], tot_loss[discriminator_loss=2.577, discriminator_real_loss=1.304, discriminator_fake_loss=1.272, generator_loss=30.65, generator_mel_loss=20.41, generator_kl_loss=2.033, generator_dur_loss=1.642, generator_adv_loss=2.19, generator_feat_match_loss=4.381, over 2185.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 15:15:56,954 INFO [train.py:811] (0/4) Start epoch 712 2023-11-14 15:19:27,519 INFO [train.py:811] (0/4) Start epoch 713 2023-11-14 15:20:19,623 INFO [train.py:467] (0/4) Epoch 713, batch 6, global_batch_idx: 26350, batch size: 50, loss[discriminator_loss=2.617, discriminator_real_loss=1.344, discriminator_fake_loss=1.272, generator_loss=30.43, generator_mel_loss=20.2, generator_kl_loss=2.1, generator_dur_loss=1.651, generator_adv_loss=2.164, generator_feat_match_loss=4.316, over 50.00 samples.], tot_loss[discriminator_loss=2.559, discriminator_real_loss=1.301, discriminator_fake_loss=1.259, generator_loss=30.56, generator_mel_loss=20.28, generator_kl_loss=2.041, generator_dur_loss=1.648, generator_adv_loss=2.187, generator_feat_match_loss=4.403, over 655.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 15:23:03,218 INFO [train.py:811] (0/4) Start epoch 714 2023-11-14 15:25:03,858 INFO [train.py:467] (0/4) Epoch 714, batch 19, global_batch_idx: 26400, batch size: 65, loss[discriminator_loss=2.555, discriminator_real_loss=1.258, discriminator_fake_loss=1.296, generator_loss=30.51, generator_mel_loss=20.07, generator_kl_loss=2.02, generator_dur_loss=1.638, generator_adv_loss=2.098, generator_feat_match_loss=4.684, over 65.00 samples.], tot_loss[discriminator_loss=2.596, discriminator_real_loss=1.298, discriminator_fake_loss=1.297, generator_loss=30.62, generator_mel_loss=20.47, generator_kl_loss=2.051, generator_dur_loss=1.644, generator_adv_loss=2.161, generator_feat_match_loss=4.289, over 1571.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 16.0 2023-11-14 15:25:04,344 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 15:25:15,258 INFO [train.py:517] (0/4) Epoch 714, validation: discriminator_loss=2.513, discriminator_real_loss=1.094, discriminator_fake_loss=1.419, generator_loss=31.62, generator_mel_loss=21.33, generator_kl_loss=2.166, generator_dur_loss=1.632, generator_adv_loss=1.889, generator_feat_match_loss=4.603, over 100.00 samples. 2023-11-14 15:25:15,259 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 15:26:49,408 INFO [train.py:811] (0/4) Start epoch 715 2023-11-14 15:30:04,967 INFO [train.py:467] (0/4) Epoch 715, batch 32, global_batch_idx: 26450, batch size: 61, loss[discriminator_loss=2.59, discriminator_real_loss=1.229, discriminator_fake_loss=1.361, generator_loss=29.88, generator_mel_loss=19.6, generator_kl_loss=2.011, generator_dur_loss=1.644, generator_adv_loss=2.293, generator_feat_match_loss=4.336, over 61.00 samples.], tot_loss[discriminator_loss=2.553, discriminator_real_loss=1.3, discriminator_fake_loss=1.252, generator_loss=30.48, generator_mel_loss=20.09, generator_kl_loss=2.035, generator_dur_loss=1.643, generator_adv_loss=2.254, generator_feat_match_loss=4.456, over 2407.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 15:30:22,386 INFO [train.py:811] (0/4) Start epoch 716 2023-11-14 15:33:51,646 INFO [train.py:811] (0/4) Start epoch 717 2023-11-14 15:34:46,950 INFO [train.py:467] (0/4) Epoch 717, batch 8, global_batch_idx: 26500, batch size: 76, loss[discriminator_loss=2.562, discriminator_real_loss=1.346, discriminator_fake_loss=1.218, generator_loss=30.89, generator_mel_loss=20.48, generator_kl_loss=2.051, generator_dur_loss=1.672, generator_adv_loss=2.391, generator_feat_match_loss=4.293, over 76.00 samples.], tot_loss[discriminator_loss=2.529, discriminator_real_loss=1.254, discriminator_fake_loss=1.275, generator_loss=30.51, generator_mel_loss=20.18, generator_kl_loss=2.001, generator_dur_loss=1.653, generator_adv_loss=2.246, generator_feat_match_loss=4.431, over 649.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 15:37:21,502 INFO [train.py:811] (0/4) Start epoch 718 2023-11-14 15:39:34,793 INFO [train.py:467] (0/4) Epoch 718, batch 21, global_batch_idx: 26550, batch size: 71, loss[discriminator_loss=2.504, discriminator_real_loss=1.291, discriminator_fake_loss=1.214, generator_loss=29.97, generator_mel_loss=20.13, generator_kl_loss=2.065, generator_dur_loss=1.64, generator_adv_loss=2.051, generator_feat_match_loss=4.086, over 71.00 samples.], tot_loss[discriminator_loss=2.574, discriminator_real_loss=1.314, discriminator_fake_loss=1.26, generator_loss=30.43, generator_mel_loss=20.13, generator_kl_loss=2.025, generator_dur_loss=1.644, generator_adv_loss=2.226, generator_feat_match_loss=4.402, over 1559.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 15:40:53,981 INFO [train.py:811] (0/4) Start epoch 719 2023-11-14 15:44:15,633 INFO [train.py:467] (0/4) Epoch 719, batch 34, global_batch_idx: 26600, batch size: 69, loss[discriminator_loss=2.842, discriminator_real_loss=1.453, discriminator_fake_loss=1.389, generator_loss=28.99, generator_mel_loss=19.45, generator_kl_loss=2.007, generator_dur_loss=1.65, generator_adv_loss=2.051, generator_feat_match_loss=3.828, over 69.00 samples.], tot_loss[discriminator_loss=2.498, discriminator_real_loss=1.263, discriminator_fake_loss=1.236, generator_loss=30.75, generator_mel_loss=19.99, generator_kl_loss=2.009, generator_dur_loss=1.64, generator_adv_loss=2.378, generator_feat_match_loss=4.736, over 2484.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 15:44:16,119 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 15:44:26,713 INFO [train.py:517] (0/4) Epoch 719, validation: discriminator_loss=2.578, discriminator_real_loss=1.256, discriminator_fake_loss=1.322, generator_loss=30.62, generator_mel_loss=20.52, generator_kl_loss=2.137, generator_dur_loss=1.636, generator_adv_loss=2.032, generator_feat_match_loss=4.302, over 100.00 samples. 2023-11-14 15:44:26,714 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 15:44:37,095 INFO [train.py:811] (0/4) Start epoch 720 2023-11-14 15:48:07,668 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-720.pt 2023-11-14 15:48:10,992 INFO [train.py:811] (0/4) Start epoch 721 2023-11-14 15:49:16,281 INFO [train.py:467] (0/4) Epoch 721, batch 10, global_batch_idx: 26650, batch size: 90, loss[discriminator_loss=2.568, discriminator_real_loss=1.287, discriminator_fake_loss=1.281, generator_loss=29.88, generator_mel_loss=19.93, generator_kl_loss=1.958, generator_dur_loss=1.636, generator_adv_loss=2.121, generator_feat_match_loss=4.238, over 90.00 samples.], tot_loss[discriminator_loss=2.537, discriminator_real_loss=1.27, discriminator_fake_loss=1.266, generator_loss=30.04, generator_mel_loss=20.02, generator_kl_loss=2.026, generator_dur_loss=1.649, generator_adv_loss=2.14, generator_feat_match_loss=4.211, over 693.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 15:51:34,844 INFO [train.py:811] (0/4) Start epoch 722 2023-11-14 15:53:58,890 INFO [train.py:467] (0/4) Epoch 722, batch 23, global_batch_idx: 26700, batch size: 110, loss[discriminator_loss=2.477, discriminator_real_loss=1.333, discriminator_fake_loss=1.144, generator_loss=30.82, generator_mel_loss=20.25, generator_kl_loss=1.999, generator_dur_loss=1.639, generator_adv_loss=2.262, generator_feat_match_loss=4.668, over 110.00 samples.], tot_loss[discriminator_loss=2.577, discriminator_real_loss=1.313, discriminator_fake_loss=1.264, generator_loss=30.55, generator_mel_loss=20.19, generator_kl_loss=2.001, generator_dur_loss=1.644, generator_adv_loss=2.265, generator_feat_match_loss=4.447, over 1771.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 15:55:07,242 INFO [train.py:811] (0/4) Start epoch 723 2023-11-14 15:58:43,671 INFO [train.py:467] (0/4) Epoch 723, batch 36, global_batch_idx: 26750, batch size: 85, loss[discriminator_loss=2.637, discriminator_real_loss=1.383, discriminator_fake_loss=1.255, generator_loss=30.65, generator_mel_loss=19.83, generator_kl_loss=1.957, generator_dur_loss=1.653, generator_adv_loss=2.625, generator_feat_match_loss=4.582, over 85.00 samples.], tot_loss[discriminator_loss=2.566, discriminator_real_loss=1.294, discriminator_fake_loss=1.272, generator_loss=30.54, generator_mel_loss=20.24, generator_kl_loss=2.019, generator_dur_loss=1.645, generator_adv_loss=2.237, generator_feat_match_loss=4.401, over 2679.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 15:58:44,803 INFO [train.py:811] (0/4) Start epoch 724 2023-11-14 16:02:17,672 INFO [train.py:811] (0/4) Start epoch 725 2023-11-14 16:03:36,471 INFO [train.py:467] (0/4) Epoch 725, batch 12, global_batch_idx: 26800, batch size: 60, loss[discriminator_loss=2.564, discriminator_real_loss=1.364, discriminator_fake_loss=1.2, generator_loss=30.54, generator_mel_loss=20.43, generator_kl_loss=2.084, generator_dur_loss=1.626, generator_adv_loss=2.242, generator_feat_match_loss=4.152, over 60.00 samples.], tot_loss[discriminator_loss=2.571, discriminator_real_loss=1.301, discriminator_fake_loss=1.27, generator_loss=30.41, generator_mel_loss=20.19, generator_kl_loss=2.024, generator_dur_loss=1.644, generator_adv_loss=2.216, generator_feat_match_loss=4.341, over 872.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 16.0 2023-11-14 16:03:36,975 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 16:03:47,692 INFO [train.py:517] (0/4) Epoch 725, validation: discriminator_loss=2.553, discriminator_real_loss=1.265, discriminator_fake_loss=1.288, generator_loss=30.26, generator_mel_loss=20.49, generator_kl_loss=2.119, generator_dur_loss=1.638, generator_adv_loss=2.055, generator_feat_match_loss=3.957, over 100.00 samples. 2023-11-14 16:03:47,693 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 16:06:02,723 INFO [train.py:811] (0/4) Start epoch 726 2023-11-14 16:08:30,014 INFO [train.py:467] (0/4) Epoch 726, batch 25, global_batch_idx: 26850, batch size: 52, loss[discriminator_loss=2.512, discriminator_real_loss=1.301, discriminator_fake_loss=1.211, generator_loss=30.43, generator_mel_loss=20.27, generator_kl_loss=2.083, generator_dur_loss=1.655, generator_adv_loss=2.088, generator_feat_match_loss=4.336, over 52.00 samples.], tot_loss[discriminator_loss=2.567, discriminator_real_loss=1.297, discriminator_fake_loss=1.27, generator_loss=30.44, generator_mel_loss=20.18, generator_kl_loss=2.008, generator_dur_loss=1.644, generator_adv_loss=2.215, generator_feat_match_loss=4.393, over 1821.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 16:09:33,801 INFO [train.py:811] (0/4) Start epoch 727 2023-11-14 16:13:05,300 INFO [train.py:811] (0/4) Start epoch 728 2023-11-14 16:13:28,059 INFO [train.py:467] (0/4) Epoch 728, batch 1, global_batch_idx: 26900, batch size: 126, loss[discriminator_loss=2.59, discriminator_real_loss=1.272, discriminator_fake_loss=1.318, generator_loss=30.26, generator_mel_loss=20.12, generator_kl_loss=1.974, generator_dur_loss=1.623, generator_adv_loss=2.285, generator_feat_match_loss=4.254, over 126.00 samples.], tot_loss[discriminator_loss=2.584, discriminator_real_loss=1.266, discriminator_fake_loss=1.319, generator_loss=30.14, generator_mel_loss=20.01, generator_kl_loss=1.988, generator_dur_loss=1.627, generator_adv_loss=2.258, generator_feat_match_loss=4.252, over 176.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 16:16:38,069 INFO [train.py:811] (0/4) Start epoch 729 2023-11-14 16:18:11,076 INFO [train.py:467] (0/4) Epoch 729, batch 14, global_batch_idx: 26950, batch size: 76, loss[discriminator_loss=2.705, discriminator_real_loss=1.337, discriminator_fake_loss=1.368, generator_loss=30.96, generator_mel_loss=21.15, generator_kl_loss=1.99, generator_dur_loss=1.65, generator_adv_loss=2.031, generator_feat_match_loss=4.141, over 76.00 samples.], tot_loss[discriminator_loss=2.628, discriminator_real_loss=1.323, discriminator_fake_loss=1.305, generator_loss=30.54, generator_mel_loss=20.55, generator_kl_loss=2.037, generator_dur_loss=1.647, generator_adv_loss=2.132, generator_feat_match_loss=4.169, over 937.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 16:20:13,624 INFO [train.py:811] (0/4) Start epoch 730 2023-11-14 16:22:59,290 INFO [train.py:467] (0/4) Epoch 730, batch 27, global_batch_idx: 27000, batch size: 54, loss[discriminator_loss=2.598, discriminator_real_loss=1.348, discriminator_fake_loss=1.251, generator_loss=30.3, generator_mel_loss=20.34, generator_kl_loss=2.006, generator_dur_loss=1.674, generator_adv_loss=2.059, generator_feat_match_loss=4.215, over 54.00 samples.], tot_loss[discriminator_loss=2.642, discriminator_real_loss=1.351, discriminator_fake_loss=1.291, generator_loss=30.35, generator_mel_loss=20.3, generator_kl_loss=2.014, generator_dur_loss=1.647, generator_adv_loss=2.157, generator_feat_match_loss=4.235, over 1981.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 16:22:59,771 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 16:23:10,798 INFO [train.py:517] (0/4) Epoch 730, validation: discriminator_loss=2.568, discriminator_real_loss=1.117, discriminator_fake_loss=1.45, generator_loss=30.82, generator_mel_loss=20.99, generator_kl_loss=2.167, generator_dur_loss=1.636, generator_adv_loss=1.875, generator_feat_match_loss=4.148, over 100.00 samples. 2023-11-14 16:23:10,799 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 16:23:59,554 INFO [train.py:811] (0/4) Start epoch 731 2023-11-14 16:27:30,247 INFO [train.py:811] (0/4) Start epoch 732 2023-11-14 16:28:04,145 INFO [train.py:467] (0/4) Epoch 732, batch 3, global_batch_idx: 27050, batch size: 53, loss[discriminator_loss=2.719, discriminator_real_loss=1.343, discriminator_fake_loss=1.376, generator_loss=30.34, generator_mel_loss=20.58, generator_kl_loss=1.949, generator_dur_loss=1.654, generator_adv_loss=2.092, generator_feat_match_loss=4.07, over 53.00 samples.], tot_loss[discriminator_loss=2.684, discriminator_real_loss=1.394, discriminator_fake_loss=1.29, generator_loss=30.51, generator_mel_loss=20.42, generator_kl_loss=2.002, generator_dur_loss=1.653, generator_adv_loss=2.196, generator_feat_match_loss=4.244, over 226.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 16:30:57,329 INFO [train.py:811] (0/4) Start epoch 733 2023-11-14 16:32:39,210 INFO [train.py:467] (0/4) Epoch 733, batch 16, global_batch_idx: 27100, batch size: 110, loss[discriminator_loss=2.537, discriminator_real_loss=1.379, discriminator_fake_loss=1.158, generator_loss=30.33, generator_mel_loss=20.06, generator_kl_loss=1.957, generator_dur_loss=1.654, generator_adv_loss=2.279, generator_feat_match_loss=4.379, over 110.00 samples.], tot_loss[discriminator_loss=2.56, discriminator_real_loss=1.303, discriminator_fake_loss=1.258, generator_loss=30.64, generator_mel_loss=20.23, generator_kl_loss=2.025, generator_dur_loss=1.64, generator_adv_loss=2.288, generator_feat_match_loss=4.45, over 1270.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2023-11-14 16:34:26,584 INFO [train.py:811] (0/4) Start epoch 734 2023-11-14 16:37:13,394 INFO [train.py:467] (0/4) Epoch 734, batch 29, global_batch_idx: 27150, batch size: 126, loss[discriminator_loss=2.551, discriminator_real_loss=1.255, discriminator_fake_loss=1.296, generator_loss=30.87, generator_mel_loss=20.39, generator_kl_loss=2.043, generator_dur_loss=1.646, generator_adv_loss=2.045, generator_feat_match_loss=4.746, over 126.00 samples.], tot_loss[discriminator_loss=2.622, discriminator_real_loss=1.363, discriminator_fake_loss=1.259, generator_loss=30.62, generator_mel_loss=19.93, generator_kl_loss=2.011, generator_dur_loss=1.642, generator_adv_loss=2.297, generator_feat_match_loss=4.733, over 2156.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 16:37:54,726 INFO [train.py:811] (0/4) Start epoch 735 2023-11-14 16:41:26,478 INFO [train.py:811] (0/4) Start epoch 736 2023-11-14 16:42:14,508 INFO [train.py:467] (0/4) Epoch 736, batch 5, global_batch_idx: 27200, batch size: 69, loss[discriminator_loss=2.52, discriminator_real_loss=1.289, discriminator_fake_loss=1.229, generator_loss=30.57, generator_mel_loss=20.3, generator_kl_loss=1.994, generator_dur_loss=1.622, generator_adv_loss=2.164, generator_feat_match_loss=4.484, over 69.00 samples.], tot_loss[discriminator_loss=2.536, discriminator_real_loss=1.282, discriminator_fake_loss=1.254, generator_loss=30.57, generator_mel_loss=20.25, generator_kl_loss=2.024, generator_dur_loss=1.642, generator_adv_loss=2.214, generator_feat_match_loss=4.448, over 502.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 16.0 2023-11-14 16:42:15,010 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 16:42:26,536 INFO [train.py:517] (0/4) Epoch 736, validation: discriminator_loss=2.571, discriminator_real_loss=1.193, discriminator_fake_loss=1.377, generator_loss=31.04, generator_mel_loss=20.92, generator_kl_loss=2.167, generator_dur_loss=1.631, generator_adv_loss=1.9, generator_feat_match_loss=4.418, over 100.00 samples. 2023-11-14 16:42:26,537 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 16:45:13,235 INFO [train.py:811] (0/4) Start epoch 737 2023-11-14 16:47:14,262 INFO [train.py:467] (0/4) Epoch 737, batch 18, global_batch_idx: 27250, batch size: 49, loss[discriminator_loss=2.598, discriminator_real_loss=1.255, discriminator_fake_loss=1.342, generator_loss=29.71, generator_mel_loss=19.83, generator_kl_loss=1.962, generator_dur_loss=1.659, generator_adv_loss=2.172, generator_feat_match_loss=4.094, over 49.00 samples.], tot_loss[discriminator_loss=2.557, discriminator_real_loss=1.324, discriminator_fake_loss=1.233, generator_loss=30.51, generator_mel_loss=19.94, generator_kl_loss=1.996, generator_dur_loss=1.64, generator_adv_loss=2.352, generator_feat_match_loss=4.578, over 1314.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 16:48:47,121 INFO [train.py:811] (0/4) Start epoch 738 2023-11-14 16:51:48,710 INFO [train.py:467] (0/4) Epoch 738, batch 31, global_batch_idx: 27300, batch size: 126, loss[discriminator_loss=2.564, discriminator_real_loss=1.221, discriminator_fake_loss=1.344, generator_loss=31.35, generator_mel_loss=20.62, generator_kl_loss=2.077, generator_dur_loss=1.637, generator_adv_loss=2.383, generator_feat_match_loss=4.637, over 126.00 samples.], tot_loss[discriminator_loss=2.582, discriminator_real_loss=1.312, discriminator_fake_loss=1.27, generator_loss=30.65, generator_mel_loss=20.4, generator_kl_loss=2.036, generator_dur_loss=1.642, generator_adv_loss=2.215, generator_feat_match_loss=4.365, over 2439.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 16:52:16,205 INFO [train.py:811] (0/4) Start epoch 739 2023-11-14 16:55:49,341 INFO [train.py:811] (0/4) Start epoch 740 2023-11-14 16:56:38,600 INFO [train.py:467] (0/4) Epoch 740, batch 7, global_batch_idx: 27350, batch size: 101, loss[discriminator_loss=2.547, discriminator_real_loss=1.338, discriminator_fake_loss=1.209, generator_loss=30.19, generator_mel_loss=20.12, generator_kl_loss=1.998, generator_dur_loss=1.65, generator_adv_loss=2.191, generator_feat_match_loss=4.23, over 101.00 samples.], tot_loss[discriminator_loss=2.607, discriminator_real_loss=1.3, discriminator_fake_loss=1.307, generator_loss=30.12, generator_mel_loss=20.16, generator_kl_loss=2.017, generator_dur_loss=1.64, generator_adv_loss=2.115, generator_feat_match_loss=4.187, over 594.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 16:59:22,073 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-740.pt 2023-11-14 16:59:25,518 INFO [train.py:811] (0/4) Start epoch 741 2023-11-14 17:01:30,389 INFO [train.py:467] (0/4) Epoch 741, batch 20, global_batch_idx: 27400, batch size: 153, loss[discriminator_loss=2.59, discriminator_real_loss=1.334, discriminator_fake_loss=1.255, generator_loss=30.77, generator_mel_loss=20.14, generator_kl_loss=2.03, generator_dur_loss=1.604, generator_adv_loss=2.381, generator_feat_match_loss=4.613, over 153.00 samples.], tot_loss[discriminator_loss=2.544, discriminator_real_loss=1.281, discriminator_fake_loss=1.262, generator_loss=30.53, generator_mel_loss=20.12, generator_kl_loss=2.031, generator_dur_loss=1.636, generator_adv_loss=2.242, generator_feat_match_loss=4.496, over 1533.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 17:01:30,876 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 17:01:41,522 INFO [train.py:517] (0/4) Epoch 741, validation: discriminator_loss=2.556, discriminator_real_loss=1.263, discriminator_fake_loss=1.293, generator_loss=31.45, generator_mel_loss=21.03, generator_kl_loss=2.177, generator_dur_loss=1.634, generator_adv_loss=2.05, generator_feat_match_loss=4.559, over 100.00 samples. 2023-11-14 17:01:41,524 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 17:03:10,388 INFO [train.py:811] (0/4) Start epoch 742 2023-11-14 17:06:26,355 INFO [train.py:467] (0/4) Epoch 742, batch 33, global_batch_idx: 27450, batch size: 85, loss[discriminator_loss=2.631, discriminator_real_loss=1.326, discriminator_fake_loss=1.305, generator_loss=29.98, generator_mel_loss=19.96, generator_kl_loss=2.117, generator_dur_loss=1.641, generator_adv_loss=2.025, generator_feat_match_loss=4.246, over 85.00 samples.], tot_loss[discriminator_loss=2.558, discriminator_real_loss=1.297, discriminator_fake_loss=1.261, generator_loss=30.55, generator_mel_loss=20.05, generator_kl_loss=2.011, generator_dur_loss=1.641, generator_adv_loss=2.294, generator_feat_match_loss=4.556, over 2498.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 17:06:44,783 INFO [train.py:811] (0/4) Start epoch 743 2023-11-14 17:10:20,522 INFO [train.py:811] (0/4) Start epoch 744 2023-11-14 17:11:24,024 INFO [train.py:467] (0/4) Epoch 744, batch 9, global_batch_idx: 27500, batch size: 65, loss[discriminator_loss=2.65, discriminator_real_loss=1.441, discriminator_fake_loss=1.209, generator_loss=30.72, generator_mel_loss=20.85, generator_kl_loss=1.964, generator_dur_loss=1.657, generator_adv_loss=2.072, generator_feat_match_loss=4.18, over 65.00 samples.], tot_loss[discriminator_loss=2.601, discriminator_real_loss=1.32, discriminator_fake_loss=1.281, generator_loss=30.4, generator_mel_loss=20.37, generator_kl_loss=2.006, generator_dur_loss=1.649, generator_adv_loss=2.131, generator_feat_match_loss=4.242, over 724.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 17:13:54,221 INFO [train.py:811] (0/4) Start epoch 745 2023-11-14 17:16:09,446 INFO [train.py:467] (0/4) Epoch 745, batch 22, global_batch_idx: 27550, batch size: 153, loss[discriminator_loss=2.814, discriminator_real_loss=1.367, discriminator_fake_loss=1.447, generator_loss=29.4, generator_mel_loss=20.15, generator_kl_loss=1.937, generator_dur_loss=1.625, generator_adv_loss=1.869, generator_feat_match_loss=3.812, over 153.00 samples.], tot_loss[discriminator_loss=2.662, discriminator_real_loss=1.374, discriminator_fake_loss=1.288, generator_loss=30.62, generator_mel_loss=20.23, generator_kl_loss=1.997, generator_dur_loss=1.634, generator_adv_loss=2.296, generator_feat_match_loss=4.468, over 1821.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 17:17:23,415 INFO [train.py:811] (0/4) Start epoch 746 2023-11-14 17:20:45,013 INFO [train.py:467] (0/4) Epoch 746, batch 35, global_batch_idx: 27600, batch size: 52, loss[discriminator_loss=2.561, discriminator_real_loss=1.305, discriminator_fake_loss=1.256, generator_loss=30.03, generator_mel_loss=19.82, generator_kl_loss=1.937, generator_dur_loss=1.656, generator_adv_loss=2.248, generator_feat_match_loss=4.367, over 52.00 samples.], tot_loss[discriminator_loss=2.589, discriminator_real_loss=1.307, discriminator_fake_loss=1.282, generator_loss=30.39, generator_mel_loss=20.21, generator_kl_loss=2.04, generator_dur_loss=1.644, generator_adv_loss=2.179, generator_feat_match_loss=4.317, over 2553.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 16.0 2023-11-14 17:20:45,638 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 17:20:56,459 INFO [train.py:517] (0/4) Epoch 746, validation: discriminator_loss=2.5, discriminator_real_loss=1.217, discriminator_fake_loss=1.283, generator_loss=30.84, generator_mel_loss=20.69, generator_kl_loss=2.212, generator_dur_loss=1.631, generator_adv_loss=2.049, generator_feat_match_loss=4.261, over 100.00 samples. 2023-11-14 17:20:56,460 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 17:21:03,707 INFO [train.py:811] (0/4) Start epoch 747 2023-11-14 17:24:28,953 INFO [train.py:811] (0/4) Start epoch 748 2023-11-14 17:25:48,692 INFO [train.py:467] (0/4) Epoch 748, batch 11, global_batch_idx: 27650, batch size: 53, loss[discriminator_loss=2.586, discriminator_real_loss=1.183, discriminator_fake_loss=1.402, generator_loss=30.04, generator_mel_loss=19.87, generator_kl_loss=2.022, generator_dur_loss=1.635, generator_adv_loss=2.227, generator_feat_match_loss=4.289, over 53.00 samples.], tot_loss[discriminator_loss=2.6, discriminator_real_loss=1.334, discriminator_fake_loss=1.265, generator_loss=30.2, generator_mel_loss=19.99, generator_kl_loss=1.968, generator_dur_loss=1.639, generator_adv_loss=2.22, generator_feat_match_loss=4.383, over 888.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 16.0 2023-11-14 17:28:05,970 INFO [train.py:811] (0/4) Start epoch 749 2023-11-14 17:30:37,282 INFO [train.py:467] (0/4) Epoch 749, batch 24, global_batch_idx: 27700, batch size: 55, loss[discriminator_loss=2.453, discriminator_real_loss=1.367, discriminator_fake_loss=1.086, generator_loss=31.23, generator_mel_loss=20.36, generator_kl_loss=1.985, generator_dur_loss=1.634, generator_adv_loss=2.375, generator_feat_match_loss=4.875, over 55.00 samples.], tot_loss[discriminator_loss=2.571, discriminator_real_loss=1.302, discriminator_fake_loss=1.269, generator_loss=30.45, generator_mel_loss=20.11, generator_kl_loss=2.001, generator_dur_loss=1.635, generator_adv_loss=2.235, generator_feat_match_loss=4.472, over 2107.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 16.0 2023-11-14 17:31:40,091 INFO [train.py:811] (0/4) Start epoch 750 2023-11-14 17:35:12,304 INFO [train.py:811] (0/4) Start epoch 751 2023-11-14 17:35:29,033 INFO [train.py:467] (0/4) Epoch 751, batch 0, global_batch_idx: 27750, batch size: 73, loss[discriminator_loss=2.564, discriminator_real_loss=1.418, discriminator_fake_loss=1.146, generator_loss=30.76, generator_mel_loss=20.23, generator_kl_loss=1.998, generator_dur_loss=1.632, generator_adv_loss=2.273, generator_feat_match_loss=4.629, over 73.00 samples.], tot_loss[discriminator_loss=2.564, discriminator_real_loss=1.418, discriminator_fake_loss=1.146, generator_loss=30.76, generator_mel_loss=20.23, generator_kl_loss=1.998, generator_dur_loss=1.632, generator_adv_loss=2.273, generator_feat_match_loss=4.629, over 73.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 16.0 2023-11-14 17:38:41,340 INFO [train.py:811] (0/4) Start epoch 752 2023-11-14 17:40:07,599 INFO [train.py:467] (0/4) Epoch 752, batch 13, global_batch_idx: 27800, batch size: 64, loss[discriminator_loss=2.682, discriminator_real_loss=1.308, discriminator_fake_loss=1.374, generator_loss=30.09, generator_mel_loss=20.16, generator_kl_loss=2.045, generator_dur_loss=1.624, generator_adv_loss=2.072, generator_feat_match_loss=4.188, over 64.00 samples.], tot_loss[discriminator_loss=2.57, discriminator_real_loss=1.29, discriminator_fake_loss=1.28, generator_loss=30.5, generator_mel_loss=20.13, generator_kl_loss=2.032, generator_dur_loss=1.64, generator_adv_loss=2.234, generator_feat_match_loss=4.471, over 833.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 16.0 2023-11-14 17:40:08,154 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 17:40:18,589 INFO [train.py:517] (0/4) Epoch 752, validation: discriminator_loss=2.55, discriminator_real_loss=1.234, discriminator_fake_loss=1.316, generator_loss=31.17, generator_mel_loss=20.9, generator_kl_loss=2.111, generator_dur_loss=1.63, generator_adv_loss=2.089, generator_feat_match_loss=4.446, over 100.00 samples. 2023-11-14 17:40:18,590 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 17:42:26,032 INFO [train.py:811] (0/4) Start epoch 753 2023-11-14 17:45:01,995 INFO [train.py:467] (0/4) Epoch 753, batch 26, global_batch_idx: 27850, batch size: 79, loss[discriminator_loss=2.605, discriminator_real_loss=1.413, discriminator_fake_loss=1.191, generator_loss=29.48, generator_mel_loss=19.59, generator_kl_loss=1.963, generator_dur_loss=1.634, generator_adv_loss=2.143, generator_feat_match_loss=4.152, over 79.00 samples.], tot_loss[discriminator_loss=2.557, discriminator_real_loss=1.312, discriminator_fake_loss=1.245, generator_loss=30.43, generator_mel_loss=19.98, generator_kl_loss=2.005, generator_dur_loss=1.641, generator_adv_loss=2.269, generator_feat_match_loss=4.54, over 1899.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 17:45:57,947 INFO [train.py:811] (0/4) Start epoch 754 2023-11-14 17:49:36,578 INFO [train.py:811] (0/4) Start epoch 755 2023-11-14 17:50:04,937 INFO [train.py:467] (0/4) Epoch 755, batch 2, global_batch_idx: 27900, batch size: 101, loss[discriminator_loss=2.625, discriminator_real_loss=1.435, discriminator_fake_loss=1.189, generator_loss=30.32, generator_mel_loss=20.08, generator_kl_loss=2.009, generator_dur_loss=1.63, generator_adv_loss=2.184, generator_feat_match_loss=4.422, over 101.00 samples.], tot_loss[discriminator_loss=2.591, discriminator_real_loss=1.381, discriminator_fake_loss=1.21, generator_loss=30.49, generator_mel_loss=20.17, generator_kl_loss=2.005, generator_dur_loss=1.632, generator_adv_loss=2.336, generator_feat_match_loss=4.343, over 286.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 17:53:10,129 INFO [train.py:811] (0/4) Start epoch 756 2023-11-14 17:54:47,959 INFO [train.py:467] (0/4) Epoch 756, batch 15, global_batch_idx: 27950, batch size: 110, loss[discriminator_loss=2.762, discriminator_real_loss=1.349, discriminator_fake_loss=1.412, generator_loss=29.83, generator_mel_loss=20.27, generator_kl_loss=2.013, generator_dur_loss=1.639, generator_adv_loss=2.033, generator_feat_match_loss=3.879, over 110.00 samples.], tot_loss[discriminator_loss=2.539, discriminator_real_loss=1.281, discriminator_fake_loss=1.258, generator_loss=30.89, generator_mel_loss=20.26, generator_kl_loss=2.026, generator_dur_loss=1.642, generator_adv_loss=2.311, generator_feat_match_loss=4.655, over 1122.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 17:56:40,942 INFO [train.py:811] (0/4) Start epoch 757 2023-11-14 17:59:26,254 INFO [train.py:467] (0/4) Epoch 757, batch 28, global_batch_idx: 28000, batch size: 50, loss[discriminator_loss=2.602, discriminator_real_loss=1.291, discriminator_fake_loss=1.312, generator_loss=29.68, generator_mel_loss=19.74, generator_kl_loss=2.015, generator_dur_loss=1.618, generator_adv_loss=2.219, generator_feat_match_loss=4.082, over 50.00 samples.], tot_loss[discriminator_loss=2.535, discriminator_real_loss=1.277, discriminator_fake_loss=1.258, generator_loss=30.39, generator_mel_loss=20.01, generator_kl_loss=2.033, generator_dur_loss=1.647, generator_adv_loss=2.219, generator_feat_match_loss=4.482, over 1990.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 16.0 2023-11-14 17:59:26,743 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 17:59:37,435 INFO [train.py:517] (0/4) Epoch 757, validation: discriminator_loss=2.594, discriminator_real_loss=1.328, discriminator_fake_loss=1.265, generator_loss=30.85, generator_mel_loss=20.74, generator_kl_loss=2.225, generator_dur_loss=1.632, generator_adv_loss=2.084, generator_feat_match_loss=4.173, over 100.00 samples. 2023-11-14 17:59:37,436 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 18:00:27,448 INFO [train.py:811] (0/4) Start epoch 758 2023-11-14 18:03:56,886 INFO [train.py:811] (0/4) Start epoch 759 2023-11-14 18:04:33,814 INFO [train.py:467] (0/4) Epoch 759, batch 4, global_batch_idx: 28050, batch size: 51, loss[discriminator_loss=2.771, discriminator_real_loss=1.366, discriminator_fake_loss=1.405, generator_loss=29.82, generator_mel_loss=20.13, generator_kl_loss=2.071, generator_dur_loss=1.626, generator_adv_loss=2.031, generator_feat_match_loss=3.959, over 51.00 samples.], tot_loss[discriminator_loss=2.714, discriminator_real_loss=1.368, discriminator_fake_loss=1.345, generator_loss=30.63, generator_mel_loss=20.54, generator_kl_loss=2.018, generator_dur_loss=1.651, generator_adv_loss=2.143, generator_feat_match_loss=4.272, over 311.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 16.0 2023-11-14 18:07:31,592 INFO [train.py:811] (0/4) Start epoch 760 2023-11-14 18:09:31,056 INFO [train.py:467] (0/4) Epoch 760, batch 17, global_batch_idx: 28100, batch size: 51, loss[discriminator_loss=2.633, discriminator_real_loss=1.356, discriminator_fake_loss=1.277, generator_loss=31.05, generator_mel_loss=20.5, generator_kl_loss=1.993, generator_dur_loss=1.624, generator_adv_loss=2.262, generator_feat_match_loss=4.668, over 51.00 samples.], tot_loss[discriminator_loss=2.578, discriminator_real_loss=1.32, discriminator_fake_loss=1.258, generator_loss=30.45, generator_mel_loss=20.1, generator_kl_loss=2.025, generator_dur_loss=1.629, generator_adv_loss=2.228, generator_feat_match_loss=4.465, over 1421.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 18:11:07,365 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-760.pt 2023-11-14 18:11:10,974 INFO [train.py:811] (0/4) Start epoch 761 2023-11-14 18:14:03,879 INFO [train.py:467] (0/4) Epoch 761, batch 30, global_batch_idx: 28150, batch size: 63, loss[discriminator_loss=2.422, discriminator_real_loss=1.261, discriminator_fake_loss=1.161, generator_loss=30.64, generator_mel_loss=19.97, generator_kl_loss=1.95, generator_dur_loss=1.656, generator_adv_loss=2.383, generator_feat_match_loss=4.684, over 63.00 samples.], tot_loss[discriminator_loss=2.52, discriminator_real_loss=1.284, discriminator_fake_loss=1.236, generator_loss=30.45, generator_mel_loss=20, generator_kl_loss=2.018, generator_dur_loss=1.636, generator_adv_loss=2.27, generator_feat_match_loss=4.524, over 2093.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 18:14:41,852 INFO [train.py:811] (0/4) Start epoch 762 2023-11-14 18:18:15,761 INFO [train.py:811] (0/4) Start epoch 763 2023-11-14 18:19:10,711 INFO [train.py:467] (0/4) Epoch 763, batch 6, global_batch_idx: 28200, batch size: 73, loss[discriminator_loss=2.59, discriminator_real_loss=1.426, discriminator_fake_loss=1.165, generator_loss=30.21, generator_mel_loss=19.49, generator_kl_loss=2.102, generator_dur_loss=1.634, generator_adv_loss=2.355, generator_feat_match_loss=4.629, over 73.00 samples.], tot_loss[discriminator_loss=2.568, discriminator_real_loss=1.298, discriminator_fake_loss=1.27, generator_loss=30.42, generator_mel_loss=19.86, generator_kl_loss=2.058, generator_dur_loss=1.631, generator_adv_loss=2.27, generator_feat_match_loss=4.597, over 576.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 18:19:11,190 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 18:19:22,746 INFO [train.py:517] (0/4) Epoch 763, validation: discriminator_loss=2.437, discriminator_real_loss=1.185, discriminator_fake_loss=1.252, generator_loss=31.03, generator_mel_loss=20.34, generator_kl_loss=2.192, generator_dur_loss=1.639, generator_adv_loss=2.205, generator_feat_match_loss=4.655, over 100.00 samples. 2023-11-14 18:19:22,748 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 18:22:02,822 INFO [train.py:811] (0/4) Start epoch 764 2023-11-14 18:23:56,322 INFO [train.py:467] (0/4) Epoch 764, batch 19, global_batch_idx: 28250, batch size: 54, loss[discriminator_loss=2.672, discriminator_real_loss=1.418, discriminator_fake_loss=1.253, generator_loss=30.69, generator_mel_loss=20.48, generator_kl_loss=2.043, generator_dur_loss=1.668, generator_adv_loss=2.164, generator_feat_match_loss=4.328, over 54.00 samples.], tot_loss[discriminator_loss=2.57, discriminator_real_loss=1.304, discriminator_fake_loss=1.266, generator_loss=30.48, generator_mel_loss=20.18, generator_kl_loss=2.033, generator_dur_loss=1.642, generator_adv_loss=2.218, generator_feat_match_loss=4.406, over 1293.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 18:25:32,403 INFO [train.py:811] (0/4) Start epoch 765 2023-11-14 18:28:42,572 INFO [train.py:467] (0/4) Epoch 765, batch 32, global_batch_idx: 28300, batch size: 49, loss[discriminator_loss=2.578, discriminator_real_loss=1.268, discriminator_fake_loss=1.312, generator_loss=29.76, generator_mel_loss=19.59, generator_kl_loss=1.97, generator_dur_loss=1.623, generator_adv_loss=2.275, generator_feat_match_loss=4.297, over 49.00 samples.], tot_loss[discriminator_loss=2.594, discriminator_real_loss=1.308, discriminator_fake_loss=1.286, generator_loss=30.45, generator_mel_loss=20.13, generator_kl_loss=2.029, generator_dur_loss=1.637, generator_adv_loss=2.254, generator_feat_match_loss=4.408, over 2289.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 18:29:01,022 INFO [train.py:811] (0/4) Start epoch 766 2023-11-14 18:32:25,211 INFO [train.py:811] (0/4) Start epoch 767 2023-11-14 18:33:20,964 INFO [train.py:467] (0/4) Epoch 767, batch 8, global_batch_idx: 28350, batch size: 110, loss[discriminator_loss=2.541, discriminator_real_loss=1.22, discriminator_fake_loss=1.321, generator_loss=31.4, generator_mel_loss=20.67, generator_kl_loss=2.043, generator_dur_loss=1.659, generator_adv_loss=2.373, generator_feat_match_loss=4.66, over 110.00 samples.], tot_loss[discriminator_loss=2.571, discriminator_real_loss=1.292, discriminator_fake_loss=1.279, generator_loss=30.45, generator_mel_loss=20.15, generator_kl_loss=2.042, generator_dur_loss=1.645, generator_adv_loss=2.211, generator_feat_match_loss=4.404, over 595.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 18:35:56,279 INFO [train.py:811] (0/4) Start epoch 768 2023-11-14 18:38:02,857 INFO [train.py:467] (0/4) Epoch 768, batch 21, global_batch_idx: 28400, batch size: 73, loss[discriminator_loss=2.613, discriminator_real_loss=1.314, discriminator_fake_loss=1.3, generator_loss=30, generator_mel_loss=19.86, generator_kl_loss=2.042, generator_dur_loss=1.622, generator_adv_loss=2.188, generator_feat_match_loss=4.285, over 73.00 samples.], tot_loss[discriminator_loss=2.576, discriminator_real_loss=1.289, discriminator_fake_loss=1.288, generator_loss=30.75, generator_mel_loss=20.29, generator_kl_loss=2.045, generator_dur_loss=1.639, generator_adv_loss=2.25, generator_feat_match_loss=4.528, over 1571.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 16.0 2023-11-14 18:38:03,351 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 18:38:14,259 INFO [train.py:517] (0/4) Epoch 768, validation: discriminator_loss=2.528, discriminator_real_loss=1.15, discriminator_fake_loss=1.378, generator_loss=31.68, generator_mel_loss=20.95, generator_kl_loss=2.279, generator_dur_loss=1.63, generator_adv_loss=2.086, generator_feat_match_loss=4.736, over 100.00 samples. 2023-11-14 18:38:14,260 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 18:39:42,211 INFO [train.py:811] (0/4) Start epoch 769 2023-11-14 18:43:10,861 INFO [train.py:467] (0/4) Epoch 769, batch 34, global_batch_idx: 28450, batch size: 60, loss[discriminator_loss=2.57, discriminator_real_loss=1.285, discriminator_fake_loss=1.284, generator_loss=29.76, generator_mel_loss=19.65, generator_kl_loss=2.071, generator_dur_loss=1.64, generator_adv_loss=2.27, generator_feat_match_loss=4.125, over 60.00 samples.], tot_loss[discriminator_loss=2.529, discriminator_real_loss=1.284, discriminator_fake_loss=1.245, generator_loss=30.63, generator_mel_loss=20.03, generator_kl_loss=1.994, generator_dur_loss=1.635, generator_adv_loss=2.323, generator_feat_match_loss=4.643, over 2741.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 16.0 2023-11-14 18:43:19,833 INFO [train.py:811] (0/4) Start epoch 770 2023-11-14 18:46:55,424 INFO [train.py:811] (0/4) Start epoch 771 2023-11-14 18:48:01,760 INFO [train.py:467] (0/4) Epoch 771, batch 10, global_batch_idx: 28500, batch size: 52, loss[discriminator_loss=2.416, discriminator_real_loss=1.2, discriminator_fake_loss=1.216, generator_loss=31.67, generator_mel_loss=20.38, generator_kl_loss=1.986, generator_dur_loss=1.622, generator_adv_loss=2.359, generator_feat_match_loss=5.324, over 52.00 samples.], tot_loss[discriminator_loss=2.502, discriminator_real_loss=1.275, discriminator_fake_loss=1.227, generator_loss=30.84, generator_mel_loss=20.23, generator_kl_loss=2.009, generator_dur_loss=1.643, generator_adv_loss=2.299, generator_feat_match_loss=4.662, over 707.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 18:50:31,270 INFO [train.py:811] (0/4) Start epoch 772 2023-11-14 18:52:56,372 INFO [train.py:467] (0/4) Epoch 772, batch 23, global_batch_idx: 28550, batch size: 56, loss[discriminator_loss=2.586, discriminator_real_loss=1.46, discriminator_fake_loss=1.125, generator_loss=31.12, generator_mel_loss=20.46, generator_kl_loss=2.114, generator_dur_loss=1.625, generator_adv_loss=2.332, generator_feat_match_loss=4.594, over 56.00 samples.], tot_loss[discriminator_loss=2.57, discriminator_real_loss=1.309, discriminator_fake_loss=1.261, generator_loss=30.52, generator_mel_loss=19.85, generator_kl_loss=2.017, generator_dur_loss=1.635, generator_adv_loss=2.337, generator_feat_match_loss=4.679, over 1699.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 18:54:04,230 INFO [train.py:811] (0/4) Start epoch 773 2023-11-14 18:57:42,133 INFO [train.py:467] (0/4) Epoch 773, batch 36, global_batch_idx: 28600, batch size: 110, loss[discriminator_loss=2.555, discriminator_real_loss=1.279, discriminator_fake_loss=1.274, generator_loss=31.11, generator_mel_loss=20.36, generator_kl_loss=2.013, generator_dur_loss=1.607, generator_adv_loss=2.381, generator_feat_match_loss=4.742, over 110.00 samples.], tot_loss[discriminator_loss=2.593, discriminator_real_loss=1.309, discriminator_fake_loss=1.284, generator_loss=30.59, generator_mel_loss=20.31, generator_kl_loss=2.019, generator_dur_loss=1.639, generator_adv_loss=2.19, generator_feat_match_loss=4.425, over 2797.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 18:57:42,602 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 18:57:53,392 INFO [train.py:517] (0/4) Epoch 773, validation: discriminator_loss=2.675, discriminator_real_loss=1.355, discriminator_fake_loss=1.32, generator_loss=30.56, generator_mel_loss=20.63, generator_kl_loss=2.129, generator_dur_loss=1.633, generator_adv_loss=2.041, generator_feat_match_loss=4.135, over 100.00 samples. 2023-11-14 18:57:53,393 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 18:57:54,045 INFO [train.py:811] (0/4) Start epoch 774 2023-11-14 19:01:28,294 INFO [train.py:811] (0/4) Start epoch 775 2023-11-14 19:02:44,144 INFO [train.py:467] (0/4) Epoch 775, batch 12, global_batch_idx: 28650, batch size: 110, loss[discriminator_loss=2.438, discriminator_real_loss=1.244, discriminator_fake_loss=1.194, generator_loss=31.25, generator_mel_loss=20.12, generator_kl_loss=2.042, generator_dur_loss=1.634, generator_adv_loss=2.406, generator_feat_match_loss=5.047, over 110.00 samples.], tot_loss[discriminator_loss=2.573, discriminator_real_loss=1.314, discriminator_fake_loss=1.259, generator_loss=30.37, generator_mel_loss=19.84, generator_kl_loss=1.994, generator_dur_loss=1.64, generator_adv_loss=2.338, generator_feat_match_loss=4.55, over 900.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 19:05:01,559 INFO [train.py:811] (0/4) Start epoch 776 2023-11-14 19:07:36,707 INFO [train.py:467] (0/4) Epoch 776, batch 25, global_batch_idx: 28700, batch size: 67, loss[discriminator_loss=2.668, discriminator_real_loss=1.191, discriminator_fake_loss=1.477, generator_loss=30.81, generator_mel_loss=20.25, generator_kl_loss=1.998, generator_dur_loss=1.655, generator_adv_loss=2.23, generator_feat_match_loss=4.68, over 67.00 samples.], tot_loss[discriminator_loss=2.541, discriminator_real_loss=1.28, discriminator_fake_loss=1.261, generator_loss=30.71, generator_mel_loss=20.16, generator_kl_loss=2.039, generator_dur_loss=1.637, generator_adv_loss=2.254, generator_feat_match_loss=4.621, over 1824.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2023-11-14 19:08:35,906 INFO [train.py:811] (0/4) Start epoch 777 2023-11-14 19:12:11,940 INFO [train.py:811] (0/4) Start epoch 778 2023-11-14 19:12:31,628 INFO [train.py:467] (0/4) Epoch 778, batch 1, global_batch_idx: 28750, batch size: 60, loss[discriminator_loss=2.469, discriminator_real_loss=1.224, discriminator_fake_loss=1.245, generator_loss=31.32, generator_mel_loss=20.27, generator_kl_loss=2.042, generator_dur_loss=1.62, generator_adv_loss=2.443, generator_feat_match_loss=4.953, over 60.00 samples.], tot_loss[discriminator_loss=2.518, discriminator_real_loss=1.321, discriminator_fake_loss=1.197, generator_loss=30.65, generator_mel_loss=19.9, generator_kl_loss=1.986, generator_dur_loss=1.619, generator_adv_loss=2.377, generator_feat_match_loss=4.776, over 141.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 19:15:47,953 INFO [train.py:811] (0/4) Start epoch 779 2023-11-14 19:17:21,549 INFO [train.py:467] (0/4) Epoch 779, batch 14, global_batch_idx: 28800, batch size: 63, loss[discriminator_loss=2.363, discriminator_real_loss=1.22, discriminator_fake_loss=1.143, generator_loss=31.48, generator_mel_loss=20.34, generator_kl_loss=2.027, generator_dur_loss=1.665, generator_adv_loss=2.477, generator_feat_match_loss=4.977, over 63.00 samples.], tot_loss[discriminator_loss=2.517, discriminator_real_loss=1.257, discriminator_fake_loss=1.259, generator_loss=30.96, generator_mel_loss=20.19, generator_kl_loss=2.013, generator_dur_loss=1.636, generator_adv_loss=2.329, generator_feat_match_loss=4.79, over 997.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 16.0 2023-11-14 19:17:22,082 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 19:17:32,665 INFO [train.py:517] (0/4) Epoch 779, validation: discriminator_loss=2.42, discriminator_real_loss=1.15, discriminator_fake_loss=1.27, generator_loss=30.91, generator_mel_loss=20.48, generator_kl_loss=2.196, generator_dur_loss=1.632, generator_adv_loss=2.052, generator_feat_match_loss=4.545, over 100.00 samples. 2023-11-14 19:17:32,666 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 19:19:25,334 INFO [train.py:811] (0/4) Start epoch 780 2023-11-14 19:21:58,809 INFO [train.py:467] (0/4) Epoch 780, batch 27, global_batch_idx: 28850, batch size: 59, loss[discriminator_loss=2.494, discriminator_real_loss=1.256, discriminator_fake_loss=1.238, generator_loss=30.29, generator_mel_loss=19.82, generator_kl_loss=2.137, generator_dur_loss=1.644, generator_adv_loss=2.104, generator_feat_match_loss=4.586, over 59.00 samples.], tot_loss[discriminator_loss=2.494, discriminator_real_loss=1.256, discriminator_fake_loss=1.238, generator_loss=30.7, generator_mel_loss=19.92, generator_kl_loss=2.008, generator_dur_loss=1.639, generator_adv_loss=2.325, generator_feat_match_loss=4.814, over 1968.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 19:22:53,626 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-780.pt 2023-11-14 19:22:57,134 INFO [train.py:811] (0/4) Start epoch 781 2023-11-14 19:26:27,332 INFO [train.py:811] (0/4) Start epoch 782 2023-11-14 19:26:58,522 INFO [train.py:467] (0/4) Epoch 782, batch 3, global_batch_idx: 28900, batch size: 54, loss[discriminator_loss=2.312, discriminator_real_loss=1.205, discriminator_fake_loss=1.108, generator_loss=31.94, generator_mel_loss=20.13, generator_kl_loss=2.03, generator_dur_loss=1.662, generator_adv_loss=2.398, generator_feat_match_loss=5.727, over 54.00 samples.], tot_loss[discriminator_loss=2.448, discriminator_real_loss=1.343, discriminator_fake_loss=1.106, generator_loss=31.21, generator_mel_loss=19.84, generator_kl_loss=1.973, generator_dur_loss=1.648, generator_adv_loss=2.44, generator_feat_match_loss=5.303, over 268.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 19:29:55,831 INFO [train.py:811] (0/4) Start epoch 783 2023-11-14 19:31:32,924 INFO [train.py:467] (0/4) Epoch 783, batch 16, global_batch_idx: 28950, batch size: 110, loss[discriminator_loss=2.559, discriminator_real_loss=1.281, discriminator_fake_loss=1.276, generator_loss=30.25, generator_mel_loss=20.12, generator_kl_loss=2.006, generator_dur_loss=1.625, generator_adv_loss=2.238, generator_feat_match_loss=4.254, over 110.00 samples.], tot_loss[discriminator_loss=2.573, discriminator_real_loss=1.303, discriminator_fake_loss=1.271, generator_loss=30.11, generator_mel_loss=20.01, generator_kl_loss=2.02, generator_dur_loss=1.639, generator_adv_loss=2.156, generator_feat_match_loss=4.285, over 1156.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 19:33:27,478 INFO [train.py:811] (0/4) Start epoch 784 2023-11-14 19:36:12,966 INFO [train.py:467] (0/4) Epoch 784, batch 29, global_batch_idx: 29000, batch size: 73, loss[discriminator_loss=2.562, discriminator_real_loss=1.34, discriminator_fake_loss=1.223, generator_loss=30.82, generator_mel_loss=20.29, generator_kl_loss=2.09, generator_dur_loss=1.619, generator_adv_loss=2.162, generator_feat_match_loss=4.664, over 73.00 samples.], tot_loss[discriminator_loss=2.604, discriminator_real_loss=1.323, discriminator_fake_loss=1.281, generator_loss=30.48, generator_mel_loss=20.22, generator_kl_loss=2.016, generator_dur_loss=1.638, generator_adv_loss=2.211, generator_feat_match_loss=4.4, over 2073.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 19:36:13,459 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 19:36:23,998 INFO [train.py:517] (0/4) Epoch 784, validation: discriminator_loss=2.483, discriminator_real_loss=1.163, discriminator_fake_loss=1.32, generator_loss=30.95, generator_mel_loss=20.64, generator_kl_loss=2.146, generator_dur_loss=1.631, generator_adv_loss=1.886, generator_feat_match_loss=4.646, over 100.00 samples. 2023-11-14 19:36:23,999 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 19:37:06,657 INFO [train.py:811] (0/4) Start epoch 785 2023-11-14 19:40:37,301 INFO [train.py:811] (0/4) Start epoch 786 2023-11-14 19:41:19,866 INFO [train.py:467] (0/4) Epoch 786, batch 5, global_batch_idx: 29050, batch size: 53, loss[discriminator_loss=2.605, discriminator_real_loss=1.375, discriminator_fake_loss=1.229, generator_loss=31.13, generator_mel_loss=20.21, generator_kl_loss=1.997, generator_dur_loss=1.645, generator_adv_loss=2.537, generator_feat_match_loss=4.734, over 53.00 samples.], tot_loss[discriminator_loss=2.585, discriminator_real_loss=1.277, discriminator_fake_loss=1.308, generator_loss=30.52, generator_mel_loss=20.03, generator_kl_loss=2.033, generator_dur_loss=1.636, generator_adv_loss=2.25, generator_feat_match_loss=4.578, over 391.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 19:44:08,596 INFO [train.py:811] (0/4) Start epoch 787 2023-11-14 19:46:00,369 INFO [train.py:467] (0/4) Epoch 787, batch 18, global_batch_idx: 29100, batch size: 101, loss[discriminator_loss=2.5, discriminator_real_loss=1.192, discriminator_fake_loss=1.309, generator_loss=31.34, generator_mel_loss=20.09, generator_kl_loss=2.002, generator_dur_loss=1.612, generator_adv_loss=2.521, generator_feat_match_loss=5.113, over 101.00 samples.], tot_loss[discriminator_loss=2.513, discriminator_real_loss=1.264, discriminator_fake_loss=1.249, generator_loss=31.02, generator_mel_loss=20.08, generator_kl_loss=2.018, generator_dur_loss=1.632, generator_adv_loss=2.366, generator_feat_match_loss=4.917, over 1444.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 19:47:40,802 INFO [train.py:811] (0/4) Start epoch 788 2023-11-14 19:50:37,936 INFO [train.py:467] (0/4) Epoch 788, batch 31, global_batch_idx: 29150, batch size: 101, loss[discriminator_loss=2.523, discriminator_real_loss=1.258, discriminator_fake_loss=1.267, generator_loss=30.01, generator_mel_loss=19.71, generator_kl_loss=2.005, generator_dur_loss=1.618, generator_adv_loss=2.146, generator_feat_match_loss=4.527, over 101.00 samples.], tot_loss[discriminator_loss=2.491, discriminator_real_loss=1.263, discriminator_fake_loss=1.227, generator_loss=30.85, generator_mel_loss=19.8, generator_kl_loss=2.032, generator_dur_loss=1.636, generator_adv_loss=2.392, generator_feat_match_loss=4.984, over 2434.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 19:51:10,064 INFO [train.py:811] (0/4) Start epoch 789 2023-11-14 19:54:42,931 INFO [train.py:811] (0/4) Start epoch 790 2023-11-14 19:55:32,813 INFO [train.py:467] (0/4) Epoch 790, batch 7, global_batch_idx: 29200, batch size: 67, loss[discriminator_loss=2.59, discriminator_real_loss=1.28, discriminator_fake_loss=1.309, generator_loss=30.44, generator_mel_loss=20.38, generator_kl_loss=1.939, generator_dur_loss=1.643, generator_adv_loss=2.193, generator_feat_match_loss=4.281, over 67.00 samples.], tot_loss[discriminator_loss=2.608, discriminator_real_loss=1.32, discriminator_fake_loss=1.288, generator_loss=30.58, generator_mel_loss=20.34, generator_kl_loss=2.029, generator_dur_loss=1.652, generator_adv_loss=2.175, generator_feat_match_loss=4.386, over 543.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 16.0 2023-11-14 19:55:33,342 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 19:55:44,844 INFO [train.py:517] (0/4) Epoch 790, validation: discriminator_loss=2.615, discriminator_real_loss=1.313, discriminator_fake_loss=1.301, generator_loss=31.74, generator_mel_loss=21.48, generator_kl_loss=2.282, generator_dur_loss=1.629, generator_adv_loss=1.995, generator_feat_match_loss=4.356, over 100.00 samples. 2023-11-14 19:55:44,845 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 19:58:31,970 INFO [train.py:811] (0/4) Start epoch 791 2023-11-14 20:00:37,040 INFO [train.py:467] (0/4) Epoch 791, batch 20, global_batch_idx: 29250, batch size: 76, loss[discriminator_loss=2.508, discriminator_real_loss=1.166, discriminator_fake_loss=1.342, generator_loss=30.95, generator_mel_loss=20.08, generator_kl_loss=2.052, generator_dur_loss=1.629, generator_adv_loss=2.354, generator_feat_match_loss=4.836, over 76.00 samples.], tot_loss[discriminator_loss=2.602, discriminator_real_loss=1.31, discriminator_fake_loss=1.292, generator_loss=30.61, generator_mel_loss=20.19, generator_kl_loss=2.013, generator_dur_loss=1.639, generator_adv_loss=2.235, generator_feat_match_loss=4.524, over 1424.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 16.0 2023-11-14 20:02:08,330 INFO [train.py:811] (0/4) Start epoch 792 2023-11-14 20:05:28,730 INFO [train.py:467] (0/4) Epoch 792, batch 33, global_batch_idx: 29300, batch size: 85, loss[discriminator_loss=2.299, discriminator_real_loss=1.121, discriminator_fake_loss=1.178, generator_loss=31.55, generator_mel_loss=19.86, generator_kl_loss=2.101, generator_dur_loss=1.628, generator_adv_loss=2.5, generator_feat_match_loss=5.457, over 85.00 samples.], tot_loss[discriminator_loss=2.486, discriminator_real_loss=1.253, discriminator_fake_loss=1.233, generator_loss=30.83, generator_mel_loss=19.96, generator_kl_loss=2.013, generator_dur_loss=1.633, generator_adv_loss=2.364, generator_feat_match_loss=4.862, over 2714.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 20:05:46,672 INFO [train.py:811] (0/4) Start epoch 793 2023-11-14 20:09:12,131 INFO [train.py:811] (0/4) Start epoch 794 2023-11-14 20:10:12,414 INFO [train.py:467] (0/4) Epoch 794, batch 9, global_batch_idx: 29350, batch size: 61, loss[discriminator_loss=2.576, discriminator_real_loss=1.225, discriminator_fake_loss=1.352, generator_loss=30.32, generator_mel_loss=19.95, generator_kl_loss=2.108, generator_dur_loss=1.647, generator_adv_loss=2.186, generator_feat_match_loss=4.43, over 61.00 samples.], tot_loss[discriminator_loss=2.557, discriminator_real_loss=1.309, discriminator_fake_loss=1.248, generator_loss=30.36, generator_mel_loss=20.06, generator_kl_loss=2.011, generator_dur_loss=1.634, generator_adv_loss=2.182, generator_feat_match_loss=4.476, over 656.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 20:12:45,768 INFO [train.py:811] (0/4) Start epoch 795 2023-11-14 20:15:04,215 INFO [train.py:467] (0/4) Epoch 795, batch 22, global_batch_idx: 29400, batch size: 90, loss[discriminator_loss=2.629, discriminator_real_loss=1.424, discriminator_fake_loss=1.205, generator_loss=31.26, generator_mel_loss=20.67, generator_kl_loss=2.041, generator_dur_loss=1.66, generator_adv_loss=2.139, generator_feat_match_loss=4.746, over 90.00 samples.], tot_loss[discriminator_loss=2.636, discriminator_real_loss=1.337, discriminator_fake_loss=1.298, generator_loss=30.49, generator_mel_loss=20.3, generator_kl_loss=2.05, generator_dur_loss=1.638, generator_adv_loss=2.15, generator_feat_match_loss=4.35, over 1566.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 20:15:04,745 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 20:15:15,188 INFO [train.py:517] (0/4) Epoch 795, validation: discriminator_loss=2.594, discriminator_real_loss=1.251, discriminator_fake_loss=1.343, generator_loss=30.8, generator_mel_loss=20.47, generator_kl_loss=2.226, generator_dur_loss=1.628, generator_adv_loss=1.974, generator_feat_match_loss=4.503, over 100.00 samples. 2023-11-14 20:15:15,190 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 20:16:30,266 INFO [train.py:811] (0/4) Start epoch 796 2023-11-14 20:20:00,184 INFO [train.py:467] (0/4) Epoch 796, batch 35, global_batch_idx: 29450, batch size: 79, loss[discriminator_loss=2.602, discriminator_real_loss=1.296, discriminator_fake_loss=1.307, generator_loss=30.07, generator_mel_loss=19.82, generator_kl_loss=2.003, generator_dur_loss=1.642, generator_adv_loss=2.193, generator_feat_match_loss=4.41, over 79.00 samples.], tot_loss[discriminator_loss=2.571, discriminator_real_loss=1.297, discriminator_fake_loss=1.274, generator_loss=30.52, generator_mel_loss=20.01, generator_kl_loss=2.023, generator_dur_loss=1.634, generator_adv_loss=2.254, generator_feat_match_loss=4.603, over 2701.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 20:20:06,780 INFO [train.py:811] (0/4) Start epoch 797 2023-11-14 20:23:40,366 INFO [train.py:811] (0/4) Start epoch 798 2023-11-14 20:24:56,848 INFO [train.py:467] (0/4) Epoch 798, batch 11, global_batch_idx: 29500, batch size: 49, loss[discriminator_loss=2.475, discriminator_real_loss=1.22, discriminator_fake_loss=1.255, generator_loss=30.26, generator_mel_loss=19.82, generator_kl_loss=2.028, generator_dur_loss=1.615, generator_adv_loss=2.148, generator_feat_match_loss=4.648, over 49.00 samples.], tot_loss[discriminator_loss=2.535, discriminator_real_loss=1.302, discriminator_fake_loss=1.233, generator_loss=30.53, generator_mel_loss=19.89, generator_kl_loss=1.97, generator_dur_loss=1.634, generator_adv_loss=2.336, generator_feat_match_loss=4.699, over 728.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 20:27:12,399 INFO [train.py:811] (0/4) Start epoch 799 2023-11-14 20:29:34,274 INFO [train.py:467] (0/4) Epoch 799, batch 24, global_batch_idx: 29550, batch size: 53, loss[discriminator_loss=2.467, discriminator_real_loss=1.287, discriminator_fake_loss=1.18, generator_loss=29.78, generator_mel_loss=19.25, generator_kl_loss=2.07, generator_dur_loss=1.647, generator_adv_loss=2.227, generator_feat_match_loss=4.586, over 53.00 samples.], tot_loss[discriminator_loss=2.55, discriminator_real_loss=1.286, discriminator_fake_loss=1.263, generator_loss=30.3, generator_mel_loss=19.89, generator_kl_loss=2.011, generator_dur_loss=1.635, generator_adv_loss=2.237, generator_feat_match_loss=4.526, over 1798.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 20:30:43,379 INFO [train.py:811] (0/4) Start epoch 800 2023-11-14 20:34:11,857 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-800.pt 2023-11-14 20:34:15,232 INFO [train.py:811] (0/4) Start epoch 801 2023-11-14 20:34:31,176 INFO [train.py:467] (0/4) Epoch 801, batch 0, global_batch_idx: 29600, batch size: 56, loss[discriminator_loss=2.371, discriminator_real_loss=1.207, discriminator_fake_loss=1.163, generator_loss=30.96, generator_mel_loss=19.38, generator_kl_loss=2.029, generator_dur_loss=1.634, generator_adv_loss=2.512, generator_feat_match_loss=5.406, over 56.00 samples.], tot_loss[discriminator_loss=2.371, discriminator_real_loss=1.207, discriminator_fake_loss=1.163, generator_loss=30.96, generator_mel_loss=19.38, generator_kl_loss=2.029, generator_dur_loss=1.634, generator_adv_loss=2.512, generator_feat_match_loss=5.406, over 56.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 16.0 2023-11-14 20:34:31,705 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 20:34:43,063 INFO [train.py:517] (0/4) Epoch 801, validation: discriminator_loss=2.309, discriminator_real_loss=1.035, discriminator_fake_loss=1.273, generator_loss=31.85, generator_mel_loss=20.11, generator_kl_loss=2.18, generator_dur_loss=1.636, generator_adv_loss=2.45, generator_feat_match_loss=5.474, over 100.00 samples. 2023-11-14 20:34:43,065 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 20:38:02,160 INFO [train.py:811] (0/4) Start epoch 802 2023-11-14 20:39:22,630 INFO [train.py:467] (0/4) Epoch 802, batch 13, global_batch_idx: 29650, batch size: 56, loss[discriminator_loss=2.502, discriminator_real_loss=1.355, discriminator_fake_loss=1.146, generator_loss=30.76, generator_mel_loss=19.98, generator_kl_loss=2.041, generator_dur_loss=1.63, generator_adv_loss=2.396, generator_feat_match_loss=4.711, over 56.00 samples.], tot_loss[discriminator_loss=2.5, discriminator_real_loss=1.246, discriminator_fake_loss=1.254, generator_loss=30.64, generator_mel_loss=19.85, generator_kl_loss=2.033, generator_dur_loss=1.631, generator_adv_loss=2.34, generator_feat_match_loss=4.789, over 979.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 20:41:39,789 INFO [train.py:811] (0/4) Start epoch 803 2023-11-14 20:44:13,185 INFO [train.py:467] (0/4) Epoch 803, batch 26, global_batch_idx: 29700, batch size: 79, loss[discriminator_loss=2.727, discriminator_real_loss=1.258, discriminator_fake_loss=1.47, generator_loss=29.5, generator_mel_loss=19.77, generator_kl_loss=1.978, generator_dur_loss=1.632, generator_adv_loss=2.158, generator_feat_match_loss=3.967, over 79.00 samples.], tot_loss[discriminator_loss=2.553, discriminator_real_loss=1.292, discriminator_fake_loss=1.262, generator_loss=30.76, generator_mel_loss=20.18, generator_kl_loss=2.044, generator_dur_loss=1.638, generator_adv_loss=2.284, generator_feat_match_loss=4.611, over 1830.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 20:45:14,646 INFO [train.py:811] (0/4) Start epoch 804 2023-11-14 20:48:49,114 INFO [train.py:811] (0/4) Start epoch 805 2023-11-14 20:49:18,584 INFO [train.py:467] (0/4) Epoch 805, batch 2, global_batch_idx: 29750, batch size: 81, loss[discriminator_loss=2.469, discriminator_real_loss=1.206, discriminator_fake_loss=1.263, generator_loss=30.77, generator_mel_loss=20.25, generator_kl_loss=2.045, generator_dur_loss=1.633, generator_adv_loss=2.242, generator_feat_match_loss=4.594, over 81.00 samples.], tot_loss[discriminator_loss=2.527, discriminator_real_loss=1.26, discriminator_fake_loss=1.267, generator_loss=30.31, generator_mel_loss=20, generator_kl_loss=2.074, generator_dur_loss=1.641, generator_adv_loss=2.136, generator_feat_match_loss=4.453, over 218.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 20:52:24,792 INFO [train.py:811] (0/4) Start epoch 806 2023-11-14 20:53:57,956 INFO [train.py:467] (0/4) Epoch 806, batch 15, global_batch_idx: 29800, batch size: 153, loss[discriminator_loss=2.391, discriminator_real_loss=1.113, discriminator_fake_loss=1.276, generator_loss=31.25, generator_mel_loss=20.05, generator_kl_loss=2.043, generator_dur_loss=1.635, generator_adv_loss=2.303, generator_feat_match_loss=5.215, over 153.00 samples.], tot_loss[discriminator_loss=2.507, discriminator_real_loss=1.243, discriminator_fake_loss=1.264, generator_loss=30.73, generator_mel_loss=19.91, generator_kl_loss=2.025, generator_dur_loss=1.638, generator_adv_loss=2.326, generator_feat_match_loss=4.828, over 1179.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 20:53:58,477 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 20:54:09,638 INFO [train.py:517] (0/4) Epoch 806, validation: discriminator_loss=2.527, discriminator_real_loss=1.069, discriminator_fake_loss=1.458, generator_loss=31.23, generator_mel_loss=20.8, generator_kl_loss=2.199, generator_dur_loss=1.638, generator_adv_loss=1.857, generator_feat_match_loss=4.737, over 100.00 samples. 2023-11-14 20:54:09,639 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 20:56:04,809 INFO [train.py:811] (0/4) Start epoch 807 2023-11-14 20:58:50,830 INFO [train.py:467] (0/4) Epoch 807, batch 28, global_batch_idx: 29850, batch size: 63, loss[discriminator_loss=2.539, discriminator_real_loss=1.392, discriminator_fake_loss=1.147, generator_loss=29.88, generator_mel_loss=19.68, generator_kl_loss=1.976, generator_dur_loss=1.661, generator_adv_loss=2.338, generator_feat_match_loss=4.227, over 63.00 samples.], tot_loss[discriminator_loss=2.54, discriminator_real_loss=1.281, discriminator_fake_loss=1.259, generator_loss=30.76, generator_mel_loss=20.16, generator_kl_loss=2.012, generator_dur_loss=1.635, generator_adv_loss=2.271, generator_feat_match_loss=4.673, over 2066.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 20:59:41,727 INFO [train.py:811] (0/4) Start epoch 808 2023-11-14 21:03:20,326 INFO [train.py:811] (0/4) Start epoch 809 2023-11-14 21:03:53,897 INFO [train.py:467] (0/4) Epoch 809, batch 4, global_batch_idx: 29900, batch size: 52, loss[discriminator_loss=2.531, discriminator_real_loss=1.296, discriminator_fake_loss=1.234, generator_loss=29.81, generator_mel_loss=19.79, generator_kl_loss=2.029, generator_dur_loss=1.677, generator_adv_loss=2.158, generator_feat_match_loss=4.152, over 52.00 samples.], tot_loss[discriminator_loss=2.551, discriminator_real_loss=1.294, discriminator_fake_loss=1.256, generator_loss=30.43, generator_mel_loss=19.88, generator_kl_loss=2.055, generator_dur_loss=1.647, generator_adv_loss=2.291, generator_feat_match_loss=4.557, over 299.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 21:06:52,484 INFO [train.py:811] (0/4) Start epoch 810 2023-11-14 21:08:44,194 INFO [train.py:467] (0/4) Epoch 810, batch 17, global_batch_idx: 29950, batch size: 52, loss[discriminator_loss=2.584, discriminator_real_loss=1.32, discriminator_fake_loss=1.264, generator_loss=30.44, generator_mel_loss=19.88, generator_kl_loss=1.964, generator_dur_loss=1.661, generator_adv_loss=2.33, generator_feat_match_loss=4.602, over 52.00 samples.], tot_loss[discriminator_loss=2.545, discriminator_real_loss=1.278, discriminator_fake_loss=1.267, generator_loss=30.7, generator_mel_loss=19.96, generator_kl_loss=2.026, generator_dur_loss=1.64, generator_adv_loss=2.281, generator_feat_match_loss=4.786, over 1364.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 21:10:25,759 INFO [train.py:811] (0/4) Start epoch 811 2023-11-14 21:13:17,652 INFO [train.py:467] (0/4) Epoch 811, batch 30, global_batch_idx: 30000, batch size: 63, loss[discriminator_loss=2.355, discriminator_real_loss=1.29, discriminator_fake_loss=1.064, generator_loss=31.39, generator_mel_loss=19.93, generator_kl_loss=1.933, generator_dur_loss=1.64, generator_adv_loss=2.576, generator_feat_match_loss=5.309, over 63.00 samples.], tot_loss[discriminator_loss=2.589, discriminator_real_loss=1.325, discriminator_fake_loss=1.264, generator_loss=30.53, generator_mel_loss=20.01, generator_kl_loss=2.015, generator_dur_loss=1.642, generator_adv_loss=2.275, generator_feat_match_loss=4.585, over 1992.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 16.0 2023-11-14 21:13:18,203 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 21:13:28,583 INFO [train.py:517] (0/4) Epoch 811, validation: discriminator_loss=2.392, discriminator_real_loss=1.16, discriminator_fake_loss=1.232, generator_loss=32.08, generator_mel_loss=20.83, generator_kl_loss=2.196, generator_dur_loss=1.648, generator_adv_loss=2.263, generator_feat_match_loss=5.142, over 100.00 samples. 2023-11-14 21:13:28,583 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 21:14:05,673 INFO [train.py:811] (0/4) Start epoch 812 2023-11-14 21:17:42,622 INFO [train.py:811] (0/4) Start epoch 813 2023-11-14 21:18:27,687 INFO [train.py:467] (0/4) Epoch 813, batch 6, global_batch_idx: 30050, batch size: 50, loss[discriminator_loss=2.566, discriminator_real_loss=1.325, discriminator_fake_loss=1.242, generator_loss=30.16, generator_mel_loss=19.47, generator_kl_loss=1.959, generator_dur_loss=1.641, generator_adv_loss=2.371, generator_feat_match_loss=4.723, over 50.00 samples.], tot_loss[discriminator_loss=2.537, discriminator_real_loss=1.281, discriminator_fake_loss=1.256, generator_loss=30.74, generator_mel_loss=19.97, generator_kl_loss=2.012, generator_dur_loss=1.648, generator_adv_loss=2.327, generator_feat_match_loss=4.786, over 454.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 16.0 2023-11-14 21:21:13,615 INFO [train.py:811] (0/4) Start epoch 814 2023-11-14 21:23:19,715 INFO [train.py:467] (0/4) Epoch 814, batch 19, global_batch_idx: 30100, batch size: 49, loss[discriminator_loss=2.504, discriminator_real_loss=1.215, discriminator_fake_loss=1.289, generator_loss=30.75, generator_mel_loss=20.44, generator_kl_loss=2.028, generator_dur_loss=1.643, generator_adv_loss=2.125, generator_feat_match_loss=4.516, over 49.00 samples.], tot_loss[discriminator_loss=2.553, discriminator_real_loss=1.302, discriminator_fake_loss=1.251, generator_loss=30.39, generator_mel_loss=19.97, generator_kl_loss=2.029, generator_dur_loss=1.632, generator_adv_loss=2.237, generator_feat_match_loss=4.52, over 1438.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 16.0 2023-11-14 21:24:42,751 INFO [train.py:811] (0/4) Start epoch 815 2023-11-14 21:27:58,749 INFO [train.py:467] (0/4) Epoch 815, batch 32, global_batch_idx: 30150, batch size: 53, loss[discriminator_loss=2.594, discriminator_real_loss=1.171, discriminator_fake_loss=1.422, generator_loss=31, generator_mel_loss=20.18, generator_kl_loss=1.984, generator_dur_loss=1.667, generator_adv_loss=2.406, generator_feat_match_loss=4.754, over 53.00 samples.], tot_loss[discriminator_loss=2.464, discriminator_real_loss=1.252, discriminator_fake_loss=1.212, generator_loss=31.02, generator_mel_loss=19.8, generator_kl_loss=2.004, generator_dur_loss=1.642, generator_adv_loss=2.432, generator_feat_match_loss=5.142, over 2318.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 21:28:16,219 INFO [train.py:811] (0/4) Start epoch 816 2023-11-14 21:31:51,855 INFO [train.py:811] (0/4) Start epoch 817 2023-11-14 21:32:52,161 INFO [train.py:467] (0/4) Epoch 817, batch 8, global_batch_idx: 30200, batch size: 52, loss[discriminator_loss=2.609, discriminator_real_loss=1.401, discriminator_fake_loss=1.208, generator_loss=29.82, generator_mel_loss=19.85, generator_kl_loss=1.955, generator_dur_loss=1.631, generator_adv_loss=2.125, generator_feat_match_loss=4.266, over 52.00 samples.], tot_loss[discriminator_loss=2.641, discriminator_real_loss=1.345, discriminator_fake_loss=1.296, generator_loss=30.49, generator_mel_loss=20.37, generator_kl_loss=2.009, generator_dur_loss=1.634, generator_adv_loss=2.151, generator_feat_match_loss=4.327, over 580.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 21:32:52,723 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 21:33:03,424 INFO [train.py:517] (0/4) Epoch 817, validation: discriminator_loss=2.652, discriminator_real_loss=1.239, discriminator_fake_loss=1.412, generator_loss=30.8, generator_mel_loss=20.69, generator_kl_loss=2.168, generator_dur_loss=1.631, generator_adv_loss=1.917, generator_feat_match_loss=4.389, over 100.00 samples. 2023-11-14 21:33:03,425 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 21:35:39,910 INFO [train.py:811] (0/4) Start epoch 818 2023-11-14 21:38:03,010 INFO [train.py:467] (0/4) Epoch 818, batch 21, global_batch_idx: 30250, batch size: 59, loss[discriminator_loss=2.506, discriminator_real_loss=1.429, discriminator_fake_loss=1.077, generator_loss=31.04, generator_mel_loss=19.92, generator_kl_loss=2.042, generator_dur_loss=1.643, generator_adv_loss=2.416, generator_feat_match_loss=5.016, over 59.00 samples.], tot_loss[discriminator_loss=2.573, discriminator_real_loss=1.293, discriminator_fake_loss=1.279, generator_loss=30.74, generator_mel_loss=20.22, generator_kl_loss=2.016, generator_dur_loss=1.636, generator_adv_loss=2.266, generator_feat_match_loss=4.606, over 1779.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 21:39:15,897 INFO [train.py:811] (0/4) Start epoch 819 2023-11-14 21:42:35,716 INFO [train.py:467] (0/4) Epoch 819, batch 34, global_batch_idx: 30300, batch size: 85, loss[discriminator_loss=2.623, discriminator_real_loss=1.395, discriminator_fake_loss=1.229, generator_loss=29.62, generator_mel_loss=19.72, generator_kl_loss=1.964, generator_dur_loss=1.628, generator_adv_loss=2.094, generator_feat_match_loss=4.211, over 85.00 samples.], tot_loss[discriminator_loss=2.578, discriminator_real_loss=1.335, discriminator_fake_loss=1.243, generator_loss=30.42, generator_mel_loss=19.7, generator_kl_loss=2.01, generator_dur_loss=1.633, generator_adv_loss=2.307, generator_feat_match_loss=4.77, over 2678.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 21:42:51,049 INFO [train.py:811] (0/4) Start epoch 820 2023-11-14 21:46:18,426 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-820.pt 2023-11-14 21:46:22,170 INFO [train.py:811] (0/4) Start epoch 821 2023-11-14 21:47:31,067 INFO [train.py:467] (0/4) Epoch 821, batch 10, global_batch_idx: 30350, batch size: 126, loss[discriminator_loss=2.584, discriminator_real_loss=1.287, discriminator_fake_loss=1.297, generator_loss=31.06, generator_mel_loss=20.58, generator_kl_loss=2.055, generator_dur_loss=1.638, generator_adv_loss=2.146, generator_feat_match_loss=4.648, over 126.00 samples.], tot_loss[discriminator_loss=2.565, discriminator_real_loss=1.305, discriminator_fake_loss=1.26, generator_loss=30.73, generator_mel_loss=20.26, generator_kl_loss=2.03, generator_dur_loss=1.64, generator_adv_loss=2.207, generator_feat_match_loss=4.601, over 912.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2023-11-14 21:49:52,145 INFO [train.py:811] (0/4) Start epoch 822 2023-11-14 21:52:15,942 INFO [train.py:467] (0/4) Epoch 822, batch 23, global_batch_idx: 30400, batch size: 67, loss[discriminator_loss=2.535, discriminator_real_loss=1.457, discriminator_fake_loss=1.079, generator_loss=31.08, generator_mel_loss=20.09, generator_kl_loss=2.105, generator_dur_loss=1.662, generator_adv_loss=2.287, generator_feat_match_loss=4.938, over 67.00 samples.], tot_loss[discriminator_loss=2.578, discriminator_real_loss=1.296, discriminator_fake_loss=1.282, generator_loss=30.7, generator_mel_loss=20.14, generator_kl_loss=2.009, generator_dur_loss=1.637, generator_adv_loss=2.277, generator_feat_match_loss=4.64, over 1992.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 16.0 2023-11-14 21:52:16,537 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 21:52:26,906 INFO [train.py:517] (0/4) Epoch 822, validation: discriminator_loss=2.61, discriminator_real_loss=1.172, discriminator_fake_loss=1.438, generator_loss=30.48, generator_mel_loss=20.46, generator_kl_loss=2.197, generator_dur_loss=1.64, generator_adv_loss=1.842, generator_feat_match_loss=4.345, over 100.00 samples. 2023-11-14 21:52:26,907 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 21:53:36,129 INFO [train.py:811] (0/4) Start epoch 823 2023-11-14 21:57:10,137 INFO [train.py:467] (0/4) Epoch 823, batch 36, global_batch_idx: 30450, batch size: 71, loss[discriminator_loss=2.658, discriminator_real_loss=1.512, discriminator_fake_loss=1.146, generator_loss=30.51, generator_mel_loss=19.91, generator_kl_loss=2.045, generator_dur_loss=1.62, generator_adv_loss=2.441, generator_feat_match_loss=4.492, over 71.00 samples.], tot_loss[discriminator_loss=2.518, discriminator_real_loss=1.273, discriminator_fake_loss=1.244, generator_loss=30.81, generator_mel_loss=20.09, generator_kl_loss=2.021, generator_dur_loss=1.631, generator_adv_loss=2.323, generator_feat_match_loss=4.746, over 2576.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 21:57:11,263 INFO [train.py:811] (0/4) Start epoch 824 2023-11-14 22:00:36,852 INFO [train.py:811] (0/4) Start epoch 825 2023-11-14 22:02:01,439 INFO [train.py:467] (0/4) Epoch 825, batch 12, global_batch_idx: 30500, batch size: 59, loss[discriminator_loss=2.576, discriminator_real_loss=1.218, discriminator_fake_loss=1.358, generator_loss=30.94, generator_mel_loss=20.37, generator_kl_loss=2.044, generator_dur_loss=1.65, generator_adv_loss=2.156, generator_feat_match_loss=4.719, over 59.00 samples.], tot_loss[discriminator_loss=2.545, discriminator_real_loss=1.293, discriminator_fake_loss=1.252, generator_loss=30.78, generator_mel_loss=19.99, generator_kl_loss=2.014, generator_dur_loss=1.638, generator_adv_loss=2.372, generator_feat_match_loss=4.774, over 924.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 22:04:08,989 INFO [train.py:811] (0/4) Start epoch 826 2023-11-14 22:06:42,025 INFO [train.py:467] (0/4) Epoch 826, batch 25, global_batch_idx: 30550, batch size: 56, loss[discriminator_loss=2.469, discriminator_real_loss=1.238, discriminator_fake_loss=1.23, generator_loss=30.87, generator_mel_loss=20, generator_kl_loss=2.123, generator_dur_loss=1.628, generator_adv_loss=2.312, generator_feat_match_loss=4.809, over 56.00 samples.], tot_loss[discriminator_loss=2.523, discriminator_real_loss=1.274, discriminator_fake_loss=1.249, generator_loss=30.7, generator_mel_loss=19.97, generator_kl_loss=2.023, generator_dur_loss=1.633, generator_adv_loss=2.298, generator_feat_match_loss=4.77, over 1997.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 22:07:38,068 INFO [train.py:811] (0/4) Start epoch 827 2023-11-14 22:11:16,359 INFO [train.py:811] (0/4) Start epoch 828 2023-11-14 22:11:39,083 INFO [train.py:467] (0/4) Epoch 828, batch 1, global_batch_idx: 30600, batch size: 59, loss[discriminator_loss=2.559, discriminator_real_loss=1.318, discriminator_fake_loss=1.24, generator_loss=30.57, generator_mel_loss=19.85, generator_kl_loss=2.123, generator_dur_loss=1.63, generator_adv_loss=2.338, generator_feat_match_loss=4.621, over 59.00 samples.], tot_loss[discriminator_loss=2.533, discriminator_real_loss=1.254, discriminator_fake_loss=1.279, generator_loss=30.62, generator_mel_loss=19.87, generator_kl_loss=2.087, generator_dur_loss=1.621, generator_adv_loss=2.369, generator_feat_match_loss=4.674, over 130.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 22:11:39,643 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 22:11:50,898 INFO [train.py:517] (0/4) Epoch 828, validation: discriminator_loss=2.486, discriminator_real_loss=1.237, discriminator_fake_loss=1.249, generator_loss=31.89, generator_mel_loss=20.74, generator_kl_loss=2.258, generator_dur_loss=1.635, generator_adv_loss=2.294, generator_feat_match_loss=4.963, over 100.00 samples. 2023-11-14 22:11:50,899 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 22:15:00,481 INFO [train.py:811] (0/4) Start epoch 829 2023-11-14 22:16:34,448 INFO [train.py:467] (0/4) Epoch 829, batch 14, global_batch_idx: 30650, batch size: 61, loss[discriminator_loss=2.52, discriminator_real_loss=1.229, discriminator_fake_loss=1.292, generator_loss=30.1, generator_mel_loss=19.9, generator_kl_loss=2.023, generator_dur_loss=1.618, generator_adv_loss=2.107, generator_feat_match_loss=4.453, over 61.00 samples.], tot_loss[discriminator_loss=2.505, discriminator_real_loss=1.267, discriminator_fake_loss=1.239, generator_loss=30.54, generator_mel_loss=20.02, generator_kl_loss=2.028, generator_dur_loss=1.632, generator_adv_loss=2.228, generator_feat_match_loss=4.63, over 1172.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 22:18:34,087 INFO [train.py:811] (0/4) Start epoch 830 2023-11-14 22:21:14,954 INFO [train.py:467] (0/4) Epoch 830, batch 27, global_batch_idx: 30700, batch size: 53, loss[discriminator_loss=2.766, discriminator_real_loss=1.394, discriminator_fake_loss=1.373, generator_loss=29.1, generator_mel_loss=19.32, generator_kl_loss=2.057, generator_dur_loss=1.617, generator_adv_loss=2.199, generator_feat_match_loss=3.912, over 53.00 samples.], tot_loss[discriminator_loss=2.542, discriminator_real_loss=1.305, discriminator_fake_loss=1.237, generator_loss=30.83, generator_mel_loss=19.91, generator_kl_loss=2.038, generator_dur_loss=1.638, generator_adv_loss=2.375, generator_feat_match_loss=4.873, over 2153.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 22:22:08,570 INFO [train.py:811] (0/4) Start epoch 831 2023-11-14 22:25:46,251 INFO [train.py:811] (0/4) Start epoch 832 2023-11-14 22:26:19,927 INFO [train.py:467] (0/4) Epoch 832, batch 3, global_batch_idx: 30750, batch size: 95, loss[discriminator_loss=2.527, discriminator_real_loss=1.312, discriminator_fake_loss=1.216, generator_loss=30.9, generator_mel_loss=20.1, generator_kl_loss=1.938, generator_dur_loss=1.628, generator_adv_loss=2.178, generator_feat_match_loss=5.062, over 95.00 samples.], tot_loss[discriminator_loss=2.591, discriminator_real_loss=1.366, discriminator_fake_loss=1.225, generator_loss=30.53, generator_mel_loss=19.94, generator_kl_loss=1.944, generator_dur_loss=1.629, generator_adv_loss=2.271, generator_feat_match_loss=4.745, over 338.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 22:29:15,589 INFO [train.py:811] (0/4) Start epoch 833 2023-11-14 22:30:54,173 INFO [train.py:467] (0/4) Epoch 833, batch 16, global_batch_idx: 30800, batch size: 51, loss[discriminator_loss=2.641, discriminator_real_loss=1.302, discriminator_fake_loss=1.338, generator_loss=29.71, generator_mel_loss=19.81, generator_kl_loss=2.055, generator_dur_loss=1.652, generator_adv_loss=2.316, generator_feat_match_loss=3.879, over 51.00 samples.], tot_loss[discriminator_loss=2.559, discriminator_real_loss=1.299, discriminator_fake_loss=1.26, generator_loss=30.39, generator_mel_loss=20, generator_kl_loss=1.993, generator_dur_loss=1.636, generator_adv_loss=2.255, generator_feat_match_loss=4.511, over 1139.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 16.0 2023-11-14 22:30:54,681 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 22:31:05,523 INFO [train.py:517] (0/4) Epoch 833, validation: discriminator_loss=2.566, discriminator_real_loss=1.36, discriminator_fake_loss=1.206, generator_loss=30.99, generator_mel_loss=20.35, generator_kl_loss=2.14, generator_dur_loss=1.632, generator_adv_loss=2.373, generator_feat_match_loss=4.486, over 100.00 samples. 2023-11-14 22:31:05,524 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 22:32:55,759 INFO [train.py:811] (0/4) Start epoch 834 2023-11-14 22:35:40,722 INFO [train.py:467] (0/4) Epoch 834, batch 29, global_batch_idx: 30850, batch size: 67, loss[discriminator_loss=2.543, discriminator_real_loss=1.432, discriminator_fake_loss=1.11, generator_loss=30.51, generator_mel_loss=19.88, generator_kl_loss=2.058, generator_dur_loss=1.652, generator_adv_loss=2.389, generator_feat_match_loss=4.531, over 67.00 samples.], tot_loss[discriminator_loss=2.506, discriminator_real_loss=1.271, discriminator_fake_loss=1.235, generator_loss=30.73, generator_mel_loss=19.7, generator_kl_loss=2.023, generator_dur_loss=1.634, generator_adv_loss=2.39, generator_feat_match_loss=4.985, over 2123.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 22:36:20,513 INFO [train.py:811] (0/4) Start epoch 835 2023-11-14 22:39:51,935 INFO [train.py:811] (0/4) Start epoch 836 2023-11-14 22:40:29,150 INFO [train.py:467] (0/4) Epoch 836, batch 5, global_batch_idx: 30900, batch size: 54, loss[discriminator_loss=2.635, discriminator_real_loss=1.335, discriminator_fake_loss=1.3, generator_loss=30.62, generator_mel_loss=20.46, generator_kl_loss=2.017, generator_dur_loss=1.637, generator_adv_loss=2.057, generator_feat_match_loss=4.449, over 54.00 samples.], tot_loss[discriminator_loss=2.584, discriminator_real_loss=1.292, discriminator_fake_loss=1.292, generator_loss=30.27, generator_mel_loss=20.04, generator_kl_loss=1.975, generator_dur_loss=1.636, generator_adv_loss=2.158, generator_feat_match_loss=4.459, over 400.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 22:43:18,576 INFO [train.py:811] (0/4) Start epoch 837 2023-11-14 22:45:18,534 INFO [train.py:467] (0/4) Epoch 837, batch 18, global_batch_idx: 30950, batch size: 79, loss[discriminator_loss=2.598, discriminator_real_loss=1.294, discriminator_fake_loss=1.305, generator_loss=30.38, generator_mel_loss=19.94, generator_kl_loss=2.081, generator_dur_loss=1.637, generator_adv_loss=2.256, generator_feat_match_loss=4.473, over 79.00 samples.], tot_loss[discriminator_loss=2.62, discriminator_real_loss=1.327, discriminator_fake_loss=1.293, generator_loss=30.47, generator_mel_loss=20.21, generator_kl_loss=2.015, generator_dur_loss=1.629, generator_adv_loss=2.179, generator_feat_match_loss=4.43, over 1369.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 22:46:54,074 INFO [train.py:811] (0/4) Start epoch 838 2023-11-14 22:50:00,873 INFO [train.py:467] (0/4) Epoch 838, batch 31, global_batch_idx: 31000, batch size: 73, loss[discriminator_loss=2.43, discriminator_real_loss=1.205, discriminator_fake_loss=1.225, generator_loss=31.39, generator_mel_loss=20.2, generator_kl_loss=2.061, generator_dur_loss=1.636, generator_adv_loss=2.348, generator_feat_match_loss=5.145, over 73.00 samples.], tot_loss[discriminator_loss=2.591, discriminator_real_loss=1.316, discriminator_fake_loss=1.275, generator_loss=30.58, generator_mel_loss=20.16, generator_kl_loss=2.019, generator_dur_loss=1.636, generator_adv_loss=2.232, generator_feat_match_loss=4.538, over 2217.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 22:50:01,504 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 22:50:12,834 INFO [train.py:517] (0/4) Epoch 838, validation: discriminator_loss=2.427, discriminator_real_loss=1.072, discriminator_fake_loss=1.355, generator_loss=31, generator_mel_loss=20.78, generator_kl_loss=2.183, generator_dur_loss=1.634, generator_adv_loss=1.901, generator_feat_match_loss=4.502, over 100.00 samples. 2023-11-14 22:50:12,835 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 22:50:40,483 INFO [train.py:811] (0/4) Start epoch 839 2023-11-14 22:54:13,496 INFO [train.py:811] (0/4) Start epoch 840 2023-11-14 22:55:15,130 INFO [train.py:467] (0/4) Epoch 840, batch 7, global_batch_idx: 31050, batch size: 52, loss[discriminator_loss=2.48, discriminator_real_loss=1.384, discriminator_fake_loss=1.097, generator_loss=30.86, generator_mel_loss=19.99, generator_kl_loss=1.983, generator_dur_loss=1.642, generator_adv_loss=2.43, generator_feat_match_loss=4.809, over 52.00 samples.], tot_loss[discriminator_loss=2.478, discriminator_real_loss=1.253, discriminator_fake_loss=1.224, generator_loss=30.6, generator_mel_loss=19.9, generator_kl_loss=1.985, generator_dur_loss=1.64, generator_adv_loss=2.316, generator_feat_match_loss=4.757, over 471.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 22:57:49,311 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-840.pt 2023-11-14 22:57:52,564 INFO [train.py:811] (0/4) Start epoch 841 2023-11-14 22:59:43,226 INFO [train.py:467] (0/4) Epoch 841, batch 20, global_batch_idx: 31100, batch size: 52, loss[discriminator_loss=2.562, discriminator_real_loss=1.29, discriminator_fake_loss=1.272, generator_loss=30.11, generator_mel_loss=19.68, generator_kl_loss=2.018, generator_dur_loss=1.639, generator_adv_loss=2.205, generator_feat_match_loss=4.566, over 52.00 samples.], tot_loss[discriminator_loss=2.512, discriminator_real_loss=1.269, discriminator_fake_loss=1.243, generator_loss=30.58, generator_mel_loss=19.79, generator_kl_loss=2.032, generator_dur_loss=1.63, generator_adv_loss=2.334, generator_feat_match_loss=4.799, over 1499.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 23:01:14,010 INFO [train.py:811] (0/4) Start epoch 842 2023-11-14 23:04:19,359 INFO [train.py:467] (0/4) Epoch 842, batch 33, global_batch_idx: 31150, batch size: 61, loss[discriminator_loss=2.33, discriminator_real_loss=1.129, discriminator_fake_loss=1.201, generator_loss=31.91, generator_mel_loss=20.09, generator_kl_loss=2.058, generator_dur_loss=1.638, generator_adv_loss=2.379, generator_feat_match_loss=5.754, over 61.00 samples.], tot_loss[discriminator_loss=2.515, discriminator_real_loss=1.28, discriminator_fake_loss=1.235, generator_loss=30.81, generator_mel_loss=19.92, generator_kl_loss=2.032, generator_dur_loss=1.636, generator_adv_loss=2.341, generator_feat_match_loss=4.874, over 2313.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 23:04:39,149 INFO [train.py:811] (0/4) Start epoch 843 2023-11-14 23:08:10,258 INFO [train.py:811] (0/4) Start epoch 844 2023-11-14 23:09:09,506 INFO [train.py:467] (0/4) Epoch 844, batch 9, global_batch_idx: 31200, batch size: 67, loss[discriminator_loss=2.668, discriminator_real_loss=1.359, discriminator_fake_loss=1.308, generator_loss=30.07, generator_mel_loss=20.03, generator_kl_loss=2.059, generator_dur_loss=1.642, generator_adv_loss=2.061, generator_feat_match_loss=4.277, over 67.00 samples.], tot_loss[discriminator_loss=2.631, discriminator_real_loss=1.339, discriminator_fake_loss=1.291, generator_loss=30.21, generator_mel_loss=20.04, generator_kl_loss=2.036, generator_dur_loss=1.642, generator_adv_loss=2.144, generator_feat_match_loss=4.355, over 671.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 16.0 2023-11-14 23:09:10,046 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 23:09:20,748 INFO [train.py:517] (0/4) Epoch 844, validation: discriminator_loss=2.619, discriminator_real_loss=1.188, discriminator_fake_loss=1.431, generator_loss=31.26, generator_mel_loss=20.99, generator_kl_loss=2.229, generator_dur_loss=1.629, generator_adv_loss=1.894, generator_feat_match_loss=4.517, over 100.00 samples. 2023-11-14 23:09:20,750 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 23:11:51,702 INFO [train.py:811] (0/4) Start epoch 845 2023-11-14 23:14:06,024 INFO [train.py:467] (0/4) Epoch 845, batch 22, global_batch_idx: 31250, batch size: 52, loss[discriminator_loss=2.574, discriminator_real_loss=1.376, discriminator_fake_loss=1.199, generator_loss=30.64, generator_mel_loss=20.2, generator_kl_loss=1.936, generator_dur_loss=1.637, generator_adv_loss=2.207, generator_feat_match_loss=4.652, over 52.00 samples.], tot_loss[discriminator_loss=2.615, discriminator_real_loss=1.328, discriminator_fake_loss=1.288, generator_loss=30.39, generator_mel_loss=20.14, generator_kl_loss=2.018, generator_dur_loss=1.63, generator_adv_loss=2.176, generator_feat_match_loss=4.429, over 1649.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 16.0 2023-11-14 23:15:25,448 INFO [train.py:811] (0/4) Start epoch 846 2023-11-14 23:18:53,015 INFO [train.py:467] (0/4) Epoch 846, batch 35, global_batch_idx: 31300, batch size: 64, loss[discriminator_loss=2.643, discriminator_real_loss=1.301, discriminator_fake_loss=1.342, generator_loss=30.49, generator_mel_loss=20.06, generator_kl_loss=2.027, generator_dur_loss=1.637, generator_adv_loss=2.277, generator_feat_match_loss=4.488, over 64.00 samples.], tot_loss[discriminator_loss=2.616, discriminator_real_loss=1.323, discriminator_fake_loss=1.294, generator_loss=30.52, generator_mel_loss=20.21, generator_kl_loss=2.005, generator_dur_loss=1.631, generator_adv_loss=2.204, generator_feat_match_loss=4.465, over 2504.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 23:18:58,225 INFO [train.py:811] (0/4) Start epoch 847 2023-11-14 23:22:37,267 INFO [train.py:811] (0/4) Start epoch 848 2023-11-14 23:23:47,717 INFO [train.py:467] (0/4) Epoch 848, batch 11, global_batch_idx: 31350, batch size: 95, loss[discriminator_loss=2.504, discriminator_real_loss=1.32, discriminator_fake_loss=1.185, generator_loss=31.28, generator_mel_loss=20.25, generator_kl_loss=2.015, generator_dur_loss=1.62, generator_adv_loss=2.496, generator_feat_match_loss=4.895, over 95.00 samples.], tot_loss[discriminator_loss=2.547, discriminator_real_loss=1.289, discriminator_fake_loss=1.258, generator_loss=30.76, generator_mel_loss=20.1, generator_kl_loss=2.009, generator_dur_loss=1.636, generator_adv_loss=2.315, generator_feat_match_loss=4.698, over 897.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 23:26:09,675 INFO [train.py:811] (0/4) Start epoch 849 2023-11-14 23:28:37,718 INFO [train.py:467] (0/4) Epoch 849, batch 24, global_batch_idx: 31400, batch size: 51, loss[discriminator_loss=2.531, discriminator_real_loss=1.223, discriminator_fake_loss=1.309, generator_loss=30.63, generator_mel_loss=19.98, generator_kl_loss=1.979, generator_dur_loss=1.637, generator_adv_loss=2.246, generator_feat_match_loss=4.785, over 51.00 samples.], tot_loss[discriminator_loss=2.506, discriminator_real_loss=1.257, discriminator_fake_loss=1.249, generator_loss=30.75, generator_mel_loss=19.92, generator_kl_loss=2.032, generator_dur_loss=1.629, generator_adv_loss=2.327, generator_feat_match_loss=4.842, over 1792.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 23:28:38,439 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 23:28:48,940 INFO [train.py:517] (0/4) Epoch 849, validation: discriminator_loss=2.489, discriminator_real_loss=1.2, discriminator_fake_loss=1.289, generator_loss=31.3, generator_mel_loss=20.63, generator_kl_loss=2.258, generator_dur_loss=1.633, generator_adv_loss=1.962, generator_feat_match_loss=4.819, over 100.00 samples. 2023-11-14 23:28:48,941 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 23:29:55,958 INFO [train.py:811] (0/4) Start epoch 850 2023-11-14 23:33:25,597 INFO [train.py:811] (0/4) Start epoch 851 2023-11-14 23:33:40,092 INFO [train.py:467] (0/4) Epoch 851, batch 0, global_batch_idx: 31450, batch size: 52, loss[discriminator_loss=2.508, discriminator_real_loss=1.287, discriminator_fake_loss=1.222, generator_loss=30.64, generator_mel_loss=19.96, generator_kl_loss=2.045, generator_dur_loss=1.669, generator_adv_loss=2.396, generator_feat_match_loss=4.566, over 52.00 samples.], tot_loss[discriminator_loss=2.508, discriminator_real_loss=1.287, discriminator_fake_loss=1.222, generator_loss=30.64, generator_mel_loss=19.96, generator_kl_loss=2.045, generator_dur_loss=1.669, generator_adv_loss=2.396, generator_feat_match_loss=4.566, over 52.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 23:37:01,429 INFO [train.py:811] (0/4) Start epoch 852 2023-11-14 23:38:32,208 INFO [train.py:467] (0/4) Epoch 852, batch 13, global_batch_idx: 31500, batch size: 153, loss[discriminator_loss=2.424, discriminator_real_loss=1.144, discriminator_fake_loss=1.28, generator_loss=31.14, generator_mel_loss=19.88, generator_kl_loss=1.874, generator_dur_loss=1.598, generator_adv_loss=2.336, generator_feat_match_loss=5.453, over 153.00 samples.], tot_loss[discriminator_loss=2.524, discriminator_real_loss=1.274, discriminator_fake_loss=1.25, generator_loss=30.94, generator_mel_loss=20.01, generator_kl_loss=1.991, generator_dur_loss=1.629, generator_adv_loss=2.332, generator_feat_match_loss=4.982, over 1197.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 23:40:31,555 INFO [train.py:811] (0/4) Start epoch 853 2023-11-14 23:43:12,747 INFO [train.py:467] (0/4) Epoch 853, batch 26, global_batch_idx: 31550, batch size: 79, loss[discriminator_loss=2.527, discriminator_real_loss=1.274, discriminator_fake_loss=1.254, generator_loss=31.2, generator_mel_loss=20.39, generator_kl_loss=1.998, generator_dur_loss=1.635, generator_adv_loss=2.311, generator_feat_match_loss=4.863, over 79.00 samples.], tot_loss[discriminator_loss=2.553, discriminator_real_loss=1.3, discriminator_fake_loss=1.252, generator_loss=30.57, generator_mel_loss=20.06, generator_kl_loss=2.005, generator_dur_loss=1.63, generator_adv_loss=2.23, generator_feat_match_loss=4.655, over 1935.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 23:44:08,118 INFO [train.py:811] (0/4) Start epoch 854 2023-11-14 23:47:43,497 INFO [train.py:811] (0/4) Start epoch 855 2023-11-14 23:48:06,062 INFO [train.py:467] (0/4) Epoch 855, batch 2, global_batch_idx: 31600, batch size: 59, loss[discriminator_loss=2.443, discriminator_real_loss=1.194, discriminator_fake_loss=1.249, generator_loss=30.89, generator_mel_loss=19.87, generator_kl_loss=1.998, generator_dur_loss=1.64, generator_adv_loss=2.322, generator_feat_match_loss=5.059, over 59.00 samples.], tot_loss[discriminator_loss=2.516, discriminator_real_loss=1.309, discriminator_fake_loss=1.208, generator_loss=30.37, generator_mel_loss=19.77, generator_kl_loss=1.985, generator_dur_loss=1.642, generator_adv_loss=2.251, generator_feat_match_loss=4.723, over 162.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 16.0 2023-11-14 23:48:06,702 INFO [train.py:508] (0/4) Computing validation loss 2023-11-14 23:48:18,037 INFO [train.py:517] (0/4) Epoch 855, validation: discriminator_loss=2.382, discriminator_real_loss=1.122, discriminator_fake_loss=1.26, generator_loss=31.99, generator_mel_loss=20.71, generator_kl_loss=2.234, generator_dur_loss=1.63, generator_adv_loss=2.125, generator_feat_match_loss=5.289, over 100.00 samples. 2023-11-14 23:48:18,038 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-14 23:51:27,879 INFO [train.py:811] (0/4) Start epoch 856 2023-11-14 23:53:03,532 INFO [train.py:467] (0/4) Epoch 856, batch 15, global_batch_idx: 31650, batch size: 76, loss[discriminator_loss=2.424, discriminator_real_loss=1.283, discriminator_fake_loss=1.141, generator_loss=31.11, generator_mel_loss=19.89, generator_kl_loss=1.951, generator_dur_loss=1.655, generator_adv_loss=2.457, generator_feat_match_loss=5.156, over 76.00 samples.], tot_loss[discriminator_loss=2.494, discriminator_real_loss=1.262, discriminator_fake_loss=1.233, generator_loss=30.64, generator_mel_loss=19.88, generator_kl_loss=2.021, generator_dur_loss=1.631, generator_adv_loss=2.3, generator_feat_match_loss=4.813, over 1182.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 23:55:02,667 INFO [train.py:811] (0/4) Start epoch 857 2023-11-14 23:57:43,676 INFO [train.py:467] (0/4) Epoch 857, batch 28, global_batch_idx: 31700, batch size: 95, loss[discriminator_loss=2.43, discriminator_real_loss=1.174, discriminator_fake_loss=1.257, generator_loss=31.14, generator_mel_loss=19.91, generator_kl_loss=2.112, generator_dur_loss=1.641, generator_adv_loss=2.5, generator_feat_match_loss=4.977, over 95.00 samples.], tot_loss[discriminator_loss=2.471, discriminator_real_loss=1.243, discriminator_fake_loss=1.228, generator_loss=30.72, generator_mel_loss=19.7, generator_kl_loss=2.013, generator_dur_loss=1.634, generator_adv_loss=2.388, generator_feat_match_loss=4.986, over 1965.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-14 23:58:30,448 INFO [train.py:811] (0/4) Start epoch 858 2023-11-15 00:02:05,210 INFO [train.py:811] (0/4) Start epoch 859 2023-11-15 00:02:41,958 INFO [train.py:467] (0/4) Epoch 859, batch 4, global_batch_idx: 31750, batch size: 95, loss[discriminator_loss=2.299, discriminator_real_loss=1.15, discriminator_fake_loss=1.148, generator_loss=31.82, generator_mel_loss=19.81, generator_kl_loss=1.975, generator_dur_loss=1.6, generator_adv_loss=2.561, generator_feat_match_loss=5.867, over 95.00 samples.], tot_loss[discriminator_loss=2.391, discriminator_real_loss=1.236, discriminator_fake_loss=1.155, generator_loss=31.02, generator_mel_loss=19.6, generator_kl_loss=1.995, generator_dur_loss=1.626, generator_adv_loss=2.384, generator_feat_match_loss=5.408, over 402.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-15 00:05:43,270 INFO [train.py:811] (0/4) Start epoch 860 2023-11-15 00:07:31,663 INFO [train.py:467] (0/4) Epoch 860, batch 17, global_batch_idx: 31800, batch size: 69, loss[discriminator_loss=2.605, discriminator_real_loss=1.249, discriminator_fake_loss=1.355, generator_loss=30.95, generator_mel_loss=20.26, generator_kl_loss=2.019, generator_dur_loss=1.617, generator_adv_loss=2.357, generator_feat_match_loss=4.703, over 69.00 samples.], tot_loss[discriminator_loss=2.577, discriminator_real_loss=1.301, discriminator_fake_loss=1.276, generator_loss=30.41, generator_mel_loss=20, generator_kl_loss=2.006, generator_dur_loss=1.639, generator_adv_loss=2.22, generator_feat_match_loss=4.546, over 1201.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-15 00:07:32,163 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 00:07:42,688 INFO [train.py:517] (0/4) Epoch 860, validation: discriminator_loss=2.631, discriminator_real_loss=1.324, discriminator_fake_loss=1.307, generator_loss=31.27, generator_mel_loss=20.89, generator_kl_loss=2.236, generator_dur_loss=1.624, generator_adv_loss=2.088, generator_feat_match_loss=4.43, over 100.00 samples. 2023-11-15 00:07:42,689 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 00:09:31,039 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-860.pt 2023-11-15 00:09:34,377 INFO [train.py:811] (0/4) Start epoch 861 2023-11-15 00:12:28,541 INFO [train.py:467] (0/4) Epoch 861, batch 30, global_batch_idx: 31850, batch size: 64, loss[discriminator_loss=2.613, discriminator_real_loss=1.278, discriminator_fake_loss=1.334, generator_loss=30, generator_mel_loss=19.65, generator_kl_loss=1.942, generator_dur_loss=1.628, generator_adv_loss=2.371, generator_feat_match_loss=4.414, over 64.00 samples.], tot_loss[discriminator_loss=2.586, discriminator_real_loss=1.313, discriminator_fake_loss=1.272, generator_loss=30.69, generator_mel_loss=20.2, generator_kl_loss=2.023, generator_dur_loss=1.637, generator_adv_loss=2.241, generator_feat_match_loss=4.586, over 2178.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-15 00:13:08,501 INFO [train.py:811] (0/4) Start epoch 862 2023-11-15 00:16:40,558 INFO [train.py:811] (0/4) Start epoch 863 2023-11-15 00:17:27,072 INFO [train.py:467] (0/4) Epoch 863, batch 6, global_batch_idx: 31900, batch size: 69, loss[discriminator_loss=2.57, discriminator_real_loss=1.404, discriminator_fake_loss=1.167, generator_loss=30.35, generator_mel_loss=19.73, generator_kl_loss=2.014, generator_dur_loss=1.613, generator_adv_loss=2.26, generator_feat_match_loss=4.734, over 69.00 samples.], tot_loss[discriminator_loss=2.485, discriminator_real_loss=1.26, discriminator_fake_loss=1.225, generator_loss=30.87, generator_mel_loss=19.95, generator_kl_loss=2.039, generator_dur_loss=1.633, generator_adv_loss=2.33, generator_feat_match_loss=4.912, over 509.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-15 00:20:14,432 INFO [train.py:811] (0/4) Start epoch 864 2023-11-15 00:22:19,692 INFO [train.py:467] (0/4) Epoch 864, batch 19, global_batch_idx: 31950, batch size: 50, loss[discriminator_loss=2.527, discriminator_real_loss=1.226, discriminator_fake_loss=1.302, generator_loss=30.43, generator_mel_loss=19.83, generator_kl_loss=1.915, generator_dur_loss=1.629, generator_adv_loss=2.359, generator_feat_match_loss=4.695, over 50.00 samples.], tot_loss[discriminator_loss=2.545, discriminator_real_loss=1.286, discriminator_fake_loss=1.26, generator_loss=30.42, generator_mel_loss=20, generator_kl_loss=2.019, generator_dur_loss=1.63, generator_adv_loss=2.187, generator_feat_match_loss=4.579, over 1597.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2023-11-15 00:23:47,349 INFO [train.py:811] (0/4) Start epoch 865 2023-11-15 00:26:58,629 INFO [train.py:467] (0/4) Epoch 865, batch 32, global_batch_idx: 32000, batch size: 60, loss[discriminator_loss=2.688, discriminator_real_loss=1.305, discriminator_fake_loss=1.382, generator_loss=30.02, generator_mel_loss=20, generator_kl_loss=1.987, generator_dur_loss=1.655, generator_adv_loss=2.242, generator_feat_match_loss=4.137, over 60.00 samples.], tot_loss[discriminator_loss=2.558, discriminator_real_loss=1.288, discriminator_fake_loss=1.27, generator_loss=30.54, generator_mel_loss=19.91, generator_kl_loss=2.009, generator_dur_loss=1.632, generator_adv_loss=2.292, generator_feat_match_loss=4.699, over 2167.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 16.0 2023-11-15 00:26:59,234 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 00:27:09,993 INFO [train.py:517] (0/4) Epoch 865, validation: discriminator_loss=2.595, discriminator_real_loss=1.359, discriminator_fake_loss=1.236, generator_loss=31.08, generator_mel_loss=20.52, generator_kl_loss=2.151, generator_dur_loss=1.63, generator_adv_loss=2.244, generator_feat_match_loss=4.534, over 100.00 samples. 2023-11-15 00:27:09,994 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 00:27:30,610 INFO [train.py:811] (0/4) Start epoch 866 2023-11-15 00:31:05,948 INFO [train.py:811] (0/4) Start epoch 867 2023-11-15 00:32:09,237 INFO [train.py:467] (0/4) Epoch 867, batch 8, global_batch_idx: 32050, batch size: 81, loss[discriminator_loss=2.562, discriminator_real_loss=1.174, discriminator_fake_loss=1.389, generator_loss=30.22, generator_mel_loss=19.63, generator_kl_loss=2.033, generator_dur_loss=1.619, generator_adv_loss=2.256, generator_feat_match_loss=4.676, over 81.00 samples.], tot_loss[discriminator_loss=2.476, discriminator_real_loss=1.232, discriminator_fake_loss=1.245, generator_loss=30.92, generator_mel_loss=19.73, generator_kl_loss=1.967, generator_dur_loss=1.631, generator_adv_loss=2.38, generator_feat_match_loss=5.21, over 675.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 00:34:43,484 INFO [train.py:811] (0/4) Start epoch 868 2023-11-15 00:36:46,834 INFO [train.py:467] (0/4) Epoch 868, batch 21, global_batch_idx: 32100, batch size: 67, loss[discriminator_loss=2.602, discriminator_real_loss=1.319, discriminator_fake_loss=1.281, generator_loss=30.85, generator_mel_loss=20.09, generator_kl_loss=2.075, generator_dur_loss=1.632, generator_adv_loss=2.35, generator_feat_match_loss=4.703, over 67.00 samples.], tot_loss[discriminator_loss=2.561, discriminator_real_loss=1.3, discriminator_fake_loss=1.261, generator_loss=30.54, generator_mel_loss=20.04, generator_kl_loss=2.037, generator_dur_loss=1.634, generator_adv_loss=2.217, generator_feat_match_loss=4.617, over 1551.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 00:38:13,939 INFO [train.py:811] (0/4) Start epoch 869 2023-11-15 00:41:37,540 INFO [train.py:467] (0/4) Epoch 869, batch 34, global_batch_idx: 32150, batch size: 101, loss[discriminator_loss=2.484, discriminator_real_loss=1.311, discriminator_fake_loss=1.174, generator_loss=31.04, generator_mel_loss=20.04, generator_kl_loss=2.104, generator_dur_loss=1.629, generator_adv_loss=2.324, generator_feat_match_loss=4.945, over 101.00 samples.], tot_loss[discriminator_loss=2.541, discriminator_real_loss=1.292, discriminator_fake_loss=1.248, generator_loss=30.61, generator_mel_loss=19.95, generator_kl_loss=2.016, generator_dur_loss=1.628, generator_adv_loss=2.273, generator_feat_match_loss=4.738, over 2681.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 00:41:47,380 INFO [train.py:811] (0/4) Start epoch 870 2023-11-15 00:45:21,999 INFO [train.py:811] (0/4) Start epoch 871 2023-11-15 00:46:29,967 INFO [train.py:467] (0/4) Epoch 871, batch 10, global_batch_idx: 32200, batch size: 58, loss[discriminator_loss=2.502, discriminator_real_loss=1.289, discriminator_fake_loss=1.213, generator_loss=30.34, generator_mel_loss=19.61, generator_kl_loss=2.075, generator_dur_loss=1.649, generator_adv_loss=2.25, generator_feat_match_loss=4.754, over 58.00 samples.], tot_loss[discriminator_loss=2.543, discriminator_real_loss=1.305, discriminator_fake_loss=1.237, generator_loss=30.77, generator_mel_loss=20.12, generator_kl_loss=2.046, generator_dur_loss=1.64, generator_adv_loss=2.278, generator_feat_match_loss=4.685, over 715.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 00:46:30,669 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 00:46:40,914 INFO [train.py:517] (0/4) Epoch 871, validation: discriminator_loss=2.558, discriminator_real_loss=1.112, discriminator_fake_loss=1.446, generator_loss=31.29, generator_mel_loss=20.74, generator_kl_loss=2.289, generator_dur_loss=1.629, generator_adv_loss=1.83, generator_feat_match_loss=4.802, over 100.00 samples. 2023-11-15 00:46:40,915 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 00:49:04,301 INFO [train.py:811] (0/4) Start epoch 872 2023-11-15 00:51:22,614 INFO [train.py:467] (0/4) Epoch 872, batch 23, global_batch_idx: 32250, batch size: 85, loss[discriminator_loss=2.727, discriminator_real_loss=1.182, discriminator_fake_loss=1.545, generator_loss=30.36, generator_mel_loss=19.8, generator_kl_loss=2.068, generator_dur_loss=1.648, generator_adv_loss=2.209, generator_feat_match_loss=4.629, over 85.00 samples.], tot_loss[discriminator_loss=2.442, discriminator_real_loss=1.234, discriminator_fake_loss=1.207, generator_loss=31.12, generator_mel_loss=19.76, generator_kl_loss=2.005, generator_dur_loss=1.631, generator_adv_loss=2.445, generator_feat_match_loss=5.273, over 1834.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 00:52:32,262 INFO [train.py:811] (0/4) Start epoch 873 2023-11-15 00:56:02,981 INFO [train.py:467] (0/4) Epoch 873, batch 36, global_batch_idx: 32300, batch size: 126, loss[discriminator_loss=2.566, discriminator_real_loss=1.333, discriminator_fake_loss=1.234, generator_loss=30.37, generator_mel_loss=19.87, generator_kl_loss=2.001, generator_dur_loss=1.634, generator_adv_loss=2.168, generator_feat_match_loss=4.691, over 126.00 samples.], tot_loss[discriminator_loss=2.492, discriminator_real_loss=1.251, discriminator_fake_loss=1.241, generator_loss=30.54, generator_mel_loss=19.75, generator_kl_loss=2.022, generator_dur_loss=1.632, generator_adv_loss=2.283, generator_feat_match_loss=4.856, over 2611.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 00:56:04,272 INFO [train.py:811] (0/4) Start epoch 874 2023-11-15 00:59:41,418 INFO [train.py:811] (0/4) Start epoch 875 2023-11-15 01:01:11,187 INFO [train.py:467] (0/4) Epoch 875, batch 12, global_batch_idx: 32350, batch size: 81, loss[discriminator_loss=2.414, discriminator_real_loss=1.22, discriminator_fake_loss=1.193, generator_loss=31.24, generator_mel_loss=20.12, generator_kl_loss=2.063, generator_dur_loss=1.615, generator_adv_loss=2.211, generator_feat_match_loss=5.23, over 81.00 samples.], tot_loss[discriminator_loss=2.62, discriminator_real_loss=1.342, discriminator_fake_loss=1.279, generator_loss=30.71, generator_mel_loss=19.97, generator_kl_loss=2.051, generator_dur_loss=1.625, generator_adv_loss=2.356, generator_feat_match_loss=4.707, over 1106.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 01:03:09,868 INFO [train.py:811] (0/4) Start epoch 876 2023-11-15 01:05:38,781 INFO [train.py:467] (0/4) Epoch 876, batch 25, global_batch_idx: 32400, batch size: 49, loss[discriminator_loss=2.65, discriminator_real_loss=1.32, discriminator_fake_loss=1.33, generator_loss=30.08, generator_mel_loss=19.89, generator_kl_loss=2.047, generator_dur_loss=1.611, generator_adv_loss=2.193, generator_feat_match_loss=4.34, over 49.00 samples.], tot_loss[discriminator_loss=2.553, discriminator_real_loss=1.283, discriminator_fake_loss=1.271, generator_loss=30.52, generator_mel_loss=19.84, generator_kl_loss=2.024, generator_dur_loss=1.634, generator_adv_loss=2.267, generator_feat_match_loss=4.755, over 1746.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2023-11-15 01:05:39,311 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 01:05:49,917 INFO [train.py:517] (0/4) Epoch 876, validation: discriminator_loss=2.488, discriminator_real_loss=1.209, discriminator_fake_loss=1.279, generator_loss=31.4, generator_mel_loss=20.55, generator_kl_loss=2.287, generator_dur_loss=1.635, generator_adv_loss=2.072, generator_feat_match_loss=4.859, over 100.00 samples. 2023-11-15 01:05:49,918 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 01:06:55,206 INFO [train.py:811] (0/4) Start epoch 877 2023-11-15 01:10:28,114 INFO [train.py:811] (0/4) Start epoch 878 2023-11-15 01:10:48,330 INFO [train.py:467] (0/4) Epoch 878, batch 1, global_batch_idx: 32450, batch size: 51, loss[discriminator_loss=2.342, discriminator_real_loss=1.248, discriminator_fake_loss=1.094, generator_loss=31.19, generator_mel_loss=19.23, generator_kl_loss=2.154, generator_dur_loss=1.632, generator_adv_loss=2.518, generator_feat_match_loss=5.652, over 51.00 samples.], tot_loss[discriminator_loss=2.372, discriminator_real_loss=1.216, discriminator_fake_loss=1.156, generator_loss=31.49, generator_mel_loss=19.7, generator_kl_loss=2.067, generator_dur_loss=1.622, generator_adv_loss=2.533, generator_feat_match_loss=5.568, over 132.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 01:14:04,008 INFO [train.py:811] (0/4) Start epoch 879 2023-11-15 01:15:30,530 INFO [train.py:467] (0/4) Epoch 879, batch 14, global_batch_idx: 32500, batch size: 58, loss[discriminator_loss=2.465, discriminator_real_loss=1.317, discriminator_fake_loss=1.146, generator_loss=30.72, generator_mel_loss=19.62, generator_kl_loss=2.082, generator_dur_loss=1.619, generator_adv_loss=2.355, generator_feat_match_loss=5.051, over 58.00 samples.], tot_loss[discriminator_loss=2.537, discriminator_real_loss=1.303, discriminator_fake_loss=1.234, generator_loss=30.82, generator_mel_loss=20.1, generator_kl_loss=2.061, generator_dur_loss=1.634, generator_adv_loss=2.277, generator_feat_match_loss=4.752, over 1120.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 01:17:34,792 INFO [train.py:811] (0/4) Start epoch 880 2023-11-15 01:20:20,352 INFO [train.py:467] (0/4) Epoch 880, batch 27, global_batch_idx: 32550, batch size: 52, loss[discriminator_loss=2.504, discriminator_real_loss=1.328, discriminator_fake_loss=1.177, generator_loss=30.39, generator_mel_loss=19.86, generator_kl_loss=1.959, generator_dur_loss=1.645, generator_adv_loss=2.17, generator_feat_match_loss=4.754, over 52.00 samples.], tot_loss[discriminator_loss=2.446, discriminator_real_loss=1.227, discriminator_fake_loss=1.219, generator_loss=30.68, generator_mel_loss=19.57, generator_kl_loss=1.991, generator_dur_loss=1.633, generator_adv_loss=2.394, generator_feat_match_loss=5.098, over 1918.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 01:21:07,282 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-880.pt 2023-11-15 01:21:10,751 INFO [train.py:811] (0/4) Start epoch 881 2023-11-15 01:24:31,841 INFO [train.py:811] (0/4) Start epoch 882 2023-11-15 01:25:01,322 INFO [train.py:467] (0/4) Epoch 882, batch 3, global_batch_idx: 32600, batch size: 69, loss[discriminator_loss=2.48, discriminator_real_loss=1.265, discriminator_fake_loss=1.216, generator_loss=30.87, generator_mel_loss=19.71, generator_kl_loss=2.022, generator_dur_loss=1.637, generator_adv_loss=2.453, generator_feat_match_loss=5.043, over 69.00 samples.], tot_loss[discriminator_loss=2.529, discriminator_real_loss=1.281, discriminator_fake_loss=1.248, generator_loss=30.43, generator_mel_loss=19.62, generator_kl_loss=2.051, generator_dur_loss=1.645, generator_adv_loss=2.36, generator_feat_match_loss=4.755, over 244.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 01:25:01,818 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 01:25:13,538 INFO [train.py:517] (0/4) Epoch 882, validation: discriminator_loss=2.375, discriminator_real_loss=1.179, discriminator_fake_loss=1.196, generator_loss=32, generator_mel_loss=20.37, generator_kl_loss=2.176, generator_dur_loss=1.642, generator_adv_loss=2.394, generator_feat_match_loss=5.41, over 100.00 samples. 2023-11-15 01:25:13,539 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 01:28:19,420 INFO [train.py:811] (0/4) Start epoch 883 2023-11-15 01:29:54,454 INFO [train.py:467] (0/4) Epoch 883, batch 16, global_batch_idx: 32650, batch size: 56, loss[discriminator_loss=2.422, discriminator_real_loss=1.258, discriminator_fake_loss=1.163, generator_loss=31.07, generator_mel_loss=20.09, generator_kl_loss=2.058, generator_dur_loss=1.645, generator_adv_loss=2.266, generator_feat_match_loss=5.008, over 56.00 samples.], tot_loss[discriminator_loss=2.549, discriminator_real_loss=1.278, discriminator_fake_loss=1.271, generator_loss=30.91, generator_mel_loss=20.2, generator_kl_loss=2.042, generator_dur_loss=1.636, generator_adv_loss=2.241, generator_feat_match_loss=4.79, over 1281.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 01:31:43,298 INFO [train.py:811] (0/4) Start epoch 884 2023-11-15 01:34:31,874 INFO [train.py:467] (0/4) Epoch 884, batch 29, global_batch_idx: 32700, batch size: 51, loss[discriminator_loss=2.598, discriminator_real_loss=1.273, discriminator_fake_loss=1.324, generator_loss=30.48, generator_mel_loss=19.99, generator_kl_loss=2.115, generator_dur_loss=1.634, generator_adv_loss=2.178, generator_feat_match_loss=4.562, over 51.00 samples.], tot_loss[discriminator_loss=2.564, discriminator_real_loss=1.297, discriminator_fake_loss=1.267, generator_loss=30.38, generator_mel_loss=19.77, generator_kl_loss=2.001, generator_dur_loss=1.629, generator_adv_loss=2.292, generator_feat_match_loss=4.686, over 2208.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 01:35:17,535 INFO [train.py:811] (0/4) Start epoch 885 2023-11-15 01:38:53,577 INFO [train.py:811] (0/4) Start epoch 886 2023-11-15 01:39:37,952 INFO [train.py:467] (0/4) Epoch 886, batch 5, global_batch_idx: 32750, batch size: 55, loss[discriminator_loss=2.633, discriminator_real_loss=1.287, discriminator_fake_loss=1.347, generator_loss=30.14, generator_mel_loss=19.92, generator_kl_loss=2.049, generator_dur_loss=1.678, generator_adv_loss=2.059, generator_feat_match_loss=4.441, over 55.00 samples.], tot_loss[discriminator_loss=2.568, discriminator_real_loss=1.295, discriminator_fake_loss=1.274, generator_loss=30.69, generator_mel_loss=20.26, generator_kl_loss=2.045, generator_dur_loss=1.631, generator_adv_loss=2.2, generator_feat_match_loss=4.555, over 539.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 01:42:23,397 INFO [train.py:811] (0/4) Start epoch 887 2023-11-15 01:44:16,983 INFO [train.py:467] (0/4) Epoch 887, batch 18, global_batch_idx: 32800, batch size: 73, loss[discriminator_loss=2.564, discriminator_real_loss=1.387, discriminator_fake_loss=1.178, generator_loss=30.8, generator_mel_loss=19.96, generator_kl_loss=2.002, generator_dur_loss=1.64, generator_adv_loss=2.258, generator_feat_match_loss=4.941, over 73.00 samples.], tot_loss[discriminator_loss=2.57, discriminator_real_loss=1.314, discriminator_fake_loss=1.256, generator_loss=30.63, generator_mel_loss=20.06, generator_kl_loss=2.039, generator_dur_loss=1.633, generator_adv_loss=2.247, generator_feat_match_loss=4.647, over 1407.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2023-11-15 01:44:17,549 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 01:44:28,139 INFO [train.py:517] (0/4) Epoch 887, validation: discriminator_loss=2.489, discriminator_real_loss=1.192, discriminator_fake_loss=1.297, generator_loss=31.07, generator_mel_loss=20.3, generator_kl_loss=2.217, generator_dur_loss=1.644, generator_adv_loss=2.108, generator_feat_match_loss=4.801, over 100.00 samples. 2023-11-15 01:44:28,140 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 01:46:05,300 INFO [train.py:811] (0/4) Start epoch 888 2023-11-15 01:48:53,879 INFO [train.py:467] (0/4) Epoch 888, batch 31, global_batch_idx: 32850, batch size: 54, loss[discriminator_loss=2.461, discriminator_real_loss=1.324, discriminator_fake_loss=1.136, generator_loss=31.38, generator_mel_loss=19.98, generator_kl_loss=1.997, generator_dur_loss=1.63, generator_adv_loss=2.465, generator_feat_match_loss=5.312, over 54.00 samples.], tot_loss[discriminator_loss=2.526, discriminator_real_loss=1.285, discriminator_fake_loss=1.242, generator_loss=30.45, generator_mel_loss=19.89, generator_kl_loss=2, generator_dur_loss=1.634, generator_adv_loss=2.263, generator_feat_match_loss=4.666, over 2245.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 01:49:29,422 INFO [train.py:811] (0/4) Start epoch 889 2023-11-15 01:53:01,056 INFO [train.py:811] (0/4) Start epoch 890 2023-11-15 01:53:55,602 INFO [train.py:467] (0/4) Epoch 890, batch 7, global_batch_idx: 32900, batch size: 52, loss[discriminator_loss=2.477, discriminator_real_loss=1.187, discriminator_fake_loss=1.291, generator_loss=30.69, generator_mel_loss=19.82, generator_kl_loss=1.972, generator_dur_loss=1.637, generator_adv_loss=2.379, generator_feat_match_loss=4.883, over 52.00 samples.], tot_loss[discriminator_loss=2.517, discriminator_real_loss=1.257, discriminator_fake_loss=1.26, generator_loss=30.34, generator_mel_loss=19.67, generator_kl_loss=1.985, generator_dur_loss=1.629, generator_adv_loss=2.291, generator_feat_match_loss=4.762, over 502.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 01:56:34,003 INFO [train.py:811] (0/4) Start epoch 891 2023-11-15 01:58:40,556 INFO [train.py:467] (0/4) Epoch 891, batch 20, global_batch_idx: 32950, batch size: 64, loss[discriminator_loss=2.695, discriminator_real_loss=1.537, discriminator_fake_loss=1.158, generator_loss=30.62, generator_mel_loss=20.01, generator_kl_loss=1.961, generator_dur_loss=1.619, generator_adv_loss=2.467, generator_feat_match_loss=4.566, over 64.00 samples.], tot_loss[discriminator_loss=2.52, discriminator_real_loss=1.265, discriminator_fake_loss=1.255, generator_loss=30.69, generator_mel_loss=19.86, generator_kl_loss=2.003, generator_dur_loss=1.634, generator_adv_loss=2.319, generator_feat_match_loss=4.873, over 1476.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 02:00:13,529 INFO [train.py:811] (0/4) Start epoch 892 2023-11-15 02:03:30,991 INFO [train.py:467] (0/4) Epoch 892, batch 33, global_batch_idx: 33000, batch size: 56, loss[discriminator_loss=2.59, discriminator_real_loss=1.143, discriminator_fake_loss=1.448, generator_loss=31.43, generator_mel_loss=20.35, generator_kl_loss=2.015, generator_dur_loss=1.623, generator_adv_loss=2.443, generator_feat_match_loss=5.004, over 56.00 samples.], tot_loss[discriminator_loss=2.503, discriminator_real_loss=1.268, discriminator_fake_loss=1.236, generator_loss=30.75, generator_mel_loss=19.97, generator_kl_loss=2.014, generator_dur_loss=1.631, generator_adv_loss=2.286, generator_feat_match_loss=4.853, over 2509.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 02:03:31,587 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 02:03:42,570 INFO [train.py:517] (0/4) Epoch 892, validation: discriminator_loss=2.723, discriminator_real_loss=1.326, discriminator_fake_loss=1.397, generator_loss=30.8, generator_mel_loss=20.53, generator_kl_loss=2.251, generator_dur_loss=1.632, generator_adv_loss=1.933, generator_feat_match_loss=4.461, over 100.00 samples. 2023-11-15 02:03:42,571 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 02:03:58,449 INFO [train.py:811] (0/4) Start epoch 893 2023-11-15 02:07:32,086 INFO [train.py:811] (0/4) Start epoch 894 2023-11-15 02:08:30,675 INFO [train.py:467] (0/4) Epoch 894, batch 9, global_batch_idx: 33050, batch size: 65, loss[discriminator_loss=2.527, discriminator_real_loss=1.209, discriminator_fake_loss=1.318, generator_loss=30.69, generator_mel_loss=20.37, generator_kl_loss=1.975, generator_dur_loss=1.638, generator_adv_loss=2.084, generator_feat_match_loss=4.621, over 65.00 samples.], tot_loss[discriminator_loss=2.591, discriminator_real_loss=1.316, discriminator_fake_loss=1.275, generator_loss=30.15, generator_mel_loss=19.8, generator_kl_loss=2.006, generator_dur_loss=1.645, generator_adv_loss=2.165, generator_feat_match_loss=4.532, over 602.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 02:11:05,753 INFO [train.py:811] (0/4) Start epoch 895 2023-11-15 02:13:20,305 INFO [train.py:467] (0/4) Epoch 895, batch 22, global_batch_idx: 33100, batch size: 101, loss[discriminator_loss=2.617, discriminator_real_loss=1.253, discriminator_fake_loss=1.364, generator_loss=30.74, generator_mel_loss=20.15, generator_kl_loss=2.002, generator_dur_loss=1.628, generator_adv_loss=2.355, generator_feat_match_loss=4.602, over 101.00 samples.], tot_loss[discriminator_loss=2.551, discriminator_real_loss=1.281, discriminator_fake_loss=1.271, generator_loss=30.69, generator_mel_loss=20.12, generator_kl_loss=2.035, generator_dur_loss=1.627, generator_adv_loss=2.226, generator_feat_match_loss=4.681, over 1720.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 02:14:40,275 INFO [train.py:811] (0/4) Start epoch 896 2023-11-15 02:18:09,876 INFO [train.py:467] (0/4) Epoch 896, batch 35, global_batch_idx: 33150, batch size: 55, loss[discriminator_loss=2.588, discriminator_real_loss=1.252, discriminator_fake_loss=1.336, generator_loss=30.58, generator_mel_loss=19.97, generator_kl_loss=2.011, generator_dur_loss=1.674, generator_adv_loss=2.137, generator_feat_match_loss=4.789, over 55.00 samples.], tot_loss[discriminator_loss=2.584, discriminator_real_loss=1.311, discriminator_fake_loss=1.273, generator_loss=30.43, generator_mel_loss=19.81, generator_kl_loss=2.008, generator_dur_loss=1.632, generator_adv_loss=2.259, generator_feat_match_loss=4.718, over 2652.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 02:18:14,669 INFO [train.py:811] (0/4) Start epoch 897 2023-11-15 02:21:46,048 INFO [train.py:811] (0/4) Start epoch 898 2023-11-15 02:23:02,650 INFO [train.py:467] (0/4) Epoch 898, batch 11, global_batch_idx: 33200, batch size: 52, loss[discriminator_loss=2.652, discriminator_real_loss=1.352, discriminator_fake_loss=1.302, generator_loss=29.69, generator_mel_loss=19.64, generator_kl_loss=2.009, generator_dur_loss=1.644, generator_adv_loss=2.207, generator_feat_match_loss=4.195, over 52.00 samples.], tot_loss[discriminator_loss=2.567, discriminator_real_loss=1.282, discriminator_fake_loss=1.285, generator_loss=30.55, generator_mel_loss=19.94, generator_kl_loss=2.002, generator_dur_loss=1.63, generator_adv_loss=2.234, generator_feat_match_loss=4.747, over 1000.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2023-11-15 02:23:03,128 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 02:23:13,662 INFO [train.py:517] (0/4) Epoch 898, validation: discriminator_loss=2.641, discriminator_real_loss=1.339, discriminator_fake_loss=1.302, generator_loss=31.78, generator_mel_loss=20.81, generator_kl_loss=2.316, generator_dur_loss=1.625, generator_adv_loss=2.175, generator_feat_match_loss=4.856, over 100.00 samples. 2023-11-15 02:23:13,664 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 02:25:27,250 INFO [train.py:811] (0/4) Start epoch 899 2023-11-15 02:27:58,765 INFO [train.py:467] (0/4) Epoch 899, batch 24, global_batch_idx: 33250, batch size: 101, loss[discriminator_loss=2.574, discriminator_real_loss=1.272, discriminator_fake_loss=1.302, generator_loss=30.96, generator_mel_loss=20.36, generator_kl_loss=2.036, generator_dur_loss=1.627, generator_adv_loss=2.172, generator_feat_match_loss=4.762, over 101.00 samples.], tot_loss[discriminator_loss=2.572, discriminator_real_loss=1.304, discriminator_fake_loss=1.269, generator_loss=30.49, generator_mel_loss=20.15, generator_kl_loss=2.022, generator_dur_loss=1.63, generator_adv_loss=2.164, generator_feat_match_loss=4.527, over 1815.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2023-11-15 02:29:04,384 INFO [train.py:811] (0/4) Start epoch 900 2023-11-15 02:32:36,644 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-900.pt 2023-11-15 02:32:40,015 INFO [train.py:811] (0/4) Start epoch 901 2023-11-15 02:32:54,429 INFO [train.py:467] (0/4) Epoch 901, batch 0, global_batch_idx: 33300, batch size: 49, loss[discriminator_loss=2.551, discriminator_real_loss=1.184, discriminator_fake_loss=1.366, generator_loss=30.01, generator_mel_loss=19.35, generator_kl_loss=2.044, generator_dur_loss=1.647, generator_adv_loss=2.215, generator_feat_match_loss=4.758, over 49.00 samples.], tot_loss[discriminator_loss=2.551, discriminator_real_loss=1.184, discriminator_fake_loss=1.366, generator_loss=30.01, generator_mel_loss=19.35, generator_kl_loss=2.044, generator_dur_loss=1.647, generator_adv_loss=2.215, generator_feat_match_loss=4.758, over 49.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2023-11-15 02:36:10,228 INFO [train.py:811] (0/4) Start epoch 902 2023-11-15 02:37:30,491 INFO [train.py:467] (0/4) Epoch 902, batch 13, global_batch_idx: 33350, batch size: 69, loss[discriminator_loss=2.576, discriminator_real_loss=1.35, discriminator_fake_loss=1.227, generator_loss=30.98, generator_mel_loss=19.95, generator_kl_loss=2.045, generator_dur_loss=1.629, generator_adv_loss=2.436, generator_feat_match_loss=4.926, over 69.00 samples.], tot_loss[discriminator_loss=2.503, discriminator_real_loss=1.262, discriminator_fake_loss=1.241, generator_loss=30.76, generator_mel_loss=19.96, generator_kl_loss=2.021, generator_dur_loss=1.635, generator_adv_loss=2.328, generator_feat_match_loss=4.82, over 841.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2023-11-15 02:39:44,520 INFO [train.py:811] (0/4) Start epoch 903 2023-11-15 02:42:16,473 INFO [train.py:467] (0/4) Epoch 903, batch 26, global_batch_idx: 33400, batch size: 64, loss[discriminator_loss=2.363, discriminator_real_loss=1.155, discriminator_fake_loss=1.207, generator_loss=30.77, generator_mel_loss=19.4, generator_kl_loss=2.091, generator_dur_loss=1.625, generator_adv_loss=2.492, generator_feat_match_loss=5.16, over 64.00 samples.], tot_loss[discriminator_loss=2.476, discriminator_real_loss=1.257, discriminator_fake_loss=1.219, generator_loss=30.8, generator_mel_loss=19.68, generator_kl_loss=1.992, generator_dur_loss=1.629, generator_adv_loss=2.383, generator_feat_match_loss=5.117, over 1909.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 02:42:16,996 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 02:42:27,433 INFO [train.py:517] (0/4) Epoch 903, validation: discriminator_loss=2.411, discriminator_real_loss=1.1, discriminator_fake_loss=1.311, generator_loss=31.57, generator_mel_loss=20.23, generator_kl_loss=2.246, generator_dur_loss=1.631, generator_adv_loss=2.149, generator_feat_match_loss=5.312, over 100.00 samples. 2023-11-15 02:42:27,434 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 02:43:23,878 INFO [train.py:811] (0/4) Start epoch 904 2023-11-15 02:46:55,985 INFO [train.py:811] (0/4) Start epoch 905 2023-11-15 02:47:18,809 INFO [train.py:467] (0/4) Epoch 905, batch 2, global_batch_idx: 33450, batch size: 50, loss[discriminator_loss=2.504, discriminator_real_loss=1.333, discriminator_fake_loss=1.171, generator_loss=29.88, generator_mel_loss=19.05, generator_kl_loss=1.911, generator_dur_loss=1.607, generator_adv_loss=2.336, generator_feat_match_loss=4.977, over 50.00 samples.], tot_loss[discriminator_loss=2.571, discriminator_real_loss=1.349, discriminator_fake_loss=1.222, generator_loss=30.19, generator_mel_loss=19.48, generator_kl_loss=2.033, generator_dur_loss=1.631, generator_adv_loss=2.348, generator_feat_match_loss=4.692, over 181.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 02:50:24,481 INFO [train.py:811] (0/4) Start epoch 906 2023-11-15 02:52:02,437 INFO [train.py:467] (0/4) Epoch 906, batch 15, global_batch_idx: 33500, batch size: 58, loss[discriminator_loss=2.523, discriminator_real_loss=1.373, discriminator_fake_loss=1.15, generator_loss=30.36, generator_mel_loss=19.88, generator_kl_loss=1.982, generator_dur_loss=1.625, generator_adv_loss=2.256, generator_feat_match_loss=4.613, over 58.00 samples.], tot_loss[discriminator_loss=2.525, discriminator_real_loss=1.261, discriminator_fake_loss=1.264, generator_loss=30.56, generator_mel_loss=19.9, generator_kl_loss=2.046, generator_dur_loss=1.627, generator_adv_loss=2.259, generator_feat_match_loss=4.731, over 1112.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 02:53:54,236 INFO [train.py:811] (0/4) Start epoch 907 2023-11-15 02:56:48,376 INFO [train.py:467] (0/4) Epoch 907, batch 28, global_batch_idx: 33550, batch size: 126, loss[discriminator_loss=2.551, discriminator_real_loss=1.26, discriminator_fake_loss=1.291, generator_loss=31.3, generator_mel_loss=20.31, generator_kl_loss=2.102, generator_dur_loss=1.612, generator_adv_loss=2.391, generator_feat_match_loss=4.887, over 126.00 samples.], tot_loss[discriminator_loss=2.597, discriminator_real_loss=1.309, discriminator_fake_loss=1.288, generator_loss=30.64, generator_mel_loss=20.18, generator_kl_loss=2.041, generator_dur_loss=1.63, generator_adv_loss=2.193, generator_feat_match_loss=4.595, over 2180.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2023-11-15 02:57:26,627 INFO [train.py:811] (0/4) Start epoch 908 2023-11-15 03:00:55,877 INFO [train.py:811] (0/4) Start epoch 909 2023-11-15 03:01:33,982 INFO [train.py:467] (0/4) Epoch 909, batch 4, global_batch_idx: 33600, batch size: 49, loss[discriminator_loss=2.518, discriminator_real_loss=1.31, discriminator_fake_loss=1.208, generator_loss=30.19, generator_mel_loss=19.91, generator_kl_loss=1.955, generator_dur_loss=1.631, generator_adv_loss=2.102, generator_feat_match_loss=4.586, over 49.00 samples.], tot_loss[discriminator_loss=2.511, discriminator_real_loss=1.284, discriminator_fake_loss=1.227, generator_loss=30.56, generator_mel_loss=19.96, generator_kl_loss=1.994, generator_dur_loss=1.639, generator_adv_loss=2.222, generator_feat_match_loss=4.745, over 305.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2023-11-15 03:01:34,523 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 03:01:45,803 INFO [train.py:517] (0/4) Epoch 909, validation: discriminator_loss=2.46, discriminator_real_loss=1.126, discriminator_fake_loss=1.334, generator_loss=30.96, generator_mel_loss=20.45, generator_kl_loss=2.217, generator_dur_loss=1.627, generator_adv_loss=1.928, generator_feat_match_loss=4.737, over 100.00 samples. 2023-11-15 03:01:45,804 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 03:04:43,945 INFO [train.py:811] (0/4) Start epoch 910 2023-11-15 03:06:34,420 INFO [train.py:467] (0/4) Epoch 910, batch 17, global_batch_idx: 33650, batch size: 55, loss[discriminator_loss=2.402, discriminator_real_loss=1.183, discriminator_fake_loss=1.221, generator_loss=30.79, generator_mel_loss=19.7, generator_kl_loss=1.98, generator_dur_loss=1.677, generator_adv_loss=2.324, generator_feat_match_loss=5.109, over 55.00 samples.], tot_loss[discriminator_loss=2.524, discriminator_real_loss=1.282, discriminator_fake_loss=1.243, generator_loss=30.61, generator_mel_loss=19.79, generator_kl_loss=1.987, generator_dur_loss=1.629, generator_adv_loss=2.345, generator_feat_match_loss=4.855, over 1308.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2023-11-15 03:08:17,633 INFO [train.py:811] (0/4) Start epoch 911 2023-11-15 03:11:20,812 INFO [train.py:467] (0/4) Epoch 911, batch 30, global_batch_idx: 33700, batch size: 52, loss[discriminator_loss=2.305, discriminator_real_loss=1.185, discriminator_fake_loss=1.12, generator_loss=31.29, generator_mel_loss=19.31, generator_kl_loss=1.984, generator_dur_loss=1.648, generator_adv_loss=2.58, generator_feat_match_loss=5.766, over 52.00 samples.], tot_loss[discriminator_loss=2.478, discriminator_real_loss=1.25, discriminator_fake_loss=1.228, generator_loss=30.88, generator_mel_loss=19.8, generator_kl_loss=2.02, generator_dur_loss=1.629, generator_adv_loss=2.372, generator_feat_match_loss=5.064, over 2065.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 03:11:55,106 INFO [train.py:811] (0/4) Start epoch 912 2023-11-15 03:15:23,975 INFO [train.py:811] (0/4) Start epoch 913 2023-11-15 03:16:10,077 INFO [train.py:467] (0/4) Epoch 913, batch 6, global_batch_idx: 33750, batch size: 73, loss[discriminator_loss=2.594, discriminator_real_loss=1.419, discriminator_fake_loss=1.176, generator_loss=30.72, generator_mel_loss=20.17, generator_kl_loss=2.029, generator_dur_loss=1.612, generator_adv_loss=2.117, generator_feat_match_loss=4.797, over 73.00 samples.], tot_loss[discriminator_loss=2.523, discriminator_real_loss=1.281, discriminator_fake_loss=1.241, generator_loss=30.6, generator_mel_loss=20.02, generator_kl_loss=2.075, generator_dur_loss=1.625, generator_adv_loss=2.206, generator_feat_match_loss=4.676, over 535.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 03:18:52,862 INFO [train.py:811] (0/4) Start epoch 914 2023-11-15 03:20:58,332 INFO [train.py:467] (0/4) Epoch 914, batch 19, global_batch_idx: 33800, batch size: 52, loss[discriminator_loss=2.629, discriminator_real_loss=1.413, discriminator_fake_loss=1.217, generator_loss=30.13, generator_mel_loss=19.92, generator_kl_loss=1.983, generator_dur_loss=1.626, generator_adv_loss=2.143, generator_feat_match_loss=4.457, over 52.00 samples.], tot_loss[discriminator_loss=2.572, discriminator_real_loss=1.299, discriminator_fake_loss=1.272, generator_loss=30.76, generator_mel_loss=20.19, generator_kl_loss=2.015, generator_dur_loss=1.622, generator_adv_loss=2.229, generator_feat_match_loss=4.705, over 1575.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 03:20:58,865 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 03:21:09,514 INFO [train.py:517] (0/4) Epoch 914, validation: discriminator_loss=2.618, discriminator_real_loss=1.258, discriminator_fake_loss=1.361, generator_loss=31.84, generator_mel_loss=21.04, generator_kl_loss=2.192, generator_dur_loss=1.622, generator_adv_loss=2.014, generator_feat_match_loss=4.965, over 100.00 samples. 2023-11-15 03:21:09,515 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 03:22:36,828 INFO [train.py:811] (0/4) Start epoch 915 2023-11-15 03:25:40,336 INFO [train.py:467] (0/4) Epoch 915, batch 32, global_batch_idx: 33850, batch size: 126, loss[discriminator_loss=2.578, discriminator_real_loss=1.266, discriminator_fake_loss=1.312, generator_loss=30.74, generator_mel_loss=19.75, generator_kl_loss=1.996, generator_dur_loss=1.605, generator_adv_loss=2.484, generator_feat_match_loss=4.902, over 126.00 samples.], tot_loss[discriminator_loss=2.6, discriminator_real_loss=1.34, discriminator_fake_loss=1.259, generator_loss=30.69, generator_mel_loss=19.72, generator_kl_loss=1.999, generator_dur_loss=1.628, generator_adv_loss=2.398, generator_feat_match_loss=4.948, over 2375.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 03:26:08,741 INFO [train.py:811] (0/4) Start epoch 916 2023-11-15 03:29:46,032 INFO [train.py:811] (0/4) Start epoch 917 2023-11-15 03:30:40,650 INFO [train.py:467] (0/4) Epoch 917, batch 8, global_batch_idx: 33900, batch size: 49, loss[discriminator_loss=2.578, discriminator_real_loss=1.325, discriminator_fake_loss=1.252, generator_loss=30.55, generator_mel_loss=20.33, generator_kl_loss=1.968, generator_dur_loss=1.621, generator_adv_loss=2.109, generator_feat_match_loss=4.516, over 49.00 samples.], tot_loss[discriminator_loss=2.593, discriminator_real_loss=1.328, discriminator_fake_loss=1.266, generator_loss=30.55, generator_mel_loss=20.15, generator_kl_loss=2.009, generator_dur_loss=1.634, generator_adv_loss=2.219, generator_feat_match_loss=4.54, over 508.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 03:33:14,280 INFO [train.py:811] (0/4) Start epoch 918 2023-11-15 03:35:25,990 INFO [train.py:467] (0/4) Epoch 918, batch 21, global_batch_idx: 33950, batch size: 58, loss[discriminator_loss=2.541, discriminator_real_loss=1.224, discriminator_fake_loss=1.317, generator_loss=30.03, generator_mel_loss=19.68, generator_kl_loss=2.056, generator_dur_loss=1.624, generator_adv_loss=2.25, generator_feat_match_loss=4.418, over 58.00 samples.], tot_loss[discriminator_loss=2.607, discriminator_real_loss=1.317, discriminator_fake_loss=1.29, generator_loss=30.48, generator_mel_loss=20.16, generator_kl_loss=2.009, generator_dur_loss=1.632, generator_adv_loss=2.166, generator_feat_match_loss=4.51, over 1579.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 03:36:48,647 INFO [train.py:811] (0/4) Start epoch 919 2023-11-15 03:40:07,612 INFO [train.py:467] (0/4) Epoch 919, batch 34, global_batch_idx: 34000, batch size: 85, loss[discriminator_loss=2.428, discriminator_real_loss=1.225, discriminator_fake_loss=1.203, generator_loss=30.72, generator_mel_loss=19.67, generator_kl_loss=1.95, generator_dur_loss=1.624, generator_adv_loss=2.25, generator_feat_match_loss=5.227, over 85.00 samples.], tot_loss[discriminator_loss=2.588, discriminator_real_loss=1.317, discriminator_fake_loss=1.271, generator_loss=30.54, generator_mel_loss=19.96, generator_kl_loss=2.02, generator_dur_loss=1.632, generator_adv_loss=2.258, generator_feat_match_loss=4.666, over 2404.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2023-11-15 03:40:08,221 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 03:40:18,695 INFO [train.py:517] (0/4) Epoch 919, validation: discriminator_loss=2.484, discriminator_real_loss=1.073, discriminator_fake_loss=1.411, generator_loss=31.27, generator_mel_loss=20.37, generator_kl_loss=2.23, generator_dur_loss=1.627, generator_adv_loss=1.906, generator_feat_match_loss=5.136, over 100.00 samples. 2023-11-15 03:40:18,696 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 03:40:30,314 INFO [train.py:811] (0/4) Start epoch 920 2023-11-15 03:44:00,569 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-920.pt 2023-11-15 03:44:03,928 INFO [train.py:811] (0/4) Start epoch 921 2023-11-15 03:45:12,065 INFO [train.py:467] (0/4) Epoch 921, batch 10, global_batch_idx: 34050, batch size: 76, loss[discriminator_loss=2.629, discriminator_real_loss=1.234, discriminator_fake_loss=1.395, generator_loss=30.39, generator_mel_loss=19.85, generator_kl_loss=1.988, generator_dur_loss=1.647, generator_adv_loss=2.258, generator_feat_match_loss=4.648, over 76.00 samples.], tot_loss[discriminator_loss=2.579, discriminator_real_loss=1.32, discriminator_fake_loss=1.259, generator_loss=30.59, generator_mel_loss=19.87, generator_kl_loss=2.006, generator_dur_loss=1.633, generator_adv_loss=2.301, generator_feat_match_loss=4.771, over 809.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2023-11-15 03:47:33,809 INFO [train.py:811] (0/4) Start epoch 922 2023-11-15 03:49:50,867 INFO [train.py:467] (0/4) Epoch 922, batch 23, global_batch_idx: 34100, batch size: 49, loss[discriminator_loss=2.652, discriminator_real_loss=1.377, discriminator_fake_loss=1.276, generator_loss=30.36, generator_mel_loss=19.78, generator_kl_loss=1.976, generator_dur_loss=1.646, generator_adv_loss=2.291, generator_feat_match_loss=4.668, over 49.00 samples.], tot_loss[discriminator_loss=2.575, discriminator_real_loss=1.306, discriminator_fake_loss=1.269, generator_loss=30.48, generator_mel_loss=19.64, generator_kl_loss=2.013, generator_dur_loss=1.624, generator_adv_loss=2.321, generator_feat_match_loss=4.883, over 1734.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 03:51:01,343 INFO [train.py:811] (0/4) Start epoch 923 2023-11-15 03:54:32,370 INFO [train.py:467] (0/4) Epoch 923, batch 36, global_batch_idx: 34150, batch size: 110, loss[discriminator_loss=2.467, discriminator_real_loss=1.369, discriminator_fake_loss=1.098, generator_loss=31.47, generator_mel_loss=20.55, generator_kl_loss=1.956, generator_dur_loss=1.609, generator_adv_loss=2.299, generator_feat_match_loss=5.055, over 110.00 samples.], tot_loss[discriminator_loss=2.55, discriminator_real_loss=1.29, discriminator_fake_loss=1.26, generator_loss=30.59, generator_mel_loss=19.98, generator_kl_loss=1.997, generator_dur_loss=1.629, generator_adv_loss=2.237, generator_feat_match_loss=4.747, over 2694.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 03:54:33,409 INFO [train.py:811] (0/4) Start epoch 924 2023-11-15 03:58:05,099 INFO [train.py:811] (0/4) Start epoch 925 2023-11-15 03:59:25,720 INFO [train.py:467] (0/4) Epoch 925, batch 12, global_batch_idx: 34200, batch size: 63, loss[discriminator_loss=2.438, discriminator_real_loss=1.294, discriminator_fake_loss=1.145, generator_loss=30.41, generator_mel_loss=19.58, generator_kl_loss=1.937, generator_dur_loss=1.653, generator_adv_loss=2.361, generator_feat_match_loss=4.879, over 63.00 samples.], tot_loss[discriminator_loss=2.5, discriminator_real_loss=1.277, discriminator_fake_loss=1.224, generator_loss=30.83, generator_mel_loss=19.88, generator_kl_loss=2.007, generator_dur_loss=1.626, generator_adv_loss=2.356, generator_feat_match_loss=4.957, over 1008.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 03:59:26,318 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 03:59:36,633 INFO [train.py:517] (0/4) Epoch 925, validation: discriminator_loss=2.463, discriminator_real_loss=1.216, discriminator_fake_loss=1.248, generator_loss=31.49, generator_mel_loss=20.59, generator_kl_loss=2.14, generator_dur_loss=1.643, generator_adv_loss=2.13, generator_feat_match_loss=4.99, over 100.00 samples. 2023-11-15 03:59:36,634 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 04:01:46,014 INFO [train.py:811] (0/4) Start epoch 926 2023-11-15 04:04:18,666 INFO [train.py:467] (0/4) Epoch 926, batch 25, global_batch_idx: 34250, batch size: 76, loss[discriminator_loss=2.543, discriminator_real_loss=1.305, discriminator_fake_loss=1.239, generator_loss=30.31, generator_mel_loss=19.76, generator_kl_loss=2.061, generator_dur_loss=1.64, generator_adv_loss=2.346, generator_feat_match_loss=4.5, over 76.00 samples.], tot_loss[discriminator_loss=2.55, discriminator_real_loss=1.292, discriminator_fake_loss=1.258, generator_loss=30.69, generator_mel_loss=19.87, generator_kl_loss=2.033, generator_dur_loss=1.632, generator_adv_loss=2.303, generator_feat_match_loss=4.854, over 1827.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 04:05:19,792 INFO [train.py:811] (0/4) Start epoch 927 2023-11-15 04:08:52,753 INFO [train.py:811] (0/4) Start epoch 928 2023-11-15 04:09:15,325 INFO [train.py:467] (0/4) Epoch 928, batch 1, global_batch_idx: 34300, batch size: 50, loss[discriminator_loss=2.465, discriminator_real_loss=1.201, discriminator_fake_loss=1.263, generator_loss=31.12, generator_mel_loss=20.27, generator_kl_loss=2.043, generator_dur_loss=1.641, generator_adv_loss=2.24, generator_feat_match_loss=4.926, over 50.00 samples.], tot_loss[discriminator_loss=2.487, discriminator_real_loss=1.27, discriminator_fake_loss=1.217, generator_loss=30.95, generator_mel_loss=19.97, generator_kl_loss=2.022, generator_dur_loss=1.647, generator_adv_loss=2.325, generator_feat_match_loss=4.987, over 108.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 04:12:24,599 INFO [train.py:811] (0/4) Start epoch 929 2023-11-15 04:13:53,638 INFO [train.py:467] (0/4) Epoch 929, batch 14, global_batch_idx: 34350, batch size: 85, loss[discriminator_loss=2.549, discriminator_real_loss=1.174, discriminator_fake_loss=1.375, generator_loss=30.09, generator_mel_loss=19.72, generator_kl_loss=1.939, generator_dur_loss=1.616, generator_adv_loss=2.254, generator_feat_match_loss=4.555, over 85.00 samples.], tot_loss[discriminator_loss=2.456, discriminator_real_loss=1.22, discriminator_fake_loss=1.235, generator_loss=30.78, generator_mel_loss=19.58, generator_kl_loss=2.004, generator_dur_loss=1.626, generator_adv_loss=2.367, generator_feat_match_loss=5.194, over 1092.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 04:15:57,061 INFO [train.py:811] (0/4) Start epoch 930 2023-11-15 04:18:36,072 INFO [train.py:467] (0/4) Epoch 930, batch 27, global_batch_idx: 34400, batch size: 65, loss[discriminator_loss=2.508, discriminator_real_loss=1.281, discriminator_fake_loss=1.226, generator_loss=30.49, generator_mel_loss=19.9, generator_kl_loss=2.035, generator_dur_loss=1.618, generator_adv_loss=2.16, generator_feat_match_loss=4.773, over 65.00 samples.], tot_loss[discriminator_loss=2.549, discriminator_real_loss=1.297, discriminator_fake_loss=1.252, generator_loss=30.53, generator_mel_loss=20.01, generator_kl_loss=2.009, generator_dur_loss=1.631, generator_adv_loss=2.188, generator_feat_match_loss=4.69, over 1968.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2023-11-15 04:18:36,653 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 04:18:47,231 INFO [train.py:517] (0/4) Epoch 930, validation: discriminator_loss=2.608, discriminator_real_loss=1.257, discriminator_fake_loss=1.352, generator_loss=31.26, generator_mel_loss=20.85, generator_kl_loss=2.339, generator_dur_loss=1.624, generator_adv_loss=1.921, generator_feat_match_loss=4.522, over 100.00 samples. 2023-11-15 04:18:47,232 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 04:19:40,080 INFO [train.py:811] (0/4) Start epoch 931 2023-11-15 04:23:08,458 INFO [train.py:811] (0/4) Start epoch 932 2023-11-15 04:23:37,129 INFO [train.py:467] (0/4) Epoch 932, batch 3, global_batch_idx: 34450, batch size: 90, loss[discriminator_loss=2.539, discriminator_real_loss=1.351, discriminator_fake_loss=1.188, generator_loss=30.04, generator_mel_loss=19.39, generator_kl_loss=2, generator_dur_loss=1.621, generator_adv_loss=2.359, generator_feat_match_loss=4.672, over 90.00 samples.], tot_loss[discriminator_loss=2.611, discriminator_real_loss=1.301, discriminator_fake_loss=1.309, generator_loss=30.55, generator_mel_loss=20.01, generator_kl_loss=2.01, generator_dur_loss=1.629, generator_adv_loss=2.253, generator_feat_match_loss=4.648, over 268.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2023-11-15 04:26:32,131 INFO [train.py:811] (0/4) Start epoch 933 2023-11-15 04:28:17,118 INFO [train.py:467] (0/4) Epoch 933, batch 16, global_batch_idx: 34500, batch size: 95, loss[discriminator_loss=2.492, discriminator_real_loss=1.386, discriminator_fake_loss=1.107, generator_loss=30.74, generator_mel_loss=19.95, generator_kl_loss=1.963, generator_dur_loss=1.61, generator_adv_loss=2.346, generator_feat_match_loss=4.871, over 95.00 samples.], tot_loss[discriminator_loss=2.506, discriminator_real_loss=1.271, discriminator_fake_loss=1.235, generator_loss=30.69, generator_mel_loss=19.85, generator_kl_loss=2.008, generator_dur_loss=1.623, generator_adv_loss=2.327, generator_feat_match_loss=4.889, over 1186.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 04:30:00,660 INFO [train.py:811] (0/4) Start epoch 934 2023-11-15 04:32:51,950 INFO [train.py:467] (0/4) Epoch 934, batch 29, global_batch_idx: 34550, batch size: 64, loss[discriminator_loss=2.406, discriminator_real_loss=1.186, discriminator_fake_loss=1.221, generator_loss=30.76, generator_mel_loss=19.2, generator_kl_loss=1.979, generator_dur_loss=1.61, generator_adv_loss=2.617, generator_feat_match_loss=5.359, over 64.00 samples.], tot_loss[discriminator_loss=2.485, discriminator_real_loss=1.273, discriminator_fake_loss=1.212, generator_loss=31.14, generator_mel_loss=19.8, generator_kl_loss=2.019, generator_dur_loss=1.63, generator_adv_loss=2.425, generator_feat_match_loss=5.266, over 2308.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 04:33:31,194 INFO [train.py:811] (0/4) Start epoch 935 2023-11-15 04:36:59,699 INFO [train.py:811] (0/4) Start epoch 936 2023-11-15 04:37:38,994 INFO [train.py:467] (0/4) Epoch 936, batch 5, global_batch_idx: 34600, batch size: 65, loss[discriminator_loss=2.594, discriminator_real_loss=1.364, discriminator_fake_loss=1.229, generator_loss=30.1, generator_mel_loss=19.57, generator_kl_loss=1.987, generator_dur_loss=1.626, generator_adv_loss=2.137, generator_feat_match_loss=4.777, over 65.00 samples.], tot_loss[discriminator_loss=2.613, discriminator_real_loss=1.317, discriminator_fake_loss=1.296, generator_loss=30.14, generator_mel_loss=19.88, generator_kl_loss=2.004, generator_dur_loss=1.633, generator_adv_loss=2.137, generator_feat_match_loss=4.486, over 376.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 04:37:39,639 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 04:37:51,275 INFO [train.py:517] (0/4) Epoch 936, validation: discriminator_loss=2.564, discriminator_real_loss=1.232, discriminator_fake_loss=1.332, generator_loss=31.25, generator_mel_loss=20.65, generator_kl_loss=2.279, generator_dur_loss=1.634, generator_adv_loss=1.962, generator_feat_match_loss=4.719, over 100.00 samples. 2023-11-15 04:37:51,277 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 04:40:43,374 INFO [train.py:811] (0/4) Start epoch 937 2023-11-15 04:42:26,963 INFO [train.py:467] (0/4) Epoch 937, batch 18, global_batch_idx: 34650, batch size: 95, loss[discriminator_loss=2.662, discriminator_real_loss=1.42, discriminator_fake_loss=1.242, generator_loss=30.96, generator_mel_loss=20.46, generator_kl_loss=2.038, generator_dur_loss=1.621, generator_adv_loss=2.227, generator_feat_match_loss=4.609, over 95.00 samples.], tot_loss[discriminator_loss=2.596, discriminator_real_loss=1.323, discriminator_fake_loss=1.272, generator_loss=30.32, generator_mel_loss=20.01, generator_kl_loss=2.014, generator_dur_loss=1.628, generator_adv_loss=2.17, generator_feat_match_loss=4.494, over 1365.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 04:44:17,112 INFO [train.py:811] (0/4) Start epoch 938 2023-11-15 04:47:22,782 INFO [train.py:467] (0/4) Epoch 938, batch 31, global_batch_idx: 34700, batch size: 58, loss[discriminator_loss=2.508, discriminator_real_loss=1.377, discriminator_fake_loss=1.132, generator_loss=30.8, generator_mel_loss=19.76, generator_kl_loss=2.037, generator_dur_loss=1.615, generator_adv_loss=2.385, generator_feat_match_loss=5, over 58.00 samples.], tot_loss[discriminator_loss=2.533, discriminator_real_loss=1.283, discriminator_fake_loss=1.251, generator_loss=30.74, generator_mel_loss=19.86, generator_kl_loss=2.017, generator_dur_loss=1.635, generator_adv_loss=2.328, generator_feat_match_loss=4.902, over 1985.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 04:47:49,227 INFO [train.py:811] (0/4) Start epoch 939 2023-11-15 04:51:21,465 INFO [train.py:811] (0/4) Start epoch 940 2023-11-15 04:52:17,830 INFO [train.py:467] (0/4) Epoch 940, batch 7, global_batch_idx: 34750, batch size: 49, loss[discriminator_loss=2.465, discriminator_real_loss=1.265, discriminator_fake_loss=1.201, generator_loss=30.37, generator_mel_loss=19.7, generator_kl_loss=1.901, generator_dur_loss=1.643, generator_adv_loss=2.41, generator_feat_match_loss=4.715, over 49.00 samples.], tot_loss[discriminator_loss=2.531, discriminator_real_loss=1.271, discriminator_fake_loss=1.26, generator_loss=30.37, generator_mel_loss=19.82, generator_kl_loss=2.005, generator_dur_loss=1.632, generator_adv_loss=2.268, generator_feat_match_loss=4.647, over 484.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 04:54:54,485 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-940.pt 2023-11-15 04:54:57,896 INFO [train.py:811] (0/4) Start epoch 941 2023-11-15 04:57:01,291 INFO [train.py:467] (0/4) Epoch 941, batch 20, global_batch_idx: 34800, batch size: 90, loss[discriminator_loss=2.916, discriminator_real_loss=1.521, discriminator_fake_loss=1.395, generator_loss=29.72, generator_mel_loss=19.35, generator_kl_loss=1.932, generator_dur_loss=1.622, generator_adv_loss=2.436, generator_feat_match_loss=4.383, over 90.00 samples.], tot_loss[discriminator_loss=2.52, discriminator_real_loss=1.264, discriminator_fake_loss=1.256, generator_loss=30.78, generator_mel_loss=19.6, generator_kl_loss=1.973, generator_dur_loss=1.629, generator_adv_loss=2.404, generator_feat_match_loss=5.181, over 1562.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2023-11-15 04:57:01,872 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 04:57:12,268 INFO [train.py:517] (0/4) Epoch 941, validation: discriminator_loss=2.681, discriminator_real_loss=1.534, discriminator_fake_loss=1.147, generator_loss=31.02, generator_mel_loss=20.36, generator_kl_loss=2.139, generator_dur_loss=1.62, generator_adv_loss=2.381, generator_feat_match_loss=4.521, over 100.00 samples. 2023-11-15 04:57:12,269 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 04:58:42,382 INFO [train.py:811] (0/4) Start epoch 942 2023-11-15 05:02:02,354 INFO [train.py:467] (0/4) Epoch 942, batch 33, global_batch_idx: 34850, batch size: 71, loss[discriminator_loss=2.6, discriminator_real_loss=1.287, discriminator_fake_loss=1.312, generator_loss=30.87, generator_mel_loss=20.17, generator_kl_loss=2.016, generator_dur_loss=1.614, generator_adv_loss=2.246, generator_feat_match_loss=4.828, over 71.00 samples.], tot_loss[discriminator_loss=2.583, discriminator_real_loss=1.31, discriminator_fake_loss=1.273, generator_loss=30.48, generator_mel_loss=20.04, generator_kl_loss=2.009, generator_dur_loss=1.626, generator_adv_loss=2.174, generator_feat_match_loss=4.633, over 2285.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2023-11-15 05:02:16,163 INFO [train.py:811] (0/4) Start epoch 943 2023-11-15 05:05:42,639 INFO [train.py:811] (0/4) Start epoch 944 2023-11-15 05:06:41,019 INFO [train.py:467] (0/4) Epoch 944, batch 9, global_batch_idx: 34900, batch size: 69, loss[discriminator_loss=2.584, discriminator_real_loss=1.264, discriminator_fake_loss=1.32, generator_loss=30.44, generator_mel_loss=20.02, generator_kl_loss=1.941, generator_dur_loss=1.609, generator_adv_loss=2.188, generator_feat_match_loss=4.68, over 69.00 samples.], tot_loss[discriminator_loss=2.586, discriminator_real_loss=1.315, discriminator_fake_loss=1.271, generator_loss=30.39, generator_mel_loss=19.93, generator_kl_loss=1.999, generator_dur_loss=1.624, generator_adv_loss=2.218, generator_feat_match_loss=4.614, over 648.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2023-11-15 05:09:15,130 INFO [train.py:811] (0/4) Start epoch 945 2023-11-15 05:11:26,094 INFO [train.py:467] (0/4) Epoch 945, batch 22, global_batch_idx: 34950, batch size: 52, loss[discriminator_loss=2.596, discriminator_real_loss=1.276, discriminator_fake_loss=1.319, generator_loss=29.99, generator_mel_loss=19.83, generator_kl_loss=2.041, generator_dur_loss=1.62, generator_adv_loss=1.951, generator_feat_match_loss=4.543, over 52.00 samples.], tot_loss[discriminator_loss=2.587, discriminator_real_loss=1.314, discriminator_fake_loss=1.272, generator_loss=30.66, generator_mel_loss=20.04, generator_kl_loss=1.98, generator_dur_loss=1.629, generator_adv_loss=2.242, generator_feat_match_loss=4.769, over 1693.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2023-11-15 05:12:43,593 INFO [train.py:811] (0/4) Start epoch 946 2023-11-15 05:16:11,559 INFO [train.py:467] (0/4) Epoch 946, batch 35, global_batch_idx: 35000, batch size: 50, loss[discriminator_loss=2.475, discriminator_real_loss=1.332, discriminator_fake_loss=1.143, generator_loss=31.29, generator_mel_loss=20.35, generator_kl_loss=2.078, generator_dur_loss=1.638, generator_adv_loss=2.268, generator_feat_match_loss=4.961, over 50.00 samples.], tot_loss[discriminator_loss=2.59, discriminator_real_loss=1.312, discriminator_fake_loss=1.279, generator_loss=30.49, generator_mel_loss=19.92, generator_kl_loss=2.01, generator_dur_loss=1.631, generator_adv_loss=2.23, generator_feat_match_loss=4.698, over 2487.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2023-11-15 05:16:12,247 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 05:16:22,445 INFO [train.py:517] (0/4) Epoch 946, validation: discriminator_loss=2.501, discriminator_real_loss=1.169, discriminator_fake_loss=1.332, generator_loss=31.78, generator_mel_loss=20.75, generator_kl_loss=2.266, generator_dur_loss=1.629, generator_adv_loss=2.086, generator_feat_match_loss=5.049, over 100.00 samples. 2023-11-15 05:16:22,446 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 05:16:29,196 INFO [train.py:811] (0/4) Start epoch 947 2023-11-15 05:19:59,861 INFO [train.py:811] (0/4) Start epoch 948 2023-11-15 05:21:12,327 INFO [train.py:467] (0/4) Epoch 948, batch 11, global_batch_idx: 35050, batch size: 73, loss[discriminator_loss=2.684, discriminator_real_loss=1.295, discriminator_fake_loss=1.39, generator_loss=30.12, generator_mel_loss=20.14, generator_kl_loss=2.013, generator_dur_loss=1.616, generator_adv_loss=2.055, generator_feat_match_loss=4.293, over 73.00 samples.], tot_loss[discriminator_loss=2.534, discriminator_real_loss=1.284, discriminator_fake_loss=1.25, generator_loss=30.79, generator_mel_loss=19.98, generator_kl_loss=2.027, generator_dur_loss=1.628, generator_adv_loss=2.296, generator_feat_match_loss=4.864, over 900.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2023-11-15 05:23:30,614 INFO [train.py:811] (0/4) Start epoch 949 2023-11-15 05:26:02,771 INFO [train.py:467] (0/4) Epoch 949, batch 24, global_batch_idx: 35100, batch size: 54, loss[discriminator_loss=2.535, discriminator_real_loss=1.245, discriminator_fake_loss=1.291, generator_loss=31.06, generator_mel_loss=19.96, generator_kl_loss=2.02, generator_dur_loss=1.659, generator_adv_loss=2.221, generator_feat_match_loss=5.195, over 54.00 samples.], tot_loss[discriminator_loss=2.554, discriminator_real_loss=1.281, discriminator_fake_loss=1.273, generator_loss=30.72, generator_mel_loss=19.97, generator_kl_loss=1.999, generator_dur_loss=1.628, generator_adv_loss=2.284, generator_feat_match_loss=4.832, over 1751.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2023-11-15 05:27:06,797 INFO [train.py:811] (0/4) Start epoch 950 2023-11-15 05:30:42,834 INFO [train.py:811] (0/4) Start epoch 951 2023-11-15 05:30:58,184 INFO [train.py:467] (0/4) Epoch 951, batch 0, global_batch_idx: 35150, batch size: 85, loss[discriminator_loss=2.551, discriminator_real_loss=1.277, discriminator_fake_loss=1.272, generator_loss=30.69, generator_mel_loss=19.79, generator_kl_loss=2.169, generator_dur_loss=1.642, generator_adv_loss=2.26, generator_feat_match_loss=4.832, over 85.00 samples.], tot_loss[discriminator_loss=2.551, discriminator_real_loss=1.277, discriminator_fake_loss=1.272, generator_loss=30.69, generator_mel_loss=19.79, generator_kl_loss=2.169, generator_dur_loss=1.642, generator_adv_loss=2.26, generator_feat_match_loss=4.832, over 85.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 05:34:08,015 INFO [train.py:811] (0/4) Start epoch 952 2023-11-15 05:35:39,370 INFO [train.py:467] (0/4) Epoch 952, batch 13, global_batch_idx: 35200, batch size: 101, loss[discriminator_loss=2.387, discriminator_real_loss=1.124, discriminator_fake_loss=1.263, generator_loss=31.16, generator_mel_loss=19.75, generator_kl_loss=2.041, generator_dur_loss=1.632, generator_adv_loss=2.311, generator_feat_match_loss=5.426, over 101.00 samples.], tot_loss[discriminator_loss=2.467, discriminator_real_loss=1.248, discriminator_fake_loss=1.22, generator_loss=31.02, generator_mel_loss=19.73, generator_kl_loss=2.018, generator_dur_loss=1.629, generator_adv_loss=2.398, generator_feat_match_loss=5.239, over 955.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2023-11-15 05:35:39,955 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 05:35:50,390 INFO [train.py:517] (0/4) Epoch 952, validation: discriminator_loss=2.606, discriminator_real_loss=1.169, discriminator_fake_loss=1.437, generator_loss=30.7, generator_mel_loss=20.22, generator_kl_loss=2.176, generator_dur_loss=1.628, generator_adv_loss=1.853, generator_feat_match_loss=4.83, over 100.00 samples. 2023-11-15 05:35:50,391 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 05:37:47,725 INFO [train.py:811] (0/4) Start epoch 953 2023-11-15 05:40:21,272 INFO [train.py:467] (0/4) Epoch 953, batch 26, global_batch_idx: 35250, batch size: 59, loss[discriminator_loss=2.531, discriminator_real_loss=1.266, discriminator_fake_loss=1.265, generator_loss=30.11, generator_mel_loss=19.91, generator_kl_loss=1.94, generator_dur_loss=1.63, generator_adv_loss=2.092, generator_feat_match_loss=4.543, over 59.00 samples.], tot_loss[discriminator_loss=2.55, discriminator_real_loss=1.292, discriminator_fake_loss=1.258, generator_loss=30.6, generator_mel_loss=19.85, generator_kl_loss=2.017, generator_dur_loss=1.631, generator_adv_loss=2.29, generator_feat_match_loss=4.816, over 1845.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 05:41:16,436 INFO [train.py:811] (0/4) Start epoch 954 2023-11-15 05:44:52,875 INFO [train.py:811] (0/4) Start epoch 955 2023-11-15 05:45:18,812 INFO [train.py:467] (0/4) Epoch 955, batch 2, global_batch_idx: 35300, batch size: 61, loss[discriminator_loss=2.555, discriminator_real_loss=1.397, discriminator_fake_loss=1.157, generator_loss=30.42, generator_mel_loss=19.67, generator_kl_loss=2.087, generator_dur_loss=1.62, generator_adv_loss=2.271, generator_feat_match_loss=4.777, over 61.00 samples.], tot_loss[discriminator_loss=2.631, discriminator_real_loss=1.347, discriminator_fake_loss=1.284, generator_loss=30.44, generator_mel_loss=19.95, generator_kl_loss=2.026, generator_dur_loss=1.619, generator_adv_loss=2.248, generator_feat_match_loss=4.603, over 202.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2023-11-15 05:48:20,637 INFO [train.py:811] (0/4) Start epoch 956 2023-11-15 05:50:02,987 INFO [train.py:467] (0/4) Epoch 956, batch 15, global_batch_idx: 35350, batch size: 85, loss[discriminator_loss=2.508, discriminator_real_loss=1.33, discriminator_fake_loss=1.177, generator_loss=31.38, generator_mel_loss=19.81, generator_kl_loss=1.937, generator_dur_loss=1.627, generator_adv_loss=2.561, generator_feat_match_loss=5.449, over 85.00 samples.], tot_loss[discriminator_loss=2.491, discriminator_real_loss=1.261, discriminator_fake_loss=1.23, generator_loss=31.14, generator_mel_loss=19.97, generator_kl_loss=2.027, generator_dur_loss=1.63, generator_adv_loss=2.393, generator_feat_match_loss=5.122, over 1066.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 05:51:59,229 INFO [train.py:811] (0/4) Start epoch 957 2023-11-15 05:54:48,252 INFO [train.py:467] (0/4) Epoch 957, batch 28, global_batch_idx: 35400, batch size: 52, loss[discriminator_loss=2.506, discriminator_real_loss=1.222, discriminator_fake_loss=1.284, generator_loss=30.65, generator_mel_loss=19.87, generator_kl_loss=2.065, generator_dur_loss=1.633, generator_adv_loss=2.25, generator_feat_match_loss=4.836, over 52.00 samples.], tot_loss[discriminator_loss=2.472, discriminator_real_loss=1.246, discriminator_fake_loss=1.226, generator_loss=30.7, generator_mel_loss=19.5, generator_kl_loss=1.993, generator_dur_loss=1.631, generator_adv_loss=2.398, generator_feat_match_loss=5.184, over 1862.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 05:54:48,815 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 05:54:59,150 INFO [train.py:517] (0/4) Epoch 957, validation: discriminator_loss=2.546, discriminator_real_loss=1.16, discriminator_fake_loss=1.386, generator_loss=31.21, generator_mel_loss=20.37, generator_kl_loss=2.276, generator_dur_loss=1.623, generator_adv_loss=1.948, generator_feat_match_loss=4.993, over 100.00 samples. 2023-11-15 05:54:59,151 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 05:55:44,708 INFO [train.py:811] (0/4) Start epoch 958 2023-11-15 05:59:12,225 INFO [train.py:811] (0/4) Start epoch 959 2023-11-15 05:59:49,567 INFO [train.py:467] (0/4) Epoch 959, batch 4, global_batch_idx: 35450, batch size: 65, loss[discriminator_loss=2.492, discriminator_real_loss=1.299, discriminator_fake_loss=1.194, generator_loss=30.91, generator_mel_loss=20.04, generator_kl_loss=1.951, generator_dur_loss=1.635, generator_adv_loss=2.227, generator_feat_match_loss=5.059, over 65.00 samples.], tot_loss[discriminator_loss=2.535, discriminator_real_loss=1.301, discriminator_fake_loss=1.234, generator_loss=30.7, generator_mel_loss=20.04, generator_kl_loss=2.003, generator_dur_loss=1.634, generator_adv_loss=2.244, generator_feat_match_loss=4.785, over 346.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 06:02:43,763 INFO [train.py:811] (0/4) Start epoch 960 2023-11-15 06:04:35,410 INFO [train.py:467] (0/4) Epoch 960, batch 17, global_batch_idx: 35500, batch size: 101, loss[discriminator_loss=2.566, discriminator_real_loss=1.338, discriminator_fake_loss=1.228, generator_loss=31.01, generator_mel_loss=20.19, generator_kl_loss=2.128, generator_dur_loss=1.622, generator_adv_loss=2.277, generator_feat_match_loss=4.785, over 101.00 samples.], tot_loss[discriminator_loss=2.586, discriminator_real_loss=1.311, discriminator_fake_loss=1.276, generator_loss=30.42, generator_mel_loss=19.92, generator_kl_loss=2.01, generator_dur_loss=1.624, generator_adv_loss=2.215, generator_feat_match_loss=4.658, over 1253.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 06:06:19,882 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-960.pt 2023-11-15 06:06:23,319 INFO [train.py:811] (0/4) Start epoch 961 2023-11-15 06:09:11,933 INFO [train.py:467] (0/4) Epoch 961, batch 30, global_batch_idx: 35550, batch size: 126, loss[discriminator_loss=2.51, discriminator_real_loss=1.164, discriminator_fake_loss=1.346, generator_loss=31.28, generator_mel_loss=20.1, generator_kl_loss=2.021, generator_dur_loss=1.624, generator_adv_loss=2.41, generator_feat_match_loss=5.121, over 126.00 samples.], tot_loss[discriminator_loss=2.542, discriminator_real_loss=1.281, discriminator_fake_loss=1.261, generator_loss=30.58, generator_mel_loss=19.8, generator_kl_loss=2.034, generator_dur_loss=1.628, generator_adv_loss=2.278, generator_feat_match_loss=4.836, over 2285.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 06:09:50,216 INFO [train.py:811] (0/4) Start epoch 962 2023-11-15 06:13:27,428 INFO [train.py:811] (0/4) Start epoch 963 2023-11-15 06:14:18,877 INFO [train.py:467] (0/4) Epoch 963, batch 6, global_batch_idx: 35600, batch size: 76, loss[discriminator_loss=2.508, discriminator_real_loss=1.367, discriminator_fake_loss=1.142, generator_loss=30.79, generator_mel_loss=20.01, generator_kl_loss=1.986, generator_dur_loss=1.62, generator_adv_loss=2.18, generator_feat_match_loss=5, over 76.00 samples.], tot_loss[discriminator_loss=2.584, discriminator_real_loss=1.312, discriminator_fake_loss=1.272, generator_loss=30.76, generator_mel_loss=20.16, generator_kl_loss=2.021, generator_dur_loss=1.627, generator_adv_loss=2.219, generator_feat_match_loss=4.726, over 592.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2023-11-15 06:14:19,971 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 06:14:31,163 INFO [train.py:517] (0/4) Epoch 963, validation: discriminator_loss=2.487, discriminator_real_loss=1.174, discriminator_fake_loss=1.313, generator_loss=31.52, generator_mel_loss=20.6, generator_kl_loss=2.213, generator_dur_loss=1.629, generator_adv_loss=2.048, generator_feat_match_loss=5.035, over 100.00 samples. 2023-11-15 06:14:31,164 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 06:17:12,226 INFO [train.py:811] (0/4) Start epoch 964 2023-11-15 06:19:10,323 INFO [train.py:467] (0/4) Epoch 964, batch 19, global_batch_idx: 35650, batch size: 126, loss[discriminator_loss=2.5, discriminator_real_loss=1.186, discriminator_fake_loss=1.314, generator_loss=30.95, generator_mel_loss=20.03, generator_kl_loss=2.123, generator_dur_loss=1.622, generator_adv_loss=2.283, generator_feat_match_loss=4.891, over 126.00 samples.], tot_loss[discriminator_loss=2.555, discriminator_real_loss=1.299, discriminator_fake_loss=1.256, generator_loss=30.78, generator_mel_loss=19.82, generator_kl_loss=2.026, generator_dur_loss=1.621, generator_adv_loss=2.335, generator_feat_match_loss=4.978, over 1695.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 06:20:45,359 INFO [train.py:811] (0/4) Start epoch 965 2023-11-15 06:24:00,983 INFO [train.py:467] (0/4) Epoch 965, batch 32, global_batch_idx: 35700, batch size: 81, loss[discriminator_loss=2.592, discriminator_real_loss=1.316, discriminator_fake_loss=1.275, generator_loss=30.19, generator_mel_loss=20.04, generator_kl_loss=2.057, generator_dur_loss=1.644, generator_adv_loss=2.016, generator_feat_match_loss=4.43, over 81.00 samples.], tot_loss[discriminator_loss=2.546, discriminator_real_loss=1.293, discriminator_fake_loss=1.253, generator_loss=30.45, generator_mel_loss=19.82, generator_kl_loss=2.026, generator_dur_loss=1.623, generator_adv_loss=2.22, generator_feat_match_loss=4.765, over 2485.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 06:24:20,571 INFO [train.py:811] (0/4) Start epoch 966 2023-11-15 06:27:52,071 INFO [train.py:811] (0/4) Start epoch 967 2023-11-15 06:28:53,416 INFO [train.py:467] (0/4) Epoch 967, batch 8, global_batch_idx: 35750, batch size: 110, loss[discriminator_loss=2.551, discriminator_real_loss=1.255, discriminator_fake_loss=1.296, generator_loss=31.18, generator_mel_loss=20.32, generator_kl_loss=2.06, generator_dur_loss=1.625, generator_adv_loss=2.301, generator_feat_match_loss=4.875, over 110.00 samples.], tot_loss[discriminator_loss=2.564, discriminator_real_loss=1.272, discriminator_fake_loss=1.291, generator_loss=30.84, generator_mel_loss=20.12, generator_kl_loss=2.032, generator_dur_loss=1.621, generator_adv_loss=2.245, generator_feat_match_loss=4.819, over 778.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 06:31:29,478 INFO [train.py:811] (0/4) Start epoch 968 2023-11-15 06:33:36,852 INFO [train.py:467] (0/4) Epoch 968, batch 21, global_batch_idx: 35800, batch size: 71, loss[discriminator_loss=2.451, discriminator_real_loss=1.172, discriminator_fake_loss=1.279, generator_loss=31.35, generator_mel_loss=19.95, generator_kl_loss=2.057, generator_dur_loss=1.631, generator_adv_loss=2.393, generator_feat_match_loss=5.324, over 71.00 samples.], tot_loss[discriminator_loss=2.522, discriminator_real_loss=1.288, discriminator_fake_loss=1.233, generator_loss=30.79, generator_mel_loss=19.82, generator_kl_loss=2, generator_dur_loss=1.63, generator_adv_loss=2.364, generator_feat_match_loss=4.981, over 1507.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 06:33:37,437 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 06:33:47,700 INFO [train.py:517] (0/4) Epoch 968, validation: discriminator_loss=2.741, discriminator_real_loss=1.128, discriminator_fake_loss=1.613, generator_loss=30.96, generator_mel_loss=20.77, generator_kl_loss=2.193, generator_dur_loss=1.637, generator_adv_loss=1.671, generator_feat_match_loss=4.682, over 100.00 samples. 2023-11-15 06:33:47,701 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 06:35:06,031 INFO [train.py:811] (0/4) Start epoch 969 2023-11-15 06:38:27,696 INFO [train.py:467] (0/4) Epoch 969, batch 34, global_batch_idx: 35850, batch size: 71, loss[discriminator_loss=2.543, discriminator_real_loss=1.377, discriminator_fake_loss=1.166, generator_loss=30.73, generator_mel_loss=20.01, generator_kl_loss=2.009, generator_dur_loss=1.613, generator_adv_loss=2.303, generator_feat_match_loss=4.793, over 71.00 samples.], tot_loss[discriminator_loss=2.491, discriminator_real_loss=1.26, discriminator_fake_loss=1.23, generator_loss=30.78, generator_mel_loss=19.87, generator_kl_loss=2.011, generator_dur_loss=1.634, generator_adv_loss=2.312, generator_feat_match_loss=4.952, over 2243.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 06:38:40,359 INFO [train.py:811] (0/4) Start epoch 970 2023-11-15 06:42:10,461 INFO [train.py:811] (0/4) Start epoch 971 2023-11-15 06:43:22,030 INFO [train.py:467] (0/4) Epoch 971, batch 10, global_batch_idx: 35900, batch size: 58, loss[discriminator_loss=2.598, discriminator_real_loss=1.322, discriminator_fake_loss=1.275, generator_loss=29.88, generator_mel_loss=19.25, generator_kl_loss=2.02, generator_dur_loss=1.624, generator_adv_loss=2.1, generator_feat_match_loss=4.895, over 58.00 samples.], tot_loss[discriminator_loss=2.56, discriminator_real_loss=1.312, discriminator_fake_loss=1.248, generator_loss=30.79, generator_mel_loss=19.53, generator_kl_loss=2.03, generator_dur_loss=1.631, generator_adv_loss=2.436, generator_feat_match_loss=5.163, over 712.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 06:45:41,934 INFO [train.py:811] (0/4) Start epoch 972 2023-11-15 06:48:04,596 INFO [train.py:467] (0/4) Epoch 972, batch 23, global_batch_idx: 35950, batch size: 51, loss[discriminator_loss=2.551, discriminator_real_loss=1.385, discriminator_fake_loss=1.167, generator_loss=30.82, generator_mel_loss=19.98, generator_kl_loss=2.036, generator_dur_loss=1.623, generator_adv_loss=2.297, generator_feat_match_loss=4.891, over 51.00 samples.], tot_loss[discriminator_loss=2.561, discriminator_real_loss=1.297, discriminator_fake_loss=1.264, generator_loss=30.59, generator_mel_loss=20, generator_kl_loss=2.032, generator_dur_loss=1.623, generator_adv_loss=2.216, generator_feat_match_loss=4.712, over 1754.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 06:49:16,040 INFO [train.py:811] (0/4) Start epoch 973 2023-11-15 06:52:45,953 INFO [train.py:467] (0/4) Epoch 973, batch 36, global_batch_idx: 36000, batch size: 52, loss[discriminator_loss=2.523, discriminator_real_loss=1.237, discriminator_fake_loss=1.285, generator_loss=29.96, generator_mel_loss=19.29, generator_kl_loss=1.984, generator_dur_loss=1.628, generator_adv_loss=2.27, generator_feat_match_loss=4.789, over 52.00 samples.], tot_loss[discriminator_loss=2.547, discriminator_real_loss=1.279, discriminator_fake_loss=1.268, generator_loss=30.67, generator_mel_loss=19.8, generator_kl_loss=2.009, generator_dur_loss=1.629, generator_adv_loss=2.327, generator_feat_match_loss=4.909, over 2630.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2023-11-15 06:52:46,480 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 06:52:57,018 INFO [train.py:517] (0/4) Epoch 973, validation: discriminator_loss=2.505, discriminator_real_loss=1.19, discriminator_fake_loss=1.315, generator_loss=31.28, generator_mel_loss=20.59, generator_kl_loss=2.157, generator_dur_loss=1.638, generator_adv_loss=1.969, generator_feat_match_loss=4.919, over 100.00 samples. 2023-11-15 06:52:57,019 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 06:52:57,743 INFO [train.py:811] (0/4) Start epoch 974 2023-11-15 06:56:32,509 INFO [train.py:811] (0/4) Start epoch 975 2023-11-15 06:57:50,212 INFO [train.py:467] (0/4) Epoch 975, batch 12, global_batch_idx: 36050, batch size: 52, loss[discriminator_loss=2.516, discriminator_real_loss=1.283, discriminator_fake_loss=1.233, generator_loss=30.72, generator_mel_loss=19.88, generator_kl_loss=2.042, generator_dur_loss=1.647, generator_adv_loss=2.33, generator_feat_match_loss=4.82, over 52.00 samples.], tot_loss[discriminator_loss=2.562, discriminator_real_loss=1.295, discriminator_fake_loss=1.268, generator_loss=30.71, generator_mel_loss=19.98, generator_kl_loss=2.028, generator_dur_loss=1.632, generator_adv_loss=2.267, generator_feat_match_loss=4.807, over 895.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2023-11-15 07:00:06,034 INFO [train.py:811] (0/4) Start epoch 976 2023-11-15 07:02:39,617 INFO [train.py:467] (0/4) Epoch 976, batch 25, global_batch_idx: 36100, batch size: 61, loss[discriminator_loss=2.727, discriminator_real_loss=1.174, discriminator_fake_loss=1.554, generator_loss=29.44, generator_mel_loss=19.27, generator_kl_loss=1.973, generator_dur_loss=1.642, generator_adv_loss=2.146, generator_feat_match_loss=4.406, over 61.00 samples.], tot_loss[discriminator_loss=2.509, discriminator_real_loss=1.273, discriminator_fake_loss=1.237, generator_loss=30.89, generator_mel_loss=19.71, generator_kl_loss=2.002, generator_dur_loss=1.623, generator_adv_loss=2.41, generator_feat_match_loss=5.144, over 1783.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 07:03:40,859 INFO [train.py:811] (0/4) Start epoch 977 2023-11-15 07:07:13,339 INFO [train.py:811] (0/4) Start epoch 978 2023-11-15 07:07:35,419 INFO [train.py:467] (0/4) Epoch 978, batch 1, global_batch_idx: 36150, batch size: 61, loss[discriminator_loss=2.525, discriminator_real_loss=1.24, discriminator_fake_loss=1.285, generator_loss=30.5, generator_mel_loss=19.68, generator_kl_loss=2.036, generator_dur_loss=1.642, generator_adv_loss=2.258, generator_feat_match_loss=4.887, over 61.00 samples.], tot_loss[discriminator_loss=2.514, discriminator_real_loss=1.249, discriminator_fake_loss=1.265, generator_loss=30.78, generator_mel_loss=19.93, generator_kl_loss=1.999, generator_dur_loss=1.628, generator_adv_loss=2.257, generator_feat_match_loss=4.959, over 119.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 07:10:43,223 INFO [train.py:811] (0/4) Start epoch 979 2023-11-15 07:12:11,754 INFO [train.py:467] (0/4) Epoch 979, batch 14, global_batch_idx: 36200, batch size: 153, loss[discriminator_loss=2.496, discriminator_real_loss=1.259, discriminator_fake_loss=1.237, generator_loss=31.08, generator_mel_loss=20.11, generator_kl_loss=2.119, generator_dur_loss=1.63, generator_adv_loss=1.986, generator_feat_match_loss=5.227, over 153.00 samples.], tot_loss[discriminator_loss=2.592, discriminator_real_loss=1.324, discriminator_fake_loss=1.268, generator_loss=30.7, generator_mel_loss=19.99, generator_kl_loss=2.033, generator_dur_loss=1.629, generator_adv_loss=2.211, generator_feat_match_loss=4.842, over 1012.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 07:12:12,394 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 07:12:23,056 INFO [train.py:517] (0/4) Epoch 979, validation: discriminator_loss=2.618, discriminator_real_loss=1.056, discriminator_fake_loss=1.562, generator_loss=30.72, generator_mel_loss=20.33, generator_kl_loss=2.276, generator_dur_loss=1.629, generator_adv_loss=1.682, generator_feat_match_loss=4.806, over 100.00 samples. 2023-11-15 07:12:23,058 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 07:14:23,757 INFO [train.py:811] (0/4) Start epoch 980 2023-11-15 07:17:09,457 INFO [train.py:467] (0/4) Epoch 980, batch 27, global_batch_idx: 36250, batch size: 69, loss[discriminator_loss=2.484, discriminator_real_loss=1.292, discriminator_fake_loss=1.193, generator_loss=31.28, generator_mel_loss=19.92, generator_kl_loss=2.125, generator_dur_loss=1.641, generator_adv_loss=2.418, generator_feat_match_loss=5.172, over 69.00 samples.], tot_loss[discriminator_loss=2.531, discriminator_real_loss=1.272, discriminator_fake_loss=1.259, generator_loss=30.8, generator_mel_loss=19.85, generator_kl_loss=2.006, generator_dur_loss=1.624, generator_adv_loss=2.308, generator_feat_match_loss=5.006, over 2082.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 07:17:56,485 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-980.pt 2023-11-15 07:17:59,866 INFO [train.py:811] (0/4) Start epoch 981 2023-11-15 07:21:34,003 INFO [train.py:811] (0/4) Start epoch 982 2023-11-15 07:22:04,813 INFO [train.py:467] (0/4) Epoch 982, batch 3, global_batch_idx: 36300, batch size: 65, loss[discriminator_loss=2.535, discriminator_real_loss=1.273, discriminator_fake_loss=1.261, generator_loss=30.54, generator_mel_loss=19.73, generator_kl_loss=1.916, generator_dur_loss=1.624, generator_adv_loss=2.264, generator_feat_match_loss=5.008, over 65.00 samples.], tot_loss[discriminator_loss=2.563, discriminator_real_loss=1.297, discriminator_fake_loss=1.266, generator_loss=30.51, generator_mel_loss=19.88, generator_kl_loss=1.964, generator_dur_loss=1.627, generator_adv_loss=2.25, generator_feat_match_loss=4.783, over 249.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 07:25:06,592 INFO [train.py:811] (0/4) Start epoch 983 2023-11-15 07:26:50,766 INFO [train.py:467] (0/4) Epoch 983, batch 16, global_batch_idx: 36350, batch size: 56, loss[discriminator_loss=2.574, discriminator_real_loss=1.133, discriminator_fake_loss=1.44, generator_loss=30.65, generator_mel_loss=19.93, generator_kl_loss=2.084, generator_dur_loss=1.61, generator_adv_loss=2.285, generator_feat_match_loss=4.742, over 56.00 samples.], tot_loss[discriminator_loss=2.537, discriminator_real_loss=1.283, discriminator_fake_loss=1.254, generator_loss=30.48, generator_mel_loss=19.71, generator_kl_loss=2.027, generator_dur_loss=1.628, generator_adv_loss=2.265, generator_feat_match_loss=4.847, over 1146.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 07:28:40,873 INFO [train.py:811] (0/4) Start epoch 984 2023-11-15 07:31:33,867 INFO [train.py:467] (0/4) Epoch 984, batch 29, global_batch_idx: 36400, batch size: 58, loss[discriminator_loss=2.68, discriminator_real_loss=1.39, discriminator_fake_loss=1.289, generator_loss=30.43, generator_mel_loss=19.77, generator_kl_loss=2.075, generator_dur_loss=1.611, generator_adv_loss=2.201, generator_feat_match_loss=4.766, over 58.00 samples.], tot_loss[discriminator_loss=2.554, discriminator_real_loss=1.295, discriminator_fake_loss=1.259, generator_loss=30.63, generator_mel_loss=20, generator_kl_loss=2.041, generator_dur_loss=1.622, generator_adv_loss=2.205, generator_feat_match_loss=4.753, over 2228.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2023-11-15 07:31:34,361 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 07:31:44,583 INFO [train.py:517] (0/4) Epoch 984, validation: discriminator_loss=2.678, discriminator_real_loss=1.31, discriminator_fake_loss=1.368, generator_loss=31.65, generator_mel_loss=20.98, generator_kl_loss=2.292, generator_dur_loss=1.636, generator_adv_loss=2.001, generator_feat_match_loss=4.745, over 100.00 samples. 2023-11-15 07:31:44,584 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 07:32:24,739 INFO [train.py:811] (0/4) Start epoch 985 2023-11-15 07:36:00,867 INFO [train.py:811] (0/4) Start epoch 986 2023-11-15 07:36:45,086 INFO [train.py:467] (0/4) Epoch 986, batch 5, global_batch_idx: 36450, batch size: 95, loss[discriminator_loss=2.43, discriminator_real_loss=1.197, discriminator_fake_loss=1.232, generator_loss=30.74, generator_mel_loss=19.64, generator_kl_loss=2.003, generator_dur_loss=1.609, generator_adv_loss=2.189, generator_feat_match_loss=5.297, over 95.00 samples.], tot_loss[discriminator_loss=2.563, discriminator_real_loss=1.319, discriminator_fake_loss=1.244, generator_loss=30.65, generator_mel_loss=19.83, generator_kl_loss=2.01, generator_dur_loss=1.622, generator_adv_loss=2.263, generator_feat_match_loss=4.93, over 497.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2023-11-15 07:39:28,570 INFO [train.py:811] (0/4) Start epoch 987 2023-11-15 07:41:29,228 INFO [train.py:467] (0/4) Epoch 987, batch 18, global_batch_idx: 36500, batch size: 110, loss[discriminator_loss=2.586, discriminator_real_loss=1.231, discriminator_fake_loss=1.355, generator_loss=31.27, generator_mel_loss=20.53, generator_kl_loss=2.08, generator_dur_loss=1.61, generator_adv_loss=2.266, generator_feat_match_loss=4.789, over 110.00 samples.], tot_loss[discriminator_loss=2.578, discriminator_real_loss=1.295, discriminator_fake_loss=1.283, generator_loss=30.7, generator_mel_loss=20.09, generator_kl_loss=2.032, generator_dur_loss=1.628, generator_adv_loss=2.18, generator_feat_match_loss=4.769, over 1560.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2023-11-15 07:43:01,282 INFO [train.py:811] (0/4) Start epoch 988 2023-11-15 07:46:11,538 INFO [train.py:467] (0/4) Epoch 988, batch 31, global_batch_idx: 36550, batch size: 65, loss[discriminator_loss=2.672, discriminator_real_loss=1.314, discriminator_fake_loss=1.356, generator_loss=30.88, generator_mel_loss=20.3, generator_kl_loss=2.074, generator_dur_loss=1.615, generator_adv_loss=2.135, generator_feat_match_loss=4.762, over 65.00 samples.], tot_loss[discriminator_loss=2.612, discriminator_real_loss=1.328, discriminator_fake_loss=1.284, generator_loss=30.61, generator_mel_loss=20.1, generator_kl_loss=2.041, generator_dur_loss=1.623, generator_adv_loss=2.185, generator_feat_match_loss=4.657, over 2445.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2023-11-15 07:46:36,111 INFO [train.py:811] (0/4) Start epoch 989 2023-11-15 07:50:10,946 INFO [train.py:811] (0/4) Start epoch 990 2023-11-15 07:51:03,433 INFO [train.py:467] (0/4) Epoch 990, batch 7, global_batch_idx: 36600, batch size: 71, loss[discriminator_loss=2.596, discriminator_real_loss=1.312, discriminator_fake_loss=1.283, generator_loss=30.17, generator_mel_loss=19.69, generator_kl_loss=2.025, generator_dur_loss=1.619, generator_adv_loss=2.139, generator_feat_match_loss=4.691, over 71.00 samples.], tot_loss[discriminator_loss=2.584, discriminator_real_loss=1.291, discriminator_fake_loss=1.293, generator_loss=30.51, generator_mel_loss=20, generator_kl_loss=1.987, generator_dur_loss=1.632, generator_adv_loss=2.181, generator_feat_match_loss=4.711, over 504.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2023-11-15 07:51:03,957 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 07:51:15,413 INFO [train.py:517] (0/4) Epoch 990, validation: discriminator_loss=2.502, discriminator_real_loss=1.136, discriminator_fake_loss=1.367, generator_loss=31.61, generator_mel_loss=20.82, generator_kl_loss=2.262, generator_dur_loss=1.631, generator_adv_loss=1.992, generator_feat_match_loss=4.902, over 100.00 samples. 2023-11-15 07:51:15,414 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 07:53:58,082 INFO [train.py:811] (0/4) Start epoch 991 2023-11-15 07:56:07,117 INFO [train.py:467] (0/4) Epoch 991, batch 20, global_batch_idx: 36650, batch size: 50, loss[discriminator_loss=2.562, discriminator_real_loss=1.324, discriminator_fake_loss=1.238, generator_loss=30.44, generator_mel_loss=19.86, generator_kl_loss=1.996, generator_dur_loss=1.624, generator_adv_loss=2.252, generator_feat_match_loss=4.715, over 50.00 samples.], tot_loss[discriminator_loss=2.609, discriminator_real_loss=1.328, discriminator_fake_loss=1.282, generator_loss=30.57, generator_mel_loss=20.07, generator_kl_loss=2.022, generator_dur_loss=1.627, generator_adv_loss=2.192, generator_feat_match_loss=4.652, over 1476.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2023-11-15 07:57:34,061 INFO [train.py:811] (0/4) Start epoch 992 2023-11-15 08:00:41,912 INFO [train.py:467] (0/4) Epoch 992, batch 33, global_batch_idx: 36700, batch size: 85, loss[discriminator_loss=2.582, discriminator_real_loss=1.347, discriminator_fake_loss=1.235, generator_loss=30.42, generator_mel_loss=20.03, generator_kl_loss=2.062, generator_dur_loss=1.608, generator_adv_loss=2.148, generator_feat_match_loss=4.574, over 85.00 samples.], tot_loss[discriminator_loss=2.591, discriminator_real_loss=1.307, discriminator_fake_loss=1.284, generator_loss=30.69, generator_mel_loss=20.13, generator_kl_loss=2.047, generator_dur_loss=1.62, generator_adv_loss=2.212, generator_feat_match_loss=4.688, over 2426.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2023-11-15 08:01:03,104 INFO [train.py:811] (0/4) Start epoch 993 2023-11-15 08:04:37,137 INFO [train.py:811] (0/4) Start epoch 994 2023-11-15 08:05:43,106 INFO [train.py:467] (0/4) Epoch 994, batch 9, global_batch_idx: 36750, batch size: 110, loss[discriminator_loss=2.562, discriminator_real_loss=1.351, discriminator_fake_loss=1.212, generator_loss=30.75, generator_mel_loss=19.68, generator_kl_loss=2.089, generator_dur_loss=1.635, generator_adv_loss=2.371, generator_feat_match_loss=4.977, over 110.00 samples.], tot_loss[discriminator_loss=2.622, discriminator_real_loss=1.331, discriminator_fake_loss=1.291, generator_loss=30.82, generator_mel_loss=19.57, generator_kl_loss=1.993, generator_dur_loss=1.624, generator_adv_loss=2.415, generator_feat_match_loss=5.218, over 821.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 08:08:07,129 INFO [train.py:811] (0/4) Start epoch 995 2023-11-15 08:10:27,347 INFO [train.py:467] (0/4) Epoch 995, batch 22, global_batch_idx: 36800, batch size: 65, loss[discriminator_loss=2.566, discriminator_real_loss=1.373, discriminator_fake_loss=1.192, generator_loss=30.61, generator_mel_loss=20.24, generator_kl_loss=1.962, generator_dur_loss=1.62, generator_adv_loss=2.076, generator_feat_match_loss=4.711, over 65.00 samples.], tot_loss[discriminator_loss=2.576, discriminator_real_loss=1.317, discriminator_fake_loss=1.258, generator_loss=30.48, generator_mel_loss=19.97, generator_kl_loss=2.02, generator_dur_loss=1.622, generator_adv_loss=2.197, generator_feat_match_loss=4.671, over 1633.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2023-11-15 08:10:27,862 INFO [train.py:508] (0/4) Computing validation loss 2023-11-15 08:10:38,605 INFO [train.py:517] (0/4) Epoch 995, validation: discriminator_loss=2.653, discriminator_real_loss=1.189, discriminator_fake_loss=1.464, generator_loss=30.64, generator_mel_loss=20.53, generator_kl_loss=2.269, generator_dur_loss=1.626, generator_adv_loss=1.772, generator_feat_match_loss=4.444, over 100.00 samples. 2023-11-15 08:10:38,606 INFO [train.py:518] (0/4) Maximum memory allocated so far is 27269MB 2023-11-15 08:11:49,597 INFO [train.py:811] (0/4) Start epoch 996 2023-11-15 08:15:19,985 INFO [train.py:467] (0/4) Epoch 996, batch 35, global_batch_idx: 36850, batch size: 71, loss[discriminator_loss=2.555, discriminator_real_loss=1.251, discriminator_fake_loss=1.305, generator_loss=30.73, generator_mel_loss=20.17, generator_kl_loss=2.089, generator_dur_loss=1.617, generator_adv_loss=2.209, generator_feat_match_loss=4.648, over 71.00 samples.], tot_loss[discriminator_loss=2.591, discriminator_real_loss=1.302, discriminator_fake_loss=1.29, generator_loss=30.56, generator_mel_loss=19.91, generator_kl_loss=2.011, generator_dur_loss=1.626, generator_adv_loss=2.242, generator_feat_match_loss=4.771, over 2464.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2023-11-15 08:15:25,746 INFO [train.py:811] (0/4) Start epoch 997 2023-11-15 08:18:59,886 INFO [train.py:811] (0/4) Start epoch 998 2023-11-15 08:20:22,891 INFO [train.py:467] (0/4) Epoch 998, batch 11, global_batch_idx: 36900, batch size: 53, loss[discriminator_loss=2.566, discriminator_real_loss=1.252, discriminator_fake_loss=1.315, generator_loss=30.01, generator_mel_loss=19.71, generator_kl_loss=1.957, generator_dur_loss=1.625, generator_adv_loss=2.027, generator_feat_match_loss=4.684, over 53.00 samples.], tot_loss[discriminator_loss=2.573, discriminator_real_loss=1.305, discriminator_fake_loss=1.268, generator_loss=30.96, generator_mel_loss=20.06, generator_kl_loss=2.053, generator_dur_loss=1.627, generator_adv_loss=2.282, generator_feat_match_loss=4.941, over 926.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 08:22:38,784 INFO [train.py:811] (0/4) Start epoch 999 2023-11-15 08:25:03,045 INFO [train.py:467] (0/4) Epoch 999, batch 24, global_batch_idx: 36950, batch size: 90, loss[discriminator_loss=2.59, discriminator_real_loss=1.328, discriminator_fake_loss=1.262, generator_loss=30.5, generator_mel_loss=20.17, generator_kl_loss=2.122, generator_dur_loss=1.629, generator_adv_loss=2.006, generator_feat_match_loss=4.57, over 90.00 samples.], tot_loss[discriminator_loss=2.56, discriminator_real_loss=1.298, discriminator_fake_loss=1.262, generator_loss=30.57, generator_mel_loss=19.88, generator_kl_loss=2.026, generator_dur_loss=1.622, generator_adv_loss=2.229, generator_feat_match_loss=4.809, over 1880.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2023-11-15 08:26:10,788 INFO [train.py:811] (0/4) Start epoch 1000 2023-11-15 08:29:43,334 INFO [utils.py:245] (0/4) Saving checkpoint to vits/exp-g2p-conformer-text-encoder-new/epoch-1000.pt 2023-11-15 08:29:46,119 INFO [train.py:868] (0/4) Done!