icefall-asr-librispeech-zipformer-large-cr-ctc-aed-20241020
/
decoding_results
/attention-decoder-rescoring-no-ngram
/log-decode-epoch-50_avg-20_use-averaged-model-2024-09-21-17-57-57
2024-09-21 17:57:57,911 INFO [ctc_decode.py:769] Decoding started | |
2024-09-21 17:57:57,911 INFO [ctc_decode.py:775] Device: cuda:0 | |
2024-09-21 17:57:57,911 INFO [ctc_decode.py:776] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'ignore_id': -1, 'label_smoothing': 0.1, 'warm_step': 2000, 'env_info': {'k2-version': '1.24.4', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '44a9d5682af9fd3ef77074777e15278ec6d390eb', 'k2-git-date': 'Wed Sep 27 11:22:55 2023', 'lhotse-version': '1.17.0.dev+git.ccfc5b2c.dirty', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'cr-ctc', 'icefall-git-sha1': 'a6eead6c-clean', 'icefall-git-date': 'Mon Sep 9 10:10:08 2024', 'icefall-path': '/star-zw/workspace/zipformer/icefall_cr_ctc', 'k2-path': '/star-zw/workspace/k2/k2/k2/python/k2/__init__.py', 'lhotse-path': '/star-zw/workspace/lhotse/lhotse/lhotse/__init__.py', 'hostname': 'de-74279-k2-train-2-0904151501-7d58788f57-7cktm', 'IP address': '10.30.14.169'}, 'frame_shift_ms': 10, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'epoch': 50, 'iter': 0, 'avg': 20, 'use_averaged_model': True, 'exp_dir': PosixPath('zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1'), 'bpe_model': 'data/lang_bpe_500/bpe.model', 'lang_dir': PosixPath('data/lang_bpe_500'), 'context_size': 2, 'decoding_method': 'attention-decoder-rescoring-no-ngram', 'num_paths': 100, 'nbest_scale': 1.0, 'hlg_scale': 0.6, 'lm_dir': PosixPath('data/lm'), 'skip_scoring': False, 'num_encoder_layers': '2,2,4,5,4,2', 'downsampling_factor': '1,2,4,8,4,2', 'feedforward_dim': '512,768,1536,2048,1536,768', 'num_heads': '4,4,4,8,4,4', 'encoder_dim': '192,256,512,768,512,256', 'query_head_dim': '32', 'value_head_dim': '12', 'pos_head_dim': '4', 'pos_dim': 48, 'encoder_unmasked_dim': '192,192,256,320,256,192', 'cnn_module_kernel': '31,31,15,15,15,31', 'decoder_dim': 512, 'joiner_dim': 512, 'attention_decoder_dim': 512, 'attention_decoder_num_layers': 6, 'attention_decoder_attention_dim': 512, 'attention_decoder_num_heads': 8, 'attention_decoder_feedforward_dim': 2048, 'causal': False, 'chunk_size': '16,32,64,-1', 'left_context_frames': '64,128,256,-1', 'use_transducer': False, 'use_ctc': True, 'use_attention_decoder': True, 'use_cr_ctc': True, 'full_libri': True, 'mini_libri': False, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 200, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures', 'res_dir': PosixPath('zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram'), 'suffix': 'epoch-50_avg-20_use-averaged-model'} | |
2024-09-21 17:57:58,323 INFO [lexicon.py:168] Loading pre-compiled data/lang_bpe_500/Linv.pt | |
2024-09-21 17:58:03,090 INFO [ctc_decode.py:861] About to create model | |
2024-09-21 17:58:04,386 INFO [ctc_decode.py:928] Calculating the averaged model over epoch range from 30 (excluded) to 50 | |
2024-09-21 17:58:28,955 INFO [ctc_decode.py:945] Number of model parameters: 174319650 | |
2024-09-21 17:58:28,955 INFO [asr_datamodule.py:467] About to get test-clean cuts | |
2024-09-21 17:58:29,094 INFO [asr_datamodule.py:474] About to get test-other cuts | |
2024-09-21 17:58:32,173 INFO [ctc_decode.py:653] batch 0/?, cuts processed until now is 14 | |
2024-09-21 17:58:55,303 INFO [zipformer.py:1858] name=None, attn_weights_entropy = tensor([5.2234, 4.6148, 4.6277, 4.7056], device='cuda:0') | |
2024-09-21 18:01:56,602 INFO [ctc_decode.py:653] batch 100/?, cuts processed until now is 2298 | |
2024-09-21 18:02:22,203 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,241 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,277 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,313 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,347 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,383 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,432 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,469 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,505 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,541 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,577 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,610 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,645 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,680 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,713 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,749 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,784 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,819 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,854 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,890 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,924 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,959 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:22,994 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:23,028 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:23,060 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:23,094 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:23,129 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:23,163 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:23,262 INFO [utils.py:657] [test-clean_attention_scale_0.01] %WER 2.53% [1329 / 52576, 151 ins, 144 del, 1034 sub ] | |
2024-09-21 18:02:23,476 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:23,572 INFO [utils.py:657] [test-clean_attention_scale_0.05] %WER 2.48% [1302 / 52576, 149 ins, 133 del, 1020 sub ] | |
2024-09-21 18:02:23,784 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:23,879 INFO [utils.py:657] [test-clean_attention_scale_0.08] %WER 2.43% [1278 / 52576, 147 ins, 126 del, 1005 sub ] | |
2024-09-21 18:02:24,086 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:24,425 INFO [utils.py:657] [test-clean_attention_scale_0.1] %WER 2.39% [1259 / 52576, 143 ins, 124 del, 992 sub ] | |
2024-09-21 18:02:24,632 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:24,724 INFO [utils.py:657] [test-clean_attention_scale_0.3] %WER 2.28% [1197 / 52576, 130 ins, 113 del, 954 sub ] | |
2024-09-21 18:02:24,931 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:25,030 INFO [utils.py:657] [test-clean_attention_scale_0.5] %WER 2.17% [1143 / 52576, 124 ins, 102 del, 917 sub ] | |
2024-09-21 18:02:25,238 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:25,332 INFO [utils.py:657] [test-clean_attention_scale_0.6] %WER 2.13% [1119 / 52576, 125 ins, 96 del, 898 sub ] | |
2024-09-21 18:02:25,537 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:25,629 INFO [utils.py:657] [test-clean_attention_scale_0.7] %WER 2.08% [1096 / 52576, 124 ins, 88 del, 884 sub ] | |
2024-09-21 18:02:25,836 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:25,933 INFO [utils.py:657] [test-clean_attention_scale_0.9] %WER 2.05% [1078 / 52576, 122 ins, 85 del, 871 sub ] | |
2024-09-21 18:02:26,137 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:26,229 INFO [utils.py:657] [test-clean_attention_scale_1.0] %WER 2.04% [1073 / 52576, 123 ins, 84 del, 866 sub ] | |
2024-09-21 18:02:26,433 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:26,524 INFO [utils.py:657] [test-clean_attention_scale_1.1] %WER 2.04% [1070 / 52576, 123 ins, 83 del, 864 sub ] | |
2024-09-21 18:02:26,730 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:26,824 INFO [utils.py:657] [test-clean_attention_scale_1.2] %WER 2.02% [1060 / 52576, 121 ins, 83 del, 856 sub ] | |
2024-09-21 18:02:27,030 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:27,123 INFO [utils.py:657] [test-clean_attention_scale_1.3] %WER 2.01% [1055 / 52576, 121 ins, 82 del, 852 sub ] | |
2024-09-21 18:02:27,364 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:27,456 INFO [utils.py:657] [test-clean_attention_scale_1.5] %WER 2.00% [1052 / 52576, 121 ins, 79 del, 852 sub ] | |
2024-09-21 18:02:27,661 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:27,752 INFO [utils.py:657] [test-clean_attention_scale_1.7] %WER 1.98% [1040 / 52576, 122 ins, 75 del, 843 sub ] | |
2024-09-21 18:02:27,960 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:28,052 INFO [utils.py:657] [test-clean_attention_scale_1.9] %WER 1.98% [1039 / 52576, 122 ins, 74 del, 843 sub ] | |
2024-09-21 18:02:28,520 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:28,613 INFO [utils.py:657] [test-clean_attention_scale_2.0] %WER 1.98% [1039 / 52576, 122 ins, 73 del, 844 sub ] | |
2024-09-21 18:02:28,818 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:28,910 INFO [utils.py:657] [test-clean_attention_scale_2.1] %WER 1.97% [1037 / 52576, 121 ins, 72 del, 844 sub ] | |
2024-09-21 18:02:29,122 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:29,214 INFO [utils.py:657] [test-clean_attention_scale_2.2] %WER 1.97% [1037 / 52576, 122 ins, 71 del, 844 sub ] | |
2024-09-21 18:02:29,425 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:29,523 INFO [utils.py:657] [test-clean_attention_scale_2.3] %WER 1.97% [1037 / 52576, 122 ins, 71 del, 844 sub ] | |
2024-09-21 18:02:29,733 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:29,836 INFO [utils.py:657] [test-clean_attention_scale_2.5] %WER 1.97% [1035 / 52576, 122 ins, 70 del, 843 sub ] | |
2024-09-21 18:02:30,052 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:30,144 INFO [utils.py:657] [test-clean_attention_scale_3.0] %WER 1.96% [1033 / 52576, 122 ins, 70 del, 841 sub ] | |
2024-09-21 18:02:30,366 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:30,461 INFO [utils.py:657] [test-clean_attention_scale_4.0] %WER 1.97% [1037 / 52576, 122 ins, 71 del, 844 sub ] | |
2024-09-21 18:02:30,670 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:30,770 INFO [utils.py:657] [test-clean_attention_scale_5.0] %WER 1.96% [1030 / 52576, 119 ins, 71 del, 840 sub ] | |
2024-09-21 18:02:30,985 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:31,077 INFO [utils.py:657] [test-clean_attention_scale_6.0] %WER 1.96% [1028 / 52576, 119 ins, 71 del, 838 sub ] | |
2024-09-21 18:02:31,300 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:31,395 INFO [utils.py:657] [test-clean_attention_scale_7.0] %WER 1.96% [1030 / 52576, 120 ins, 71 del, 839 sub ] | |
2024-09-21 18:02:31,604 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:31,698 INFO [utils.py:657] [test-clean_attention_scale_8.0] %WER 1.96% [1030 / 52576, 120 ins, 71 del, 839 sub ] | |
2024-09-21 18:02:31,906 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:32,241 INFO [utils.py:657] [test-clean_attention_scale_9.0] %WER 1.96% [1031 / 52576, 121 ins, 71 del, 839 sub ] | |
2024-09-21 18:02:32,449 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-clean-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:02:32,486 INFO [ctc_decode.py:717] | |
For test-clean, WER of different settings are: | |
attention_scale_3.0 1.96 best for test-clean | |
attention_scale_5.0 1.96 | |
attention_scale_6.0 1.96 | |
attention_scale_7.0 1.96 | |
attention_scale_8.0 1.96 | |
attention_scale_9.0 1.96 | |
attention_scale_2.1 1.97 | |
attention_scale_2.2 1.97 | |
attention_scale_2.3 1.97 | |
attention_scale_2.5 1.97 | |
attention_scale_4.0 1.97 | |
attention_scale_1.7 1.98 | |
attention_scale_1.9 1.98 | |
attention_scale_2.0 1.98 | |
attention_scale_1.5 2.0 | |
attention_scale_1.3 2.01 | |
attention_scale_1.2 2.02 | |
attention_scale_1.0 2.04 | |
attention_scale_1.1 2.04 | |
attention_scale_0.9 2.05 | |
attention_scale_0.7 2.08 | |
attention_scale_0.6 2.13 | |
attention_scale_0.5 2.17 | |
attention_scale_0.3 2.28 | |
attention_scale_0.1 2.39 | |
attention_scale_0.08 2.43 | |
attention_scale_0.05 2.48 | |
attention_scale_0.01 2.53 | |
2024-09-21 18:02:32,913 INFO [zipformer.py:1858] name=None, attn_weights_entropy = tensor([5.1365, 4.4428, 4.9126, 5.0717], device='cuda:0') | |
2024-09-21 18:02:35,087 INFO [ctc_decode.py:653] batch 0/?, cuts processed until now is 17 | |
2024-09-21 18:04:48,388 INFO [zipformer.py:1858] name=None, attn_weights_entropy = tensor([4.7769, 4.1399, 4.5622, 4.7112], device='cuda:0') | |
2024-09-21 18:05:13,242 INFO [zipformer.py:1858] name=None, attn_weights_entropy = tensor([5.8741, 5.7811, 5.1177, 5.4814], device='cuda:0') | |
2024-09-21 18:05:41,706 INFO [zipformer.py:1858] name=None, attn_weights_entropy = tensor([3.8281, 4.7939, 5.1590, 5.1042], device='cuda:0') | |
2024-09-21 18:05:48,184 INFO [zipformer.py:1858] name=None, attn_weights_entropy = tensor([2.5137, 2.8956, 2.5775, 2.2306], device='cuda:0') | |
2024-09-21 18:05:53,840 INFO [zipformer.py:1858] name=None, attn_weights_entropy = tensor([4.7181, 3.8914, 4.0782, 4.2465], device='cuda:0') | |
2024-09-21 18:06:02,472 INFO [ctc_decode.py:653] batch 100/?, cuts processed until now is 2530 | |
2024-09-21 18:06:26,746 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:26,785 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:26,822 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:26,856 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:26,893 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:26,941 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:26,989 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,032 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,074 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,110 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,192 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,293 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,328 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,378 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,414 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,499 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,560 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,597 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,632 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,672 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,727 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,763 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,851 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,889 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,934 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:27,999 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:28,034 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:28,113 INFO [ctc_decode.py:674] The transcripts are stored in zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/recogs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:28,232 INFO [utils.py:657] [test-other_attention_scale_0.01] %WER 4.93% [2579 / 52343, 256 ins, 268 del, 2055 sub ] | |
2024-09-21 18:06:28,478 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:28,582 INFO [utils.py:657] [test-other_attention_scale_0.05] %WER 4.83% [2530 / 52343, 257 ins, 260 del, 2013 sub ] | |
2024-09-21 18:06:28,824 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:28,931 INFO [utils.py:657] [test-other_attention_scale_0.08] %WER 4.77% [2499 / 52343, 255 ins, 253 del, 1991 sub ] | |
2024-09-21 18:06:29,169 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:29,284 INFO [utils.py:657] [test-other_attention_scale_0.1] %WER 4.74% [2481 / 52343, 252 ins, 247 del, 1982 sub ] | |
2024-09-21 18:06:29,507 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:29,612 INFO [utils.py:657] [test-other_attention_scale_0.3] %WER 4.50% [2353 / 52343, 234 ins, 218 del, 1901 sub ] | |
2024-09-21 18:06:29,853 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:29,956 INFO [utils.py:657] [test-other_attention_scale_0.5] %WER 4.33% [2269 / 52343, 221 ins, 195 del, 1853 sub ] | |
2024-09-21 18:06:30,175 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:30,275 INFO [utils.py:657] [test-other_attention_scale_0.6] %WER 4.31% [2254 / 52343, 221 ins, 193 del, 1840 sub ] | |
2024-09-21 18:06:30,497 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:30,600 INFO [utils.py:657] [test-other_attention_scale_0.7] %WER 4.26% [2230 / 52343, 220 ins, 187 del, 1823 sub ] | |
2024-09-21 18:06:30,820 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:30,920 INFO [utils.py:657] [test-other_attention_scale_0.9] %WER 4.21% [2202 / 52343, 220 ins, 182 del, 1800 sub ] | |
2024-09-21 18:06:31,136 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:31,239 INFO [utils.py:657] [test-other_attention_scale_1.0] %WER 4.20% [2197 / 52343, 219 ins, 183 del, 1795 sub ] | |
2024-09-21 18:06:31,455 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:31,837 INFO [utils.py:657] [test-other_attention_scale_1.1] %WER 4.18% [2190 / 52343, 218 ins, 182 del, 1790 sub ] | |
2024-09-21 18:06:32,069 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:32,171 INFO [utils.py:657] [test-other_attention_scale_1.2] %WER 4.17% [2182 / 52343, 217 ins, 182 del, 1783 sub ] | |
2024-09-21 18:06:32,392 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:32,493 INFO [utils.py:657] [test-other_attention_scale_1.3] %WER 4.17% [2182 / 52343, 219 ins, 181 del, 1782 sub ] | |
2024-09-21 18:06:32,712 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:32,812 INFO [utils.py:657] [test-other_attention_scale_1.5] %WER 4.15% [2173 / 52343, 219 ins, 180 del, 1774 sub ] | |
2024-09-21 18:06:33,036 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:33,135 INFO [utils.py:657] [test-other_attention_scale_1.7] %WER 4.14% [2169 / 52343, 222 ins, 179 del, 1768 sub ] | |
2024-09-21 18:06:33,352 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:33,452 INFO [utils.py:657] [test-other_attention_scale_1.9] %WER 4.13% [2161 / 52343, 221 ins, 180 del, 1760 sub ] | |
2024-09-21 18:06:33,690 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:33,797 INFO [utils.py:657] [test-other_attention_scale_2.0] %WER 4.13% [2160 / 52343, 221 ins, 180 del, 1759 sub ] | |
2024-09-21 18:06:34,015 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:34,114 INFO [utils.py:657] [test-other_attention_scale_2.1] %WER 4.13% [2161 / 52343, 222 ins, 181 del, 1758 sub ] | |
2024-09-21 18:06:34,332 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:34,431 INFO [utils.py:657] [test-other_attention_scale_2.2] %WER 4.13% [2161 / 52343, 221 ins, 182 del, 1758 sub ] | |
2024-09-21 18:06:34,650 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:34,750 INFO [utils.py:657] [test-other_attention_scale_2.3] %WER 4.12% [2159 / 52343, 221 ins, 181 del, 1757 sub ] | |
2024-09-21 18:06:34,965 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:35,065 INFO [utils.py:657] [test-other_attention_scale_2.5] %WER 4.12% [2156 / 52343, 223 ins, 181 del, 1752 sub ] | |
2024-09-21 18:06:35,287 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:35,389 INFO [utils.py:657] [test-other_attention_scale_3.0] %WER 4.11% [2153 / 52343, 225 ins, 179 del, 1749 sub ] | |
2024-09-21 18:06:35,612 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:35,711 INFO [utils.py:657] [test-other_attention_scale_4.0] %WER 4.10% [2145 / 52343, 228 ins, 180 del, 1737 sub ] | |
2024-09-21 18:06:35,928 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:36,276 INFO [utils.py:657] [test-other_attention_scale_5.0] %WER 4.11% [2153 / 52343, 231 ins, 179 del, 1743 sub ] | |
2024-09-21 18:06:36,492 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:36,591 INFO [utils.py:657] [test-other_attention_scale_6.0] %WER 4.10% [2146 / 52343, 228 ins, 180 del, 1738 sub ] | |
2024-09-21 18:06:36,812 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:36,913 INFO [utils.py:657] [test-other_attention_scale_7.0] %WER 4.09% [2143 / 52343, 227 ins, 180 del, 1736 sub ] | |
2024-09-21 18:06:37,135 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:37,234 INFO [utils.py:657] [test-other_attention_scale_8.0] %WER 4.08% [2138 / 52343, 225 ins, 179 del, 1734 sub ] | |
2024-09-21 18:06:37,454 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:37,554 INFO [utils.py:657] [test-other_attention_scale_9.0] %WER 4.08% [2137 / 52343, 225 ins, 178 del, 1734 sub ] | |
2024-09-21 18:06:37,775 INFO [ctc_decode.py:701] Wrote detailed error stats to zipformer/exp-large-ctc-aed-ctc-loss-scale-0.1-aed-loss-scale-0.9-cr-loss-scale-0.02-time-mask-ratio-2.5-scaled-masked-1/attention-decoder-rescoring-no-ngram/errs-test-other-epoch-50_avg-20_use-averaged-model.txt | |
2024-09-21 18:06:37,782 INFO [ctc_decode.py:717] | |
For test-other, WER of different settings are: | |
attention_scale_8.0 4.08 best for test-other | |
attention_scale_9.0 4.08 | |
attention_scale_7.0 4.09 | |
attention_scale_4.0 4.1 | |
attention_scale_6.0 4.1 | |
attention_scale_3.0 4.11 | |
attention_scale_5.0 4.11 | |
attention_scale_2.3 4.12 | |
attention_scale_2.5 4.12 | |
attention_scale_1.9 4.13 | |
attention_scale_2.0 4.13 | |
attention_scale_2.1 4.13 | |
attention_scale_2.2 4.13 | |
attention_scale_1.7 4.14 | |
attention_scale_1.5 4.15 | |
attention_scale_1.2 4.17 | |
attention_scale_1.3 4.17 | |
attention_scale_1.1 4.18 | |
attention_scale_1.0 4.2 | |
attention_scale_0.9 4.21 | |
attention_scale_0.7 4.26 | |
attention_scale_0.6 4.31 | |
attention_scale_0.5 4.33 | |
attention_scale_0.3 4.5 | |
attention_scale_0.1 4.74 | |
attention_scale_0.08 4.77 | |
attention_scale_0.05 4.83 | |
attention_scale_0.01 4.93 | |
2024-09-21 18:06:37,782 INFO [ctc_decode.py:985] Done! | |