File size: 117,304 Bytes
6c08760
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
7767517
1163 1275
pnnx.Input               pnnx_input_0             0 1 0
pnnx.Input               pnnx_input_10            0 1 1
pnnx.Input               pnnx_input_11            0 1 2
pnnx.Input               pnnx_input_12            0 1 3
pnnx.Input               pnnx_input_13            0 1 4
pnnx.Input               pnnx_input_14            0 1 5
pnnx.Input               pnnx_input_15            0 1 6
pnnx.Input               pnnx_input_16            0 1 7
pnnx.Input               pnnx_input_17            0 1 8
pnnx.Input               pnnx_input_18            0 1 9
pnnx.Input               pnnx_input_19            0 1 10
pnnx.Input               pnnx_input_110           0 1 11
pnnx.Input               pnnx_input_111           0 1 12
pnnx.Input               pnnx_input_112           0 1 13
pnnx.Input               pnnx_input_113           0 1 14
pnnx.Input               pnnx_input_114           0 1 15
pnnx.Input               pnnx_input_115           0 1 16
pnnx.Input               pnnx_input_116           0 1 17
pnnx.Input               pnnx_input_117           0 1 18
pnnx.Input               pnnx_input_118           0 1 19
pnnx.Input               pnnx_input_119           0 1 20
pnnx.Input               pnnx_input_120           0 1 21
pnnx.Input               pnnx_input_121           0 1 22
pnnx.Input               pnnx_input_122           0 1 23
pnnx.Input               pnnx_input_123           0 1 24
pnnx.Input               pnnx_input_124           0 1 25
pnnx.Input               pnnx_input_125           0 1 26
pnnx.Input               pnnx_input_126           0 1 27
pnnx.Input               pnnx_input_127           0 1 28
pnnx.Input               pnnx_input_128           0 1 29
pnnx.Input               pnnx_input_129           0 1 30
pnnx.Input               pnnx_input_130           0 1 31
pnnx.Input               pnnx_input_131           0 1 32
pnnx.Input               pnnx_input_132           0 1 33
pnnx.Input               pnnx_input_133           0 1 34
pnnx.Input               pnnx_input_134           0 1 35
pnnx.Input               pnnx_input_135           0 1 36
pnnx.Input               pnnx_input_136           0 1 37
pnnx.Input               pnnx_input_137           0 1 38
pnnx.Input               pnnx_input_138           0 1 39
pnnx.Input               pnnx_input_139           0 1 40
pnnx.Input               pnnx_input_140           0 1 41
pnnx.Input               pnnx_input_141           0 1 42
pnnx.Input               pnnx_input_142           0 1 43
pnnx.Input               pnnx_input_143           0 1 44
pnnx.Input               pnnx_input_144           0 1 45
pnnx.Input               pnnx_input_145           0 1 46
pnnx.Input               pnnx_input_146           0 1 47
pnnx.Input               pnnx_input_147           0 1 48
pnnx.Input               pnnx_input_148           0 1 49
pnnx.Input               pnnx_input_149           0 1 50
pnnx.Input               pnnx_input_150           0 1 51
pnnx.Input               pnnx_input_151           0 1 52
pnnx.Input               pnnx_input_152           0 1 53
pnnx.Input               pnnx_input_153           0 1 54
pnnx.Input               pnnx_input_154           0 1 55
pnnx.Input               pnnx_input_155           0 1 56
pnnx.Input               pnnx_input_156           0 1 57
pnnx.Input               pnnx_input_157           0 1 58
pnnx.Input               pnnx_input_158           0 1 59
pnnx.Input               pnnx_input_159           0 1 60
pnnx.Input               pnnx_input_160           0 1 61
pnnx.Input               pnnx_input_161           0 1 62
pnnx.Input               pnnx_input_162           0 1 63
pnnx.Input               pnnx_input_163           0 1 64
torch.unsqueeze          torch.unsqueeze_892      1 1 0 65 dim=1 $input=0
nn.Conv2d                encoder_embed.conv.0     1 1 65 66 bias=True dilation=(1,1) groups=1 in_channels=1 kernel_size=(3,3) out_channels=8 padding=(1,1) padding_mode=zeros stride=(1,1) @bias=(8)f32 @weight=(8,1,3,3)f32
pnnx.Expression          pnnx_expr_3071           1 1 66 67 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_16             1 1 67 68 $input=67
pnnx.Expression          pnnx_expr_3070           2 1 66 68 69 expr=mul(@0,@1)
nn.Conv2d                encoder_embed.conv.3     1 1 69 70 bias=True dilation=(1,1) groups=1 in_channels=8 kernel_size=(3,3) out_channels=32 padding=(0,0) padding_mode=zeros stride=(2,2) @bias=(32)f32 @weight=(32,8,3,3)f32
pnnx.Expression          pnnx_expr_3067           1 1 70 71 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_17             1 1 71 72 $input=71
pnnx.Expression          pnnx_expr_3066           2 1 70 72 73 expr=mul(@0,@1)
nn.Conv2d                encoder_embed.conv.6     1 1 73 74 bias=True dilation=(1,1) groups=1 in_channels=32 kernel_size=(3,3) out_channels=128 padding=(0,0) padding_mode=zeros stride=(2,2) @bias=(128)f32 @weight=(128,32,3,3)f32
pnnx.Expression          pnnx_expr_3063           1 1 74 75 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_18             1 1 75 76 $input=75
pnnx.Expression          pnnx_expr_3062           2 1 74 76 77 expr=mul(@0,@1)
torch.permute            torch.permute_761        1 1 77 78 dims=(0,2,1,3) $input=77
Tensor.reshape           Tensor.reshape_83        1 1 78 79 shape=(1,-1,2432) $input=78
nn.Linear                encoder_embed.out        1 1 79 80 bias=True in_features=2432 out_features=144 @bias=(144)f32 @weight=(144,2432)f32
pnnx.Expression          pnnx_expr_3053           1 1 80 81 expr=mul(@0,@0)
torch.mean               torch.mean_728           1 1 81 82 dim=(-1) keepdim=True $input=81
pnnx.Expression          pnnx_expr_3049           2 1 80 82 83 expr=mul(@0,pow(add(@1,2.500004e-01),-5.000000e-01))
Tensor.slice             slice_0                  1 1 83 84 dims=(1) ends=(-1) starts=(1) steps=(1) $input=83
torch.permute            torch.permute_762        1 1 84 85 dims=(1,0,2) $input=84
torch.tensor_split       slice_2                  1 2 85 86 87 dim=0 indices=(8)
torch.cat                torch.cat_568            2 1 87 86 88 dim=0
nn.Linear                encoder.emformer_layers.0.feed_forward_macaron.0 1 1 88 89 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_3027           1 1 89 90 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_19             1 1 90 91 $input=90
pnnx.Expression          pnnx_expr_3026           2 1 89 91 92 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.0.feed_forward_macaron.4 1 1 92 93 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_3024           2 1 88 93 94 expr=add(@0,@1)
torch.tensor_split       slice_4                  1 2 94 95 96 dim=0 indices=(2)
torch.cat                torch.cat_569            2 1 95 96 97 dim=0
nn.Linear                encoder.emformer_layers.0.attention.emb_to_query 1 1 97 98 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_570            3 1 1 95 96 99 dim=0
nn.Linear                encoder.emformer_layers.0.attention.emb_to_key_value 1 1 99 100 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_712          1 2 100 101 102 chunks=2 dim=2 $input=100
Tensor.view              Tensor.view_488          1 1 98 103 shape=(10,4,36) $input=98
torch.tensor_split       slice_6                  1 2 101 104 105 dim=0 indices=(34)
torch.cat                torch.cat_571            3 1 104 2 105 106 dim=0
Tensor.view              Tensor.view_489          1 1 106 107 shape=(50,4,36) $input=106
torch.tensor_split       slice_8                  1 2 102 108 109 dim=0 indices=(34)
torch.cat                torch.cat_572            3 1 108 3 109 110 dim=0
Tensor.view              Tensor.view_490          1 1 110 111 shape=(50,4,36) $input=110
torch.permute            torch.permute_763        1 1 103 112 dims=(1,0,2) $input=103
pnnx.Expression          pnnx_expr_2972           1 1 112 113 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_764        1 1 107 114 dims=(1,0,2) $input=107
torch.permute            torch.permute_766        1 1 114 115 dims=(0,2,1) $input=114
torch.bmm                torch.bmm_536            2 1 113 115 116 $input=113 $mat2=115
F.softmax                F.softmax_67             1 1 116 117 dim=-1 $input=116
torch.permute            torch.permute_765        1 1 111 118 dims=(1,0,2) $input=111
torch.bmm                torch.bmm_537            2 1 117 118 119 $input=117 $mat2=118
torch.permute            torch.permute_767        1 1 119 120 dims=(1,0,2) $input=119
Tensor.reshape           Tensor.reshape_84        1 1 120 121 shape=(-1,144) $input=120
torch.unsqueeze          torch.unsqueeze_893      1 1 121 122 dim=1 $input=121
nn.Linear                encoder.emformer_layers.0.attention.out_proj 1 1 122 123 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_729           1 1 96 124 dim=(0) keepdim=True $input=96
pnnx.Expression          pnnx_expr_2938           2 1 94 123 125 expr=add(@0,@1)
torch.tensor_split       slice_9                  1 2 125 126 127 dim=0 indices=(2)
torch.cat                torch.cat_574            2 1 127 126 128 dim=0
torch.permute            torch.permute_768        1 1 128 129 dims=(1,2,0) $input=128
nn.Conv1d                encoder.emformer_layers.0.conv_module.pointwise_conv1 1 1 129 130 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_0                  1 1 130 131 dim=1 $input=130
torch.cat                torch.cat_575            2 1 4 131 132 dim=2
nn.Conv1d                encoder.emformer_layers.0.conv_module.depthwise_conv 1 1 132 133 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_2910           1 1 133 134 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_20             1 1 134 135 $input=134
pnnx.Expression          pnnx_expr_2909           2 1 133 135 136 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.0.conv_module.pointwise_conv2 1 1 136 137 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_11                 1 2 137 138 139 dim=2 indices=(8)
torch.permute            torch.permute_770        1 1 139 140 dims=(2,0,1) $input=139
torch.permute            torch.permute_769        1 1 138 141 dims=(2,0,1) $input=138
torch.cat                torch.cat_576            2 1 140 141 142 dim=0
pnnx.Expression          pnnx_expr_2874           2 1 125 142 143 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.0.feed_forward.0 1 1 143 144 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_2871           1 1 144 145 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_21             1 1 145 146 $input=145
pnnx.Expression          pnnx_expr_2870           2 1 144 146 147 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.0.feed_forward.4 1 1 147 148 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_2868           2 1 143 148 149 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_2863           1 1 149 150 expr=mul(@0,@0)
torch.mean               torch.mean_730           1 1 150 151 dim=(-1) keepdim=True $input=150
pnnx.Expression          pnnx_expr_2859           2 1 149 151 152 expr=mul(@0,pow(add(@1,9.635315e-01),-5.000000e-01))
torch.tensor_split       slice_13                 1 2 152 153 154 dim=0 indices=(2)
torch.cat                torch.cat_577            2 1 153 154 155 dim=0
nn.Linear                encoder.emformer_layers.1.feed_forward_macaron.0 1 1 155 156 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_2847           1 1 156 157 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_22             1 1 157 158 $input=157
pnnx.Expression          pnnx_expr_2846           2 1 156 158 159 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.1.feed_forward_macaron.4 1 1 159 160 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_2844           2 1 155 160 161 expr=add(@0,@1)
torch.tensor_split       slice_16                 1 2 161 162 163 dim=0 indices=(2)
torch.cat                torch.cat_578            2 1 162 163 164 dim=0
nn.Linear                encoder.emformer_layers.1.attention.emb_to_query 1 1 164 165 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_579            3 1 5 162 163 166 dim=0
nn.Linear                encoder.emformer_layers.1.attention.emb_to_key_value 1 1 166 167 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_713          1 2 167 168 169 chunks=2 dim=2 $input=167
Tensor.view              Tensor.view_491          1 1 165 170 shape=(10,4,36) $input=165
torch.tensor_split       slice_18                 1 2 168 171 172 dim=0 indices=(34)
torch.cat                torch.cat_580            3 1 171 6 172 173 dim=0
Tensor.view              Tensor.view_492          1 1 173 174 shape=(50,4,36) $input=173
torch.tensor_split       slice_20                 1 2 169 175 176 dim=0 indices=(34)
torch.cat                torch.cat_581            3 1 175 7 176 177 dim=0
Tensor.view              Tensor.view_493          1 1 177 178 shape=(50,4,36) $input=177
torch.permute            torch.permute_771        1 1 170 179 dims=(1,0,2) $input=170
pnnx.Expression          pnnx_expr_2786           1 1 179 180 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_772        1 1 174 181 dims=(1,0,2) $input=174
torch.permute            torch.permute_774        1 1 181 182 dims=(0,2,1) $input=181
torch.bmm                torch.bmm_538            2 1 180 182 183 $input=180 $mat2=182
F.softmax                F.softmax_68             1 1 183 184 dim=-1 $input=183
torch.permute            torch.permute_773        1 1 178 185 dims=(1,0,2) $input=178
torch.bmm                torch.bmm_539            2 1 184 185 186 $input=184 $mat2=185
torch.permute            torch.permute_775        1 1 186 187 dims=(1,0,2) $input=186
Tensor.reshape           Tensor.reshape_85        1 1 187 188 shape=(-1,144) $input=187
torch.unsqueeze          torch.unsqueeze_894      1 1 188 189 dim=1 $input=188
nn.Linear                encoder.emformer_layers.1.attention.out_proj 1 1 189 190 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_731           1 1 163 191 dim=(0) keepdim=True $input=163
pnnx.Expression          pnnx_expr_2750           2 1 161 190 192 expr=add(@0,@1)
torch.tensor_split       slice_21                 1 2 192 193 194 dim=0 indices=(2)
torch.cat                torch.cat_583            2 1 194 193 195 dim=0
torch.permute            torch.permute_776        1 1 195 196 dims=(1,2,0) $input=195
nn.Conv1d                encoder.emformer_layers.1.conv_module.pointwise_conv1 1 1 196 197 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_1                  1 1 197 198 dim=1 $input=197
torch.cat                torch.cat_584            2 1 8 198 199 dim=2
nn.Conv1d                encoder.emformer_layers.1.conv_module.depthwise_conv 1 1 199 200 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_2720           1 1 200 201 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_23             1 1 201 202 $input=201
pnnx.Expression          pnnx_expr_2719           2 1 200 202 203 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.1.conv_module.pointwise_conv2 1 1 203 204 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_23                 1 2 204 205 206 dim=2 indices=(8)
torch.permute            torch.permute_778        1 1 206 207 dims=(2,0,1) $input=206
torch.permute            torch.permute_777        1 1 205 208 dims=(2,0,1) $input=205
torch.cat                torch.cat_585            2 1 207 208 209 dim=0
pnnx.Expression          pnnx_expr_2684           2 1 192 209 210 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.1.feed_forward.0 1 1 210 211 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_2681           1 1 211 212 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_24             1 1 212 213 $input=212
pnnx.Expression          pnnx_expr_2680           2 1 211 213 214 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.1.feed_forward.4 1 1 214 215 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_2678           2 1 210 215 216 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_2673           1 1 216 217 expr=mul(@0,@0)
torch.mean               torch.mean_732           1 1 217 218 dim=(-1) keepdim=True $input=217
pnnx.Expression          pnnx_expr_2669           2 1 216 218 219 expr=mul(@0,pow(add(@1,1.013315e+00),-5.000000e-01))
torch.tensor_split       slice_25                 1 2 219 220 221 dim=0 indices=(2)
torch.cat                torch.cat_586            2 1 220 221 222 dim=0
nn.Linear                encoder.emformer_layers.2.feed_forward_macaron.0 1 1 222 223 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_2657           1 1 223 224 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_25             1 1 224 225 $input=224
pnnx.Expression          pnnx_expr_2656           2 1 223 225 226 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.2.feed_forward_macaron.4 1 1 226 227 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_2654           2 1 222 227 228 expr=add(@0,@1)
torch.tensor_split       slice_28                 1 2 228 229 230 dim=0 indices=(2)
torch.cat                torch.cat_587            2 1 229 230 231 dim=0
nn.Linear                encoder.emformer_layers.2.attention.emb_to_query 1 1 231 232 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_588            3 1 9 229 230 233 dim=0
nn.Linear                encoder.emformer_layers.2.attention.emb_to_key_value 1 1 233 234 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_714          1 2 234 235 236 chunks=2 dim=2 $input=234
Tensor.view              Tensor.view_494          1 1 232 237 shape=(10,4,36) $input=232
torch.tensor_split       slice_30                 1 2 235 238 239 dim=0 indices=(34)
torch.cat                torch.cat_589            3 1 238 10 239 240 dim=0
Tensor.view              Tensor.view_495          1 1 240 241 shape=(50,4,36) $input=240
torch.tensor_split       slice_32                 1 2 236 242 243 dim=0 indices=(34)
torch.cat                torch.cat_590            3 1 242 11 243 244 dim=0
Tensor.view              Tensor.view_496          1 1 244 245 shape=(50,4,36) $input=244
torch.permute            torch.permute_779        1 1 237 246 dims=(1,0,2) $input=237
pnnx.Expression          pnnx_expr_2596           1 1 246 247 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_780        1 1 241 248 dims=(1,0,2) $input=241
torch.permute            torch.permute_782        1 1 248 249 dims=(0,2,1) $input=248
torch.bmm                torch.bmm_540            2 1 247 249 250 $input=247 $mat2=249
F.softmax                F.softmax_69             1 1 250 251 dim=-1 $input=250
torch.permute            torch.permute_781        1 1 245 252 dims=(1,0,2) $input=245
torch.bmm                torch.bmm_541            2 1 251 252 253 $input=251 $mat2=252
torch.permute            torch.permute_783        1 1 253 254 dims=(1,0,2) $input=253
Tensor.reshape           Tensor.reshape_86        1 1 254 255 shape=(-1,144) $input=254
torch.unsqueeze          torch.unsqueeze_895      1 1 255 256 dim=1 $input=255
nn.Linear                encoder.emformer_layers.2.attention.out_proj 1 1 256 257 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_733           1 1 230 258 dim=(0) keepdim=True $input=230
pnnx.Expression          pnnx_expr_2560           2 1 228 257 259 expr=add(@0,@1)
torch.tensor_split       slice_33                 1 2 259 260 261 dim=0 indices=(2)
torch.cat                torch.cat_592            2 1 261 260 262 dim=0
torch.permute            torch.permute_784        1 1 262 263 dims=(1,2,0) $input=262
nn.Conv1d                encoder.emformer_layers.2.conv_module.pointwise_conv1 1 1 263 264 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_2                  1 1 264 265 dim=1 $input=264
torch.cat                torch.cat_593            2 1 12 265 266 dim=2
nn.Conv1d                encoder.emformer_layers.2.conv_module.depthwise_conv 1 1 266 267 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_2530           1 1 267 268 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_26             1 1 268 269 $input=268
pnnx.Expression          pnnx_expr_2529           2 1 267 269 270 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.2.conv_module.pointwise_conv2 1 1 270 271 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_35                 1 2 271 272 273 dim=2 indices=(8)
torch.permute            torch.permute_786        1 1 273 274 dims=(2,0,1) $input=273
torch.permute            torch.permute_785        1 1 272 275 dims=(2,0,1) $input=272
torch.cat                torch.cat_594            2 1 274 275 276 dim=0
pnnx.Expression          pnnx_expr_2494           2 1 259 276 277 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.2.feed_forward.0 1 1 277 278 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_2491           1 1 278 279 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_27             1 1 279 280 $input=279
pnnx.Expression          pnnx_expr_2490           2 1 278 280 281 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.2.feed_forward.4 1 1 281 282 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_2488           2 1 277 282 283 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_2483           1 1 283 284 expr=mul(@0,@0)
torch.mean               torch.mean_734           1 1 284 285 dim=(-1) keepdim=True $input=284
pnnx.Expression          pnnx_expr_2479           2 1 283 285 286 expr=mul(@0,pow(add(@1,1.069920e+00),-5.000000e-01))
torch.tensor_split       slice_37                 1 2 286 287 288 dim=0 indices=(2)
torch.cat                torch.cat_595            2 1 287 288 289 dim=0
nn.Linear                encoder.emformer_layers.3.feed_forward_macaron.0 1 1 289 290 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_2467           1 1 290 291 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_28             1 1 291 292 $input=291
pnnx.Expression          pnnx_expr_2466           2 1 290 292 293 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.3.feed_forward_macaron.4 1 1 293 294 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_2464           2 1 289 294 295 expr=add(@0,@1)
torch.tensor_split       slice_40                 1 2 295 296 297 dim=0 indices=(2)
torch.cat                torch.cat_596            2 1 296 297 298 dim=0
nn.Linear                encoder.emformer_layers.3.attention.emb_to_query 1 1 298 299 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_597            3 1 13 296 297 300 dim=0
nn.Linear                encoder.emformer_layers.3.attention.emb_to_key_value 1 1 300 301 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_715          1 2 301 302 303 chunks=2 dim=2 $input=301
Tensor.view              Tensor.view_497          1 1 299 304 shape=(10,4,36) $input=299
torch.tensor_split       slice_42                 1 2 302 305 306 dim=0 indices=(34)
torch.cat                torch.cat_598            3 1 305 14 306 307 dim=0
Tensor.view              Tensor.view_498          1 1 307 308 shape=(50,4,36) $input=307
torch.tensor_split       slice_44                 1 2 303 309 310 dim=0 indices=(34)
torch.cat                torch.cat_599            3 1 309 15 310 311 dim=0
Tensor.view              Tensor.view_499          1 1 311 312 shape=(50,4,36) $input=311
torch.permute            torch.permute_787        1 1 304 313 dims=(1,0,2) $input=304
pnnx.Expression          pnnx_expr_2406           1 1 313 314 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_788        1 1 308 315 dims=(1,0,2) $input=308
torch.permute            torch.permute_790        1 1 315 316 dims=(0,2,1) $input=315
torch.bmm                torch.bmm_542            2 1 314 316 317 $input=314 $mat2=316
F.softmax                F.softmax_70             1 1 317 318 dim=-1 $input=317
torch.permute            torch.permute_789        1 1 312 319 dims=(1,0,2) $input=312
torch.bmm                torch.bmm_543            2 1 318 319 320 $input=318 $mat2=319
torch.permute            torch.permute_791        1 1 320 321 dims=(1,0,2) $input=320
Tensor.reshape           Tensor.reshape_87        1 1 321 322 shape=(-1,144) $input=321
torch.unsqueeze          torch.unsqueeze_896      1 1 322 323 dim=1 $input=322
nn.Linear                encoder.emformer_layers.3.attention.out_proj 1 1 323 324 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_735           1 1 297 325 dim=(0) keepdim=True $input=297
pnnx.Expression          pnnx_expr_2370           2 1 295 324 326 expr=add(@0,@1)
torch.tensor_split       slice_45                 1 2 326 327 328 dim=0 indices=(2)
torch.cat                torch.cat_601            2 1 328 327 329 dim=0
torch.permute            torch.permute_792        1 1 329 330 dims=(1,2,0) $input=329
nn.Conv1d                encoder.emformer_layers.3.conv_module.pointwise_conv1 1 1 330 331 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_3                  1 1 331 332 dim=1 $input=331
torch.cat                torch.cat_602            2 1 16 332 333 dim=2
nn.Conv1d                encoder.emformer_layers.3.conv_module.depthwise_conv 1 1 333 334 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_2340           1 1 334 335 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_29             1 1 335 336 $input=335
pnnx.Expression          pnnx_expr_2339           2 1 334 336 337 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.3.conv_module.pointwise_conv2 1 1 337 338 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_47                 1 2 338 339 340 dim=2 indices=(8)
torch.permute            torch.permute_794        1 1 340 341 dims=(2,0,1) $input=340
torch.permute            torch.permute_793        1 1 339 342 dims=(2,0,1) $input=339
torch.cat                torch.cat_603            2 1 341 342 343 dim=0
pnnx.Expression          pnnx_expr_2304           2 1 326 343 344 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.3.feed_forward.0 1 1 344 345 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_2301           1 1 345 346 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_30             1 1 346 347 $input=346
pnnx.Expression          pnnx_expr_2300           2 1 345 347 348 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.3.feed_forward.4 1 1 348 349 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_2298           2 1 344 349 350 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_2293           1 1 350 351 expr=mul(@0,@0)
torch.mean               torch.mean_736           1 1 351 352 dim=(-1) keepdim=True $input=351
pnnx.Expression          pnnx_expr_2289           2 1 350 352 353 expr=mul(@0,pow(add(@1,1.152986e+00),-5.000000e-01))
torch.tensor_split       slice_49                 1 2 353 354 355 dim=0 indices=(2)
torch.cat                torch.cat_604            2 1 354 355 356 dim=0
nn.Linear                encoder.emformer_layers.4.feed_forward_macaron.0 1 1 356 357 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_2277           1 1 357 358 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_31             1 1 358 359 $input=358
pnnx.Expression          pnnx_expr_2276           2 1 357 359 360 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.4.feed_forward_macaron.4 1 1 360 361 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_2274           2 1 356 361 362 expr=add(@0,@1)
torch.tensor_split       slice_52                 1 2 362 363 364 dim=0 indices=(2)
torch.cat                torch.cat_605            2 1 363 364 365 dim=0
nn.Linear                encoder.emformer_layers.4.attention.emb_to_query 1 1 365 366 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_606            3 1 17 363 364 367 dim=0
nn.Linear                encoder.emformer_layers.4.attention.emb_to_key_value 1 1 367 368 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_716          1 2 368 369 370 chunks=2 dim=2 $input=368
Tensor.view              Tensor.view_500          1 1 366 371 shape=(10,4,36) $input=366
torch.tensor_split       slice_54                 1 2 369 372 373 dim=0 indices=(34)
torch.cat                torch.cat_607            3 1 372 18 373 374 dim=0
Tensor.view              Tensor.view_501          1 1 374 375 shape=(50,4,36) $input=374
torch.tensor_split       slice_56                 1 2 370 376 377 dim=0 indices=(34)
torch.cat                torch.cat_608            3 1 376 19 377 378 dim=0
Tensor.view              Tensor.view_502          1 1 378 379 shape=(50,4,36) $input=378
torch.permute            torch.permute_795        1 1 371 380 dims=(1,0,2) $input=371
pnnx.Expression          pnnx_expr_2216           1 1 380 381 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_796        1 1 375 382 dims=(1,0,2) $input=375
torch.permute            torch.permute_798        1 1 382 383 dims=(0,2,1) $input=382
torch.bmm                torch.bmm_544            2 1 381 383 384 $input=381 $mat2=383
F.softmax                F.softmax_71             1 1 384 385 dim=-1 $input=384
torch.permute            torch.permute_797        1 1 379 386 dims=(1,0,2) $input=379
torch.bmm                torch.bmm_545            2 1 385 386 387 $input=385 $mat2=386
torch.permute            torch.permute_799        1 1 387 388 dims=(1,0,2) $input=387
Tensor.reshape           Tensor.reshape_88        1 1 388 389 shape=(-1,144) $input=388
torch.unsqueeze          torch.unsqueeze_897      1 1 389 390 dim=1 $input=389
nn.Linear                encoder.emformer_layers.4.attention.out_proj 1 1 390 391 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_737           1 1 364 392 dim=(0) keepdim=True $input=364
pnnx.Expression          pnnx_expr_2180           2 1 362 391 393 expr=add(@0,@1)
torch.tensor_split       slice_57                 1 2 393 394 395 dim=0 indices=(2)
torch.cat                torch.cat_610            2 1 395 394 396 dim=0
torch.permute            torch.permute_800        1 1 396 397 dims=(1,2,0) $input=396
nn.Conv1d                encoder.emformer_layers.4.conv_module.pointwise_conv1 1 1 397 398 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_4                  1 1 398 399 dim=1 $input=398
torch.cat                torch.cat_611            2 1 20 399 400 dim=2
nn.Conv1d                encoder.emformer_layers.4.conv_module.depthwise_conv 1 1 400 401 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_2150           1 1 401 402 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_32             1 1 402 403 $input=402
pnnx.Expression          pnnx_expr_2149           2 1 401 403 404 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.4.conv_module.pointwise_conv2 1 1 404 405 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_59                 1 2 405 406 407 dim=2 indices=(8)
torch.permute            torch.permute_802        1 1 407 408 dims=(2,0,1) $input=407
torch.permute            torch.permute_801        1 1 406 409 dims=(2,0,1) $input=406
torch.cat                torch.cat_612            2 1 408 409 410 dim=0
pnnx.Expression          pnnx_expr_2114           2 1 393 410 411 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.4.feed_forward.0 1 1 411 412 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_2111           1 1 412 413 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_33             1 1 413 414 $input=413
pnnx.Expression          pnnx_expr_2110           2 1 412 414 415 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.4.feed_forward.4 1 1 415 416 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_2108           2 1 411 416 417 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_2103           1 1 417 418 expr=mul(@0,@0)
torch.mean               torch.mean_738           1 1 418 419 dim=(-1) keepdim=True $input=418
pnnx.Expression          pnnx_expr_2099           2 1 417 419 420 expr=mul(@0,pow(add(@1,1.284874e+00),-5.000000e-01))
torch.tensor_split       slice_61                 1 2 420 421 422 dim=0 indices=(2)
torch.cat                torch.cat_613            2 1 421 422 423 dim=0
nn.Linear                encoder.emformer_layers.5.feed_forward_macaron.0 1 1 423 424 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_2087           1 1 424 425 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_34             1 1 425 426 $input=425
pnnx.Expression          pnnx_expr_2086           2 1 424 426 427 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.5.feed_forward_macaron.4 1 1 427 428 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_2084           2 1 423 428 429 expr=add(@0,@1)
torch.tensor_split       slice_64                 1 2 429 430 431 dim=0 indices=(2)
torch.cat                torch.cat_614            2 1 430 431 432 dim=0
nn.Linear                encoder.emformer_layers.5.attention.emb_to_query 1 1 432 433 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_615            3 1 21 430 431 434 dim=0
nn.Linear                encoder.emformer_layers.5.attention.emb_to_key_value 1 1 434 435 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_717          1 2 435 436 437 chunks=2 dim=2 $input=435
Tensor.view              Tensor.view_503          1 1 433 438 shape=(10,4,36) $input=433
torch.tensor_split       slice_66                 1 2 436 439 440 dim=0 indices=(34)
torch.cat                torch.cat_616            3 1 439 22 440 441 dim=0
Tensor.view              Tensor.view_504          1 1 441 442 shape=(50,4,36) $input=441
torch.tensor_split       slice_68                 1 2 437 443 444 dim=0 indices=(34)
torch.cat                torch.cat_617            3 1 443 23 444 445 dim=0
Tensor.view              Tensor.view_505          1 1 445 446 shape=(50,4,36) $input=445
torch.permute            torch.permute_803        1 1 438 447 dims=(1,0,2) $input=438
pnnx.Expression          pnnx_expr_2026           1 1 447 448 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_804        1 1 442 449 dims=(1,0,2) $input=442
torch.permute            torch.permute_806        1 1 449 450 dims=(0,2,1) $input=449
torch.bmm                torch.bmm_546            2 1 448 450 451 $input=448 $mat2=450
F.softmax                F.softmax_72             1 1 451 452 dim=-1 $input=451
torch.permute            torch.permute_805        1 1 446 453 dims=(1,0,2) $input=446
torch.bmm                torch.bmm_547            2 1 452 453 454 $input=452 $mat2=453
torch.permute            torch.permute_807        1 1 454 455 dims=(1,0,2) $input=454
Tensor.reshape           Tensor.reshape_89        1 1 455 456 shape=(-1,144) $input=455
torch.unsqueeze          torch.unsqueeze_898      1 1 456 457 dim=1 $input=456
nn.Linear                encoder.emformer_layers.5.attention.out_proj 1 1 457 458 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_739           1 1 431 459 dim=(0) keepdim=True $input=431
pnnx.Expression          pnnx_expr_1990           2 1 429 458 460 expr=add(@0,@1)
torch.tensor_split       slice_69                 1 2 460 461 462 dim=0 indices=(2)
torch.cat                torch.cat_619            2 1 462 461 463 dim=0
torch.permute            torch.permute_808        1 1 463 464 dims=(1,2,0) $input=463
nn.Conv1d                encoder.emformer_layers.5.conv_module.pointwise_conv1 1 1 464 465 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_5                  1 1 465 466 dim=1 $input=465
torch.cat                torch.cat_620            2 1 24 466 467 dim=2
nn.Conv1d                encoder.emformer_layers.5.conv_module.depthwise_conv 1 1 467 468 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_1960           1 1 468 469 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_35             1 1 469 470 $input=469
pnnx.Expression          pnnx_expr_1959           2 1 468 470 471 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.5.conv_module.pointwise_conv2 1 1 471 472 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_71                 1 2 472 473 474 dim=2 indices=(8)
torch.permute            torch.permute_810        1 1 474 475 dims=(2,0,1) $input=474
torch.permute            torch.permute_809        1 1 473 476 dims=(2,0,1) $input=473
torch.cat                torch.cat_621            2 1 475 476 477 dim=0
pnnx.Expression          pnnx_expr_1924           2 1 460 477 478 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.5.feed_forward.0 1 1 478 479 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_1921           1 1 479 480 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_36             1 1 480 481 $input=480
pnnx.Expression          pnnx_expr_1920           2 1 479 481 482 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.5.feed_forward.4 1 1 482 483 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_1918           2 1 478 483 484 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_1913           1 1 484 485 expr=mul(@0,@0)
torch.mean               torch.mean_740           1 1 485 486 dim=(-1) keepdim=True $input=485
pnnx.Expression          pnnx_expr_1909           2 1 484 486 487 expr=mul(@0,pow(add(@1,1.493289e+00),-5.000000e-01))
torch.tensor_split       slice_73                 1 2 487 488 489 dim=0 indices=(2)
torch.cat                torch.cat_622            2 1 488 489 490 dim=0
nn.Linear                encoder.emformer_layers.6.feed_forward_macaron.0 1 1 490 491 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_1897           1 1 491 492 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_37             1 1 492 493 $input=492
pnnx.Expression          pnnx_expr_1896           2 1 491 493 494 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.6.feed_forward_macaron.4 1 1 494 495 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_1894           2 1 490 495 496 expr=add(@0,@1)
torch.tensor_split       slice_76                 1 2 496 497 498 dim=0 indices=(2)
torch.cat                torch.cat_623            2 1 497 498 499 dim=0
nn.Linear                encoder.emformer_layers.6.attention.emb_to_query 1 1 499 500 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_624            3 1 25 497 498 501 dim=0
nn.Linear                encoder.emformer_layers.6.attention.emb_to_key_value 1 1 501 502 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_718          1 2 502 503 504 chunks=2 dim=2 $input=502
Tensor.view              Tensor.view_506          1 1 500 505 shape=(10,4,36) $input=500
torch.tensor_split       slice_78                 1 2 503 506 507 dim=0 indices=(34)
torch.cat                torch.cat_625            3 1 506 26 507 508 dim=0
Tensor.view              Tensor.view_507          1 1 508 509 shape=(50,4,36) $input=508
torch.tensor_split       slice_80                 1 2 504 510 511 dim=0 indices=(34)
torch.cat                torch.cat_626            3 1 510 27 511 512 dim=0
Tensor.view              Tensor.view_508          1 1 512 513 shape=(50,4,36) $input=512
torch.permute            torch.permute_811        1 1 505 514 dims=(1,0,2) $input=505
pnnx.Expression          pnnx_expr_1836           1 1 514 515 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_812        1 1 509 516 dims=(1,0,2) $input=509
torch.permute            torch.permute_814        1 1 516 517 dims=(0,2,1) $input=516
torch.bmm                torch.bmm_548            2 1 515 517 518 $input=515 $mat2=517
F.softmax                F.softmax_73             1 1 518 519 dim=-1 $input=518
torch.permute            torch.permute_813        1 1 513 520 dims=(1,0,2) $input=513
torch.bmm                torch.bmm_549            2 1 519 520 521 $input=519 $mat2=520
torch.permute            torch.permute_815        1 1 521 522 dims=(1,0,2) $input=521
Tensor.reshape           Tensor.reshape_90        1 1 522 523 shape=(-1,144) $input=522
torch.unsqueeze          torch.unsqueeze_899      1 1 523 524 dim=1 $input=523
nn.Linear                encoder.emformer_layers.6.attention.out_proj 1 1 524 525 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_741           1 1 498 526 dim=(0) keepdim=True $input=498
pnnx.Expression          pnnx_expr_1800           2 1 496 525 527 expr=add(@0,@1)
torch.tensor_split       slice_81                 1 2 527 528 529 dim=0 indices=(2)
torch.cat                torch.cat_628            2 1 529 528 530 dim=0
torch.permute            torch.permute_816        1 1 530 531 dims=(1,2,0) $input=530
nn.Conv1d                encoder.emformer_layers.6.conv_module.pointwise_conv1 1 1 531 532 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_6                  1 1 532 533 dim=1 $input=532
torch.cat                torch.cat_629            2 1 28 533 534 dim=2
nn.Conv1d                encoder.emformer_layers.6.conv_module.depthwise_conv 1 1 534 535 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_1770           1 1 535 536 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_38             1 1 536 537 $input=536
pnnx.Expression          pnnx_expr_1769           2 1 535 537 538 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.6.conv_module.pointwise_conv2 1 1 538 539 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_83                 1 2 539 540 541 dim=2 indices=(8)
torch.permute            torch.permute_818        1 1 541 542 dims=(2,0,1) $input=541
torch.permute            torch.permute_817        1 1 540 543 dims=(2,0,1) $input=540
torch.cat                torch.cat_630            2 1 542 543 544 dim=0
pnnx.Expression          pnnx_expr_1734           2 1 527 544 545 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.6.feed_forward.0 1 1 545 546 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_1731           1 1 546 547 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_39             1 1 547 548 $input=547
pnnx.Expression          pnnx_expr_1730           2 1 546 548 549 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.6.feed_forward.4 1 1 549 550 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_1728           2 1 545 550 551 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_1723           1 1 551 552 expr=mul(@0,@0)
torch.mean               torch.mean_742           1 1 552 553 dim=(-1) keepdim=True $input=552
pnnx.Expression          pnnx_expr_1719           2 1 551 553 554 expr=mul(@0,pow(add(@1,1.652225e+00),-5.000000e-01))
torch.tensor_split       slice_85                 1 2 554 555 556 dim=0 indices=(2)
torch.cat                torch.cat_631            2 1 555 556 557 dim=0
nn.Linear                encoder.emformer_layers.7.feed_forward_macaron.0 1 1 557 558 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_1707           1 1 558 559 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_40             1 1 559 560 $input=559
pnnx.Expression          pnnx_expr_1706           2 1 558 560 561 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.7.feed_forward_macaron.4 1 1 561 562 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_1704           2 1 557 562 563 expr=add(@0,@1)
torch.tensor_split       slice_88                 1 2 563 564 565 dim=0 indices=(2)
torch.cat                torch.cat_632            2 1 564 565 566 dim=0
nn.Linear                encoder.emformer_layers.7.attention.emb_to_query 1 1 566 567 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_633            3 1 29 564 565 568 dim=0
nn.Linear                encoder.emformer_layers.7.attention.emb_to_key_value 1 1 568 569 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_719          1 2 569 570 571 chunks=2 dim=2 $input=569
Tensor.view              Tensor.view_509          1 1 567 572 shape=(10,4,36) $input=567
torch.tensor_split       slice_90                 1 2 570 573 574 dim=0 indices=(34)
torch.cat                torch.cat_634            3 1 573 30 574 575 dim=0
Tensor.view              Tensor.view_510          1 1 575 576 shape=(50,4,36) $input=575
torch.tensor_split       slice_92                 1 2 571 577 578 dim=0 indices=(34)
torch.cat                torch.cat_635            3 1 577 31 578 579 dim=0
Tensor.view              Tensor.view_511          1 1 579 580 shape=(50,4,36) $input=579
torch.permute            torch.permute_819        1 1 572 581 dims=(1,0,2) $input=572
pnnx.Expression          pnnx_expr_1646           1 1 581 582 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_820        1 1 576 583 dims=(1,0,2) $input=576
torch.permute            torch.permute_822        1 1 583 584 dims=(0,2,1) $input=583
torch.bmm                torch.bmm_550            2 1 582 584 585 $input=582 $mat2=584
F.softmax                F.softmax_74             1 1 585 586 dim=-1 $input=585
torch.permute            torch.permute_821        1 1 580 587 dims=(1,0,2) $input=580
torch.bmm                torch.bmm_551            2 1 586 587 588 $input=586 $mat2=587
torch.permute            torch.permute_823        1 1 588 589 dims=(1,0,2) $input=588
Tensor.reshape           Tensor.reshape_91        1 1 589 590 shape=(-1,144) $input=589
torch.unsqueeze          torch.unsqueeze_900      1 1 590 591 dim=1 $input=590
nn.Linear                encoder.emformer_layers.7.attention.out_proj 1 1 591 592 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_743           1 1 565 593 dim=(0) keepdim=True $input=565
pnnx.Expression          pnnx_expr_1610           2 1 563 592 594 expr=add(@0,@1)
torch.tensor_split       slice_93                 1 2 594 595 596 dim=0 indices=(2)
torch.cat                torch.cat_637            2 1 596 595 597 dim=0
torch.permute            torch.permute_824        1 1 597 598 dims=(1,2,0) $input=597
nn.Conv1d                encoder.emformer_layers.7.conv_module.pointwise_conv1 1 1 598 599 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_7                  1 1 599 600 dim=1 $input=599
torch.cat                torch.cat_638            2 1 32 600 601 dim=2
nn.Conv1d                encoder.emformer_layers.7.conv_module.depthwise_conv 1 1 601 602 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_1580           1 1 602 603 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_41             1 1 603 604 $input=603
pnnx.Expression          pnnx_expr_1579           2 1 602 604 605 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.7.conv_module.pointwise_conv2 1 1 605 606 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_95                 1 2 606 607 608 dim=2 indices=(8)
torch.permute            torch.permute_826        1 1 608 609 dims=(2,0,1) $input=608
torch.permute            torch.permute_825        1 1 607 610 dims=(2,0,1) $input=607
torch.cat                torch.cat_639            2 1 609 610 611 dim=0
pnnx.Expression          pnnx_expr_1544           2 1 594 611 612 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.7.feed_forward.0 1 1 612 613 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_1541           1 1 613 614 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_42             1 1 614 615 $input=614
pnnx.Expression          pnnx_expr_1540           2 1 613 615 616 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.7.feed_forward.4 1 1 616 617 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_1538           2 1 612 617 618 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_1533           1 1 618 619 expr=mul(@0,@0)
torch.mean               torch.mean_744           1 1 619 620 dim=(-1) keepdim=True $input=619
pnnx.Expression          pnnx_expr_1529           2 1 618 620 621 expr=mul(@0,pow(add(@1,1.899739e+00),-5.000000e-01))
torch.tensor_split       slice_97                 1 2 621 622 623 dim=0 indices=(2)
torch.cat                torch.cat_640            2 1 622 623 624 dim=0
nn.Linear                encoder.emformer_layers.8.feed_forward_macaron.0 1 1 624 625 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_1517           1 1 625 626 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_43             1 1 626 627 $input=626
pnnx.Expression          pnnx_expr_1516           2 1 625 627 628 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.8.feed_forward_macaron.4 1 1 628 629 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_1514           2 1 624 629 630 expr=add(@0,@1)
torch.tensor_split       slice_100                1 2 630 631 632 dim=0 indices=(2)
torch.cat                torch.cat_641            2 1 631 632 633 dim=0
nn.Linear                encoder.emformer_layers.8.attention.emb_to_query 1 1 633 634 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_642            3 1 33 631 632 635 dim=0
nn.Linear                encoder.emformer_layers.8.attention.emb_to_key_value 1 1 635 636 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_720          1 2 636 637 638 chunks=2 dim=2 $input=636
Tensor.view              Tensor.view_512          1 1 634 639 shape=(10,4,36) $input=634
torch.tensor_split       slice_102                1 2 637 640 641 dim=0 indices=(34)
torch.cat                torch.cat_643            3 1 640 34 641 642 dim=0
Tensor.view              Tensor.view_513          1 1 642 643 shape=(50,4,36) $input=642
torch.tensor_split       slice_104                1 2 638 644 645 dim=0 indices=(34)
torch.cat                torch.cat_644            3 1 644 35 645 646 dim=0
Tensor.view              Tensor.view_514          1 1 646 647 shape=(50,4,36) $input=646
torch.permute            torch.permute_827        1 1 639 648 dims=(1,0,2) $input=639
pnnx.Expression          pnnx_expr_1456           1 1 648 649 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_828        1 1 643 650 dims=(1,0,2) $input=643
torch.permute            torch.permute_830        1 1 650 651 dims=(0,2,1) $input=650
torch.bmm                torch.bmm_552            2 1 649 651 652 $input=649 $mat2=651
F.softmax                F.softmax_75             1 1 652 653 dim=-1 $input=652
torch.permute            torch.permute_829        1 1 647 654 dims=(1,0,2) $input=647
torch.bmm                torch.bmm_553            2 1 653 654 655 $input=653 $mat2=654
torch.permute            torch.permute_831        1 1 655 656 dims=(1,0,2) $input=655
Tensor.reshape           Tensor.reshape_92        1 1 656 657 shape=(-1,144) $input=656
torch.unsqueeze          torch.unsqueeze_901      1 1 657 658 dim=1 $input=657
nn.Linear                encoder.emformer_layers.8.attention.out_proj 1 1 658 659 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_745           1 1 632 660 dim=(0) keepdim=True $input=632
pnnx.Expression          pnnx_expr_1420           2 1 630 659 661 expr=add(@0,@1)
torch.tensor_split       slice_105                1 2 661 662 663 dim=0 indices=(2)
torch.cat                torch.cat_646            2 1 663 662 664 dim=0
torch.permute            torch.permute_832        1 1 664 665 dims=(1,2,0) $input=664
nn.Conv1d                encoder.emformer_layers.8.conv_module.pointwise_conv1 1 1 665 666 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_8                  1 1 666 667 dim=1 $input=666
torch.cat                torch.cat_647            2 1 36 667 668 dim=2
nn.Conv1d                encoder.emformer_layers.8.conv_module.depthwise_conv 1 1 668 669 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_1390           1 1 669 670 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_44             1 1 670 671 $input=670
pnnx.Expression          pnnx_expr_1389           2 1 669 671 672 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.8.conv_module.pointwise_conv2 1 1 672 673 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_107                1 2 673 674 675 dim=2 indices=(8)
torch.permute            torch.permute_834        1 1 675 676 dims=(2,0,1) $input=675
torch.permute            torch.permute_833        1 1 674 677 dims=(2,0,1) $input=674
torch.cat                torch.cat_648            2 1 676 677 678 dim=0
pnnx.Expression          pnnx_expr_1354           2 1 661 678 679 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.8.feed_forward.0 1 1 679 680 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_1351           1 1 680 681 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_45             1 1 681 682 $input=681
pnnx.Expression          pnnx_expr_1350           2 1 680 682 683 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.8.feed_forward.4 1 1 683 684 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_1348           2 1 679 684 685 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_1343           1 1 685 686 expr=mul(@0,@0)
torch.mean               torch.mean_746           1 1 686 687 dim=(-1) keepdim=True $input=686
pnnx.Expression          pnnx_expr_1339           2 1 685 687 688 expr=mul(@0,pow(add(@1,1.969823e+00),-5.000000e-01))
torch.tensor_split       slice_109                1 2 688 689 690 dim=0 indices=(2)
torch.cat                torch.cat_649            2 1 689 690 691 dim=0
nn.Linear                encoder.emformer_layers.9.feed_forward_macaron.0 1 1 691 692 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_1327           1 1 692 693 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_46             1 1 693 694 $input=693
pnnx.Expression          pnnx_expr_1326           2 1 692 694 695 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.9.feed_forward_macaron.4 1 1 695 696 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_1324           2 1 691 696 697 expr=add(@0,@1)
torch.tensor_split       slice_112                1 2 697 698 699 dim=0 indices=(2)
torch.cat                torch.cat_650            2 1 698 699 700 dim=0
nn.Linear                encoder.emformer_layers.9.attention.emb_to_query 1 1 700 701 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_651            3 1 37 698 699 702 dim=0
nn.Linear                encoder.emformer_layers.9.attention.emb_to_key_value 1 1 702 703 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_721          1 2 703 704 705 chunks=2 dim=2 $input=703
Tensor.view              Tensor.view_515          1 1 701 706 shape=(10,4,36) $input=701
torch.tensor_split       slice_114                1 2 704 707 708 dim=0 indices=(34)
torch.cat                torch.cat_652            3 1 707 38 708 709 dim=0
Tensor.view              Tensor.view_516          1 1 709 710 shape=(50,4,36) $input=709
torch.tensor_split       slice_116                1 2 705 711 712 dim=0 indices=(34)
torch.cat                torch.cat_653            3 1 711 39 712 713 dim=0
Tensor.view              Tensor.view_517          1 1 713 714 shape=(50,4,36) $input=713
torch.permute            torch.permute_835        1 1 706 715 dims=(1,0,2) $input=706
pnnx.Expression          pnnx_expr_1266           1 1 715 716 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_836        1 1 710 717 dims=(1,0,2) $input=710
torch.permute            torch.permute_838        1 1 717 718 dims=(0,2,1) $input=717
torch.bmm                torch.bmm_554            2 1 716 718 719 $input=716 $mat2=718
F.softmax                F.softmax_76             1 1 719 720 dim=-1 $input=719
torch.permute            torch.permute_837        1 1 714 721 dims=(1,0,2) $input=714
torch.bmm                torch.bmm_555            2 1 720 721 722 $input=720 $mat2=721
torch.permute            torch.permute_839        1 1 722 723 dims=(1,0,2) $input=722
Tensor.reshape           Tensor.reshape_93        1 1 723 724 shape=(-1,144) $input=723
torch.unsqueeze          torch.unsqueeze_902      1 1 724 725 dim=1 $input=724
nn.Linear                encoder.emformer_layers.9.attention.out_proj 1 1 725 726 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_747           1 1 699 727 dim=(0) keepdim=True $input=699
pnnx.Expression          pnnx_expr_1230           2 1 697 726 728 expr=add(@0,@1)
torch.tensor_split       slice_117                1 2 728 729 730 dim=0 indices=(2)
torch.cat                torch.cat_655            2 1 730 729 731 dim=0
torch.permute            torch.permute_840        1 1 731 732 dims=(1,2,0) $input=731
nn.Conv1d                encoder.emformer_layers.9.conv_module.pointwise_conv1 1 1 732 733 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_9                  1 1 733 734 dim=1 $input=733
torch.cat                torch.cat_656            2 1 40 734 735 dim=2
nn.Conv1d                encoder.emformer_layers.9.conv_module.depthwise_conv 1 1 735 736 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_1200           1 1 736 737 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_47             1 1 737 738 $input=737
pnnx.Expression          pnnx_expr_1199           2 1 736 738 739 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.9.conv_module.pointwise_conv2 1 1 739 740 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_119                1 2 740 741 742 dim=2 indices=(8)
torch.permute            torch.permute_842        1 1 742 743 dims=(2,0,1) $input=742
torch.permute            torch.permute_841        1 1 741 744 dims=(2,0,1) $input=741
torch.cat                torch.cat_657            2 1 743 744 745 dim=0
pnnx.Expression          pnnx_expr_1164           2 1 728 745 746 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.9.feed_forward.0 1 1 746 747 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_1161           1 1 747 748 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_48             1 1 748 749 $input=748
pnnx.Expression          pnnx_expr_1160           2 1 747 749 750 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.9.feed_forward.4 1 1 750 751 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_1158           2 1 746 751 752 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_1153           1 1 752 753 expr=mul(@0,@0)
torch.mean               torch.mean_748           1 1 753 754 dim=(-1) keepdim=True $input=753
pnnx.Expression          pnnx_expr_1149           2 1 752 754 755 expr=mul(@0,pow(add(@1,2.125452e+00),-5.000000e-01))
torch.tensor_split       slice_121                1 2 755 756 757 dim=0 indices=(2)
torch.cat                torch.cat_658            2 1 756 757 758 dim=0
nn.Linear                encoder.emformer_layers.10.feed_forward_macaron.0 1 1 758 759 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_1137           1 1 759 760 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_49             1 1 760 761 $input=760
pnnx.Expression          pnnx_expr_1136           2 1 759 761 762 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.10.feed_forward_macaron.4 1 1 762 763 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_1134           2 1 758 763 764 expr=add(@0,@1)
torch.tensor_split       slice_124                1 2 764 765 766 dim=0 indices=(2)
torch.cat                torch.cat_659            2 1 765 766 767 dim=0
nn.Linear                encoder.emformer_layers.10.attention.emb_to_query 1 1 767 768 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_660            3 1 41 765 766 769 dim=0
nn.Linear                encoder.emformer_layers.10.attention.emb_to_key_value 1 1 769 770 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_722          1 2 770 771 772 chunks=2 dim=2 $input=770
Tensor.view              Tensor.view_518          1 1 768 773 shape=(10,4,36) $input=768
torch.tensor_split       slice_126                1 2 771 774 775 dim=0 indices=(34)
torch.cat                torch.cat_661            3 1 774 42 775 776 dim=0
Tensor.view              Tensor.view_519          1 1 776 777 shape=(50,4,36) $input=776
torch.tensor_split       slice_128                1 2 772 778 779 dim=0 indices=(34)
torch.cat                torch.cat_662            3 1 778 43 779 780 dim=0
Tensor.view              Tensor.view_520          1 1 780 781 shape=(50,4,36) $input=780
torch.permute            torch.permute_843        1 1 773 782 dims=(1,0,2) $input=773
pnnx.Expression          pnnx_expr_1076           1 1 782 783 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_844        1 1 777 784 dims=(1,0,2) $input=777
torch.permute            torch.permute_846        1 1 784 785 dims=(0,2,1) $input=784
torch.bmm                torch.bmm_556            2 1 783 785 786 $input=783 $mat2=785
F.softmax                F.softmax_77             1 1 786 787 dim=-1 $input=786
torch.permute            torch.permute_845        1 1 781 788 dims=(1,0,2) $input=781
torch.bmm                torch.bmm_557            2 1 787 788 789 $input=787 $mat2=788
torch.permute            torch.permute_847        1 1 789 790 dims=(1,0,2) $input=789
Tensor.reshape           Tensor.reshape_94        1 1 790 791 shape=(-1,144) $input=790
torch.unsqueeze          torch.unsqueeze_903      1 1 791 792 dim=1 $input=791
nn.Linear                encoder.emformer_layers.10.attention.out_proj 1 1 792 793 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_749           1 1 766 794 dim=(0) keepdim=True $input=766
pnnx.Expression          pnnx_expr_1040           2 1 764 793 795 expr=add(@0,@1)
torch.tensor_split       slice_129                1 2 795 796 797 dim=0 indices=(2)
torch.cat                torch.cat_664            2 1 797 796 798 dim=0
torch.permute            torch.permute_848        1 1 798 799 dims=(1,2,0) $input=798
nn.Conv1d                encoder.emformer_layers.10.conv_module.pointwise_conv1 1 1 799 800 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_10                 1 1 800 801 dim=1 $input=800
torch.cat                torch.cat_665            2 1 44 801 802 dim=2
nn.Conv1d                encoder.emformer_layers.10.conv_module.depthwise_conv 1 1 802 803 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_1010           1 1 803 804 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_50             1 1 804 805 $input=804
pnnx.Expression          pnnx_expr_1009           2 1 803 805 806 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.10.conv_module.pointwise_conv2 1 1 806 807 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_131                1 2 807 808 809 dim=2 indices=(8)
torch.permute            torch.permute_850        1 1 809 810 dims=(2,0,1) $input=809
torch.permute            torch.permute_849        1 1 808 811 dims=(2,0,1) $input=808
torch.cat                torch.cat_666            2 1 810 811 812 dim=0
pnnx.Expression          pnnx_expr_974            2 1 795 812 813 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.10.feed_forward.0 1 1 813 814 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_971            1 1 814 815 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_51             1 1 815 816 $input=815
pnnx.Expression          pnnx_expr_970            2 1 814 816 817 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.10.feed_forward.4 1 1 817 818 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_968            2 1 813 818 819 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_963            1 1 819 820 expr=mul(@0,@0)
torch.mean               torch.mean_750           1 1 820 821 dim=(-1) keepdim=True $input=820
pnnx.Expression          pnnx_expr_959            2 1 819 821 822 expr=mul(@0,pow(add(@1,2.230928e+00),-5.000000e-01))
torch.tensor_split       slice_133                1 2 822 823 824 dim=0 indices=(2)
torch.cat                torch.cat_667            2 1 823 824 825 dim=0
nn.Linear                encoder.emformer_layers.11.feed_forward_macaron.0 1 1 825 826 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_947            1 1 826 827 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_52             1 1 827 828 $input=827
pnnx.Expression          pnnx_expr_946            2 1 826 828 829 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.11.feed_forward_macaron.4 1 1 829 830 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_944            2 1 825 830 831 expr=add(@0,@1)
torch.tensor_split       slice_136                1 2 831 832 833 dim=0 indices=(2)
torch.cat                torch.cat_668            2 1 832 833 834 dim=0
nn.Linear                encoder.emformer_layers.11.attention.emb_to_query 1 1 834 835 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_669            3 1 45 832 833 836 dim=0
nn.Linear                encoder.emformer_layers.11.attention.emb_to_key_value 1 1 836 837 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_723          1 2 837 838 839 chunks=2 dim=2 $input=837
Tensor.view              Tensor.view_521          1 1 835 840 shape=(10,4,36) $input=835
torch.tensor_split       slice_138                1 2 838 841 842 dim=0 indices=(34)
torch.cat                torch.cat_670            3 1 841 46 842 843 dim=0
Tensor.view              Tensor.view_522          1 1 843 844 shape=(50,4,36) $input=843
torch.tensor_split       slice_140                1 2 839 845 846 dim=0 indices=(34)
torch.cat                torch.cat_671            3 1 845 47 846 847 dim=0
Tensor.view              Tensor.view_523          1 1 847 848 shape=(50,4,36) $input=847
torch.permute            torch.permute_851        1 1 840 849 dims=(1,0,2) $input=840
pnnx.Expression          pnnx_expr_886            1 1 849 850 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_852        1 1 844 851 dims=(1,0,2) $input=844
torch.permute            torch.permute_854        1 1 851 852 dims=(0,2,1) $input=851
torch.bmm                torch.bmm_558            2 1 850 852 853 $input=850 $mat2=852
F.softmax                F.softmax_78             1 1 853 854 dim=-1 $input=853
torch.permute            torch.permute_853        1 1 848 855 dims=(1,0,2) $input=848
torch.bmm                torch.bmm_559            2 1 854 855 856 $input=854 $mat2=855
torch.permute            torch.permute_855        1 1 856 857 dims=(1,0,2) $input=856
Tensor.reshape           Tensor.reshape_95        1 1 857 858 shape=(-1,144) $input=857
torch.unsqueeze          torch.unsqueeze_904      1 1 858 859 dim=1 $input=858
nn.Linear                encoder.emformer_layers.11.attention.out_proj 1 1 859 860 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_751           1 1 833 861 dim=(0) keepdim=True $input=833
pnnx.Expression          pnnx_expr_850            2 1 831 860 862 expr=add(@0,@1)
torch.tensor_split       slice_141                1 2 862 863 864 dim=0 indices=(2)
torch.cat                torch.cat_673            2 1 864 863 865 dim=0
torch.permute            torch.permute_856        1 1 865 866 dims=(1,2,0) $input=865
nn.Conv1d                encoder.emformer_layers.11.conv_module.pointwise_conv1 1 1 866 867 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_11                 1 1 867 868 dim=1 $input=867
torch.cat                torch.cat_674            2 1 48 868 869 dim=2
nn.Conv1d                encoder.emformer_layers.11.conv_module.depthwise_conv 1 1 869 870 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_820            1 1 870 871 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_53             1 1 871 872 $input=871
pnnx.Expression          pnnx_expr_819            2 1 870 872 873 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.11.conv_module.pointwise_conv2 1 1 873 874 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_143                1 2 874 875 876 dim=2 indices=(8)
torch.permute            torch.permute_858        1 1 876 877 dims=(2,0,1) $input=876
torch.permute            torch.permute_857        1 1 875 878 dims=(2,0,1) $input=875
torch.cat                torch.cat_675            2 1 877 878 879 dim=0
pnnx.Expression          pnnx_expr_784            2 1 862 879 880 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.11.feed_forward.0 1 1 880 881 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_781            1 1 881 882 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_54             1 1 882 883 $input=882
pnnx.Expression          pnnx_expr_780            2 1 881 883 884 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.11.feed_forward.4 1 1 884 885 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_778            2 1 880 885 886 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_773            1 1 886 887 expr=mul(@0,@0)
torch.mean               torch.mean_752           1 1 887 888 dim=(-1) keepdim=True $input=887
pnnx.Expression          pnnx_expr_769            2 1 886 888 889 expr=mul(@0,pow(add(@1,2.460389e+00),-5.000000e-01))
torch.tensor_split       slice_145                1 2 889 890 891 dim=0 indices=(2)
torch.cat                torch.cat_676            2 1 890 891 892 dim=0
nn.Linear                encoder.emformer_layers.12.feed_forward_macaron.0 1 1 892 893 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_757            1 1 893 894 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_55             1 1 894 895 $input=894
pnnx.Expression          pnnx_expr_756            2 1 893 895 896 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.12.feed_forward_macaron.4 1 1 896 897 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_754            2 1 892 897 898 expr=add(@0,@1)
torch.tensor_split       slice_148                1 2 898 899 900 dim=0 indices=(2)
torch.cat                torch.cat_677            2 1 899 900 901 dim=0
nn.Linear                encoder.emformer_layers.12.attention.emb_to_query 1 1 901 902 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_678            3 1 49 899 900 903 dim=0
nn.Linear                encoder.emformer_layers.12.attention.emb_to_key_value 1 1 903 904 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_724          1 2 904 905 906 chunks=2 dim=2 $input=904
Tensor.view              Tensor.view_524          1 1 902 907 shape=(10,4,36) $input=902
torch.tensor_split       slice_150                1 2 905 908 909 dim=0 indices=(34)
torch.cat                torch.cat_679            3 1 908 50 909 910 dim=0
Tensor.view              Tensor.view_525          1 1 910 911 shape=(50,4,36) $input=910
torch.tensor_split       slice_152                1 2 906 912 913 dim=0 indices=(34)
torch.cat                torch.cat_680            3 1 912 51 913 914 dim=0
Tensor.view              Tensor.view_526          1 1 914 915 shape=(50,4,36) $input=914
torch.permute            torch.permute_859        1 1 907 916 dims=(1,0,2) $input=907
pnnx.Expression          pnnx_expr_696            1 1 916 917 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_860        1 1 911 918 dims=(1,0,2) $input=911
torch.permute            torch.permute_862        1 1 918 919 dims=(0,2,1) $input=918
torch.bmm                torch.bmm_560            2 1 917 919 920 $input=917 $mat2=919
F.softmax                F.softmax_79             1 1 920 921 dim=-1 $input=920
torch.permute            torch.permute_861        1 1 915 922 dims=(1,0,2) $input=915
torch.bmm                torch.bmm_561            2 1 921 922 923 $input=921 $mat2=922
torch.permute            torch.permute_863        1 1 923 924 dims=(1,0,2) $input=923
Tensor.reshape           Tensor.reshape_96        1 1 924 925 shape=(-1,144) $input=924
torch.unsqueeze          torch.unsqueeze_905      1 1 925 926 dim=1 $input=925
nn.Linear                encoder.emformer_layers.12.attention.out_proj 1 1 926 927 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_753           1 1 900 928 dim=(0) keepdim=True $input=900
pnnx.Expression          pnnx_expr_660            2 1 898 927 929 expr=add(@0,@1)
torch.tensor_split       slice_153                1 2 929 930 931 dim=0 indices=(2)
torch.cat                torch.cat_682            2 1 931 930 932 dim=0
torch.permute            torch.permute_864        1 1 932 933 dims=(1,2,0) $input=932
nn.Conv1d                encoder.emformer_layers.12.conv_module.pointwise_conv1 1 1 933 934 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_12                 1 1 934 935 dim=1 $input=934
torch.cat                torch.cat_683            2 1 52 935 936 dim=2
nn.Conv1d                encoder.emformer_layers.12.conv_module.depthwise_conv 1 1 936 937 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_630            1 1 937 938 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_56             1 1 938 939 $input=938
pnnx.Expression          pnnx_expr_629            2 1 937 939 940 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.12.conv_module.pointwise_conv2 1 1 940 941 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_155                1 2 941 942 943 dim=2 indices=(8)
torch.permute            torch.permute_866        1 1 943 944 dims=(2,0,1) $input=943
torch.permute            torch.permute_865        1 1 942 945 dims=(2,0,1) $input=942
torch.cat                torch.cat_684            2 1 944 945 946 dim=0
pnnx.Expression          pnnx_expr_594            2 1 929 946 947 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.12.feed_forward.0 1 1 947 948 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_591            1 1 948 949 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_57             1 1 949 950 $input=949
pnnx.Expression          pnnx_expr_590            2 1 948 950 951 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.12.feed_forward.4 1 1 951 952 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_588            2 1 947 952 953 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_583            1 1 953 954 expr=mul(@0,@0)
torch.mean               torch.mean_754           1 1 954 955 dim=(-1) keepdim=True $input=954
pnnx.Expression          pnnx_expr_579            2 1 953 955 956 expr=mul(@0,pow(add(@1,1.900982e+00),-5.000000e-01))
torch.tensor_split       slice_157                1 2 956 957 958 dim=0 indices=(2)
torch.cat                torch.cat_685            2 1 957 958 959 dim=0
nn.Linear                encoder.emformer_layers.13.feed_forward_macaron.0 1 1 959 960 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_567            1 1 960 961 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_58             1 1 961 962 $input=961
pnnx.Expression          pnnx_expr_566            2 1 960 962 963 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.13.feed_forward_macaron.4 1 1 963 964 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_564            2 1 959 964 965 expr=add(@0,@1)
torch.tensor_split       slice_160                1 2 965 966 967 dim=0 indices=(2)
torch.cat                torch.cat_686            2 1 966 967 968 dim=0
nn.Linear                encoder.emformer_layers.13.attention.emb_to_query 1 1 968 969 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_687            3 1 53 966 967 970 dim=0
nn.Linear                encoder.emformer_layers.13.attention.emb_to_key_value 1 1 970 971 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_725          1 2 971 972 973 chunks=2 dim=2 $input=971
Tensor.view              Tensor.view_527          1 1 969 974 shape=(10,4,36) $input=969
torch.tensor_split       slice_162                1 2 972 975 976 dim=0 indices=(34)
torch.cat                torch.cat_688            3 1 975 54 976 977 dim=0
Tensor.view              Tensor.view_528          1 1 977 978 shape=(50,4,36) $input=977
torch.tensor_split       slice_164                1 2 973 979 980 dim=0 indices=(34)
torch.cat                torch.cat_689            3 1 979 55 980 981 dim=0
Tensor.view              Tensor.view_529          1 1 981 982 shape=(50,4,36) $input=981
torch.permute            torch.permute_867        1 1 974 983 dims=(1,0,2) $input=974
pnnx.Expression          pnnx_expr_506            1 1 983 984 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_868        1 1 978 985 dims=(1,0,2) $input=978
torch.permute            torch.permute_870        1 1 985 986 dims=(0,2,1) $input=985
torch.bmm                torch.bmm_562            2 1 984 986 987 $input=984 $mat2=986
F.softmax                F.softmax_80             1 1 987 988 dim=-1 $input=987
torch.permute            torch.permute_869        1 1 982 989 dims=(1,0,2) $input=982
torch.bmm                torch.bmm_563            2 1 988 989 990 $input=988 $mat2=989
torch.permute            torch.permute_871        1 1 990 991 dims=(1,0,2) $input=990
Tensor.reshape           Tensor.reshape_97        1 1 991 992 shape=(-1,144) $input=991
torch.unsqueeze          torch.unsqueeze_906      1 1 992 993 dim=1 $input=992
nn.Linear                encoder.emformer_layers.13.attention.out_proj 1 1 993 994 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_755           1 1 967 995 dim=(0) keepdim=True $input=967
pnnx.Expression          pnnx_expr_470            2 1 965 994 996 expr=add(@0,@1)
torch.tensor_split       slice_165                1 2 996 997 998 dim=0 indices=(2)
torch.cat                torch.cat_691            2 1 998 997 999 dim=0
torch.permute            torch.permute_872        1 1 999 1000 dims=(1,2,0) $input=999
nn.Conv1d                encoder.emformer_layers.13.conv_module.pointwise_conv1 1 1 1000 1001 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_13                 1 1 1001 1002 dim=1 $input=1001
torch.cat                torch.cat_692            2 1 56 1002 1003 dim=2
nn.Conv1d                encoder.emformer_layers.13.conv_module.depthwise_conv 1 1 1003 1004 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_440            1 1 1004 1005 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_59             1 1 1005 1006 $input=1005
pnnx.Expression          pnnx_expr_439            2 1 1004 1006 1007 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.13.conv_module.pointwise_conv2 1 1 1007 1008 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_167                1 2 1008 1009 1010 dim=2 indices=(8)
torch.permute            torch.permute_874        1 1 1010 1011 dims=(2,0,1) $input=1010
torch.permute            torch.permute_873        1 1 1009 1012 dims=(2,0,1) $input=1009
torch.cat                torch.cat_693            2 1 1011 1012 1013 dim=0
pnnx.Expression          pnnx_expr_404            2 1 996 1013 1014 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.13.feed_forward.0 1 1 1014 1015 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_401            1 1 1015 1016 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_60             1 1 1016 1017 $input=1016
pnnx.Expression          pnnx_expr_400            2 1 1015 1017 1018 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.13.feed_forward.4 1 1 1018 1019 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_398            2 1 1014 1019 1020 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_393            1 1 1020 1021 expr=mul(@0,@0)
torch.mean               torch.mean_756           1 1 1021 1022 dim=(-1) keepdim=True $input=1021
pnnx.Expression          pnnx_expr_389            2 1 1020 1022 1023 expr=mul(@0,pow(add(@1,2.083574e+00),-5.000000e-01))
torch.tensor_split       slice_169                1 2 1023 1024 1025 dim=0 indices=(2)
torch.cat                torch.cat_694            2 1 1024 1025 1026 dim=0
nn.Linear                encoder.emformer_layers.14.feed_forward_macaron.0 1 1 1026 1027 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_377            1 1 1027 1028 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_61             1 1 1028 1029 $input=1028
pnnx.Expression          pnnx_expr_376            2 1 1027 1029 1030 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.14.feed_forward_macaron.4 1 1 1030 1031 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_374            2 1 1026 1031 1032 expr=add(@0,@1)
torch.tensor_split       slice_172                1 2 1032 1033 1034 dim=0 indices=(2)
torch.cat                torch.cat_695            2 1 1033 1034 1035 dim=0
nn.Linear                encoder.emformer_layers.14.attention.emb_to_query 1 1 1035 1036 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_696            3 1 57 1033 1034 1037 dim=0
nn.Linear                encoder.emformer_layers.14.attention.emb_to_key_value 1 1 1037 1038 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_726          1 2 1038 1039 1040 chunks=2 dim=2 $input=1038
Tensor.view              Tensor.view_530          1 1 1036 1041 shape=(10,4,36) $input=1036
torch.tensor_split       slice_174                1 2 1039 1042 1043 dim=0 indices=(34)
torch.cat                torch.cat_697            3 1 1042 58 1043 1044 dim=0
Tensor.view              Tensor.view_531          1 1 1044 1045 shape=(50,4,36) $input=1044
torch.tensor_split       slice_176                1 2 1040 1046 1047 dim=0 indices=(34)
torch.cat                torch.cat_698            3 1 1046 59 1047 1048 dim=0
Tensor.view              Tensor.view_532          1 1 1048 1049 shape=(50,4,36) $input=1048
torch.permute            torch.permute_875        1 1 1041 1050 dims=(1,0,2) $input=1041
pnnx.Expression          pnnx_expr_316            1 1 1050 1051 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_876        1 1 1045 1052 dims=(1,0,2) $input=1045
torch.permute            torch.permute_878        1 1 1052 1053 dims=(0,2,1) $input=1052
torch.bmm                torch.bmm_564            2 1 1051 1053 1054 $input=1051 $mat2=1053
F.softmax                F.softmax_81             1 1 1054 1055 dim=-1 $input=1054
torch.permute            torch.permute_877        1 1 1049 1056 dims=(1,0,2) $input=1049
torch.bmm                torch.bmm_565            2 1 1055 1056 1057 $input=1055 $mat2=1056
torch.permute            torch.permute_879        1 1 1057 1058 dims=(1,0,2) $input=1057
Tensor.reshape           Tensor.reshape_98        1 1 1058 1059 shape=(-1,144) $input=1058
torch.unsqueeze          torch.unsqueeze_907      1 1 1059 1060 dim=1 $input=1059
nn.Linear                encoder.emformer_layers.14.attention.out_proj 1 1 1060 1061 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_757           1 1 1034 1062 dim=(0) keepdim=True $input=1034
pnnx.Expression          pnnx_expr_280            2 1 1032 1061 1063 expr=add(@0,@1)
torch.tensor_split       slice_177                1 2 1063 1064 1065 dim=0 indices=(2)
torch.cat                torch.cat_700            2 1 1065 1064 1066 dim=0
torch.permute            torch.permute_880        1 1 1066 1067 dims=(1,2,0) $input=1066
nn.Conv1d                encoder.emformer_layers.14.conv_module.pointwise_conv1 1 1 1067 1068 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_14                 1 1 1068 1069 dim=1 $input=1068
torch.cat                torch.cat_701            2 1 60 1069 1070 dim=2
nn.Conv1d                encoder.emformer_layers.14.conv_module.depthwise_conv 1 1 1070 1071 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_250            1 1 1071 1072 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_62             1 1 1072 1073 $input=1072
pnnx.Expression          pnnx_expr_249            2 1 1071 1073 1074 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.14.conv_module.pointwise_conv2 1 1 1074 1075 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_179                1 2 1075 1076 1077 dim=2 indices=(8)
torch.permute            torch.permute_882        1 1 1077 1078 dims=(2,0,1) $input=1077
torch.permute            torch.permute_881        1 1 1076 1079 dims=(2,0,1) $input=1076
torch.cat                torch.cat_702            2 1 1078 1079 1080 dim=0
pnnx.Expression          pnnx_expr_214            2 1 1063 1080 1081 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.14.feed_forward.0 1 1 1081 1082 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_211            1 1 1082 1083 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_63             1 1 1083 1084 $input=1083
pnnx.Expression          pnnx_expr_210            2 1 1082 1084 1085 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.14.feed_forward.4 1 1 1085 1086 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_208            2 1 1081 1086 1087 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_203            1 1 1087 1088 expr=mul(@0,@0)
torch.mean               torch.mean_758           1 1 1088 1089 dim=(-1) keepdim=True $input=1088
pnnx.Expression          pnnx_expr_199            2 1 1087 1089 1090 expr=mul(@0,pow(add(@1,1.938745e+00),-5.000000e-01))
torch.tensor_split       slice_181                1 2 1090 1091 1092 dim=0 indices=(2)
torch.cat                torch.cat_703            2 1 1091 1092 1093 dim=0
nn.Linear                encoder.emformer_layers.15.feed_forward_macaron.0 1 1 1093 1094 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_187            1 1 1094 1095 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_64             1 1 1095 1096 $input=1095
pnnx.Expression          pnnx_expr_186            2 1 1094 1096 1097 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.15.feed_forward_macaron.4 1 1 1097 1098 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_184            2 1 1093 1098 1099 expr=add(@0,@1)
torch.tensor_split       slice_184                1 2 1099 1100 1101 dim=0 indices=(2)
torch.cat                torch.cat_704            2 1 1100 1101 1102 dim=0
nn.Linear                encoder.emformer_layers.15.attention.emb_to_query 1 1 1102 1103 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.cat                torch.cat_705            3 1 61 1100 1101 1104 dim=0
nn.Linear                encoder.emformer_layers.15.attention.emb_to_key_value 1 1 1104 1105 bias=True in_features=144 out_features=288 @bias=(288)f32 @weight=(288,144)f32
torch.chunk              torch.chunk_727          1 2 1105 1106 1107 chunks=2 dim=2 $input=1105
Tensor.view              Tensor.view_533          1 1 1103 1108 shape=(10,4,36) $input=1103
torch.tensor_split       slice_186                1 2 1106 1109 1110 dim=0 indices=(34)
torch.cat                torch.cat_706            3 1 1109 62 1110 1111 dim=0
Tensor.view              Tensor.view_534          1 1 1111 1112 shape=(50,4,36) $input=1111
torch.tensor_split       slice_188                1 2 1107 1113 1114 dim=0 indices=(34)
torch.cat                torch.cat_707            3 1 1113 63 1114 1115 dim=0
Tensor.view              Tensor.view_535          1 1 1115 1116 shape=(50,4,36) $input=1115
torch.permute            torch.permute_883        1 1 1108 1117 dims=(1,0,2) $input=1108
pnnx.Expression          pnnx_expr_126            1 1 1117 1118 expr=mul(@0,1.666667e-01)
torch.permute            torch.permute_884        1 1 1112 1119 dims=(1,0,2) $input=1112
torch.permute            torch.permute_886        1 1 1119 1120 dims=(0,2,1) $input=1119
torch.bmm                torch.bmm_566            2 1 1118 1120 1121 $input=1118 $mat2=1120
F.softmax                F.softmax_82             1 1 1121 1122 dim=-1 $input=1121
torch.permute            torch.permute_885        1 1 1116 1123 dims=(1,0,2) $input=1116
torch.bmm                torch.bmm_567            2 1 1122 1123 1124 $input=1122 $mat2=1123
torch.permute            torch.permute_887        1 1 1124 1125 dims=(1,0,2) $input=1124
Tensor.reshape           Tensor.reshape_99        1 1 1125 1126 shape=(-1,144) $input=1125
torch.unsqueeze          torch.unsqueeze_908      1 1 1126 1127 dim=1 $input=1126
nn.Linear                encoder.emformer_layers.15.attention.out_proj 1 1 1127 1128 bias=True in_features=144 out_features=144 @bias=(144)f32 @weight=(144,144)f32
torch.mean               torch.mean_759           1 1 1101 1129 dim=(0) keepdim=True $input=1101
pnnx.Expression          pnnx_expr_90             2 1 1099 1128 1130 expr=add(@0,@1)
torch.tensor_split       slice_189                1 2 1130 1131 1132 dim=0 indices=(2)
torch.cat                torch.cat_709            2 1 1132 1131 1133 dim=0
torch.permute            torch.permute_888        1 1 1133 1134 dims=(1,2,0) $input=1133
nn.Conv1d                encoder.emformer_layers.15.conv_module.pointwise_conv1 1 1 1134 1135 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=288 padding=(0) padding_mode=zeros stride=(1) @bias=(288)f32 @weight=(288,144,1)f32
F.glu                    F.glu_15                 1 1 1135 1136 dim=1 $input=1135
torch.cat                torch.cat_710            2 1 64 1136 1137 dim=2
nn.Conv1d                encoder.emformer_layers.15.conv_module.depthwise_conv 1 1 1137 1138 bias=True dilation=(1) groups=144 in_channels=144 kernel_size=(31) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,1,31)f32
pnnx.Expression          pnnx_expr_60             1 1 1138 1139 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_65             1 1 1139 1140 $input=1139
pnnx.Expression          pnnx_expr_59             2 1 1138 1140 1141 expr=mul(@0,@1)
nn.Conv1d                encoder.emformer_layers.15.conv_module.pointwise_conv2 1 1 1141 1142 bias=True dilation=(1) groups=1 in_channels=144 kernel_size=(1) out_channels=144 padding=(0) padding_mode=zeros stride=(1) @bias=(144)f32 @weight=(144,144,1)f32
torch.tensor_split       slice_191                1 2 1142 1143 1144 dim=2 indices=(8)
torch.permute            torch.permute_890        1 1 1144 1145 dims=(2,0,1) $input=1144
torch.permute            torch.permute_889        1 1 1143 1146 dims=(2,0,1) $input=1143
torch.cat                torch.cat_711            2 1 1145 1146 1147 dim=0
pnnx.Expression          pnnx_expr_24             2 1 1130 1147 1148 expr=add(@0,@1)
nn.Linear                encoder.emformer_layers.15.feed_forward.0 1 1 1148 1149 bias=True in_features=144 out_features=576 @bias=(576)f32 @weight=(576,144)f32
pnnx.Expression          pnnx_expr_21             1 1 1149 1150 expr=sub(@0,1.000000e+00)
F.sigmoid                F.sigmoid_66             1 1 1150 1151 $input=1150
pnnx.Expression          pnnx_expr_20             2 1 1149 1151 1152 expr=mul(@0,@1)
nn.Linear                encoder.emformer_layers.15.feed_forward.4 1 1 1152 1153 bias=True in_features=576 out_features=144 @bias=(144)f32 @weight=(144,576)f32
pnnx.Expression          pnnx_expr_18             2 1 1148 1153 1154 expr=add(@0,@1)
pnnx.Expression          pnnx_expr_13             1 1 1154 1155 expr=mul(@0,@0)
torch.mean               torch.mean_760           1 1 1155 1156 dim=(-1) keepdim=True $input=1155
pnnx.Expression          pnnx_expr_9              2 1 1154 1156 1157 expr=mul(@0,pow(add(@1,1.524804e+00),-5.000000e-01))
torch.cat                torch.cat_708            2 1 61 1129 1158 dim=0
torch.cat                torch.cat_699            2 1 57 1062 1159 dim=0
torch.cat                torch.cat_690            2 1 53 995 1160 dim=0
torch.cat                torch.cat_681            2 1 49 928 1161 dim=0
torch.cat                torch.cat_672            2 1 45 861 1162 dim=0
torch.cat                torch.cat_663            2 1 41 794 1163 dim=0
torch.cat                torch.cat_654            2 1 37 727 1164 dim=0
torch.cat                torch.cat_645            2 1 33 660 1165 dim=0
torch.cat                torch.cat_636            2 1 29 593 1166 dim=0
torch.cat                torch.cat_627            2 1 25 526 1167 dim=0
torch.cat                torch.cat_618            2 1 21 459 1168 dim=0
torch.cat                torch.cat_609            2 1 17 392 1169 dim=0
torch.cat                torch.cat_600            2 1 13 325 1170 dim=0
torch.cat                torch.cat_591            2 1 9 258 1171 dim=0
torch.cat                torch.cat_582            2 1 5 191 1172 dim=0
torch.cat                torch.cat_573            2 1 1 124 1173 dim=0
Tensor.slice             slice_289                1 1 1173 1174 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1173
Tensor.slice             slice_287                1 1 106 1175 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=106
Tensor.slice             slice_288                1 1 1175 1176 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1175
Tensor.slice             slice_285                1 1 110 1177 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=110
Tensor.slice             slice_286                1 1 1177 1178 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1177
Tensor.slice             slice_284                1 1 132 1179 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=132
Tensor.slice             slice_283                1 1 1172 1180 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1172
Tensor.slice             slice_281                1 1 173 1181 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=173
Tensor.slice             slice_282                1 1 1181 1182 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1181
Tensor.slice             slice_279                1 1 177 1183 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=177
Tensor.slice             slice_280                1 1 1183 1184 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1183
Tensor.slice             slice_278                1 1 199 1185 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=199
Tensor.slice             slice_277                1 1 1171 1186 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1171
Tensor.slice             slice_275                1 1 240 1187 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=240
Tensor.slice             slice_276                1 1 1187 1188 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1187
Tensor.slice             slice_273                1 1 244 1189 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=244
Tensor.slice             slice_274                1 1 1189 1190 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1189
Tensor.slice             slice_272                1 1 266 1191 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=266
Tensor.slice             slice_271                1 1 1170 1192 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1170
Tensor.slice             slice_269                1 1 307 1193 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=307
Tensor.slice             slice_270                1 1 1193 1194 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1193
Tensor.slice             slice_267                1 1 311 1195 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=311
Tensor.slice             slice_268                1 1 1195 1196 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1195
Tensor.slice             slice_266                1 1 333 1197 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=333
Tensor.slice             slice_265                1 1 1169 1198 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1169
Tensor.slice             slice_263                1 1 374 1199 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=374
Tensor.slice             slice_264                1 1 1199 1200 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1199
Tensor.slice             slice_261                1 1 378 1201 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=378
Tensor.slice             slice_262                1 1 1201 1202 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1201
Tensor.slice             slice_260                1 1 400 1203 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=400
Tensor.slice             slice_259                1 1 1168 1204 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1168
Tensor.slice             slice_257                1 1 441 1205 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=441
Tensor.slice             slice_258                1 1 1205 1206 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1205
Tensor.slice             slice_255                1 1 445 1207 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=445
Tensor.slice             slice_256                1 1 1207 1208 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1207
Tensor.slice             slice_254                1 1 467 1209 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=467
Tensor.slice             slice_253                1 1 1167 1210 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1167
Tensor.slice             slice_251                1 1 508 1211 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=508
Tensor.slice             slice_252                1 1 1211 1212 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1211
Tensor.slice             slice_249                1 1 512 1213 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=512
Tensor.slice             slice_250                1 1 1213 1214 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1213
Tensor.slice             slice_248                1 1 534 1215 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=534
Tensor.slice             slice_247                1 1 1166 1216 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1166
Tensor.slice             slice_245                1 1 575 1217 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=575
Tensor.slice             slice_246                1 1 1217 1218 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1217
Tensor.slice             slice_243                1 1 579 1219 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=579
Tensor.slice             slice_244                1 1 1219 1220 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1219
Tensor.slice             slice_242                1 1 601 1221 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=601
Tensor.slice             slice_241                1 1 1165 1222 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1165
Tensor.slice             slice_239                1 1 642 1223 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=642
Tensor.slice             slice_240                1 1 1223 1224 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1223
Tensor.slice             slice_237                1 1 646 1225 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=646
Tensor.slice             slice_238                1 1 1225 1226 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1225
Tensor.slice             slice_236                1 1 668 1227 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=668
Tensor.slice             slice_235                1 1 1164 1228 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1164
Tensor.slice             slice_233                1 1 709 1229 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=709
Tensor.slice             slice_234                1 1 1229 1230 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1229
Tensor.slice             slice_231                1 1 713 1231 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=713
Tensor.slice             slice_232                1 1 1231 1232 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1231
Tensor.slice             slice_230                1 1 735 1233 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=735
Tensor.slice             slice_229                1 1 1163 1234 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1163
Tensor.slice             slice_227                1 1 776 1235 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=776
Tensor.slice             slice_228                1 1 1235 1236 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1235
Tensor.slice             slice_225                1 1 780 1237 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=780
Tensor.slice             slice_226                1 1 1237 1238 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1237
Tensor.slice             slice_224                1 1 802 1239 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=802
Tensor.slice             slice_223                1 1 1162 1240 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1162
Tensor.slice             slice_221                1 1 843 1241 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=843
Tensor.slice             slice_222                1 1 1241 1242 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1241
Tensor.slice             slice_219                1 1 847 1243 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=847
Tensor.slice             slice_220                1 1 1243 1244 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1243
Tensor.slice             slice_218                1 1 869 1245 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=869
Tensor.slice             slice_217                1 1 1161 1246 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1161
Tensor.slice             slice_215                1 1 910 1247 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=910
Tensor.slice             slice_216                1 1 1247 1248 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1247
Tensor.slice             slice_213                1 1 914 1249 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=914
Tensor.slice             slice_214                1 1 1249 1250 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1249
Tensor.slice             slice_212                1 1 936 1251 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=936
Tensor.slice             slice_211                1 1 1160 1252 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1160
Tensor.slice             slice_209                1 1 977 1253 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=977
Tensor.slice             slice_210                1 1 1253 1254 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1253
Tensor.slice             slice_207                1 1 981 1255 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=981
Tensor.slice             slice_208                1 1 1255 1256 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1255
Tensor.slice             slice_206                1 1 1003 1257 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=1003
Tensor.slice             slice_205                1 1 1159 1258 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1159
Tensor.slice             slice_203                1 1 1044 1259 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=1044
Tensor.slice             slice_204                1 1 1259 1260 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1259
Tensor.slice             slice_201                1 1 1048 1261 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=1048
Tensor.slice             slice_202                1 1 1261 1262 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1261
Tensor.slice             slice_200                1 1 1070 1263 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=1070
Tensor.slice             slice_199                1 1 1158 1264 dims=(0) ends=(2147483647) starts=(1) steps=(1) $input=1158
Tensor.slice             slice_197                1 1 1111 1265 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=1111
Tensor.slice             slice_198                1 1 1265 1266 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1265
Tensor.slice             slice_195                1 1 1115 1267 dims=(0) ends=(2147483647) starts=(34) steps=(1) $input=1115
Tensor.slice             slice_196                1 1 1267 1268 dims=(0) ends=(2147483647) starts=(-8) steps=(1) $input=1267
Tensor.slice             slice_194                1 1 1137 1269 dims=(2) ends=(-2) starts=(-32) steps=(1) $input=1137
pnnx.Expression          pnnx_expr_0              64 1 1174 1176 1178 1179 1180 1182 1184 1185 1186 1188 1190 1191 1192 1194 1196 1197 1198 1200 1202 1203 1204 1206 1208 1209 1210 1212 1214 1215 1216 1218 1220 1221 1222 1224 1226 1227 1228 1230 1232 1233 1234 1236 1238 1239 1240 1242 1244 1245 1246 1248 1250 1251 1252 1254 1256 1257 1258 1260 1262 1263 1264 1266 1268 1269 1270 expr=[@0,@1,@2,@3,@4,@5,@6,@7,@8,@9,@10,@11,@12,@13,@14,@15,@16,@17,@18,@19,@20,@21,@22,@23,@24,@25,@26,@27,@28,@29,@30,@31,@32,@33,@34,@35,@36,@37,@38,@39,@40,@41,@42,@43,@44,@45,@46,@47,@48,@49,@50,@51,@52,@53,@54,@55,@56,@57,@58,@59,@60,@61,@62,@63]
Tensor.slice             slice_193                1 1 1157 1271 dims=(0) ends=(2147483647) starts=(2) steps=(1) $input=1157
torch.permute            torch.permute_891        1 1 1271 1272 dims=(1,0,2) $input=1271
prim::TupleConstruct     pnnx_4332                2 1 1272 1270 1273
pnnx.Output              pnnx_output_0            1 0 1273