Barun
bapatra
AI & ML interests
Natural Language Processing, Large Model Scaling, Alignment research, multimodality
Organizations
bapatra's activity
Move flash_attn assert from __init__ into calling func
4
#32 opened 3 months ago
by
rogerxfeng8
auto_map in config.json doesn't contain Phi3SmallForSequenceClassification
1
#13 opened 6 months ago
by
kyeongpil
Add the classifiers to the auto_map
1
#76 opened 5 months ago
by
mrm196
Enabled the AutoModelForSequenceClassification in the auto_map
#22 opened 5 months ago
by
mrm196
Ensure the query_states and key_states remain in bf16
1
#21 opened 5 months ago
by
mrm196
Keep getting AssertionError: Flash Attention is not available when load the model
1
#7 opened 6 months ago
by
Complete-your-profile
Phi 3 small crashing error
3
#12 opened 6 months ago
by
aravindpai
Crash in Fine-tuning
4
#14 opened 6 months ago
by
tanliboy
how should data be packed?
2
#16 opened 6 months ago
by
shiyue
What pad token should I use for fine tuning?
1
#10 opened 6 months ago
by
faizsameerahmed96
Shared memory error
9
#15 opened 6 months ago
by
marktenenholtz
Update tokenization_phi3_small.py
1
#18 opened 6 months ago
by
damajercakms
Update tokenization_phi3_small.py
1
#14 opened 6 months ago
by
damajercakms
RuntimeError: FlashAttention only support fp16 and bf16 data type during fine tuning.
7
#11 opened 6 months ago
by
faizsameerahmed96
Where can we download the phi-3 small ?
1
#11 opened 6 months ago
by
sebastienbo
Why a different architecture from mini and medium?
5
#5 opened 6 months ago
by
winddude
Target_module of this phi-3-small model
10
#3 opened 6 months ago
by
hackint0sh
flash Attention Error while inference
5
#7 opened 6 months ago
by
hackint0sh
Is it possible that this is a small model of GPT-3.5?
1
#6 opened 6 months ago
by
Trangle