Converting to native Transformers
21
#81 opened about 2 months ago
by
cyrilvallez
Adding the Open Portuguese LLM Leaderboard Evaluation Results
#79 opened about 2 months ago
by
leaderboard-pt-pr-bot
Independent evaluation results
#78 opened about 2 months ago
by
yaronr
Fix: AttributeError when `input_ids` is None during multimodal LLM training
#77 opened about 2 months ago
by
lyulumos
使用 vLLM 部署,无法使用 outlines 做 guided decoding
2
#75 opened 3 months ago
by
wangm23456
Fix https://github.com/huggingface/peft/issues/1974#issue-2437471248
1
#74 opened 3 months ago
by
Finger-ebic
ollama适配的glm4怎样触发function_call?
3
#71 opened 3 months ago
by
NSSSJSS
Issue with Submitting Model to Open LLM Leaderboard
#68 opened 4 months ago
by
paperplanedeemo
Rag Prompt
1
#67 opened 4 months ago
by
devve1
请问如何复现模型介绍界面的经典任务的评估结果?
#62 opened 4 months ago
by
deleted
请问如何在llm-studio里使用
1
#61 opened 4 months ago
by
wongvio
del gradient_checkpointing_enable()
1
#60 opened 4 months ago
by
chandler88
What's the chat prompt in plain string?
3
#59 opened 5 months ago
by
apepkuss79
适配新版transformers | adapt transformers update (https://github.com/huggingface/transformers/pull/31116)
2
#58 opened 5 months ago
by
HibernantBear
Alternative quantizations.
#57 opened 5 months ago
by
ZeroWw
Harness Evaluation
2
#56 opened 5 months ago
by
VityaVitalich
[AUTOMATED] Model Memory Requirements
#55 opened 5 months ago
by
model-sizer-bot
已解决
#54 opened 5 months ago
by
zhongyi1997cn
Why GLM3 is better than GLM4 on LVEval benchmark?
1
#48 opened 5 months ago
by
AnaRhisT
请问能否提供工具调用模板?
#42 opened 5 months ago
by
WateBear
模型许可证
#37 opened 5 months ago
by
Andrewzhu100
What does "open source" mean? Need info on source code, training data, fine-tuning data
#36 opened 6 months ago
by
markding
lobechat不能使用函数和图片功能
#35 opened 6 months ago
by
jackies
FIX transformers compat
#28 opened 6 months ago
by
Qubitium
Multiple/Parallel function call?
1
#27 opened 6 months ago
by
Yhyu13
[ISSUE] forward() requires input_ids even if inputs_embeds is provided alternatively
#23 opened 6 months ago
by
x5fu
FlashAttention only supports Ampere GPUs or newer.
1
#13 opened 6 months ago
by
GuokLIU
qunatizer部分的为什么去掉了?
4
#10 opened 6 months ago
by
fukai
使用trans来进行推理时候出现的错误:
#4 opened 6 months ago
by
shams123321
请问prompt换了没
2
#1 opened 6 months ago
by
okcwang