|
--- |
|
license: apache-2.0 |
|
--- |
|
|
|
For convenience, I'm keeping smaller LoRA adapters I've extracted from various models for Qwen2.5 14B in this single repository: |
|
|
|
- [CultriX/Qwen2.5-14B-Wernicke](https://huggingface.co/sometimesanotion/Qwen-2.5-14B-LoRAs/tree/main/CultriX/Qwen2.5-14B-Wernicke/Qwen/Qwen2.5-14B-Instruct-rank128) - Base Qwen2.5-14B-Instruct, rank 128. |
|
- [huihui-ai/Qwen2.5-Coder-14B-Instruct-abliterated](https://huggingface.co/sometimesanotion/Qwen-2.5-14B-LoRAs/tree/main/huihui-ai/Qwen2.5-Coder-14B-Instruct-abliterated/Qwen/Qwen2.5-Coder-14B-Instruct-rank32) - Base Qwen2.5-Coder-14B-Instruct, rank 32. |
|
- [tanliboy/lambda-qwen2.5-14b-dpo-test](https://huggingface.co/sometimesanotion/Qwen-2.5-14B-LoRAs/tree/main/tanliboy/lambda-qwen2.5-14b-dpo-test/Qwen/Qwen2.5-14B-Instruct-rank128) - Base Qwen2.5-14B-Instruct, rank 128. |
|
- [tanliboy/lambda-qwen2.5-14b-dpo-test](https://huggingface.co/sometimesanotion/Qwen-2.5-14B-LoRAs/tree/main/tanliboy/lambda-qwen2.5-14b-dpo-test/Qwen/Qwen2.5-14B-Instruct-rank32) - Base Qwen2.5-14B-Instruct, rank 32. |
|
|
|
This repo's file structor follows thes naming scheme: author/model name/base model author/base model name-rank## |
|
|