BAAI
/

Aquila_logo

English | 简体中文

GithubWeChat

We opensource our Aquila2 series, now including Aquila2, the base language models, namely Aquila2-7B and Aquila2-34B, as well as AquilaChat2, the chat models, namely AquilaChat2-7B and AquilaChat2-34B, as well as the long-text chat models, namely AquilaChat2-7B-16k and AquilaChat2-34B-16k

2023.10.25 🔥 AquilaChat2-34B-16K v1.2 is based on the previous AquilaChat2-34B-16K. The AquilaChat2-34B-16K-V1.2 has significantly improved long-text synthesis capabilities compared to the V1 version, approaching the level of GPT-3.5-16K. Additionally, the V1.2 version incorporates more conventional instruction fine-tuning corpora, enhancing its performance in non-long-text scenarios compared to the V1 version.

The additional details of the Aquila model will be presented in the official technical report. Please stay tuned for updates on official channels.

Quick Start AquilaChat2-34B-16K(Chat model)

1. Inference

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch

device = torch.device("cuda:0")
model_info = "BAAI/AquilaChat2-34B-16k"
tokenizer = AutoTokenizer.from_pretrained(model_info, trust_remote_code=True)
quantization_config=BitsAndBytesConfig(
                        load_in_4bit=True,
                        bnb_4bit_use_double_quant=True,
                        bnb_4bit_quant_type="nf4",
                        bnb_4bit_compute_dtype=torch.bfloat16,
                    )
model = AutoModelForCausalLM.from_pretrained(model_info, trust_remote_code=True, torch_dtype=torch.bfloat16,
                                                # quantization_config=quantization_config, # Uncomment this line for 4bit quantization
                                                )
model.eval()
model.to(device)
text = "请给出10个要到北京旅游的理由。"
from predict import predict
out = predict(model, text, tokenizer=tokenizer, max_gen_len=200, top_p=0.9,
              seed=123, topk=15, temperature=1.0, sft=True, device=device,
              model_name="AquilaChat2-34B-16K")
print(out)

License

Aquila2 series open-source model is licensed under BAAI Aquila Model Licence Agreement

Citation

Feel free to cite the repo if you think Aquila2 is useful.

@misc{zhang2024aquila2technicalreport,
      title={Aquila2 Technical Report}, 
      author={Bo-Wen Zhang and Liangdong Wang and Jijie Li and Shuhao Gu and Xinya Wu and Zhengduo Zhang and Boyan Gao and Yulong Ao and Guang Liu},
      year={2024},
      eprint={2408.07410},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2408.07410}, 
}
Downloads last month
528
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.

Model tree for BAAI/AquilaChat2-34B-16K

Quantizations
4 models

Space using BAAI/AquilaChat2-34B-16K 1

Collection including BAAI/AquilaChat2-34B-16K