--- license: llama2 --- ### Description This is a translation model utilizing the high Japanese proficiency of Swallow-hf-13b, primarily focused on English-Japanese or any language-to-Japanese translation. The model, tokyotech-llm/Swallow-13b-hf, has been fine-tuned with an 4K context and is mainly aimed at translating relatively long texts ranging from 100 tokens to several thousand tokens. While its core strength lies in English-Japanese translation, it also partially supports translation in other languages. (Multilingual translation features and long context translation become unstable when quantized.) ### Prompt An XML-like instruction template has been adopted. --- ### 概要 Swallow-hf-13bの高い日本語力を利用した翻訳モデルです [tokyotech-llm/Swallow-hf-13b](https://huggingface.co/tokyotech-llm/Swallow-13b-hf) 英日翻訳メインに、ファインチューニングしています 数千tokenまでの翻訳に対応しています 英語以外の言語から日本語への翻訳も一部対応しています ### プロンプト XML likeなタグによるinstructionフォーマットを採用しました ## Usage ### Prompt format:English to Japanese (main function) ``` : {}  : {}  ``` ### Prompt format:Other language to Japanese (experimental) ``` : {}  : {}  ``` ### Prompt format:Japanese to English ``` not supported ``` 長文の場合、Textstreamerの使用をお勧めします ``` import torch from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer model_name = "aixsatoshi/Honyaku-13b" model = AutoModelForCausalLM.from_pretrained( model_name, torch_dtype=torch.bfloat16, device_map="auto", ) tokenizer = AutoTokenizer.from_pretrained(model_name) # Define the streamer streamer = TextStreamer(tokenizer) # Define the English prompt english_prompt = """ In an era marked by rapid globalization, the intricate interplay between international law, economic policies, and political dynamics has become increasingly complex. Legal frameworks, once confined within national borders, now stretch across continents, necessitating a nuanced understanding of transnational legislation and treaties. As multinational corporations navigate the labyrinthine maze of global markets, economic theories that underpin currency fluctuations, trade imbalances, and fiscal policies are more pertinent than ever. Central to these economic considerations is the concept of market equilibrium, a delicate balance affected by myriad factors including consumer behavior, governmental regulations, and global crises. Politically, the landscape is equally labyrinthine. Ideological shifts and the resurgence of nationalism have reshaped diplomatic relations, with international agreements and alliances being tested under the strain of geopolitical tensions. The role of supranational entities like the United Nations and the European Union in mediating these conflicts is of paramount importance, as is the need for diplomatic finesse in an increasingly multipolar world. Furthermore, the intersection of politics and economics is evident in the debate over economic sanctions and their efficacy in swaying political decisions. In this context, understanding the subtleties of rhetoric used in political discourse, and how it interweaves with legal jargon and economic terminology, is crucial. For instance, the rhetoric surrounding fiscal austerity measures often intertwines with legal discourse on budgetary legislation and economic debates on inflation control. Similarly, discussions on constitutional amendments are frequently laden with political undertones, reflecting broader societal issues and ideological divides. This convergence of legal, economic, and political vernacular presents a unique challenge for machine translation systems, demanding not only linguistic accuracy but also a deep comprehension of the nuanced interplay of these disciplines. """ # Prepare the prompt for English to Japanese translation prompt = f": {english_prompt} \n\n:" # Tokenize the input text and move to CUDA device inputs = tokenizer(prompt, return_tensors="pt").to("cuda") # Generate the output using the model and streamer output = model.generate(**inputs, max_new_tokens=4096, do_sample=True, top_k=20, top_p=0.95, streamer=streamer) ```