Edit model card

Model Card for Model traslate_en_es

image/png

image/png

image/png

image/png

  • p95 prompt length: 620
  • p95 prompt + chosen length: 1278
## template gemma

<bos><start_of_turn>system
You are an AI that translates from English to Spanish.<end_of_turn>
<start_of_turn>user
516
Total views
6-month low Ethereum fees suggest altseason is inbound: Santiment NEWS
Own this piece of crypto history

Collect this article as NFT
Join us on social networks


Gas fees on the Ethereum network have plunged to a six-month low even as the price of Ether
ETH

tickers down
$3,260

 rallied slightly over the weekend, which analysts from crypto analytics platform Santiment say could signal an upcoming altcoin rally.

On April 27, the average fee for an Ethereum translation fell as low as $1.12, according to an April 28 X post from Santiment.

“Traders historically move between sentimental cycles of feeling that crypto is going ‘To the Moon’ or feeling that ‘It Is Dead’, which can be observed through transaction fees,” wrote Santiment.
<end_of_turn>
<start_of_turn>model
516
Visitas totales
Las bajas tasas de Ethereum sugerirían que la temporada de altseason está a punto de llegar: Santiment NEWS
¡Posee este artículo como NFT
Únete a nosotros en redes sociales


Las tarifas de gas en la red de Ethereum han descendido a un mínimo de seis meses incluso mientras el precio de Ether
ETH

descendió ligeramente durante el fin de semana, lo que los analistas de la plataforma de análisis de criptomonedas Santiment dicen podría señalar un posible aumento de altco.

El 27 de abril, la tarifa promedio para una traducción de Ethereum cayó a un mínimo de $1.12, según un post de April 28 de Santiment.

“Los comerciantes han históricomente moverse entre ciclos sentimentales de sentir que el cripto está ‘A la Luna’ o sentir que ‘Está muerto’, lo cual se puede observar a través de tarifas de transacciones,” escribió Santiment.
<end_of_turn>
input= tokenizer(prompt,
                 return_tensors="pt",
                 add_special_tokens=False).to(model.device)

max_new_tokens=1000
generation_config = GenerationConfig(
              max_new_tokens = max_new_tokens,
              temperature = .3,
              # top_p=0.55,
              top_k = 50,
              repetition_penalty = 1.,
              do_sample=True,
          )
outputs = model.generate(**input,
                         generation_config=generation_config,
                         stopping_criteria=stopping_criteria_list,
                        )
print(tokenizer.decode(outputs[0], skip_special_tokens=False) )
train dataset: 49252 values
https://huggingface.co/datasets/NickyNicky/Colossal_Translation_EN_ES_ORPO_DPO_Gemma
original dataset:
https://huggingface.co/datasets/Iker/Colossal-Instruction-Translation-EN-ES
Downloads last month
14
Safetensors
Model size
2.51B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for NickyNicky/gemma-1.1-2b-it_orpo_traslate_en_es_V1

Quantizations
1 model

Dataset used to train NickyNicky/gemma-1.1-2b-it_orpo_traslate_en_es_V1