Edit model card

paraphrase-xlm-r-multilingual-v1.gguf

import torch
from llama_cpp import Llama
from sentence_transformers import SentenceTransformer
from scipy.spatial.distance import cosine

model = SentenceTransformer(
    "paraphrase-xlm-r-multilingual-v1",
    model_kwargs={"torch_dtype": torch.float16}
)
llm = Llama.from_pretrained(
    "mykor/paraphrase-xlm-r-multilingual-v1.gguf",
    filename="paraphrase-xlm-r-multilingual-277M-v1-F16.gguf",
    embedding=True,
    verbose=False,
)

text = "์›€์ธ ๋Ÿฌ๋“  ์–ด๊นจ๋ฅผ ๋”ฐ๋ผ์„œ ๋‹ค์‹œ ์ €๋ฌผ์–ด๊ฐ€๋Š” ์˜ค๋Š˜์˜ ๋ ๋ฐค์ด ์กฐ์šฉํžˆ ๋‚˜๋ฅผ ์•ˆ์œผ๋ฉด ๋ฌด๋„ˆ์ ธ๊ฐ€๋Š” ๋‚  ์žŠ์–ด๋ฒ„๋ฆด ์ˆ˜ ์žˆ์–ด"
embed1 = model.encode(text)
embed2 = llm.embed(text)
print(cosine(embed1, embed2))
9.908465917951581e-05
Downloads last month
40
GGUF
Model size
277M params
Architecture
bert

8-bit

16-bit

Inference API
Unable to determine this model's library. Check the docs .