Edit model card

pierluigic/xl-lexeme

This model is based on sentence-transformers: It maps target word in sentences to a 1024 dimensional dense vector space and can be used for tasks like clustering or semantic search.

Usage (WordTransformer)

Install the library:

git clone git@github.com:pierluigic/xl-lexeme.git
cd xl-lexeme
pip3 install .

Then you can use the model like this:

from WordTransformer import WordTransformer, InputExample

model = WordTransformer('pierluigic/xl-lexeme')
examples = InputExample(texts="the quick fox jumps over the lazy dog", positions=[10,13])
fox_embedding = model.encode(examples) #The embedding of the target word "fox"

Training

The model was trained with the parameters:

DataLoader:

torch.utils.data.dataloader.DataLoader of length 16531 with parameters:

{'batch_size': 16, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}

Loss:

sentence_transformers.losses.ContrastiveLoss.ContrastiveLoss with parameters:

{'distance_metric': 'SiameseDistanceMetric.COSINE_DISTANCE', 'margin': 0.5, 'size_average': True}

Parameters of the fit()-Method:

{
    "epochs": 10,
    "evaluation_steps": 4132,
    "evaluator": "sentence_transformers.evaluation.SequentialEvaluator.SequentialEvaluator",
    "max_grad_norm": 1,
    "optimizer_class": "<class 'transformers.optimization.AdamW'>",
    "optimizer_params": {
        "lr": 1e-05
    },
    "scheduler": "WarmupLinear",
    "steps_per_epoch": null,
    "warmup_steps": 16531.0,
    "weight_decay": 0.0
}

Full Model Architecture

SentenceTransformerTarget(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: XLMRobertaModel 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)

Citing & Authors

@inproceedings{cassotti-etal-2023-xl,
    title = "{XL}-{LEXEME}: {W}i{C} Pretrained Model for Cross-Lingual {LEX}ical s{EM}antic chang{E}",
    author = "Cassotti, Pierluigi  and
      Siciliani, Lucia  and
      DeGemmis, Marco  and
      Semeraro, Giovanni  and
      Basile, Pierpaolo",
    booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)",
    month = jul,
    year = "2023",
    address = "Toronto, Canada",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.acl-short.135",
    pages = "1577--1585"
}
Downloads last month
2,408
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.