AI & ML interests

Relation distilled BERT: lexical relation embedding model driven by pre-trained language models.

Recent Activity

asahi417  updated a dataset over 1 year ago
relbert/conceptnet_relational_similarity
asahi417  updated a dataset over 1 year ago
relbert/analogy_questions
asahi417  updated a model over 1 year ago
relbert/relbert-roberta-base-nce-t-rex
View all activity


RelBERT is a high-quality semantic representative embedding of word pairs powered by pre-trained language model. Install relbert via pip,

   
      pip install relbert
   

and play with RelBERT models.

   
      from relbert import RelBERT
      model = RelBERT('relbert/relbert-roberta-large')
      vector = model.get_embedding(['Tokyo', 'Japan'])  # shape of (1024, )
   

See more information bellow.