--- language: en license: cc-by-nc-4.0 library_name: sentence-transformers tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers datasets: - beeformer/recsys-movielens-20m pipeline_tag: sentence-similarity --- # goodbooks-mpnet-base-v2 This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and it is designed to use in recommender systems for content-base filtering and as a side information for cold-start recommendation. ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example product description", "Each product description is converted"] model = SentenceTransformer('beeformer/goodbooks-mpnet-base-v2') embeddings = model.encode(sentences) print(embeddings) ``` ## Training procedure ### Pre-training We use the pretrained [`sentence-transformers/all-mpnet-base-v2`](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) model. Please refer to the model card for more detailed information about the pre-training procedure. ### Fine-tuning We use the initial model without modifying its architecture or pre-trained model parameters. However, we reduce the processed sequence length to 384 to reduce the training time of the model. Regarding other hyperparameters, we use the same interaction data batch size of 1024; we use the negative sampling parameter m = 7500. We use constant learning rate of 1e-5, and we train the model for five epochs. We finetuned our model on the Goodbooks-10k dataset. For details please see our paper (link TBA). For item ids used during training please see (links TBA). ## Evaluation Results For ids of items used for coldstart evaluation please see (links TBA). Table with results TBA. ## Intended uses This model was trained as a demonstration of capabilities of the beeFormer training framework (link and details TBA) and is intended for research purposes only. ## Citation TBA