--- pipeline_tag: sentence-similarity language: - de tags: - sentence-transformers - sentence-similarity - transformers - setfit - onnx license: mit datasets: - deutsche-telekom/ger-backtrans-paraphrase --- # German BERT large paraphrase cosine This is a [sentence-transformers](https://www.SBERT.net) model. It maps sentences & paragraphs (text) into a 1024 dimensional dense vector space. The model is intended to be used together with [SetFit](https://github.com/huggingface/setfit) to improve German few-shot text classification. This is the ONNX-Version of [deutsche-telekom/gbert-large-paraphrase-cosine](https://huggingface.co/deutsche-telekom/gbert-large-paraphrase-cosine) ## Licensing Copyright (c) 2023 [Philip May](https://may.la/), [Deutsche Telekom AG](https://www.telekom.com/)\ Copyright (c) 2022 [deepset GmbH](https://www.deepset.ai/) Licensed under the **MIT License** (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License by reviewing the file [LICENSE](https://huggingface.co/deutsche-telekom/gbert-large-paraphrase-cosine/blob/main/LICENSE) in the repository.