YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

CYBERT

BERT model dedicated to the domain of cyber security. The model has been trained on a corpus of high-quality cyber security and computer science text and is unlikely to work outside this domain.

##Model architecture

The model architecture used is original Roberta and tokenizer to train the corpus is Byte Level.

##Hardware

The model is trained on GPU NVIDIA-SMI 510.54

Downloads last month
572
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for SynamicTechnologies/CYBERT

Finetunes
4 models

Spaces using SynamicTechnologies/CYBERT 2