SecBERT / README.md
jackaduma's picture
Create README.md
5576c01
|
raw
history blame
No virus
1.04 kB

SecBERT

This is the pretrained model presented in SecBERT: A Pretrained Language Model for Cyber Security Text, which is a BERT model trained on cyber security text.

The training corpus was papers taken from APTnotes, Stucco-Data: Cyber security data sources, CASIE: Extracting Cybersecurity Event Information from Text, SemEval-2018 Task 8: Semantic Extraction from CybersecUrity REports using Natural Language Processing (SecureNLP).

SecBERT has its own wordpiece vocabulary (secvocab) that's built to best match the training corpus. We trained BERT and SecRoBERTa versions.

Available models include:

  • SecBERT
  • SecRoBERTa

The original repo can be found here.