Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
shatabdi
/
twisent_sieBert
like
0
Text Classification
Transformers
PyTorch
roberta
Generated from Trainer
Inference Endpoints
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
refs/pr/1
twisent_sieBert
/
special_tokens_map.json
Commit History
add tokenizer
5b13add
shatabdi
commited on
Jun 24, 2022