Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
shatabdi
/
twisent_twisent
like
1
Text Classification
Transformers
PyTorch
roberta
Generated from Trainer
Inference Endpoints
Model card
Files
Files and versions
Community
2
Train
Deploy
Use this model
main
twisent_twisent
/
special_tokens_map.json
Commit History
add tokenizer
ab00a8d
shatabdi
commited on
Jun 24, 2022