Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
TypicaAI
/
magbert-ner
like
4
Token Classification
Transformers
PyTorch
French
camembert
Inference Endpoints
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
777be16
magbert-ner
1 contributor
History:
5 commits
system
HF staff
Update sentencepiece.bpe.model
777be16
almost 4 years ago
.gitattributes
345 Bytes
initial commit
almost 4 years ago
config.json
1.64 kB
Update config.json
almost 4 years ago
sentencepiece.bpe.model
811 kB
Update sentencepiece.bpe.model
almost 4 years ago
special_tokens_map.json
210 Bytes
Update special_tokens_map.json
almost 4 years ago
tokenizer_config.json
24 Bytes
Update tokenizer_config.json
almost 4 years ago