Text Generation
Transformers
PyTorch
Safetensors
Korean
gpt_neox
text-generation-inference
Inference Endpoints
polyglot-ko-3.8b-chat / special_tokens_map.json

Commit History

Upload tokenizer
629e5d4

heegyu commited on