Datasets:
Tasks:
Token Classification
Modalities:
Text
Formats:
parquet
Languages:
Thai
Size:
100K - 1M
Tags:
word-tokenization
License:
Commit History
Reorder split names (#1)
e25239f
add dataset_info in dataset metadata
b3962be
remove dummmy data
b8aaeca
mariosasko
commited on