Datasets:
Tasks:
Token Classification
Modalities:
Text
Formats:
parquet
Languages:
Thai
Size:
100K - 1M
Tags:
word-tokenization
License:
mariosasko
commited on
Commit
•
b8aaeca
1
Parent(s):
a1deac4
remove dummmy data
Browse files
dummy/best2009/1.0.0/dummy_data.zip
DELETED
@@ -1,3 +0,0 @@
|
|
1 |
-
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:76493fafb238c2d7cf264354a31380029d69f66710c6385e246397d7b90688e1
|
3 |
-
size 18586
|
|
|
|
|
|
|
|