sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
6fef3d2cfc13f70cb2a4dcfda0485c202db55e39 |
Not a serious or useable dataset, just used this https://github.com/Pleias/marginalia library and edited this [notebook](https://colab.research.google.com/drive/1dbliAbWRoidQgMqFPq4Rs8qooJ4S0Ye1?usp=sharing) to check the quality of the selections in the jondurbin/truthy-dpo-v0.1 dataset | eren23/truthy-dpo-v0.1-pruning-test-result | [
"region:us"
] | 2024-02-17T15:57:03+00:00 | {"dataset_info": {"features": [{"name": "original_source", "dtype": "string"}, {"name": "reference_evaluate", "dtype": "string"}, {"name": "reference_result", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1241977, "num_examples": 909}], "download_size": 622347, "dataset_size": 1241977}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T16:00:48+00:00 |
73814861a571b238b12417d4ac8bd4195ce3247d | damerajee/long-context-hindi | [
"region:us"
] | 2024-02-17T15:57:36+00:00 | {} | 2024-02-17T15:58:15+00:00 |
|
2d5a42b97183d91a9c60a51e52ff628589863b53 | Raja99hug/Auto_English_Thanglish_dataset | [
"region:us"
] | 2024-02-17T15:58:18+00:00 | {} | 2024-02-17T16:00:01+00:00 |
|
d8e1b3521cf2047e30907fed5655fb7ae266a0cb | CyberHarem/delacey_neuralcloud | [
"region:us"
] | 2024-02-17T15:59:31+00:00 | {} | 2024-02-17T16:00:46+00:00 |
|
156ca922a6c621448d5e05d40aa521357f32be99 |
# Dataset of antonina/アントニーナ/安冬妮娜 (Neural Cloud)
This is the dataset of antonina/アントニーナ/安冬妮娜 (Neural Cloud), containing 34 images and their tags.
The core tags of this character are `long_hair, yellow_eyes, hat, headphones, bangs, hair_between_eyes, white_headwear, aqua_hair, green_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 55.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/antonina_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 27.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/antonina_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 72 | 54.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/antonina_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 45.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/antonina_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 72 | 83.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/antonina_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/antonina_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 34 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, black_gloves, holding, long_sleeves, jacket, blush, closed_mouth, open_clothes, black_thighhighs, sitting, black_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | black_gloves | holding | long_sleeves | jacket | blush | closed_mouth | open_clothes | black_thighhighs | sitting | black_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:----------|:---------------|:---------|:--------|:---------------|:---------------|:-------------------|:----------|:--------------|
| 0 | 34 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/antonina_neuralcloud | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T15:59:32+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T16:08:19+00:00 |
54e96e381906fa487acf1a4d77217e581efc148d | 100k Sentence Pairs for Nepali Spelling Correction | dura-garage/nep-spell-100k | [
"license:mit",
"region:us"
] | 2024-02-17T15:59:51+00:00 | {"license": "mit"} | 2024-02-17T16:01:55+00:00 |
456facbd213224925103afeb4e42cc3b19be1fdf | ramixpe/rfc3261 | [
"region:us"
] | 2024-02-17T16:02:48+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 91790, "num_examples": 335}], "download_size": 47236, "dataset_size": 91790}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T16:02:50+00:00 |
|
8fde8356170be70518f5ce64793af11233387d51 | natnitaract/teetouchjaknamon-faissbatchall-index-2 | [
"license:cc-by-3.0",
"region:us"
] | 2024-02-17T16:02:55+00:00 | {"license": "cc-by-3.0"} | 2024-02-17T16:07:22+00:00 |
|
37fcd93d6e30fec318c4cfccf1b7044e764b0394 |
# Dataset Card for Evaluation run of vilm/Quyen-Mini-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vilm/Quyen-Mini-v0.1](https://huggingface.co/vilm/Quyen-Mini-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vilm__Quyen-Mini-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T16:04:03.826177](https://huggingface.co/datasets/open-llm-leaderboard/details_vilm__Quyen-Mini-v0.1/blob/main/results_2024-02-17T16-04-03.826177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.43840376506422085,
"acc_stderr": 0.03434968269403666,
"acc_norm": 0.4413503435966419,
"acc_norm_stderr": 0.03507262298104947,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487360988,
"mc2": 0.4643634101883703,
"mc2_stderr": 0.015588466941925255
},
"harness|arc:challenge|25": {
"acc": 0.3703071672354949,
"acc_stderr": 0.014111298751674948,
"acc_norm": 0.39334470989761094,
"acc_norm_stderr": 0.014275101465693024
},
"harness|hellaswag|10": {
"acc": 0.4660426209918343,
"acc_stderr": 0.004978260641742204,
"acc_norm": 0.6056562437761402,
"acc_norm_stderr": 0.004877104939356237
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.03056159042673184,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.03056159042673184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4513888888888889,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.4513888888888889,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.36416184971098264,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.36416184971098264,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715563,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.02418049716437689,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.02418049716437689
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4870967741935484,
"acc_stderr": 0.028434533152681855,
"acc_norm": 0.4870967741935484,
"acc_norm_stderr": 0.028434533152681855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5575757575757576,
"acc_stderr": 0.03878372113711274,
"acc_norm": 0.5575757575757576,
"acc_norm_stderr": 0.03878372113711274
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5707070707070707,
"acc_stderr": 0.03526552724601199,
"acc_norm": 0.5707070707070707,
"acc_norm_stderr": 0.03526552724601199
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.538860103626943,
"acc_stderr": 0.03597524411734577,
"acc_norm": 0.538860103626943,
"acc_norm_stderr": 0.03597524411734577
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.41794871794871796,
"acc_stderr": 0.02500732988246122,
"acc_norm": 0.41794871794871796,
"acc_norm_stderr": 0.02500732988246122
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184409,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184409
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4957983193277311,
"acc_stderr": 0.0324773433444811,
"acc_norm": 0.4957983193277311,
"acc_norm_stderr": 0.0324773433444811
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5596330275229358,
"acc_stderr": 0.02128431062376155,
"acc_norm": 0.5596330275229358,
"acc_norm_stderr": 0.02128431062376155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.03154696285656628,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.03154696285656628
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.03506612560524866,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.03506612560524866
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5611814345991561,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.5611814345991561,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4349775784753363,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.4349775784753363,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48091603053435117,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.48091603053435117,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4601226993865031,
"acc_stderr": 0.03915857291436972,
"acc_norm": 0.4601226993865031,
"acc_norm_stderr": 0.03915857291436972
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7606837606837606,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.7606837606837606,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5363984674329502,
"acc_stderr": 0.01783252407959326,
"acc_norm": 0.5363984674329502,
"acc_norm_stderr": 0.01783252407959326
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5289017341040463,
"acc_stderr": 0.026874085883518348,
"acc_norm": 0.5289017341040463,
"acc_norm_stderr": 0.026874085883518348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.02852638345214263,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.02852638345214263
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.44694533762057875,
"acc_stderr": 0.02823776942208533,
"acc_norm": 0.44694533762057875,
"acc_norm_stderr": 0.02823776942208533
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4845679012345679,
"acc_stderr": 0.0278074900442762,
"acc_norm": 0.4845679012345679,
"acc_norm_stderr": 0.0278074900442762
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022128,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3520208604954368,
"acc_stderr": 0.012198140605353599,
"acc_norm": 0.3520208604954368,
"acc_norm_stderr": 0.012198140605353599
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.02909720956841194,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.02909720956841194
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.40032679738562094,
"acc_stderr": 0.01982184368827178,
"acc_norm": 0.40032679738562094,
"acc_norm_stderr": 0.01982184368827178
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5771144278606966,
"acc_stderr": 0.034932317774212816,
"acc_norm": 0.5771144278606966,
"acc_norm_stderr": 0.034932317774212816
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5380116959064327,
"acc_stderr": 0.03823727092882307,
"acc_norm": 0.5380116959064327,
"acc_norm_stderr": 0.03823727092882307
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487360988,
"mc2": 0.4643634101883703,
"mc2_stderr": 0.015588466941925255
},
"harness|winogrande|5": {
"acc": 0.5911602209944752,
"acc_stderr": 0.013816954295135686
},
"harness|gsm8k|5": {
"acc": 0.2744503411675512,
"acc_stderr": 0.0122915811708149
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vilm__Quyen-Mini-v0.1 | [
"region:us"
] | 2024-02-17T16:06:12+00:00 | {"pretty_name": "Evaluation run of vilm/Quyen-Mini-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [vilm/Quyen-Mini-v0.1](https://huggingface.co/vilm/Quyen-Mini-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vilm__Quyen-Mini-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T16:04:03.826177](https://huggingface.co/datasets/open-llm-leaderboard/details_vilm__Quyen-Mini-v0.1/blob/main/results_2024-02-17T16-04-03.826177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.43840376506422085,\n \"acc_stderr\": 0.03434968269403666,\n \"acc_norm\": 0.4413503435966419,\n \"acc_norm_stderr\": 0.03507262298104947,\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487360988,\n \"mc2\": 0.4643634101883703,\n \"mc2_stderr\": 0.015588466941925255\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3703071672354949,\n \"acc_stderr\": 0.014111298751674948,\n \"acc_norm\": 0.39334470989761094,\n \"acc_norm_stderr\": 0.014275101465693024\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4660426209918343,\n \"acc_stderr\": 0.004978260641742204,\n \"acc_norm\": 0.6056562437761402,\n \"acc_norm_stderr\": 0.004877104939356237\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.04065771002562605,\n \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.04065771002562605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.03056159042673184,\n \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.03056159042673184\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.4513888888888889,\n \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.36416184971098264,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715563,\n \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.328042328042328,\n \"acc_stderr\": 0.02418049716437689,\n \"acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.02418049716437689\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4870967741935484,\n \"acc_stderr\": 0.028434533152681855,\n \"acc_norm\": 0.4870967741935484,\n \"acc_norm_stderr\": 0.028434533152681855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5575757575757576,\n \"acc_stderr\": 0.03878372113711274,\n \"acc_norm\": 0.5575757575757576,\n \"acc_norm_stderr\": 0.03878372113711274\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5707070707070707,\n \"acc_stderr\": 0.03526552724601199,\n \"acc_norm\": 0.5707070707070707,\n \"acc_norm_stderr\": 0.03526552724601199\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.538860103626943,\n \"acc_stderr\": 0.03597524411734577,\n \"acc_norm\": 0.538860103626943,\n \"acc_norm_stderr\": 0.03597524411734577\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.41794871794871796,\n \"acc_stderr\": 0.02500732988246122,\n \"acc_norm\": 0.41794871794871796,\n \"acc_norm_stderr\": 0.02500732988246122\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184409,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184409\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4957983193277311,\n \"acc_stderr\": 0.0324773433444811,\n \"acc_norm\": 0.4957983193277311,\n \"acc_norm_stderr\": 0.0324773433444811\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5596330275229358,\n \"acc_stderr\": 0.02128431062376155,\n \"acc_norm\": 0.5596330275229358,\n \"acc_norm_stderr\": 0.02128431062376155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3101851851851852,\n \"acc_stderr\": 0.03154696285656628,\n \"acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.03154696285656628\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.03506612560524866,\n \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.03506612560524866\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5611814345991561,\n \"acc_stderr\": 0.032302649315470375,\n \"acc_norm\": 0.5611814345991561,\n \"acc_norm_stderr\": 0.032302649315470375\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4349775784753363,\n \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.4349775784753363,\n \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.48091603053435117,\n \"acc_stderr\": 0.04382094705550988,\n \"acc_norm\": 0.48091603053435117,\n \"acc_norm_stderr\": 0.04382094705550988\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4601226993865031,\n \"acc_stderr\": 0.03915857291436972,\n \"acc_norm\": 0.4601226993865031,\n \"acc_norm_stderr\": 0.03915857291436972\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7606837606837606,\n \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.7606837606837606,\n \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5363984674329502,\n \"acc_stderr\": 0.01783252407959326,\n \"acc_norm\": 0.5363984674329502,\n \"acc_norm_stderr\": 0.01783252407959326\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5289017341040463,\n \"acc_stderr\": 0.026874085883518348,\n \"acc_norm\": 0.5289017341040463,\n \"acc_norm_stderr\": 0.026874085883518348\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.02852638345214263,\n \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.02852638345214263\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.44694533762057875,\n \"acc_stderr\": 0.02823776942208533,\n \"acc_norm\": 0.44694533762057875,\n \"acc_norm_stderr\": 0.02823776942208533\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4845679012345679,\n \"acc_stderr\": 0.0278074900442762,\n \"acc_norm\": 0.4845679012345679,\n \"acc_norm_stderr\": 0.0278074900442762\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022128,\n \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022128\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3520208604954368,\n \"acc_stderr\": 0.012198140605353599,\n \"acc_norm\": 0.3520208604954368,\n \"acc_norm_stderr\": 0.012198140605353599\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.02909720956841194,\n \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.02909720956841194\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.40032679738562094,\n \"acc_stderr\": 0.01982184368827178,\n \"acc_norm\": 0.40032679738562094,\n \"acc_norm_stderr\": 0.01982184368827178\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5771144278606966,\n \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.5771144278606966,\n \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5380116959064327,\n \"acc_stderr\": 0.03823727092882307,\n \"acc_norm\": 0.5380116959064327,\n \"acc_norm_stderr\": 0.03823727092882307\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487360988,\n \"mc2\": 0.4643634101883703,\n \"mc2_stderr\": 0.015588466941925255\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5911602209944752,\n \"acc_stderr\": 0.013816954295135686\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2744503411675512,\n \"acc_stderr\": 0.0122915811708149\n }\n}\n```", "repo_url": "https://huggingface.co/vilm/Quyen-Mini-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|arc:challenge|25_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|gsm8k|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hellaswag|10_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T16-04-03.826177.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["**/details_harness|winogrande|5_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T16-04-03.826177.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T16_04_03.826177", "path": ["results_2024-02-17T16-04-03.826177.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T16-04-03.826177.parquet"]}]}]} | 2024-02-17T16:06:39+00:00 |
8853543eac22fd6463809276d3f35a7c34a85b63 | fahmiaziz/alpaca-new | [
"region:us"
] | 2024-02-17T16:07:24+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2010022, "num_examples": 2000}], "download_size": 1242409, "dataset_size": 2010022}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T16:08:40+00:00 |
|
ffaedaa7aedd58825c0ea8385607bb6839bacf77 | GiulioZ94/test | [
"region:us"
] | 2024-02-17T16:17:05+00:00 | {} | 2024-02-17T16:17:25+00:00 |
|
5a6ee4a27482addf31eb78ee73750872816fda0d |
# Dataset of sol/ソル/苏尔 (Neural Cloud)
This is the dataset of sol/ソル/苏尔 (Neural Cloud), containing 21 images and their tags.
The core tags of this character are `long_hair, blonde_hair, hair_between_eyes, yellow_eyes, ponytail, very_long_hair, bangs, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 21 | 26.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sol_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 21 | 17.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sol_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 38 | 28.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sol_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 21 | 24.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sol_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 38 | 36.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sol_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sol_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, smile, fingerless_gloves, black_gloves, looking_at_viewer, white_shirt, belt, crop_top, midriff, navel, necklace, orange_jacket, open_jacket, black_pants, long_sleeves, standing, fur-trimmed_jacket, holding, outdoors, black_choker, boots, sky |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | fingerless_gloves | black_gloves | looking_at_viewer | white_shirt | belt | crop_top | midriff | navel | necklace | orange_jacket | open_jacket | black_pants | long_sleeves | standing | fur-trimmed_jacket | holding | outdoors | black_choker | boots | sky |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:---------------|:--------------------|:--------------|:-------|:-----------|:----------|:--------|:-----------|:----------------|:--------------|:--------------|:---------------|:-----------|:---------------------|:----------|:-----------|:---------------|:--------|:------|
| 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/sol_neuralcloud | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T16:20:02+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T16:24:39+00:00 |
281d8501ac7efd9106cb17dec83e22f24f1b6cb5 | CyberHarem/banxsy_neuralcloud | [
"region:us"
] | 2024-02-17T16:20:25+00:00 | {} | 2024-02-17T16:22:45+00:00 |
|
9b273331fc34c69aeec9d09ee956cb147a88eb60 | alisson40889/globo | [
"license:openrail",
"region:us"
] | 2024-02-17T16:23:39+00:00 | {"license": "openrail"} | 2024-02-17T16:24:30+00:00 |
|
260babeff49226e42d5370000ea2470867dd2474 | arbitropy/squad_bn_nested | [
"region:us"
] | 2024-02-17T16:30:26+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "story", "dtype": "string"}, {"name": "questions", "sequence": "string"}, {"name": "answers", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 61464060, "num_examples": 20079}, {"name": "test", "num_bytes": 4237327, "num_examples": 2503}, {"name": "validation", "num_bytes": 4137887, "num_examples": 2500}], "download_size": 25632628, "dataset_size": 69839274}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-02-17T16:30:37+00:00 |
|
db0f5f74ba59b7d7419f9bd21ec1b74c89178855 | Olivacker/ratoputo | [
"license:openrail",
"region:us"
] | 2024-02-17T16:31:19+00:00 | {"license": "openrail"} | 2024-02-17T16:31:44+00:00 |
|
8142451dad8abee86f239d61ca09e60d9474a36d | CyberHarem/earhart_neuralcloud | [
"region:us"
] | 2024-02-17T16:34:55+00:00 | {} | 2024-02-17T16:35:45+00:00 |
|
24a786ac7e4153e368cef1536280e3645ff4919c |
# Dataset of evelyn/イヴリン/伊芙琳 (Neural Cloud)
This is the dataset of evelyn/イヴリン/伊芙琳 (Neural Cloud), containing 19 images and their tags.
The core tags of this character are `long_hair, red_eyes, eyepatch, bangs, breasts, white_hair, large_breasts, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 38.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 19.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 39.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 33.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 58.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/evelyn_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, nipples, solo, cum_on_breasts, nude, blush, closed_mouth, navel, pussy |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, closed_mouth, holding_gun, pantyhose, red_gloves, tactical_clothes, black_footwear, boots, cape, bulletproof_vest, handgun, long_sleeves, looking_at_viewer, outdoors, pouch, rifle, shotgun_shell, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | nipples | solo | cum_on_breasts | nude | blush | closed_mouth | navel | pussy | holding_gun | pantyhose | red_gloves | tactical_clothes | black_footwear | boots | cape | bulletproof_vest | handgun | long_sleeves | outdoors | pouch | rifle | shotgun_shell | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:----------|:-------|:-----------------|:-------|:--------|:---------------|:--------|:--------|:--------------|:------------|:-------------|:-------------------|:-----------------|:--------|:-------|:-------------------|:----------|:---------------|:-----------|:--------|:--------|:----------------|:-----------|
| 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/evelyn_neuralcloud | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T16:35:07+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T16:41:38+00:00 |
11af94e2b80b422ebb6081a18de65921330180b3 | CyberHarem/nora_neuralcloud | [
"region:us"
] | 2024-02-17T16:35:22+00:00 | {} | 2024-02-17T16:36:14+00:00 |
|
9129570552b03519306f3084d489b144f5f0d034 | # Dataset Card for "Gold-alpaca-med-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hemanth955/Gold-alpaca-med-small | [
"region:us"
] | 2024-02-17T16:39:32+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "Input", "dtype": "string"}, {"name": "Output", "dtype": "int64"}, {"name": "Instruction", "dtype": "string"}, {"name": "Text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 47684340, "num_examples": 27360}], "download_size": 0, "dataset_size": 47684340}} | 2024-02-17T16:53:09+00:00 |
954d50162ffedba0f3bfc4d7a8b317f0e8e6d944 |
# BEE-spoke-data/consumer-finance-complaints
`consumer-finance-complaints` but in a format that actually works.
Pulled Feb 2024
| BEE-spoke-data/consumer-finance-complaints | [
"task_categories:text-classification",
"task_categories:text-generation",
"source_datasets:consumer-finance-complaints",
"license:cc0-1.0",
"region:us"
] | 2024-02-17T16:42:13+00:00 | {"language": ["en"], "license": "cc0-1.0", "size_categories": ["1M<n<10M"], "source_datasets": "consumer-finance-complaints", "task_categories": ["text-classification", "text-generation"], "dataset_info": [{"config_name": "default", "features": [{"name": "Date received", "dtype": "string"}, {"name": "Product", "dtype": "string"}, {"name": "Sub-product", "dtype": "string"}, {"name": "Issue", "dtype": "string"}, {"name": "Sub-issue", "dtype": "string"}, {"name": "Consumer complaint narrative", "dtype": "string"}, {"name": "Company public response", "dtype": "string"}, {"name": "Company", "dtype": "string"}, {"name": "State", "dtype": "string"}, {"name": "ZIP code", "dtype": "string"}, {"name": "Tags", "dtype": "string"}, {"name": "Consumer consent provided?", "dtype": "string"}, {"name": "Submitted via", "dtype": "string"}, {"name": "Date sent to company", "dtype": "string"}, {"name": "Company response to consumer", "dtype": "string"}, {"name": "Timely response?", "dtype": "string"}, {"name": "Consumer disputed?", "dtype": "string"}, {"name": "Complaint ID", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 3427420677, "num_examples": 4707579}], "download_size": 1061488683, "dataset_size": 3427420677}, {"config_name": "has-text", "features": [{"name": "Date received", "dtype": "string"}, {"name": "Product", "dtype": "string"}, {"name": "Sub-product", "dtype": "string"}, {"name": "Issue", "dtype": "string"}, {"name": "Sub-issue", "dtype": "string"}, {"name": "Consumer complaint narrative", "dtype": "string"}, {"name": "Company public response", "dtype": "string"}, {"name": "Company", "dtype": "string"}, {"name": "State", "dtype": "string"}, {"name": "ZIP code", "dtype": "string"}, {"name": "Tags", "dtype": "string"}, {"name": "Consumer consent provided?", "dtype": "string"}, {"name": "Submitted via", "dtype": "string"}, {"name": "Date sent to company", "dtype": "string"}, {"name": "Company response to consumer", "dtype": "string"}, {"name": "Timely response?", "dtype": "string"}, {"name": "Consumer disputed?", "dtype": "string"}, {"name": "Complaint ID", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1229876941.3934054, "num_examples": 1689573}], "download_size": 925128908, "dataset_size": 1229876941.3934054}], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}, {"config_name": "has-text", "data_files": [{"split": "train", "path": "has-text/train-*"}]}], "tags": ["finance", "government data", "2024"]} | 2024-02-17T16:46:00+00:00 |
823a938a918efc6b56214d575875c60d633c0973 | alirzb/SeizureClassifier_Wav2Vec_U_43828667_on_UnBal_43845380 | [
"region:us"
] | 2024-02-17T16:43:36+00:00 | {"dataset_info": {"features": [{"name": "array", "sequence": "float64"}, {"name": "label_true", "dtype": "int64"}, {"name": "label_pred", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 3693924, "num_examples": 9}], "download_size": 1095957, "dataset_size": 3693924}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T16:43:40+00:00 |
|
a1ac9a2fac73c3deac49abf59dc036377340e78f | CyberHarem/undine_neuralcloud | [
"region:us"
] | 2024-02-17T16:44:31+00:00 | {} | 2024-02-17T16:45:00+00:00 |
|
53a1781ba62188ec9b507f07a2ae9fb2f8b11ddf |
# Dataset of turing/チューリング/图灵 (Neural Cloud)
This is the dataset of turing/チューリング/图灵 (Neural Cloud), containing 22 images and their tags.
The core tags of this character are `animal_ears, brown_hair, long_hair, hair_ornament, breasts, bangs, animal_ear_fluff, large_breasts, braid, hair_between_eyes, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 34.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turing_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 17.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turing_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 53 | 39.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turing_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 29.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turing_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 53 | 57.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turing_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/turing_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, smile, white_shirt, simple_background, yellow_gloves, cleavage_cutout, closed_mouth, black_pants, open_clothes, white_background, id_card, long_sleeves, white_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | smile | white_shirt | simple_background | yellow_gloves | cleavage_cutout | closed_mouth | black_pants | open_clothes | white_background | id_card | long_sleeves | white_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:--------------|:--------------------|:----------------|:------------------|:---------------|:--------------|:---------------|:-------------------|:----------|:---------------|:---------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/turing_neuralcloud | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T16:44:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T16:50:22+00:00 |
b7079b0d26e381e689896975034429a9feb049aa | CyberHarem/magnhilda_neuralcloud | [
"region:us"
] | 2024-02-17T16:44:49+00:00 | {} | 2024-02-17T16:46:07+00:00 |
|
c7dceb8ff5f7348d5c3ba37fb39b1e6f4c3cd245 | MITCriticalData/dataset_rio_de_janeiro_2018_2023 | [
"license:mit",
"region:us"
] | 2024-02-17T16:46:35+00:00 | {"license": "mit"} | 2024-02-17T16:46:35+00:00 |
|
dd382831addcffdecab68b96b411e2222bfcf2d8 | 0: ENOUGH_INFO
1: NOT_ENOUGH_INFO | iestynmullinor/evidence_reranker_fever_balanced | [
"region:us"
] | 2024-02-17T16:46:56+00:00 | {} | 2024-02-17T17:27:53+00:00 |
72fff341279b97921e9c6cac578d596f9b9873aa | alisson40889/thuca | [
"license:openrail",
"region:us"
] | 2024-02-17T16:47:45+00:00 | {"license": "openrail"} | 2024-02-17T16:48:40+00:00 |
|
081e3f0bf57b882d24f950056a3823065d89f764 | Clonadordoely/marcos | [
"license:openrail",
"region:us"
] | 2024-02-17T16:47:50+00:00 | {"license": "openrail"} | 2024-02-17T16:48:43+00:00 |
|
8f344b292b1d35da6eaf09dba8acce269dada4f5 |
# Dataset Card for Evaluation run of DreadPoor/JustToSuffer-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/JustToSuffer-7B-slerp](https://huggingface.co/DreadPoor/JustToSuffer-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__JustToSuffer-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T16:45:59.965925](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__JustToSuffer-7B-slerp/blob/main/results_2024-02-17T16-45-59.965925.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6495053299013857,
"acc_stderr": 0.03213728048297641,
"acc_norm": 0.6511170368490796,
"acc_norm_stderr": 0.03278191957728323,
"mc1": 0.4541003671970624,
"mc1_stderr": 0.017429593091323522,
"mc2": 0.6269022526171036,
"mc2_stderr": 0.015516860373603584
},
"harness|arc:challenge|25": {
"acc": 0.6604095563139932,
"acc_stderr": 0.013839039762820167,
"acc_norm": 0.689419795221843,
"acc_norm_stderr": 0.013522292098053067
},
"harness|hellaswag|10": {
"acc": 0.7030472017526389,
"acc_stderr": 0.00455981758918207,
"acc_norm": 0.8678550089623581,
"acc_norm_stderr": 0.003379562298387565
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700476,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700476
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834838,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834838
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40893854748603353,
"acc_stderr": 0.016442830654715544,
"acc_norm": 0.40893854748603353,
"acc_norm_stderr": 0.016442830654715544
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504515,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504515
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4541003671970624,
"mc1_stderr": 0.017429593091323522,
"mc2": 0.6269022526171036,
"mc2_stderr": 0.015516860373603584
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625856
},
"harness|gsm8k|5": {
"acc": 0.5974222896133434,
"acc_stderr": 0.013508523063663423
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_DreadPoor__JustToSuffer-7B-slerp | [
"region:us"
] | 2024-02-17T16:48:17+00:00 | {"pretty_name": "Evaluation run of DreadPoor/JustToSuffer-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [DreadPoor/JustToSuffer-7B-slerp](https://huggingface.co/DreadPoor/JustToSuffer-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__JustToSuffer-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-17T16:45:59.965925](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__JustToSuffer-7B-slerp/blob/main/results_2024-02-17T16-45-59.965925.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6495053299013857,\n \"acc_stderr\": 0.03213728048297641,\n \"acc_norm\": 0.6511170368490796,\n \"acc_norm_stderr\": 0.03278191957728323,\n \"mc1\": 0.4541003671970624,\n \"mc1_stderr\": 0.017429593091323522,\n \"mc2\": 0.6269022526171036,\n \"mc2_stderr\": 0.015516860373603584\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6604095563139932,\n \"acc_stderr\": 0.013839039762820167,\n \"acc_norm\": 0.689419795221843,\n \"acc_norm_stderr\": 0.013522292098053067\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7030472017526389,\n \"acc_stderr\": 0.00455981758918207,\n \"acc_norm\": 0.8678550089623581,\n \"acc_norm_stderr\": 0.003379562298387565\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700476,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700476\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834838,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834838\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40893854748603353,\n \"acc_stderr\": 0.016442830654715544,\n \"acc_norm\": 0.40893854748603353,\n \"acc_norm_stderr\": 0.016442830654715544\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n \"acc_stderr\": 0.012731102790504515,\n \"acc_norm\": 0.46088657105606257,\n \"acc_norm_stderr\": 0.012731102790504515\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4541003671970624,\n \"mc1_stderr\": 0.017429593091323522,\n \"mc2\": 0.6269022526171036,\n \"mc2_stderr\": 0.015516860373603584\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625856\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5974222896133434,\n \"acc_stderr\": 0.013508523063663423\n }\n}\n```", "repo_url": "https://huggingface.co/DreadPoor/JustToSuffer-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|arc:challenge|25_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|gsm8k|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hellaswag|10_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-17T16-45-59.965925.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["**/details_harness|winogrande|5_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-17T16-45-59.965925.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_17T16_45_59.965925", "path": ["results_2024-02-17T16-45-59.965925.parquet"]}, {"split": "latest", "path": ["results_2024-02-17T16-45-59.965925.parquet"]}]}]} | 2024-02-17T16:48:38+00:00 |
d5c2203314a1d0adb659c0dd3cc5828097dbe647 | arbitropy/bcoqa-test | [
"region:us"
] | 2024-02-17T16:48:51+00:00 | {"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "story", "dtype": "string"}, {"name": "questions", "sequence": "string"}, {"name": "answers", "sequence": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 2764257.9446640317, "num_examples": 490}], "download_size": 1095052, "dataset_size": 2764257.9446640317}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T16:48:54+00:00 |
|
c5f220780786f38bcfdaca8d3c8d46581b5272ef | alirzb/SeizureClassifier_Wav2Vec_U_43828667_on_UnBal_43845590 | [
"region:us"
] | 2024-02-17T16:55:23+00:00 | {"dataset_info": {"features": [{"name": "array", "sequence": "float64"}, {"name": "label_true", "dtype": "int64"}, {"name": "label_pred", "dtype": "int64"}, {"name": "id", "dtype": "string"}, {"name": "ws", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 4304681.0, "num_examples": 9}], "download_size": 1707867, "dataset_size": 4304681.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T17:08:44+00:00 |
|
7a5d4c3ef3cb7587acd180e6a79a9a6ab07bf2f1 | CyberHarem/clotho_neuralcloud | [
"region:us"
] | 2024-02-17T16:55:42+00:00 | {} | 2024-02-17T16:57:13+00:00 |
|
55b8e6131e6bb7d576c9b5a3bd1af46c0d6616ff |
# Dataset of willow/ウィロウ/薇洛儿 (Neural Cloud)
This is the dataset of willow/ウィロウ/薇洛儿 (Neural Cloud), containing 22 images and their tags.
The core tags of this character are `pink_hair, breasts, red_eyes, short_hair, large_breasts, hat, eyewear_on_head, sunglasses, bangs, bow, hair_ornament, pink_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 44.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/willow_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 19.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/willow_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 52 | 43.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/willow_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 36.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/willow_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 52 | 71.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/willow_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/willow_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, gloves, open_mouth, black_thighhighs, jacket, smile, holding, bowtie, skirt, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | gloves | open_mouth | black_thighhighs | jacket | smile | holding | bowtie | skirt | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------|:-------------|:-------------------|:---------|:--------|:----------|:---------|:--------|:--------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/willow_neuralcloud | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T16:55:50+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T17:05:02+00:00 |
143d5b2a9777a7162d3c167fec17cdbaa3be6982 |
# Dataset Card for Spider-Releastic
This dataset variant contains only the Spider Realistic dataset used in "Structure-Grounded Pretraining for Text-to-SQL". The dataset is created based on the dev split of the Spider dataset (2020-06-07 version from https://yale-lily.github.io/spider). The authors of the dataset modified the original questions to remove the explicit mention of column names while keeping the SQL queries unchanged to better evaluate the model's capability in aligning the NL utterance and the DB schema. For more details, please refer to the authors paper https://arxiv.org/abs/2010.12773. The SQL queries and databases from the original Spider dataset are kept unchanged.
For the official database files, please refer to the Spider release site: https://yale-lily.github.io/spider.
This dataset was copied from Zenodo: https://zenodo.org/records/5205322.
This dataset is distributed under the CC BY-SA 4.0 license.
## Paper Abstract
> Learning to capture text-table alignment is essential for tasks like text-to-SQL. A model needs to correctly recognize natural language references to columns and values and to ground them in the given database schema. In this paper, we present a novel weakly supervised Structure-Grounded pretraining framework (StruG) for text-to-SQL that can effectively learn to capture text-table alignment based on a parallel text-table corpus. We identify a set of novel prediction tasks: column grounding, value grounding and column-value mapping, and leverage them to pretrain a text-table encoder. Additionally, to evaluate different methods under more realistic text-table alignment settings, we create a new evaluation set Spider-Realistic based on Spider dev set with explicit mentions of column names removed, and adopt eight existing text-to-SQL datasets for cross-database evaluation. STRUG brings significant improvement over BERT-LARGE in all settings. Compared with existing pretraining methods such as GRAPPA, STRUG achieves similar performance on Spider, and outperforms all baselines on more realistic sets.
## Citation Information
If you use the dataset, please cite the following papers including the original Spider datasets, Finegan-Dollak et al., 2018 and the original datasets for Restaurants, GeoQuery, Scholar, Academic, IMDB, and Yelp.
```
@article{deng2020structure,
title={Structure-Grounded Pretraining for Text-to-SQL},
author={Deng, Xiang and Awadallah, Ahmed Hassan and Meek, Christopher and Polozov, Oleksandr and Sun, Huan and Richardson, Matthew},
journal={arXiv preprint arXiv:2010.12773},
year={2020}
}
@inproceedings{Yu&al.18c,
year = 2018,
title = {Spider: A Large-Scale Human-Labeled Dataset for Complex and Cross-Domain Semantic Parsing and Text-to-SQL Task},
booktitle = {EMNLP},
author = {Tao Yu and Rui Zhang and Kai Yang and Michihiro Yasunaga and Dongxu Wang and Zifan Li and James Ma and Irene Li and Qingning Yao and Shanelle Roman and Zilin Zhang and Dragomir Radev }
}
@InProceedings{P18-1033,
author = "Finegan-Dollak, Catherine
and Kummerfeld, Jonathan K.
and Zhang, Li
and Ramanathan, Karthik
and Sadasivam, Sesh
and Zhang, Rui
and Radev, Dragomir",
title = "Improving Text-to-SQL Evaluation Methodology",
booktitle = "Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
year = "2018",
publisher = "Association for Computational Linguistics",
pages = "351--360",
location = "Melbourne, Australia",
url = "http://aclweb.org/anthology/P18-1033"
}
@InProceedings{data-sql-imdb-yelp,
dataset = {IMDB and Yelp},
author = {Navid Yaghmazadeh, Yuepeng Wang, Isil Dillig, and Thomas Dillig},
title = {SQLizer: Query Synthesis from Natural Language},
booktitle = {International Conference on Object-Oriented Programming, Systems, Languages, and Applications, ACM},
month = {October},
year = {2017},
pages = {63:1--63:26},
url = {http://doi.org/10.1145/3133887},
}
@article{data-academic,
dataset = {Academic},
author = {Fei Li and H. V. Jagadish},
title = {Constructing an Interactive Natural Language Interface for Relational Databases},
journal = {Proceedings of the VLDB Endowment},
volume = {8},
number = {1},
month = {September},
year = {2014},
pages = {73--84},
url = {http://dx.doi.org/10.14778/2735461.2735468},
}
@InProceedings{data-atis-geography-scholar,
dataset = {Scholar, and Updated ATIS and Geography},
author = {Srinivasan Iyer, Ioannis Konstas, Alvin Cheung, Jayant Krishnamurthy, and Luke Zettlemoyer},
title = {Learning a Neural Semantic Parser from User Feedback},
booktitle = {Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
year = {2017},
pages = {963--973},
location = {Vancouver, Canada},
url = {http://www.aclweb.org/anthology/P17-1089},
}
@inproceedings{data-geography-original
dataset = {Geography, original},
author = {John M. Zelle and Raymond J. Mooney},
title = {Learning to Parse Database Queries Using Inductive Logic Programming},
booktitle = {Proceedings of the Thirteenth National Conference on Artificial Intelligence - Volume 2},
year = {1996},
pages = {1050--1055},
location = {Portland, Oregon},
url = {http://dl.acm.org/citation.cfm?id=1864519.1864543},
}
@inproceedings{data-restaurants-logic,
author = {Lappoon R. Tang and Raymond J. Mooney},
title = {Automated Construction of Database Interfaces: Intergrating Statistical and Relational Learning for Semantic Parsing},
booktitle = {2000 Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora},
year = {2000},
pages = {133--141},
location = {Hong Kong, China},
url = {http://www.aclweb.org/anthology/W00-1317},
}
@inproceedings{data-restaurants-original,
author = {Ana-Maria Popescu, Oren Etzioni, and Henry Kautz},
title = {Towards a Theory of Natural Language Interfaces to Databases},
booktitle = {Proceedings of the 8th International Conference on Intelligent User Interfaces},
year = {2003},
location = {Miami, Florida, USA},
pages = {149--157},
url = {http://doi.acm.org/10.1145/604045.604070},
}
@inproceedings{data-restaurants,
author = {Alessandra Giordani and Alessandro Moschitti},
title = {Automatic Generation and Reranking of SQL-derived Answers to NL Questions},
booktitle = {Proceedings of the Second International Conference on Trustworthy Eternal Systems via Evolving Software, Data and Knowledge},
year = {2012},
location = {Montpellier, France},
pages = {59--76},
url = {https://doi.org/10.1007/978-3-642-45260-4_5},
}
``` | aherntech/spider-realistic | [
"task_categories:text2text-generation",
"size_categories:n<1K",
"language:en",
"license:cc-by-4.0",
"text-to-sql",
"arxiv:2010.12773",
"region:us"
] | 2024-02-17T17:00:56+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["n<1K"], "task_categories": ["text2text-generation"], "pretty_name": "Spider-Rea;ostoc", "tags": ["text-to-sql"]} | 2024-02-17T17:19:05+00:00 |
d3ef3e2957d4568807ba310c2acc921f7502768e | # Dataset Card for "Gold-alpaca-med-new-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hemanth955/Gold-alpaca-med-new-small | [
"region:us"
] | 2024-02-17T17:06:37+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "Input", "dtype": "string"}, {"name": "Output", "dtype": "int64"}, {"name": "Instruction", "dtype": "string"}, {"name": "Text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5229084, "num_examples": 3000}], "download_size": 1204919, "dataset_size": 5229084}} | 2024-02-17T17:06:41+00:00 |
4e92b1ddb9722c5f2358f2e68521896a8e75993a | uisikdag/embeddings_getstart | [
"license:apache-2.0",
"region:us"
] | 2024-02-17T17:10:07+00:00 | {"license": "apache-2.0"} | 2024-02-17T17:10:45+00:00 |
|
4d8a6247a05c59706a9e6a17cbf19ed0e0360fd0 |
# Annotation
Each date in the Products-Real and Products-Synth datasets is annotated with class, bounding box coordinates, date transcription, image width, and height. There are four classes defined: date, due, prod, and code in the training sets. Expiration dates in the test set of Product-Real are specifically labeled as "exp" class for easy evaluation, unlike the training set of Product-Real. Each component in the Date-Real and Date-Synth datasets is annotated with class, bounding box, and transcription. The day, month, and year are used as the classes for each component of the dates. Moreover, Components-Real and Components-Synth datasets consist of the components of the day, month, and year and their transcriptions.
# Citation
Dataset published originally in `A Generalized Framework for Recognition of Expiration Date on Product Packages Using Fully Convolutional Networks`
@article{seker2022generalized,
title={A generalized framework for recognition of expiration dates on product packages using fully convolutional networks},
author={Seker, Ahmet Cagatay and Ahn, Sang Chul},
journal={Expert Systems with Applications},
pages={117310},
year={2022},
publisher={Elsevier}
} | dimun/ExpirationDate | [
"task_categories:object-detection",
"language:en",
"license:afl-3.0",
"region:us"
] | 2024-02-17T17:16:08+00:00 | {"language": ["en"], "license": "afl-3.0", "task_categories": ["object-detection"]} | 2024-02-17T17:27:54+00:00 |
2305e7834fca367897ad93968493a07007577b27 | SST with openai whisper large V3 for collect better part of the voices | sarpba/common_voice_16.1_hu_texts | [
"license:apache-2.0",
"region:us"
] | 2024-02-17T17:16:36+00:00 | {"license": "apache-2.0"} | 2024-02-17T17:19:07+00:00 |
1ce25dcb86d8c56e5c81697e48deefa094187d95 | CyberHarem/mai_neuralcloud | [
"region:us"
] | 2024-02-17T17:18:02+00:00 | {} | 2024-02-17T17:18:53+00:00 |
|
44a344e5439ca8ed2518c7f6e7603f3b55f9ebaf | CyberHarem/max_neuralcloud | [
"region:us"
] | 2024-02-17T17:18:10+00:00 | {} | 2024-02-17T17:19:03+00:00 |
|
584ac0b15a0f6640bf66968485e1f607b4665548 | CyberHarem/helix_neuralcloud | [
"region:us"
] | 2024-02-17T17:18:13+00:00 | {} | 2024-02-17T17:18:48+00:00 |
|
a896c36711e6c69fcb5c69746ed1e88e5a4d15d9 | alisson40889/cidadao | [
"license:openrail",
"region:us"
] | 2024-02-17T17:20:52+00:00 | {"license": "openrail"} | 2024-02-17T17:21:34+00:00 |
|
2330d906ebe06a5ba89655ac218cf79a4b98bd37 |
# Dataset of fresnel/フレネル/菲涅尔 (Neural Cloud)
This is the dataset of fresnel/フレネル/菲涅尔 (Neural Cloud), containing 13 images and their tags.
The core tags of this character are `purple_hair, purple_eyes, short_hair, goggles_on_head, bangs, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 26.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fresnel_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 11.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fresnel_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 32 | 25.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fresnel_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 22.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fresnel_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 32 | 39.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fresnel_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fresnel_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, crop_top, goggles, shirt, midriff, navel, black_shorts, jacket, jewelry, open_clothes, white_background, bare_shoulders, long_sleeves, off_shoulder, closed_mouth, full_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | crop_top | goggles | shirt | midriff | navel | black_shorts | jacket | jewelry | open_clothes | white_background | bare_shoulders | long_sleeves | off_shoulder | closed_mouth | full_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------|:----------|:--------|:----------|:--------|:---------------|:---------|:----------|:---------------|:-------------------|:-----------------|:---------------|:---------------|:---------------|:------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/fresnel_neuralcloud | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T17:33:01+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T17:36:26+00:00 |
edcf1b3fd8142b60ad705f4490976c6e0ca96dbd | CyberHarem/groove_neuralcloud | [
"region:us"
] | 2024-02-17T17:33:35+00:00 | {} | 2024-02-17T17:33:56+00:00 |
|
2dd80cfaf4e152479da599944b4935b0e469f801 |
# Dataset of hannah/ハンナ/汉娜 (Neural Cloud)
This is the dataset of hannah/ハンナ/汉娜 (Neural Cloud), containing 12 images and their tags.
The core tags of this character are `animal_ears, brown_hair, long_hair, animal_ear_fluff, bangs, hair_ornament, multicolored_hair, white_hair, bow, grey_eyes, twintails, streaked_hair, tail, very_long_hair, ahoge, hair_between_eyes, hair_bow, breasts, low_twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 24.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hannah_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 11.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hannah_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 34 | 27.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hannah_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 19.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hannah_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 34 | 41.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hannah_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hannah_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, smile, white_shirt, skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | smile | white_shirt | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:--------------|:--------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X |
| CyberHarem/hannah_neuralcloud | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-02-17T17:33:59+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T17:36:59+00:00 |
615e1fc9f5cb590b31c79ae9e93d77cb873a4dee | rsalshalan/MGB3_V2 | [
"region:us"
] | 2024-02-17T17:36:14+00:00 | {} | 2024-02-17T17:36:14+00:00 |
|
017fde5eee0dacd52c4651d6ab22a3b3240ce06f | YXu120/NC_Education_dataset | [
"region:us"
] | 2024-02-17T17:38:35+00:00 | {} | 2024-02-17T17:38:35+00:00 |
|
f56c78fe3fdd22795a6261b56e8659a8514be429 | Hack90/virus_tiny | [
"region:us"
] | 2024-02-17T17:41:16+00:00 | {"dataset_info": {"features": [{"name": "Accession", "dtype": "string"}, {"name": "Organism_Name", "dtype": "string"}, {"name": "Submitters", "dtype": "string"}, {"name": "Organization", "dtype": "string"}, {"name": "Org_location", "dtype": "string"}, {"name": "Release_Date", "dtype": "string"}, {"name": "Isolate", "dtype": "string"}, {"name": "Species", "dtype": "string"}, {"name": "Genus", "dtype": "string"}, {"name": "Family", "dtype": "string"}, {"name": "Molecule_type", "dtype": "string"}, {"name": "Length", "dtype": "int64"}, {"name": "Sequence_Type", "dtype": "string"}, {"name": "Publications", "dtype": "float64"}, {"name": "Geo_Location", "dtype": "string"}, {"name": "Country", "dtype": "string"}, {"name": "Host", "dtype": "string"}, {"name": "Isolation_Source", "dtype": "string"}, {"name": "Collection_Date", "dtype": "string"}, {"name": "BioSample", "dtype": "string"}, {"name": "BioProject", "dtype": "string"}, {"name": "GenBank_Title", "dtype": "string"}, {"name": "Sequence", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 172388184, "num_examples": 13829}], "download_size": 71753842, "dataset_size": 172388184}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T17:41:20+00:00 |
|
56d70d717d94b10a64f87e347a83af740a297c97 | LiukG/From_Genebank | [
"region:us"
] | 2024-02-17T17:41:34+00:00 | {} | 2024-02-17T17:41:34+00:00 |
|
652ac6e49721442e74be2cca27bbd4f7343ac7ef | Rigoleto/drone-detection | [
"region:us"
] | 2024-02-17T17:42:28+00:00 | {} | 2024-02-17T17:42:28+00:00 |
|
8e661bb46ab5fc89dd0a0b53b5ac214b4f59c308 |
# Dataset of dushevnaya/ドゥシェーヴヌイ/杜莎妮 (Neural Cloud)
This is the dataset of dushevnaya/ドゥシェーヴヌイ/杜莎妮 (Neural Cloud), containing 30 images and their tags.
The core tags of this character are `long_hair, animal_ears, braid, grey_hair, bangs, hair_between_eyes, breasts, green_eyes, blue_eyes, large_breasts, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 44.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dushevnaya_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 23.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dushevnaya_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 69 | 48.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dushevnaya_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 39.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dushevnaya_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 69 | 76.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dushevnaya_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dushevnaya_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | blush, white_background, 1girl, open_mouth, simple_background, solo, green_hairband, holding, looking_at_viewer, bare_shoulders, collarbone, dress, hair_flower, official_alternate_costume, smile, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | white_background | 1girl | open_mouth | simple_background | solo | green_hairband | holding | looking_at_viewer | bare_shoulders | collarbone | dress | hair_flower | official_alternate_costume | smile | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------|:-------------|:--------------------|:-------|:-----------------|:----------|:--------------------|:-----------------|:-------------|:--------|:--------------|:-----------------------------|:--------|:-------------|
| 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/dushevnaya_neuralcloud | [
"region:us"
] | 2024-02-17T17:42:52+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T17:42:52+00:00 |
1082a0c2b7c04724f7435d03717c6b5c4d65745a | CyberHarem/nascita_neuralcloud | [
"region:us"
] | 2024-02-17T17:42:59+00:00 | {} | 2024-02-17T17:43:26+00:00 |
|
f5599486ea67d96ec6238d206149b5ac5b6bee7f |
# Dataset of abigail/アビゲイル/阿比盖尔 (Neural Cloud)
This is the dataset of abigail/アビゲイル/阿比盖尔 (Neural Cloud), containing 30 images and their tags.
The core tags of this character are `animal_ears, blonde_hair, long_hair, blue_eyes, breasts, goggles_on_head, large_breasts, tail, bangs, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 29.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/abigail_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 19.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/abigail_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 65 | 36.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/abigail_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 26.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/abigail_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 65 | 48.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/abigail_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/abigail_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------|
| 0 | 30 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, smile, open_mouth, shirt, cleavage, holding, goggles, shorts, blush, gloves, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | open_mouth | shirt | cleavage | holding | goggles | shorts | blush | gloves | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------------|:--------|:-----------|:----------|:----------|:---------|:--------|:---------|:--------------------|
| 0 | 30 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/abigail_neuralcloud | [
"region:us"
] | 2024-02-17T17:43:00+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-02-17T17:43:00+00:00 |