Dataset Viewer
The dataset viewer is not available for this dataset.
Cannot get the config names for the dataset.
Error code:   RetryableConfigNamesError
Exception:    HfHubHTTPError
Message:      504 Server Error: Gateway Time-out for url: https://huggingface.co/api/datasets/alix-tz/noisy-gt-missing-words-train-only/tree/3df129837feffb101b7d6b12484fc9eb971836d7?recursive=True&expand=False
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/dataset/config_names.py", line 66, in compute_config_names_response
                  config_names = get_dataset_config_names(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 164, in get_dataset_config_names
                  dataset_module = dataset_module_factory(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1731, in dataset_module_factory
                  raise e1 from None
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1688, in dataset_module_factory
                  return HubDatasetModuleFactoryWithoutScript(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1066, in get_module
                  patterns = get_data_patterns(base_path, download_config=self.download_config)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 501, in get_data_patterns
                  return _get_data_files_patterns(resolver)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 295, in _get_data_files_patterns
                  data_files = pattern_resolver(pattern)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 388, in resolve_pattern
                  for filepath, info in fs.glob(pattern, detail=True, **glob_kwargs).items()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 521, in glob
                  return super().glob(path, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 604, in glob
                  allpaths = self.find(root, maxdepth=depth, withdirs=True, detail=True, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 563, in find
                  out = self._ls_tree(path, recursive=True, refresh=refresh, revision=resolved_path.revision, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 446, in _ls_tree
                  self._ls_tree(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 463, in _ls_tree
                  for path_info in tree:
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_api.py", line 3028, in list_repo_tree
                  for path_info in paginate(path=tree_url, headers=headers, params={"recursive": recursive, "expand": expand}):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_pagination.py", line 37, in paginate
                  hf_raise_for_status(r)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_http.py", line 477, in hf_raise_for_status
                  raise _format(HfHubHTTPError, str(e), response) from e
              huggingface_hub.errors.HfHubHTTPError: 504 Server Error: Gateway Time-out for url: https://huggingface.co/api/datasets/alix-tz/noisy-gt-missing-words-train-only/tree/3df129837feffb101b7d6b12484fc9eb971836d7?recursive=True&expand=False

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

Noisy Ground Truth - Missing Words in Train Split only

Dataset of synthetic data for experimentation with noisy ground truth. The text in the dataset is based on Colette's Sido and Les Vignes, also the data was processed prior to generating images with the TextRecognitionDataGenerator.

In Noisy Ground Truth - Missing Words in Train Split only, each variation column is affected by the noise, only when the split is for training. The validation and test splits are not affected by the noise.

Data structure

The dataset is composed of the following columns:

  • gold: the original text
  • source: identifies the original work from which the text was extracted
    • ex: "sido-colette" means that the text was extracted from Colette's Sido (or Les Vignes)
  • MW-F: the first word is missing
  • MW-L: the last word is missing
  • MW-50: half of the words are missing, chosen randomly
  • MW-50-L: half of the words are missing, chosen from the end of the text
  • MW-1: one word is missing, chosen randomly
  • MW-1-POS: position of the missing word in MW-1, "-2" means that no word is missing
  • n_line: number identifying the line in page # n_page
  • n_page: number identifying the page (the page changes every 25 lines)
  • split: split between train, validation and test (applied at page level)
  • im: synthetic text line image

Dataset Card Contact

Alix Chagué (first.last@inria.fr)

Downloads last month
5