Dataset Preview
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    TypeError
Message:      Couldn't cast array of type
struct<0: int64, 1: int64, 2: int64, 3: int64, 4: int64, 5: int64, 6: int64, 7: int64, 8: int64, 9: int64, 10: int64, 11: int64, 12: int64, 13: int64, 14: int64, 15: int64, 16: int64, 17: int64, 18: int64, 19: int64, 20: int64, 21: int64, 22: int64, 23: int64, 24: int64, 25: int64, 26: int64, 27: int64, 28: int64, 29: int64, 30: int64, 31: int64, 32: int64, 33: int64, 34: int64, 35: int64, 36: int64, 37: int64, 38: int64, 39: int64, 40: int64, 41: int64, 42: int64, 43: int64, 44: int64, 45: int64>
to
{'0': Value(dtype='int64', id=None), '1': Value(dtype='int64', id=None), '2': Value(dtype='int64', id=None), '3': Value(dtype='int64', id=None), '4': Value(dtype='int64', id=None), '5': Value(dtype='int64', id=None), '6': Value(dtype='int64', id=None), '7': Value(dtype='int64', id=None), '8': Value(dtype='int64', id=None), '9': Value(dtype='int64', id=None), '10': Value(dtype='int64', id=None), '11': Value(dtype='int64', id=None), '12': Value(dtype='int64', id=None), '13': Value(dtype='int64', id=None), '14': Value(dtype='int64', id=None), '15': Value(dtype='int64', id=None), '16': Value(dtype='int64', id=None), '17': Value(dtype='int64', id=None), '18': Value(dtype='int64', id=None), '19': Value(dtype='int64', id=None), '20': Value(dtype='int64', id=None), '21': Value(dtype='int64', id=None), '22': Value(dtype='int64', id=None), '23': Value(dtype='int64', id=None), '24': Value(dtype='int64', id=None), '25': Value(dtype='int64', id=None), '26': Value(dtype='int64', id=None), '27': Value(dtype='int64', id=None), '28': Value(dtype='int64', id=None), '29': Value(dtype='int64', id=None), '30': Value(dtype='int64', id=None), '31': Value(dtype='int64', id=None), '32': Value(dtype='int64', id=None), '33': Value(dtype='int64', id=None), '34': Value(dtype='int64', id=None), '35': Value(dtype='int64', id=None), '36': Value(dtype='int64', id=None), '37': Value(dtype='int64', id=None), '38': Value(dtype='int64', id=None), '39': Value(dtype='int64', id=None), '40': Value(dtype='int64', id
...
'int64', id=None), '146': Value(dtype='int64', id=None), '147': Value(dtype='int64', id=None), '148': Value(dtype='int64', id=None), '149': Value(dtype='int64', id=None), '150': Value(dtype='int64', id=None), '151': Value(dtype='int64', id=None), '152': Value(dtype='int64', id=None), '153': Value(dtype='int64', id=None), '154': Value(dtype='int64', id=None), '155': Value(dtype='int64', id=None), '156': Value(dtype='int64', id=None), '157': Value(dtype='int64', id=None), '158': Value(dtype='int64', id=None), '159': Value(dtype='int64', id=None), '160': Value(dtype='int64', id=None), '161': Value(dtype='int64', id=None), '162': Value(dtype='int64', id=None), '163': Value(dtype='int64', id=None), '164': Value(dtype='int64', id=None), '165': Value(dtype='int64', id=None), '166': Value(dtype='int64', id=None), '167': Value(dtype='int64', id=None), '168': Value(dtype='int64', id=None), '169': Value(dtype='int64', id=None), '170': Value(dtype='int64', id=None), '171': Value(dtype='int64', id=None), '172': Value(dtype='int64', id=None), '173': Value(dtype='int64', id=None), '174': Value(dtype='int64', id=None), '175': Value(dtype='int64', id=None), '176': Value(dtype='int64', id=None), '177': Value(dtype='int64', id=None), '178': Value(dtype='int64', id=None), '179': Value(dtype='int64', id=None), '180': Value(dtype='int64', id=None), '181': Value(dtype='int64', id=None), '182': Value(dtype='int64', id=None), '183': Value(dtype='int64', id=None), '184': Value(dtype='int64', id=None)}
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2011, in _prepare_split_single
                  writer.write_table(table)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 585, in write_table
                  pa_table = table_cast(pa_table, self._schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2302, in table_cast
                  return cast_table_to_schema(table, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2261, in cast_table_to_schema
                  arrays = [cast_array_to_feature(table[name], feature) for name, feature in features.items()]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2261, in <listcomp>
                  arrays = [cast_array_to_feature(table[name], feature) for name, feature in features.items()]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1802, in wrapper
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1802, in <listcomp>
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2122, in cast_array_to_feature
                  raise TypeError(f"Couldn't cast array of type\n{_short_str(array.type)}\nto\n{_short_str(feature)}")
              TypeError: Couldn't cast array of type
              struct<0: int64, 1: int64, 2: int64, 3: int64, 4: int64, 5: int64, 6: int64, 7: int64, 8: int64, 9: int64, 10: int64, 11: int64, 12: int64, 13: int64, 14: int64, 15: int64, 16: int64, 17: int64, 18: int64, 19: int64, 20: int64, 21: int64, 22: int64, 23: int64, 24: int64, 25: int64, 26: int64, 27: int64, 28: int64, 29: int64, 30: int64, 31: int64, 32: int64, 33: int64, 34: int64, 35: int64, 36: int64, 37: int64, 38: int64, 39: int64, 40: int64, 41: int64, 42: int64, 43: int64, 44: int64, 45: int64>
              to
              {'0': Value(dtype='int64', id=None), '1': Value(dtype='int64', id=None), '2': Value(dtype='int64', id=None), '3': Value(dtype='int64', id=None), '4': Value(dtype='int64', id=None), '5': Value(dtype='int64', id=None), '6': Value(dtype='int64', id=None), '7': Value(dtype='int64', id=None), '8': Value(dtype='int64', id=None), '9': Value(dtype='int64', id=None), '10': Value(dtype='int64', id=None), '11': Value(dtype='int64', id=None), '12': Value(dtype='int64', id=None), '13': Value(dtype='int64', id=None), '14': Value(dtype='int64', id=None), '15': Value(dtype='int64', id=None), '16': Value(dtype='int64', id=None), '17': Value(dtype='int64', id=None), '18': Value(dtype='int64', id=None), '19': Value(dtype='int64', id=None), '20': Value(dtype='int64', id=None), '21': Value(dtype='int64', id=None), '22': Value(dtype='int64', id=None), '23': Value(dtype='int64', id=None), '24': Value(dtype='int64', id=None), '25': Value(dtype='int64', id=None), '26': Value(dtype='int64', id=None), '27': Value(dtype='int64', id=None), '28': Value(dtype='int64', id=None), '29': Value(dtype='int64', id=None), '30': Value(dtype='int64', id=None), '31': Value(dtype='int64', id=None), '32': Value(dtype='int64', id=None), '33': Value(dtype='int64', id=None), '34': Value(dtype='int64', id=None), '35': Value(dtype='int64', id=None), '36': Value(dtype='int64', id=None), '37': Value(dtype='int64', id=None), '38': Value(dtype='int64', id=None), '39': Value(dtype='int64', id=None), '40': Value(dtype='int64', id
              ...
              'int64', id=None), '146': Value(dtype='int64', id=None), '147': Value(dtype='int64', id=None), '148': Value(dtype='int64', id=None), '149': Value(dtype='int64', id=None), '150': Value(dtype='int64', id=None), '151': Value(dtype='int64', id=None), '152': Value(dtype='int64', id=None), '153': Value(dtype='int64', id=None), '154': Value(dtype='int64', id=None), '155': Value(dtype='int64', id=None), '156': Value(dtype='int64', id=None), '157': Value(dtype='int64', id=None), '158': Value(dtype='int64', id=None), '159': Value(dtype='int64', id=None), '160': Value(dtype='int64', id=None), '161': Value(dtype='int64', id=None), '162': Value(dtype='int64', id=None), '163': Value(dtype='int64', id=None), '164': Value(dtype='int64', id=None), '165': Value(dtype='int64', id=None), '166': Value(dtype='int64', id=None), '167': Value(dtype='int64', id=None), '168': Value(dtype='int64', id=None), '169': Value(dtype='int64', id=None), '170': Value(dtype='int64', id=None), '171': Value(dtype='int64', id=None), '172': Value(dtype='int64', id=None), '173': Value(dtype='int64', id=None), '174': Value(dtype='int64', id=None), '175': Value(dtype='int64', id=None), '176': Value(dtype='int64', id=None), '177': Value(dtype='int64', id=None), '178': Value(dtype='int64', id=None), '179': Value(dtype='int64', id=None), '180': Value(dtype='int64', id=None), '181': Value(dtype='int64', id=None), '182': Value(dtype='int64', id=None), '183': Value(dtype='int64', id=None), '184': Value(dtype='int64', id=None)}
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1529, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1154, in convert_to_parquet
                  builder.download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1027, in download_and_prepare
                  self._download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1122, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1882, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2038, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

Unnamed: 0
dict
data
dict
version
dict
{"0":192,"1":139,"2":153,"3":110,"4":147,"5":93,"6":220,"7":84,"8":46,"9":212,"10":36,"11":173,"12":(...TRUNCATED)
{"0":"{'title': 'Atlantic_City,_New_Jersey', 'paragraphs': [{'context': 'هو على جزيرة اب(...TRUNCATED)
{"0":1.1,"1":1.1,"2":1.1,"3":1.1,"4":1.1,"5":1.1,"6":1.1,"7":1.1,"8":1.1,"9":1.1,"10":1.1,"11":1.1,"(...TRUNCATED)
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/datasets-cards)

Hugging Face's logo Hugging Face Search models, datasets, users... Models Datasets Spaces Docs Solutions Pricing

Hugging Face is way more fun with friends and colleagues! 🤗 Join an organization Datasets:

Mostafa3zazi / Arabic_SQuAD Copied like 0 Dataset card Files and versions Community Arabic_SQuAD / README.md Mostafa3zazi's picture Mostafa3zazi Update README.md 17d5b9d 19 days ago raw history blame contribute delete Safe 2.18 kB

dataset_info: features: - name: index dtype: string - name: question dtype: string - name: context dtype: string - name: text dtype: string - name: answer_start dtype: int64 - name: c_id dtype: int64 splits: - name: train num_bytes: 61868003 num_examples: 48344 download_size: 10512179 dataset_size: 61868003

Dataset Card for "Arabic_SQuAD"

More Information needed


Citation

@inproceedings{mozannar-etal-2019-neural,
    title = "Neural {A}rabic Question Answering",
    author = "Mozannar, Hussein  and
      Maamary, Elie  and
      El Hajal, Karl  and
      Hajj, Hazem",
    booktitle = "Proceedings of the Fourth Arabic Natural Language Processing Workshop",
    month = aug,
    year = "2019",
    address = "Florence, Italy",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/W19-4612",
    doi = "10.18653/v1/W19-4612",
    pages = "108--118",
    abstract = "This paper tackles the problem of open domain factual Arabic question answering (QA) using Wikipedia as our knowledge source. This constrains the answer of any question to be a span of text in Wikipedia. Open domain QA for Arabic entails three challenges: annotated QA datasets in Arabic, large scale efficient information retrieval and machine reading comprehension. To deal with the lack of Arabic QA datasets we present the Arabic Reading Comprehension Dataset (ARCD) composed of 1,395 questions posed by crowdworkers on Wikipedia articles, and a machine translation of the Stanford Question Answering Dataset (Arabic-SQuAD). Our system for open domain question answering in Arabic (SOQAL) is based on two components: (1) a document retriever using a hierarchical TF-IDF approach and (2) a neural reading comprehension model using the pre-trained bi-directional transformer BERT. Our experiments on ARCD indicate the effectiveness of our approach with our BERT-based reader achieving a 61.3 F1 score, and our open domain system SOQAL achieving a 27.6 F1 score.",
}

Downloads last month
9