--- language_creators: - found languages: - en-US licenses: - cc-by-4.0 multilinguality: - monolingual pretty_name: Scifi_TV_Shows size_categories: - unknown source_datasets: - original task_categories: - other task_ids: - other-other-story-generation tags: - Story Generation paperswithcode_id: scifi-tv-plots --- Dataset Description --- A collection of long-running (80+ episodes) science fiction TV show plot synopses, scraped from Fandom.com wikis. Collected Nov 2017. Each episode is considered a "story". Contains plot summaries from : * Babylon 5 (https://babylon5.fandom.com/wiki/Main_Page) - 84 stories * Doctor Who (https://tardis.fandom.com/wiki/Doctor_Who_Wiki) - 311 stories * Doctor Who spin-offs - 95 stories * Farscape (https://farscape.fandom.com/wiki/Farscape_Encyclopedia_Project:Main_Page) - 90 stories * Fringe (https://fringe.fandom.com/wiki/FringeWiki) - 87 stories * Futurama (https://futurama.fandom.com/wiki/Futurama_Wiki) - 87 stories * Stargate (https://stargate.fandom.com/wiki/Stargate_Wiki) - 351 stories * Star Trek (https://memory-alpha.fandom.com/wiki/Star_Trek) - 701 stories * Star Wars books (https://starwars.fandom.com/wiki/Main_Page) - 205 stories, each book is a story * Star Wars Rebels - 65 stories * X-Files (https://x-files.fandom.com/wiki/Main_Page) - 200 stories Total: 2276 stories Dataset is "eventified" and generalized (see LJ Martin, P Ammanabrolu, X Wang, W Hancock, S Singh, B Harrison, and MO Riedl. Event Representations for Automated Story Generation with Deep Neural Nets, Thirty-Second AAAI Conference on Artificial Intelligence (AAAI), 2018. for details on these processes.) and split into train-test-validation sets---separated by story so that full stories will stay together---for converting events into full sentences. --- Format --- all-sci-fi-data.txt -- * Each line of the stories contains data in the format: 5-tuple events in a list (subject, verb, direct object, modifier noun, preposition) ||| generalized 5-tuple events in a list ||| original sentence ||| generalized sentence e.g., > [[u'Voyager', u'run', 'EmptyParameter', u'deuterium', u'out'], [u'Voyager', u'force', u'go', 'EmptyParameter', 'EmptyParameter'], [u'Voyager', u'go', 'EmptyParameter', u'mode', u'into']]|||[['<VESSEL>0', 'function-105.2.1', 'EmptyParameter', "Synset('atom.n.01')", u'out'], ['<VESSEL>0', 'urge-58.1-1', u'escape-51.1-1', 'EmptyParameter', 'EmptyParameter'], ['<VESSEL>0', u'escape-51.1-1', 'EmptyParameter', "Synset('statistic.n.01')", u'into']]|||The USS Voyager is running out of deuterium as a fuel and is forced to go into Gray mode.|||the <VESSEL>0 is running out of Synset('atom.n.01') as a Synset('matter.n.03') and is forced to go into Synset('horse.n.01') Synset('statistic.n.01'). * Stories end with <EOS> (end of story) tag on its own line * On the line after <EOS>, there is a defaultdict of entities found in the story by tag and in order (e.g. the second entity in the "<ORGANIZATION>" list in the dictionary would be <ORGANIZATION>1 in the story above --- index starts at 0). These lines start with "%%%%%%%%%%%%%%%%%". e.g., > %%%%%%%%%%%%%%%%%defaultdict(<type 'list'>, {'<ORGANIZATION>': ['seven of nine', 'silver blood'], '<LOCATION>': ['sickbay', 'astrometrics', 'paris', 'cavern', 'vorik', 'caves'], '<DATE>': ['an hour ago', 'now'], '<MISC>': ['selected works', 'demon class', 'electromagnetic', 'parises', 'mimetic'], '<DURATION>': ['less than a week', 'the past four years', 'thirty seconds', 'an hour', 'two hours'], '<NUMBER>': ['two', 'dozen', '14', '15'], '<ORDINAL>': ['first'], '<PERSON>': ['tom paris', 'harry kim', 'captain kathryn janeway', 'tuvok', 'chakotay', 'jirex', 'neelix', 'the doctor', 'seven', 'ensign kashimuro nozawa', 'green', 'lt jg elanna torres', 'ensign vorik'], '<VESSEL>': ['uss voyager', 'starfleet']}) Files in Test-Train-Val Directory -- * File names: all-sci-fi-val.txt, all-sci-fi-test.txt, & all-sci-fi-train.txt * Each line of the stories contains data in the format: 5-tuple events in a list ||| generalized 5-tuple events in a list ||| original sentence ||| generalized sentence e.g., > [[u'Voyager', u'run', 'EmptyParameter', u'deuterium', u'out'], [u'Voyager', u'force', u'go', 'EmptyParameter', 'EmptyParameter'], [u'Voyager', u'go', 'EmptyParameter', u'mode', u'into']]|||[['<VESSEL>0', 'function-105.2.1', 'EmptyParameter', "Synset('atom.n.01')", u'out'], ['<VESSEL>0', 'urge-58.1-1', u'escape-51.1-1', 'EmptyParameter', 'EmptyParameter'], ['<VESSEL>0', u'escape-51.1-1', 'EmptyParameter', "Synset('statistic.n.01')", u'into']]|||The USS Voyager is running out of deuterium as a fuel and is forced to go into Gray mode.|||the <VESSEL>0 is running out of Synset('atom.n.01') as a Synset('matter.n.03') and is forced to go into Synset('horse.n.01') Synset('statistic.n.01'). * No <EOS> tags or dictionary. * Separated 80-10-10 for train-test-val, but by story instead of individual lines. Files in Input_OutputFiles Directory -- Files ending with *_input.txt - * 5-tuple generalized event on each line; formatted as a string instead of a list e.g., > <VESSEL>0 function-105.2.1 EmptyParameter Synset('atom.n.01') out Files ending with *_output.txt - * Corresponding generalized sentence for events in the matching _input.txt file e.g., > the <VESSEL>0 is running out of Synset('atom.n.01') as a Synset('matter.n.03') and is forced to go into Synset('horse.n.01') Synset('statistic.n.01'). Files in OriginalStoriesSeparated Directory -- * Contains unedited, unparsed original stories scraped from the respective Fandom wikis. * Each line is a story with sentences space-separated. After each story, there is a <EOS> tag on a new line. * There is one file for each of the 11 domains listed above. --- Citation --- @inproceedings{Ammanabrolu2020AAAI, title={Story Realization: Expanding Plot Events into Sentences}, author={Prithviraj Ammanabrolu and Ethan Tien and Wesley Cheung and Zhaochen Luo and William Ma and Lara J. Martin and Mark O. Riedl}, journal={Proceedings of the AAAI Conference on Artificial Intelligence}, year={2020}, volume={34}, number={05}, url={https://ojs.aaai.org//index.php/AAAI/article/view/6232} } --- Licensing --- The Creative Commons Attribution 4.0 International License. https://creativecommons.org/licenses/by/4.0/